Google Introduces 4th-Generation TPU Chips

Google will reveal a preview of its latest machine-learning clusters at its I/O conference tomorrow, which aims for nine exaflops of peak performance while using 90 percent carbon-free energy. It will be the largest publicly accessible machine learning hub in the world.

The TPU V4 Pod sits at the heart of the new clusters. These tensor processing units were first announced at Google I/O last year, and AI teams from Meta, LG, and Salesforce have already had access to them. The V4 TPUs enable researchers to use their preferred framework, whether Tensorflow, JAX, or PyTorch, and have already enabled breakthroughs at Google Research in areas like language understanding, computer vision, and speech recognition.

Potential workloads for the clusters, which will be based in Google’s Oklahoma data center, are expected to be similar, chewing through data in the fields of natural language processing, computer vision algorithms, and recommendation systems.

Access to the clusters is provided in slices ranging from four chips (one TPU VM) to thousands. Slices with at least 64 chips use three-dimensional torus links, which provide more bandwidth for group communication operations. The V4 chips can also access twice as much memory as the previous generation — 32GiB versus 16GiB — and double the acceleration speed when training large-scale models.

To make advanced AI hardware more accessible, we launched the TPU Research Cloud (TRC) program a few years ago, which has provided free access to TPUs to thousands of ML enthusiasts around the world, said Jeff Dean, SVP, Google Research, and AI.

They’ve published hundreds of papers and open-source GitHub libraries on everything from ‘Writing Persian poetry with AI’ to ‘Discriminating between sleep and exercise-induced fatigue using computer vision and behavioral genetics.’ The release of Cloud TPU v4 is a significant milestone for Google Research and our TRC program, and we are very excited about our long-term collaboration with ML developers around the world to use AI for good.

Google’s sustainability commitment means that the company has been matching the energy usage of its data centers with venerable energy purchases since 2017, and it aims to run its entire business on renewable energy by 2030. The V4 TPU consumes less energy than previous generations, producing three times the FLOPS per Watt of the V3 chip.

Access to Cloud TPU v4 Pods is available to all Google AI Cloud users in evaluation (on-demand), preemptible, and committed use discount (CUD) options.

Source link