Here’s how our TPUs power increasingly demanding AI workloads.
Google AIArchived Apr 23, 2026✓ Full text saved
Learn how Google’s TPUs power increasingly demanding AI workloads with this new video.
Full text archived locally
✦ AI Summary· Claude Sonnet
Apr 23, 2026
Here’s how our TPUs power increasingly demanding AI workloads.
Behind the Google products you use every day are custom chips designed for one job: doing math at massive scale. They're called TPUs, or Tensor Processing Units.
We designed TPUs from the ground up more than a decade ago specifically to run AI models. Basically, it takes a lot of math for AI models to work, and TPUs can do complex math super quickly: The newest generation of TPUs can process 121 exaflops of compute power with double the bandwidth of previous generations.
Learn more about these tiny but mighty processors in the video below.
POSTED IN:
Related stories
Elevating Austria: Google invests in its first data center in the Alps.
Google has been a proud part of Austria’s landscape for years, and today, we’re announcing our first data center in Kronstorf, generat…
10 leading enterprises show why agents mean business
1,302 real-world gen AI use cases from the world's leading organizations
Gemini Enterprise Agent Platform lets you build, govern and optimize your agents.