
E333 - TPUs: Google's home advantage
Published: December 12, 2025
Duration: 29:20
In the race to train and deploy generative AI models, companies have poured hundreds of billions of dollars into GPUs, chips that have become essential for the parallel processing needs of large language models.
Nvidia alone has forecast $500 billion in sales across 2025 and 2026, driven largely by Jensen Huang, founder and CEO at Nvidia, recently stated that “inference has become the most compute-intensive phase of AI — demanding real-time reasoning at planetary scale”.
Google is meeting these demands in its own way. Unlike other firms reliant on chips by Nvidia, AMD, and others, Google has long used its in...