TPU (Tensor Processing Unit)
Google's custom-designed AI accelerator chip, optimized for TensorFlow workloads and offering an alternative to NVIDIA GPUs for AI training and inference within Google Cloud.
Google has deployed six generations of TPUs since 2015, with TPU v5p being the latest. TPU pods can connect thousands of chips — a TPU v5p pod contains 8,960 chips delivering over 200 exaFLOPS. Google used TPUs to train PaLM (6,144 TPU v4 chips) and Gemini. TPU v5e offers competitive price-performance for inference workloads on Google Cloud. While NVIDIA GPUs have broader software ecosystem support through CUDA, TPUs offer cost advantages for organizations committed to Google's ecosystem. Google Cloud is the only provider offering TPU access.
Live Data
Explore the Data
Related Terms
AI Compute
The computational resources — primarily GPU and TPU processing power — required to train and run AI models, typically measured in FLOP (floating-point operations) or GPU-hours.
Capex (Capital Expenditure)
Long-term investment spending by companies on physical assets like data centers, GPU clusters, and networking infrastructure — the backbone of AI deployment at scale.
Data Center
A facility housing computer systems and infrastructure used to process, store, and distribute data — increasingly built specifically for AI training and inference workloads.
Fine-Tuning
The process of further training a pre-trained AI model on a specific, smaller dataset to specialize it for a particular task or domain, requiring far less compute than training from scratch.
Foundation Model
A large AI model trained on broad data that can be adapted to a wide range of downstream tasks — examples include GPT-4, Claude, Gemini, and Llama.
Frontier Model
The most capable and advanced AI models at any given time, typically trained with the largest compute budgets and achieving state-of-the-art performance on benchmarks.
AI Economy Pulse
Every Friday: the 3 AI data points that actually matter this week. Free, forever.
Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”
No spam, ever. Unsubscribe anytime.