Skip to main content
Aiconomy

AI Compute

The computational resources — primarily GPU and TPU processing power — required to train and run AI models, typically measured in FLOP (floating-point operations) or GPU-hours.

AI compute has been doubling roughly every 6 months since 2010, a 4.2x annual growth rate that far outpaces Moore's Law. Over 120 million GPU-hours are consumed daily by AI workloads worldwide. The AI chip market generated $66.2 billion in 2024, with NVIDIA commanding approximately 64% market share. Training a frontier model like GPT-4 required an estimated $78–191 million in compute costs.

Live Data

10,095,566,431GPU Hours Consumed by AI Today

Explore the Data

AI Economy Pulse

Every Friday: 3 data points shaping the AI economy this week. Cited sources. No fluff.

Data cited to: Stanford HAI, IEA, OECD, IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

Weekly. Unsubscribe in one click.