Aiconomy

Model Training

The computationally intensive process of teaching an AI model by feeding it data and adjusting its parameters to minimize errors, often requiring thousands of GPUs running for weeks or months.

Training compute for frontier models doubles roughly every 6 months. Training GPT-4 required an estimated $78–191 million in compute costs and consumed 3,600 MWh of electricity — enough to power 120 American homes for a year. The Stargate project, a $500 billion AI infrastructure initiative, is largely driven by the need for more training compute. Training costs are a key barrier to entry, concentrating frontier AI development among a handful of well-funded companies.

Live Data

9,662,931,952GPU Hours Consumed by AI Today

AI Economy Pulse

Weekly AI economy data in your inbox. Free forever.

Join 2,500+ subscribers

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.