Aiconomy

Inference

The process of running a trained AI model to generate predictions or outputs — as opposed to training, which is the process of building the model. Inference accounts for the majority of AI's ongoing energy consumption.

While training a frontier model is a one-time (if massive) compute cost, inference runs continuously as millions of users interact with AI systems daily. A single ChatGPT query uses roughly 10x the electricity of a Google search. As AI adoption scales to billions of users, inference energy demand is projected to far exceed training costs, contributing to the projected growth of AI electricity consumption from 560 TWh in 2025 to potentially 1,000 TWh by 2030.

Live Data

136.781227 TWhAI Energy Consumed Today

Explore the Data

AI Economy Pulse

Weekly AI economy data in your inbox. Free forever.

Join 2,500+ subscribers

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.