Aiconomy
Last Updated: March 22, 2026

AI Models Database 2026

Track notable AI models with compute requirements, training costs, parameter counts, and benchmark performance. Sourced from Epoch AI, Stanford HAI, and primary research.

Key Model Statistics

200+
30%

Notable AI models released in 2024

Epoch AI tracked over 200 notable AI models released in 2024, including GPT-4o, Claude 3.5 Sonnet, Gemini 1.5 Pro, and Llama 3.

Source: Epoch AI
67%
12%

Models released as open-source in 2024

67% of notable foundation models released in 2024 were open-weight, including Meta's Llama 3 and Mistral's models.

Source: Stanford HAI
$191M

Estimated cost to train GPT-4

The estimated compute cost to train OpenAI's GPT-4 was approximately $78–191 million.

Source: Epoch AI
1T+

Parameters in largest language models

The largest publicly known language models now exceed 1 trillion parameters.

Source: Epoch AI
4.2x

Annual training compute growth rate

Compute used to train frontier AI models doubles roughly every 6 months — a 4.2x annual growth rate.

Source: Epoch AI
90%+

AI systems surpassing human benchmarks

AI systems now exceed median human performance on most standard academic benchmarks.

Source: Stanford HAI
15 months

Time to saturate new AI benchmarks

New AI benchmarks are typically saturated within 15 months of release, down from 3+ years a decade ago.

Source: Stanford HAI
51%

AI papers from industry (vs academia)

For the first time, industry produced 51% of significant AI research papers in 2024, overtaking academia.

Source: Stanford HAI
95K+
12%

AI patents filed per year globally

Over 95,000 AI-related patent applications are filed annually at major patent offices.

Source: USPTO AIPD

Model Trends

The AI Model Landscape

The pace of AI model development is accelerating. Epoch AI tracked over 200 notable models released in 2024 — including GPT-4o, Claude 3.5 Sonnet, Gemini 1.5 Pro, and Llama 3 — with projections of 300+ for 2025. The compute used to train frontier models doubles roughly every 6 months, a 4.2x annual growth rate that shows no signs of slowing.

A significant shift toward open-source has emerged: 67% of notable foundation models in 2024 were released with open weights. Meta's Llama 3 family, Mistral's models, and others have made powerful AI capabilities freely available, democratizing access while challenging the business models of closed-source providers like OpenAI.

AI capabilities are advancing faster than ever. Systems now surpass median human performance on most standard academic benchmarks, and new benchmarks are saturated within 15 months of release — down from 3+ years a decade ago. The largest models exceed 1 trillion parameters, with training costs reaching $78–191 million for frontier systems like GPT-4.

Industry now produces 51% of significant AI research papers, overtaking academia for the first time. This shift, combined with the "brain drain" of 40%+ of AI PhDs going directly into industry, raises questions about the future of academic AI research and the concentration of AI capabilities in a handful of companies.

Frequently Asked Questions

AI Economy Pulse

Weekly AI economy data in your inbox. Free forever.

Join 2,500+ subscribers

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.