Skip to main content
Aiconomy

Foundation Model

A large AI model trained on broad data that can be adapted to a wide range of downstream tasks — examples include GPT-4, Claude, Gemini, and Llama.

In 2024, 67% of notable foundation models were released with open weights. The compute used to train them doubles roughly every 6 months. Training costs for frontier foundation models range from $78–191 million. The largest models exceed 1 trillion parameters. Meta's Llama family, OpenAI's GPT series, Google's Gemini, and Anthropic's Claude are among the most prominent foundation model families.

Live Data

80AI Models Released This Year

AI Economy Pulse

Every Friday: 3 data points shaping the AI economy this week. Cited sources. No fluff.

Data cited to: Stanford HAI, IEA, OECD, IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

Weekly. Unsubscribe in one click.