Aiconomy

GPT-3

OpenAI's 175-billion-parameter language model released in 2020, which demonstrated that scaling up model size produced emergent capabilities like few-shot learning and code generation.

GPT-3 was a watershed moment in AI, showing that a single model could perform translation, coding, summarization, and creative writing with minimal task-specific training. Its 175 billion parameters were 10x larger than any previous model. GPT-3's few-shot learning ability — performing tasks from just a few examples in the prompt — was unexpected and launched the prompt engineering field. The model's API generated over $1 billion in annualized revenue for OpenAI and attracted Microsoft's $13 billion investment.

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.