Skip to main content
Aiconomy

GPT-3

OpenAI's 175-billion-parameter language model released in 2020, which demonstrated that scaling up model size produced emergent capabilities like few-shot learning and code generation.

GPT-3 was a watershed moment in AI, showing that a single model could perform translation, coding, summarization, and creative writing with minimal task-specific training. Its 175 billion parameters were 10x larger than any previous model. GPT-3's few-shot learning ability — performing tasks from just a few examples in the prompt — was unexpected and launched the prompt engineering field. The model's API generated over $1 billion in annualized revenue for OpenAI and attracted Microsoft's $13 billion investment.

AI Economy Pulse

Every Friday: 3 data points shaping the AI economy this week. Cited sources. No fluff.

Data cited to: Stanford HAI, IEA, OECD, IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

Weekly. Unsubscribe in one click.