Aiconomy

Zero-Shot Learning

A model's ability to perform tasks it was never explicitly trained on, using its general knowledge and reasoning capabilities to handle entirely novel situations without any task-specific examples.

Zero-shot learning has become one of the most impressive capabilities of large language models. GPT-4 can perform tasks like sentiment analysis, translation, and code generation without any task-specific training examples. CLIP demonstrated zero-shot image classification by learning from 400 million image-text pairs. Zero-shot capability generally improves with model scale and is considered an emergent property of sufficiently large models. It has dramatically reduced the barrier to deploying AI for new applications.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.