BERT
Google's Bidirectional Encoder Representations from Transformers, a landmark 2018 model that introduced pre-training and bidirectional context understanding to NLP, improving search and language tasks worldwide.
BERT transformed NLP by reading text in both directions simultaneously, unlike earlier models that read left-to-right. It set new records on 11 NLP benchmarks upon release. Google deployed BERT in Search in 2019, calling it the biggest improvement in 5 years, affecting 10% of all English queries. BERT spawned numerous variants: RoBERTa, ALBERT, DistilBERT, and domain-specific versions for biomedical (BioBERT) and legal (LegalBERT) text. Despite being surpassed by larger models, BERT remains widely deployed in production due to its efficiency.
Explore the Data
Related Terms
Artificial General Intelligence (AGI)
A hypothetical form of AI that can understand, learn, and apply knowledge across any intellectual task at or above human level, rather than being specialized for specific tasks.
AI Alignment
The research field focused on ensuring AI systems behave in accordance with human values and intentions, particularly as systems become more capable.
ChatGPT
OpenAI's conversational AI assistant, launched in November 2022, which catalyzed the current generative AI boom by demonstrating the capabilities of large language models to a mainstream audience.
Fine-Tuning
The process of further training a pre-trained AI model on a specific, smaller dataset to specialize it for a particular task or domain, requiring far less compute than training from scratch.
Foundation Model
A large AI model trained on broad data that can be adapted to a wide range of downstream tasks — examples include GPT-4, Claude, Gemini, and Llama.
Frontier Model
The most capable and advanced AI models at any given time, typically trained with the largest compute budgets and achieving state-of-the-art performance on benchmarks.
AI Economy Pulse
Every Friday: the 3 AI data points that actually matter this week. Free, forever.
Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”
No spam, ever. Unsubscribe anytime.