Aiconomy

BERT

Google's Bidirectional Encoder Representations from Transformers, a landmark 2018 model that introduced pre-training and bidirectional context understanding to NLP, improving search and language tasks worldwide.

BERT transformed NLP by reading text in both directions simultaneously, unlike earlier models that read left-to-right. It set new records on 11 NLP benchmarks upon release. Google deployed BERT in Search in 2019, calling it the biggest improvement in 5 years, affecting 10% of all English queries. BERT spawned numerous variants: RoBERTa, ALBERT, DistilBERT, and domain-specific versions for biomedical (BioBERT) and legal (LegalBERT) text. Despite being surpassed by larger models, BERT remains widely deployed in production due to its efficiency.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.