Aiconomy

Embedding

A learned numerical representation that maps words, sentences, images, or other data into a continuous vector space where similar items are positioned close together.

Embeddings are foundational to modern AI systems. Word2Vec (2013) and GloVe pioneered word embeddings, while BERT and GPT produce contextual embeddings where the same word gets different representations based on context. OpenAI's text-embedding-ada-002 produces 1,536-dimensional vectors used by thousands of applications. Embeddings power semantic search, recommendation systems, and RAG pipelines. The embedding market is growing as enterprises build vector databases to store and query billions of embeddings for AI applications.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.