Gradient Descent
The primary optimization algorithm used to train neural networks, which iteratively adjusts model parameters in the direction that most reduces the prediction error.
Gradient descent and its variants (SGD, Adam, AdamW) are the workhorses of deep learning optimization. Stochastic gradient descent (SGD) processes random mini-batches of data rather than the entire dataset, making it practical for large-scale training. The Adam optimizer, introduced in 2015, adapts learning rates per parameter and is the default choice for training most modern neural networks. Training frontier models involves performing gradient descent across trillions of tokens using thousands of GPUs in parallel.
Explore the Data
Related Terms
Artificial General Intelligence (AGI)
A hypothetical form of AI that can understand, learn, and apply knowledge across any intellectual task at or above human level, rather than being specialized for specific tasks.
AI Alignment
The research field focused on ensuring AI systems behave in accordance with human values and intentions, particularly as systems become more capable.
AI Compute
The computational resources — primarily GPU and TPU processing power — required to train and run AI models, typically measured in FLOP (floating-point operations) or GPU-hours.
Capex (Capital Expenditure)
Long-term investment spending by companies on physical assets like data centers, GPU clusters, and networking infrastructure — the backbone of AI deployment at scale.
Data Center
A facility housing computer systems and infrastructure used to process, store, and distribute data — increasingly built specifically for AI training and inference workloads.
Fine-Tuning
The process of further training a pre-trained AI model on a specific, smaller dataset to specialize it for a particular task or domain, requiring far less compute than training from scratch.
AI Economy Pulse
Every Friday: the 3 AI data points that actually matter this week. Free, forever.
Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”
No spam, ever. Unsubscribe anytime.