Aiconomy

Backpropagation

The fundamental algorithm for training neural networks, which calculates how much each weight contributed to the overall error and adjusts them to improve predictions.

Backpropagation, short for backward propagation of errors, was popularized in 1986 by Rumelhart, Hinton, and Williams. It works by computing the gradient of the loss function with respect to each weight using the chain rule of calculus, then updating weights in the direction that reduces error. The algorithm is the backbone of deep learning — without it, training networks with millions or billions of parameters would be computationally infeasible. Modern frameworks like PyTorch and TensorFlow implement automatic differentiation to handle backpropagation efficiently.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.