Skip to main content
Aiconomy

Dropout

A regularization technique that randomly deactivates a fraction of neurons during each training step, preventing neural networks from becoming overly dependent on any single neuron and reducing overfitting.

Dropout was introduced by Srivastava et al. in 2014 and quickly became one of the most widely used regularization methods in deep learning. Typical dropout rates range from 10% to 50% of neurons per layer. The technique effectively trains an ensemble of sub-networks within a single model, improving generalization to unseen data. While dropout remains common in smaller models, modern architectures like transformers often rely more on other regularization methods such as layer normalization and weight decay.

Explore the Data

AI Economy Pulse

Every Friday: 3 data points shaping the AI economy this week. Cited sources. No fluff.

Data cited to: Stanford HAI, IEA, OECD, IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

Weekly. Unsubscribe in one click.