Aiconomy

Perceptron

The simplest form of a neural network, consisting of a single artificial neuron that takes weighted inputs, applies an activation function, and produces an output — the building block of all modern neural networks.

The perceptron was introduced by Frank Rosenblatt in 1958, marking one of the earliest milestones in AI history. While a single perceptron can only solve linearly separable problems, stacking perceptrons into multi-layer networks (multilayer perceptrons, or MLPs) enables learning complex nonlinear relationships. Minsky and Papert's 1969 critique of perceptron limitations contributed to the first AI winter. The resurgence of neural networks in the 2010s vindicated the perceptron's foundational role in modern deep learning.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.