Aiconomy

Federated Learning

A privacy-preserving machine learning approach where models are trained across multiple decentralized devices or servers without sharing the raw data, keeping sensitive information local.

Federated learning was pioneered by Google in 2016 for improving Android keyboard predictions without collecting user typing data. The technique trains local models on each device, then aggregates only the model updates (not the data) on a central server. It is increasingly adopted in healthcare (training on patient data across hospitals without sharing records), finance, and telecommunications. Apple uses federated learning for Siri improvements. The approach addresses growing data privacy regulations like GDPR while enabling collaborative model training.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.