Aiconomy

Cerebras

An AI chip startup that builds the world's largest processor — the Wafer-Scale Engine — designed specifically for training and running AI models at unprecedented speeds.

Cerebras' WSE-3 chip contains 4 trillion transistors across an entire silicon wafer (approximately 46,000 square millimeters), compared to NVIDIA's H100 at roughly 800 square millimeters. The company's CS-3 system can train LLMs at speeds competitive with large GPU clusters while using significantly less power. Cerebras has raised over $700 million and partnered with organizations including the Mayo Clinic and the UAE's G42. The company offers cloud-based inference that delivers the fastest per-token LLM output speeds commercially available.

Explore the Data

AI Economy Pulse

Every Friday: the 3 AI data points that actually matter this week. Free, forever.

Built on data from Stanford HAI, IEA, OECD & IMF

Latest: “AI Investment Hits $42B in Q1 2026 — Here's Where It Went”

No spam, ever. Unsubscribe anytime.