Business & Enterprise | 3 min read

OpenAI Doubles Down on Cerebras: $20 Billion Deal Includes Equity Stake and $1 Billion for Data Centers

OpenAI is committing more than $20 billion to Cerebras over three years — doubling a prior arrangement — and taking a potential 10% equity stake in the chip startup as it builds an inference stack independent of Nvidia.

Hector Herrera
Hector Herrera
Scene in a data center with someone building
Why this matters OpenAI is committing more than $20 billion to Cerebras over three years — doubling a prior arrangement — and taking a potential 10% equity stake in the chip startup as it builds an inference stack independent of Nvidia.

OpenAI Doubles Down on Cerebras: $20 Billion Deal Includes Equity Stake and $1 Billion for Data Centers

By Hector Herrera | April 17, 2026 | Business

OpenAI has agreed to pay chip startup Cerebras more than $20 billion over three years for servers built on its wafer-scale chips — more than doubling a previously reported $10 billion arrangement. The expanded deal includes warrants that could give OpenAI up to a 10% equity stake in Cerebras, plus a separate $1 billion earmarked for new data center construction. According to reporting by The Information via Yahoo Finance, this is now one of the largest compute procurement commitments in AI history.

The core implication: OpenAI is deliberately building an inference stack that reduces its dependency on Nvidia.

The Deal, Piece by Piece

What Cerebras Actually Makes

Cerebras builds what it calls wafer-scale chips — processors manufactured from a single, uncut silicon wafer rather than the smaller dies that standard GPUs are sliced from. Its current WSE-3 chip contains 4 trillion transistors and 900,000 AI-optimized compute cores. That's a fundamentally different form factor than an Nvidia H100 or H200.

The tradeoff is real: wafer-scale chips are expensive to manufacture, harder to cool, and less flexible for the training workloads that require distributing computation across hundreds or thousands of standard GPUs. Where Cerebras claims a genuine advantage is inference — the work of generating outputs from an already-trained model in real time. For large language models running at scale, Cerebras argues its chips can deliver responses significantly faster than GPU alternatives.

That matters enormously to OpenAI. Every ChatGPT response, every API call, every operator integration is an inference request. Inference speed and cost per token are the two variables that most directly affect OpenAI's margins at scale.

Why This Pressures Nvidia

The Deal, Piece by Piece

Per The Information's reporting, the expanded commitment includes:

  • $20 billion+ in payments to Cerebras for server capacity over three years
  • Warrants giving OpenAI the right to acquire up to 10% of Cerebras — an equity component that was not part of the original deal
  • $1 billion specifically designated to fund construction of new data centers capable of hosting Cerebras hardware at scale

The prior reported figure was $10 billion. The new deal more than doubles that. The equity piece is the structural change: it converts OpenAI from a pure customer into a partial financial stakeholder.

Why This Pressures Nvidia

Nvidia controls an estimated 70–80% of the AI accelerator market. For companies like OpenAI that run inference continuously at massive scale, that concentration means limited pricing leverage and perpetual supply constraints. Nvidia sets the terms; buyers absorb them.

Cerebras doesn't threaten Nvidia's dominance in training — the distributed, cluster-based workloads where Nvidia's GPU ecosystem is most entrenched. But inference is a different battlefield, and it's the one that matters most to OpenAI's operating costs. Routing a meaningful share of inference workloads to Cerebras hardware gives OpenAI an alternative with its own committed supply chain — one OpenAI now has a financial incentive to see succeed.

The equity structure reinforces that alignment. If Cerebras scales as a business, OpenAI's warrants appreciate. If OpenAI routes more inference to Cerebras, Cerebras generates more revenue to fund chip development and data center buildout. The incentives point in the same direction.

What to Watch

1. A revived Cerebras IPO. Cerebras filed for a public offering in 2024 before withdrawing amid regulatory scrutiny over a large Saudi customer. A $20 billion committed revenue stream from OpenAI — plus an equity stake held by one of AI's most recognized names — substantially changes the company's IPO narrative. Watch for a refiled S-1.

2. Nvidia's inference roadmap. Nvidia's Blackwell Ultra and the forthcoming Rubin architecture are both designed with inference efficiency in mind. Any acceleration in that roadmap should be read as a direct competitive response to Cerebras gaining traction with a customer of OpenAI's scale and visibility.


Hector Herrera is the founder of Hex AI Systems and editor of NexChron.

Key Takeaways

  • By Hector Herrera | April 17, 2026 | Business
  • 10% equity stake in Cerebras
  • inference is a different battlefield
  • 1. A revived Cerebras IPO.
  • 2. Nvidia's inference roadmap.

Did this help you understand AI better?

Your feedback helps us write more useful content.

Hector Herrera

Written by

Hector Herrera

Hector Herrera is the founder of Hex AI Systems, where he builds AI-powered operations for mid-market businesses across 16 industries. He writes daily about how AI is reshaping business, government, and everyday life. 20+ years in technology. Houston, TX.

More from Hector →

Get tomorrow's AI briefing

Join readers who start their day with NexChron. Free, daily, no spam.

More from NexChron