#News

A $10B Vote of Confidence: Why OpenAI is Backing Cerebras’ AI Hardware

A $10B Vote of Confidence: Why OpenAI is Backing Cerebras’ AI Hardware

Date: January 15, 2026

AI chipmaker Cerebras, known for its wafer-scale AI processors, has secured a multiyear computing partnership with OpenAI valued at more than $10 billion.

This deal marks one of the largest infrastructure agreements in the rapidly expanding computing sector. OpenAI is set to purchase up to 750 megawatts of computing power over three years.

Cerebras’ powerful processors have been in the talks for giving direct competition to NVIDIA, which sells its chips to large cloud providers like Microsoft. While NVIDIA’s hardware relies on massive clusters of thousands of GPUs working together, Cerebras takes a radically different approach by packing an entire AI supercomputer onto a single wafer-scale chip.

This design reduces the need for complex networking between chips, cuts latency, and allows large AI models to run more efficiently. Therefore, positioning Cerebras as one of the few challengers to NVIDIA’s long-standing leadership in AI computing.

Sam Altman And Andrew Feldman Gearing Up for the Future

Following the partnership, Cerebras CEO Andrew Feldman took to LinkedIn to call it “a decade in the making,” saying the collaboration reflects a long-held belief that hardware architecture and model scale would eventually need to converge.

“That point has arrived,” Feldman said, noting that the multi-year agreement will deploy 750 megawatts of Cerebras wafer-scale systems beginning in early 2026. Emphasizing why performance now matters more than ever, Feldman added, “As models grow more capable, speed becomes the bottleneck. Slow systems limit what users can do and whether AI becomes infrastructure or remains a novelty.”

Cerebras’ wafer-scale design, which keeps computation and memory on a single processor, delivers up to 15× faster inference than traditional GPU systems, enabling AI experiences that feel “instantaneous” and, in his words, positioning 2026 as “a defining year” for both Cerebras and the future of large-scale AI deployment.

Further highlighting the strategic importance of the deal, Sachin Katti of OpenAI said the company is building “a resilient portfolio that matches the right systems to the right workloads,” adding that Cerebras brings “a dedicated low-latency inference solution” that will enable “faster responses, more natural interactions, and a stronger foundation to scale real-time AI to many more people.”

From Breakthrough Hardware to Public Markets: What’s Next for Cerebras?

It was recently reported that Cerebras was in talks of raising another billion dollars at a $22 billion valuation. While the company has postponed its IPO a few times, they might be going live with it pretty soon.

As the OpenAI–Cerebras partnership moves into execution, both companies say the focus is no longer on proving what AI can do, but on delivering it at speed and scale. The collaboration is set to become the largest high-speed AI inference rollout in the world, aimed at making advanced AI more responsive and widely accessible.

For Cerebras, the collaboration represents a gateway to its most ambitious chapter yet. “Just as broadband transformed the internet, real-time inference will transform AI,” Andrew Feldman, CEO, Carebras, said, pointing to a future where speed unlocks entirely new ways to build, deploy, and interact with intelligent systems.

With capacity coming online in multiple tranches through 2028, the OpenAI–Cerebras alliance signals a long-term bet on the future of AI. It points to a world where AI is not only more powerful, but also faster, more scalable, and increasingly embedded in everyday digital life.

Manish

By Manish

Have newsworthy information in tech we can share with our community?

Post Project Image

Fill in the details, and our team will get back to you soon.

Contact Information
+ * =