Nvidia rival Cerebras discloses US IPO filing as AI boom drives listings

by Chief Editor

Breaking the GPU Monopoly: The Rise of Specialized AI Hardware

For years, the AI gold rush has been dominated by a single player. Nvidia’s graphics processing units (GPUs) became the industry standard, but a new architectural approach is attempting to challenge that hegemony. Cerebras Systems is positioning itself not just as another chipmaker, but as a fundamental shift in how AI is powered.

From Instagram — related to Cerebras, Nvidia

Unlike traditional GPUs, Cerebras utilizes a “wafer-scale engine.” This design is specifically engineered to eliminate high-bandwidth memory bottlenecks, which have long been a primary constraint in scaling artificial intelligence. By focusing on inference—the process where AI systems generate responses to user queries—Cerebras is targeting the most computationally expensive part of the generative AI lifecycle.

Did you know? Cerebras’ wafer-scale engine is designed to speed up both the training and inference of large AI models, offering a direct alternative to the GPU-heavy infrastructure used by most AI labs.

As the industry moves toward wider generative AI adoption, the demand for specialized hardware that can handle massive models more efficiently is expected to grow. This shift suggests a future where “one size fits all” hardware is replaced by a diversified ecosystem of specialized silicon.

From Hardware Vendor to Cloud Powerhouse

One of the most significant trends emerging from the Cerebras model is the transition from selling physical chips to providing “Compute-as-a-Service.” While the company spent years trying to sell hardware directly, it has pivoted to operating its chips within its own data centers as a cloud service.

From Hardware Vendor to Cloud Powerhouse
Cerebras Hardware Microsoft

This strategic pivot places Cerebras in direct competition with some of the largest tech giants in the world, including Microsoft, Amazon, Alphabet, Oracle, and CoreWeave.

By controlling the infrastructure, AI chipmakers can ensure their hardware is optimized for the software running on it, creating a vertically integrated stack that can potentially offer better performance and lower costs for clients than generic cloud environments.

The OpenAI Factor: Redefining Infrastructure Partnerships

The relationship between Cerebras and OpenAI signals a massive shift in how AI giants secure their computing future. The two have entered a multi-year deal valued at over $20 billion, under which OpenAI will deploy 750 megawatts of Cerebras chips through 2028.

This partnership is more than just a customer-vendor relationship; it is a deep financial integration. OpenAI provided a $1 billion loan to Cerebras and received a warrant to purchase company stock, effectively betting on the success of the hardware it uses.

Pro Tip: When analyzing AI infrastructure companies, look beyond current revenue to “remaining performance obligations.” As of December 31, Cerebras reported $24.6 billion in such obligations, providing a clearer picture of future revenue streams.

This trend indicates that the largest AI developers are no longer willing to rely on a single hardware provider. Diversifying their compute sources is now a matter of operational survival.

Navigating the Geopolitical Minefield of AI

The path to the public market has not been without hurdles. Cerebras’ journey highlights the increasing intersection of AI technology and national security. The company previously faced delays due to a U.S. National security review concerning G42, a UAE-based tech conglomerate.

NVIDIA Stock SHOCKING New Rival? Cerebras IPO Shakes Up AI Chip Market

G42 was once a dominant revenue source, contributing 87% of revenue in the first half of 2024. However, U.S. Authorities raised concerns that Middle Eastern investments could potentially provide China access to advanced American AI technology. While Cerebras obtained clearance from the Committee on Foreign Investment in the United States (CFIUS) in 2025, the episode serves as a case study for the risks associated with global AI supply chains.

The company has since successfully diversified its revenue. In 2025, G42’s contribution dropped to 24%, while the Mohamed bin Zayed University of Artificial Intelligence provided 62% of revenue, showing a shift toward institutional and academic partnerships.

Financial Trajectory and Market Outlook

The financial turnaround of the company provides a glimpse into the scalability of specialized AI hardware. In 2025, Cerebras reported $510 million in revenue—a nearly 76% increase from 2024. More impressively, it swung from a $485 million net loss in 2024 to a net income of $87.9 million in 2025.

Financial Trajectory and Market Outlook
Cerebras Nvidia Hardware

With a targeted Nasdaq listing under the ticker “CBRS” and a valuation estimated between $22 billion and $25 billion, Cerebras is testing whether the market is ready to value AI hardware companies based on their ability to disrupt the established GPU order.

AI Infrastructure FAQ

What makes Cerebras chips different from Nvidia’s?

Cerebras uses a wafer-scale engine designed to avoid high-bandwidth memory bottlenecks, focusing heavily on the efficiency of AI inference.

What is the nature of the Cerebras-OpenAI deal?

It is a multi-year compute deal valued at over $20 billion, providing OpenAI with up to 750 megawatts of computing power through 2028.

Who are the primary competitors in this space?

Beyond Nvidia, Cerebras competes with cloud providers like Amazon, Microsoft, Alphabet, Oracle, and CoreWeave.

Why was the IPO delayed previously?

The delay was linked to a U.S. National security review (CFIUS) regarding investments from G42, a UAE-based conglomerate.


What do you believe? Will specialized hardware like the wafer-scale engine eventually replace GPUs for AI inference, or will Nvidia’s ecosystem remain too dominant to disrupt? Share your thoughts in the comments below or subscribe to our newsletter for more deep dives into the future of AI infrastructure.

You may also like

Leave a Comment