World Economic

Global trade, energy transition, financial regulation, multinational corporations, and macroeconomic trends.

Cerebras IPO to land amid chipmaker mania on Wall Street

5 min read

The U.S. might be facing a testy economic backdrop, with the Iran War, worries about oil rationing, and a return of inflation, but the market continues to soldier onward and upward.

The market has clung onto one balloon in particular: chipmakers. From its March lows, the VanEck Semiconductor ETF has rallied over 53%, with giants like Nvidia, AMD, Micron, and Intel among the 26-strong helping to buoy the broader market.

This pocket of the market is to thank for fresh records in the Nasdaq and S&P 500. But is there room for one more to join in the fun? We’re about to find out on Thursday as Cerebras Systems tests its luck in its market debut. Of course, it wasn’t without controversy.

Let’s talk numbers

Like many other hot AI stocks of the moment, Cerebras sells chips that help run AI models. However, it has chosen to focus on one corner of the market: inference, or the compute for running the models It purports to have an advantage in this category because of it sells the “world’s largest and fastest AI processor.”

In 2025, the company made $510 million in revenue, up 76% year-over-year. Most of this revenue came from sources associated with the United Arab Emirates (UAE). However, new partnerships with Amazon and OpenAI have shown that there’s more laid out ahead for the California-based chipmaker.

The semi excitement has already found its way to the long-awaited listing, which was 20x oversubscribed. That’s led the company to upgrade its IPO price range to $150 to $160 per share, an implied valuation of $48 billion. That’s nearly double the valuation it collected in its last round as a private company.

It’s also upsized the offering from 28 million shares to 30 million, which would put it on track to raise about $4.8 billion. It would be far and away the largest listing we’ve seen so far in 2026.

Size matters

When we’re talking about the world’s “largest” chip, that’s no joke. The company’s Wafer Scale Engine 3 (WSE-3) chip is almost half a square foot of AI chip, with over 4 trillion transistors, 125 petaflops of AI compute, and 900,000 AI-optimized cores. The company says that this is “19x more transistors and 28x more compute than Nvidia’s B200, which itself is only a fraction of the size of the WSE-3.

For Cerebras, the size of WSE-3 is the advantage, allowing it to run models from “OpenAI, Cognition, and Meta at up to 3,000 tokens per second.” In partnership with Amazon Web Services, the company says it’s running these models “up to 70x faster than GPUs,” a shot across the bow of GPU builders like Nvidia and AMD. For this reason, it declares itself the “market leader in high-speed AI inference.”

However, the company’s unique “wafer-scale inference” offering also requires unique approaches. Whereas GPUs are a ‘plug and play’ solution for high-performance computing, WSE-3 is sort of the ‘star’ of a much bigger flagship system called CS-3. This is the actual product that Cerebras is selling, purpose-built to run large language models (LLMs) like OpenAI’s GPT-5.

Second time’s a charm

Cerebras’ original 2024 listing was punted after a national security review brought on by its reliance on the United Arab Emirates (UAE). At the time, there were worries about the company’s revenue concentration; 86% of the company’s 2024 revenue came from UAE-related sources. The UAE-backed G42 was also a small shareholder in the company. The U.S. CFIUS ultimately OKed the deal.

However, waiting might have been in Cerebras’ best interest. It has watered down the ties that soured its initial offering by adding some significant pieces. In March, Cerebras inked a term sheet with Amazon to bring its “high-speed AI inference” onto Amazon Web Services (AWS). And last month, OpenAI committed to spend $20 billion on Cerebras chips via cloud deals, while earning and option to become an owner in the company.

These moves have helped to diversify Cerebras’ revenue. It’s also not like Cerebras hasn’t continued to grow in value. In February, it raised $1 billion in fresh funding at a $23 billion valuation, proving the excitement for the unique product it offers in an ever-crowded semiconductor space.

There are skeptics (as always)

The $150 to $160 price range might be a steep ask for the company. After all, a nearly $48 billion valuation for a company which has not even yet surpassed $1 billion in revenue ties a great deal of skepticism to business execution and its dependence on partners like OpenAI.

Cerebras also has drawbacks on the hardware side, even though they tout the strengths. While semiconductor industry watcher Vikram Sekar calls the company’s WSE an “extraordinary piece of engineering,” he details some of the bottlenecks in a hefty newsletter post.

Most of these problems with WSE-3 are structural. For example, wafer-size chips are more likely to have defects, requiring ‘redundant’ cores. There’s also an iceberg of other problems underneath:

The size of the chip and the solutions required make it inherently less compatible with other systems, hence the need for CS-3. There are also memory capacity problems; it has just 44GB of on-chip SRAM, while NVIDIA’s H200 has 141GB of HBM3e memory. Separately, CS-3 demands more power than ordinary data center racks and are not modular like GPU clusters. Working with Cerebras hardware also requires you to use a proprietary software stack, which might feel like a setback relative to Nvidia’s heavily entrenched CUDA.

For these reasons, some investors are skeptical about whether Cerebras could become truly disruptive to the established order. Add in the company’s increased dependence on OpenAI, which has faced skepticism around its finances and backed away from various data center deals, and you get a glimpse of the bear case.

However, one can assume that some of these problems might be solved by a forthcoming successor to WSE-3 and CS-3, such as a WSE-4 or CS-4. Whether its hardware superiority is enough might take time to ascertain, though.

#Cerebras #IPO #land #chipmaker #mania #Wall #Street

Leave a Reply

Your email address will not be published.