World Economic

Global trade, energy transition, financial regulation, multinational corporations, and macroeconomic trends.

Wall Street didn’t like what Google just revealed

6 min read

Google (GOOGL) just gave Wall Street a reason to rethink the biggest AI trade available.

Alphabet’s Google Research said earlier in March that it had developed a new family of compression algorithms, TurboQuant, PolarQuant and Quantized Johnson-Lindenstrauss, or QJL.

What is the point of all of these? They aim to slash the memory required to run large language models and vector search systems.

In Google’s tests, TurboQuant reduced key-value cache memory needs by at least six times while preserving accuracy, raising important questions regarding the bigger issue for investors. What happens to memory and storage demand if AI models become dramatically better?

That question hit a nerve fast. Micron Technology (MU), Western Digital (WDC), Seagate Technology (STX) and SanDisk (SNDK) all moved lower as investors digested the possibility that AI workloads may not need as much firepower.

Market coverage tied the decline directly to Google’s breakthrough, which landed at a moment when AI infrastructure stocks had already enjoyed an enormous run on the belief that bigger models translate into higher memory, more storage and more capex.

That is what made the reaction so alarming. Wall Street was not simply responding to a research blog. It was responding to the idea that part of the AI boom’s value could shift away from hardware suppliers. Where will it go next? Well, it will likely move towards the companies finding ways to squeeze more performance out of the same infrastructure base.

For a scarcity-built trade, that is something alarming.

“As AI becomes more integrated into all products, from LLMs to semantic search, this work in fundamental vector quantization will be more critical than ever,” Google research scientist Amir Zandieh and Google Fellow Vahab Mirrokni wrote in a company blog post.

Google’s TurboQuant hits the AI memory trade

Googleframed TurboQuant as a solution to one of modern AI’s most painful bottlenecks: memory overhead.

As models process longer prompts and larger context windows, the need for memory to store key-value caches increases, which can slow inference and raise operating expenses.

Traditional vector quantization can make that footprint smaller, but it often adds extra costs because systems still need to store quantization constants with high precision.

Related: Nvidia CEO makes bombshell call on AI’s next big thing

Google said TurboQuant addresses that weakness by combining PolarQuant for the main compression work with QJL for low-cost error correction.

That technical distinction is why the market is responding so much. For the past two years, investors are rewarding the opinion that artificial intelligence will keep forcing hyperscalers and model builders to buy more memory-rich systems, higher-capacity storage and more supporting infrastructure.

Google’s work does not eliminate the thesis. However, it confuses the matter by showing that software innovation can bend the hardware-demand curve.

When a sector is priced for relentless intensity, even the hint of future efficiency will lead to substantial repricing.

There is still an important counterargument. TurboQuant remains research-stage technology, with Google saying it plans to debut the documents at ICLR 2026, while PolarQuant is slated for AISTATS 2026.

That means that the selloff may have been caused as much by people getting out of crowded positions and taking profits as by a sudden change in demand in the end market. And bulls still have a case to make: recent news has shown that hyperscaler infrastructure spending will still be huge in 2026.

Sandisk added another twist to the story, as it occurred on the same day that people learned about a large strategic move in memory.

Nanya Technology said March 25 that Sandisk Technologies, a wholly owned subsidiary of Sandisk Corp., subscribed for 138.685 million common shares in Nanya’s private placement at NT$223.9 per share.

Nanya said the proceeds would be used for factory facilities and production equipment for advanced memory manufacturing to meet AI-driven computational demand.

Sandisk was the biggest investor in the about $2.5 billion fundraising and that it also inked a long-term deal for DRAM supply with Nanya, according to the reports.

That makes the most interesting split-screen in the story. On one side, Google’s new documents suggest future AI models may require less memory per workload.

On the other hand, Sandisk is still spending real money to make sure it can get memory supply for the long-term growth of AI. That is not something investors can ignore. The real debate right now is what will happen next in the AI infrastructure trade.

The more profound issue is whether AI remains primarily a hardware story or is becoming an optimization issue. Up to now, the market is overwhelmingly rewarding hardware beneficiaries, from memory makers to networking suppliers to GPU partners.

But Google’s is giving a reminder that the best benefits accruing in AI economics may come from smarter compression, better routing, lower-cost inference and more efficient data handling. That does not finish the buildout; it simply redistributes some of the profit pool.

That is why these stocks reacted so violently. Investors weren’t just buying and selling news about one algorithm. They were betting that software might start moving faster than the hardware assumptions the market makes. If that happens, the winners inside AI may still win big. But the key thing is that they will not win the same way.

Google sparks a fresh selloff in AI memory stocks

Photo by LUDOVIC MARIN on Getty Images

Sandisk and Micron now face a tougher AI narrative

For now, the cleanest read is not that Google broke and destroyed the memory market. It is that Google has disrupted the simplicity of the memory trade.

More Tech Stocks:

Morgan Stanley sets jaw-dropping Micron price target after eventNvidia’s China chip problem isn’t what most investors thinkQuantum Computing makes $110 million move nobody saw coming

Micron, Western Digital, Seagate and Sandisk all benefit from a straightforward narrative.

Related: Micron CEO drops a bombshell after Micron’s huge earnings beat

Larger models, heavier inference and more AI traffic should require more chips, more storage and higher spending across the data center stack. Do not get me wrong, that narrative still has plenty of legs to run.

Micron’s own recent results showed that demand for AI is still very high, and recent news has said that big hyperscalers are still planning to spend a lot on infrastructure in 2026.

The point is that demand does not disappear. The point is for investors to think long and hard about how much of that demand will be offset by efficiency gains from the model side.

That’s when figuring out the value gets harder. If AI keeps getting smarter but the amount of memory needed for each task goes down, hardware makers may still have strong sales, but not the kind of steady growth that investors had expected.

That possibility is most important for stocks that have already gone up a lot, because when the market sees a new reason to question the slope of future demand, crowded winners are usually the first to get hit. That’s precisely what Google’s post on March 24 did.

Key takeaways on Google, Micron and SandiskGoogle ResearchintroducedTurboQuant, PolarQuant and QJL on March 24 to reduce AI memory overhead.Google said TurboQuant cut key-value cache memoryneeds by at least six times in its tests without sacrificing accuracy.Memory and storage stocks including Micron, Western Digital, Seagate and Sandisk sold off as investors reassessed AI hardware demand assumptions.Sandiskseparately agreed to invest in Nanya and secure DRAM supply, signaling continued confidence in long-term memory demand.The big market question is whether AI’s next gains flow more to hardware suppliers or to software and model companies that make infrastructure more efficient.

The AI memory trade is not dead. Not at all. But it is no longer as simple as “more models, more chips.”

Google just reminded Wall Street that software can shake things up as well.

That makes things harder for Micron, Sandisk, and the rest of the group. They now have to show that demand growth can outpace the efficiency gains from the model side of the business. That means that for investors, the next few quarters will be less about excitement and more about proof.

Related: Palantir just got access to something highly sensitive

#Wall #Street #didnt #Google #revealed

Leave a Reply

Your email address will not be published.