{"id":5216,"date":"2026-04-30T19:38:15","date_gmt":"2026-04-30T19:38:15","guid":{"rendered":"https:\/\/stock999.top\/?p=5216"},"modified":"2026-04-30T19:38:15","modified_gmt":"2026-04-30T19:38:15","slug":"big-techs-700-billion-ai-spending-spree-has-no-clear-end-in-sight","status":"publish","type":"post","link":"https:\/\/stock999.top\/?p=5216","title":{"rendered":"Big Tech\u2019s $700 billion AI spending spree has no clear end in sight"},"content":{"rendered":"<p><img src=\"https:\/\/fortune.com\/img-assets\/wp-content\/uploads\/2026\/03\/Meta-Datacenter.jpg?w=2048\" \/><\/p>\n<p>Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition: SoftBank plans to list a new AI and robotics company in the US\u2026AI model\u2019s goblin habit, explained\u2026Putting Google\u2019s AI to the test as a trip planner.<\/p>\n<p>If Big Tech\u2019s AI spending spree were like climbing Mount Everest, they would still be ascending toward the summit, getting dizzy from the altitude.<\/p>\n<p>In quarterly earnings, estimates from Alphabet, Amazon, Meta and Microsoft put combined capital expenditures at more than $130 billion for the quarter, driven by buildouts of data centers and other infrastructure. That spending could surpass $700 billion this year, up sharply from about $410 billion last year. While only Alphabet has explicitly pointed to further increases beyond this year, all four companies signaled sustained high levels of investment as demand for AI infrastructure continues to grow.<\/p>\n<p>The market reaction has been mixed. Shares of Meta fell sharply after its earnings report as investors focused on the scale of its AI spending plans, and Microsoft also slipped. By contrast, Alphabet and Amazon rose on strong cloud growth\u2014highlighting a growing divide on Wall Street over whether this buildout is justified or getting ahead of itself.<\/p>\n<p>There\u2019s no doubt that AI companies\u2014from the hyperscalers to startups like OpenAI and Anthropic\u2014are hungry, if not starving, for more computing power. The scale of today\u2019s AI systems, which require far more hardware, energy, and coordination than earlier generations of software, means that more is almost never enough. The result is a surge in spending unlike anything the industry has seen before: : McKinsey research from last year found that by 2030, AI capex is projected to require $6.7 trillion worldwide to keep pace with the demand for compute power.\u00a0<\/p>\n<p>Spending big on physical infrastructure<\/p>\n<p>It\u2019s important to understand how much of that spending is going directly into the physical infrastructure that supports AI\u2014both training frontier models and running them. But it can be hard to wrap your mind around the scale of this buildout.\u00a0<\/p>\n<p>It starts with chips\u2014the specialized silicon semiconductors designed to perform the calculations used in AI. A single GPU from Nvidia, for example, can cost up to $40,000. But companies don\u2019t buy them one at a time; they buy systems. An eight-GPU server can cost hundreds of thousands of dollars, and the clusters needed for hyperscale AI data centers\u2014made up of thousands or even hundreds of thousands of GPUs\u2014can run into the billions.<\/p>\n<p>Then there are the data centers that house and power those systems. Pack tens or hundreds of thousands of GPUs into a cluster of buildings spread across hundreds or thousands of acres, and the result starts to look less like a traditional tech investment and more like a utility-scale project\u2014consuming as much electricity as a small city. Last month, I looked closely at Meta\u2019s $27 billion Hyperion data center project in northeast Louisiana, which some estimate will use millions of GPUs. <\/p>\n<p>Another key piece is networking\u2014the cables and switches that connect thousands of chips so they can work together. Training and running modern AI models requires constant, high-speed communication between machines, using specialized switches, fiber optic or ethernet connections, and network cards. Without that, even the most powerful chips can\u2019t do much.<\/p>\n<p>Not everyone agrees spending will keep climbing<\/p>\n<p>Not everyone is convinced the spending will keep climbing. Some investors and analysts see it as a gamble, warning of a potential overbuild in which companies pour money into infrastructure that runs too far ahead of demand. There are still plenty of headlines predicting an AI \u201creckoning.\u201d And as my colleague Shawn Tully has pointed out, the fast-depreciating nature of AI hardware means that there are even greater costs coming down the pike.<\/p>\n<p>But this AI spending race is now in its third year and still shows no signs of slowing. In 2024, the combined capex of the four biggest hyperscalers was just over $200 billion. Two years later, it\u2019s on track to approach $700 billion.<\/p>\n<p>If this is a climb, there\u2019s still no clear view of the summit.<\/p>\n<p>With that, here\u2019s more AI news.<\/p>\n<p>Sharon Goldman<br \/>sharon.goldman@fortune.com <br \/>@sharongoldman<\/p>\n<p>FORTUNE ON AI<\/p>\n<p>Microsoft, Meta, and Google just announced billions more in AI spending. Only Google convinced investors it\u2019s paying off \u2013 by Amanda Gerut<\/p>\n<p>Half of Google\u2019s and Amazon\u2019s \u2018blowout AI profits\u2019 came from a stake in Anthropic\u2014not from their actual business\u2014by Eva Roytburg<\/p>\n<p>AWS CEO Matt Garman sees huge business opportunity for Amazon in AI-powered software: \u2018Everything is going to be remade\u2019 \u2013 by Alexei Oreskovic<\/p>\n<p>China\u2019s decision to block the $2 billion Meta-Manus deal shows how far Washington and Beijing are drifting apart over AI \u2013 by Nicholas Gordon<\/p>\n<p>AI IN THE NEWS<\/p>\n<p>SoftBank plans to list new AI and robotics company in the US. The Financial Times reported that SoftBank Group is reportedly preparing to spin out and take public a new AI and robotics company called \u201cRoze,\u201d targeting a valuation of up to $100 billion in what would be one of the largest AI IPOs to date. The venture is expected to focus on the physical buildout of AI infrastructure\u2014using robotics to help construct data centers and bundling together SoftBank\u2019s existing bets in energy, land, and digital infrastructure\u2014as CEO Masayoshi Son doubles down on \u201cphysical AI\u201d as the next frontier. The IPO could come as early as the second half of 2026, part of a broader effort to capitalize on surging investor demand for AI while also helping SoftBank manage its massive financial commitments, including tens of billions invested in OpenAI and other large-scale infrastructure projects.<\/p>\n<p>AI model&#8217;s goblin habit, explained. After questions arose about the odd tendency of OpenAI models to reference goblins, gremlins, and similar creatures, the company put out a blog post today acknowledging the problem and saying that it wasn&#8217;t random but a side effect of how the models were trained. The behavior first appeared after the GPT-5.1 launch, when the reinforcement learning process used to create the model&#8217;s \u201cNerdy\u201d personality mode\u2014one of several distinct personalities OpenAI began offering users with the roll-out of that model\u2014rewarded whimsical metaphors, including those specifically referencing the mythical creatures. The way this reinforcement learning process works, the linguistic tic seeped into other model personality types too. Even after the Nerdy personality was removed, the habit persisted in later models like Codex because training had already baked it in. The episode is a small but telling example of how subtle reward signals can shape model behavior in unpredictable ways.\u00a0<\/p>\n<p>Putting Google&#8217;s AI to the test as a trip planner. I&#8217;m always interested in how AI is progressing in its ability to help with travel plans. In a New York Times column, author Brian X. Chen put Google&#8217;s Gemini to the test. He found that AI is getting meaningfully better at handling complex, multi-step tasks like trip planning\u2014but still falls short of full autonomy. Gemini\u2019s integration with Google services like Flights, Hotels, Gmail, and Maps allows it to act as a kind of \u201cAI travel agent,\u201d quickly generating itineraries, packing lists, and personalized recommendations that saved significant time and effort. But the system remains inconsistent: it made basic errors (like omitting essentials from packing lists) and struggled with real-time context, such as confusing locations across different legs of a trip. The takeaway remains: AI models are useful, but still require human oversight, particularly when context, timing, and accuracy really matter.\u00a0<\/p>\n<p>EYE ON AI NUMBERS75%<\/p>\n<p>That&#8217;s how many tech leaders agree that their operating models and processes need to change in the next 12 to 18 months in order to drive greater value from AI, according to Deloitte&#8217;s new 2026 Global Tech Leadership Study.\u00a0<\/p>\n<p>But in a sign that there is a widening gap between ambition and capability in scaling AI, the same survey found that 80% of tech leaders are confident in their organization&#8217;s ability to deploy and govern AI capabilities at scale. Confidence, Deloitte emphasized, appears to be surging ahead of readiness.\u00a0<\/p>\n<p>\u00a0<\/p>\n<p>AI CALENDAR<\/p>\n<p>June 8-10: Fortune Brainstorm Tech, Aspen, Colo. Apply to attend\u00a0here.<\/p>\n<p>June 17-20: VivaTech, Paris.<\/p>\n<p>July 6-11: International Conference\u00a0on Machine Learning (ICML), Seoul, South Korea.<\/p>\n<p>July 7-10: AI\u00a0for Good Summit, Geneva, Switzerland.<\/p>\n<p>#Big #Techs #billion #spending #spree #clear #sight<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition: SoftBank plans&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[245],"tags":[876,291,237,552,5916,1862,2400,813,879,2559,667,884,8538],"_links":{"self":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts\/5216"}],"collection":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5216"}],"version-history":[{"count":0,"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts\/5216\/revisions"}],"wp:attachment":[{"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5216"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5216"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5216"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}