{"id":5711,"date":"2026-05-06T23:22:16","date_gmt":"2026-05-06T23:22:16","guid":{"rendered":"https:\/\/stock999.top\/?p=5711"},"modified":"2026-05-06T23:22:16","modified_gmt":"2026-05-06T23:22:16","slug":"fomo-has-proven-a-stronger-incentive-than-poor-stock-performance-goldman-sachs-finds-insecurity-is-a-key-part-of-the-ai-boom","status":"publish","type":"post","link":"https:\/\/stock999.top\/?p=5711","title":{"rendered":"&#8216;FOMO has proven a stronger incentive than poor stock performance&#8217;: Goldman Sachs finds insecurity is a key part of the AI boom"},"content":{"rendered":"<p><img src=\"https:\/\/fortune.com\/img-assets\/wp-content\/uploads\/2026\/05\/GettyImages-2274018153-e1778087808765.jpg?w=2048\" \/><\/p>\n<p>The numbers coming out of Wall Street\u2019s most influential research shop tell a story that Silicon Valley would rather not hear.<\/p>\n<p>In two separate reports published in April, Goldman Sachs analysts examined the great AI infrastructure build-out from opposite ends of the telescope \u2014 one team studying how much the machine will cost to build, another studying whether the machine is actually working \u2014 and arrived at a rare institutional moment: two wings of a single firm arguing, simultaneously, that the machine costs more than anyone knows and produces less than anyone admits.<\/p>\n<p>Notably, it is not the first time Goldman has said something like this. James Covello, the firm\u2019s head of global equity research, has been one of Wall Street\u2019s most prominent and consistent AI skeptics since he co-authored the original \u201cToo Much Spend, Too Little Benefit?\u201d report in June 2024 \u2014 a piece that landed like a thunderclap precisely because it came from inside one of the institutions most deeply enmeshed in financing the boom it was questioning. Goldman advises hyperscalers, underwrites chip company offerings, and sits at the table with the companies building the very infrastructure Covello was interrogating.<\/p>\n<p>Two years later, Covello is back with an update. He was wrong about some things, he acknowledges. But on the central question \u2014 whether the spending is producing commensurate returns \u2014 he has only gotten more convinced.<\/p>\n<p>Trillion-dollar bill<\/p>\n<p>Start with the cost. The Goldman Sachs Global Institute, which is not part of the bank\u2019s research arm, issued an inquisitive report titled \u201cTracking Trillions,\u201d projecting roughly $7.6 trillion in cumulative AI capital expenditure between 2026 and 2031 \u2014 covering chips, data centers, and power infrastructure. Annual spending is expected to more than double over that period, from $765 billion this year to $1.6 trillion by 2031.<\/p>\n<p>Those figures, the report is careful to note, are not forecasts. They are baseline estimates \u2014 and extremely sensitive ones at that. Change a single assumption about how quickly AI chips become obsolete, and cumulative spending swings by hundreds of billions of dollars. Build the next generation of data centers at $19 million per megawatt instead of $15 million, and total data center costs balloon by more than $500 billion over the projection period. The report\u2019s central message is a focus on the variable estimates and is stated more in a phase of wonder than a phase of uncertainty: the $4 trillion to $8 trillion figures that have \u201cfeatured prominently in recent market commentary\u201d are \u201cfar more conditional than they appear.\u201d<\/p>\n<p>The physical reality underlying those numbers is staggering in its own right. Today\u2019s leading AI systems pack 72 processors into a single rack, connected by hundreds of thousands of kilometers of cabling. The facilities housing them require industrial-scale liquid cooling, dedicated power delivery, and redundancy systems that didn\u2019t exist in conventional data center design a decade ago. A standard cloud data center from the 2010s might have been built at $10 million per megawatt. The next generation of AI-optimized facilities costs $15 million to $20 million, and some facilities built just two years ago are already considered insufficiently equipped for the chips being manufactured today.<\/p>\n<p>What is the return on investment?<\/p>\n<p>Then there\u2019s Covello\u2019s perspective.<\/p>\n<p>Covello writes that he spent two years tracking what all that investment is actually producing for the companies deploying it. His findings do not make for comfortable reading in the boardrooms of companies that have staked their technology roadmaps on artificial intelligence.<\/p>\n<p>Despite $30 billion to $40 billion in enterprise investment in generative AI, Covello cited the influential MIT Labs report, as reported by Fortune, which found that 95% of organizations were getting zero return on their AI pilots. A 2025 EY survey found that 99% of companies in its sample reported financial losses due to AI-related risks, with an average loss of $4.4 million per company. A Wall Street Journal survey found a yawning gap between what C-suites say AI is doing for productivity and what workers on the ground actually report. One AI hiring startup tested frontier AI agents on 480 workplace tasks commonly performed by bankers, consultants, and lawyers. Every agent failed to complete most of its duties.<\/p>\n<p>\u201c56% of Americans say they use AI,\u201d the report quotes one research firm saying, \u201cyet 85% of the workforce does not have a value-driving AI use case.\u201d<\/p>\n<p>IT budgets, rather than shrinking as executives promised shareholders, are growing. Gartner projects global IT spending to rise from $5 trillion in 2024 to $6.15 trillion in 2026. The cost savings have not materialized. Harvard Business Review research cited in the report found that AI-generated errors \u2014 what researchers are calling \u201cworkslop\u201d \u2014 cost a 10,000-person organization more than $9 million annually in lost productivity. Far from efficiencies, AI appears to be generating new headaches and expenses in many cases.<\/p>\n<p>Nvidia: the AI economy\u2019s big winner<\/p>\n<p>Somewhere between the two reports lies what may be the defining structural problem of the AI era: almost none of the money flowing into the AI ecosystem is being captured by the companies deploying it. Nearly all of it is flowing to Nvidia.<\/p>\n<p>\u201cTracking Trillions\u201d anchors its entire baseline model to Nvidia\u2019s forward revenue estimates, noting that the chip giant accounts for roughly 75% of total compute spend \u2014 at gross margins of approximately 75%, far above any competitor, essentially acknowledging that the AI economy as currently constructed is a revenue model for one company.<\/p>\n<p>Covello is less diplomatic about what this means. Semiconductor companies, he writes, \u201care supposed to thrive when their customers thrive [but] in this cycle, the chip companies are thriving at the expense of everyone above them in the chain.\u201d Since the launch of ChatGPT, Nvidia\u2019s net income has grown roughly 20x. The hyperscalers \u2014 Microsoft, Amazon, Google, Meta \u2014 have seen far more modest gains and enterprises and model companies have been losing money. \u201cSomething has to change with this dynamic,\u201d Covello writes, \u201ceither the companies higher in the chain need to start earning a return on investment or they will eventually need to spend less on the chips that are powering this build.\u201d<\/p>\n<p>FOMO is the motivator<\/p>\n<p>And yet the spending continues. Which brings us to perhaps the most remarkable finding from Goldman: the engine driving the fifth industrial revolution does not appear to be a rational capital allocation process. It is insecurity, if not outright fear.<\/p>\n<p>Covello had explicitly predicted in its original 2024 report that if hyperscaler stocks underperformed the market for a sustained period, those companies would cut their AI capital expenditure. The opposite has happened. Microsoft, Amazon, Google, and Meta have dramatically\u00a0increased\u00a0their spending on AI infrastructure even as their stocks have lagged the S&amp;P 500. Hyperscalers have burned through all their free cash flow from operations and are now issuing debt to fund the build-out. Data center debt issuance doubled to $182 billion in 2025 alone.<\/p>\n<p>Covello\u2019s diagnosis: \u201cFOMO has proven a stronger incentive than poor stock performance as hyperscalers have prioritized being involved in the AI arms race over their current shareholders.\u201d<\/p>\n<p>The word \u201carms race\u201d is doing a lot of work in that sentence. Arms races, by definition, are not about winning \u2014 they are about not losing. No hyperscaler CEO is racing to build data centers because the ROI spreadsheet demands it. They are racing because the cost of being wrong about AI \u2014 of sitting it out and watching a competitor transform the industry \u2014 feels existentially higher than the cost of burning through cash on infrastructure that may never fully pay for itself. It is a very human insecurity driving what may be the largest coordinated capital deployment in corporate history.<\/p>\n<p>The \u201cTracking Trillions\u201d report captures the supply-side version of the same dynamic. When physical bottlenecks \u2014 power interconnection queues, transformer shortages, specialized labor constraints \u2014 slow data center deployment, companies don\u2019t scale back their ambitions. They work around the constraints, building behind-the-meter power generation, duplicating capacity, absorbing inefficiency rather than reconsidering the underlying bet. \u201cElongation,\u201d the report calls it: the buildout stretches, costs rise, and the gap between capital committed and capacity online widens \u2014 but the commitment itself holds, because no one wants to be the company that blinked.<\/p>\n<p>The big risk in an elongation scenario, the institute argues, is if bottlenecks prove severe or persistent enough to shift the buildout narrative. When projects fail simultaneously, the focus suddenly shifts to whether they can actually succeed within the timeline. \u201cAt that point, elongation begins to function as a feedback loop, one in which supply-side friction introduces demand-side doubt, potentially leading to deferred or downsized investment plans.\u201d That being said, the institute finds the current environment closer to the base case than the stress case, \u201cthough the buffer is not wide.\u201d<\/p>\n<p>It hardly needs to be said that insecurity and fear are the stuff that bubbles are built on, and the jitters over an AI bubble in 2025 appeared to deflate with the successful release of Google\u2019s new Gemini model and the growing influence of Anthropic\u2019s Claude. We still aren\u2019t out of the woods yet, Goldman is pointing out. <\/p>\n<p>The jobs didn\u2019t disappear<\/p>\n<p>One area where Goldman explicitly revises its prior pessimism is jobs \u2014 though not in the direction AI boosters would prefer.<\/p>\n<p>The firm\u2019s macro team found that while AI has measurably reduced hiring in substitution-heavy occupations \u2014 telephone operators, insurance claims clerks, billing processors \u2014 it has modestly increased employment in augmentation-heavy fields like engineering and operations management. The net drag: roughly 16,000 jobs per month, and a 0.1 percentage point bump to the unemployment rate. Goldman\u2019s baseline projects that AI could ultimately displace 6% to 7% of jobs as adoption broadens over the next decade \u2014 meaningful, but nowhere near the \u201cAI will replace 50% of jobs\u201d headlines that have dominated public discourse.<\/p>\n<p>Those headlines \u201cwill most likely persist,\u201d Covello notes drily, \u201cas that drives clicks and views.\u201d<\/p>\n<p>The job finding is, in its own way, a symptom of the same broader problem: AI has been most effective at the margins, augmenting what workers already do rather than replacing them wholesale or generating the sweeping productivity gains that would justify the spending. Consumer adoption has been spectacular \u2014 Goldman acknowledges it was too conservative on this front, noting that generative AI reached roughly 53% adoption within three years, faster than the personal computer or the internet at comparable stages. But 95% of those users are on free tiers. The consumer enthusiasm has not translated into enterprise economics.<\/p>\n<p>Something has to give<\/p>\n<p>Goldman\u2019s investment conclusion is a quiet repudiation of the trade that has defined markets for the last two years: go long hyperscalers, underweight semis. The picks-and-shovels play, in other words, may be over. He echoed Fortune contributor Jeffrey Sonnenfeld, Lester Crown Professor of Leadership Practice at the Yale School of Management and founder of the Yale Chief Executive Leadership Institute, who recently argued that \u201cdata infrastructure\u201d is the key differentiator for AI scale going forward.<\/p>\n<p>What actually needs to happen, per Covello, is more mundane than most AI coverage would suggest. Data needs to be structured properly \u2014 many AI agents today are being built on top of siloed, misaligned databases that make good outputs impossible. Workloads need to be orchestrated so that expensive frontier models aren\u2019t being deployed to answer questions a cheaper model could handle. Small language models, fine-tuned on domain-specific data, need to keep displacing the large general-purpose models that dominate headlines but often underperform in practice.<\/p>\n<p>The logic is asymmetric. If enterprise ROI eventually materializes, hyperscaler stocks \u2014 currently priced with deep skepticism baked in \u2014 have significant room to run. If ROI continues to disappoint, hyperscalers will cut capex and see a cash-flow-relief rally regardless. The semiconductors, by contrast, are priced for a world in which the arms race never ends, and the returns never arrive \u2014 a world that, per Goldman\u2019s own analysis, cannot persist indefinitely.<\/p>\n<p>[This report has been updated to clarify that the Goldman Sachs Global Institute is not a part of the bank\u2019s research arm and is not issuing forecasts of any kind, and to modify the headline accordingly.]<\/p>\n<p>#FOMO #proven #stronger #incentive #poor #stock #performance #Goldman #Sachs #finds #insecurity #key #part #boom<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The numbers coming out of Wall Street\u2019s most influential research shop tell a story that&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[245],"tags":[1715,97,877,403,536,11214,1305,1828,8890,11215,634,6052,2123,4536,3869,1306,91,1642],"_links":{"self":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts\/5711"}],"collection":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=5711"}],"version-history":[{"count":0,"href":"https:\/\/stock999.top\/index.php?rest_route=\/wp\/v2\/posts\/5711\/revisions"}],"wp:attachment":[{"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=5711"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=5711"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/stock999.top\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=5711"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}