Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Another tech earnings season, another dizzying escalation of the artificial intelligence capex spending boom. The headlong growth in Big Tech’s data centre spending plans has been notable all year, but this week brought an extra twist. Meta and Microsoft both now predict their 2026 spending increases will be even bigger than 2025, while Alphabet jacked up its capex again for the rest of this year.
Following the giant chip and cloud computing deals that OpenAI has forged in recent weeks, it’s easy to see all this as an undifferentiated and undisciplined dash for growth. Look a little deeper, though, and there are big differences in the level of risk and likely returns. Once Wall Street’s AI fervour eases, investors will pay more attention in discerning which companies have the most sustainable capital spending.
One key difference is near-term demand. Microsoft, for instance, had expected that demand for its AI capacity would exceed supply until at least the end of this year. This week, its executives predicted the supply shortage would run until at least the middle of next year. Microsoft also claims the duration of business it has already booked closely echoes the expected useful life of its AI chips, helping to match its AI costs and revenue.
Meta, by contrast, is building more on spec. Pressed repeatedly by analysts this week on the purpose behind his company’s booming capex, chief executive Mark Zuckerberg was studiously vague. Building new services that reach billions of people, he said, “is a huge muscle that Meta has developed” — though he didn’t explain what those future AI services might be. Investors are still willing to give Zuckerberg the benefit of the doubt. Meta’s stock price is more than six times higher than it was at its nadir three years ago, when Wall Street lost confidence in his earlier bet on the Metaverse. But the 10 per cent dent to the shares on Thursday suggests patience is starting to wear.
Another important differentiator is the nature of the AI demand that some companies are starting to report. The backlog of contracted business — known as remaining performance obligations, or RPO — is a critical indicator. But not all future revenue looks the same.
When Oracle revealed last month that its RPO had jumped to $455bn, the news sent its shares soaring. It only sank in later that $300bn of that is linked to a five-year deal with OpenAI, which is still a long way from generating the demand for all that capacity or the cash flow to pay for it.
This week, Microsoft said its RPO rose by half to nearly $400bn, even before counting a $250bn deal it announced with OpenAI. Its executives were at pains to point out that these bookings reflect demand from a wide variety of customers, with a weighted average duration of only two years. In other words, these contracts should generate revenue in short order. This points to two different AI capex booms: A relatively short-term one, tied to some discernible level of actual demand, and a longer-term, speculative one, tied to an almost religious belief in exponential growth.
A third key factor is profitability. With data centres now filling up rapidly with AI servers and other relatively shortlived assets, depreciation charges are starting to surge. At Microsoft, depreciation and amortisation jumped to 16.8 per cent of revenue in the latest quarter, from 11.3 per cent a year before. Expense discipline is becoming critical, even amid the boom — as many workers at Amazon found to their cost this week.
That should bring closer attention to which companies are building the most cost-effective AI infrastructure, and which have sustainable pricing power. Google and Microsoft both registered stronger profit margins than Wall Street had expected in the latest quarter, while Meta’s operating margin dropped by three percentage points, to 40 per cent.
The defensibility of those margins will be one of the big stories of 2026. Google is betting heavily on its full “stack” of in-house AI technology, from AI chips, known as TPUs, to leading foundation models. Microsoft, meanwhile, has been overtly steering away from what could become the most competitive, low-margin part of the AI market — renting out GPU capacity. Its loosening ties with OpenAI have raised some anxiety among investors. CEO Satya Nadella intimated, though, there will be plenty of more profitable opportunities.
When Wall Street’s AI party finally ends, decisions like these will help determine which companies dominate over the long haul.
richard.waters@ft.com
