8:26 am, Tuesday, 4 November 2025

BIG TECH’S EARNINGS SHOW AN AI ARMS RACE—AND NEW FAULT LINES

Sarakhon Report

Spending surges, profits diverge

This earnings week distilled the new tech economy into a few charts: capital expenditures at historic highs, cloud growth steady, and ad markets resilient but choppy. Hyperscalers and platform giants reiterated multiyear plans to pour hundreds of billions into data centers, chips and power, framing AI as a must-have capability rather than a feature. Margins told a more nuanced story. Companies with disciplined AI integration—tying models to core products and usage—defended operating income, while firms chasing breadth over depth saw pressure from depreciation and electricity costs. Investors rewarded evidence of paying customers, not just demos, and looked for signs that inference workloads are scaling efficiently outside of headline chatbots.

Where the cash is going—and the open questions

The capex wave is flowing into GPUs, custom accelerators and long-term power contracts. Cloud units posted solid, if decelerating, growth as enterprises pilot copilots and vertical AI. Ads stabilized on better targeting, yet policymakers are circling with privacy rules and content standards. The open questions now center on unit economics: can inference costs fall fast enough; will open-model hybrids reduce vendor lock-in; and can supply chains deliver enough power and cooling? For chipmakers and memory suppliers, the cycle looks strong. For software-only players, the bar is proof of revenue per compute dollar. The next few quarters will show whether AI is widening moats or masking fragility.

 

05:06:26 pm, Sunday, 2 November 2025

BIG TECH’S EARNINGS SHOW AN AI ARMS RACE—AND NEW FAULT LINES

05:06:26 pm, Sunday, 2 November 2025

Spending surges, profits diverge

This earnings week distilled the new tech economy into a few charts: capital expenditures at historic highs, cloud growth steady, and ad markets resilient but choppy. Hyperscalers and platform giants reiterated multiyear plans to pour hundreds of billions into data centers, chips and power, framing AI as a must-have capability rather than a feature. Margins told a more nuanced story. Companies with disciplined AI integration—tying models to core products and usage—defended operating income, while firms chasing breadth over depth saw pressure from depreciation and electricity costs. Investors rewarded evidence of paying customers, not just demos, and looked for signs that inference workloads are scaling efficiently outside of headline chatbots.

Where the cash is going—and the open questions

The capex wave is flowing into GPUs, custom accelerators and long-term power contracts. Cloud units posted solid, if decelerating, growth as enterprises pilot copilots and vertical AI. Ads stabilized on better targeting, yet policymakers are circling with privacy rules and content standards. The open questions now center on unit economics: can inference costs fall fast enough; will open-model hybrids reduce vendor lock-in; and can supply chains deliver enough power and cooling? For chipmakers and memory suppliers, the cycle looks strong. For software-only players, the bar is proof of revenue per compute dollar. The next few quarters will show whether AI is widening moats or masking fragility.