Microsoft is facing increasing pressure on its cloud infrastructure, and OpenAI is now at the center of that strain.
During its fiscal second-quarter earnings report, the software giant revealed that its commercial bookings backlog, formally known as remaining performance obligations (RPO), jumped 110% year over year to $625 billion.
For investors, this could signal a long-term demand for Azure services, but overreliance won't be good either.
OpenAI Represents 45% of Microsoft's Azure Commitments

In a striking disclosure, Microsoft confirmed that OpenAI alone accounts for approximately 45% of its total backlog. While the company has not previously broken out OpenAI's share, the figure shows how deeply the AI startup is intertwined with Azure's growth trajectory.
The revelation immediately raised concerns among Wall Street analysts, some of whom questioned whether Microsoft is becoming too reliant on a single AI partner, particularly as demand for compute continues to outpace supply across the industry.
Nadella Stresses Diversification Over Dependence
CEO Satya Nadella pushed back against those concerns, emphasizing that Microsoft is intentionally avoiding over-optimization around any one business or customer.
He reiterated that while Azure growth remains critical, Microsoft is simultaneously investing in Microsoft 365, GitHub, and Copilot, each of which represents a distinct and expanding market opportunity.
Nadella framed the strategy as balanced growth rather than AI-driven tunnel vision.
Capex Surge Fuels Investor Anxiety
Despite beating earnings expectations, Microsoft shares fell more than 6% in after-hours trading. Investor unease centered on slowing Azure revenue growth and an aggressive increase in capital expenditures.
Microsoft's capex surged 66% year over year to $37.5 billion, reflecting the enormous cost of building AI-ready data centers and securing scarce GPUs. Analysts warned that sustained spending at this scale could pressure margins if revenue growth fails to accelerate.
Compute Allocation Becomes a Zero-Sum Game
According to The Register, CFO Amy Hood explained that Microsoft must now make deliberate trade-offs when allocating compute resources. New GPUs and CPUs are first prioritized for first-party products like Copilot, internal research and development, and strategic initiatives. Only the remaining capacity is then made available to Azure customers.
This reality highlights a growing tension that, as AI demand explodes, compute allocation has become a zero-sum game.
A Powerful, But Strained AI Partnership
Microsoft's challenges closely mirror those faced by OpenAI itself. The AI company has repeatedly acknowledged that limited compute capacity forces hard choices between advancing research and scaling products.
Still, Hood described the partnership as a strategic advantage, crediting OpenAI with helping Microsoft stay at the forefront of AI-powered application development.
Originally published on Tech Times








