Flash News

Microsoft Has Spent Over $100 Billion on OpenAI Collaboration

Microsoft has cumulatively spent over $100 billion on its collaboration with OpenAI, according to testimony from Microsoft’s Chief Transaction Officer Michael Wetter in the Elon Musk lawsuit.

This amount includes early equity investments as well as costs for building large-scale cloud infrastructure and computing power hosting for OpenAI on Azure, with many expenditures occurring before generating revenue.

Market Mechanism: Microsoft continues to invest heavily to support the training and deployment of OpenAI models, driving capital expenditure in AI infrastructure, with funds flowing into Azure cloud services, GPU procurement, and data center construction; both Microsoft and OpenAI benefit from technological leadership, while competing AI cloud service providers face pressure.

Source: Public Information

ABAB AI Insight

Microsoft has cumulatively invested about $13 billion in equity in OpenAI since 2019, but the total expenditure exceeding $100 billion mainly comes from Azure infrastructure construction. This testimony arises from Elon Musk's lawsuit regarding OpenAI's transition from non-profit to commercialization and Microsoft's role in it.

In terms of capital strategy, Microsoft has exchanged substantial early investments in computing power platforms for commercial rights to OpenAI models, motivated by the desire to secure dominance in next-generation AI technology. Meanwhile, Azure has achieved high growth due to hosting OpenAI's needs, generating over $30 billion in revenue from OpenAI-related business.

Similar cases include massive capital expenditures by hyperscalers like Google and Amazon in AI; Microsoft is currently in a phase of heavy asset investment in AI infrastructure, with the OpenAI collaboration becoming a core growth engine for its cloud business.

Structural Judgment: This fundamentally represents a reconstruction of the industry chain driven by technological substitution. The extreme demand for computing power in AI training shifts the pricing power of cloud infrastructure from general IaaS to ultra-large-scale platforms customized for cutting-edge models. The mechanism is that Microsoft must make substantial upfront investments to build capacity to secure exclusive/priority usage rights for leading models like OpenAI, forcing tech capital to shift from traditional software licensing back to AI computing power and data centers, creating new entry barriers and long-term moats.

ABAB News · Cognitive Law

The more money AI burns, the more infrastructure becomes a moat.
With $100 billion invested, one qualifies to share in the next generation of intelligence.
Those who build computing power in advance will reap the highest returns.

Source

·ABAB News
·
2 min read
·2 hrs ago
分享: