OpenAI Running Losses Due to Microsoft Cloud Overdependence

Summary of OpenAI Running Losses Due to Microsoft Cloud Overdependence

by Joe Rogan

9mNovember 18, 2025

Overview of The Joe Rogan Experience of AI

This episode summarizes leaked documents and reporting (via tech blogger Ed Zitron and TechCrunch sources) about OpenAI’s revenues, compute costs, and its financial relationship with Microsoft. The host walks through reported revenue-share payments to Microsoft, large and rising inference costs, OpenAI’s cloud-provider concentration (historically Microsoft Azure), executive revenue claims, and the broader implications for AI economics and valuations.

Key takeaways

  • Leaked documents suggest Microsoft received ~$494M from OpenAI in 2024 and ~$865.8M in the first three quarters of the following year — implying OpenAI’s payments to Microsoft could top $1B for that year.
  • The payments are said to reflect a ~20% revenue-share arrangement Microsoft reportedly negotiated when it invested heavily in OpenAI.
  • Leaked estimates put OpenAI’s inference compute spending at ~$3.8B in 2024 and ~$8.65B in the first nine months of the next year — indicating rapidly rising cash costs to serve users.
  • Sam Altman is publicly projecting much higher numbers (an annualized revenue run rate above $20B and a claim of $100B by 2026), but those are unverified forward-looking statements.
  • Much of OpenAI’s training spend may be booked as non-cash credit (from Microsoft), while inference — the ongoing cost of answering user requests — is largely cash outflow.

Reported financial figures (leaked / cited)

  • Microsoft revenue-share receipts from OpenAI:
    • 2024: $493.8M (leaked figure)
    • First 3 quarters of next year: $865.8M (leaked figure)
  • Implied OpenAI revenue (based on the 20% share assumption): multiple estimates quoted (range and conflicts):
    • Using 20% → 2024 revenue implied at roughly $2.5B from one calculation.
    • Other reports claim 2024 revenue ≈ $4B, and first half of the next year ≈ $4.3B.
  • Compute (inference) spend:
    • 2024 inference spend reported: ~$3.8B
    • First 9 months of the next year: ~$8.65B
  • Alternate/conflicting figures cited: total compute spend ~$5.6B last year; cost of revenue ~$2.5B for first half of the year.

Compute costs, cloud dependence, and provider diversification

  • Historically, OpenAI relied heavily on Microsoft Azure because of Microsoft’s substantial early investments (reported $10B+ at launch, ~$13B total reported later).
  • The revenue-share deal reportedly requires OpenAI to pay Microsoft ~20% of revenue.
  • Training costs may be covered largely by credits or non-cash arrangements provided by Microsoft; inference costs (serving model queries) are mostly cash and escalating quickly.
  • OpenAI is reportedly broadening cloud partners (CoreWeave, Oracle, AWS, Google Cloud) to diversify compute suppliers and reduce dependence on Microsoft — a notable strategic shift if sustained.

Revenue-share mechanics and accounting caveats

  • Multiple leaks and sources use different definitions: some report Microsoft’s net revenue share (after Microsoft deducts what it pays back to OpenAI or customers) rather than gross receipts.
  • Microsoft and OpenAI publicly decline to detail these flows; Microsoft does not break out Bing/Azure OpenAI revenue in financial reports, making outside estimates uncertain.
  • Credits vs. cash: investment-backed credits from Microsoft can mask true cash burn on training, while inference cash costs remain a major expense for OpenAI.

Implications and broader context

  • If inference costs are as large as reported and OpenAI is not yet profitable on those costs, it raises questions about margins across the AI industry — many smaller players may face similar pressure.
  • High and rising compute costs could reshape valuations and force pricing or product changes (e.g., higher subscription fees, enterprise contracts).
  • OpenAI’s diversification of cloud providers could reduce single-vendor risk but add complexity to operations and contracting.
  • Executive revenue claims (e.g., $20B+ run rate, $100B by 2026) should be treated cautiously: they are forward-looking and not corroborated by public audited financials.

Notable quotes / claims from the episode

  • “Microsoft received $493.8 million in revenue share payments from OpenAI in 2024, and $865.8 million in the first three quarters of this year.” (leaked figures)
  • Sam Altman: OpenAI will “end the year above $20 billion in annualized revenue run rate” and aim to “hit $100 billion by 2026.” (forward-looking claims)

What to watch next

  • Any official OpenAI or Microsoft disclosures (SEC filings, IPO paperwork, or formal press releases) that confirm or refute the leaked numbers.
  • Further leaks or investigative reports (TechCrunch, Ed Zitron) that clarify gross vs. net revenue sharing and the breakdown of credits vs. cash.
  • How OpenAI’s cloud-provider mix evolves and whether shifting providers reduces unit inference costs.
  • Pricing or product changes from OpenAI as a response to rising inference costs.

Limitations and caveats

  • The episode relies on leaked documents and secondhand reporting; many figures are unverified and some numbers conflict across sources.
  • “Revenue-share” may be reported on a net or gross basis depending on the source, which significantly affects implied revenue calculations.
  • Executive statements about future revenues are projections, not historical, audited results.

Conclusion

The episode highlights leaked data suggesting OpenAI pays Microsoft substantial and rapidly rising revenue-share and that inference compute costs are ballooning. Those dynamics — combined with heavy reliance on one cloud provider historically and conflicting revenue estimates — raise important questions about profitability, industry valuations, and how AI companies will manage escalating operational costs. All reported numbers should be treated with caution until confirmed by official financial disclosures.