AI App Crisis, OpenAI Does Math, Big Nvidia Deal

Summary of AI App Crisis, OpenAI Does Math, Big Nvidia Deal

by Candace Fan

18mMarch 11, 2026

Overview of AI App Crisis, OpenAI Does Math, Big Nvidia Deal

Host Jaden Schaefer (episode guest/producer: Candace Fan metadata) covers three headline stories: 1) new data showing AI-powered apps struggle with long-term retention despite strong early monetization, 2) ChatGPT’s launch of dynamic interactive visual explanations for math and science, and 3) Thinking Machine Labs’ large multi-year compute partnership and strategic investment from NVIDIA. The episode combines data from a RevenueCat report, product examples, and industry implications for startups and infrastructure builders.

Key news items

  • AI apps: RevenueCat’s 2026 state of subscription apps report shows AI-powered apps monetize quickly but have higher churn and refund rates than non-AI apps.
  • ChatGPT: OpenAI introduced dynamic visual explanations—interactive modules that let users manipulate variables and see equations/diagrams update in real time for 70+ math & science concepts.
  • Thinking Machine Labs + NVIDIA: Thinking Machine announced a multi-year strategic partnership with NVIDIA to deploy large-scale AI systems (deployment starts in 2027), including at least one gigawatt of NVIDIA AI systems; NVIDIA is also taking a strategic stake. Thinking Machine has raised billions and released an API product (Tinker).

RevenueCat findings — data & takeaways

  • Dataset: Analysis of subscription infrastructure for ~75,000 developers, >1 billion in-app subscription transactions, representing about $11B annual developer revenue.
  • Adoption:
    • Only ~27% of apps analyzed were categorized as AI-powered, though ~1 in 4 apps now market themselves as AI-driven.
    • Photo & video lead AI adoption (~61% of those apps). Gaming (~6.2%) and travel (~12.3%) are low adopters.
  • Retention:
    • 12-month retention: AI apps ~21% vs non-AI apps ~30.7% (≈10 percentage points lower).
    • Monthly retention: AI apps ~6.1% vs non-AI apps ~9.5%.
    • Weekly retention (rare): AI apps ~2.5% vs non-AI ~1.7% (AI higher here, reflecting short-term trials).
  • Monetization & value:
    • Median download monetization: AI apps 2.4% vs non-AI 2.0%.
    • Realized lifetime value (median): monthly — AI ~$18 vs non-AI ~$13; annual — AI ~$30 vs non-AI ~$20.
  • Refunds & volatility:
    • AI apps show materially higher refund and volatility rates (examples cited in the report include higher median and high-end refund percentages).
  • Host interpretation: AI features boost conversion and short-term revenue but often fail to deliver durable, long-term value; hype/overpromising and rapid experimentation contribute to churn.

ChatGPT: dynamic visual explanations (what changed)

  • Feature: Interactive visual explanations inside ChatGPT that allow users to manipulate variables and see math/science equations and diagrams update in real time.
  • Scope: Supports 70+ concepts (examples in episode: compound interest, exponential decay, linear equations, Ohm’s Law, Coulomb’s Law, kinetic energy, Hooke’s Law).
  • Potential impact:
    • Education: helps students and learners explore concepts rather than just reading static answers; could act as a widely accessible tutor.
    • Product stickiness: interactive, exploratory features can increase perceived utility and reduce churn if executed well.
  • Related context: Other assistants (example: Google’s Gemini) have introduced interactive diagrams; the space is competitive and features drive user expectations.

Thinking Machine Labs + NVIDIA (compute deal)

  • Deal highlights:
    • Multi-year strategic partnership to deploy large-scale computing systems, starting in 2027.
    • Commitment includes at least one gigawatt of NVIDIA AI systems (signaling very large-scale compute plans).
    • NVIDIA is making a strategic investment in Thinking Machine Labs.
  • Company context:
    • Thinking Machine Labs (spun out by former OpenAI talent) raised multiple billions and released Tinker API last year; valuation reported in the episode at over $12B.
  • Industry context:
    • The industry is increasingly competing aggressively for access to massive compute; major compute partnerships are now common for scaling advanced AI products.
    • CEO Jensen Huang (NVIDIA) projects huge industry spending on AI infrastructure through the decade.

Analysis & implications

  • Short-term monetization vs long-term retention:
    • Many AI apps can convert users and command higher prices, but failing to meet expectations and provide durable utility drives churn and refunds.
  • Product lessons:
    • If you build AI features, prioritize reliable core value and product experience at launch rather than overpromising novel capabilities.
    • Weekly, low-price subscription strategies may create churn and annoyance—monthly or annual models often produce more stable LTV.
  • Infrastructure race:
    • As new interactive features and larger models become common, access to enormous compute will be a gating factor for scale and capability.
    • Early compute commitments (like gigawatt deals) are a signal of long-term ambitions and capital intensity.
  • Market dynamics:
    • Rapid experimentation means many apps will be tried briefly and abandoned; long-term winners will be those who turn novelty into habitual utility.

Practical recommendations (for founders, product teams, and operators)

  • Focus onboarding on a single, wow-worthy core use case that works reliably; avoid overselling feature scope in marketing.
  • Measure short- and long-term retention separately and optimize for 3–12 month retention, not just initial conversions.
  • Choose subscription cadence thoughtfully—weekly plans may boost trials but often increase churn and user friction.
  • Invest early in product quality and bug fixes (reducing early churn is cost-effective).
  • If scaling compute-heavy features, secure committed infrastructure partners early and align product roadmap to realistic deployment timelines.
  • Use interactive/explanatory features (like ChatGPT’s visuals) to increase user engagement for educational or technical products.

Notable quotes from the episode

  • “AI features help apps monetize really quickly, but sustaining that long-term is going to be the challenge.”
  • “A lot of this comes down to over-promising and under-delivering what the AI is capable of doing.”
  • “When you build a product…make sure it works really good on launch. Then you’ll be able to keep your churn up.”

Where to try the models mentioned

  • Host plugs AIbox.ai — access to many AI models and tools (link mentioned in episode).

If you want the episode’s quick takeaway: AI can drive fast revenue and exciting features, but building durable, reliable user value and securing the massive compute to scale are the two biggest challenges for long-term success.