Menu
Class #1 | MS&E435: Economics of the AI Supercycle Stanford University Spring '26 Apoorv Agrawal

Class #1 | MS&E435: Economics of the AI Supercycle Stanford University Spring '26 Apoorv Agrawal

MS&E 435: Economics of AI

67,474 views 16 days ago

Video Summary

This video introduces a 9-week course on Artificial Intelligence, focusing on the economic landscape and investment opportunities within the AI supercycle. The instructor, Apoorv, outlines the course logistics, grading, and the central theme: understanding where the money is in AI and how value is accruing. A key takeaway is that the current AI ecosystem's value distribution, heavily favoring the semiconductor layer, differs significantly from previous tech revolutions like the internet and cloud, presenting a unique set of challenges and opportunities.

An interesting fact revealed is that despite significant growth in AI applications, the vast majority of revenue added in the last two years has gone to the semiconductor sector, highlighting the foundational, yet highly concentrated, nature of the current AI infrastructure.

Short Highlights

  • The course will explore the economic landscape of AI, including where the money is and what to expect over the next 9 weeks, with guest speakers and an assignment.
  • The current AI ecosystem's value distribution is an inverted triangle, heavily weighted towards the semiconductor layer (e.g., Nvidia), unlike previous tech revolutions.
  • The cost of AI users is significantly higher than traditional software users due to GPU burn, impacting profitability.
  • For consumer AI, the biggest questions are how to scale user monetization beyond current levels (currently $10/user/year for ChatGPT) and whether ads will be more profitable than subscriptions.
  • The semiconductor layer is the most profitable, with Nvidia's data center revenues earning approximately 75% margin, while application layer revenues are estimated between 0-30%.

Key Details

Introduction and Course Logistics [0:00]

  • The instructor, Apoorv, introduces himself and the 9-week course, emphasizing a focus on the "money in AI."
  • Logistics include a quiz on day one, a 50/50 grading split (attendance and an assignment), and a course structure involving guest speakers.
  • The course is designed to be time-efficient, requiring no more than 3 hours per week, including class time, readings, and interaction with AI tools.

"And uh the biggest question that we've all been wrestling with, where's the money? Where's the money in AI?"

The AI Ecosystem vs. Past Tech Revolutions [06:02]

  • A significant focus is placed on the economic model of AI, contrasting it with the internet, mobile, and cloud revolutions.
  • The current AI ecosystem is characterized by a massive investment in capex for data centers (energy, chips, power, interconnects, memory) to train and serve models.
  • Unlike software businesses that ran at high gross margins (80-90%), AI applications have a higher incremental user cost due to the need to burn GPUs for inference.
  • The AI ecosystem's value distribution is presented as an "inverted triangle" or a "pyramid," dramatically different from the cloud ecosystem's shape.

"The cloud ecosystem looks dramatically different than the uh AI ecosystem."

Drivers of the AI Ecosystem's Shape [08:13]

  • The difference in ecosystem shape is attributed to several factors: the early stage of the AI cycle, Nvidia's dominant market share in compute, and the fundamental physics of running inference, which is more costly than traditional software.
  • The concept of "software eating the world" by Marc Andreessen is contrasted with the AI model, where the marginal cost of an AI user is not near zero but significantly higher.
  • Past tech revolutions like mobile took approximately a decade to flip their value distribution triangles.

"These software businesses ran at 80, some even at 90% gross margins. That is not the case with this new economic model of AI because if we have a set of users using Cursor or using you know, you you hear all these stories about large bit scale businesses that are still not profitable at billions of dollars of revenue scale is because of that."

Profitability and Value Accrual [11:07]

  • The course will delve into profitability across different layers of the AI stack: semiconductors, inference (the most competitive layer), and applications/models.
  • For companies like Nvidia, the discussion will revolve around dominance, competitive threats (like ASICs), and pricing compression.
  • For AI providers like OpenAI and Anthropic, the focus will be on user profitability and revenue models (ads vs. subscriptions).
  • Startups in the inference layer face intense competition from hyperscalers aiming for dominance.

"The most competitive part of the whole ecosystem. There's a lot of startups that are doing really well. They're winning so far, but you've also got the hyperscalers who want to have a dominant say in that layer."

Consumer AI Monetization and User Scale [34:01]

  • The current landscape of consumer AI, outside of coding, shows high usage with a large percentage of free users (e.g., 95% for ChatGPT).
  • Unlike mandatory apps (WhatsApp, YouTube) or social apps (Instagram, TikTok) that reach billions of users, current AI applications like ChatGPT are closer to "niche apps" (Spotify, Twitter) in terms of user engagement and monetization.
  • ChatGPT has reached about 1 billion users, monetized at approximately $10 per user per year, significantly lower than Alphabet's $100/user/year or Meta's $70/user/year.
  • The challenge is to scale user numbers and increase per-user revenue, potentially by moving beyond knowledge work and exploring ad-based models.

"The leading AI provider, ChatGPT, has got about a billion users that are monetized at about $10 a user a year. And so, the question is, how do we get the billion up to 4 billion?"

The Future of AI Monetization and Advertising [38:25]

  • The instructor predicts that ads will become a significant revenue stream for AI providers, offering better pricing due to a deeper understanding of user intent and higher trust.
  • This is likened to the evolution of mobile advertising, which faced skepticism but ultimately found its space and effectiveness.
  • The potential for AI-powered ads to be highly personalized and effective is highlighted as a key future unlock for the AI economic model.

"And I suspect the ads that ChatGPT will be able to serve or Claude will be able to serve will have a lot better pricing because they will understand your intent, that you will be logged in, very good attribution, a lot more trust."

Other People Also See