#408: Epic Games Is Transforming The Development Of Entertainment Content, & More
1. Epic Games Is Transforming The Development Of Entertainment Content
Epic Games’ new deals with Lego and Disney suggest that power is shifting away from large incumbent gaming studios and publishers. The winners are likely to be individual creators and indie teams building on UGC (User-Generated Content) platforms like Fortnite Creative and Roblox.
Last week, during the annual Game Developers Conference (GDC) in San Francisco, Epic Games announcedi1 that creators now can use Lego-branded assets in their own games. Thanks to its Creator Economy 2.0 initiative,ii creators of Lego-branded games can earn payouts from Epic Games. As Lego’s intellectual property (IP) opens up to creators, the spectrum of UGC created inside Fortnite’s ecosystem should widen Epics appeal to gamers currently not served by its portfolio of first-party games. Interestingly, its Lego IP strategy could be pointing the way to a similar deal with Disney for its vast library of entertainment assets.
In February, Disney announcediii a $1.5 billion equity investment in Epic Games in addition to a multiyear deal to build Disney-branded games and other immersive experiences. The deal also offered end gamers the right to use the Unreal Engine to “create their own stories and experiences,” suggesting that third-party developers will be able to use Disney IP in their own Fortnite games.
ARK envisions a future in which most IP holders will outsource the creation of incremental consumer-facing content to end gamers. While emerging AI tools are democratizing the creation of digital assets and experiences, they also are enabling adherence to brand guidelines at scale. By aggregating such experiences, Epic Games and Roblox could capture the majority of incremental value generated by UGC while supporting multi-billion-dollar platform ecosystems that increase consumer entertainment options.
2. Nvidia’s 2024 GTC Conference Showcased A Wellspring Of New AI Platform Offerings
Last week, Nvidia held its first in-person Graphics Processing Unit (GPU) Technology Conference (GTC) since 2019. Now the 3rd largest companyiv by market cap in the world, Nvidia welcomed ~300,000 people,v including virtual attendees, to the event, or 35-times the 8,400 who attended in 2018.vi
During GTC, Nvidia unveiled its new Blackwell platform which, on GPT-4 sized models, will improve the performance on AI training by 3x and AI inference by 15x.vii During an investor Q&A session,viii CEO Jensen Huang said that he had been surprised by the exceptional demand for Hopper after the launch of ChatGPT and that, for Blackwell’s ramp, Nvidia is better prepared to supply the pent-up demand from its largest customers, Microsoft and Meta Platforms.
Nvidia also unveiled NIMS (Nvidia Inference Microservices), prepackaged AI models to be deployed and fine-tuned easily across all Nvidia hardware. Available at a rate of $4,500 per GPU to Nvidia AI Enterprise customers, the service should accelerate its recurring software revenue base, which crossed $1 billion at an annual rate, nearly 2% of total revenue, in the fourth quarter. It also announced GR00T, a foundation model, and Isaac Sim, an extensible simulation platform, for the development of autonomous robots.
Among the companies at the conference, software vendors like ServiceNow, LinkedIn, and SentinalOne describedix Generative AI deployments in their own organizations and for customers. Many panelists said they are arming their engineers with coding assistants and augmenting back-office tasks with chatbots for HR and Finance. ServiceNow’s Chief Digital Information Officer, Chris Bedi, noted that Gen AI solutions using internal proprietary data have helped employees cut the 30-35% of their time spent on administrative tasks by 40%.
We look forward to learning how more enterprises harness the power of generative AI to deploy productivity-enhancing solutions on Nvidia’s hardware.
3. Nvidia’s New Blackwell Chip Promises Step-Function Performance Gains
Last week, Nvidia CEO Jensen Huang announcedx the company’s new Blackwell chip offering that delivers 2.5 times more FLOPs of computational power than its predecessor Hopper chip, introduced in 2022.xi The innovative chip design stitches two Blackwell “dies“ together in the same package and accounts for most of Blackwell’s performance improvement, but it adds complexity and costs compared to a traditional monolithic die.
Compared to Hopper-based systems, a Blackwell data center can train a GPT-4 sized language model with roughly one-fourth the number of GPUs and energy of a traditional data center. Focused heavily on maximizing data transfer speeds between the Blackwell package and its surrounding systems, Nvidia situates the chip within vertically integrated server and data center solutions that incorporate its latest high-bandwidth networking equipment. This approach should minimize bottlenecks and enable full utilization of Blackwell’s immense computing power.
Blackwell also enhances optimizations for AI inference workloads.xii A new 4-bit numeric precision format can be leveraged for inference tasks that alone improves performance by ~5x compared to Hopper.xiii As chip demand soars and competition intensifies from AI chip startups and vertically integrated solutions like Meta's in-house inference designs, cost effective inference solutions are becoming increasingly important.
ARK’s research suggests that demand for AI compute should skyrocket throughout this business cycle, scaling 69% at an annual rate from $50 billion in 2023 to ~$2 trillion by 2030. While Nvidia should benefit enormously from the secular tailwinds, the competitive landscape is likely to intensify as, even in a demand supercycle, chips are much more cyclical than other parts of the tech stack. Blackwell should keep Nvidia at the forefront of the AI acceleration race for now, but as the AI chip wars heat up, the company will have to innovate at a breakneck pace to protect its enviable market position and margins.
4. Tesla’s Full Autonomy AI Training Compute Is Getting Ready To Roll
Last week, Elon Musk statedxiv that Tesla is no longer constrained by compute for Artificial Intelligence (AI) training as it aims for full autonomy. Last July, having suggested that Nvidia might not be able to supply enough GPUs,xv Tesla committed ~$1 billion to its Dojo supercomputer, which is optimized to ingest video clips from its ~2.6 million FSD miles driven per day.xvi During its most recent earnings call, however, Tesla downplayed Dojo’s progress, implying that it would continue to rely on Nvidia.xvii
We are wondering if Tesla’s compute capacity increased because it acquired more GPUs from Nvidia, or because Dojo is ready for prime time. Regardless, Tesla now seems likely to achieve the aggressive training compute targets it laid out last June.xviii Increased compute should enable accelerated software updates, paving the way to full autonomy. According to Tesla, full self-driving (FSD) v12.3 will deliver three significant updates every two weeks.xix Already, users are reporting human-like driving experiences with fewer interventions.xx ARK is looking forward to monitoring FSD’s progress toward full autonomy in the next few weeks and months.
i Webster, A. 2024. “Fortnite players can now build their own Lego games.” The Verge.
ii Epic Games. 2023. “Introducing Creator Economy 2.0."
iii The Walt Disney Company. 2024. “Disney and Epic Games to Create Expansive and Open Games and Entertainment Universe Connected to Fortnite."
iv Marketcap.com. “Companies ranked by Market Cap.” Data as of March 24, 2024.
v Nvidia. 2024. “See the Future at GTC 2024: NVIDIA’s Jensen Huang to Unveil Latest Breakthroughs in Accelerated Computing, Generative AI and Robotics."
vi Wikipedia. ND. “Nvidia GTC.”
vii Nvidia. 2024. “NVIDIA HGX AI Supercomputer: The world’s leading AI computing platform.”
viii Nvidia. 2024. “GTC Financial Analyst Q&A.”
ix Nvidia. 2024. “Driving Enterprise Transformation: CIO Insights on Harnessing Generative AI's Potential.”
x Nvidia. 2024. “Keynote by NVIDIA CEO Jensen Huang."
xi Smith, R. 2024. “NVIDIA Blackwell Architecture and B200/B100 Accelerators Announced: Going Bigger With Smaller Data.” AnandTech.
xii “AI inference” refers to the process of running an AI model to generate language or image output and contrasts; “AI training” involves feeding large datasets to a model to teach it how to generate that output.
xiii Smith, R. 2024. “NVIDIA Blackwell Architecture and B200/B100 Accelerators Announced: Going Bigger With Smaller Data.” AnandTech.
xiv Musk, E. 2024. “Yeah, 99% of people have no idea..” X.
xv Seeking Alpha Transcripts. 2023. “Tesla, Inc. (TSLA) Q2 2023 Earnings Call Transcript.”
xvi Ibid. “FSD daily miles” is an ARK estimate based on the FSD chart disclosed in Tesla’s Q4 earnings deck. See Tesla. 2024. “Q4 and FY 2023 Update.”
xvii Seeking Alpha Transcripts. 2024. “Tesla, Inc. (TSLA) Q4 2023 Earnings Call Transcript.”
xviii Downing, F. 2023. “Tesla forecasts ramping up to 100 exaflops….” X.
xix Musk, E. 2024. “Three significant improvements to SFD..” X.
xx See Dell, M. 2024. “Super impressive, Tesla FSD v12.3...” X; Maurer, R. 2024. “Tested FSD v12.3 in Chicago…” X.
-
1
Webster, A. 2024. “Fortnite players can now build their own Lego games.” The Verge.