#368: NVIDIA’s First-Quarter Report Blew Away Expectations. What Happens Next?, & More
1. NVIDIA’s First-Quarter Report Blew Away Expectations. What Happens Next?

Shares of NVIDIA soared last Thursday after the company reported1 its fiscal first-quarter earnings. Although revenue growth declined 13% on a year-over-year basis, it increased sequentially and was higher than expected thanks to record Data Center revenue. The showstopper in the report was guidance for second-quarter revenue of $11 billion, leaps ahead of the $7.11 billion consensus expectation, pointing to year-over-year growth of 63%. As the primary provider of accelerated computing hardware for developing and running large language models, NVIDIA is an early beneficiary of the boom in generative artificial intelligence (AI) that ChatGPT unleashed. Its projected revenue upside also reflects a recovery in its Gaming division now that channel inventory levels have normalized.
ARK research suggests that AI-driven data center spend will rise at an annual rate of 67%, from $17 billion to $1.7 trillion, in 2030,2 an enormous tailwind for NVIDIA even as competition intensifies. Horizontal chip companies like Advanced Micro Devices are entering the market while major cloud players like Amazon, Microsoft, and Google—recognizing the strategic opportunities for their public clouds and other business units—are developing their own hardware for training and inference workloads. Moreover, industry titans like Tesla (Dojo)3 and Meta (MTIA)4 are vertically integrating by developing in-house AI chips tailored to their particular use cases. In our view, while NVIDIA still has substantial runway for growth, its competitors are well-armed with resources and strategic incentives to challenge its reign.
The explosion in demand for AI hardware is pointing toward potentially significant revenue growth for software companies. During the next five to ten years, according to our current estimates, software will generate $8 in revenues for every dollar of demand for AI hardware, as companies are purchasing AI hardware to deliver AI-powered products and services. In what could be “winner take most” opportunities, those focused on activating AI with strong proprietary data and distribution advantages should be best positioned to capitalize on AI use cases and reap the dramatic productivity gains associated with the promise of generative AI.
[1] Nvidia. 2023. “NVIDIA Announces Financial Results For First Quarter Fiscal 2024.” [2] Downing, F. 2022. “Applying Wright’s Law To AI Accelerators.” ARK Investment Management LLC. [3] Lambert, F. 2021. “Tesla unveils Dojo supercomputer: world’s new most powerful AI training machine.” Electrek. [4] MetaAI. 2023. “MTIA v1: Meta’s first-generation AI inference accelerator.”
2. Ford Will Adopt Tesla’s Electric Vehicle Charging Standard


Last week, Elon Musk and Ford CEO Jim Farley announced1 that Ford electric vehicle (EV) owners will have access to more than 12,000 Tesla Superchargers across the US and Canada early next year. Farley also announced that its next-generation EVs will include Tesla’s charging standard, allowing Ford vehicles to charge at Tesla Superchargers without adapters.2
A major step toward making Tesla’s charging infrastructure the standard for all EVs in North America, Ford’s move should generate additional profits for Tesla and provide data on the charging behavior of a competitor’s electric vehicles. Expanding the reach of Tesla’s hardware and software into Ford cars for a better charging experience could put pressure on other automakers to follow suit.
[1] Farley, J. 2023. “@JimFarley98 & @ElonMusk: Accelerating EV adoption.” Twitter. [2] Wayland, M. et al. 2023. “Ford EVs will use Tesla charging tech in surprise partnership between rival automakers.” CNBC Autos.
3. Large Language Models Are Activating Multi-omics Datasets

A recent commentary1 in Nature Biotechnology highlights the potential of large language models (LLMs) like ChatGPT in the biotech space, particularly drug discovery. Simplifying the way scientists interact with complex data, LLMs offer substantial benefits and could be an emerging force in the industry’s growth. As discussed in the commentary, scientists have observed that LLMs can surface useful evidence from intractably large graph databases to identify indirect relationships that human researchers otherwise would miss. Thanks to LLMs and the conversational interface, scientists and database engineers are spending less time writing queries and analyzing datasets and more time asking research questions. That said, some have raised doubts about the accuracy of LLMs, citing their tendency to “hallucinate” information when they find no obvious answer. In our view, more experience with “prompt engineering” will limit hallucinations with time.
[1] Savage, N. 2023. “Drug discovery companies are customizing ChatGPT: here’s how.” Nature Biotechlogy.