MarketWatch

Nvidia may have yet another advantage in AI - and this one is less appreciated

By Emily Bary

Bernstein analysts say spending on AI inference has trailed expectations, putting Nvidia in a relatively better spot than peers as the company dominates the AI training market

Nvidia Corp. may have yet another advantage over peers, but this one skates under the radar.

While the maker of graphics processing units dominates the market for artificial-intelligence hardware, that's well understood by the market. A new analysis from Bernstein, though, suggests that the AI market has shaken out in a somewhat unexpected way that could make Nvidia (NVDA) seem especially well positioned relative to peers.

Specifically, the Bernstein analysts looked at spending on "training" for AI models versus spending on AI inference. Training refers to the process of teaching AI models to draw conclusions, while inference refers to when models actually act on that training to make predictions based on new information.

Analysts and companies have been excited about the inferencing opportunity, but the Bernstein analysts say the size of this market has trailed their earlier expectations. The Bernstein team, led by Toni Sacconaghi and Stacy Rasgon, estimates that inferencing makes up less than 5% of AI infrastructure at present "and will continue to be a relatively small portion of infrastructure spend for the next several years."

On the flip side, "training of [large-language models] continues to be the key driver of AI spend," the analysts said.

What does that mean for Nvidia? The potential for continued strong training trends "is likely constructive for Nvidia at this point," as the company "currently dominates the vast majority of [generative-AI] training infrastructure" and is "on the cusp of [its] new Blackwell product cycle which is likely fueling next-generation models."

By contrast, if the inferencing market doesn't meet size expectations originally, that could be a setback for some of Nvidia rivals, according to the analysts. That's because competitors "have (mostly) broadly acknowledged Nvidia's training dominance and have hence focused on inference as the bulk of their long term opportunity."

See also: Nvidia investors just got a $1 trillion reason to be even more bullish

The analysts took a moment to acknowledge a data point that might seem at odds with their analysis: Nvidia itself said that inference made up an estimated 40% of its data-center revenue over the last four reported quarters.

"We note that the statement was not specific to generative AI, but across all data-center GPU use cases, and we note that many other use cases...are far less training intensive and more inferencing intensive than LLMs (and are more likely to already be generating [returns on investment)," the analysts said. They noted other use cases include "traditional" machine learning and recommendation algorithms.

The analysts also said Nvidia "may not have perfect visibility into what its customers are using GPUs for."

All the while, the Bernstein team sees long-run potential in inferencing, though there could be some puts and takes. "While we are generally believers, it is hard to have conviction either way," they wrote. "However, we also observe that if training investment slows in the next 1-2 years amid some mismatch between infrastructure costs and revenues, it is unlikely that inferencing will be able to pick up the slack, potentially leading to a digestion cycle, impacting all players in the AI value chain."

Read: Nvidia's stock is no longer the S&P 500's top gainer this year. Here's what is.

-Emily Bary

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

 

(END) Dow Jones Newswires

10-02-24 0935ET

Copyright (c) 2024 Dow Jones & Company, Inc.

Market Updates

Sponsor Center