Nvidia earnings on deck as AI kingpin tightens grip on $1 trillion market

In This Article:

Nvidia shares ended lower last week, giving back around $150 billion in market value amid a broader selloff in rate-sensitive tech stocks ahead of the group's hotly anticipated third quarter earnings on Wednesday.

Nvidia  (NVDA) , which commands a near 80% share of the market for high-end AI-powering chips and processors, is finding that its biggest challenge isn't the technological advances of its rivals, but rather the ability of its supply-chain partners to help it meet what CEO Jensen Huang has called "insane" demand for its new Blackwell line.

Blackwell chips, as a stand-alone, are said to be around two and a half times faster than Nvidia's legacy H100 chips, also known as Hopper, when they are used to train large-language AI models. And they're around five times when used to run those models in real applications, a process called inferencing.

That performance, of course, comes at a price: Blackwell GPUs reportedly cost around twice as much as their H100 predecessors, at between $60,000 and $70,000 per unit, with prices reaching as high as $3 million for a fully loaded server with 72 chips stacked inside.

Nvidia's broader chip architecture makes this possible, as chips can be stacked and interlocked, almost Lego-like, based on specific client needs.

Blackwell is also backward-compatible with the H100, enabling customers — if they're lucky enough to get their hands on them — to replace legacy chips with the newer, faster and more efficient models.

Nvidia shares are valued at $3.65 trillion. The whole of Japan's Nikkei 225, the second-largest stock market in the world, is valued at $4.65 trillion.<p>Shutterstock</p>
Nvidia shares are valued at $3.65 trillion. The whole of Japan's Nikkei 225, the second-largest stock market in the world, is valued at $4.65 trillion.

Shutterstock

Lucky for Nvidia, and ultimately its investors, there's no lack of willingness among its biggest customers to spend.

Nvidia GPUs: Harder to buy than drugs?

Elon Musk, who runs a host of businesses alongside his obligations as Tesla  (TSLA)  CEO and President-elect Donald Trump's budget-slasher-in-chief, could be one of its biggest customers.

His XAi startup, which aims to challenge OpenAI and its industry benchmark ChatGPT, is looking to raise around $6 billion in fresh capital, CNBC reported last week. Such a funding would value that group at around $50 billion.

Part of that funding, the report indicated, will go to buying around 100,000 of the H100 chips next year. That's on top of the 300,000 he wants to buy for Tesla to replace his existing cluster of H100 chips.

"[Nvidia] GPUs at this point are considerably harder to get than drugs," Musk told a Wall Street Journal CEO Council Summit last spring.

He's not far wrong.

Related: Nvidia to reap billions in big tech AI spending

Mark Zuckerberg's AI ambitions for Meta Platforms  (META) , centered on the training and inferencing of its Llama supercomputer, reportedly require around 350,000 H100 chips. Upgrading those to the faster Blackwell line, which is sold out for all of next year, won't be cheap.