Business

Motley Fool
Opinion: The Biggest Risk Facing Nvidia Stock, and How the Company Will Solve It

In This Article:

The semiconductor industry has always been cyclical, because whether it's computers, smartphones, or even data centers, consumers and businesses only upgrade their physical hardware once every few years. That means revenue comes in big waves, followed by lengthy troughs.

Artificial intelligence (AI) recently changed that. Since 2023, spending on data center chips and components appears to be growing exponentially, as some of the world's biggest technology companies race to develop the most powerful AI software.

Nvidia (NASDAQ: NVDA) has been the biggest beneficiary of that spending boom, because it supplies the most advanced data center graphics processors (GPUs) for developing AI. Nvidia is selling so many chips that it has become the second-largest company in the entire world, adding $3 trillion to its market capitalization over the last two years alone.

But data center spending can't continue at this pace forever, and the inevitable slowdown could be the biggest risk to Nvidia's stock price. However, CEO Jensen Huang might have already revealed the solution.

A black Nvidia sign out the front of the company's headquarters.
Image source: Nvidia.

Data center spending could slow significantly in a few years

The ultimate goal for every company developing AI is to achieve artificial general intelligence (AGI), which is the point at which the technology matches human intelligence in most cognitive tasks. A researcher who used to work for ChatGPT creator OpenAI predicts AGI could arrive as soon as 2027. Tesla CEO Elon Musk believes 2029 is a more realistic target. In any case, it could be just a few years away.

Developing AI beyond the point of AGI will almost certainly yield diminishing returns, because very few commercial workloads would benefit from such a high degree of machine intelligence. If that's the case, demand for Nvidia's data center GPUs could plunge a few years from now because the pool of developers who want (or who can afford) further performance increases will be very small.

Today, the bulk of data center infrastructure spending comes from just a handful of trillion-dollar tech giants (more on that in a moment), and Nvidia is launching new generations of data center GPUs almost on an annual basis to meet their needs. It released the Hopper architecture in September 2022, the Blackwell architecture in March 2024, and reports suggest a new architecture called "Rubin" will be revealed by the end of 2025.

That isn't sustainable over the long term, not only because demand is likely to slow in a few years, but also because it cost Nvidia $10 billion to develop Blackwell alone, and research and development costs will only climb from there.