Unlock stock picks and a broker-level newsfeed that powers Wall Street. Upgrade Now
Better Artificial Intelligence Stock: Nvidia vs. AMD

In This Article:

Discrete GPUs were initially designed for use in video games and professional graphics applications, but they've now become the backbone of data centers that use them to process all sorts of computations, especially those involving complex artificial intelligence (AI) functions. Unlike CPUs, which process a single piece of data at a time through scalar processing, GPUs use vector processing to manage a wide range of floating-point numbers and integers simultaneously.

That key difference makes powerful data center GPUs the picks and shovels of the AI gold rush. Nvidia (NASDAQ: NVDA) and Advanced Micro Devices (NASDAQ: AMD) share a near-duopoly in this market, but which chipmaker is the better AI-driven investment right now?

A digital illustration of an AI chip.
Image source: Getty Images.

The key differences between Nvidia and AMD

Nvidia is the 800-pound gorilla of the GPU market, while AMD is the distant underdog. Nvidia's share of the entire discrete GPU market grew eight percentage points year over year to 90% in the third quarter of 2024, according to JPR, while AMD's share shrank from 17% to 10%. TechInsights estimates Nvidia accounted for 98% of all data center GPU shipments in 2023.

Nvidia generated 88% of its revenue from the data center market in its latest quarter. Only 9% came from its gaming GPU segment (which was once its biggest business), while the rest mainly came from other chips for the auto and OEM markets.

AMD produces GPUs, x86 CPUs, and APUs (accelerated processing units) which merge CPUs and GPUs for gaming consoles and other devices. It controlled 38% of the x86 CPU market in the first quarter of 2025, according to PassMark Software, while Intel held a 60% share. AMD sells both types of chips for the PC and data center markets.

For PCs, AMD sells Ryzen CPUs and Radeon GPUs. For servers, it sells Eypc CPUs and Instinct GPUs. Its Epyc CPUs compete against Intel's Xeon CPUs, while the Instinct GPUs target Nvidia's Hopper series GPUs at a much lower price point.

However, Nvidia locks in a lot of AI developers into its proprietary CUDA (Compute Unified Device Architecture) platform for GPU applications. AMD's GPUs can only run CUDA with non-native porting tools and applications, which further limits its appeal among large cloud and AI customers. That's why the world's top AI companies -- including OpenAI, Microsoft, Amazon, Meta Platforms, and Alphabet's Google -- still mainly use Nvidia's GPUs to power their generative AI applications.

Which company is growing faster?

In fiscal 2023 (which ended in January 2023), Nvidia's revenue growth flatlined as the PC market lapped its pandemic-driven growth spurt and the macroeconomic headwinds battered the data center market. But in fiscal 2024, its revenue soared 126% as the sudden rise of generative AI applications lit a raging fire under its data center business.