Qualcomm says it will build custom data center CPUs that link with Nvidia chips

In This Article:

Investing.com -- Qualcomm (NASDAQ:QCOM) announced Monday it will develop custom data center CPUs designed to link with Nvidia’s AI chips, marking a renewed push into the server processor market.

While Nvidia (NASDAQ:NVDA) dominates the AI accelerator space, its chips require CPUs—traditionally supplied by Intel (NASDAQ:INTC) and AMD (NASDAQ:AMD). Nvidia has also entered the CPU market with its Arm-based "Grace" chip.

Now, Qualcomm is rejoining the race, leveraging Nvidia technology to improve communication between its CPUs and Nvidia GPUs.

“With the ability to connect our custom processors to Nvidia’s rack-scale architecture, we’re advancing a shared vision of high-performance energy-efficient computing to the data center,” Qualcomm CEO Cristiano Amon said Monday.

Qualcomm initially pursued an Arm-based server chip in the 2010s and tested it with Meta Platforms (NASDAQ:META), but scaled back the effort amid legal pressures and cost cuts. The initiative was revived following Qualcomm’s 2021 acquisition of a team of former Apple (NASDAQ:AAPL) chip engineers.

Since then, the company has resumed discussions with Meta and signed a letter of understanding with Saudi AI firm Humain to co-develop a custom data center CPU.

Qualcomm’s push into data center CPUs gained momentum with its 2021 acquisition of Nuvia, a company that develops Arm-based processor designs and plays a central role in Qualcomm’s renewed server chip ambitions.

The data center CPU space is fiercely competitive, with major cloud providers like Amazon (NASDAQ:AMZN) and Microsoft (NASDAQ:MSFT) already deploying their own custom-built processors, alongside established industry giants AMD and Intel.

Qualcomm’s expansion into this segment is part of a broader effort to reduce its reliance on smartphones, where it has long dominated with processors and modems.

The company is promoting its chips as energy-efficient and capable of running AI workloads directly on devices, rather than offloading them to the cloud. This on-device approach could enable faster AI performance and improved privacy, as sensitive data remains local to the hardware.

Related articles

Qualcomm says it will build custom data center CPUs that link with Nvidia chips

Mizuho lifts price targets for AI server stocks

Use recent counter-trend rally in S&P 500 to buy defensive stocks