1 Super Semiconductor ETF to Buy in the Wake of the DeepSeek Sell-Off

In This Article:

Artificial intelligence (AI) is moving fast. It was only two years ago that OpenAI's GPT-3.5 models sparked the AI arms race, and the pace of innovation has since been staggering. But the best innovations often come out of left field. For instance, a China-based AI lab called DeepSeek dominated headlines this past week because of its latest large language models (LLMs), which offer a significant leap in efficiency compared to those that currently lead the industry.

Some investors are concerned that this higher efficiency will lower demand for computing capacity, which could be bad news for Nvidia (NASDAQ: NVDA) because it's the leading supplier of graphics processors (GPUs) for AI data centers. Those concerns caused Nvidia stock to plunge 17% on Monday.

But it isn't all bad news for the chip industry. The recent advancements from DeepSeek could actually create significant opportunities in AI hardware. But picking winners and losers right now won't be easy, which is why buying the iShares Semiconductor ETF (NASDAQ: SOXX) might be the best (or at least safer) way to play the next phase of the AI revolution. Read on.

A digital render of a circuit board with a chip in the center, inscribed with the letters AI.
Image source: Getty Images.

What is DeepSeek?

DeepSeek was formally established in 2023 by China's leading hedge fund, High-Flyer, which had been using AI to build trading algorithms for several years. By spinning off DeepSeek, High-Flyer's owners created a new value stream from all the hype surrounding AI. DeepSeek faces a serious disadvantage: The U.S. government banned the sale of Nvidia's latest (and most advanced) data center GPUs to Chinese firms to keep American developers like OpenAI ahead of the competition. DeepSeek had to use older chips like Nvidia's H100 and the H800, which was designed with throttled performance specifically for the Chinese market.

That forced DeepSeek's team to innovate. While American AI companies were spending tens of billions of dollars to build new data centers to train ever-bigger LLMs, the Chinese start-up focused on optimizing the software side and creating a more efficient LLM architecture to squeeze more out of its limited compute capacity.

It leaned on a technique called distillation, which involves taking a successful model like OpenAI's GPT-4o, and using it to partially train its own smaller model. That allowed DeepSeek to rapidly accelerate its progress by using a fraction of the compute compared to starting from scratch.

Fast forward to today: DeepSeek says it built one of its latest models (called V3) for just $5.6 million, and it rivals the industry's leading models across several key performance benchmarks. Its incredible efficiency means the company can charge just $0.14 per 1 million input tokens, which is 94% cheaper than OpenAI's current rate of $2.50 per 1 million input tokens. Plus, DeepSeek's models are completely open source, so developers aren't locked into a closed ecosystem like they are with OpenAI.