In This Article:
Listen and subscribe to Opening Bid on Apple Podcasts, Spotify, YouTube, or wherever you find your favorite podcasts.
Staying on top in the AI chip race won't be easy for market darling Nvidia (NVDA) and its founder Jensen Huang, reminds one fellow tech billionaire.
"The fact that Jensen doesn't even make his own chips — that everyone has Taiwan Semiconductor (TSM) available — is all the more credit to him at the design level, the way they have done things is pretty fantastic. I would say I wouldn't want to be Jensen necessarily because wow, other people are working on the same things," Microsoft (MSFT) co-founder Bill Gates told me on Yahoo Finance's Opening Bid podcast (video above; listen in below).
Added Gates, "But so far he has stayed way ahead and Microsoft is an incredible customer and is always wanting to get as many of his chips as they possibly can. At the same time other big tech companies are working on their own AI chips."
Competition for Nvidia is sprouting up, as Gates correctly points out.
With giants like Amazon (AMZN) announcing an $8 billion partnership with Anthropic to enter the AI chip space and Google (GOOG) dropping a supercomputer with an AI chip called Willow, it's evident Big Tech companies want in on the action.
Further, Broadcom (AVGO) and Marvell (MRVL) have released advanced custom chips.
At the same time, the long-term demand for Nvidia's powerful AI chips is being questioned arguably for the first time.
DeepSeek is a Chinese company that has bursted onto the tech landscape seemingly out of nowhere.
It surprised markets and those trading hot AI names like Nvidia and AMD (AMD) after unveiling RI, its AI model that gave a ChatGPT-esque performance at a cheaper price tag. RI costs a reported $5.6 million to build a base model, compared to the hundreds of millions of dollars incurred at US-based companies such as OpenAI and Anthropic.
Watch: Why billionaire investor Ray Dalio is concerned about tech valuations
Fears mounted instantly that US companies are overspending on AI infrastructure, which includes Nvidia chips.
"So what this [news] basically does is two things. One, it says there's still a lot of innovation left and many companies can aspire to train these models. But, two, it also raises the really interesting question of do you need to spend billions of dollars in order to train cutting-edge, world-class models? I think the jury is still out there for things like that, but it's basically upended our assumptions about where AI is going," Snowflake CEO Sridhar Ramaswamy told me on Opening Bid podcast.