Unlock stock picks and a broker-level newsfeed that powers Wall Street.

A Once-in-a-Generation Investment Opportunity: 1 Artificial Intelligence (AI) Growth Stock to Buy Now and Hold Forever

In This Article:

It's been little more than a year since the latest iteration of artificial intelligence (AI) went viral, and we're only just beginning to see the fruits of this breakthrough technology. Early indications suggest one of the biggest benefits will be the time and money savings from increases in productivity, as AI automates mundane and time-consuming chores. Businesses of all kinds are exploring how to best adopt this technology, but it's still early days.

Micron Technology (NASDAQ: MU) CEO Sanjay Mehrotra was clear about the long runway ahead. "We are in the very early innings of a multiyear growth phase driven by AI as this disruptive technology will transform every aspect of business and society," he said.

That's a bold assertion but one that's increasingly being echoed by the brightest minds in technology, though estimates of its value are diverse. Generative AI is expected to be a $1.3 trillion market by 2032, according to Bloomberg Intelligence. Global management consulting firm McKinsey & Company is more bullish, estimating a range of between $2.6 trillion and $4.4 trillion annually. What is pretty clear, however, is that the opportunity is vast.

It's also pretty clear that Micron Technology stands to reap a portion of this growing AI windfall.

A hand showing a spark and two AI icons exchanging information.
Image source: Getty Images.

Multiple ways to profit

Micron Technology may not be a household name, but the company provides a number of components that are vital to AI processing, particularly in the data center. Micron is a leading supplier of memory (DRAM) and storage (NAND) chips -- and each one helps accelerate the performance of Nvidia's GPUs, which are the gold standard in data center processing.

In November, Nvidia announced that it had chosen Micron's HBM3E (High Bandwidth Memory 3E) chip, which would be integrated into its H200 Tensor Core GPUs, providing "advanced memory to handle massive amounts of data for generative AI and high-performance computing workloads," according to the press release. Nvidia went further, saying that the HBM3E helped ramp up the performance of the H200, which delivered "nearly double the capacity and 2.4 times more bandwidth compared with its predecessor, the Nvidia A100."

These data center workhorse processors are scheduled to begin shipping in the second quarter of 2024. Last month, Micron announced it had begun volume production of the HBM3E, which the company said provides superior performance while using about 30% less power than competing offerings.

As the number and size of data center workloads continue to scale, energy consumption is becoming a key consideration, which no doubt was a factor when Nvidia chose Micron's power-miserly chips.