Unlock stock picks and a broker-level newsfeed that powers Wall Street.
Energy consumption 'to dramatically increase' because of AI

In This Article:

Artificial intelligence is expected to have the most impact on practically everything since the advent of the internet. Wall Street sure thinks so. The tech-heavy Nasdaq (^IXIC) is up 26% year to date thanks to the frenzy over AI-related stocks.

But AI's big breakout comes at a cost: much more energy.

Take for example OpenAI's chatbot ChatGPT. Research done at the University of Washington shows that hundreds of millions of queries on ChatGPT can cost around 1 gigawatt-hour a day, or the equivalent energy consumed by 33,000 US households.

"The energy consumption of something like ChatGPT inquiry compared to some inquiry on your email, for example, is going to be probably 10 to 100 times more power hungry,” Professor of electrical and computer engineering Sajjad Moazeni told Yahoo Finance.

Industry participants say this is only the very beginning of what's to come.

“We’re maybe at 1% of where the AI adoption will be in the next two to three years,” said Arijit Sengupta, founder and CEO of Aible, an enterprise AI solution company. “The world is actually headed for a really bad energy crisis because of AI unless we fix a few things.”

EAGLE MOUNTAIN, UT - OCTOBER 05: Construction proceeds on phases three through five at a Facebook data center on October 5, 2021 in Eagle Mountain, Utah. Facebook was shut down yesterday for more than seven hours reportedly due in part to a major disruption in communication between the company's data centers.  (Photo by George Frey/Getty Images)
An energy-hungry Facebook data center under construction. (George Frey/Getty Images) · George Frey via Getty Images

Data centers are the heart of the advanced computing process. They are the physical locations with thousands of processing units and servers at the core of the cloud computing industry largely managed by Google, Microsoft, and Amazon.

"As you think of this shift towards these larger foundation models, at the end of the day you’re going to need these data centers to require a lot more energy as a whole," Angelo Zino, VP and senior equity analyst at CFRA Research, told Yahoo Finance.

Data centers have increasingly shifted from using simpler processors, called CPUs, to more advanced graphics processing units, or GPUs. Those components, made by companies like Nvidia (NVDA), are the most energy intensive.

"For the next decade, GPUs are going to be the core of AI infrastructure. And GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive,” explained Brady Brim-Deforest, CEO of Formula Monks, an AI technology consulting company.

Added Brim-Deforest: "Energy consumption is going to dramatically increase on a global scale, simply because of the energy-intensive nature of AI. But if you look at the nuances, what's interesting is AI is also incredibly efficient at things that humans are not as efficient at."

'Huge massive infrastructure cost'

Research done by Benjamin C. Lee, professor of electrical engineering and computer science at the University of Pennsylvania, and professor David Brooks of Harvard showed that data center energy usage grew 25% a year on average between 2015 and 2021. This was before generative AI grabbed national headlines and ChatGPT usage skyrocketed.