AI uses a lot of energy. How Big Tech is trying to fix it

In this article:

By 2027, global AI-related electricity consumption could increase by 64%, reaching a level on par with countries like Sweden and the Netherlands. Tech companies are largely driving this energy spike, as they rapidly scale data centers to power AI innovation. Amazon (AMZN), Google (GOOG, GOOGL), Meta (META), and Microsoft (MSFT) are projected to spend $189B in AI capital expenditures in 2024 alone.

This innovation boom comes with additional costs—placing strain on aging power grids and increasing companies’ emissions as they try to hit net-zero targets. Microsoft’s emissions have increased by 30% since 2020 because of its AI investments.

Companies are now working to keep up in the AI race while managing their energy footprint by investing in renewable energy sources and other zero-emission options, including nuclear. Amazon purchased a $650M data center next to a nuclear power plant run by Talen Energy (TLN) earlier this year. Chip companies that make the hardware powering AI, like industry leader Nvidia (NVDA), are also working to increase the energy efficiency of their products, as Big Tech continues to develop large language models and integrate AI into real-life applications.

Advertisement