A new report published Wednesday by the International Energy Agency (IEA) estimated that global electricity demand from data centers, artificial intelligence and cryptocurrencies could double by 2026.
The IEA estimated that those three sectors together consumed roughly 460 Terawatt hours (TWh) of electricity globally in 2022, about 2% of total demand.
For context, the average U.S. household consumes about 10,500 Kilowatt hours (kWh) of electricity each year, according to the U.S. Energy Information Administration. One TWh is equivalent to one billion kWh.
Related: Artificial Intelligence is a sustainability nightmare – but it doesn’t have to be
Though the report noted that it is difficult to make projections with technological advancements and efficiencies shipping so rapidly, the IEA estimated a range between 620 and 1,050 TWh of electricity consumption from those sectors in 2026, roughly the energy equivalent of at least Sweden or at most Germany.
Data centers, the IEA said, are the most significant driver of regional electricity demand growth; 33% of the world’s 8,000 data centers are located in the U.S. Those data centers consumed 200 TWh of electricity in 2022, which was equivalent to 4% of the country’s electricity demand.
The IEA expects that number to reach 260 TWh, about 6% of the country’s total energy consumption, by 2026, powered largely by a growth in cloud-based services and increased adoption of 5G networks.
Further, the report indicated that an increased adoption of AI into large, existing applications could increase electricity consumption by an enormous margin.
A typical Google search consumes .3 Wh of electricity, the IEA said. A typical ChatGPT query consumes 2.9 Wh of electricity. If Google alone incorporates generative AI at scale across all of Google Search, considering the nine billion daily Google searches that are conducted, the IEA estimated that such a move would require an additional 10 TWh of electricity per year.
1,000 Wh is equivalent to one kWh.
In 2023, Nvidia (NVDA) – Get Free Report shipped 100,000 AI servers that, according to the IEA, consume an average of 7.3 TWh per year. By 2026, the IEA expects the industry to be consuming at least 10 times its 2023 demand.
Related: Here’s the Steep, Invisible Cost Of Using AI Models Like ChatGPT
AI and sustainability
The issue of AI and energy has been increasingly scrutinized and studied by some scientists, like Dr. Sasha Luccioni, a leading expert in ethical, responsible AI. But systems like ChatGPT are notoriously closed source, so while researchers are attempting to understand the energy usage of Large Language Models (LLMs) like ChatGPT, the full reality of the carbon emissions of these models remains unclear.
Part of the issue with calculating not just energy usage, but carbon emissions, is that a company’s given carbon footprint relies on many more factors than electricity consumption; the location of data centers, the type of grid powering those data centers, the load on that grid and the number of requests a model deals with on a given day all impact the way electricity consumption translate to carbon footprint.
Luccioni last year studied Bloom, an open-sourced LLM that over an 18-day period received about 230,000 requests, consumed 914 kWh of electricity and had a carbon footprint of 340 kg.
ChatGPT, in comparison, is clocking about 100 million users (users, not requests) per week. Its website was visited around 1.6 billion times in November, according to data from Similarweb, with each visit averaging seven minutes in duration.
These issues of electricity consumption are merely one component of the overall environmental impact of AI, which additionally consumes an enormous amount of water, which is used to cool its data centers.
OpenAI CEO Sam Altman said at the World Economic Forum’s annual meeting in Davos that an energy breakthrough is needed for future iterations of AI technology, which he said will consume far more energy than people anticipate.
“There’s no way to get there without a breakthrough,” he said. “It motivates us to go invest more in fusion.”
Altman said both nuclear fusion and fission should be looked at as potential energy breakthroughs; the CEO in 2021 invested $375 million in Helion Energy, a private U.S. nuclear fusion company that later signed a deal to provide energy to Microsoft in the future. Microsoft, which has poured $13 billion into OpenAi, is OpenAI’s top investor.
Altman has said that countries should explore geoengineering — the manipulation of the environment to partially offset the effects of climate change — as a stopgap in the meantime.
“This is so symptomatic of the broken relationship between AI and the environment,” Luccioni said in a post on X. “We can’t magically generate more energy, nor is geoengineering a viable climate solution. We need to stop stuffing genAI into everything and reduce its energy use, right now.”
Related: ChatGPT’s development nearly cost this small Midwestern city its drinking water
Current solutions
The idea of a sustainable data center has been explored by at least one company.
Verne Global, for instance, operates a growing set of sustainable data centers across Europe, largely built in Nordic locations.
Verne’s first data center was opened in Iceland, where it takes advantage of a sustainable, reliable mix of power coming from geothermal and hydroelectric sources, as well as a temperate environment that doesn’t require much in the way of cooling systems.
Machine learning technology, according to the IEA, can be applied to data centers to optimize their energy usage and increase their efficiency.
Hyperscaled data centers, the report added, can run large-scale operations without significant increases in electricity consumption.
The IEA additionally estimated that renewables will generate more than a third of the world’s global energy supply by 2025, and low-carbon sources (a combination of renewables and nuclear energy) will account for nearly half of all global electricity generation by the end of 2026.
Luccioni told TheStreet last year that while certain — small — AI models can be used to help the climate, LLMs like ChatGPT simply consume far too much energy.
“I think that we’re currently seeing a trend of a one-size-fits-all solution,” she said. “Everyone’s trying to plug in LLMs and see what sticks and so I think we’re going to see more energy usage, more compute, just because now everything has to have an LLM behind it, just because it’s trendy.”
OpenAI did not respond to a request for comment.
Contact Ian with AI stories via email, [email protected], or Signal 732-804-1223.
Related: Veteran fund manager picks favorite stocks for 2024