Powering AI Could Soon Need as Much Electricity as a Small Country, Expert Warns

An expert suggests that the energy consumption related to AI could exceed the power demands of some countries.

A recent commentary by Alex de Vries, the founder of Digiconomist, a company that aims to expose the "unintended consequences of digital trends," raised concerns about the substantial energy footprint associated with the widespread adoption of AI.

De Vries suggested that in the future, the energy demands of AI could surpass those of entire countries, Cell Press reported.

"Looking at the growing demand for AI service, it's very likely that energy consumption related to AI will significantly increase in the coming years," the author said.

Powering AI Could Soon Need as Much Electricity as a Small Country, Expert Warns
A recent commentary by Alex de Vries, the founder of Digiconomist, raised concerns about the substantial energy footprint of AI. Leon Neal/Getty Images

Surge of AI Services

De Vries, a PhD candidate at Vrije Universiteit Amsterdam, noted that as the demand for AI services continues to surge, the energy consumption linked to AI is poised for significant escalation.

This may be true for generative AI, a branch of AI that includes models like OpenAI's ChatGPT, which requires extensive data for training, a process known for its high energy requirements.

For instance, Hugging Face, a New York-based AI company, revealed that its multilingual text-generating AI tool consumed a staggering 433 megawatt-hours (MWH) during its training process, which is an amount sufficient to power 40 typical American homes for a year.

However, the energy usage of AI does not stop at training. De Vries' analysis indicated that when the tool actively generates data based on prompts, it continues to draw substantial computing power and, eventually, energy.

For instance, operating ChatGPT on a daily basis could consume 564 MWh of electricity. Although global efforts are underway to enhance the energy efficiency of AI hardware and software, De Vries cautioned that efficiency gains often lead to increased demand.

This phenomenon, known as Jevons' Paradox, suggests that technological progress ultimately results in a net rise in resource consumption.

AI's Electrical Consumption

Companies like Google are already integrating generative AI into their services, with experiments underway to power their search engine using AI.

Given that Google conducts billions of searches daily, De Vries estimated that if AI were implemented for every search, it would require approximately 29.2 TWh of annual power, equivalent to Ireland's yearly electricity consumption.

Although this drastic situation is improbable in the immediate future due to factors such as costs and limitations in the supply chain, there is an anticipated upswing in the production of AI servers in the years ahead, according to De Vries.

Projections suggest that by 2027, the worldwide electrical consumption associated with AI could witness a substantial surge, potentially reaching between 85 to 134 TWh annually, driven by the expected increase in AI server manufacturing.

"The potential growth highlights that we need to be very mindful about what we use AI for. It's energy intensive, so we don't want to put it in all kinds of things where we don't actually need it," De Vries noted.

De Vries' commentary was published in the journal Joule.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics