As artificial intelligence rapidly advances, consuming substantial energy and threatening environmental harm, companies are seeking solutions to lessen its ecological impact.
Browser company Opera announced its plans to deploy a new AI cluster powered entirely by green energy in Keflavik, Iceland, on Wednesday. All of the company's flagship browsers already boast native AI features, and the company said it believes AI will soon take on a role beyond that of a chatbot and help users perform more elaborate browser tasks, requiring more AI computing power.
"Data Centers dissipate a massive amount of power," Benjamin Lee, a Computer and Information Science professor at the University of Pennsylvania, told Tech Times in an interview. "Hyperscale data centers, such as the ones built by the largest technology companies, dissipate tens of megawatts of power. We estimate that such data centers already use more power than the state of Massachusetts. Moreover, power usage is growing rapidly."
Opera's AI cluster is designed to have the most negligible possible environmental impact. The computing center relies on hydroelectric and geothermal power for energy and fresh Icelandic air for cooling.
"Iceland will complement our existing infrastructure across the globe as a green, cost-efficient, and centralized hub for computation-heavy tasks, ensuring that Opera has the infrastructure in place to seize the opportunities we see and allowing our company to rapidly evolve and expand its AI services," said Krystian Zubel, vice president of Group IT at Opera.
AI's Environmental Impact
The use of AI is soaring, and so is the power needed to run the computers. Social media firm Meta recently said it could spend up to $37 billion on new digital infrastructure this year, $2 billion more than previously anticipated.
A recent article in the journal Joule outlines the potential environmental footprint of artificial intelligence as it increasingly infiltrates every aspect of popular culture and the workplace. The concern centers around generative AI's need for high-powered servers, raising fears that such extensive computing demands could significantly increase data centers' energy use and carbon emissions. Under one scenario, Google Search could consume as much electricity as Ireland in a single day due to the demands of power-intensive AI.
Data centers, the engines of AI development, currently consume 3% of global energy, and energy consumption from data centers is projected to reach over 1,000 terawatt-hours by 2026, equivalent to the electricity consumption of Japan, pointed out Jonathan Martin, the president of WEKA, in an interview.
"And while we've seen a lot of discussion around how to cool power-hungry data centers that process AI, the focus really should be on how we prevent data centers from overheating in the first place," he added."The time, money, and energy spent diagnosing and treating the symptoms of data cooling should be used to address the root of the problem: inefficient energy consumption in the data center.
If you have a lot of high-performance computer processors in one place, you need to power them and then cool them down, Ilia Badeev, Head of Data Science at Trevolution Group, said in an interview. Energy consumption for operating and cooling has become a significant issue. For example, Microsoft's Project Natick explored the feasibility of underwater data centers for efficient cooling.
There's growing political pressure to do something about the environmental costs of AI. A team of Democratic legislators recently put forward a bill aimed at examining the ecological effects of artificial intelligence. The bill mandates that the EPA investigate AI's climate effects and directs government agencies to devise a voluntary disclosure framework for firms to report their AI models' potential environmental impacts.
Sen. Ed Markey, D-Mass., highlighted the dual nature of AI's environmental impact in a statement, noting, "AI harbors a Dickensian aspect towards our environment: it has the potential to either enhance or deteriorate our planet's health. The AI Environmental Impacts Act intends to establish precise standards and voluntary disclosure norms for gauging AI's environmental footprint. We cannot allow the advancement of next-gen AI technologies to compromise our planet's well-being."
Solutions To The Crisis?
One solution to AI energy usage might be to make the technology more efficient. The most promising innovations could significantly reduce computing power, primarily for AI consumption, by 10-fold or more within the next 2 to 5 years - while sustaining the ever-increasing AI computing demand that shows no signs of relenting, RK Anand, the founder and Chief Strategy Officer at Recogni said in an interview.
"This will alleviate the need for massive energy required over the next decade," he added.
"Hyperscalers, in particular, are also investing in their own renewable energy generations, including solar, geothermal, and nuclear sources."
One crucial step is shifting to renewable energy sources, like solar, wind, and hydroelectric power. This move can significantly cut down the carbon footprint of these data centers, Adnan Masood, Chief AI Architect at UST, said in an interview. Companies, including Google and Apple, are already investing heavily in this area.
"It's not just a nod to environmentalism; it's a practical, long-term cost-saving measure," he added.
Cooling is another area where improvements can make a big difference. The traditional methods, using air, are energy hogs. More innovative approaches, like natural cold air or liquid cooling systems, can drastically reduce energy use. Some are even experimenting with underwater data centers, which sounds more like science fiction than reality, but it's a testament to the kind of creative solutions being explored.
Then there's the hardware itself. Masood said that using energy-efficient servers and other equipment can lower the overall energy consumption of a data center.
"New processor technologies, especially those tailored for AI tasks, are more efficient in their energy use, getting more done for each watt of power," he added.
The methods used to measure the impact of data centers - like using offsets, buying renewable power, or trying to match power use with renewable energy around the clock - only partially address their adverse effects, Andrew Chien, a professor of computer science at the University of Chicago said in an interview. Moreover, concerns are growing over electronic waste, the environmental impact of producing and discarding electronics, and various other forms of ecological damage attributed to them.
Chien is among the researchers exploring flexible, dynamic data centers driven by the vision of "Zero Carbon Cloud" (Zccloud). The idea of Zccloud is that data centers can respond to excesses of renewable energy in the power grid, consuming the excess energy to do important but flexible computing such as AI training or engineering product optimization.
"When there are fewer renewables, the data centers can reduce their consumption, reducing stress on the grid and avoiding blackouts," Chien said. "Another direction is harmonizing water use with local circumstances. We have demonstrated that these approaches can accommodate the continued rapid growth of data centers while reducing their total environmental impact."
But there are upsides to all that energy pumped out by computers. Repurposing waste heat from data centers for heating buildings or municipal heating systems turns a byproduct into a valuable resource. Badeev said. Facebook's data center in Odense, Denmark, is designed to capture and redirect excess heat to warm nearly 7,000 homes in the local community - a practical application of waste heat recovery.
"It's a smart way to turn a byproduct into a resource," Masood said.