Generative AI is an energy hog. Is the technology worth the environmental cost?



It can seem like magic. Write a request in ChatGPT, click a button and – in advance! – here is a five paragraph analysis of Shakespeare Hamlet and, as an added bonus, it is written in iambic pentameter. Or point DALL-E to the chimeric animal from your dream and an image of a gecko-wolf-starfish hybrid will come up. If you’re feeling down, call up the digital “ghost” of your late grandmother and get some solace (SN: 15.6.24, p. 10).

Despite what it may seem, none of this materializes out of thin air. Every interaction with a chatbot or other generative AI system runs through wires and cables to a data center—a warehouse full of racks of servers that run those requests through the billions (and potentially trillions) of parameters that dictate how a generative model reacts .

Processing and responding to requests consumes electricity, as does supporting infrastructure such as the fans and air conditioning that cool the spinning servers. In addition to huge utility bills, the result is a huge amount of climate-warming carbon emissions. Power generation and server cooling also absorb tons of water, which is used in fossil fuel and nuclear power generation, as well as for evaporative or liquid heat dissipation systems.

This year, as the popularity of generative AI continued to grow, environmentalists sounded the alarm about this resource-hungry technology. The debate over how to weigh the costs against the less tangible benefits that generative AI brings, such as increased productivity and access to information, is mired in ideological divisions over the technology’s purpose and value.

Advocates argue that this latest revolution in AI is a social good, even a necessity, bringing us closer than ever to general artificial intelligence, hypercapable computing systems that some argue could be a paradigm-shifting technology. on a par with the printing press or the Internet.

Generative AI “is an accelerator for anything you want to do,” says Rick Stevens, an associate lab director at Argonne National Laboratory and a computer scientist at the University of Chicago. According to him, the technology has already enabled huge productivity gains for businesses and researchers.

One analysis found 40 percent gains in performance when skilled workers used AI tools, he notes. AI assistants can boost vocabulary learning in schools, he adds. Or help doctors diagnose and treat patients and improve access to medical information, says Charlotte Blease, an interdisciplinary researcher at Uppsala University in Sweden who studies health data. Generative AI could even help city planners reduce traffic (and reduce carbon emissions in the process), or help government agencies better predict the weather, says Priya Donti, an electrical engineer and computer scientist at MIT and co-founder of the non-profit organization Climate Change AI. . The list goes on.

Now, at this critical juncture, experts from fields as diverse as economics, computer engineering and sustainability are working to assess the true burden of the technology.

How much energy does AI use?

ChatGPT and other generative tools are power-hungry, says Alex de Vries, founder of research and consulting agency Digiconomist and a Ph.D. candidate at the Vrije Universiteit Amsterdam. “The bigger you make these models—the more parameters, the more data—the better they perform. But of course, bigger also requires more computing resources to train and run them, requiring more energy,” says de Vries, who studies the environmental impact of technologies such as cryptocurrency and AI. “Bigger is better works for generative AI, but doesn’t work for the environment.”

Training generative AI models to extract an analysis of Shakespeare or an image of a fantastic animal is costly. The process involves developing an AI architecture, collecting and storing sets of digital data, and then having the AI ​​system ingest and incorporate that data — which can amount to anything publicly available online — into its decision-making processes. Improving the models to be more humane and to avoid uncertain responses requires additional efforts (SN: 27.1.24, p. 18).

However, training a single model uses more energy than 100 US homes in a year. Searching ChatGPT uses about 10 times more energy than a standard Internet search, according to the International Energy Agency. Compiling an email with an AI chatbot could take seven times more energy than fully charging an iPhone 16, some researchers estimate.

Although training is obviously a big resource, when millions of people rely on chatbots for everyday tasks, it adds up, says Shaolei Ren, an electrical and computer engineer at the University of California, Riverside. So much so that the AI ​​sector could soon draw as much energy annually as the Netherlands, de Vries estimated in 2023 in joule. Given the rapid growth of generative AI, the current trajectory already exceeds the forecast.

And that’s just electricity. Ten to 50 ChatGPT queries use half a liter of water, according to a 2023 analysis by Ren and colleagues. This also turned out to be a huge underestimate, he says, by a factor of four.

Some engineers and AI experts dispute these numbers. “I don’t understand what science is behind these [estimates],” says David Patterson, an engineer at Google and professor emeritus at the University of California, Berkeley. “The only way I can imagine getting one [accurate] The answer would be in close cooperation with a company like Google.”

Right now, this is impossible. Tech companies release limited information about their data centers and AI models, de Vries and Ren say. So it’s hard to accurately estimate the cradle-to-grave cost of AI or predict the future. In their estimates, the two researchers relied on proxies, such as AI server production numbers from technology company Nvidia or combining knowledge of data center locations with information from corporate sustainability reports.

However, real-world trends point to AI’s voracious appetite for power. For decades before the AI ​​generation boom, efficiency gains offset the increased demand for energy that comes with expansions in data centers and computers, says Andrew Chien, a computer scientist at the University of Chicago. This has changed. By the end of 2020, data center expansion began to outpace efficiency improvements, he says. Google and Microsoft’s self-reported energy use doubled between 2019 and 2023. The release of ChatGPT in late 2022 kicked off an AI generation frenzy — making matters worse, Chien says. Before 2022, total energy demand in the United States has been stable for about 15 years. Now it’s growing.

“The easiest way to save energy is to do nothing,” says Patterson. But “progress involves investment and cost.” Generative AI is a very new technology and banning it now would stunt its potential, he argues. “It’s too early to know that [generative AI] it will not more than recoup the investment.”

A more sustainable path for AI

The decision should not be between completely shutting down generative AI development or allowing it to continue indefinitely. Instead, most experts note that there is a more responsible way to approach the technology, mitigating the risks and maximizing the rewards.

Policies requiring companies to disclose where and how they are using generative AI, as well as the corresponding energy consumption, would be a step in the right direction, says Lynn Kaack, a computer science and public policy expert at the Hertie School in Berlin. Regulating technology use and access to it can be difficult, but Kaack says it’s key to minimizing environmental and social harm.

Perhaps not everyone, for example, should be able to freely produce sound clones and photorealistic images with a single click. Should we pour the same amount of resources into supporting a meme-generating machine as we do into running a hurricane prediction model?

More research into the limitations of the technology could also save a lot of wasted consumption. AI “is very powerful in certain types of applications, but completely useless in others,” says Kaack.

In the meantime, data centers and AI developers can take steps to reduce their carbon emissions and resource use, Chien says. Simple changes such as training patterns only when there is abundant carbon-free energy on the grid (say, on sunny days when solar panels produce a surplus of energy) or fine-tuning system performance at times of peak energy demand can make a measurable difference. Replacing water-intensive evaporative cooling with liquid immersion cooling or other closed strategies that allow water recycling would also minimize demand.

Each of these choices involves trade-offs. More carbon-efficient systems generally use more water, Ren says. There is no one-size-fits-all solution. The alternative to exploring and pushing these options — even if they make it a little harder for companies to develop ever-greater AI models — is risking part of our collective environmental destiny, he says.

“There’s no reason to believe that technology is going to save us,” says Chien—so why not hedge our bets?


#Generative #energy #hog #technology #worth #environmental #cost
Image Source : www.sciencenews.org

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top