The use and prevalence of generative artificial intelligence (AI) technology has ballooned over the past few years. This includes the growth of chatbots like ChatGPT and image generators like Midjourney.
As AI has become ubiquitous, people have raised concerns about the environmental impacts of the technology. One of the more common criticisms is that it requires more water and power than older technology.
Some people have drawn links between this resource use, climate change and the wildfires in Los Angeles. A viral post from Instagram that has since been reshared many times claimed that a single interaction with ChatGPT uses 10 times the amount of energy as a Google search.
Readers Olive and Dean also asked us to VERIFY the impact artificial intelligence has on water and power usage.
THE SOURCES
-
University of Illinois Urbana-Champaign’s Center for Secure Water
-
2023 study by engineering researchers from the University of California, Riverside
-
2023 study by researcher with Digiconomist, a research company focused on unintended consequences of digital trends
-
Sunbird, a company that makes data center management software
WHAT WE FOUND
Our current online world relies on vast amounts of computers and data centers to operate. These centers power everything we do online, from conducting internet searches to streaming movies.
Artificial intelligence is more sophisticated than a regular web search or movie stream. It requires exponentially more computing power to complete what may seem like simple tasks.
The AI boom has thus led to a rise in new data centers, too. These new data centers that support the additional computing power required are the source of AI’s outsized environmental impact.
How AI uses electricity
An AI tool like ChatGPT relies on large amounts of data and equally large amounts of computer processing power to provide a result. Tech companies keep computer systems to store this data and run programs to process it in physical locations called data centers.
When someone gives an AI program a prompt, it uses computational power to sift through and process all of that data, Katherine Bourzac, a science writer for Nature journals, wrote in a 2024 article. The more computational power used, the more electricity is needed.
How AI uses water
It’s not just electricity data centers need more of when they use more computational power; they also need more water, according to the University of Illinois Urbana-Champaign’s Center for Secure Water.
The more power a computer uses, the more heat it generates. If a computer gets too hot, it’ll start running into problems. That’s why laptops and personal computers have fans inside of them that spin faster when the computer works harder.
Many data centers use industrial-sized fans to do the same thing on a large scale. However, traditional air cooling isn’t always enough to dissipate the amount of heat generated by all of the computer power AI uses, according to Sunbird, a data center management software company. So AI data centers use liquid coolants, which absorb and transfer heat better than air does.
When data centers use water as their liquid coolant, the water is pumped through pipes surrounding the center’s equipment, where it absorbs excess heat and is then typically pumped back out through a heat exchanger to a coolant tower, where the water evaporates. That means these data centers need a constant source of water to run through their systems.
AI resource usage by the numbers
The average electricity demand of a typical Google search without AI is 0.3 Wh (watt-hours) of electricity, while the average electricity demand of a ChatGPT request is 2.9 Wh, according to the International Energy Agency (IEA).
In 2023, John Hennessy, chairman of Google parent company Alphabet, told Reuters that he predicted an exchange with an AI chatbot would likely be 10 times more energy intensive than a standard Google search without AI.
While 0.3 to 2.9 Wh might not seem like much (a toaster typically uses 10 to 160 Wh per use), those numbers add up. In 2021, before Google began integrating AI overviews into its search engine, Google consumed more than 18 trillion watt-hours of electricity, according to a study by a researcher with Digiconomist. At that time, AI accounted for 10-15% of the total electricity Google used.
Various estimates within that study estimated that Google search integrated with AI could use between 6.9 and 8.9 Wh per search. Google didn’t include the total amount of electricity it consumed in its most recent environmental report, but Google did say that in 2023 it released 37% more emissions from using electricity than it did in 2022.
Google said the increase in emissions was primarily because its increasing demand for electricity for its data centers outpaced its ability to bring more carbon-free energy projects online.
In its most recent environmental report, Google reported it consumed 14% more water in 2023 than it did in 2022. This is “primarily due to water cooling needs” at Google’s data centers, “which experienced increased electricity consumption year-over-year.”
The exact amount of water used to cool the machines in a data center can depend on the data center’s design and location; data centers in hotter locations need more water for cooling.
On average, data centers can consume approximately 1-9 liters of water per kWh of server energy, according to an estimate from engineering researchers at the University of California, Riverside.