Artificial intelligence is everywhere now and it consumes a lot of water
9 mins read

Artificial intelligence is everywhere now and it consumes a lot of water

From our collaborating partner “Living on Earth” environmental news magazine broadcast by public radioAynsley O’Neill interview with Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside.

Artificial intelligence has become part of everyday life, but so far there are few regulations regarding its implementation and use. There is currently no law in the US that requires AI companies to disclose their environmental impact in terms of energy and water use. Concerned researchers rely on voluntary data from companies such as Apple, Meta and Microsoft.

However, research shows that generating artificial intelligence may require even more resources than initially thought. Imagine you want to ask an AI program to write a 100-word email for you. You get a near-instant response, but you don’t see the intense computing resources that went into creating that email. In an AI data center, generating just two of these emails can use as much energy as fully charging the latest iPhone. According to a Pew Research Center study, this 100-word email can use up an entire bottle of water for cooling needed in data centers.

This interview with Shaolei Ren, professor of electrical and computer engineering at the University of California, Riverside, has been edited for length and clarity.

Elections 2024Elections 2024

Get the latest news on climate threats during the election season.

AYNSLEY O’NEILL: For those of us who aren’t familiar with the technical aspects of how AI works, why does it use so much more energy and so much more water than anything else you do on your computer?

SHAOLEI REN: Well, because a big language model is by definition really big. Each model has several billion parameters, or even hundreds of billions of parameters. Suppose you have 10 billion parameters to generate one token or one word: you will have to perform 20 billion calculations. This is a very energy-intensive process. This energy turns into heat, so we need to get rid of it; water evaporation is one of the most effective ways to cool data center facilities. Therefore, in addition to energy, we use a lot of water.

Water evaporates into the atmosphere, so it is sometimes considered lost water, although technically it is still in our global water cycle, but cannot be reused in the short term from the same source. Water consumption is the difference between water intake and water discharge, and this is very different from the water we use to shower. When you shower, you take in a lot of water, but there is not much water consumption.

O’NEILL: At least from what I know in the United States, the water used in AI data centers for cooling comes from local or municipal sources. What impact does an AI data center have on the local community surrounding it?

REN: In the US, approximately 80-90 percent of the water used in data centers comes from (public) water sources. We have conducted preliminary research that shows that in the US today, data center water use already accounts for approximately 2-3 percent of total public water use. So we are talking about consumption here, not water consumption. Estimates from EPRI (Electric Energy Research Institute) show that artificial intelligence energy demand may increase to 8 percent by the 2030s.

O’NEILL: There’s a debate raging in Memphis, where tech billionaire Elon Musk is trying to build a massive artificial intelligence server. The local power company estimates that this system will require approximately one million gallons of water per day to cool. From your perspective, how should local communities weigh the benefits against the costs of having local AI data centers?

Shaolei Ren, professor of electrical and computer engineering at the University of California, RiversideShaolei Ren, professor of electrical and computer engineering at the University of California, Riverside
Shaolei Ren, professor of electrical and computer engineering at the University of California, Riverside

REN: I think there are benefits, especially in terms of economic development. For example, the construction of a data center will generate some tax revenues, and once completed, the local government will have a steady inflow of taxes.

On the other hand, natural resources in the form of millions of gallons of water per day may be a problem. Right now, I’ve heard that the local water company says they collect 1 percent of their total water needs. However, I would say that they are probably equating water use with water withdrawals because they provide water to residents and other industries, but (that) consumption is mostly water withdrawals because they just immediately return the water back to the water supply. However, if the data center takes in water, most of the water will evaporate. So this is not the right metric to compare. This 1 percent water draw for data centers could mean water consumption is around 5-10 percent.

O’NEIL: The companies that have built and operated these AI systems themselves are interested in making the technology more efficient. What possible improvements to AI technology could make it more energy or water efficient over time?

REN: They definitely have an incentive to reduce energy consumption, reduce resource consumption for training and inference. We have seen many research proposals and solutions that promise to reduce energy consumption, but it turns out that in reality the systems are not that optimized.

I saw an article by a research team from a leading technology company that shows that energy consumption is 10 times higher than previously thought, even though state-of-the-art optimization techniques are used. They therefore have an incentive to reduce energy and resource consumption for AI computations.

However, the real world is a different story, partly because they have stringent service level goals to meet, which means they have to return responses to users in a short time frame, and this limits how well they can optimize their system. If they only deal with batch processing, they can be very energy efficient, but in reality there are many limitations that prevent them from using optimization techniques.

Maybe we can compare a bus to a passenger car. Generally speaking, on a per-passenger basis, a bus should be more energy efficient than a passenger car, assuming the bus is fully loaded. But in reality, due to user requests, random patterns and other limitations, the bus is not fully loaded at all. If you have a 50-passenger bus, it is usually loaded with only five passengers and on average per passenger the fuel economy is much worse than a passenger car.

O’NEILL: Artificial intelligence has really become a huge part of many people’s everyday lives. This is supposed to make our lives easier, but it comes at a huge cost to the environment. What is the solution here? If technological progress is not happening as we expect, what is the solution?

REN: One potential fix is ​​that instead of using larger and larger models, we could use smaller and smaller models, because usually these smaller models are good enough to do many of the tasks we really care about.

For example, if you just want to know the weather or a text summary, using a smaller model is usually sufficient, and a smaller model means you will save a lot of resources and power consumption. Sometimes you can even render small models on a mobile phone, which can further save energy by, say, 80 percent in a very simple way compared to running a larger model in the cloud.

About this story

You may have noticed: This story, like everything we cover, is free to read. That’s because Inside Climate News is a 501c3 nonprofit organization. We don’t charge a subscription fee, block our messages behind a paywall, or clutter our site with ads. We make our climate and environment news available free of charge to you and anyone who wants it.

That’s not all. We also make our news available free of charge to many other media organizations across the country. Many of them cannot afford to conduct ecological journalism on their own. We have established offices from coast to coast to cover local stories, partner with local newsrooms and co-publish stories to make this important work available to as wide an audience as possible.

Two of us founded ICN in 2007. Six years later, we won a Pulitzer Prize for national reporting, and now we run the nation’s oldest and largest climate news site. We tell the story in all its complexity. We hold polluters accountable. We expose ecological injustice. We disinform disinformation. We analyze solutions and inspire action.

Donations from readers like you fund every aspect of our work. If you haven’t already, will you support our ongoing work, our reporting on the biggest crisis facing our planet, and help us reach even more readers in more places?

Please take a moment to make a tax-deductible donation. Each of them makes a difference.

Thank you,