AI Water Footprint: The Thirsty Truth Behind Language Models

KEY TAKEAWAYS
Large language models like ChatGPT and Google's Bard have a significant water footprint due to the energy and cooling requirements for their training process.
Researchers estimate that GPT-3 consumed 185,000 gallons of water during its training, and the water consumption would be higher for newer models like GPT-4.
As AI models become more popular, their water consumption could have a significant impact on water supplies, especially amid increasing environmental challenges in the US.
Researchers suggest several ways to reduce AI's water footprint, including adjusting when and where AI models are trained, using federated learning strategies, and integrating information from electricity providers and advancements in energy storage.
Greater transparency from AI model developers and data center operators in disclosing when and where AI models are trained is crucial to address AI water footprint concerns and develop environmentally sustainable practices.

 

While large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard have revolutionized the tech landscape, new research highlights their enormous water footprint. 

The training process for AI models like GPT-3 and GPT-4 requires immense amounts of energy and cooling, resulting in considerable water consumption.

ChatGPT’s Growing Water Consumption

Researchers from the University of Colorado Riverside and the University of Texas Arlington published a pre-print paper titled “Making AI Less ‘Thirsty.'” 

They estimated that GPT-3 consumed 185,000 gallons (700,000 liters) of water during its training. 

The water consumption would be even higher for newer models like GPT-4, which rely on a larger set of data parameters.

According to the study, an average user’s conversation with ChatGPT is equivalent to pouring out a large bottle of fresh water. 

As these AI models become more popular, their water consumption could have a significant impact on water supplies, especially amid the increasing environmental challenges in the US.

Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.

Cooling Data Centers and the AI Water Footprint

Data centers use massive amounts of water to cool down server rooms and maintain an ideal temperature for the equipment. 

Cooling towers, the most common cooling solution for warehouse-scale data centers, consume a significant amount of water. 

The researchers estimate around a gallon of water is consumed for every kilowatt-hour expended in an average data center. 

Data centers typically rely on clean freshwater sources to avoid corrosion, bacteria growth, and to control humidity.

Addressing the AI Water Footprint Problem

Researchers suggest several ways to reduce AI’s water footprint, including adjusting when and where AI models are trained.

Training models during cooler hours or in data centers with better water efficiency can help reduce water consumption. 

Chatbot users can also engage with AI modules during “water-efficient hours,” similar to off-hours appliance use.

Federated learning strategies, which involve multiple users collaborating on training AI models using local devices, could also help decrease on-site water consumption. 

Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.

Transparency and Accountability

The researchers call for greater transparency from AI model developers and data center operators in disclosing when and where AI models are trained. 

Such information would be valuable for both the research community and the general public. 

Acknowledging the training process and location could also help address AI water footprint concerns. 

As AI continues to advance, it is crucial for the tech industry to develop environmentally sustainable practices to minimize the water footprint of these revolutionary models.

Craig Miller

Read Full Biography
Back to previous

You May Also Like

Abandoned house in the us desert
Special Interest

Loïc Vendrame’s Photos Reveal Modern Ruins

Delve into Loïc Vendrame’s remarkable photo series “Future Rust, Future Dust” as it shines a light on modern ruins.  A…

ikea
Special Interest

SPACE10 Closes After 10 Years: Legacy and Future Impact Explored

Copenhagen-based SPACE10, IKEA’s innovation lab, announces its closure after a decade. Dive into its lasting impact on design, community engagement,…

distillery in england
Special Interest

Art & Cocktails: Unique Saturday Night at Wildflower Distillery

Experience a one-of-a-kind Saturday at Wildflower Distillery, Penticton. Enjoy custom cocktails, live painting, and jazz tunes. Free entry, opens at…

  • mail
  • facebook
  • twitter

related articles

Special Interest

Florida Governor and Disney in Legal Battle over Autonomy of Reedy Creek Improvement District

Special Interest

China’s AI Censorship and India’s Hands-off Approach to Regulation

Special Interest

Lunar Flashlight Mission Faces Challenges, Aims for Revised Orbit


Articles About Special Interest

Lunar Codex: The Moon’s Permanent Installation of Creative Arts

August 8, 2023

Metropolitan Museum to Refund Cryptocurrency Firm’s Gift amid Recovery Efforts

June 20, 2023

A Glimpse of the Extraterrestrial: SETI’s Artistic Experiment

June 3, 2023

Groundbreaking Study Finds Potential Antidote for Deadly Death Cap Mushroom

May 20, 2023

OpenAI CEO Advocates for AI Regulation Amid Congressional Concerns

May 19, 2023