AI Water Footprint: The Thirsty Truth Behind Language Models

  • April 15, 2023

KEY TAKEAWAYS
Large language models like ChatGPT and Google's Bard have a significant water footprint due to the energy and cooling requirements for their training process.
Researchers estimate that GPT-3 consumed 185,000 gallons of water during its training, and the water consumption would be higher for newer models like GPT-4.
As AI models become more popular, their water consumption could have a significant impact on water supplies, especially amid increasing environmental challenges in the US.
Researchers suggest several ways to reduce AI's water footprint, including adjusting when and where AI models are trained, using federated learning strategies, and integrating information from electricity providers and advancements in energy storage.
Greater transparency from AI model developers and data center operators in disclosing when and where AI models are trained is crucial to address AI water footprint concerns and develop environmentally sustainable practices.

 

While large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard have revolutionized the tech landscape, new research highlights their enormous water footprint. 

The training process for AI models like GPT-3 and GPT-4 requires immense amounts of energy and cooling, resulting in considerable water consumption.

ChatGPT’s Growing Water Consumption

Researchers from the University of Colorado Riverside and the University of Texas Arlington published a pre-print paper titled “Making AI Less ‘Thirsty.'” 

They estimated that GPT-3 consumed 185,000 gallons (700,000 liters) of water during its training. 

The water consumption would be even higher for newer models like GPT-4, which rely on a larger set of data parameters.

According to the study, an average user’s conversation with ChatGPT is equivalent to pouring out a large bottle of fresh water. 

As these AI models become more popular, their water consumption could have a significant impact on water supplies, especially amid the increasing environmental challenges in the US.

Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.

Cooling Data Centers and the AI Water Footprint

Data centers use massive amounts of water to cool down server rooms and maintain an ideal temperature for the equipment. 

Cooling towers, the most common cooling solution for warehouse-scale data centers, consume a significant amount of water. 

The researchers estimate around a gallon of water is consumed for every kilowatt-hour expended in an average data center. 

Data centers typically rely on clean freshwater sources to avoid corrosion, bacteria growth, and to control humidity.

Addressing the AI Water Footprint Problem

Researchers suggest several ways to reduce AI’s water footprint, including adjusting when and where AI models are trained.

Training models during cooler hours or in data centers with better water efficiency can help reduce water consumption. 

Chatbot users can also engage with AI modules during “water-efficient hours,” similar to off-hours appliance use.

Federated learning strategies, which involve multiple users collaborating on training AI models using local devices, could also help decrease on-site water consumption. 

Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.

Transparency and Accountability

The researchers call for greater transparency from AI model developers and data center operators in disclosing when and where AI models are trained. 

Such information would be valuable for both the research community and the general public. 

Acknowledging the training process and location could also help address AI water footprint concerns. 

As AI continues to advance, it is crucial for the tech industry to develop environmentally sustainable practices to minimize the water footprint of these revolutionary models.

Craig Paradise media

Read Full Biography
Back to previous

You May Also Like

best psychics
Special Interest

2023’s Best Psychics Online (Real Psychic Readers for Phone, Video, & Chat Sessions)

When looking for the best psychics, the first thing we care about is accuracy. Indeed, when you’re faced with difficult……

Special Interest

Groundbreaking Study Finds Potential Antidote for Deadly Death Cap Mushroom

  The notorious death cap mushroom (Amanita phalloides), responsible for an alarming number of mushroom-related fatalities worldwide, may soon lose……

Special Interest

OpenAI CEO Advocates for AI Regulation Amid Congressional Concerns

  In a recent hearing before the US Senate committee, Sam Altman, CEO of OpenAI, presented the case for the……

  • mail
  • facebook
  • twitter

related articles

Special Interest

Unicorn Burgers and Mermaid Lattes: The Fantastically Whacky and Outlandish Food Trend Sweeping the Nation – Courtesy of ChatGPT

Special Interest

Scientists Uncover Intriguing Methods of Plant Communication, Including the “Wood-Wide Web”

Special Interest

Unearthing Ancient Routes: Fresh Perspectives on Native American Lineage


Articles About Special Interest

AI-Driven Accessibility Innovations Coming to Apple Devices

May 18, 2023

A Historic Leap: Saudi Woman to Reach Orbit

May 17, 2023

AI: The Double-Edged Sword of Progress and Bias

May 17, 2023

AI Hoax Article Fools Historic Irish Newspaper

May 17, 2023

Hong Kong Scientists Discover New Species of Box Jellyfish Amid Growing Interest in These Marine Creatures

May 16, 2023