Generative AI Hides High Utility Consumption, Directly Affecting Cost of Living Around Data Centers, Study Says

April 19, 2011 - According to Tom's Hardware, a study by The Washington Post and the University of California, Riverside, on Wednesday, local time, pointed out that using theGenerative AI The environmental costs are much higher than outsiders think. The study found thatAI Even just in generating text, a lot of water is consumed behind the scenes for server cooling, in addition to the huge power consumption.

Generative AI Hides High Utility Consumption, Directly Affecting Cost of Living Around Data Centers, Study Says

The research team noted that AI's water usage varies greatly depending on the state and distance from the data center: the less water used, the more power consumption tends to increase, and vice versa. To generate a 100-word email, for example, a data center in TexasAverage water consumption is about 235 ml.In Washington State, the number isUp to 1408 ml, which is equivalent to about three bottles of ordinary mineral water.

The study cautions that water consumption can quickly accumulate if users repeatedly call GPT-4 on a weekly, or even daily, basis, even if they are just generating simple text.

The study also points out that data centers' consumption of hydropower resources is beingExacerbate the cost of living for neighboring residentsthat directly inflates water and electricity bills. For example.Meta consumed a total of 22 million liters of water while training the LLaMA-3 model, which is about the same amount of water needed to grow 4,439 pounds (note: about 2013.5 kilograms) of rice, or the combined water use of 164 Americans for an entire year.

In addition to water, the demand for electricity from GPT-4 is equally staggering. The study estimates that if approximately 1 in 10 active employees across the United States called GPT-4 once a week for a year (52 queries per capita, totaling about 17 million users), theElectricity requirements would amount to 121,517 megawatt-hours, the equivalent of 20 days of electricity use by all households in Washington, D.C.. This is still under extremely restrained assumptions, and actual use is likely to be far more than that.

The Washington Post also cited responses from OpenAI, Meta, Google, and Microsoft, with several companies stating that they would work to reduce the burden of AI on the environment, but none giving a clear course of action.

Microsoft spokesman Craig Cincotta said the company is "working on data center cooling technologies that don't consume water." While this sounds promising, the report says, researchers caution caution due to the lack of specifics, "Corporate environmental commitments have often been sacrificed in the past for profitability goals."

statement:The content of the source of public various media platforms, if the inclusion of the content violates your rights and interests, please contact the mailbox, this site will be the first time to deal with.
Information

DeepSeek's Wenfeng Liang Named to TIME's "100 Most Influential People in the World 2025" List

2025-4-19 11:56:37

Information

ChatGPT low-profile online "memory search" function: personalized search results more accurate

2025-4-19 11:59:12

Search