this post was submitted on 23 Aug 2023
225 points (97.1% liked)
Technology
59429 readers
2714 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why would a data center need to continously consume water to cool itself? Leaks?
Evaporative cooling systems, such as cooling towers, so that water is non-recoverable.
The article however is mentioning that 3/4 of the water use cited is indirect through power generation.
Didn't know those were a thing
Water is extremely important in most large scale cooling systems, whether it be swamp coolers (aka evaporative cooling) or traditional HVAC (aka chillers).
That water will be recovered as rain.
But probably will end in ocean
And evaporate to become rain again and again.
I mean, sure, but that's not ideal for us
It will rain somewhere. Generally places that already have rain. If you're counting global amount, we have plenty of fresh water, but we don't have it in the places where we need it.
That can still turn into a local deficit in areas with little rainfall
Evaporative coolers are cheap. It can be done with non-evaporative coolers, but is far more expensive to build.
Not to mention a much higher carbon footprint.
The reason evaporative coolers are cheap is because they use a fraction of the electricity that chillers do.
And note that the majority of data center water usage is indirect via power generation, so using less water on site but more indirectly by consuming more power is both more expensive and less efficient.
Unfortunately, evaporative coolers are the best way to go, for now.
From Google's blog:
From the WaPo article:
They compare it to residential use and I wonder if they add all those sources for that when comparing?
For California at least, residential use is about 10% of all water usage iirc. So if data centers are dwarfed by that...not a big concern in the big picture.
The issue I guess is when data center usage sucks up all the local supply. State and region wide they don't use much but they do use a lot in one small area.