From what I’ve seen it’s “not worth the effort or expense” to reuse the water. Some of them literally just send tap water through the cooling loops and then into the sewer drains
I worked 10 years at a data center, all that water is recycled - it is very carefully chemically balanced so as to not corrode the pipes and pumps, no they do not use it once and dump it out.
Because the massive stacks of high-powered chips that they use, tend to get very hot. They don’t use the kind of computers that work through passive cooling.
I don’t understand why AI data centers would CONSUME water. Once they fill up their chiller loops, then… that’s it, right?
It’s hard for me to imagine them relying on the temperature of the incoming water, and dumping all the warm water as discharge.
From what I’ve seen it’s “not worth the effort or expense” to reuse the water. Some of them literally just send tap water through the cooling loops and then into the sewer drains
They’re probably using cooling towers, which cool through evaporation. They should be using reclaimed though.
This is the right answer. They use evaporative cooling. Which does save a lot of power so they can claim to be “green”.
Hmm, I wonder if that plays into the wild and frequent thunderstorms in Texas now.
Its got to be the data centers or global warming overall (and its shifting of the Jetstream’s).
As long as it is cheaper to buy water, then evaporate it, big firms will continue to do so.
With a COP of around 15 and up it is difficult to argue with the economy of this.
Local regulation would be required, but that would need politicians who don’t suck.
I worked 10 years at a data center, all that water is recycled - it is very carefully chemically balanced so as to not corrode the pipes and pumps, no they do not use it once and dump it out.
Because the massive stacks of high-powered chips that they use, tend to get very hot. They don’t use the kind of computers that work through passive cooling.
I say, as my Laptop burns into my lap.