Say you can’t move your entire data center to the edge of the Arctic Circle, where it’s constantly frigid and the electricity is cheap. How do you cool the furnace-like heat that big data-storage spaces generate?
Use hot water from computers to drive the refrigeration of others. IEEE Spectrum reports.
“The electrical energy that goes into the computer is converted into heat, essentially, and if you could reuse that heat somehow, then you recover a large part of the energy and the cost that you put into this,” says Tilo Wettig at the University of Regensburg in Germany.
Basically, that waste heat can be used to drive a special kind of refrigerator called an adsorption chiller. The chillers produce cold water, which is then used to cool other computers in the data center or provide air-conditioning for the site’s human workers.
The Regensburg team spent over a year operating a new liquid-cooling technology meant for large-scale, piping-hot computer clusters. They call their research platform iDataCool.
[The] key innovation of iDataCool is its low-cost, custom-designed copper heat sink, through which the water flows, drawing away heat. The processor heat sink is hard-soldered to a copper pipeline with flowing water, which is attached to heat sinks affixed to other components in the system, such as memory and voltage controllers.
Energy from all this hot water (above 65 degrees Celsius) drives the adsorption chiller, which feeds a separate cooling loop in the data center.
The energy recovered was about 25 percent of what would’ve been lost (better thermal insulation could get it to 50 percent).
Most of our laptops are cooled by air, often with fans. Liquid-cooled systems are more expensive and require more maintenance (water causes irreparable damage to circuits), so its biggest deterrent might be the fear of putting up a little more money for a system that isn’t widely used.
The results were presented at the International Supercomputing Conference in Leipzig, Germany, last week.
[Via IEEE Spectrum]