RSS

The Bulletin

Waste heat from computers will cool other computers

Posting in Design

Say you can’t move your entire data center to the edge of the Arctic Circle, where it’s constantly frigid and the electricity is cheap. How do you cool the furnace-like heat that big data-storage spaces generate?

Use hot water from computers to drive the refrigeration of others. IEEE Spectrum reports.

“The electrical energy that goes into the computer is converted into heat, essentially, and if you could reuse that heat somehow, then you recover a large part of the energy and the cost that you put into this,” says Tilo Wettig at the University of Regensburg in Germany.

Basically, that waste heat can be used to drive a special kind of refrigerator called an adsorption chiller. The chillers produce cold water, which is then used to cool other computers in the data center or provide air-conditioning for the site’s human workers.

The Regensburg team spent over a year operating a new liquid-cooling technology meant for large-scale, piping-hot computer clusters. They call their research platform iDataCool.

Since the project was carried out in collaboration with the IBM Research & Development Laboratory, the researchers based it on IBM’s iDataPlex server system, designed for high-performance computing.

[The] key innovation of iDataCool is its low-cost, custom-designed copper heat sink, through which the water flows, drawing away heat. The processor heat sink is hard-soldered to a copper pipeline with flowing water, which is attached to heat sinks affixed to other components in the system, such as memory and voltage controllers.

Energy from all this hot water (above 65 degrees Celsius) drives the adsorption chiller, which feeds a separate cooling loop in the data center.

The energy recovered was about 25 percent of what would’ve been lost (better thermal insulation could get it to 50 percent).

Most of our laptops are cooled by air, often with fans. Liquid-cooled systems are more expensive and require more maintenance (water causes irreparable damage to circuits), so its biggest deterrent might be the fear of putting up a little more money for a system that isn’t widely used.

The results were presented at the International Supercomputing Conference in Leipzig, Germany, last week.

[Via IEEE Spectrum]

Image: SuperMUC (with a similar warm water cooling system) / LRZ

— By on June 27, 2013, 6:33 AM PST

Janet Fang

Contributing Editor

Janet Fang has written for Nature, Discover and the Point Reyes Light. She is currently a lab technician at Lamont-Doherty Earth Observatory. She holds degrees from the University of California, Berkeley and Columbia University. She is based in New York. Follow her on Twitter. Disclosure