X
Innovation

Google: Our data centers use half the energy

Google dutifully measured the energy consumption for years in pursuit of an efficient fleet of data centers. Those efforts and innovations, like its seawater-cooled data center, have paid dividends.
Written by Kirsten Korosec, Contributor

As cloud-computing and social networking services have surged, so has demand for power-hungry data centers. That's prompted search engine giant Google to constantly measure the performance of its numerous data centers and take bold (as well as simple) steps to improve efficiency.

The upshot? Google's data centers use 50 percent less energy than its competitors, senior director of data center construction and operations Joe Kava said in a blog post today that coincided with the company's quarterly power use report.

A data center is more efficient when it uses less energy to run the facility than to power the IT equipment. So, Google has made tracking its power usage effectiveness -- a ratio of the total energy used to run a data center to the amount used for its servers --  one of its key measurements. If a data center has a PUE of 2.0, that means for every watt of energy that powers the servers, another watt powers the cooling, lighting and other systems.

Most data centers use just as much overhead energy like cooling and power conversion as they do to power their servers, according to Google. An ideal PUE would be 1.0, Kava said.

In 2011, the trailing 12-month PUE for all Google data centers was 1.14, nearly 2 percent improvement from the previous year. Translation: The search engine giant's data centers use 14 percent additional power for all sources of overhead combined. Meaning, most of that energy powers the machines that directly serve Google searches and products.

Google says its calculations include the performance of its entire data center fleet, not just its newest facilities. The company also includes all sources of overhead in its efficiency metric and takes measurements in all seasons. That means, Google accounts for the electricity used to power the servers and cooling systems as well as the oil and natural gas that heats its offices. Kava said the company also includes system inefficiencies found in transformers, cable and UPS losses. The numbers are based on production data taken from hundreds of meters installed throughout Google's data centers.

Some of Google's innovative data center projects include:

  • Seawater-cooled data center in Finland. Google bought in 2009 an old 1950s-era paper mill that was conveniently equipped with a quarter-mile-long seawater tunnel. The company created a system that takes raw seawater from the Gulf of Finland and runs it through heat transfer units, which are used to cool the servers. The old seawater is later mixed with a fresh chilly batch before it's returned to the gulf to minimize environmental impacts.
  • Google built an evaporative cooling system at its Georgia data center that uses chilled sprayed water to cool its servers. The novel part, is Google decision to tap into recycled waste water for its source. Google built a plant that funnels up to 30 percent of the waste water from a Douglas County Water and Sewer Authority plant for its data center cooling system. Google sends any water that hasn't evaporated to a cleaning plant, where it's eventually pumped back into the Chattahoochee River.
  • Google has developed and released five simple innovations it has developed and still uses to make its data centers more efficient. One of its key recommendations: turn up your thermostat.
  • Google relies entirely on fresh air for cooling at its Belgium data center. It's a system that works most of the year and has helped the facility become the company's most efficient data center. During hot weather, the temps inside the data center can rise above 95 degrees Fahrenheit, making it unsuitable for workers. When this happens, Google workers leave the server area to climate-controlled sections of the building.

Photo: brcwcs via stock.xchng

Related:

This post was originally published on Smartplanet.com

Editorial standards