Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Facebook's Data Center: Where Likes Live: Page 2 of 3

To highlight this openness, Facebook sponsored a tour of Building Two recently, led by Crass. DPR Construction, which builds data centers and other advanced buildings for financial services, healthcare and technology companies, said the Facebook example is having an impact. "Facebook has taken the lid off the secrecy about how to bring power and cooling into a modern data center," Andres said. Its example is being copied by other leading data center builders.

Who, you might ask? Standing on the roof of the Prineville facility, Crass looks to the south and can see another mega data center going up next door. At 338,000 square feet, it appears to resemble Facebook's. It's being built by Apple.

One of the keys to power conservation in the mega data center is cooling by evaporation, rather than air conditioner compressor. There are no industrial air conditioners -- chillers, they're called in data center circles -- in the Prineville facility. Some advanced data centers, such as those built by Vantage in Silicon Valley, install chillers just in case for hot spells. But in Prineville, the dry desert air is a boon to evaporation, and Facebook has built what has to be one of the most massive air-movement systems in the world to get its data center servers to the temperature that it wants.

The air warmed by running disks and servers rises into a plenum above the equipment. It's siphoned off to mix with cooler outside air, pushed through a deep set of air filters that look like square versions of the round air filters that used to sit atop a car engine's carburetor -- a thick, porous, soft corrugated paper.

The air is then pushed through something called Munters Media, an absorbent cellulose material that absorbs a small amount of water trickling down its vertical, wavy slats. The dry air cools as it absorbs some of the moisture, and Facebook's sensitive equipment faces a reduced risk of static electrical shock as the outside air is cleansed, cooled and humidified. A product of a Swedish company, Munters Media was used by European farmers for simple cooling systems based on fans and air flow to cool poultry and cattle before being drafted into use in advanced data centers.

Server fans draw the cool air over server motherboards that have been designed to be long and narrow rather than the common rectangular shape. Components are arranged to encourage air flow, and unnecessary components, like video and graphic processors, are stripped away. Instead of memory chips acting as dams to the air flowing over the hot CPU, they are aligned in parallel with the direction of air flow. Warm air from two rows of servers, standing back to back, is exhausted into a shared hot aisle, rises to the plenum, and is either pushed out of the building or sent back to the ambient air mixing room to restart its journey.

A Facebook Server Suite

Crass, pictured here in one of his "server suites," refused to pump up the processes that he supervises. At the heart of it, he said the building management system is "deciding how much to open the dampers to bring in the cool air, putting it in the data center and getting rid of the hot air." It's also pushing air through banks of filters, controlling the valves that let water flow to Munters Media, monitoring the humidity as well as temperature of the data center air, and adjusting the speeds of six-foot fan blades for exhausting air from the last of a row of rooftop chambers.

The building's design lets fans and passive systems accomplish most of the work. Once humidified and cooled, the air flows down a nine-by-nine foot chute built between the filters and Munters Media rooms to the cold aisle of data hall below. As it falls, it hits a big deflector plate at ceiling level of the cold aisle and is spread out over the tops of running servers, where it's drawn across the warm server motherboards.

In earlier data center design, "the cool air flowed down to the floor of the data center. That didn't work as well. It didn't scatter as far and you need it at the top of the server rack, not just the bottom," said Crass.

The air-flow cooling also works because of the cool temperatures in the high desert. The hottest days of July and August will not go above 86 degrees, on average. As the Prineville complex was being designed, the American Society of Heating, Refrigeration and Air Conditioning Engineering raised the acceptable temperature for air used to cool computer equipment to 80.6 degrees. The conventional wisdom held that it was better to run computer equipment with air-conditioned air that might be in the low 60s or even 50s as it emanated from the chiller.

The Facebook data center runs hotter than many enterprise data centers at the peak of the summer, but its servers are not enclosed in metal cases; rather they sit on simple, pull-out trays in server racks that maximize the air flow over their components.

Crass thinks electrical equipment in his data hall can be adequately cooled with 85-degree air, given the powerful air flows possible with the building's design, and Facebook engineers have raised the upper margin of the cold aisle to that limit without ill effects. The higher the operating temperature, the less energy that needs to be poured into pumps for moving water and cooling fans.