Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Facebook's Data Center: Where Likes Live

The road to one of the world's mega data centers is lined with scattered sagebrush, tumbleweeds and gnarly, undersized junipers.

No crops grow without irrigation in the central Oregon high desert; it's not promising territory for large populations. But it serves as home to the most heavily populated Internet application, the 901 million user-strong Facebook.

Every "like," comment or other befriending action on Facebook, at least by users on the West Coast, is executed inside one of two massive 330,000-square-foot "data halls" in a complex on a mesa outside Prineville, Ore. The buildings are identical; if each could be tipped on its end, you would see two 81-story skyscrapers. They are the first data centers that Facebook has designed, owned and operated -- and they're built to a whole new scale.

[ Learn more about what's ahead for data center design. See 5 Data Center Trends For 2013. ]

From a distance, the unfenced complex has a remarkably open look. There is a manned security gate, but Building Two, closest to the gate, has an accessible air about it. The glass of the office space shows up prominently in the middle of the building, and from a distance it appears that a series of big cards line the approach to the front door. Those "cards" are two-story concrete slabs backed by heavy steel supports. The office space itself is fronted by a stone wall four feet high and thick, contained in a heavy wire mesh. No vehicle is going to get too close to the more vulnerable parts of the building.

Building Two

Although it isn't self-evident, "there's a pretty sophisticated security system both in the building and on the premises," said Andy Andres, project executive for Facebook for the building's co-builder, DPR Construction of Redwood City, Calif. It teamed up with Fortis Construction of Portland, Ore., to do the project.

The Prineville complex -- there's room for a third 330,000-square-foot data hall on the site -- is Facebook's answer to leasing data center space in the Silicon Valley, which is what it did before Prineville Building One opened at the end of 2010. It's building another center on the Prineville pattern in Forest City, N.C., slated to begin operations next year. Facebook also has one operating in Lulea, Sweden, where cheap hydropower abounds. In each case, it's striving for data centers that are highly efficient to operate in automation, energy consumption and staffing. The mammoth Prineville buildings operate with a total of 72 people (although the second data hall is still being equipped).

The design, using ambient air for cooling, has cut energy consumption so much that Prineville was named the number one green engineering project of 2011 by the U.S. State Department's director of office construction and other judges selected by Engineering News-Record, a McGraw-Hill publication.

But it stands out in another way. Until recently, the giants of the Internet -- Amazon.com, eBay, Google -- didn't talk about how they built their data centers or the servers inside. These service-oriented data centers -- the first "cloud" data centers -- were different; the servers that went into them were stripped down and optimized in ways that distinguished them from servers that sit in a business department or enterprise data center.

Intel officials recognized where this class of server was being employed – in the new style of data centers being built by Google, Microsoft, Apple and similar companies. Based on the demand Intel saw unfolding for related server components, Intel calculated at the end of 2011 that $450 billion a year was being spent on data center construction. Those data centers fuel the iPhone apps, instant searches, eBay trades and Amazon.com e-commerce that make up the unfolding digital economy.

Thus, facilities like Prineville matter. If the world has an expanding appetite for compute power, it's important that the data centers providing the backend services be added to the environment in the most efficient manner possible. Facebook is a LEEDGold-certified building, meaning it has undertaken industry-leading power-conserving measures. If the typical enterprise data center pipes twice as much power into the data center as needed by the computing devices, Prineville lowers the ratio (its power unit efficiency, or PUE) to 1.06 or 1.07, one of the best ratios established anywhere.

While some of the features the data center incorporates were invented by others, Facebook is unique in publishing the details of its designs and specifications. In April 2011, Facebook founded the Open Compute Project, where it makes available as open source information the designs for its servers. "We feel our competitive advantage is in our software and the service we deliver," not in the design of the data center, said Tom Furlong, Facebook director of site operations, in the announcement of the Engineering News-Recordaward.

Joshua Crass, Prineville data center manager and a former Google operations team manager from 2006 to 2010, had a more down-to-earth way of summing up the difference: "When I was working at Google, my wife never saw the inside of my office. Here my two kids come in and play" around Building Two's sprawling open office space.

  • 1