Menu
How (and Why) Facebook Excels at Data Center Efficiency

How (and Why) Facebook Excels at Data Center Efficiency

Facebook's data centers are 'second to none' thanks to a homogenous and nimble design focused on efficiency and cost savings.

Efficiency is a priority at the core of Facebook's capability to scale, according to Jason Taylor, the company's vice president of infrastructure. There are never more than five types of servers in use at any of Facebook's data centers, located in five different regions, and no servers or infrastructure sit dormant waiting for services to launch, Taylor told analysts yesterday at the Credit Suisse Technology Conference.

Considering the volume of new data that Facebook processes every day -- 930 million photo uploads, 6 billion likes and 12 billion messages -- cost savings and efficiency gains compound at virtually unparalleled magnitudes.

Taylor says Facebook is focused on cutting costs at its data centers, on servers and on software. That effort begins with managing the heat that emanates from the servers that power its 1.35 billion monthly users.

Data centers housed in facilities without dedicated heat management systems can easily incur up to 90 percent more energy costs for every watt delivered to each server, Taylor says. "Now at Facebook, because we've designed both our own servers and data centers, that heat tax is only seven percent... We are using cold air from the outside. We're not chilling air at all. We're passing it across the servers, mixing it in a hot aisle and then evacuating out the other side of the building."

"In terms of raw thermal efficiency, our data centers are second to none," says Taylor.

Facebook's Adherence to Open Source

Facebook's data center design, which is open source as part of its Open Compute Project, also delivers savings through a deliberately homogeneous approach to computing. Advantages include volume pricing, repurposing, easier operations and simpler repairs, and servers can be allocated in hours rather than months. Homogeneous infrastructure also makes it easier for technicians to optimize systems on the fly, according to Taylor.

Facebook's third major efficiency win comes from its software, which Taylor says is "absolutely critical" in delivering efficient infrastructure. "Software is far more flexible than hardware. You pay for hardware and your hardware becomes inefficient when you have a lot of variation," he says, adding that most of Facebook's core software is also open source.

"We really believe that the entire industry can benefit from efficiency work that we do, and that we can benefit from the industry feeding back and contributing new ideas and designs," Taylor says. "Fundamentally our company is going to win or lose based on our product, the cost of our infrastructure, and cost efficiency wins on infrastructure is something we like the entire industry to benefit from."

While the technician-to-server ratio at a typical data center is around 1 to 450, the ratio in Facebook's facilities falls somewhere between 1 to 15,000 and 1 to 20,000, according to Taylor.

Facebook's Exponential Rise in Network Bandwidth

Taylor says that the amount of network available for a reasonable price is increasing dramatically. The standard 1GB servers that Facebook used when he joined the company in 2009 were upgraded to 10GB in 2011, and they'll be swapped out again with 25GB boxes within the next two years.

"I'd say within three years we'll have 100GB servers," Taylor says. "So over around a six-year period, going 100 times up in the amount of bandwidth that's available. I think network and improvements in networking is going to be the largest driver toward changes in how large-scale Internet companies work."

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags FacebookenvironmentCredit SuisseGreen data center

More about Credit SuisseFacebookTechnology

Show Comments
[]