CIO

Facebook walks on optical networking's wild side

The social network is using technologies that others may take years to adopt

As a company that draws more than 2 billion eyeballs per month, Facebook was a fitting harbinger of trends to come at an optical networking conference.

The social networking goliath is lighting up its own optical fiber, deploying 100Gbps links in its data centers and looking towards emerging silicon photonics technology, Facebook Director of Technical Operations Najam Ahmad said on Monday at an Optical Society of America meeting held alongside the annual OFC (Optical Fiber Communications) Conference in San Francisco.

Facebook's challenges mirror those of other enterprises and data center operators, with fast-growing data traffic and rapidly evolving network needs, but with 1.2 billion active monthly users, it's facing those issues sooner than some. Though his company is unique in some ways, Ahmad's comments in an on-stage interview may shed some light on the future of connectivity.

The biggest traffic at Facebook isn't in and out of its data centers but among the servers within them, Ahmad said. That's because every time a user logs in to the site, hundreds or thousands of servers are called into action to compute different parts of that customer's News Feed on the fly. This so-called "east-west" traffic is also growing faster as apps on Facebook grow more complex and user interactions become richer, he said.

So while many enterprises are deploying 10-Gigabit Ethernet today, that's already the minimum at Facebook.

"We haven't deployed anything less than a 10-Gig for about two years now," Ahmad said. Those are the links to servers themselves, and upstream from that, Facebook is using 40-Gigabit Ethernet and a few 100-Gigabit links. It's mostly on 40-Gigabit now because 100-Gigabit is still too expensive, he said. But within a year or two, Ahmad expects the company's fast pipes to become predominantly 100-Gigabit.

Facebook's computing needs are growing so quickly that the company no longer builds just one data center at a time, Ahmad said.

"Now what we do is we buy land, we build one building, and then a second, a third, and a fourth," he said. "All of a sudden, what we've done is build a campus. So our optical needs change slightly."

To link, say, four data centers spread across a campus of 10 to 20 acres, Ahmad would like to have a fiber technology that can span one or two kilometers and carry 100Mbps to start, then 200Mbps and 400Mbps as traffic grows over time. For that, he envisions connections using single-mode fiber rather than the multimode fiber most commonly used in data centers today, which has a shorter range.

Such a system is likely to use silicon photonics, an emerging technology that applies the world's most common semiconductor material to optical connectivity, Ahmad said. Facebook is also exploring silicon photonics because it wants to use so-called rack-level computing, in which computing, storage and memory are concentrated in separate racks and connected at high speed to form the equivalent of many servers. PCIe is another option for this, he added.

For the long-distance links between its growing data centers, Facebook is starting to buy and light up its own "dark fiber" capacity instead of leasing connections on carriers' networks. One reason is that leasing the kind of bandwidth Facebook needs is sometimes more expensive than just buying the fiber capacity. But by controlling the fiber itself, the company also can respond to rapid changes in traffic, he said.

"Leasing capacity takes too long. Usually four to six weeks," Ahmad said. "We want to be able to do it in four to six minutes."

That complaint echoed comments by others at the daylong conference. Quicker provisioning was one reason that collocation provider Layer 42 Networks got its own fiber, which it uses to sell network capacity alongside its other services, CEO Derek Garnier said. It's also something that BT hopes eventually to achieve through software-defined networking, said Andrew Lord, head of optical research at the U.K. carrier.

Stephen Lawson covers mobile, storage and networking technologies for The IDG News Service. Follow Stephen on Twitter at @sdlawsonmedia. Stephen's e-mail address is stephen_lawson@idg.com