Menu
InfiniBand will reach 200-gigabit speed next year

InfiniBand will reach 200-gigabit speed next year

Mellanox will ship a set of products for end-to-end 200Gbps links for HPC and more

InfiniBand is set to hit 200Gbps (bits per second) in products that were announced Thursday, potentially accelerating machine-learning platforms as well as HPC (high-performance computing) systems.

The massive computing performance of new servers equipped with GPUs calls for high network speeds, and these systems are quickly being deployed to handle machine-learning tasks, Dell’Oro Group analyst Sameh Boujelbene said.

So-called HDR InfiniBand, which will be generally available next year in three sets of products from Mellanox Technologies, will double the top speed of InfiniBand. It will also have twice the top speed of Ethernet.

But the high-performance crowd that’s likely to adopt this new interconnect is a small one, Boujelbene said. Look for the top 10 percent of InfiniBand users, who already use 100Gbps InfiniBand, to jump on the new stuff, she said.

InfiniBand itself makes up just a small part of data-center networking. Less than five percent of all server controllers and adapters shipping these days use InfiniBand, with most of the rest using Ethernet, Boujelbene said. Intel Omni-Path, a 100Gbps interconnect that grew out of the company’s acquisition of onetime InfiniBand vendor QLogic, has an even smaller slice.

One thing that sets InfiniBand apart from Ethernet is its low and predictable latency. This matters a lot in HPC, where it’s common to run simulations that have to go through many iterations. Each delay in performing an interation gets compounded thousands of times, so it needs to be minimal, said Gilad Shainer, Mellanox’s vice president of marketing.

HPC systems are the most likely to have InfiniBand today. More than 200 of the huge systems on the Top500 list of supercomputers for 2016 use the interconnect. But it’s also being adopted in machine learning, big data, financial, fraud detection and other applications, Shainer said.

The HDR InfiniBand product lineup has everything needed for an end-to-end 200Gbps data-center interconnect, Mellanox says. It includes Mellanox’s ConnectX-6 adapters, Quantum switches and LinkX cables and transceivers.

The adapters can connect with any server CPU architecture, including x86, Power and ARM. The switch can have 40 ports of 200Gbps InfiniBand or 80 ports of 100Gbps, with a total of 16Tbps of switching capacity.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about ARMDellIntelMellanox TechnologiesOmniQLogicQuantum

Show Comments
[]