Menu
Cloud Standards: Trickier than Nailing Jelly to a Wall

Cloud Standards: Trickier than Nailing Jelly to a Wall

Open cloud computing standards and specifications now could help IT avoid vendor lock-in later. Major virtualization vendors and the DMTF came together this week to accelerate work on standards, but the early plans still need "a lot of work," analysts say.

Just try creating a definition of cloud computing that's broad enough to encompass all its permutations and narrow enough to provide technical guidance on how to get one cloud talking to another.

At least three years (although possibly eight, arguably 11 and somewhat untenably, one), after the phrase "cloud computing" was first used to describe on-demand computing services, nearly every conversation with experts involves defining what you both mean by it.

"At Burton Group it took a meeting where all the analysts were present over two days to come up with a definition, and it ended up being very small, so it could encompass everything," according to Chris Wolf, analyst at the Burton Group.

"Not always, but usually you're talking about either software as a service, infrastructure services, or running an application on someone else's cloud," according to James Staten, analyst for Forrester Research. "Within that there are a lot of variations. There are no standard pricing models, standard offerings, definitions of terms-if enterprises are going to consume this, they have to know what they're getting. There have to be things that will be the same from cloud to cloud," Staten says.

The Desktop Management Task Force (DMTF) may not solve the jelly problem, or even the definition of Cloud, but it is working on a set of specifications that should give both cloud providers and customers a common language to describe what services a cloud offers and how to make use of them, according to Winston Bumpus, president of DMTF and director of standards architecture at VMware.

The DMTF announced this week it has formed a working group called the Open Cloud Standards Incubator, to lay out a set of specifications defining how cloud-computing platforms can define their own operations and interoperate with others.

Key among the desired requirements will be common terms describing the capabilities or services one cloud supports, the quality of service it offers, protocols for customers to provision new servers or applications, billing and verification of the services customers buy, and the secure management of data within the clouds. These elements can help prevent vendor lock-in.

"We're focused on three aspects of cloud management and interoperability," Bumpus says. "First is to build on the work we've done on the Open Virtualization Foundation [and the associated standard, the Open Virtualization Format and what extensions we need to have to support the cloud environment. Second is resource management and interoperability between clouds that provide infrastructure as a service. Third is security as people move from their own environments into cloud environments federated trust, inter-cloud communications, protocols and APIs for secure interoperability."

In addition to VMware, members of the leadership board of the working group include AMD, Cisco, Citrix, EMC, HP, IBM, Intel, Microsoft, Novell, Red Hat, Savvi and Sun Microsystems. The group doesn't plan to create new standards from scratch; instead, it will build on the specifications it created for the Open Virtualization Foundation and other management specifications, adapting or extending them to apply effectively to cloud platforms, Bumpus says.

Where a function or relevant open-standard specification is available, the group will use it, Bumpus says. In cases where an existing specification doesn't go far enough to define a cloud-computing function, members of the Incubator will add to those functions; where there are no standards, it will create new ones.

"A lot of what we're dealing with in this space is management," Bumpus says. "Provisioning, resource management, quality of service. So we're looking at specs that have been developed in a fragmented way and getting together to make sense of them as a group."

"Customers are looking for big things like security, [from cloud computing] but they're also looking to avoid lock-in to one vendor," Bumpus said. "If we can create a set of common definitions, then [vendors] can compete on function, performance and reliability, the way we think they should, not locking customers in to one set of interfaces."

Though it was founded primarily to develop systems-management tools that would help vendors sell more hardware, DMTF isn't a bad choice of organization to create management standards for cloud computing vendors, Forrester's Staten says.

"Given how immature cloud computing is right now-we don't have standard pricing models, standard offerings-having a standards organization like DMTF that's perceived as being more vendor-driven isn't a bad thing, to move forward even without having the ultimate answer to something," Staten says.

"If you want everyone in the world to speak the same language, you take it to someone like the IETF," Staten says. "They'd say to be a cloud it has to support these 35 APIs, and it would work well, but they'd give us something by 2015. DMTF isn't likely to do that."

DMTF, in fact, did so good a job defining standards and specifications for virtual servers, applications and storage that the development of OVF really gave it new life, more than a decade after it first developed workable systems management protocols, Wolf says.

"Having service providers [such as Amazon, IBM and EDS] as part of the mix is new for DMTF, but it will reach out to those folks for cloud standards," Wolf says. "It's not going to be easy to create standards for trusted federation and interaction between clouds. But wholesale cloud adoption is at least two or three years away, so the year or two before we'll see anything from DMTF isn't too much of a problem."

Though connecting one cloud to another is currently a theoretical problem, Wolf says many of his clients are already trying to figure out how to create trusted links among regional data centers, each of which is set up as a private cloud platform.

"A lot of organizations are looking at it now and making plans regarding architecture, so it's not a big problem that the standards aren't finished. A lot of clients that are interested still have to rework business processes to support that [cloud computing] model, and that may take longer," Wolf says.

Defining how to handle and document the security of data at rest and data as it's transmitted across the network will also be important, Wolf says, to calm both security concerns and meet regulatory compliance requirements.

"People recognize right now that there's a lot of insanity about cloud computing. It needs a lot of work," Wolf says. "You need standard metadata formats and interfaces to move data among different providers, and DMTF does have the connections to pull that off."

Follow everything from CIO.com on Twitter @CIOonline

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags cloud computing

More about Amazon Web ServicesAMDBurton GroupCiscoCitrix Systems Asia PacificEDS AustraliaEMC CorporationForrester ResearchHewlett-Packard AustraliaHPIBM AustraliaIETFIntelMicrosoftNovellProvisionProvisionRed HatSun MicrosystemsVMware Australia

Show Comments
[]