Menu
NASA wants to run space missions, not data centers

NASA wants to run space missions, not data centers

CTO says the primary effort of NASA's open source cloud effort is to free itself from IT tasks

The National Aeronautics and Space Administration (NASA) is backing open source cloud computing with a long-term goal in mind: to get out of the data center business.

NASA CTO Chris Kemp said he believes that compute resources are fundamentally a utility, no different than electric power. And "we don't own power plants right now - we don't own other services that are provided as utilities," he said

"I don't see why NASA needs to operate any infrastructure," said Kemp. "We can build space probes, we can build deep space networks, we can stay out on the frontiers, where the American public wants us to be and not spend over $1 billion a year on it infrastructure."

Kemp talked about NASA's cloud computing efforts during a talk and an interview at Gartner Inc.'s ITExpo symposium here this week.

It was author Nicholas Carr who popularized the notion that compute resources would become a commodity that in time would be accessed as utilities now are.

But so far, the computer industry is still far from operating like a utility. Many cloud platforms are still proprietary and unplugging from one provider's cloud-based apps to another's is difficult.

That's where Kemp and NASA have stepped in with a potential solution.

NASA developed its own cloud computing platform, called Nebula, and has released it as open source under an Apache 2.0 license. Cloud computing and hosting provider Rackspace, which had developed its own internal cloud management platform, contacted NASA about using some of Nebula's code. That effort led to OpenStack , which emerged from Rackspace, NASA and others this summer as an open source cloud platform.

"Our mandate is to commercialize technology," said Kemp, noting that code from NASA's Nebula cloud software management stack is now part of the OpenStack technology. "That could be one of the most important pieces of technology that NASA has commercialized in a long time," he said.

The benefits of open source to NASA's cloud efforts are clear, said Kemp. It expands the number of developers working on OpenStack code and NASA can help influence its development and standards.

"This furthers our objective of having off-the-shelf products that meet our requirements -- less custom development [and] less proprietary systems," he added.

NASA's long-range plan is to increase reliance on cloud-based services, transitioning from internal systems over a 10- to 20-year period. Kemp believes it is possible that the agency may eventually get much, if not all, of its compute resources delivered via external cloud resources.

At this point Kemp said the agency doesn't officially use public cloud services -- though some researchers may be using public clouds on their own -- but they are being investigated.

Even with NASA's support, it remains to be seen how much OpenStack can influence the future of cloud computing.

Daryl Plummer, a Gartner analyst, said today there are no standards in the cloud and "there is extremely limited portability."

Plummer said OpenStack is an infrastructure service only, not about other cloud layers. "It's not a bad goal, you just have to be realistic about how far you can go with it," he said.

John Engate, the CTO of Rackspace, is hoping that the open source push short circuits any proprietary cloud push, but he also acknowledges that there's much work ahead.

OpenStack includes the code that NASA was using with the KVM hypervisor. Rackspace's platform had Xen support. VMware support is not yet available and while support for its hypervisor can be built, VMware's support will needed to support its tools as well.

Engate says he is buoyed by developer and commercial interest in OpenStack, which is also supported by Dell and Citrix, though he acknowledged that its future is yet to be determined. "It's a community, not a Rackspace thing," Engate added.

The vendors, in ensuring portability though open source and standards, might argue that the trade-off will be the stifling of innovation, and the benefits that their respective services deliver, even if that means dealing with a variety of different APIs and data structure.

But Joseph Flynn, CIO of MIT's Lincoln Laboratory, said that "portability is absolutely critical - it's not just where it sits and how it gets there."

While portability may hinder some innovation, from a user perspective, "regulatory risk is regulatory risk," said Flynn.

Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, or subscribe to Patrick's RSS feed . His e-mail address is pthibodeau@computerworld.com.

Read more about cloud computing in Computerworld's Cloud Computing Topic Center.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags cloud computinginternetData Centerhardware systemsNASAConfiguration / maintenanceGovernment/Industries

Show Comments
[]