Menu
Legacy Apps in the Cloud: Six Details Worth Sweating

Legacy Apps in the Cloud: Six Details Worth Sweating

Sprawl in a cloud environment costs the user extra for wasted resources and increases the risk of security breaches

A host of substantial problems with porting legacy apps to the cloud will keep most companies from diving in for now, say analysts reporting on weaknesses in the cloud and ISVs trying to fill in the gaps.

But just as important for legacy applications that are often heavily customized and surrounded by cordons of stored procedures, report-generating scripts and security auditing tools, are the smaller issues that aren't obvious immediately, but can stop the show just as effectively as the biggies, according to Bernard Golden, CEO at consulting firm HyperStratus, and CIO.com blogger. Here's a look at the details worth sweating.

Visibility

Some applications require close monitoring, either by IT people on guard to make sure nothing goes pear-shaped unexpectedly, or by software that keeps track of who uses the application, what data they accessed and what they did with it, according to Chris Wolf, infrastructure analyst at The Burton Group.

This isn't an issue of basic security-limiting either physically or through programmatic limits the number of people who can use software or data. It's the ability to go much deeper-tracking which authorized users actually used the application, when they did, what data they changed or reports they generated, and who used those reports or data afterward, Wolf says.

That kind of control is ridiculous if you're talking about Google mail. But it's not only critical, it's required by law if you're talking about software for finance or customer management. Unfortunately, most of the network- and application-access protocols those tracking applications use don't work across the Internet, or have been turned off by cloud providers worried about customer privacy and security.

If you want to see and be able to report, reliably, on who has been using your data and applications, be sure your cloud provider can either build in a gateway for your security tracking, or provide a mechanism within its own environment to track and report what's going on in your part of the cloud, analysts say.

Even if your cloud provider does offer strong security assurances, how well those assurances will stand up to audit-at least for now-may boil down to how well your auditor understands virtualization and the cloud, virtualization security experts say.

Domino Updates

Data isn't static. It has to be updated regularly and correctly. Most companies automate updates to records that may be housed in several databases, and most legacy applications work effectively with those scripts, which were mostly written specifically for them, according to Steve Yaskin, CTO and founder of Queplix, a software and consulting company that specializes in migrating legacy applications to cloud environments.

In the U.S. military, for example, a soldier's total health and performance record is only accessible by using a Social Security Number to identify relevant records stored in Army, Army Reserve, Veterans Administration and other databases.

Changes to one have to replicate to the others, which is that much more difficult if the data or applications to access it has been moved to a cloud environment that may not assign the kind of static location identifiers for data stores a legacy application did.

Porting one application to the cloud can break those kinds of connections, requiring a whole series of modifications, rewrites and ports of middleware, designed scripts and other relatively undocumented customizations that suddenly can't find the data they need to function.

Naming "Standards"

Over the years most companies have built up inventories of applications that are almost compatible and almost standard, despite differences in the applications themselves or in the data they generate.

An EMEA group might define "customer," "product" and "revenue" differently, for example, than a group in a different part of the world, and IT runs a little field-mapping or data conversion so it doesn't have to tell either half of the world it's doing things wrong.

Even if the only difference is the number of characters or specific database fields involved in defining what a customer is that difference can cause bigger problems when you move one of those applications to the cloud, according to David Linthicum, CTO of Saga Software and author of Cloud Computing and SOA Convergence in Your Enterprise: A Step-by-Step Guide.

In the cloud, no matter how well your main application runs with compute resources you can increase at will, mapping or conversion scripts may not be able to link as tightly with either the data or chain of reporting routines that just won't work without a little data tweak here and there.

Missing Masters

Many organizations avoid the data-naming problem and problems with data consistency and currency (essentially version control on a massive scale) using Master Data Management (MDM)-a defined set of qualifications and definitions of what constitutes the correct data for the company as a whole.

Geographic divisions or business units may continue to use more recent results, or sales and cost data stripped of components that don't involve them, but the company as a whole defines "revenue" according to a single set of numbers that are updated at a specific time.

If apps feeding into the "master" data set move into the cloud-or if the MDM applications and data themselves go cloudward-it becomes far more difficult to figure out which data are real and which are the imposters. Security and financial-reporting auditors tend to look askance at that level of uncertainty.

Sprawl

The problem with cloud environments, as with virtual server infrastructures, is the risk that you'll take advantage of all that potential space and just spawn off as many copies of a VM, application or database as you need, and then forget about them.

Sprawl in a cloud environment costs the user extra for wasted resources and increases the risk of security breaches in applications that are insufficiently supervised.

New tools from Appistry, VMware and Elastra, among others, are designed to reign sprawl in within both cloud and VM infrastructures, but legacy applications may have to be retooled to be managed directly by those tools, rather than just being managed by default when the VM on which they run is set to obey policies on security, capacity and lifecycle.

Scale

Porting Siebel or Salesforce.com applications to the cloud is easier than highly customized Oracle, SAP or other in-house applications, just by nature of the environment in which they were developed, according to Nathan Brookwood, principal analyst at Insight64.

Many of those applications, especially those whose logic includes heavy duty processing rather than just monitoring transactions, were designed for large, vertically scaled servers, not for environments like those in most clouds, which rely on a larger number of lower-powered servers, he says.

Legacy applications that scale vertically and have difficulty spreading out the other way may present unexpected performance problems, even if all their other data connection and protocol support make them look like good candidates for the cloud, he says.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags cloud computinglegacy applications

More about Burton GroupetworkGoogleOracleSalesforce.comSAP AustraliaStratusVMware Australia

Show Comments
[]