CIO

Bugs!

CIOs, either because they're afraid of the vendors they're dependent on or they fear that any admission of failure will reflect negatively on them, aren't talking. At least not on the record.

Software ain't what it used to be.

According to last year's President's Information Technology Advisory Committee report, today's software often fails "in unpredicted ways". It's buggy and unreliable, and it just doesn't do what it's supposed to.

Not that anyone needs to be told that by a federal committee. Everyone already knows about the poor quality of software. It's even hit the daily newspapers.

In April, the Mississippi State Tax Commission filed a lawsuit against American Management Systems in Virginia because the automated tax revenue system that the consultancy built didn't work without "multiple incidents of both a critical and serious nature", according to the lawsuit. And The Topeka Capital-Journal, a Kansas newspaper, gleefully reported the Mississippi suit by saying that Kansas' screwed-up tax system relied on the same buggy software.

So everybody knows all about this. But CIOs, either because they're afraid of the vendors they're dependent on or they fear that any admission of failure will reflect negatively on them, aren't talking. At least not on the record.

"I'll talk about it anonymously, but I don't want to push the company name out there," says one IT executive who's frustrated with a vendor and its buggy billing package. "We've had some success dealing with them as a beta tester and wouldn't want to jeopardise that."

As a result of this silence, "there's an amazing lack of detailed examples of serious software failure in the field that can be used to publicise what good and bad [software development] practices are", says Cem Kaner, a Silicon Valley attorney and software quality expert. "CIOs sit on that data." (See "Beta Testing," page 100, for tips.) While vendors shoulder most of the blame for today's shoddy apps, some industry watchers argue that a slice of the blame pie belongs to the all-too-quiet CIOs. Scott GrA¨aux, technology manager at Windough.com, a Florida-based online sweepstakes Web site, says many of his colleagues "hear the complaints from their users but don't invest the time to partner with software providers to create a more robust tool. This leaves their end users with information gaps and leaves the software manufacturer with a subpar product."

But a handful of CIOs are bucking the trend. They're making sure they get better software by hopping in on vendors' test and development cycles and holding out cash until they fix their apps. For these CIOs, upsetting the status quo and sharing their time and know-how with vendors are worth the cost if it means they won't have to put up with substandard programs, infinite tweaks and endless service calls.

It Was the Best of Times . . .

"The software development process used to be a long and complicated thing," recalls Randy Covill, senior analyst of e-commerce strategies and applications at AMR Research, an IT industry analysis company in Boston. As Roger Sherman, former director of testing for Microsoft, explained in his 1995 report "Shipping the Right Products at the Right Time", software quality is defined by three dimensions: reliability, feature set and schedule [the process]. If market conditions change and cause developers to place disproportionate emphasis on any one leg of this stool, the other two can suffer. If being first to market (the opposite of schedule) or feature set takes precedence, warns Sherman, reliability may suffer. For many years, when the market was less volatile, developers produced reliable software because they stuck to safe schedules.

According to Covill, the accepted best practices for software development used to call for three to six months of study before a 12-month product cycle even began. After that, developers conducted up to 16 types of tests for functionality, stability, ease of integration and scalability. Then, after the developers deemed the software stable enough to be used in controlled, real-world settings, they shipped it out to beta testers, who reported back to the developers.

This beta process benefited both parties. The vendors got a better product because CIOs caught bugs that got past the developers. According to Software Productivity Research, a Massachusetts-based software consultancy, betas yield some of the highest numbers for fixing errors of omission, commission, ambiguity, speed and capacity in software, compared with 16 other forms of in-house testing. And CIOs benefited by getting the technology first, learning its nuances and gaining leverage with the vendors that allowed the CIOs to ask for and get specific features. Plus, most CIOs received either a discounted or waived application fee for the software.

. . . It Was the Worst of Times

But the old, ideal beta process, says Neil Goldman, director of Internet computing strategies at the Boston technology research organisation The Yankee Group, never went according to plan. Some CIOs signed up for betas simply to receive the software in advance for a lower price and never really tested it or informed the vendor about the software's performance. And many who had every intention of sending feedback became distracted this past decade with priorities like Y2K preparations. Beta-testing responses dropped off so much, says Goldman, that many software shops began holding out on the lower price tag until they received genuine feedback pointing out bugs.

Some CIOs simply passed on testing to their subordinates. Ron Furmaniuk, manager of Notes beta administration at Lotus Development (US), says that of the 4000-plus testers he manages, most are managers of IS groups, project team leaders, system engineers, analysts and consultants. He cannot find a single CIO or director of IT in his database.

And while CIOs became too busy for beta programs, the market changed and developers got busier.

"A few years back, vendors told customers that the old times of waiting 18 months for a new release were over," recalls Jean-Luc Alarcon, senior vice president of the research reports and interactive/media resources division of Meta Group (US). "With the Internet, they said, we'll deliver you functionality on a regular basis." But, says Alarcon, in their attempts to build apps with more features than their competitors, time to market suffered. So, in their drive to work at Internet speed, the vendors began to cut corners, and 12-month cycles and beta tests went out the window. (Chad Robinson, a senior research analyst at Robert Frances Group, adds that some larger vendors also stopped running beta tests because they didn't want the details of their releases to leak to their competitors.) As Microsoft alum Sherman predicted, being first to market and feature set took precedence, and reliability took a back seat.

A few developers still beta test, but they're sending out prototypes that they've spent only a few months developing and that they know have bugs, says AMR Research's Covill. If the testers' feedback is bad, he says, the vendors throw away the prototypes and start over. If the feedback is good (or spare), the app goes to market. "[CIOs] buying new releases are more likely than ever to get versions not far beyond the prototyping stage," says Covill.

Hosted software provided by application service providers (ASPs) isn't much better, according to Goldman. ASP applications, which only live on the ASPs' servers, can be continually modified, so the ASPs don't have to worry about the robustness of their apps when they release them. There's never anything to recall. "To get 99.999 per cent uptime costs a lot of money," says Goldman, "and they don't spend a lot of time, money or energy [on testing]. They just want to roll it out."

According to Capers Jones, chief scientist at Artemis Manage-ment Systems, only 30 per cent of US software companies conducted beta tests in 1998, a number that some say has shrunk since then. In the meantime, unpredictable software failures are being caught not by preliminary testers, but by real-time users.

Why CIOs Aren't Beta Testing

Brian Bertlin, CIO of Washington Group International, doesn't beta test. He hasn't found an app that handles core IT functions so well that it would give his Idaho-based, engineering and construction organisation a business advantage, and that, he says, would be the only kind of app that would be worth getting in advance and spending extra time testing. "We're not a dotcom; we don't need to be on the bleeding edge," he says.

First Union's Jeff Scott says he can't justify the manpower requirements that testing demands. As the director of advanced technology at the North Carolina-based bank, Scott wants to find technology that will improve the bank's internal communications and external computerised banking offerings. Once upon a time, he organised a group that tested an advance release of Sun Microsystems' object request broker. And although his team learned the ins and outs of object technology, the bank ultimately decided it wasn't worth implementing. So Scott's team had wasted its time. "Our strategy for now," he says, "is to be an early adopter of technology we let others prove."

While Bertlin and Scott may be in the majority, a minority of IT executives are fighting the good fight against bad software. CIO talked with three who know that working with and pressuring vendors is the only way to find quality software.

Home Depot: La Vida Beta

Ron Griffin, CIO of Atlanta-based Home Depot, not only reviews applications that he buys before putting them in production, he also often beta tests them. The home goods giant's IT infrastructure connects more than 700 stores, and Griffin says he's always looking for new ways to streamline it. Thirty per cent of that infrastructure comes from vendors.

Griffin believes he is able to use so much foreign, potentially fluky code in his network precisely because he beta tests. He says he always sends new code through his team of testers, but if he does so during beta stages, vendors are more likely to incorporate functionality changes as well as code fixes. "Last year we processed almost a billion customers through our point of sale system," he says. "That probably took 50 billion internal transactions to process. Most applications aren't designed for the scope and scale of our business. So we're constantly rearchitecturing and refining."

Griffin says he understands why many CIOs use intuition rather than testing to decide if new software will work for their businesses ("Testing is slower," he says), but he thinks that not testing ultimately wastes more time and money. "Organisations that don't test find they have a harder time getting [the software] into production and their businesses," says Griffin. Once part of the IT staff has tested a vendor's product, Griffin says they can immediately help train others in the organisation. "They get the feeling of ownership through testing," he says. "And it leads to a much higher level of acceptance in the user community."

Compaq: On Beyond Beta

Joe Batista, the director and chief creatologist of Internet enterprise at Houston-based Compaq Computer, also believes that he avoids bad code by cosying up to vendors. But Batista goes a step further than Griffin and completely immerses himself in a vendor's production processes.

Batista has set up workshops in which he and up to five of Compaq's engineering, testing, design and business development veterans meet with young software companies' management teams. They listen to pitches, and they "prod them about their models, fundamental technologies and processes", says Batista. "It's both collaborative and disruptive."

In February 1999, Batista and his crew conducted a session with Conjoin, a Massachusetts-based company that was developing intranet development tools. When Batista heard Conjoin CEO Nick d'Arbeloff tout his company's Field First intranet publishing tool, Batista realised that it could help him with a project. He had been struggling with an intranet he built for Compaq's East Coast sales staff. Managing and updating the content on the home page - timely brochures, spec sheets and customer testimonials were becoming tedious. Field First, which allowed anyone to publish on an intranet with regular word processing applications, might relieve the pressure. But Batista couldn't just buy Field First because Conjoin hadn't finished developing it. So Batista offered to help d'Arbeloff finish building the prototype.

Batista says this kind of cooperation works because it allows Compaq and its vendors to take chances. "We didn't have to follow the traditional lines of corporate IT where someone had to be an established vendor with three references and millions of dollars," he says. If the quality of the code or experience of the vendor is questionable, Batista knows he has the leeway to hop in on its development cycle and fill in the gaps.

EsA¤vio: Money Talks

Sandy Philips, CIO at EsA¤vio, an e-business consultancy in Pennsylvania, doesn't have the flexibility Griffin and Batista enjoy at Home Depot and Compaq to get involved in vendors' product cycles (or the clout to convince those vendors to relinquish control of their development efforts). With only four full-time IT workers, he's too busy keeping his $US35 million, 300-plus-person company running smoothly. But he does control a seven-figure budget and the juice that goes with it.

Right now, Philips is struggling to integrate into EsA¤vio's back-end accounting system a practice management system the company bought last year from a vendor (that he'd prefer not to name). Not only is the product (which EsA¤vio bought before Philips became CIO) flawed, but the vendor has been slow to fix it.

"One of the invoices the management system generated was not picking up overtime hours," he says. "We found the bad code and had to massage the data by hand to build invoices. If we hadn't, it could have cost thousands upon thousands of dollars a month in mis-billing."

After discovering the problem, Philips rushed to tell the vendor, but it didn't respond. So he took the matter into his own hands. "I held all the bills they sent and said: Â'I won't pay until they fix the product and I get real performance'," he says. "Suddenly we started getting a whole lot more voice mails, e-mails and interest." The vendor assigned a full-time account manager to work with Philips.

Because of the problems Philips encountered with this vendor, he told another that he wouldn't pay until he saw performance. "They came to me with set amounts they wanted each month," he says. "I went back to their project plan, pulled out deliverables and said we'll make milestone payments based on those deliverables, provided they're good." Now Philips can refuse to pay if this vendor has a code problem that needs to be fixed.

Philips has also found vendors to be compliant when he entices them with free advertising at conferences and user group meetings. "I have the power to give them free advertising, whether it's positive or negative." That, he believes, keeps them on their toes.

Power in Numbers

Philips isn't alone in using user groups as leverage with vendors. Mike Reilly, chief technology officer of JP Morgan, a New York City-based financial company, joined a user group to be in close contact with vendors (see "Clout Helps"). Reilly is part of the Open Group, a UK-based user group, one of few not controlled by vendors. It consists of over 200 IT enterprise users and vendors, and encourages IT executives from the likes of Charles Schwab, Visa and the US Department of Defense to vent about software problems. In this manner, Allen Brown, president and CEO of the Open Group, has seen CIOs bring about change. "Often when buyers request solutions, vendors reply that changes are in the next release or that they're not hearing that from other customers," he says. "But if they hear it right from a group that includes Boeing, JP Morgan and Shell, those rebuttals are more difficult to make."

Other groups, such as the Research Board, a New York City-based think tank of approximately 100 Fortune 100 CIOs that focuses on business best practices and technology research, exclude vendors. It even excludes CIOs from technology vendors. The Research Board's researchers visit and interview the vendors, invite major ones to speak and send the vendors the results of CIO surveys.

Representatives from standards and certification organisations, such as the Washington, DC-based Institute of Electrical and Electronics Engineers Computer Society and The Software Engineering Institute (SEI), a process certification body at Carnegie Mellon University, in Pittsburgh, say they would love for CIOs to join and help shape the software benchmarks they create. "CIOs are some of the most knowledgeable observers, and we'd love to tap their knowledge," says Mike Konrad, a senior member of the technical staff at SEI.

Regardless of whether they do so collectively or on their own, David Reid, director of IT at Fazoli's Management, a Kentucky-based Italian fast-food company, says it's imperative that CIOs start talking with their wallets if they don't have the time or staff to review and test. "We've all gone through the cycle of RFP, sales presentations, scripted demos, reference checking and lab testing," he says, "only to discover during implementation significant product shortcomings that were missed in all the previous steps." The answer, he says, is to "be as diligent as you can during the selection process" and remember who pays the bills. "Other than that," he says, "all we can do is wait for evolution and survival of the fittest to move the software industry forward."

Beta Testing

Four tips on how to do it right

1. Check references. AMR Research's senior analyst Randy Covill advises CIOs to ask other beta testers how long they've used the software, how many people used it, what systems it talked to and how it performed.

2. Measure risk. Jean-Luc Alarcon, a senior vice president at Meta Group, reminds CIOs that beta testing is not a pure IT opportunity; it's an enterprise decision. He recommends that other constituencies weigh in on assessing the risk of bringing in foreign software. Alarcon suggests companies first analyse the consequence of making people vulnerable to software that produces errors and miscommunications. "Identify the cost of disrupting the business," he says. "Then value the risk. What will you lose if customers wait one week for a product they used to receive within 24 hours? Different companies would fare differently."

3. Don't go naked. CIOs can manage the risk by rolling things out in a staged manner with an earlier version or another product on hand to implement if the beta has too many bugs to function effectively.

4. Be the vector. When drafting an agreement with the vendor, CIOs can follow Home Depot CIO Ron Griffin's lead: establish one point of contact for each organisation through which all communication flows. Ensure all parties have executive-level commitment.

"If it's very important that you get something right," says Griffin, "then make sure you make time and do testing, and prove to yourself that what you put in place is going to scratch your itch."

Clout Helps

How one CTO created a standard

Mike Reilly, CTO at JP Morgan, a New York City-based financial company, knew he could attract new customers if he could promise them stock data over the Internet at a specific time every day. But the enterprise management software he had purchased from Computer Associates (CA) and Unicenter TNG, the umbrella program that connected his home-grown apps with CA's, didn't let him do that.

So Reilly invented a new standard for management software applications that creates an open, two-way application programming interface (API). He convinced CA to implement the proposed standard, called the Application Instrumentation and Control (AIC) standard, into Unicenter TNG. Together, the two companies tested the AIC standard and its subsequent API. Reilly then asked CA to join an industry consortium called the Open Group, UK-based standards body composed of users and vendors, to formalise the standard and make it official.

Reilly and CA submitted the AIC to Open Group's Fast Track Standardisation Process. Open Group then called for the two parties to bring the AIC-empowered spec to the consortium's other systems management vendors for review and refinement. Then Reilly watched as CA worked with competitors such as EMC and Tivoli. Within weeks, Open Group's members voted and accepted the new AIC standard.

"The decision to push for a standard was pretty easy," says Reilly. "We could have built our own technology and horseshoed it into what we get from the market," but he knew that tweaking apps and forcing them to process functions they weren't built to do can result in crashes and downtime. "So we did what we did: got the industry to adopt a standard and had the industry affect the vendors."