Menu
Choosing Your Priorities

Choosing Your Priorities

Planning government ICT strategy can be downright impossible sometimes. Not only do you have to deal with the usual hyperbole, fluff and empty promises of the vendors, but there's the added complexity of ever-changing ministerial priorities, budget competition from other areas of departmental portfolios and the well-entrenched political intrigues of the government machine.

With so many forces buffeting ICT strategists, it is understandable why many government projects run far over budget and time, or end up consuming inordinate amounts of resources. Sure, overriding changes in the past five years have shifted government ICT spending in many organizations away from the big-bang strategies of the late 1990s. Yet even small investments can generate change that spreads like ripples across the organization - and each of these ripples may be enough to sink boats that have for years been rocking gently in the quiet harbours of mission criticality.

For this reason, we have identified several technological and managerial megatrends that encapsulate the major forces driving today's IT strategizing. Each offers the potential for considerable cost savings, improvements in operational efficiency or better alignment of ICT strategy with the shifting sands of government policy.

All are relevant to today's government information executives, who continue to search for new ways to deliver more with less. Meeting this goal is a fundamental requirement for progress, but it does not have to be painful. Plan carefully, and the right technological approach can get you where you need to be.

1 Consolidation: Big is In (Again)

Although it would be much easier to build information systems in greenfield environments, government agencies virtually never have that luxury. Legacy systems of all types abound, with mainframes and even early open systems environments often more than a decade old.

Such platforms can invariably be updated and improved, but their functional importance has generally forced an alternative approach: leave the old systems running as they were, then build bridges to new systems using integration software. However, the maturation of two technological concepts - consolidation and virtualization - has finally changed these dynamics by providing a different way to look at the problem of combining old and new.

Consolidation, both at the server and storage level, has come about as systems architects recognize the consequences of years spent arguing that monolithic servers should be avoided in favour of large numbers of commodity (usually Intel-based) systems. That change may have lowered per-unit costs, but large numbers of commodity systems are typically running far below the capacity of their CPUs.

In some cases, the surfeit of computing overhead is intentional, the result of planning for future growth. In many other cases, however, it is the incidental result of a one-application-one-server mentality that has become unnecessary as overall computing capacity continues to increase.

Current architectural thinking focuses on combining the functionality of numerous other server applications onto a smaller number of systems. Here, virtualization comes into play. Once the exclusive domain of high-end enterprise systems, virtualization on commodity servers has allowed the consolidation of many individual systems into functionally identical virtual machines running on a single server.

This approach both reduces capital acquisition costs and lowers ongoing management expenses. That has made consolidation a key priority for WorkCover Queensland, which as part of its three-year Intelligent Computing initiative is turning to consolidation as a way of simplifying its server operating environment.

In WorkCover Queensland's three-phase consolidation project, discrete storage area networks (SANs) at the organization's head office and backup data centres have been joined into a single SAN fabric. Core business systems, currently spread across 28 HP-UX-based servers, will over the next few years be migrated into 15 virtual partitions managed using the built-in virtualization of two HP Superdome servers.

Finally, the organization is in the process of testing an open systems consolidation in which four Windows Server-based and six Linux-based servers will be moved onto a pair of four-way HP servers using VMWare ESX Server. Ultimately, it hopes to consolidate 17 Windows and six Linux servers onto six four-way HP servers.

With network servers currently spread across the state, consolidating them into fewer numbers will support new models for application delivery and eliminate costly on-site visits to far-flung offices, says Lynn Kincade, general manager for IT with WorkCover Queensland. "It used to be that every time we needed more power, we threw in more servers," she says. "This is not smart. The Intelligent Computing program is about being able to use our computing power as best we can. It's time to bring it all together."

Consolidation provides value both in reduced capital costs and lower ongoing management expenses. It also provides a long-awaited solution for legacy woes, since virtual computing platforms can be used to run old computing platforms on top of current platforms. Because it separates the application from its underlying software, increasingly portable application platforms could eventually provide enterprise-grade virtualization of applications running on a range of old CISC and RISC platforms - reducing costs now while paving the way for the future.

Virtualization servers from EMC subsidiary VMWare continue to dominate the market, but increasing recognition of the benefits of virtualization has made it a hot ticket item that is being built into contemporary operating systems at the fundamental level. Highly granular virtualization has also become de rigueur for storage providers, who long ago embraced the need to virtualize their storage in order to make it available to all different types of servers.

Combining virtual servers with virtual storage is a fundamental step in the shift to dissociate applications from underlying hardware - which will, in turn, lead towards next-generation visions such as grids. Government organizations do not have to wait that long to see benefits, however: with the technology undergoing a flurry of R&D investment, consolidation and virtualization can quickly and painlessly reduce your ongoing costs.

2 Voice Over IP: The New Dial Tone

Demand for Internet Protocol (IP) telephony has been growing for years, and shows no sign of stopping as more and more organizations give in to the inevitable and replace their ageing PABXs.

If your organization is nearing the end of life for its existing equipment and it is not investigating IP telephony, odds are that it will be soon. Analyst firm IDC recently noted that the enterprise IP telephony market grew 98 percent last year, with sales of IP phones growing 176 percent to 200,000 units shipped.

Much of that growth can be attributed to the government sector, where a number of large Voice-over-IP (VoIP) deployments have seen thousands of handsets deployed at once. The Commonwealth Department of Veterans' Affairs (DVA), for one, recently worked with IBM Global Services to complete a major VoIP upgrade that included more than 3200 Avaya 4620 IP phones across 58 department sites. The new environment has successfully trimmed significant costs from the handling of more than 4.5 million calls received annually by DVA.

Victoria Police is considering VoIP for a planned upgrade of up to 250 phone systems, covering several thousand extensions across the state. Indeed, VoIP has infiltrated every level of government from large departments down to the smallest: even the minuscule Australian Antarctic Division has been testing the technology to service Australia's southernmost government offices.

Acceptance of VoIP is strong enough that many state bodies are starting to play the volume game by planning large-scale rollovers to the new technology. InTACT, the ACT government's shared services communications agency, is in the throes of a $27 million, 15,000-extension whole-of-government VoIP rollout. Under the $14 million Victorian Office Telephony Systems (VOTS) contract, NEC has rolled out 24,000 VoIP-capable network points at 43 government sites across that state in an effort expected to save $1 million per year.

Recognizing the benefits of high-volume VoIP, similar efforts are on the boil across the country and may provide an easy migration to VoIP for government agencies that have not made the jump on their own yet.

There are caveats, however. Security issues, whether real or perceived, remain a contentious issue in the migration to VoIP. Throughout this year, the Department of Communications, Information Technology and the Arts has been soliciting opinions in a review to better understand VoIP's potential security risks.

The results of the study may eventually lead to encryption or other mandates that must be considered by any agency considering the jump. Any such potential policies cannot, however, obscure the fact that government agencies of all sizes and levels - all of which are heavy users of telephony - are likely to find it easy to build a significant business case around deployment of IP telephony. If it is not on your department's dance card, you should consider learning some new steps.

3 Governance: Following (and Figuring Out) the Rules

Government at all levels has often been notoriously reluctant to get introspective, and even less eager to actively change itself. While many organizations have already taken a page from the private sector and re-engineered their processes to become more client-centric, the string of high-profile financial disasters in the private sector has now driven a new trend towards self-assessment among government organizations.

That is a big change from the past, where performance reviews were generally driven by ministerial mandate or by unwanted attention from Commonwealth or state auditing bodies. Such reviews continue, but there are signs that government organizations now recognize that pushing concerted governance efforts from within, rather than being pushed from outside, will ultimately be a more useful and productive approach - and produce outcomes that are more acceptable to all stakeholders.

One of the most significant opportunities for introducing governance structures has been the formation of shared services bodies, most notably in Western Australia, NSW and Queensland. In WA, for example, the newly formed Office of Shared Services has structured each of its Shared Service Centres with its own Governance Board whose members are responsible for implementing a formal Governance Charter according to best practice guidelines published by the state government. Although they share local responsibility for governance within their units, these boards have their own reporting hierarchy, with each board responsible to the overarching Shared Services Steering Committee.

Mirroring the trend towards greater involvement of chief executives within the governance processes of private sector companies, government directors-general and CEOs are now expected to be involved in governance planning at every level. Such increased activity is expected to push global expenditure on compliance-related projects to $US80 billion over the next five years, according to AMR Research.

Before it is possible to address governance, however, it is critical to figure out exactly what it entails. This is a particularly pointed issue in local governments, where there is little top-down guidance compared with that available to state and Commonwealth regulatory agencies.

As formal governance efforts continue to trickle down to every level of government, there are likely to be many widely divergent ideas about just how governance can be achieved. This is not necessarily a bad thing, as long as a mechanism for self-evaluation and change is built in as part of the governance framework. State and local government associations may also have a role to play by providing best practice governance templates and encouraging sharing between member organizations.

Because it has been such a nebulous concept, governance has been hard to model in software. That has not stopped numerous software providers, however, from trying their hand at turning governance into a feature of their related products. Their generally US roots mean that Sarbanes-Oxley (SOX) compliance is a common target; and, while Australian government agencies are not affected by SOX, it offers valuable guidance for governance efforts.

Each provider has its own approach: testing tools giant Mercury, for example, has built its IT Governance Center around its own competency in business process measurement to reflect SOX and other regulations, while Compuware's IT Governance tool builds on the company's development and workflow capabilities. Computer Associates' recent acquisition of Niku brought it a governance tool that has been linked with related applications such as CA's Remedy help desk. Other vendors see governance as an offshoot of project management or portfolio management.

In the long term, such solutions will likely homogenize as feature sets become consistent. Governance remains a growing priority within government organizations facing the need to build continuous improvement - still a foreign concept for many government bodies - into their everyday procedures. Governance also provides strategic value by offering a consistent way to evaluate and prioritize competing demands. For now, however, forget about the tools: simply assessing the current situation and planning future governance structures will be more than enough to keep most government information strategists busy well into the future.

4 BI: Government Intelligence About Government

Having recognized that clear governance processes are essential, the logical next question is how to monitor those processes and ensure that the focus on governance is actually improving the way the organization is run.

This issue has become particularly relevant after years of whole-of-government outsourcing, in which many departments effectively abdicated responsibility for process monitoring to the outsourcers and demanded only compliance with fixed service level agreements (SLAs). With outsourcing now taking on a far more limited scope and individual business processes increasingly handled by different firms, many departments are finding it essential to ensure they have their own metrics for tracking performance and improving business processes.

To this end, one of the major priorities for information executives is establishing operational management systems that not only sniff out trends in operational data and customer interaction, but can also be tied back to business goals and governance structures. For example, there is little value in touting the value of a reporting structure for business unit performance, if that reporting information cannot be quickly used to kick off processes to improve that performance.

"The great [majority] of organizations will continue to be average performers that continue to have some dysfunctional attributes associated with them," says Bill Kirwin, vice-president and research director with Gartner, which recently introduced an "enterprise personality profile" designed to help organizations highlight areas where better information and process change could help them. "We think that if we can give the average organization a way of trying to figure out what's going on, they can internalize this and actually start to execute against it."

Making this happen is far from easy. The widely divergent types of data found in most government organizations may be relatively straightforward to analyze on a project basis using conventional business intelligence (BI) tools, but tying them into overall control objectives can be downright impossible. Data ownership, inconsistencies and the need to identify measurable operational performance measures all contribute to this complexity.

For the Commonwealth Department of Education, Science and Training (DEST), a project to introduce "dashboards" reflecting current business performance is a major step in the ongoing effort to get better information about the programs it administers. In particular, the New Apprenticeships Program - which helps manage training for 1.6 million clients served by 32 recruitment companies dealing with 340,000 employers - is getting the dashboard treatment to expand the utility of its Hyperion business intelligence platform.

In the past, the BI tool has been used to generate regular reports about activity within the apprenticeships program based on the central TYIMS (Training and Youth Internet Management System) database. However, the data traditionally produced was time-based and targeted around day-to-day performance - limiting its usefulness in assessing and improving policies over the long term.

By broadening the scope of its reporting and using the Hyperion Compliance Management Dashboard to report on key performance indicators (KPIs) in real time, the system's nearly 100 users should get more useful information that provides far more meaningful assistance in improving the performance of DEST's services.

"We want to enable a bit more flexible analysis of performance, both within the established KPI framework and beyond it," says David Featherstone, technical director for DEST's Information Services Group, which is also in the throes of a data warehousing project that aims to consolidate information from across DEST's various operations.

"If [managers] are able to get hold of more detailed information on a more timely basis, they can work to achieve the benchmarks or KPIs they need to, and can put focus into the areas that they need to. We have tried to consolidate our data holdings and do more analysis of all the data we have available."

Construction of more comprehensive reporting frameworks - a trend now referred to as business process management (BPM) - will be essential to support departments' strengthening push for greater internal efficiencies. Many organizations have already embarked on qualitative enhancements such as service management improvements around the ITIL libraries, but without a clear methodology for monitoring progress, the value of such initiatives will be severely compromised.

Gartner subsidiary Meta Group recently found that 64 percent of companies had already devoted specific budgets to compliance efforts - at this point, largely focused around BPM deployments - in order to improve their reporting capabilities and strengthen governance controls. By tying in process change with more targeted, real-time reporting, government organizations can both improve their overall performance and meet the governance obligations that require clear processes for business improvement.

5 e-Procurement: Buying Better

Attempts to improve the efficiency of government procurement are as old as government itself, and organizations are continuing to search for better ways to buy. Here, the sheer volume and variety of government buying have proved to be determined enemies: early marketplace-based concepts of government procurement have simply failed to gather the momentum they were expected to produce.

Because it has traditionally been such an expensive and time-consuming exercise, however, procurement remains on the radar screen of government. Online distribution of tender information (as through AusTender, the site now used for publishing tenders from 84 FMA Act Commonwealth departments) has been the low-hanging fruit for this process. Limited trials have proved that the concepts - if not the models themselves - are viable. Victoria's EC4P procurement program, for example, reportedly dropped the price of handling procurement orders from $65 to just $12.

Yet much of this value comes not from the specifics of EC4P, but from the simple act of automating what has traditionally been a heavily manual process. This distinction is important because any government organization can automate simple procurement and realize such cost savings - but extending that model across other parts of the organization, necessary to generate the kind of savings once envisioned for e-procurement, is a much more complicated effort altogether. The difference is one of limited purchasing improvements versus broad procurement process change, and it is one that continues to elude many departments.

Efforts to unify the situation continue, with observers calling for greater collaboration between bodies including the state-level Australian Procurement and Construction Council (APCC) and the minister-level Online Council. Even their efforts may be handicapped by ongoing problems with implementations; a recent audit of Western Australia's GEM Purchasing e-procurement system, for example, cited local wins but noted the limitations of technological standards that have prevented the system from being used more widely.

Other wins have been few and far between, with newfound commonality of purpose emerging only after smaller groups of departments band together to identify commonalities in their purchasing. For example, NSW Health's own e-procurement initiative has continued to progress thanks to a relatively limited scope and fewer chances for outside interference.

Despite strategic efforts to commoditize procurement through a whole-of-government system, the clustered, siloed approach could well continue to typify government e-procurement efforts for the long term. For information executives, of course, any procurement benefit is better than none at all. Given the high cost of procurement and the proven potential to squeeze costs out of the process, it remains an area of high priority for government organizations of all sizes.

6 Open Source: A New Way of Thinking

The movement towards open source software started with Linux, which in the government sector has recently gained strong support, just as it has in the private sector. Several widely publicized government organizations - including the NSW Department of Commerce, NSW Roads and Traffic Authority, and Northern Territory government - have made a huge commitment to the operating system.

However, open source certainly does not end with Linux. At all levels of government, pockets of organizations are quickly warming to all manner of other open source applications. One particularly popular role for open source has been content management systems such as Squiz MySource Matrix, which is in use at dozens of government organizations such as the Commonwealth Department of Finance, which recently posted a Request for Tender to find an integration partner for the system.

Open source has gained currency as a philosophy across every part of the IT infrastructure. NSW Police recently installed Red Hat Linux extensively across its back end and remains philosophically supportive of open source as it installed a single program office across all of its ICT projects. In April, Victorian voting authorities called for tenders for development of open source electronic voting machines. And also in April, NSW signed a $40 million procurement deal that makes Linux and associated services available across the state government.

Why the appeal of open source? Certainly, low cost (licences are usually free but support still costs) is one factor. Others point out that open source is a way of sidestepping from the dominance of Microsoft, although many would argue that government organizations should not be choosing critical software platforms out of spite or political preferences.

A far more compelling reason to choose open source: it is particularly well suited to the needs of government organizations because its transparency lends itself to offering high levels of security and auditability, and therefore good governance.

Better still, open source platforms traditionally seen as a functional compromise are gaining far more credibility: Linux is robust and, in recent desktop incarnations, eminently usable. Previously proprietary vendors are also coming to the table. IBM and other firms have open-sourced popular databases, while Sun recently went so far as to open-source, and give away, its enterprise-level Solaris operating system. That bold move presents tantalizing new options for government departments, many of whom have already standardized on Solaris and may have just found another reason not to give it up for Windows Server.

Simply choosing potential applications because they are open source is hardly a strategy, however; traditional criteria that applications be fit for purpose remain. It is a fine line to walk, and here the Australian Government Information Management Office (AGIMO) has taken a leadership role in helping government organizations by publishing its A Guide to Open Source Software for Australian Government Agencies procurement paper.

Released in April, the paper has received worldwide recognition as a concrete step forward in the encouragement of open source within government, and it provides invaluable procurement advice for government organizations contemplating the move. One of those pieces of advice: that government departments "disregard any industry fads or novelty value" and assess open source applications based on fitness for purpose and value for money.

Making these assessments, government organizations must consider practical issues such as support and ongoing development - especially problematic in smaller state and local government organizations that may not have any Linux skills and only minimal development resources. For such organizations, the move to open source, no matter how seemingly appealing, can be one fraught with complexity and the risk of depending even more heavily on external suppliers for support and service.

Eventually, greater adoption of open source applications should mean that skill sets will catch up with those for Microsoft applications. For now, support - and the attendant accountability it introduces - remains a significant issue. However, even if Microsoft continues to dominate government desktops and strengthen its presence at the back end, AGIMO's recognition of open source software represents a significant milestone in its evolution. It is a philosophical step towards self-determination on the part of government, and in the long term could prove to be a real empowerment to government organizations that have long felt tied to the whim of proprietary vendors.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about ACTAMR ResearchAustralian Antarctic DivisionAustralian Procurement and Construction CouncilAvayaCA TechnologiesCompuwareDepartment of CommunicationsEMC CorporationGartnerHPIBM AustraliaIDC AustraliaIntelLogicalMeta GroupMicrosoftNECNikuNorthern Territory GovernmentNSW HealthNSW PoliceRed HatRoads and Traffic AuthorityUnifyVMware AustraliaWorkcover

Show Comments
[]