Simplifying Storage to Speed Innovation in the Cloud Era

How senior technology executives are dealing with the tsunami of internal and external data

Data is the new currency for organisations looking to understand more about customer behaviour to help drive their innovative business transformations. The more information they can collect and analyse – increasingly using cognitive applications – the more targeted their products and services will be.

To manage this tsunami of internal and external data and gain the insights they need, many enterprises are looking to optimise their storage infrastructure and are often adopting hybrid cloud storage models.

Senior technology executives gathered at O Bar and Dining in Sydney recently to discuss how they are dealing with this storage growth. The luncheon was sponsored by IBM and Meridian IT.

Smart organisations have adopted some form of data management strategy which could be as simple as a ‘hot and cold’ storage pool with software automatically moving data based on certain thresholds such as the age of the information, says Nick Milsom from IBM Cloud Object Storage A/NZ. 

“The key is the software and its ability to automatically match the value of the data to a class of storage and as the profile changes, move the data up and down the storage pools,” Milsom says.

Many organisations are also now realising that all archive data should live in the cloud. Just like an inappropriate phone plan, if the cloud usage pattern doesn’t match the data access, you can end up with a monthly bill that is a lot higher than expected, he says.

“A true hybrid cloud model with common software across ‘on-premise and off-premise’ storage can facilitate the best of both worlds, something we are seeing a lot more demand for,” says Milsom.

Dave Glover, CTO at Salmat, says the multi-channel marketing organisation runs a mix of on-premise and cloud-based storage infrastructure.

“We have storage we manage and storage that is managed by virtue of the software-as-a-service (SaaS) solutions we use such as those provided by Workday, Google, and Salesforce. Our requirements continue to grow; we are dealing with new sources of data and the time we are required to store certain information is increasing,” he says.

“We have more sources of data than we used to have, we are required to retain data for long periods, and disposing of data when it reaches end of life is a time-consuming task where the effort involved can outweigh the incremental cost of keeping it.”

The move to an infrastructure-as-a-service (IaaS) storage model allows the organisation to leave “hygiene” issues of the past behind with “no more expensive upgrades, no more tape libraries and the benefit of check-box redundancy,” he adds.

Each of Enero Group’s companies have different storage requirements, says group IT director, Daniel O’Sheedy.

“For example, BMF Advertising will produce terabytes of video footage for TV advertisements which needs to be stored securely whereas Hotwire PR, which is engaged in a lot of B2B public relations work, is fairly light-on with its storage requirements,” O’Sheedy says.

Enero Group has a local SAN for file storage for each company across the entire group and until around three years ago, backups were sent to tape.

“[A cloud archive] solution became cheaper than tape backup so we switched. But the rise in cybercrime such as the CryptoLocker [ransomware attack] meant that our data could potentially become unavailable in an instant,” says O’Sheedy.

“We then added an extra layer of backups at each site using network attached storage. Protecting against the risk of cybercrime moved us to keep more copies of the data so that we can get our business back online quickly in the event of a disaster.”

“Replicated data in multiple locations also requires more storage and as we have offices in Australia, the UK and the US, we also need to consider the data retention laws worldwide. So even more long-term storage is needed to account for that,” he says.

Douglas Zuzic, information systems manager at Richard Crookes Constructions, says his organisation is currently running a mix of both on-premise and cloud storage to deal with a doubling of its storage requirements over the past 18 months.   

The digitalisation of construction processes and increasing staff numbers and amount of data stored to manage construction projects have all contributed to this growth, says Zuzic.

“Using cloud providers definitely makes sense for project-based organisations such as ours, allowing for quick response without the up-front capital,” he says.

Meanwhile, Jonathan Chaitow, general manager, architecture, design and innovation at infrastructure maintenance services provider, Broadspectrum, says data retention requirements, a lack of focus on purging or de-duplication, and increased application logging and encryption are all contributing to storage growth.

Broadspectrum is turning to cloud service providers exclusively as the organisation “prefers to specify the service levels and respond times rather than the underlying technologies. Cloud service providers offer economies of scale and growing capabilities, which enables the organisation to focus on its core business rather than infrastructure and operations, says Chaitow.

Which technologies are most suitable now and in the future?

 According to IBM’s Milsom, organisations that have a software-defined storage strategy would ideally have the right quantity of ‘hot, warm, cold and deep archive storage.’

“Flash is for high performance; tape, object or cloud is for deep archive. What you use really depends on access profile, how long the data needs to be kept for and cost,” he says.

Milsom adds that the ability to write a business application that can store and retrieve data via an API is a game changer. It means the management of large data stores can be reduced by a factor of 10 or more.

“Traditional practices such as backup can be adapted or completely removed – unstructured data is accessed by the application delivery via an API. If you can make things simpler and cost-effective, it’s a lot easier to justify keeping more,” he says.

Salmat’s Glover says the technology is of little interest to his team.

“Our plans to move more and more storage to the cloud means [choosing technology] is a problem I don’t need to consider. It seems that every SAN vendor is selling flash and spinning disk ‘combo boxes’ that deliver performance without breaking the bank. The reality is that everyone needs to make their decisions by trading off cost versus performance. The underlying technology is of minor interest.” 

Finally, Enero Group’s O’Sheedy says the majority of his organisation’s business requirements and satisfied with ordinary hard drive speeds and the premium for flash storage hasn’t yet been commercially viable.

“However, the one area we would use flash would be with the data analysis teams,” he says. “On a standard ‘old-school server’, teams could take up to 24 hours to do their analysis. If they want to run another analysis to check correlations, there’s another 24 hours gone. If the data file is corrupt, that means another day is gone,” he says.

But because this type of work is usually project-based, it is difficult to justify the capital expenditure purchase and run such a massive chunk of hardware. Instead, the organisation has used a fully-managed data warehouse service with great success, says O’Sheedy.

“We can ingest the data, automate processes and give our data scientists the front end they need for their work. If an ongoing project eventuates, we don’t have to rebuild the environment and we can just keep paying for it under an OPEX model. If the project comes to an end, then we export the data and shut it off.”