Menu
IBM moves into Stream Computing with System S

IBM moves into Stream Computing with System S

Real time analytics tool is capable of handling vast 6GB per-second data streams

IBM has launched new ‘stream computing’ software which will allow the real time analysis of several hundred simultaneous streams of data.

System S, begun as a research program at IBM Research seven years ago, is designed to make sense of vast volumes of data --such as stock prices, retail sales, Twitter feeds, and weather information.

A scalable, middleware platform System S models can be applied to correlate and parse data and recognise patterns to derive fast, accurate analytics to aid in decision-making.

Glenn Wightwick, IBM Australia Development Laboratory director said, by way of example, that supermarket chains typically collect and store information on what products were being sold, and when, in a point of sale (POS) system. At night this information is often uploaded to a decision support system where queries are run to generate a useable business information report the next morning.

“Instead, you could have RFID tags on each product so that when they were taken off the shelf, those events could flow into System S and create a real-time dashboard of what is happening in the store and perhaps adjust prices or make some business decision that is of competitive advantage to them,” he said.

This information could also be integrated into a retailer’s supply chain system to automatically re-order goods as they were taken off the shelf by customers, Wightwick said.

According to Wightwick, much of the value of System S lies in its ability to scale up to process massive amounts of data in real time. Uppsala University and the Swedish Institute of Space Physics, for example, expects to be able to use System S to perform analytics on at least 6 gigabytes of data per second over the next year as part of a project to study “space weather” phenomena such as solar flares and geomagnetic storms.

Work carried out with Toronto Dominion Bank and IBM Research demonstrated how the software was able to analyse some 6 million stock trades per second, Wightwick said.

“In the finance industry, if you can reach a conclusion a microsecond ahead of your competitor then you can turn that into an advantage in your decisions to buy and sell stock,” he said.

System-S is capable of running on a standard Linux-based blade centre environment, but can scale up to IBM’s powerful Blue Gene super computer, capable of petaflop-scale computing with hundreds of thousands of cores.

Despite the enormous quantities of data System S is able to process, extensive storage arrays are not required to use the system, Wightwick said..

“One of the motivations for building System S is that you don’t store things [because] the volume of data and events is so large that it isn’t feasible to build a storage environment to handle it,” he said.

Currently IBM is in discussions with the University of Melbourne and NICTA regarding collaboration on a project to monitor smart sensors on farms to measure soil temperatures and control automated watering systems, he said.

IBM plans to make System S trial code available for free to help foster a better understanding of the software’s capabilities. The trial code will includes developer tools, adapters and software to test applications. For more information, visit IBM's IBM InfoSphere Streams Web site.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags System-S

More about IBM AustraliaIBM AustraliaLinuxNICTAUniversity of MelbourneUniversity of Melbourne

Show Comments
[]