6 data analytics success stories: An inside look
- 05 September, 2017 20:00
If data is the new oil, then knowing how to refine it into actionable intelligence is the key to leveraging its potential. To this end, CIOs are playing with predictive analytics tools, crafting machine learning algorithms and battle-testing other solutions in pursuit of businesses efficiencies and new ways to serve customers.
Hyperaware that reducing costs or boosting revenues can help them shine in the eyes of the C-suite and board of directors, CIOs are spending more than ever on technologies that support data science. Worldwide revenues for big data and business analytics will reach $150.8 billion in 2017, an increase of 12.4 percent over 2016, according to IDC. Commercial purchases of hardware, software and services intended to support big data and analytics are expected to exceed $210 billion. IDC analyst Dan Vesset notes that big data analytics solutions have become key pillars of enabling digital transformation efforts across industries and business processes worldwide.
But there is a dark side to this delirious spending: Most data analytics projects fail to yield measurable value. Legacy systems and business-line bureaucracies have spawned data siloes and perpetuated poor data quality. And CIOs are still struggling to fill the gaps in talent required to manipulate data for insights. The war for talent is fierce and the rise of university analytics programs isn’t producing qualified candidates fast enough.
[ Keep up to date with the 10 hottest data analytics trends today (and 5 going cold). | Bolster your career with our guide to the big data certifications that will pay off. | Get the latest insights by signing up for our CIO newsletter. ]
Yet data analytics success stories abounded at the CIO100 Symposium earlier this month, where several IT leaders revealed and were awarded for their efforts. CIOs also shared lessons learned and advice for peers undertaking similar efforts.
Making data analytics work at Merck
Merck, which had grown to become a $40 billion global healthcare company operating in 140 markets worldwide, sought to use data collected in ERP and core systems for manufacturing execution and inventory control to gain more business insights. But with Merck engineers spending 60 percent to 80 percent of their effort finding, accessing and ingesting data for each project the business objective was long gone. "We were not viewing data as a viable, permanent and valuable asset," said Michelle A'lessandro, CIO of manufacturing IT at Merck. "We wanted to establish a culture where we spent far less time moving and reporting the data and far more time using the data for meaningful business outcomes."
Merck created MANTIS (Manufacturing and Analytics Intelligence), an über data warehousing system comprising in-memory databases and open source tools that can crunch data housed in structured and unstructured systems, including text, video and social media. Importantly, the system was designed to allow non-technical business analysts to easily see data in visualization software. Conversely, data scientists could access information through sophisticated simulation and modeling tools. MANTIS has helped decrease the time and cost of the company's overall portfolio of IT analytics projects by 45 percent. The tangible business outcomes include a 30 percent reduction in average lead time, and a 50 percent reduction in average inventory carrying costs.
Lessons learned: A key to her success, A'lessandro says, was identifying a "lighthouse" analytics project in an Asia-Pacific plant where Merck would see the biggest payback. Upon demonstrating success with MANTIS there, it became a call to action to other sites. She also learned not to bite off more than she can chew. A'lessandro says she "overreached" in an early experiment to use artificial intelligence and machine learning to analyzes costs of Merck's manufacturing processes. "It wasn't for lack of sponsorship or lack of visions, we just couldn't get it to work," A'lessandro said.
Dr. Pepper Snapple Group taps machine learning for contextual relevance
For years, Dr. Pepper Snapple Group’s sales route staff grabbed a fat binder filled with customer data, notes on sales and promotions, and hit the road to woo retail clients such as Wal-Mart and Target. Today, instead of a binder, sales staff are armed with iPads that tell them what stores they need to visit, what offers to make retailers, and other crucial metrics. "They were glorified order takers,” said Tom Farrah, CIO of Dr. Pepper Snapple Group. “Now they are becoming intelligent sales people equipped with information to help achieve their goal.”
The platform, MyDPS, is equipped with machine learning and other analytics tools that funnel recommendations and a daily operational scorecard to workers when they load the app. Algorithms that show staffers how they are executing against their expected projections, including whether they are on track to meet their plan, fall below, along with insights as to how they can course correct. “If I am going to make someone successful I have to ensure what info they have is contextually relevant,” Farrah said.
Lessons learned: To test the proof of concept for MyDPS, Farrah gave the software to four people in a branch and had the president of the business go visit them. They revealed that execution sell-in had improved by 50 percent since the previous month after using MyDPS, convincing him to greenlight the project. "He got the result and that's what it took to sell-in,” Farrah said. “That's really important that you not just have the business sponsor for the project but they want the result that it's going to deliver."
Bechtel disrupts itself with big data center of excellence
A little-known fact: Construction-related spending is 13 percent of GDP but the industry as a whole has generated only a 1 percent productivity gain in the past two decades, says Carol Zierhoffer, CIO of Bechtel. Experts say the sector can boost productivity 50 to 60 percent by rewiring contracts, upskilling workers and improving onsite execution, among other tweaks. Bechtel, which built the Hoover Dam, English Channel Tunnel and other marvels, began unearthing insights from data buried in various parts of the business.
Zierhoffer met with industry peers at Wal-Mart, Boeing and Lockheed Martin to gain insights about how to move forward. The company built a big data center of excellence, in which sits a data lake comprising 5 petabytes of data, and started a proof-of-concept. It used photo recognition technology to inspect and label photos of sites on behalf of customers, saving $2 million. Natural language processing (NLP) tools parse claims, RFPs and contracts. Estimates and plans that once took days and weeks now take hours. Bechtel has also expanded the analytics efforts to look at staff retention, including trying to anticipate when employees may leave. “We believe we are knocking on the door of that challenge in productivity,” Zierhoffer said.
Lessons learned: Data silos and quality are a bear. Although Bechtel can analyze large volumes of data, the quality of the data located all over the business must be improved. “We had to disrupt ourselves and looked at how we work and bridge data siloes.”
RRD’s road to new business, thanks to machine learning
Some years ago RRD, the marketing communications company formerly known as R.H. Donnelley, opened a logistics division to ship its print materials to consumers and businesses. To support the business, it managed loads for itself and shipped anything from washing machines to dog food on behalf of its partners, eventually growing to a $1 billion enterprise. The challenge? Finding optimum shipping rates in a world where FedEx and UPS were the undisputed kings.
Variables such as weather, geography, drivers and political climates cost its business. With the pressing need to predict rate variables, RRD turned to machine learning and analytics, said Ken O’Brien, CIO of RRD. It hired staff and universities to help write algorithms, testing thousands of scenarios across 700 routes until it was able to anticipate freight rates in real-time — seven days in advance at a 99 percent accuracy rate. “The project paid for itself in under a year and we're still seeing growth in that business related to freights,” O’Brien said. The company projects that in 2017 its truckload brokerage business will grow from $4 million to $16 million, a $12 million increase in revenue against a $600 million business.
Lessons learned: New enterprises require a high-level commitment, though O’Brien admits that some of his business peers were ready to throw in the towel at various points along the way. The business didn’t trust the technology for a process that was typically done by feel and guesswork. RRD set up a collaborative environment in which business and IT worked together to influence the outcome. “You will stumble, you will have challenges, but be patient,” O’Brien said.
Monsanto taps machine learning for optimal planting plans
Farmers are forever agonizing over which seeds to plant, how much, where and when. Seed giant Monsanto is on the case, using data science to make prescriptive recommendations for planting. Mathematical and statistical models to plot out the best times to plant male and female plants and where to plant them, ideally to maximize yield and reduce land utilization. Its machine learning algorithm churns through more than 90 billion data points in days, rather than weeks or months, said Adrian Cartier, director of global IT analytics at Monsanto. The business benefits? In 2016, Monsanto saved $6 million and reduced its supply chain footprint by 4 percent. “In North America, a 4 percent land utilization reduction equates to a lot of land not being used and a lot money saved,” Cartier said.
Lessons learned: The key for Monsanto was instilling a “cradle to grave” collaboration between IT and the supply chain business. "Their domain expertise from an agricultural and supply chain perspective, married with our domain expertise of mathematics and statistics created the value that we were able to deliver,” Cartier said. Cartier said he also sought out “change leaders and advocates” within the supply chain business to offset the healthy balance of naysayers.
For Pitt Ohio, predictive analytics delivers success
The freight industry is under fire from the so-called “Amazon Impact,” said Scott Sullivan, CIO of Pitt Ohio. Pitt Ohio, a $700 million freight company, had gotten used to picking up freight and delivering it to customers the next day. But thanks to Amazon, customers are increasingly expecting same-day delivery. And they expect more information about their packages.
“Customers now want to know not only when it will be picked up but how it will be delivered so they can plan their workload,” Sullivan said. Using historical data, predictive analytics and algorithms that calculate a variety of freight weight, driving distance and other factors in real time, Pitt Ohio can estimate the time the driver will arrive at a delivery destination at a 99 percent accuracy rate. The company estimates that it has increased revenue through repeat orders (estimated at $50,000 per year) and reduced the risk of lost customers (estimated at $60,000 per year).
Lessons learned: Sullivan says it was a cross-departmental affair involving market research, sales operations and IT, all of whom checked and re-checked results to make sure they were accomplishing their objectives. “There’s a lot of data within your four walls — be innovative and look for challenging ways to use it,” Sullivan said.