Predictive Analytics: How Clear Is the ROI?

By David McCann.  

Companies claim that predictive analytics has earned them rich returns on their investments, but the numbers may not be as solid as they appear.

One of the hottest trends in business intelligence is predictive analytics — tools that sift through a company’s data to identify patterns, risks, and opportunities. Mueller Inc. was so sure predictive analytics could help it solve business problems that when the company bought IBM’s popular SPSS Modeler package last fall, it hadn’t yet identified what to use it for, let alone consider what the return on investment might be. But within just 90 days of implementing the software for a pair of projects, Mueller reaped a 248% ROI on the purchase. Or did it?

The figure was arrived at by Nucleus Research, a research and IT advisory firm that analyzed the results of Mueller’s efforts. Mark Lack, manager of strategy analytics and business intelligence for the Ballinger, Tex.-based maker of pre-engineered metal buildings and metal roofing products, was happy with the software’s performance and “comfortable” that the ROI calculation was “in the ballpark.”

“Nucleus used its own methodology to calculate the ROI, but it was based on numbers we gave them,” Lack says. “We’re a very conservative organization, and we’re not comfortable giving out numbers unless we believe they’re realistic.”

But just how solid is that ROI number? Does it incorporate every effect from implementing recommendations suggested by the analysis? Does it matter if it’s not right on the money? Is calculating ROI for predictive analytics even important? Definitive answers to those questions are hard to come by.

To be sure, some predictive analytics projects produce quantitative, verifiable results. One of Mueller’s projects arose from customer complaints about custom-made steel pieces that were being delivered damaged, which the company’s sales force characterized as a big problem. “We deliver material to places that are in the middle of nowhere, and it’s expensive to manufacture another piece, put it on a truck, and get it out there,” says Lack.

The company was about to purchase machinery that would package its products differently, when its CEO suggested that Lack test the validity of the assumption that the packaging was to blame. So he used the SPSS software to analyze 3 million records over a five-year period and found that there were product-damage reports for fewer than one delivery in a thousand. He also found that in 70% of those cases, product had not actually been delivered damaged but rather had been produced according to erroneous specs provided by customers and their Mueller sales reps.

Much of Mueller’s early ROI on the software, as it turned out, was in the form of expense avoidance: it didn’t need the new packaging equipment because the incidence of product being delivered damaged was not statistically significant, Lack notes.

Easier Said Than Done?

That anecdote seems to support the contention of predictive-analytics consultants and researchers that calculating ROI is a straightforward comparison of technology and labor costs with before-and-after business results. Indeed, many companies run controlled studies where, for example, a new marketing tactic suggested by the analytics is directed at a portion of an audience, with the benefit quantified by comparing it with results for a control group.

But that doesn’t mean the process of calculating ROI is necessarily devoid of subjectivity. The kinds of results factored in (or omitted) and the way control groups are selected, among other things, could influence the results.

Return-on-investment calculations for predictive analytics are generally “not ironclad, yet so many people use them to make investment decisions,” says Lack. He acknowledges that he doesn’t know the best way to calculate ROI in every analytics scenario and defers to Nucleus’ methodology. Still, “it may or may not have been the same method I would have taken,” he notes.

Nina Sandy, principal analyst at Nucleus, notes that the firm uses a standard ROI methodology and is registered with the National Association of State Boards of Accountancy, “ensuring its research will withstand the most detailed financial review.” But when asked whether there could be impacts from predictive analytics that are not measurable, she answers, “There certainly could be. [For example, companies] may find sales anomalies and customer-service improvements from getting better insight on sales cycles. Some of it may be more of an intuitive process. But there are [also] absolute numbers that can be applied to what you gain.”

Must ROI Be Proved?

Convenience-store chain Wawa, an avid user of predictive analytics, seeks to reach a specific ROI threshold for most analytics-driven projects. “We calculate ROI and have benchmarks we need to work within for every expense project and anything involving the use of capital, especially building new stores,” says CFO Cathy Pulos.

But many companies don’t bother to perform ROI analyses. John Elder, CEO of analytics firm Elder Research, says about half his clients “do a pretty good job of calculating ROI, because people have arguments to make higher up the chain about making investments.” The others think predictive analytics generates value but aren’t very interested in quantifying that worth.

Other companies can’t really calculate a valid ROI, says Dan Vesset, program vice president of business analytics at analyst firm IDC. “At times it is very difficult to quantify what it means to make a better decision [based on predictive analytics], because that’s not an end in itself,” he says. “You need to follow through on it and change your marketing campaign, or the way you interact with customers, or stock your inventory, or whatever. Companies can get too focused on just the data and analytics. I don’t think an ‘If we build it they will come’ approach works.”

All Over the Map

A 2011 report by IDC, sponsored by IBM, concluded that the typical ROI for analytics projects that incorporate predictive analytics was about 250%, very much in line with Nucleus’ assessment of Mueller’s efforts. Nonpredictive analytics — focused only on accessing information and internal productivity gains rather than testing hypotheses in order to make predictions — returned 89%.

But ROI claims are all over the map. Nucleus reported in 2012 that based on an examination of 60 analytics-related ROI case studies, every dollar invested in predictive-analytics, business-intelligence, and performance-management products generated a gain of $10.66, or an ROI of more than 1,000%.

Elder was part of a team that helped the Internal Revenue Service create an improved model for flagging suspicious-looking tax returns. With the model the government agency had been using, only 1% of flagged returns turned out to be cases of fraud. The new model improved the success rate to 25%, Elder says. After several years, the model was credited with a return of $6 billion to $7 billion, he says.

With regard to calculating predictive-analytics ROI, Elder allows that “some things aren’t obvious, and you have to make educated guesses.” For example, there’s the cannibalization effect. It took some time for McDonald’s to realize that when it promoted fish filets, it sold fewer Quarter Pounders, he notes. “Fully factoring in things like that is sometimes harder than it looks,” Elder says.

Many companies err in trying to make ROI “bulletproof,” says Richard Boire, a founding partner of analytics firm Boire Filler Group. That is, they try to come up with an ROI that allocates a portion of fixed costs to an analytics project.

Boire avoids that. “I try to look at measures that are variable based on what I’m doing,” he says. “For a promotional campaign, for example, I may be impacting the number of phone calls, e-mails, or direct-marketing packages that go out. I don’t want to throw in the cost of the building or the people working on the project,” even though labor costs are commonly included in ROI calculations.

With so many gray areas, is there any point in trying to measure the ROI of predictive analytics? Sure, says Elder. “The good news is that predictive analytics is not trying to figure out how the brain works. There are real-world things that it’s trying to optimize. When you implement an analytics recommendation engine, you have a before and an after, and the incremental returns are much more measurable than is the case with many other things.”

Source: www.cfo.com

About Prof Janek Ratnatunga 1129 Articles
Professor Janek Ratnatunga is CEO of the Institute of Certified Management Accountants. He has held appointments at the University of Melbourne, Monash University and the Australian National University in Australia; and the Universities of Washington, Richmond and Rhode Island in the USA. Prior to his academic career he worked with KPMG.