Menu
Future Results Not Guaranteed

Future Results Not Guaranteed

Taking the Market's Pulse

Even if a demand forecasting system had 100 per cent accurate information, there is another problem: The past can't predict the future. Computer-generated forecasts use historical data to make assumptions about what will happen, but there is no way for them to anticipate major market changes. For example, Belvedere International, which is based in Ontario, Canada, makes skin-care products. When SARS broke out in Toronto, Belvedere sold more than a year's worth of its One Step hand disinfectant in a month. No forecasting system could have predicted that. Belvedere has kept its assembly line running 16 hours a day, six days a week - modifying production of other goods in the process - just to keep pace with demand. "It's no different from forecasting the weather," says Gene Alvarez, Meta Group's vice president of technology research services in the US. "Once in a while something the model couldn't figure out catches them off guard. Same thing happens with consumer taste and demand."

Even forecasts that are made with a limited number of variables and with accurate data will be off. They still make the fundamental assumption that what was true yesterday will be true tomorrow. But because the data about a change lags behind the change itself, it takes human market watchers to note business climate alterations.

Vicor, which manufactures power converters for electronic circuit boards, found this out at the beginning of the recent economic downturn. Until a year ago, the company had used a homegrown forecasting system that it had built in 1993. CIO Richardson describes it as a straight-line forecast based on sales history. Company executives relied solely on the automated forecasts to predict demand for their products.

"It was reasonably good in the 90s when demand was increasing at a nice steady rate," he says. "Where it broke down was when the product mix increased and the business downturn started."

Vicor didn't see it coming. In a conference call in April 2001, CEO Patrizio Vinciarelli said that shipments "fell off the table", and the company was left with a massive inventory glut. "When the future doesn't resemble the past, none of this forecasting software works well," Richardson says.

The mishap taught Vicor the necessity of factoring human intelligence into its forecasts. In order to make sure that it isn't caught off guard again, the company set up a dual forecasting process in which the sales department comes up with a forecast and the computer system, which was upgraded a year ago, makes another. The two are complementary; the sales department is too conservative with its forecasts (Richardson thinks the salespeople are merely cautious; a cynic might point out that they are compensated for selling above quota). On the other hand, the computer system won't necessarily pick up on changes in the market that salespeople can often see.

For example, month after month, one telecom customer of Vicor kept placing the same order, and month after month the computer spit out a flat forecast. But a sales manager in the field found out that the telecom increased its order with another supplier whose parts are used in the same product as Vicor's. The sales manager talked to the telecom company and found out that the company had indeed decided to ramp up production, and armed with that information, Vicor increased its production as well, and was thus prepared when the telecom placed a bigger order.

While Vicor uses computer-generated demand forecasts as a check and balance for the human-generated forecasts, Scotts takes a slightly different approach. It takes its computer-generated forecast and distributes it to designated forecast planners for feedback. The planners, who are experts for the store and area they represent, make changes based on their expertise. For example, a planner in the Northeast might lower a forecast due to bad weather that limits gardening, or another might increase a forecast if he knows that a particular store is planning a promotion. Scotts takes one unusual step to ensure that the planners have access to the most up-to-date information: The planners actually work in the office of the customer company's buyer. If, for instance, the planner represents Home Depot in a particular region, he works in the office of Home Depot's buyer for that region. Sengupta says that the close proximity fosters collaboration between the planners and Scotts' customers.

In the end, the demand forecasting failure at Nike and other companies can be laid squarely on the shoulders of executives who put too much faith in technology. Court records in the lawsuits by shareholders against Nike reveal that executives for the company didn't even hold meetings to review and discuss the computerised forecasts that turned out to be so disastrously wrong. In other words, Nike management neglected to put in place a high-level process of human checks and balances for the computerised forecast. While that negligence actually enabled Nike executives to successfully argue that they were initially unaware of the flawed forecast that was generating such a huge inventory glut, it was a Pyrrhic victory. The company still lost $US180 million in sales and a third of its stock market value.

The Nike case powerfully illustrates that forecasting, no matter how advanced vendors claim their technology is, has to be an executive-level process. Executives need to review the computerised forecast and analyse how it squares with information from their sales and marketing reps, and then sign off on a number that the whole company can live with. At Alcatel, executives meet on a regular basis to dissect and discuss forecasts, which are produced by combining computer readouts with human intelligence.

"The final meeting here is attended by me, the head of supply chain and the heads of marketing and sales," says Alcatel CFO Burns. "We all have to approve the decision. We live and die by it."

SIDEBAR: 3 Demand Forecasting Myths

SINCE FORECASTS ARE ALWAYS WRONG THERE IS NO NEED TO FORECAST.

Inaccurate forecasts can still be useful as long as you treat the result as a guide rather than the gospel. At the very least, having one forecast for the whole company keeps departments from coming up with their own grossly different forecasts.

FORECASTING REQUIRES STATISTICS AND MATHS WIZARDS.

While number crunching is important, what will ultimately make or break a forecast is how well you know your customers and the market. That requires a sales force that can both communicate with customers and honestly share that information with the rest of the company.

ONLY THE MOST EXPENSIVE FORECASTING SOFTWARE WILL WORK.

Contrary to what vendors want you to believe, most forecasting software is pretty much the same. The algorithms have been around for so long now that it is unreasonable to expect that one system's maths is better than another's. The important thing when choosing demand software is selecting a system that's robust enough to handle the amount of data that you intend to enter.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about ACTAlcatel-LucentBillionCiscoHISHome Depoti2IDC AustraliaInformation ResourcesMeta GroupNikeProcter & Gamble AustraliaSoftware TodaySoftware WorksStanford UniversityStrategy&Technology ResearchVicorWal-Mart

Show Comments
[]