14
CONCLUSIONS PAPER How to Avoid Wasting Time at Forecasting Featuring: Mike Gilliland, Forecasting Product Marketing Manager, SAS Insights from a webinar in the Applying Business Analytics Webinar Series

How to Avoid Wasting Time at Forecasting (Whitepaper)

Embed Size (px)

DESCRIPTION

We spend far too many organizational resources creating our forecasts, while almost invariably failing to achieve the level of accuracy desired. The whole conversation needs to be turned around. We should be focusing much less attention on modeling and forecast accuracy and much more on process efficiency and effectiveness. We must also consider alternative ways to answer the business questions that, out of habit, we rely on forecasting alone to address. For more info: www.nafcu.org/sas

Citation preview

Page 1: How to Avoid Wasting Time at Forecasting (Whitepaper)

CONCLUSIONS PAPER

How to Avoid Wasting Time at Forecasting

Featuring:

Mike Gilliland, Forecasting Product Marketing Manager, SAS

Insights from a webinar in the Applying Business Analytics Webinar Series

Page 2: How to Avoid Wasting Time at Forecasting (Whitepaper)

SAS Conclusions Paper

Table of Contents

The Futility of Forecasting? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Trying to Predict a Random Future . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Forecast Value Added Analysis – A Lean Approach to Forecasting . . 2

A Simple Example of FVA Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

The Practical Value of FVA Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Eliminate Ineffective Steps in the Forecasting Process . . . . . . . . . . . . . 4

Fairly Compare Forecasting Performance . . . . . . . . . . . . . . . . . . . . . . . 5

Getting Started with FVA Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Step 1: Map Your Overall Forecasting Process . . . . . . . . . . . . . . . . . . . 6

Step 2: Collect the Necessary Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Step 3: Analyze the Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Step 4: Report the Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Step 5: Interpret the Results and Take Action on the Findings . . . . . . . 8

Industry Adoption of FVA Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

Closing Thoughts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

About the Author . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

About SAS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

For More Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

Page 3: How to Avoid Wasting Time at Forecasting (Whitepaper)

1

How to Avoid Wasting Time at Forecasting

The Futility of Forecasting?The job of the forecast analyst at Fair Coin Toss Inc. is to predict the frequency of heads vs. tails in 1,000 daily tosses of a coin. The forecaster’s predictions have been only 50 percent accurate so far. Management isn’t happy. With so much data and sophisticated software available, surely the forecasts could be 60 percent accurate or better.

More data must be the key, management said. So they had the analyst augment his forecasting models with data about the physical attributes of the coin, the atmospheric and environmental conditions in which the coin was tossed, the performance attributes of the automated coin-tossing machine, the delay between coin tosses, and the rebound characteristics of the surface on which the coin was tossed. They organized a management committee to review the forecasts and tweak them based on their own experience and insights with coin tossing.

Still the forecasts were only 50 percent accurate.

The tossing of a fair coin to land heads-up or tails-up is a random occurrence, with a 50-50 chance of one or the other . No amount of new data, past data, analyst override or management intervention can improve the odds of predicting whether the next toss will land as heads or tails .

So the first aphorism of forecasting ought to be that forecasting is a huge waste of management time – or rather, that it can waste a huge amount of management time .

This is not to say that forecasting is pointless and irrelevant . It doesn’t mean that forecasting isn’t useful or necessary to run our organizations – and it doesn’t mean that managers shouldn’t care about their forecasting issues, nor seek ways to improve them . It simply means that the amount of time, money and human effort spent on forecasting is not commensurate with the amount of benefit achieved – that is, improvement in forecast accuracy .

We spend far too many organizational resources creating our forecasts, while almost invariably failing to achieve the level of accuracy desired . The whole conversation needs to be turned around . We should be focusing much less attention on modeling and forecast accuracy and much more on process efficiency and effectiveness . We must also consider alternative ways to answer the business questions that, out of habit, we rely on forecasting alone to address .

Page 4: How to Avoid Wasting Time at Forecasting (Whitepaper)

2

SAS Conclusions Paper

Trying to Predict a Random FutureAlthough we live in an uncertain and largely unpredictable world, we prefer to operate with an illusion of control . No matter what kind of behavior or activity we are trying to forecast – be it customer demand, financial costs and revenue, call center activity, loan defaults, insurance claims, or whatever – we think a bigger computer, a fancier model and a more elaborate process are all we need to get better forecasts . Unfortunately, the world doesn’t work that way .

As management at Fair Coin Toss Inc . eventually had to concede, forecast accuracy is largely determined by the nature of the behavior we are trying to forecast – its forecastability . If the behavior is smooth and stable, we should be able to forecast it accurately with simple methods . However, if the behavior is wild and erratic – or completely random, like heads vs . tails – there may be little hope of generating accurate forecasts, no matter how sophisticated our methods or how much effort we put into it .

Therefore, the goal of our efforts should be to develop forecasts as accurate as anyone can reasonably expect them to be – given the nature of what we are trying to forecast – and to do this as efficiently as possible .

Forecast Value Added Analysis – A Lean Approach to ForecastingLean is all about identifying and eliminating the wasted efforts in any process . A method called forecast value added or FVA analysis is a way to apply the lean approach to forecasting . FVA is used to find those process activities that are just wasting time – that are failing to improve the forecast, or are even making it worse .

Forecast value added is defined as the change in a forecasting performance metric – whatever metric you happen to use – that can be attributed to a particular step or participant in the forecasting process . Essentially, FVA is comparing the results of a process activity to the results you would have achieved without doing the activity .

FVA can be positive, showing that you are adding value by making the forecast better . Or FVA can be negative, indicating that whatever you are doing is just making the forecast worse .

FVA analysis is used to identify and mercilessly eliminate the non-value-adding activities in your forecasting process, so you can streamline the process and redirect the non-value-adding efforts into more productive activities . For instance, it might mean having your sales people out selling rather than trying to forecast future sales . As an added bonus, when you eliminate the activities that are just making the forecast worse, you can actually achieve better forecasts with less cost and effort .

The goal of our efforts should

be to develop forecasts as

accurate as anyone can

reasonably expect them to be –

given the nature of what we are

trying to forecast – and to do

this as efficiently as possible.

“FVA is the lean

manufacturing mind-set

applied to forecasting…”

Tom WallaceAuthor and supply chain thought leader

Page 5: How to Avoid Wasting Time at Forecasting (Whitepaper)

3

How to Avoid Wasting Time at Forecasting

A Simple Example of FVA Analysis

To understand FVA analysis, consider a very simple forecasting process, shown in Figure 1 . Here, historical demand feeds into forecasting software, which generates what we call the statistical forecast . Then, an analyst reviews the statistical forecast and can make a manual adjustment .

FVA analysis compares the accuracy of the statistical forecast (generated by modeling software) to the analyst’s override – and compares both forecasts to a “naive” forecast generated by very simple methods .

DemandHistory Statistical Model

Analyst Override

Figure 1: The simple forecasting process.

FVA analysis is the application of fundamental scientific method to the business forecasting process . We start with a null hypothesis – that our forecasting process has no effect on forecast accuracy . We then gather data to determine whether we can reject this null hypothesis .

What we are doing is analogous to evaluating the safety and efficacy of a new drug or medical treatment . For example, we find 100 people with colds and randomly divide them into two groups, giving one group the new cold remedy, and the other group a placebo . We then evaluate their recovery to see if those who had the new remedy get better faster .

In FVA analysis, a naive forecast serves as the placebo . The naive forecast must be something that is simple to compute, requiring the minimum of effort and manipulation to prepare a forecast . For example:

• Therandom walk or “no change” model just uses your last known actual value as the future forecast . If you sold 12 units last week, your forecast for this week is 12 units . If you sell 15 units this week, your new forecast for next week becomes 15 units, and so on .

• Fortheseasonal random walk, you use the same period from a year ago as the forecast for this year . Thus if you sold 35 units in October 2012, your forecast for October 2013 would be 35 units .

• Amovingaverageorothersimplestatisticalformulaisalsosuitabletouseas your naive model – being within the spirit of simple to compute with a minimum of effort .

Page 6: How to Avoid Wasting Time at Forecasting (Whitepaper)

4

SAS Conclusions Paper

When we conduct FVA analysis, we compare the forecasts generated at various stages in our forecasting process to our placebo, the naive forecast . If the process is doing better than the naive forecast, we are adding value . If we find that our process is doing worse than a naive forecast, we are simply wasting time and resources .

FVA results are commonly displayed in a stair-step report, as we see in Figure 2 . There is a row corresponding to each sequential step in the process – here the naive forecast, the statistical and the override . The second column shows whatever metric we are using to measure performance, typically MAPE (the mean absolute percent error), or accuracy . And the remaining columns show the pairwise comparisons .

Process Step MAPE FVA vs. Naive Forecast FVA vs. Statistical Forecast

Naive Forecast 25 percent

Statistical Forecast 20 percent 5 percent

Analyst Override Forecast 23 percent 2 percent -3 percent

Figure 2: A basic FVA analysis asks, “Did the forecasting process do better than a naive forecast?”

Here we see that the naive model achieved a MAPE of 25 percent . The statistical forecast reduced the error by 5 percentage points, with a MAPE of 20 percent . However, while the analyst override had a MAPE 2 percentage points lower than the naive model, it actually made the forecast worse by 3 percentage points compared to the statistical forecast .

In short, if you are doing better than a naive forecast, your process is adding value . If you are doing worse than a naive forecast, then you are simply wasting time and resources . It is not uncommon to find, as we see here, that human tampering with the process can make the forecast worse .

The Practical Value of FVA Analysis

Eliminate Ineffective Steps in the Forecasting Process

By conducting a thorough and ongoing FVA analysis at your organization, you may be able to find process steps or participants that are failing to add value . The idea is to streamline your process by eliminating those wasted efforts . Resources diverted away from forecasting can then be redirected to more productive activities, like selling product or serving customers .

• When FVA is negative – you can see that a process activity is making the forecast worse – then clearly that activity is unproductive and should be eliminated . FVA can also be used as an ongoing metric for tracking statistical model performance and indicating when models need to be recalibrated .

Page 7: How to Avoid Wasting Time at Forecasting (Whitepaper)

5

How to Avoid Wasting Time at Forecasting

By identifying and improving (or eliminating) non-value-adding activities, you can streamline your process and reduce the cost of resources invested in forecasting – essentially getting better forecasts for free .

• When FVA is positive from one step to another, it can indicate that the process step is adding value, as long as the incremental benefits justify the cost .

Fairly Compare Forecasting Performance

Another common use of FVA analysis is to compare performance between individual forecasters, product groups or organizations . Suppose you manage a group of three analysts, and you will award a bonus to the best forecaster . Using traditional analysis relying solely on MAPE or other traditional performance metrics, you would conclude that Analyst A, with a MAPE of 20 percent, deserves the bonus . But is this correct?

Analyst MAPE

A 20%

B 30%

C 40%

Figure 3: Traditional analysis based on MAPE would say that Analyst A is the best performer.

Let’s look a bit deeper into the situation using FVA analysis . Suppose we find that Analyst A is responsible for forecasting sales for products that have long life cycles, no promotional activity and stable demand patterns . Sales of these products would be relatively easy to forecast . Even a naive model would have achieved a MAPE of just 10 percent . FVA analysis shows that Analyst A is actually making the forecast worse by 10 percentage points .

Analyst B has products that are moderately difficult to forecast, with some seasonality and promotional activity, a few new products entering the mix, and moderately volatile demand patterns . Analyst B achieved the same MAPE as a naive model would have achieved, so the forecast value added is zero .

It turns out that only Analyst C added any value . Although C had the worst forecast error at 40 percent, C had demand that was very difficult to forecast . A naive model would have only achieved a MAPE of 50 percent . So C had 10 percentage points of value added and deserves the bonus .

Use FVA analysis to identify non-value-adding activities: •Streamlinetheprocessby

eliminating wasted efforts .

•Directresourcestomoreproductive activities .

•Potentiallyachievebetterforecasts for free .

Page 8: How to Avoid Wasting Time at Forecasting (Whitepaper)

6

SAS Conclusions Paper

Analyst Item Type Item Life Cycle Seasonality Promotions New Items Demand Volatility MAPE Naive MAPE FVA

A Basic Long None None None Low 20% 10% -10%

B Basic Long Some Few Few Medium 30% 30% 0%

C Fashion Short Highly Many Many High 40% 50% 10%

Figure 4: FVA analysis shows that MAPE alone can be misleading as an indicator of analyst performance.

This example leads to a warning about one of the perils of benchmarking forecasting performance . You cannot simply compare the MAPE or forecast accuracy achieved . You have to evaluate performance with respect to the underlying forecastability of the demand patterns .

MAPE is the most popular metric for evaluating forecasting performance, and it does tell you the magnitude of your forecast error . But MAPE doesn’t account for the forecastability of what you’re trying to forecast . It doesn’t tell you the level of accuracy you should be able to achieve . And it doesn’t tell you anything about how efficient you are . In short, MAPE by itself is not a legitimate metric for comparing forecasting performance .

Getting Started with FVA Analysis

Step 1: Map Your Overall Forecasting Process

The process may be very simple, like the one on the left in Figure 5, perhaps with just a statistically generated forecast and a manual override – or it can be an elaborate consensus process with lots of participation from various internal departments, customers and suppliers . Many organizations also have a final review step where senior management gets to change the numbers before approving them .

FVA analysis may reveal that

having the lowest MAPE is not

necessarily the same as being

the best forecaster.

Page 9: How to Avoid Wasting Time at Forecasting (Whitepaper)

7

How to Avoid Wasting Time at Forecasting

Finance P&ICDemandHistory

DemandHistory

CausalFactors

SalesExec

Targets

Mktg Customers

Statistical Model

Analyst Override

Collaboration/Consensus

Executive Review

Approved Forecast

Statistical Model

Analyst Override

Figure 5: Begin your venture into FVA analysis by mapping process steps and contributors.

Step 2: Collect the Necessary Data

In a thorough FVA analysis, you capture and record the forecast every period, at every step in the forecasting process: the naive forecast, your software’s statistical forecast, other forecasts modified by manual overrides and consensus, or executive-approved forecasts .

You want to gather this information at the most granular level of detail available, such as by product and location . You also need to record the time bucket of the forecast, typically the week or month you are forecasting – and of course, the “actual” demand or behavior you were trying to forecast .

Step 3: Analyze the Process

Having gathered the necessary data, now you can do FVA analysis – looking at how each process step results in a positive or negative change in MAPE, weighted MAPE or whatever traditional metric you are using . It doesn’t matter which traditional metric you use, including bias or forecast accuracy, because FVA analysis measures not the absolute value of the metric but the degree of change at each step of the process . Comparisons may include:

• Statisticalversusnaiveforecast.

• Analystoverrideversusstatisticalforecast.

Page 10: How to Avoid Wasting Time at Forecasting (Whitepaper)

8

SAS Conclusions Paper

• Consensusversusanalystforecast.

• Approvedversusconsensus.

• Consensusparticipantinputsversusnaive.

Excel works fine for a quick one-time snapshot of FVA in a single period of time . However, ongoing analysis of a multistage forecasting process with a lot of products will quickly grow into a large amount of data to store and maintain . You will want to automate data collection and storage, so this is not something you do in Excel . The entry-level SAS®VisualDataDiscoverysoftwareeasilyhandleshugeFVAdatasets,analysis and reporting, as well as dynamic visualization of FVA data .

Step 4: Report the Results

There is no one fixed way to report FVA results, but a stair-step table is a good place to start . On the left side you list the process steps or participants and their performance in terms of MAPE or accuracy or whatever metric you are using . The columns to the right show the value added (or subtracted) from step to step in the process .

For a more elaborate process, the report layout would be the same, except with more rows to show the additional process steps and more columns to show the additional comparisons between steps . You don’t have to report FVA for every possible pair of forecasts, but you should at least report every major pair in the chronological process .

In addition to a stair-step report such as the simple example in Figure 2, graphical presentation of the data is best . For example, a histogram illustrating the distribution ofFVAvaluesforagroupofproductscanbeveryinsightful.DonaldWheelerpresents ideas for how to present data in his book about statistical process control, Understanding Variation: The Key to Managing Chaos.

Step 5: Interpret the Results and Take Action on the Findings

Be aware that naive forecasts can be surprisingly difficult to beat . For example, a moving average is a perfectly appropriate forecasting model in some situations, and additional statistical sophistication does not always generate a better forecast .

When a particular participant or step is not adding value, you should first try to understand why . For example, do statistical models need to be updated so they will performbetter?Doanalystsneedadditionalexperienceortrainingonwhentomakejudgmentoverridesandwhentojustleavethestatisticalforecastalone?Docertainparticipants in the consensus process bias results because of their own personal agendas?Doexecutivesonlyapproveforecaststhatmeettheoperatingplanand revise those forecasts that are falling below plan?

Be aware that as you conduct FVA for the first time, the results can be embarrassing to participants who are shown to add no value to the process . You may choose to share results privately and tactfully . Your purpose is to improve the process, not necessarily to humiliate anyone .

If you haven’t conducted FVA

analysis and know that you are

beating a naive forecast, then

maybe you aren’t.

Page 11: How to Avoid Wasting Time at Forecasting (Whitepaper)

9

How to Avoid Wasting Time at Forecasting

It is also important to be cautious in interpreting your FVA results and not draw conclusions without sufficient evidence . You can’t just look at one period of data, or a short time frame, and determine whether you are adding value or not . Over short time periods, results may just be due to chance .

Industry Adoption of FVA AnalysisFVA has been applied in companies across many industries, including consumer products, retail, pharmaceuticals, manufacturing, transportation, apparel, and food and beverage . Major corporations such as Cisco, Intel, AstraZeneca, Newell Rubbermaid and others have gone public with their FVA results and the new ways they have applied the FVA concept . For example:

• ApremiumhomefurnishingsmanufacturerusedFVAtoappealtothecompetitivenature of its sales force . Sales reps were challenged to “beat the nerd in the corner” by improving upon the nerd’s statistical forecast .

• Whenalargetechnologymanufacturerlookedatsixyearsofhistoricaldata, FVA analysis revealed that half of forecasts failed to beat a naive model . The naive models were also less biased – neither chronically too high or too low .

• Atamajorspecialtyretailer,analystsweremakingfrequentadjustmentstotheforecast with each new bit of point-of-sales data from the stores . But FVA analysis showed that 75 percent of the analyst overrides failed to beat a moving average .

• Anautomotivesupplierfoundthat,althoughmanagementoverridesdidslightlyimprove forecast accuracy, the incremental gain might not be worth the cost of management time spent on the overrides .

There is also academic research on the topic of human adjustments to forecasts . One study of four supply chain companies in the UK examined 60,000 forecasts, where 75 percent were manually adjusted . The study authors concluded that:

• Smalladjustmentsmadealmostnodifferenceinforecastaccuracy,which makes perfect sense . Small adjustments will not make the forecast much better or much worse, so they are probably not worth the effort .

• Largeradjustments,particularlylargedownwardadjustmentsthatreduced the forecast, tended to add value by making the forecast more accurate .1

1 Source: “Good and Bad Judgment in Forecasting.” Fildes and Goodwin, Foresight, Fall 2007

Page 12: How to Avoid Wasting Time at Forecasting (Whitepaper)

10

SAS Conclusions Paper

Closing ThoughtsForecast accuracy is limited by the nature of the behavior you are trying to forecast . While you cannot control the accuracy of your forecasts, you can control the process used and the resources invested . Overly elaborate forecasting processes with many management touch points generally tend to make forecasts worse . More touch points bring more opportunities for people to add their biases and personal agendas – and contaminate what should be an objective, dispassionate and scientific process .

With FVA analysis, we can see whether forecast accuracy is being improved or eroded at each step in the process . We can improve or eliminate process steps that add no value . If good software can give you reasonably accurate forecasts with little or no management intervention, rely on the software and invest that management time in other areas that can bring more value to the company .

About the AuthorMike Gilliland, Product Marketing Manager at SAS, has worked in consumer products forecasting for more than 20 years in the food, electronics and apparel industries, as well as a consultant . He wrote a quarterly column on Worst Practices in Business Forecasting for Supply Chain Forecasting Digest and has published in Supply Chain Management Review, Journal of Business Forecasting, Foresight: The International Journal of Applied Forecasting, Analytics and APICS magazine . He is currently the Forecasting Practice column editor for Foresight .

Gilliland holds master’s degrees in philosophy and mathematical sciences from Johns Hopkins University . He deals with FVA analysis, worst practices and other forecasting topics in his book, The Business Forecasting Deal: Exposing Myths, Eliminating Bad Practices, Providing Practical Solutions. You can follow his blog, The Business ForecastingDeal,atblogs.sas.com/content/forecasting .

[email protected]

Page 13: How to Avoid Wasting Time at Forecasting (Whitepaper)

11

How to Avoid Wasting Time at Forecasting

About SASSAS Forecast Server is SAS’ flagship forecasting product, suitable for the forecasting needs of even the largest enterprises . A high-performance forecasting engine provides large-scale, automatic forecasting from SAS code or via the SAS Forecast Studio interface . SAS Forecast Server can diagnose the historical behavior of a time series, determine the appropriate class of models to deal with that behavior, and customize model parameters for each individual series . SAS Forecast Server has been adopted at more than 500 organizations worldwide, across a wide range of industries .

SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market . Through innovative solutions, SAS helps customers at more than 60,000 sites improve performance and deliver value by making better decisions faster . Since 1976 SAS has been giving customers around the world THE POWER TO KNOW® .

For More InformationTo view the on-demand recording of this webinar:

sas.com/reg/web/corp/nn

For information about events in the Applying Business Analytics Webinar Series:

sas.com/ABAWS

To view the Forecasting 101 on-demand webcast:

sas.com/reg/web/corp/907017

To access Mike Gilliland’s blog, The Business Forecasting Deal:

blogs.sas.com/content/forecasting

To view the on-demand webcast, Forecast Value Added Analysis: Step by Step: sas.com/events/cm/176129/index.html

To download the SAS white paper, Forecast Value Added Analysis: Step by Step: sas.com/reg/wp/corp/6216.

The Business Forecasting Deal: Exposing Myths, Eliminating Bad Practices, Providing Practical Solutions, by Mike Gilliland is available through the SAS bookstore, Amazon .com and other booksellers .

Follow us on twitter: @sasanalytics Like us on Facebook: SAS Analytics

Page 14: How to Avoid Wasting Time at Forecasting (Whitepaper)

About SASSAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market . Through innovative solutions, SAS helps customers at more than 60,000 sites improve performance and deliver value by making better decisions faster . Since 1976 SAS has been giving customers around the world THE POWER TO KNOW® . For more information on SAS® Business Analytics software and services, visit sas.com .

SAS Institute Inc. World Headquarters +1 919 677 8000To contact your local SAS office, please visit: sas.com/offices

SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. ® indicates USA registration. Other brand and product names are trademarks of their respective companies. Copyright © 2013, SAS Institute Inc. All rights reserved. 106146_S95306_0113