JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for PHD-DESIGN Archives


PHD-DESIGN Archives

PHD-DESIGN Archives


PHD-DESIGN@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

PHD-DESIGN Home

PHD-DESIGN Home

PHD-DESIGN  2002

PHD-DESIGN 2002

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Excellent review of Scott Armstrong's forecasting book

From:

Ken Friedman <[log in to unmask]>

Reply-To:

Ken Friedman <[log in to unmask]>

Date:

Tue, 12 Mar 2002 20:01:34 +0100

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (547 lines)

Dear Colleagues,

While waiting for a copy of Scott Armstrong's book
on forecasting, I had a chance to read the following review.

This review is written from the viewpoint of empirical
finance research, but the careful development and analysis
of issues will give you ideas on how to apply the themes
and methods in this book to design research.

Best regards,

Ken Friedman


Review of

Principles Of Forecasting : A Handbook for Researchers and Practitioners

Reviewed by John Aitchison. www.DataSciencesResearch.com


There's a new book in town .. a big one, and an important one. It is called

Principles Of Forecasting : A Handbook for Researchers and Practitioners

Edited by J.Scott Armstrong, University of Pennsylvania. Kluwer,2001.
xii + 849 pages. USD$190 (Hardbound) ISBN 0-7923-7930-6

"Principles of Forecasting" is a review work, a survey of the
state-of-the-art of forecasting (very broadly defined), a
distillation of the theory and hard-won practical knowledge into a
masterwork. 9 of the 30 papers are authored or co-authored by the
editor J. Scott Armstrong, with 39 researchers contributing and over
120 external reviewers.

Let us look first at what it is not .. it is not a mathematically
oriented textbook, not specifically focussed on finance nor for that
matter only on time-series forecasting. It does not have the
field-specific approach of a book such as "Non-linear time series
models in empirical finance" (Philip Hans Franses and Dick van Dijk)
nor the in-depth statistical exposition of Chatfield's recent
"Time-Series Forecasting", nor is it a step-by-step "how to" such as
is presented in "Forecasting Methods and Applications" (Makridakis,
Wheelwright and Hyndman) or more recently in "Practical Forecasting
for Managers" (Nash and Nash).

What it is .. is a handbook for practitioners (and researchers) . Its
wide-ranging scope ensures that almost every type of forecasting
activity is reviewed and summarised and distilled into principles by
an expert in that particular field.

So why would a hard-nosed empirical finance person be interested in
"Principles of Forecasting"?

Precisely because of its broad scope. It takes a step back, reviews
ALL the evidence about a diverse set of forecasting methodologies and
provides a framework against which you can assess your own approaches
and tools.

Sometimes we don't have the luxury of good data and lots of it.
Sometimes we are breaking new ground and there is no data at all,
maybe just some analogous situationsŠ situations in which finance
professionals are asked to produce a "forecast" from little or no
data or from data that is only marginally relevant to the problem at
hand. For example one may be asked to forecast the likely success of
a new financial service or product.

"Principles of Forecasting" is the most comprehensive source that I
am aware of that specifically discusses in depth the problems that
arise when there is insufficient objective data.

A flowchart at the back of the book (and also available on the
Principles of Forecasting website) splits forecasting problems into
those with or without sufficient data. There are further splits in
this methodology recommendation tree, but suffice it to say that
there are 10 terminal leaves ranging from "expert forecasting" to
"econometric methods" which then feed back into a further decision
box on the utility of combining forecasts. Chapter 12 of "Principles
of Forecasting" works through procedures for selecting forecasting
methods and for this tree in detail.

If you are one of those people who 'randomly sample' a new book,
particularly a large book - and Principles Of Forecasting IS a large
book, weighing in at 738 pages prior to the Reviewer List, Author
profiles , and the Forecasting Dictionary (64 pages) - by choosing
pages more or less at random, reading more here and less there,
running your eye over the subject index - then you are likely to find
topics like 'measurement of purchase intentions', 'improving
judgement in forecasting', 'forecasting with conjoint analysis' as
well as the more traditional forecasting topics. If these do not fit
your 'mental model' of what forecasting is about, do not despair.

The handbook uses the term 'forecasting' in a much more all embracing
fashion than econometricians and statisticians might normally do, but
there is plenty of material on time-series forecasting - this is the
subject of several specific chapters (Chapter 8 'Extrapolation',
Chapter 11 'Econometric Methods', part of Chapter 15 'Assessing
Uncertainty' , Chapter 7 'Analogies', and parts of Chapter 18
'Applications of Principles').

The meaning of 'Principles' is outlined on p3 under the heading "What
do we mean by principles? " .

"The purpose of this book is to summarize knowledge of forecasting as
a set of principles. These 'principles' represent advice, guidelines,
prescriptions, condition-action statements, and rules."

It goes on to explain that the principles should be supported by
empirical evidence, the authors describe and summarize the evidence
where possible and identify 'speculative principles' and those 'based
on expert judgement' as such.

The titles listed below will give an idea of its scope and, in a move
that should be more broadly emulated, abstracts for each chapter are
available on the website - so you can "try before you buy".

The papers are:

"Role Playing: A Method to Forecast Decisions"
"Methods for Forecasting from Intentions Data"
"Improving Judgmental Forecasts"
"Improving Reliability of Judgmental Forecasts"
"Decomposition for Judgmental Forecasting and Estimation"
"Expert Opinions in Forecasting: Role of the Delphi Technique"
"Forecasting with Conjoint Analysis"
"Judgmental Bootstrapping: Inferring Experts' Rules for Forecasting"
"Forecasting Analogous Time Series"
"Extrapolation of Time Series and Cross-Sectional Data"
"Neural Networks for Time Series Forecasting"
"Rule-based Forecasting: Using Judgment in Time-Series Extrapolation"
"Expert Systems for Forecasting"
"Econometric Forecasting"
"Selecting Forecasting Methods"
"Judgmental Time Series Forecasting Using Domain Knowledge"
"Judgmental Adjustments of Statistical Forecasts"
"Combining Forecasts"
"Evaluating Forecasting Methods"
"Prediction Intervals for Time Series"
"Overconfidence in Judgmental Forecasting"
"Scenarios and Acceptance of Forecasts"
"Learning from Experience: Coping with Hindsight Bias and Ambiguity"
"Population Forecasting"
"Forecasting the Diffusion of Innovations: Implications for Time
Series Extrapolation"
"Econometric Models for Forecasting Market Share"
"Forecasting Trial Sales of New Consumer Packaged Goods"
"Diffusion of Forecasting Principles through Books"
"Diffusion of Forecasting Principles: An Assessment of Forecasting
Software Programs"
"Standards and Practices for Forecasting"

Some papers of special interest to finance professionals are listed
below. We start with those that relate more to situations in which
judgement is involved (or the formal data is limited or not directly
applicable) and then move to the more "hard core" time-series related
papers.

"Expert Opinions in Forecasting: Role of the Delphi Technique"

Expert opinion is obviously one way of getting a 'forecast' .. but
there is a great deal more to it than getting a bunch of guys and
gals to sit around in a meeting.

The Delphi technique, a structured group process for eliciting and
combining expert judgments has been widely used and widely
criticised. This paper examines the proper application of Delphi,
contrasts it with traditional group meetings and the Nominal Group
Technique, and arrives at 11 principles for the design and
application of structured group forecasting techniques.

The contributing authors to "Principles of Forecasting" have done us
a great service in their synthesis of theory and evidence and their
dedication to "getting it right", and I highly recommend a close
study of this chapter to anyone in the situation of having to use
structured group elicitation of forecasts from experts.

"Forecasting with Conjoint Analysis"

Conjoint analysis is widely known in the market research community
but probably less so to finance professionalsŠ there is, of course,
no reason why the techniques cannot be applied to (new) financial
products and services, as well as to packaged goods, and indeed we
have done so. This paper provides an overview of the conjoint
procedure and conditions under which conjoint should work well.

For those unfamiliar with conjoint and its close cousin SPDCM (Stated
Preference Discrete Choice Modeling), the underlying idea is that a
model can be built of the relationship between (stated) preferences
and product or service attributes. This is accomplished by exposing
survey or group respondents to a number of hypothetical scenarios
(which have been developed by varying the product attributes
according to an appropriate experimental design) and asking for
"preferences" or "choices". Models can be fit to this data with
regression or maximum likelihood estimation of multinomial logit
models.

"Judgmental Bootstrapping (exjoint analysis) "

Judgmental bootstrapping is a novel procedure proposed by J. Scott
Armstrong to "predict what an expert would predict". He also suggests
the somewhat more appropriate name "exjoint analysis" drawing on its
conceptual relationship to conjoint analysis. Another name for the
area is "policy capturing" - a type of expert system based on an
expert's opinions and cues. Models are typically estimated by
ordinary least squares.

This is an intriguing approach and while the applications to date
have been limited, Professor Armstrong provides guidance as to
situations in which such an approach might profitably be applied. He
concludes that exjoint analysis can provide more accurate forecasts
than unaided judgments especially when the prediction problem is
complex, the model can be reliably estimated and the experts have
valid knowledge about relationships.

"Forecasting Analogous Time Series"

The chapter on "analogies" concentrates on pooling analogous
time-series. It is not thus about analogies in the broader sense and
for a brief discussion of some of the uses of analogies we recommend
"Forecasting Methods and Applications" (Makridakis, Wheelwright and
Hyndman - Chapter 9 p466) in which an analogy between five important
inventions of the Industrial Revolution and corresponding ones of the
Information Revolution is explored.

This chapter concentrates on Bayesian pooling to "borrow strength
from neighbours".

In situations in which organizations have to forecast hundreds or
thousands of time series it might be expected that these series
include several sets of "analogous time series" eg sales of the same
products in different geographic areas might be expected to co-vary
positively over time. That co-variation can be put to work in
increasing the precision of estimates and adapting readily to pattern
changes, while being somewhat robust to outliers.

The suggested process essentially involves standardising each of the
time series (in the analogous group), and pooling the time period
such that the resultant data has multiple data points per time period.

In spite of the lack of a wide literature on pooling as an area of
forecasting there are some attractive ideas in this chapter and I
particularly recommend it to those dealing with large numbers of time
series. At the other end of the scale the author suggests that
pooling might be applicable to micro-scale time-series models, those
in which intermittent demand leads to a series with many zeros and
where Croston's smoothing might usually be applied.

"Neural Networks for Time Series Forecasting"

This short but highly readable chapter on the use of Neural Networks
in time-series forecasting includes comparisons between neural nets
and traditional models - the Faraway and Chatfield 1998 paper
"Time-Series Forecasting and Neural Networks" is discussed, as is
some more recent work using data from the M-competition.

Neural networks are attractive to many practitioners because of their
flexibility and inherent non-linearity, but there has been
significant controversy about their application to the time-series
domain. The authors remain optimistic about the potential for neural
networks for situations with discontinuities in the data or for
forecasting longer time horizons. A set of principles is supplied for
a sensible application of neural net techniques.

"Extrapolation of Time Series and Cross-Sectional Data"

The chapter on "Extrapolation" is a gentle introduction to
time-series and contains a lot of common sense and useful guidelines.
It emphasises the principle of simplicity and recommends a simple
representation of trend, unless there is evidence to the contrary :
the chapter contains a discussion of the empirical evidence to
support this and refers to the results of the M-competition. (The M
forecasting competitions are well known to time-series analysts, but
perhaps not so well known outside that sphere, and this chapter gives
pointers to the key results of the competition, in which one
technique is pitted against another).

"Econometric Forecasting"

This chapter (by Geoffrey Allen, Department of Resource Economics,
University of Massachusetts and Robert Fildes, The Management School,
Lancaster University) is highly readable and accessible to
non-econometricians : there is a refreshing level of understated
humour throughout.

It starts "Econo-magic and economic tricks are two of the pejorative
terms its detractors use to describe the art and science of
econometrics. No doubt, these terms are well deserved in many
instances" , but then goes on to discuss the source of the problems,
and gives a brief but illuminating history of econometric forecasting
and the 1925 work of Charles Sarle in forecasting the price of hogs.
And thence to "Unfortunately for forecasters, research by
econometricians has not focused on what works best with real-world
data but on which particular method would be optimal if a standard
assumption is violated in some well-defined way".

They go on to explain "The principal tool of the econometrician is
regression analysis, using several causal variables ..... compared
with univariate modeling, multivariate analysis opens up many more
choices for the investigator ......".

They identify the fundamental principle as being to "aim for a
relatively simple model for specification", relatively being a
keyword in this context. There is discussion of the fact that an
econometric model can be too simple and the KISS (Keep It
Sophisticatedly Simple) principle of Zellner, the econometric axiom
that any model is a misspecification and what is wanted is a model
and estimation procedure that is robust to misspecification.

Gilbert's 1995 Monte Carlo study "Combining VAR estimation and State
Space model reduction for simple good predictions" is used as support
for the argument of parsimony.

An 8-step strategy for econometric forecasters is proposed, starting
from the eminently sensible "define the objectives of the modeling
effort". Statements like this are all too easily dismissed as
motherhood, but the thoughtful analyst will recognize that the
objective can radically influence the model form and estimation
methodology.

Each of these 8 steps is loosely discussed in compact but sufficient
detail and there are sub-sections within each of the 8 steps. For
example, there is much discussion about the use of disaggregated data
and aggregating forecasts and the evidence to support the bottom-up
approach.

Most of the discussion centres around using VAR models (Vector
Auto-Regression) estimated using ordinary least squares.
Econometricians and non-econometricians alike will find the
discussion of co-integration and ECM (Error Correction Models) and
the range of possibilities illuminating. There is even a reference to
the classic "A Drunk and Her Dog" article (Murray 1994) which
describes co-integration thus "As they wander home, the dog and its
owner may make their own little detours but will never be far apart.
But a drunk and someone else's dog will wander their separate ways".

The conclusion of the chapter presents 23 principles of econometric
forecasting and for each the conditions under which it applies and
the evidence to support the recommendation. A review of this material
is highly recommended.

"Integrating, Adjusting and Combining Procedures"

The three papers in this chapter

"Judgmental Time Series Forecasting Using Domain Knowledge," Richard
Webby, Marcus O'Connor, and Michael Lawrence, University of South
Wales

"Judgmental Adjustments of Statistical Forecasts," Nada R. Sanders,
Department of Management Science, Wright State University and Larry
P. Ritzman, Operations and Strategic Management, Boston College

"Combining Forecasts," J. Scott Armstrong

address the issues of a combination of judgment and hard forecasts,
the adjustment of statistical forecasts and the combination of
forecasts from different methods.

Combining forecasts is an area of great interest as it holds out the
promise of a combined forecast being more accurate than the
individual components.

Professor Armstrong reviews 57 studies relating to combining
forecasts and presents a useful set of principles to employ in
forecast combination.

"Prediction Intervals for Time Series"

In the opinion of the reviewer, this paper (by the well known Chris
Chatfield, Department of Mathematical Sciences, University of Bath)
is one of the most lucid and important papers in "Principles of
Forecasting". It addresses the important issue of estimating
prediction intervals and thus providing interval forecasts as well as
the more usual point forecasts. Time-series specialists will
undoubtedly be aware of Dr Chatfield's most recent book "Time-Series
Forecasting" which addresses many of the issues touched upon here.

For those not familiar with a prediction interval it is "an interval
estimate for an unknown future value" and the terminology is
preferred to the term 'confidence interval'. Density forecasting, the
problem of the complete probability distribution of some future
value, is a related topic.

Chatfield draws a clear distinction between forecasting method and
the model, a method being a rule or formula for computing a point
forecast, which may or may not be based on a model. For example,
exponential smoothing is a method based on a model. Models, of
course, permit one to compute theoretical prediction intervals which
may not be the case with methods.

Much of the balance of the chapter focuses on the computation of PIs,
the reasons for their not being computed routinely or being computed
incorrectly, the effects of model uncertainty on PIs, and the reasons
underlying the common observation that computed PIs are too narrow.

Theoretical formulae are available for computing PIs for various
classes of time-series models and some methods, and Chatfield gives
references and discusses these. He suggests that such theoretical
formulae might better be called "true model" formulae because they
assume there is a true known model and that the model parameters are
known exactly. Of course the parameters have to be estimated from the
data and the effect of parameter uncertainty can be non-trivial, in
certain circumstances. Selecting the correct model is clearly of
major importance and an example is given of fitting two plausible
models to the same dataset yielding two very different PIs. Chatfield
makes the important point a wider PI is not necessarily "bad" and in
that particular situation was in fact more realistic.

There is a brief discussion of the pitfalls of approximate formulae :
interested readers should refer to Chatfield "Time-Series
Forecasting" books for more information.

Computationally intensive methods including bootstrapping are
discussed, including the Williams and Goodman procedure of sequential
data splitting and refitting, which Chatfield suggests is now due for
reassessment. He also makes the point that bootstrapping sometimes
gives poor results and mentions some of the difficulties of
bootstrapping time-series.

There is some discussion on the use of Bayesian approaches, both for
finding the complete predictive distribution and for Bayesian model
averaging.

"Evaluating Forecasting Methods," J. Scott Armstrong

Professor Armstrong provides a broad overview of the evaluation task
ending with a useful evaluation checklist.

It points up the need to compare methods against reasonable
alternatives, to use multiple competing hypotheses to test the
underlying assumptions, to evaluate the outputs by replication and
assess outputs by prespecified criteria (in order to avoid the
tendency of people to reject disconfirming evidence).

There is a discussion on alternative error measures and a
recommendation to avoid root mean square errors (RMSE) for
comparisons across series.

"Diffusion of Forecasting Principles: An Assessment of Forecasting
Software Programs"

The focus of the paper (by Leonard J. Tashman, University of Vermont,
and Jim Hoover, U.S. Navy) is the degree to which forecasting
software programs facilitate "best practice". This is probably of
most interest to decision makers attempting to standardize on a
forecasting software package or to those who need to audit some
forecasting practices.

Although it is difficult to come up with a summary score it would
appear that forecasting software is about "halfway there" - that is,
about 50% of forecasting principles are implemented across all
software evaluated. However the area of prediction interval
forecasting, not surprisingly, remains an area of weakness. The
authors conclude that best forecasting practices cannot readily be
achieved within the spreadsheet medium, and that in general the
dedicated business forecasting programs for forecasting time-series
data are more likely to provide forecasting best practice
implementation than general statistical packages.

"Standards of Practice"

"Standards and Practices for Forecasting," J. Scott Armstrong

The last chapter of the book presents a somewhat daunting list of
some 139 forecasting principles in 16 categories.

The authors themselves recognize their 139 principles might be too
many but point out that only some of them would be applicable in any
given situation.

As well as the 139 principles the chapter contains a brief discussion
on auditing forecasts, and re-presents the 139 principles as a
checklist (which is also available from the website).

It seems reasonable that this be used, given its extensive peer
review, as the benchmark for auditing procedures.

CONCLUSION:

This is a book to have on hand, not something which you sit down and
read from cover to cover in one weekend. It is a resource of depth,
scope and value.

It might be thought that breadth connotes shallowness, and an expert
in a particular aspect of forecasting might find coverage of his or
her particular area of expertise somewhat thin. While as a generality
that may be true, the work of the domain experts in summarizing and
distilling the evidence results in a clear statement of a set of
principles : those principles are thought provoking even to a domain
expert.

Its broad scope and thorough summary of the available evidence makes
it a valuable guide for any quantitative professional venturing into
an unfamiliar area of forecasting : its clear statement of principles
in that area encourages 'best practice' and communication of that
best practice and the supporting evidence to the end user .

The Forecasting Principles website

http://hops.wharton.upenn.edu/forecast

contains much useful information about the book and forecasting
issues and practices.

A detailed table of contents, with links to abstracts of each of the
30 papers (this is a particularly nice touch) , is available at

http://morris.wharton.upenn.edu/forecast/tofc.html

and the methodology tree at

http://morris.wharton.upenn.edu/forecast/insidecover.pdf


References

Chatfield,C (2001) "Time-Series Forecasting", London: Chapman & Hall

Faraway,J. & Chatfield, C. (1998) "Time-series forecasting with
neural networks: A comparative study using the airline data" Applied
Statistics, 47, 231-250

Franses, P.H. & van Dijk, D (2000) "Non-linear time series models in
empirical finance" , Cambridge: Cambridge University Press

Gilbert, P.D. (1995), "Combining VAR estimation and state space model
reduction for simple good predictions" Journal of Forecasting, 14,
229-250

Makridakis,C. , Wheelwright, S.C. and Hyndman, R.J. (1998)
"Forecasting Methods and Applications" (3'rd ed), New York: Wiley

Murray, M.P. (1994), "A drunk and her dog: An illustration of
cointegration and error correction" American Statistician,48,37-39

Nash, J.C. & Nash, M.M. (2001), "Practical Forecasting For Managers",
London: Arnold

The Reviewer

John Aitchison is Director of DataSciencesResearch

www.DataSciencesResearch.com

an independent research design, analysis, evaluation and audit consultancy

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
2006
2005
2004
2003
2002
2001
2000
1999
1998


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager