Download Forecasting: Methods and Applications PDF

TitleForecasting: Methods and Applications
Author
LanguageEnglish
File Size4.0 MB
Total Pages638
Document Text Contents
Page 1

1/THE FORECASTING PERSPECTIVE

1/1 Why forecast? . . . . . . . . . . . . . . . . . . . . . . . 2

1/2 An overview of forecasting techniques . . . . . . . . . . 6

1/2/1 Explanatory versus time series forecasting . . . . 10

1/2/2 Qualitative forecasting . . . . . . . . . . . . . . 12

1/3 The basic steps in a forecasting task . . . . . . . . . . . 13

References and selected bibliography . . . . . . . . . . . . . . 17

Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Page 2

2 Chapter 1. The Forecasting Perspective

1/1 Why forecast?

Frequently there is a time lag between awareness of an impending
event or need and occurrence of that event. This lead time is the
main reason for planning and forecasting. If the lead time is zero
or very small, there is no need for planning. If the lead time is
long, and the outcome of the flnal event is conditional on identiflable
factors, planning can perform an important role. In such situations,
forecasting is needed to determine when an event will occur or a need
arise, so that appropriate actions can be taken.

In management and administrative situations the need for planning
is great because the lead time for decision making ranges from several
years (for the case of capital investments) to a few days or hours
(for transportation or production schedules) to a few seconds (for
telecommunication routing or electrical utility loading). Forecasting
is an important aid in efiective and e–cient planning.

Opinions on forecasting are probably as diverse as views on any
set of scientiflc methods used by decision makers. The layperson may
question the validity and e–cacy of a discipline aimed at predicting an
uncertain future. However, it should be recognized that substantial
progress has been made in forecasting over the past several centuries.
There are a large number of phenomena whose outcomes can now
be predicted easily. The sunrise can be predicted, as can the speed
of a falling object, the trajectory of a satellite, rainy weather, and a
myriad of other events. However, that was not always the case.

The evolution of science has increased the understanding of various
aspects of the environment and consequently the predictability of
many events. For example when the Ptolemaic system of astronomy
was developed almost 1900 years ago, it could predict the movement
of any star with an accuracy unheard of before that time. Even then,
however, systematic errors were common. Then came the emergence
of Copernican astronomy, which was much more accurate than its
Ptolemaic predecessor and could predict the movement of the stars
to within hundredths of a second. Today, modern astronomy is far
more accurate than Copernican astronomy. The same increase in
accuracy is shown in the theory of motion, which Aristotle, Galileo,
Newton, and Einstein each improved.

The trend to be able to more accurately predict a wider variety

Page 319

316 Chapter 7. The Box-Jenkins methodology for ARIMA models

Time

0 10 20 30

0
2

0
4

0
6

0
8

0
1

0
0

Figure 7-2: Time plot of the data in Table 7-1.

r1 = 0.103
r2 = 0.099
r3 = −0.043
r4 = −0.031
r5 = −0.183
r6 = 0.025
r7 = 0.275
r8 = −0.004
r9 = −0.011

r10 = −0.152

Lag

A
C

F

2 4 6 8 10

-1
.0

-0
.5

0
.0

0
.5

1
.0

Figure 7-3: Autocorrelations for the series in Table 7-1.

Page 320

7/1 Examining correlations in time series data 317

7/1/2 A white noise model

Equation (7.2) is a simple random model where observation Yt is made
up of two parts, an overall level, c, and a random error component,
et, which is uncorrelated from period to period:

Yt = c + et . (7.2)

The data in Table 7-1 were obtained from this model. It is often called
a “white noise” model—a terminology which comes from engineering. white noise

The white noise model is fundamental to many techniques in time
series analysis. In fact, we have already used it in earlier chapters.
Any good forecasting model should have forecast errors which follow
a white noise model (see Section 2/4/5).

7/1/3 The sampling distribution of autocorrelations

With a time series which is white noise, the sampling theory of rk
is known and so the properties of the ACF can be studied for this
model.

One way of approaching this problem is to study the rk values one
at a time and to develop a standard error formula to test whether a
particular rk is significantly different from zero.

Theoretically, all autocorrelation coefficients for a series of random
numbers must be zero. But because we have finite samples, each of
the sample autocorrelations will not be exactly zero. It has been
shown by Anderson (1942), Bartlett (1946), Quenouille (1949), and
others, that the autocorrelation coefficients of white noise data have
a sampling distribution that can be approximated by a normal curve
with mean zero and standard error 1/

p
n where n is the number of standard error

observations in the series. This information can be used to develop
tests of hypotheses similar to those of the F -test and the t-tests
examined in Chapters 5 and 6.

For example, 95% of all sample autocorrelation coefficients must
lie within a range specified by the mean plus or minus 1.96 standard
errors.1 Since the mean is zero and the standard error is 1/

p
n

1The value of 1.96 is found by looking at Table A, Appendix III, of areas under
the normal curve. Since it is close to 2, it is often approximated by 2.

Page 637

Appendix III 631

Table entry gives DWL and DWU for
a 1% one-sided test of the Durbin-
Watson statistic. n = number of
observations; k = number of
parameters (so number of explanatory
variables is k − 1).k = 2 k = 3 k = 4 k = 5 k = 6n DWL DWU DWL DWU DWL DWU DWL DWU DWL DWU

15 0.81 1.07 0.70 1.25 0.59 1.46 0.49 1.70 0.39 1.96
16 0.84 1.09 0.74 1.25 0.63 1.44 0.53 1.66 0.44 1.90
17 0.87 1.10 0.77 1.25 0.67 1.43 0.57 1.63 0.48 1.85
18 0.90 1.12 0.80 1.26 0.71 1.42 0.61 1.60 0.52 1.80
19 0.93 1.13 0.83 1.26 0.74 1.41 0.65 1.58 0.56 1.77
20 0.95 1.15 0.86 1.27 0.77 1.41 0.68 1.57 0.60 1.74
21 0.97 1.16 0.89 1.27 0.80 1.41 0.72 1.55 0.63 1.71
22 1.00 1.17 0.91 1.28 0.83 1.40 0.75 1.54 0.66 1.69
23 1.02 1.19 0.94 1.29 0.86 1.40 0.77 1.53 0.70 1.67
24 1.04 1.20 0.96 1.30 0.88 1.41 0.80 1.53 0.72 1.66
25 1.05 1.21 0.98 1.30 0.90 1.41 0.83 1.52 0.75 1.65
26 1.07 1.22 1.00 1.31 0.93 1.41 0.85 1.52 0.78 1.64
27 1.09 1.23 1.02 1.32 0.95 1.41 0.88 1.51 0.81 1.63
28 1.10 1.24 1.04 1.32 0.97 1.41 0.90 1.51 0.83 1.62
29 1.12 1.25 1.05 1.33 0.99 1.42 0.92 1.51 0.85 1.61
30 1.13 1.26 1.07 1.34 1.01 1.42 0.94 1.51 0.88 1.61
31 1.15 1.27 1.08 1.34 1.02 1.42 0.96 1.51 0.90 1.60
32 1.16 1.28 1.10 1.35 1.04 1.43 0.98 1.51 0.92 1.60
33 1.17 1.29 1.11 1.36 1.05 1.43 1.00 1.51 0.94 1.59
34 1.18 1.30 1.13 1.36 1.07 1.43 1.01 1.51 0.95 1.59
35 1.19 1.31 1.14 1.37 1.08 1.44 1.03 1.51 0.97 1.59
36 1.21 1.32 1.15 1.38 1.10 1.44 1.04 1.51 0.99 1.59
37 1.22 1.32 1.16 1.38 1.11 1.45 1.06 1.51 1.00 1.59
38 1.23 1.33 1.18 1.39 1.12 1.45 1.07 1.52 1.02 1.58
39 1.24 1.34 1.19 1.39 1.14 1.45 1.09 1.52 1.03 1.58
40 1.25 1.34 1.20 1.40 1.15 1.46 1.10 1.52 1.05 1.58
45 1.29 1.38 1.24 1.42 1.20 1.48 1.16 1.53 1.11 1.58
50 1.32 1.40 1.28 1.45 1.24 1.49 1.20 1.54 1.16 1.59
55 1.36 1.43 1.32 1.47 1.28 1.51 1.25 1.55 1.21 1.59
60 1.38 1.45 1.35 1.48 1.32 1.52 1.28 1.56 1.25 1.60
65 1.41 1.47 1.38 1.50 1.35 1.53 1.31 1.57 1.28 1.61
70 1.43 1.49 1.40 1.52 1.37 1.55 1.34 1.58 1.31 1.61
75 1.45 1.50 1.42 1.53 1.39 1.56 1.37 1.59 1.34 1.62
80 1.47 1.52 1.44 1.54 1.42 1.57 1.39 1.60 1.36 1.62
85 1.48 1.53 1.46 1.55 1.43 1.58 1.41 1.60 1.39 1.63
90 1.50 1.54 1.47 1.56 1.45 1.59 1.43 1.61 1.41 1.64
95 1.51 1.55 1.49 1.57 1.47 1.60 1.45 1.62 1.42 1.64

100 1.52 1.56 1.50 1.58 1.48 1.60 1.46 1.63 1.44 1.65

Source: The Durbin-Watson tables are taken from J. Durbin and G.S. Watson
(1951) “Testing for serial correlation in least squares regression,” Biometrika, 38,
159–177. Reprinted with the kind permission of the publisher and the authors.

Page 638

632 Statistical tables

G: Normally distributed observations

Random independent observations
from a standard normal distribution
(mean 0 and standard deviation 1).

0.807 0.550 -0.076 0.147 -0.768 -0.022 -0.671 0.395 0.497 1.008
-1.746 2.101 0.473 2.058 -1.133 0.129 -0.251 -0.685 -0.290 0.034
-0.528 -0.121 -1.262 -0.780 1.173 -0.826 -0.698 0.196 1.590 0.019
-0.487 -0.227 -1.218 0.102 0.541 -0.281 0.634 1.226 -1.755 -0.432
0.548 -0.331 -0.163 0.229 -0.915 -0.406 0.028 -1.653 -0.509 0.635

0.946 0.015 2.992 -0.649 -1.070 0.921 1.012 -0.765 -0.506 -0.128
-1.143 -2.068 -0.449 0.111 0.189 -1.488 0.655 -0.958 -0.472 -1.116
-0.508 -0.500 1.207 0.661 -0.428 0.465 0.282 2.406 0.250 0.331
-0.055 -0.708 0.206 -0.247 -1.333 -0.713 -1.803 -0.016 2.784 0.698
1.722 -0.046 0.158 0.753 -1.180 -0.284 -0.101 -0.289 0.679 1.019

-0.775 -1.225 1.163 -0.677 -0.158 0.184 -0.152 -0.149 0.395 -1.486
-0.425 -0.450 -1.267 -0.254 2.049 -0.195 -0.137 -0.629 -0.085 -0.623
0.052 0.571 -0.057 -0.018 0.023 0.342 1.105 0.891 0.957 0.090
-1.568 0.714 0.372 -2.171 0.001 1.457 -1.583 1.199 0.533 -0.595
-0.402 -0.528 1.679 0.102 -0.933 0.691 0.131 -1.041 -0.381 0.704

-0.379 -1.091 0.702 -1.718 1.925 0.608 1.580 0.110 0.595 -0.894
-0.219 2.480 0.876 0.333 -0.748 0.209 0.173 -0.822 -0.428 -0.515
1.102 -0.964 -0.597 -1.281 -0.493 -0.828 1.862 0.076 -0.238 -0.109
-0.067 -0.592 0.532 -0.136 0.673 -0.184 0.698 1.035 -0.740 2.658
-0.766 -0.547 -0.750 0.070 -0.105 2.796 0.521 -0.528 -0.087 -1.108

-0.040 0.244 0.926 -0.163 -0.882 0.686 -0.351 -0.928 1.128 -0.910
-0.840 -0.276 0.063 0.751 2.457 -1.881 -2.265 0.486 0.293 1.080
0.472 0.150 -1.024 1.265 1.163 -1.864 -1.052 -1.258 -0.246 0.212
-0.238 0.306 -1.478 -1.045 -0.314 0.393 0.507 -0.616 -0.624 -1.839
-1.838 1.940 0.836 0.379 0.450 -3.152 -0.251 1.744 1.088 -0.453

-1.347 -0.498 0.928 -2.171 0.227 -0.401 -0.896 2.266 -1.087 1.406
-0.597 -0.337 0.643 -1.093 0.012 0.735 1.313 -0.542 -1.709 0.114
-0.758 1.332 0.177 -0.394 1.939 0.656 -1.052 0.107 2.193 0.314
-0.629 -1.170 -1.099 -0.914 -0.605 0.451 1.529 -0.706 0.053 0.566
-0.127 0.310 0.881 0.385 0.507 -0.724 1.166 -1.139 0.417 0.979

-1.060 0.780 -0.769 0.558 -0.925 -1.875 -1.737 0.601 -0.096 2.050
-0.748 1.106 -0.558 -1.638 -1.830 1.303 0.190 0.374 1.127 -0.934
-0.747 -0.951 -1.259 -0.153 0.104 -0.520 -0.285 0.448 0.871 -0.447
-0.516 0.563 1.507 0.655 -1.207 0.437 -1.498 0.613 -0.357 0.560
-0.111 -0.359 -1.762 0.332 0.000 -0.650 1.212 0.390 -0.868 1.736

0.554 1.107 -0.204 -0.040 -0.114 0.813 -1.071 -0.321 0.974 -1.463
-0.388 0.527 1.205 -0.238 -0.003 -0.138 -0.926 -1.503 -0.464 -0.388
-0.846 -1.411 0.963 1.980 -0.399 0.258 1.279 -1.105 2.107 -0.769
-1.617 -1.017 0.722 -1.925 -0.128 -0.637 0.550 0.485 2.008 1.008
-1.787 -0.691 0.557 -0.856 0.216 0.695 -0.917 -0.500 0.540 0.137

Similer Documents