In the
following we assume a stationary stochastic process
, i.e.,
and
. In the
previous sections, we have assumed that we knew the process and
thus the moment generating function was also known. In practice
one observes only a realization of the process,
, and thus there is the problem of estimating the moment
generating function.
The parameter
can be estimated with the simple
arithmetic sample mean:
 |
(12.20) |
The estimator
is unbiased since it holds that
, and its variance is
When the autocovariance function
is absolutely
summable, it holds that
and
. The estimator
is then also a consistent estimator for
. In many cases there
are more efficient estimators which take advantage of the
correlation structure of the process.
The asymptotic variance
is denoted as
, since this is exactly the
spectral density at frequency zero. Under the absolute summability
of
the following asymptotic distribution for the
estimator holds:
N |
(12.21) |
A possible estimator of the covariance function
is
 |
(12.22) |
with the mean estimator
from (11.20). Instead
of dividing by
in (11.22) one could also divide by
, although the estimator would then have less favorable
properties. The estimator
is no longer
unbiased, since the following can be shown.
Positive autocovariances are in general underestimated with
. Asymptotically
is
nevertheless unbiased:
For the variance when
terms of higher order are ignored it holds that
and since
holds,
is a consistent estimator for
. Furthermore, it can be shown that the covariance
estimator behaves asymptotically like a normally distributed
random variable:
An obvious estimator for the ACF
is
 |
(12.23) |
Once again we have a bias of order
, i.e.,
and
is asymptotically unbiased. For the
variance it holds that
The estimator
is consistent, since
. For
the asymptotic distribution of the vector
it can be shown
that
with the covariance matrix
with the typical element
In particular for the asymptotic variance of
, it holds that
Example 12.5 (MA(q))
For the MA(q) process in (
11.1) we know that

for all

. Thus the asymptotic variance can be simplified
from

for

to
Example 12.6 (white noise)
If

is white noise, it holds that
and
for

. The asymptotic covariance matrix of

is the identity matrix.
Using this we can build approximately 95% confidence intervals
for the ACF:
![$ [-\frac{1}{n}\pm \frac{2}{\sqrt{n}}]$](sfehtmlimg2209.gif)
.