6.3 ARCH(q) Model

Engle's (1982) original ARCH model assumes


$\displaystyle y_{t}$ $\displaystyle =$ $\displaystyle u_{t}\cr
u_{t}$ (6.12)

with $ \alpha_{0}>0$, and $ \alpha_{i}\geq 0,\ i=1,...,q$, to ensure that the conditional variance is positive.

The basic idea of these models is to increase the order of the autoregressive polynomial described in (1.1).

For this purpose, we define $ \upsilon_{t}=u_{t}^{2}-\sigma_{t}^{2}$, so that the square error is now

$\displaystyle u_{t}^{2}=\alpha_{0}+\sum_{i=1}^{q}\alpha_{i}u_{t-i}^{2}+\upsilon_{t}$

where $ \upsilon_{t}$ is a non-gaussian white noise.

The derivation of the unconditional moments of the ARCH($ q$) process is analogous to ARCH(1). The necessary and sufficient condition for the existence of stationary variance is

$\displaystyle \sum_{i=1}^{q}\alpha_{i}<1$

When this condition is satisfied, the variance of the process is

$\displaystyle \sigma^{2}=\frac{\alpha_{0}}{1-\sum_{i=1}^{q}\alpha_{i}}$

Although the variance of $ u_{t}$ conditioned on $ I_{t-1}$ changes with the elements of the information set (it depends on the past through the $ q$ most recent values of the squared innovation process) , the ARCH process is unconditionally homoscedastic.

6.3.0.0.1 Example

We simulate an ARCH(2) process with parameters $ \alpha_0 = 1/3$, and $ \alpha_1=\alpha_2=1/3$ and we compare the ACF function for the original and squared simulated values. The autocorrelation function of squared values reveals the first two significative lags.

After the model is estimated, we plot a similar picture as in figure 6.7 to show how the volatility does not vanish so quickly as in the ARCH(1) model.

Figure 6.8: Simulated time series with the volatility bands estimated from the ARCH(2) model.
\includegraphics[width=0.75\defpicwidth]{arch2vol.ps}

The log-likelihood function of the standard ARCH($ q$) model in (6.27), conditoned on an initial observation, is given by

$\displaystyle L(\alpha_{0},\alpha_{1},...,\alpha_{q})=\sum_{t=1}^{T}l_{t}$

where

$\displaystyle l_t = -\frac{1}{2}\log(\sigma_t^2) -\frac{1}{2}\frac{u_t^2} {\sigma^2_t}$ (6.13)

apart from some constant in the likelihood.

Let $ z_{t}^{\top }=(1,u_{t-1}^{2},...,u_{t-q}^{2})$ and $ \alpha^{\top }=(\alpha_{0},\alpha_{1},...,\alpha_{q}) $ so that the conditional variance can be written as $ \sigma_t^{2}=z_{t}^{\top }\alpha$.

The first order conditions then become simply

$\displaystyle \frac{\partial L}{\partial
\alpha}=\frac{1}{2\sigma_{t}^{2}} z_{t}
(\frac{u_{t}^{2}}{\sigma_{t}^{2}}-1)$

and the estimate of the information matrix is given in (6.10).