12.7 Goodness-of-Fit test

We now turn our interest to the derivation of the asymptotic distribution of $ k_n^{-1}\ell_n(\tilde{m}_{\hat{\theta}})$. We do this by discretizing $ \int_0^1 {\cal{N}}^2(s) ds$ as $ (k_n)^{-1}\sum_{j=1}^{k_n} {\cal{N}}^2(t_j)$ where $ \{t_j\}_{j=1}^{k_n}$ are the mid-points of the original bins in formulating $ \ell_n(\tilde{m}_{\hat{\theta}})$. If we choose $ k_n = [(2h)^{-1}]$ such that $ \vert t_{j+1} - t_j\vert \ge 2h$ for all $ j$, then $ \{ {\cal{N}}(t_j)\}$ are independent and each $ {\cal{N}}(t_j) \sim \textrm{N}(h^{1/4} \Delta_n(t_j)/\sqrt{V(t_j)},1)$. This means that under the alternative $ H_1$

$\displaystyle \sum_{j=1}^{k_n} {\cal{N}}^2(t_j) \sim \chi_{k_n}^2(\gamma_{k_n}), $

a non-central $ \chi^2$ random variable with $ k_n$ degree of freedom and the non-central component $ \gamma_{k_n}=h^{1/4} \lbrace \sum_{j=1}^{k_n} \Delta_n^2(t_j)/V(t_j)\rbrace^{1/2}$. Under $ H_0$,

$\displaystyle \sum_{j=1}^{k_n} {\cal{N}}^2(t_j) \sim \chi_{k_n}^2 $

is $ \chi^2$-distributed with $ k_n$ degrees of freedom. This leads to a $ \chi^2$ test with significance level $ \alpha$ which rejects $ H_0$ if $ \ell_n(\tilde{m}_{\hat{\theta}}) > \chi^2_{k_n,\alpha}$ where $ \chi^2_{k_n,\alpha}$ is the ( $ 1-\alpha$)-quantile of $ \chi^2_{k_n}$. The asymptotic power of the $ \chi^2$ test is $ \textrm{P}\lbrace\chi_{k_n}^2(\gamma_{k_n}) > \chi^2_{k_n,\alpha}\rbrace$, which is sensitive to alternative hypotheses differing from $ H_0$ in all directions.

We may also establish the asymptotic normality of $ (k_n)^{-1}\sum_{i=1}^{k_n} {\cal{N}}^2(t_j)$ by applying the central limit theorem for a triangular array, which together with (12.28) and (12.29) means that

$\displaystyle k_n^{-1} \ell_n(\tilde{m}_{\hat{\theta}}) \stackrel{ {\cal{L}} } ...
...} \int \Delta_n^2(s)V^{-1}(s) ds,
2 h K^{(4)}(0) \{K^{(2)}(0)\}^{-2} \biggr). $

A test for $ H_0$ with an asymptotic significance level $ \alpha$ is to reject $ H_0$ if

$\displaystyle k_n^{-1} \ell_n(\tilde{m}_{\hat{\theta}}) > 1 + z_{\alpha} \{K^{(2)}(0)\}^{-1} \sqrt{ 2h K^{(4)}(0) }$ (12.29)

where $ \textrm{P}(Z > z_{\alpha}) = \alpha$ and $ Z \sim \textrm{N}(0,1)$. The asymptotic power of this test is

$\displaystyle 1 - \Phi \biggl \lbrace z_{\alpha} - { K^{(2)}(0) \int \Delta_n^2(s)V^{-1}(s) ds \over \sqrt{ 2 K^{(4)}(0) }} \biggr \rbrace.$ (12.30)

We see from the above that the binning based on the bandwidth value $ h$ provides a key role in the derivation of the asymptotic distributions. However, the binning discretizes the null hypothesis and unavoidably leads to some loss of power as shown in the simulation reported in the next section. From the point of view of retaining power, we would like to have the size of the bins smaller than that prescribed by the smoothing bandwidth in order to increase the resolution of the discretized null hypothesis to the original $ H_0$. However, this will create dependence between the empirical likelihood evaluated at neighbouring bins and make the above asymptotic distributions invalid. One possibility is to evaluate the distribution of $ \int_0^1 {\cal{N}}_0^2(s) ds$ by using the approach of Wood and Chan (1994) by simulating the normal process $ {\cal{N}}^2(s)$ under $ H_0$. However, this is not our focus here and hence is not considered in this chapter.