6.3 Exercises

EXERCISE 6.1   Consider an uniform distribution on the interval $[0,\theta]$. What is the MLE of $\theta$? (Hint: the maximization here cannot be performed by means of derivatives. Here the support of $x$ depends on $\theta$!)

EXERCISE 6.2   Consider an i.i.d. sample of size n from the bivariate population with pdf $f(x_1,x_2)=\frac{1}{\theta_1\theta_2}e^{-(\frac{x_1}{\theta_1}+
\frac{x_2}{\theta_2})}$, $x_1,x_2>0$. Compute the MLE of $\theta=(\theta_1,\theta_2)$. Find the Cramer-Rao lower bound. Is it possible to derive a minimal variance unbiased estimator of $\theta$?

EXERCISE 6.3   Show that the MLE of Example 6.1, $\widehat\mu=\overline{x}$, is a minimal variance estimator for any finite sample size $n$ (i.e., without applying Theorem 6.3).

EXERCISE 6.4   We know from Example 6.4 that the MLE of Example 6.1 has $\data{F}_{1}=\data{I}_{p}$. This leads to

\begin{displaymath}\sqrt{n} (\overline x - \mu) \mathrel{\mathop{\longrightarrow}\limits_{}^{\cal L}} N_p(0,\data{I})\end{displaymath}

by Theorem 6.3. Can you give an analogous result for the square $\overline{x}^2$ for the case $p=1$?

EXERCISE 6.5   Consider an i.i.d. sample of size n from the bivariate population with pdf $f(x_1,x_2)=\frac{1}{\theta_1^2 \theta_2}\frac{1}{x_2}e^{-(\frac{x_1}
{\theta_1x_2}+\frac{x_2}{\theta_1\theta_2})}$, $x_1,x_2>0$. Compute the MLE of $\theta=(\theta_1,\theta_2)$. Find the Cramer-Rao lower bound and the asymptotic variance of $\widehat{\theta}$.

EXERCISE 6.6   Consider a sample $\lbrace x_i\rbrace_{i=1}^n$ from $N_p(\mu,\Sigma_0)$ where $\Sigma_0$ is known. Compute the Cramer-Rao lower bound for $\mu$. Can you derive a minimal unbiased estimator for $\mu$?

EXERCISE 6.7   Let $ X \sim N_p (\mu, \Sigma) $ where $\Sigma$ is unknown but we know
$\Sigma=\mathop{\hbox{diag}}(\sigma_{11},\sigma_{22},\dots,\sigma_{pp})$. From an i.i.d. sample of size n, find the MLE of $\mu$ and of $\Sigma$.

EXERCISE 6.8   Reconsider the setup of the previous exercise. Suppose that

\begin{displaymath}\Sigma=\mathop{\hbox{diag}}(\sigma_{11},\sigma_{22},\dots,
\sigma_{pp}).\end{displaymath}

Can you derive in this case the Cramer-Rao lower bound for $\theta^{\top}=(\mu_1\dots\mu_p,\sigma_{11}\dots\sigma_{pp})$?

EXERCISE 6.9   Prove Theorem 6.1. Hint: start from $\frac{\partial }{\partial \theta } E(t^{\top}) =
\frac{\partial }{\partial \theta } \int t^{\top}(\data{X};\theta) L(\data{X};\theta)
d\data{X}$, then permute integral and derivatives and note that $s(\data{X};\theta )=\frac{1}{L(\data{X};\theta)} \frac{\partial }{\partial \theta } L(\data{X};\theta) $.

EXERCISE 6.10   Prove expression (6.12).
(Hint: start from $E(s(\data{X};\theta)) = \int \frac{1}{L(\data{X};\theta)}
\frac{\partial}{\partial \theta}L(\data{X};\theta) L(\data{X};\theta)\partial \data{X}$ and then permute integral and derivative.)