EXERCISE 6.1
Consider an uniform distribution on the interval
![$[0,\theta]$](mvahtmlimg1979.gif)
.
What is the MLE of

? (Hint: the maximization here
cannot be performed by means of derivatives. Here the support of

depends on

!)
EXERCISE 6.2
Consider an i.i.d. sample of size n from the bivariate population with
pdf

,

.
Compute the MLE of

. Find the Cramer-Rao lower
bound. Is it possible to derive a minimal variance unbiased estimator of

?
EXERCISE 6.3
Show that the MLE of Example
6.1,

,
is a minimal variance estimator for any finite sample size

(i.e.,
without applying Theorem
6.3).
EXERCISE 6.4
We know from Example
6.4 that the MLE of Example
6.1
has

. This leads to
by Theorem
6.3.
Can you give an analogous result for the square

for the case

?
EXERCISE 6.5
Consider an i.i.d. sample of size n from the bivariate population with
pdf

,

.
Compute the MLE of

. Find the Cramer-Rao lower
bound and the asymptotic variance of

.
EXERCISE 6.6
Consider a sample

from

where

is known. Compute the Cramer-Rao lower bound for

. Can you derive a minimal unbiased estimator for

?
EXERCISE 6.7
Let

where

is unknown but we know

. From an i.i.d. sample
of size n, find the MLE of

and of

.
EXERCISE 6.8
Reconsider the setup of the previous exercise. Suppose that
Can you derive in this case the Cramer-Rao lower bound for

?
EXERCISE 6.9
Prove Theorem
6.1. Hint: start from

, then permute integral and derivatives and note that

.
EXERCISE 6.10
Prove expression (
6.12).
(Hint: start from

and then permute integral and derivative.)