12.4 Exercises

EXERCISE 12.1   Prove Theorem 12.2 (a) and 12.2 (b).

EXERCISE 12.2   Apply the rule from Theorem 12.2 (b) for $p=1$ and compare the result with that of Example 12.3.

EXERCISE 12.3   Calculate the ML discrimination rule based on observations of a one-dimensional variable with an exponential distribution.

EXERCISE 12.4   Calculate the ML discrimination rule based on observations of a two-dimensional random variable, where the first component has an exponential distribution and the other has an alternative distribution. What is the difference between the discrimination rule obtained in this exercise and the Bayes discrimination rule?

EXERCISE 12.5   Apply the Bayes rule to the car data (Table B.3) in order to discriminate between Japanese, European and U.S. cars, i.e., $J=3$. Consider only the ``miles per gallon'' variable and take the relative frequencies as prior probabilities.

EXERCISE 12.6   Compute Fisher's linear discrimination function for the 20 bank notes from Example 11.6. Apply it to the entire bank data set. How many observations are misclassified?

EXERCISE 12.7   Use the Fisher's linear discrimination function on the WAIS data set (Table B.12) and evaluate the results by re-substitution the probabilities of misclassification.

EXERCISE 12.8   Show that in Example 12.6
(a)
${\data{W}}=100\left (
\data{S}_{g}+\data{S}_{f}\right )$, where $\data{S}_{g}$ and $\data{S}_{f}$ denote the empirical covariances (3.6) and (3.5) w.r.t. the genuine and counterfeit bank notes,
(b)
${\data{B}} = 100\left \{ ({\overline x}_g-\overline x)
({\overline x}_g-\overl...
...+({\overline x}_f-\overline x)
({\overline x}_f-\overline x)^{\top} \right \},$ where $\overline x = {\textstyle \frac{1}{2}}({\overline x}_g +
{\overline x}_f)$,
(c)
$a=\data{W}^{-1}(\overline{x}_{g}-\overline{x}_{f}).$

EXERCISE 12.9   Recalculate Example 12.3 with the prior probability $\pi_1=\frac{1}{3}$ and $C(2\vert 1)=2C(1\vert 2)$.

EXERCISE 12.10   Explain the effect of changing $\pi_1$ or $C(1\vert 2)$ on the relative location of the region $R_j, j=1,2$.

EXERCISE 12.11   Prove that Fisher's linear discrimination function is identical to the ML rule when the covariance matrices are identical $(J=2)$.

EXERCISE 12.12   Suppose that $x \in \{ 0,1,2,3,4,5,6,7,8,9,10 \}$ and
$\displaystyle \Pi_1$ $\textstyle :$ $\displaystyle X \sim {\rm Bi}(10,0.2) \quad \textrm{with the prior probability }
\pi_1=0.5;$  
$\displaystyle \Pi_2$ $\textstyle :$ $\displaystyle X \sim {\rm Bi}(10,0.3) \quad \textrm{with the prior probability }
\pi_2=0.3;$  
$\displaystyle \Pi_3$ $\textstyle :$ $\displaystyle X \sim {\rm Bi}(10,0.5) \quad \textrm{with the prior probability }
\pi_3=0.2.$  

Determine the sets $R_1$, $R_2$ and $R_3$. (Use the Bayes discriminant rule.)