5.3 Hotelling $T^2$-Distribution

Suppose that $Y\in \mathbb{R}^p$ is a standard normal random vector, i.e., $ Y \sim N_p(0,\data{I})$, independent of the random matrix $ \data{M}\sim W_p(\data{I},n)$. What is the distribution of $Y^{\top} \data{M}^{-1}Y$? The answer is provided by the Hotelling $T^2$-distribution: $n\ Y^{\top} \data{M}^{-1}Y $ is Hotelling $T^2$ $(p,n)$ distributed.

The Hotelling $T^2$-distribution is a generalization of the Student $t$-distribution. The general multinormal distribution $N(\mu,\Sigma)$ is considered in Theorem 5.8. The Hotelling $T^2$-distribution will play a central role in hypothesis testing in Chapter 7.

THEOREM 5.8   If $ X \sim N_p (\mu, \Sigma) $ is independent of $ \data{M}\sim W_p(\Sigma ,n)$, then

\begin{displaymath}n(X-\mu )^{\top} \data{M}^{-1}(X-\mu )\sim T^2(p,n).\end{displaymath}

COROLLARY 5.3   If $\overline x$ is the mean of a sample drawn from a normal population $N_p(\mu,\Sigma)$ and $\data{S}$ is the sample covariance matrix, then
\begin{displaymath}
(n-1)(\overline x-\mu )^{\top} \data{S}^{-1}(\overline x-\mu...
...-\mu)^{\top} \data{S}^{-1}_u(\overline x-\mu )\sim T^2(p,n-1).
\end{displaymath} (5.17)

Recall that $\data{S}_{u}=\frac{n}{n-1} \data{S}$ is an unbiased estimator of the covariance matrix. A connection between the Hotelling $T^2$- and the $F$-distribution is given by the next theorem.

THEOREM 5.9  

\begin{displaymath}T^2(p,n)=\frac{np }{n-p+1 }\ \ F_{p,n-p+1}.\end{displaymath}

EXAMPLE 5.5   In the univariate case (p=1), this theorem boils down to the well known result:

\begin{displaymath}\left(\frac{\bar{x}-\mu}{\sqrt{\data{S}_u}/\sqrt{n}}\right)^2 \sim T^2
(1,n-1) = F_{1,n-1}=t^2_{n-1}\end{displaymath}

For further details on Hotelling $T^2$-distribution see Mardia et al. (1979). The next corollary follows immediately from (3.23),(3.24) and from Theorem 5.8. It will be useful for testing linear restrictions in multinormal populations.

COROLLARY 5.4   Consider a linear transform of $X \sim N_p ( \mu,\Sigma),\ Y = \data{A}X$ where
$\data{A}(q\times p)$ with $(q \leq p).$ If $\overline x$ and $\data{S}_X$ are the sample mean and the covariance matrix, we have

\begin{eqnarray*}
\overline y & = & \data{A}\overline x\sim N_q ( \data{A}\mu, \...
...X\data{A}^{\top}\sim W_q ( \data{A}\Sigma\data{A}^{\top} , n-1)
\end{eqnarray*}




\begin{displaymath}(n-1)(\data{A}\overline x - \data{A}\mu)^{\top} (\data{AS}_X\...
...{\top} )^{-1}(\data{A}\overline x -
\data{A}\mu)\sim T^2(q,n-1)\end{displaymath}

The $T^2$ distribution is closely connected to the univariate $t$-statistic. In Example 5.4 we described the manner in which the Wishart distribution generalizes the $\chi^2$-distribution. We can write (5.17) as:

\begin{displaymath}
T^2 =
\sqrt{n}(\overline x-\mu )^{\top}
\left(
\frac{\sum_...
...\overline x)^\top}{n-1}
\right)^{-1}\sqrt{n}(\overline x-\mu )
\end{displaymath}

which is of the form

\begin{displaymath}
\left( \begin{array}{c} \textrm{multivariate normal}\\ \text...
...tivariate normal}\\ \textrm{random vector} \end{array}\right).
\end{displaymath}

This is analogous to

\begin{displaymath}
t^2=\sqrt{n}(\overline x-\mu ) (s^2)^{-1} \sqrt{n}(\overline x-\mu )
\end{displaymath}

or

\begin{displaymath}
\left( \begin{array}{c} \textrm{normal}\\ \textrm{random var...
... \textrm{normal}\\ \textrm{random variable} \end{array}\right)
\end{displaymath}

for the univariate case. Since the multivariate normal and Wishart random variables are independently distributed, their joint distribution is the product of the marginal normal and Wishart distributions. Using calculus, the distribution of $T^2$ as given above can be derived from this joint distribution.

Summary
$\ast$
Hotelling's $T^2$-distribution is a generalization of the $t$-distribution. In particular $T(1,n) = t_{n}$.
$\ast$
$(n-1)(\overline x-\mu )^{\top} \data{S}^{-1}(\overline x-\mu)$ has a $T^2(p,n-1)$ distribution.
$\ast$
The relation between Hotelling's $T^2-$ and Fisher's $F$-distribution is given by $ T^2(p,n)=\frac{np }{n-p+1 }\ \ F_{p,n-p+1}.$