3.4 Estimating the Tail-dependence Coefficient

Suppose $ X,\;X^{(1)},\dots,X^{(m)}$ are i.i.d. bivariate random vectors with distribution function $ F$ and copula $ C.$ We assume continuous marginal distribution functions $ F_i,\;i=1,2.$ Tests for tail dependence or tail independence are given for example in Ledford and Tawn (1996) or Draisma et al. (2004).

We consider the following three (non-)parametric estimators for the lower and upper tail-dependence coefficients $ \lambda_U$ and $ \lambda_L.$ These estimators have been discussed in Huang (1992) and Schmidt and Stadtmüller (2003). Let $ C_m$ be the empirical copula defined by:

$\displaystyle C_m(u,v)=F_m(F^{-1}_{1m}(u),F^{-1}_{2m}(v)),$ (3.7)

with $ F_m$ and $ F_{im}$ denoting the empirical distribution functions corresponding to $ F$ and $ F_i,$ $ i=1,2,$ respectively. Let $ R^{(j)}_{m1}$ and $ R^{(j)}_{m2}$ be the rank of $ X^{(j)}_1$ and $ X^{(j)}_2,\;j=1,\dots,m,$ respectively. The first estimators are based on formulas (3.1) and (3.2):


$\displaystyle \hat{\lambda}^{(1)}_{U,m}$ $\displaystyle =$ $\displaystyle \frac{m}{k}C_m\Big(\Big(1-\frac{k}{m},1\Big]\times\Big(1-\frac{k}{m},1\Big]\Big)$  
  $\displaystyle =$ $\displaystyle \frac{1}{k}\sum_{j=1}^mI(R^{(j)}_{m1}> m-k,R^{(j)}_{m2}>
m-k)$ (3.8)

and



$\displaystyle \hat{\lambda}^{(1)}_{L,m}=\frac{m}{k}C_m\Big(\frac{k}{m},\frac{k}{m}\Big)
=\frac{1}{k}\sum_{j=1}^mI(R^{(j)}_{m1}\leq
k,R^{(j)}_{m2}\leq k),$     (3.9)

where $ k=k(m)\rightarrow\infty$ and $ k/m\rightarrow 0$ as $ m\rightarrow\infty,$ and the first expression in (3.8) has to be understood as the empirical copula-measure of the interval $ (1-k/m,1]\times(1-k/m,1].$ The second type of estimator is already well known in multivariate extreme-value theory (Huang; 1992). We only provide the estimator for the upper TDC.



$\displaystyle \hat{\lambda}^{(2)}_{U,m}$ $\displaystyle =$ $\displaystyle 2-\frac{m}{k}\Big\{1-C_m\Big(1-\frac{k}{m},1-\frac{k}{m}\Big)\Big\}$  
  $\displaystyle =$ $\displaystyle 2-\frac{1}{k}\sum_{j=1}^m I(R^{(j)}_{m1}> m-k
\;\textrm{or}\;R^{(j)}_{m2}> m-k),$ (3.10)

with $ k=k(m)\rightarrow\infty$ and $ k/m\rightarrow 0$ as $ m\rightarrow\infty.$ The optimal choice of $ k$ is related to the usual variance-bias problem and we refer the reader to Peng (1998) for more details. Strong consistency and asymptotic normality for both types of nonparametric estimators are also addressed in the latter three reference.

Now we focus on an elliptically-contoured bivariate random vector $ X.$ In the presence of tail dependence, previous arguments justify a sole consideration of elliptical distributions having a regularly-varying density generator with regular variation index $ \alpha.$ This implies that the distribution function of $ \vert\vert X\vert\vert _2$ has also a regularly-varying tail with index $ \alpha $. Formula (3.6) shows that the upper and lower tail-dependence coefficients $ \lambda_U$ and $ \lambda_L$ depend only on the regular variation index $ \alpha $ and the ``correlation'' coefficient $ \rho.$ Hence, we propose the following parametric estimator for $ \lambda_U$ and $ \lambda_L$:

$\displaystyle \hat{\lambda}^{(3)}_{U,m}=\hat{\lambda}^{(3)}_{L,m}=\lambda^{(3)}_U(\hat{\alpha}_m,\hat{\rho}_m).$ (3.11)

Several robust estimators $ \hat{\rho}_m$ for $ \rho$ are provided in the literature such as estimators based on techniques of multivariate trimming (Hahn, Mason, and Weiner; 1991), minimum-volume ellipsoid estimators (Rousseeuw and van Zomeren; 1990), and least square estimators (Frahm et al.; 2002).

For more details regarding the relationship between the regular variation index $ \alpha,$ the density generator, and the random variable $ \vert\vert X\vert\vert _2$ we refer to Schmidt (2002b). Observe that even though the estimator for the regular variation index $ \alpha $ might be unbiased, the TDC estimator $ \hat{\lambda}^{(3)}_{U,m}$ is biased due to the integral transform.