Index

absolute regularity condition
16.1.1
additive outliers
18.1.1.2
AFPE
16.2.2
agglom algorithm
9.2.1
$ \alpha$-mixing
16.1.1
ANOVA
8.1
ASE
16.1.2
ASEP
16.1.2
asymptotic final prediction error
see AFPE
asymptotic mean squared error
16.2.2
asymptotic MISE
16.2.2
average squared error
see ASE
of prediction
see ASEP
backfitting
GAM
7.1.3
GPLM
6.1.2.0.3
bandwidth choice
16.2.2
bandwidth selection
8.2.2 | 16.1.2 | 16.1.2 | 16.2.2 | 16.2.2
cross-validation
16.1.2
Silverman's rule-of-thumb
16.2.2
Bera-Jarque test
16.1.3
Berkson error
3.2
$ \beta$-mixing
16.1.1
biplots
correspondence analysis
13.3.6
breakdown point
2.1.2
CAFPE
16.2.2
CART
10.
density estimation
10.5.3
example
10.5.1
growing the tree
10.1
plotting the result
10.4
pruning the tree
10.2
selecting the final tree
10.3
censoring
5.1
classification and regression trees
see CART
cluster analysis
9.
average linkage method
9.2.1.3
centroid method
9.2.1.4
complete linkage method
9.2.1.2
hierarchical
9.2
agglomerative
9.2.1
divisive
9.2.2
median method
9.2.1.5
nonhierarchical
9.3
adaptive K-means
9.3.2
fuzzy C-means
9.3.4
hard C-means
9.3.3
K-means
9.3.1
similarity of objects
9.1.2
single linkage method
9.2.1.1
ward method
9.2.1.6
compare two
9.3.4
computation
Nadarya-Watson estimates
16.1.1
confidence intervals
Nadaraya-Watson estimator
16.1.4
constraints
GPLM
6.4.3
contingency table
13.1
controlled-variable model
3.2
correspondence analysis
13.
biplots
13.3.6
XploRe implementation
13.2 | 13.3.2
Cox regression
5.3
hypothesis testing
5.3.3
credit scoring
GPLM
6.2.2
cross-validation
10.3 | 16.2.2
curse of dimensionality
16.2.1
data preparation
multiple time series
17.1.1
density estimation
CART
10.5.3
derivative estimation
16.1.5
diagnostics
flexible time series
16.1.3
distance
$ L_p$
9.1.1
Euclidean
9.1.1
Mahalanobis
9.1.1
maximum
9.1.1
distance measures
9.1.1
DPLS
11.
computing
11.3
example
11.4
overview
11.1
theory
11.2
dynamic partial least squares
see DPLS
EIV
3.
calculation
3.3.3
linear eiv models
3.1
nonlinear eiv models
3.2
partially linear eiv models
3.3
regression calibration
3.2.1
simulation extrapolation
3.2.2
variance of error known
3.3.1
variance of error unknown
3.3.2
vector of explanatory variables
3.1.2
endogenous variable
4.1
error
asymptotic final prediction
see AFPE
asymptotic mean squared
16.2.2
average squared
see ASE
of prediction
see ASEP
corrected asymptotid final prediction
see CAFPE
final prediction
see FPE
integrated squared
see ISE
mean integrated square
asymptotic
16.2.2
mean integrated squared
see MISE
error model
3.2
errors in variables
see EIV
estimate
leave-one-out cross-validation
16.1.2
estimation
simultaneous-equations
4.2
estimator
local linear
16.1.1
local quadratic
16.1.5
exogenous regressor
4.1
ExploRing Persistence
15.
$ \phi$-mixing
16.1.1
final prediction error
see FPE
financial time series
15.
flexible time series
16. | 16.2
bandwidth choice
16.2.2
bandwidth selection
16.1.2
confidence intervals
16.1.4
derivative estimation
16.1.5
diagnostics
16.1.3 | 16.2.3
plot
16.2.3
selection of lags
16.2.2
FPE
16.2.2 | 16.2.2
corrected asymptotic
see CAFPE
GAM
6.1.1 | 7. | 7.1.1
backfitting
7.1.3
data preparation
7.2
estimation
7.3 | 7.3.4
interactive
7.4
marginal integration
7.1.2
orthogonal series
7.1.4
testing
7.6
theory
7.1
generalized additive models
see GAM
generalized linear model
6.
generalized partial linear models
see GPLM
GLM
3.2 | 6.
GPLM
6. | 7.
backfitting
6.1.2.0.3
estimation
6.3 | 6.3.1
likelihood
6.1.2
models
6.3
output display
6.4.7 | 6.5.2
profile likelihhood
6.1.2.0.1
specification test
6.5.3
Speckman estimator
6.1.2.0.2
grid
GPLM
6.4.2
growth regression
8. | 8.1
hazard regression
5.
Cox proportional hazards model
5.3
hypothesis testing
5.3.3
data structure
5.1
Kaplan-Meier estimator
5.2
Hurst coefficient
15.2.1
Hurst exponent
14.1
income distribution
8. | 8.
innovation outliers
18.1.1.3
integrated squared error
see ISE
ISE
16.1.2
Kalman filter
18.2
optimality of
18.2.2
robust
see robust Kalman filter
Kaplan-Meier estimator
5.2
kernel density estimation
multivariate
8.2.3
univariate
8.2.2
least median of squares
2.1.2
least trimmed squares
see LTS
leave-one-out cross-validation estimate
16.1.2
likelihood ratio test
GPLM
6.5.3
link function
6.1
local linear estimator
16.1.1 | 16.2.1
rate of convergence
16.2.1
variance of
16.1.4
local quadratic estimator
16.1.5
long-memory analysis
14.
example
15.5
tests
14.2 | 15.3
long-memory process
14.1
spectrum of
14.1
LTS
2. | 2.1.2
marginal integration
GAM
7.1.2
mean integrated squared error
see MISE
MISE
16.1.2 | 16.2.2
asymptotic
16.2.2
model
additive partially linear
7.1.1
additive with interaction
7.1.1
aggregate money demand
17.
dynamic panel data
12.
dynamic partial least squares
see DPLS
generalized additive
see GAM
generalized linear
see GLM | 6.
generalized partial linear
see GPLM
Klein's
4.1
nonlinear autoregressive
see NAR
nonlinear time series
see flexible time series
partial linear
6.1.1
simultaneous-equations
see simultaneous-equations model
vector autoregressive
17.
money-demand system
4.3
multiple time series
17.
analysis in XploRe
17.1.2
data preparation
17.1.1
estimation
17.3.2
plot of
17.2.1
structural analysis
17.4
validation
17.3.3
Nadaraya-Watson estimator
16.2.1
rate of convergence
16.2.1
variance of
16.1.4
Nadarya-Watson estimator
computation
16.1.1
NAR
higher order
16.2
neasurement error model
3.2
nonlinear autoregressive model
see NAR
nonlinear time series analysis
see flexible time series
optional parameters
GPLM
6.4
orthogonal series
GAM
7.1.4
outliers
18.1.1
additive
18.1.1.2
innovation
18.1.1.3
other types of
18.1.1.4
output
GPLM
6.4.7 | 6.5.2
panel data
12.
dynamic panel data model
12.4
fixed effects model
12.3
unit root tests
12.5
plot
CART
10.4
flexible time series
16.2.3
multiple time series
17.2.1
product kernel
16.2.1
profile likelihhood
GPLM
6.1.2.0.1
quantile function
1.1
conditional
1.2.1
quantile regression
1.
asymptotic normality
1.4.1
confidence intervals
1.4.3
definition
1.2.1
equivariance
1.3.1
monotonic transformations
1.3.2
rank test
1.4.3
rank test inversion
1.4.3
robustness
1.3.3
statistical inference
1.4
Wald test
1.4.2
quantile regression process
1.4.1
rankscore function
1.4.3
regression tree
see CART
rIC filter
18.4
rLS filter
18.3
robust Kalman filter
18.
rIC filter
18.4
rLS filter
18.3
rqfit
1.5.1
rrstest
1.5.2
simultaneous-equations
computation
4.2.5
estimation
4.2
example
4.2.5 | 4.3
identification
4.2.1
Klein's model
4.1
three-stage least squares
4.2.4
two-stage least squares
4.2.3
simultaneous-equations model
4.
singular value decomposition
13.1.1
specification test
GPLM
6.5.3
Speckman estimator
GPLM
6.1.2.0.2
start values
GPLM
6.4.2
state-space model
18.
statistical characteristics
GPLM
6.5.1
strong mixing
16.1.1
test
Bera-Jarque
16.1.3
time series
absolute regularity condition
16.1.1
$ \alpha$-mixing
16.1.1
antipersistent
14.1
$ \beta$-mixing
16.1.1
$ \phi$-mixing
16.1.1
financial
see financial time series
flexible
see flexible time series
fractionally integrated
14.1 | 15.2.2
long-memory
14.1 | 15.1
multiple
see multiple time series
nonlinear
see flexible time series
nonstationary
14.1
persistence
15.1
strong mixing
16.1.1
uniform mixing
16.1.1
uncovered interest parity
12.
uniform mixing
16.1.1
WARPing
16.1.1 | 16.1.5 | 16.2.1
weights
GPLM
6.4.3