next up previous contents index
Next: 8.1 Linear Regression Modeling Up: References Previous: References


8. (Non) Linear Regression Modeling

Pavel Cízek

We will study causal relationships of a known form between random variables. Given a model, we distinguish one or more dependent (endogenous) variables $ \boldsymbol{Y} = (Y_1,\ldots,Y_l), l \in \mathbb{N}$, which are explained by a model, and independent (exogenous, explanatory) variables $ \boldsymbol{X} = (X_1,\ldots,X_p), p \in \mathbb{N}$, which explain or predict the dependent variables by means of the model. Such relationships and models are commonly referred to as regression models.

A regression model describes the relationship between the dependent and independent variables. In this chapter, we restrict our attention to models with a form known up to a finite number of unspecified parameters. The model can be either linear in parameters,

$\displaystyle \boldsymbol{Y} = \boldsymbol{X}^{\top} \boldsymbol{\beta_0} + \varepsilon\;,$    

or nonlinear,

$\displaystyle \boldsymbol{Y} = h(\boldsymbol{X}, \boldsymbol{\beta_0}) + \varepsilon\;,$    

where $ \boldsymbol {\beta }$ represents a vector or a matrix of unknown parameters, $ \varepsilon$ is the error term (fluctuations caused by unobservable quantities), and $ h$ is a known regression function. The unknown parameters $ \boldsymbol {\beta }$ are to be estimated from observed realizations $ \{
y_{1i},\ldots, y_{li} \}_{i=1}^n$ and $ \{ x_{1i},\ldots, x_{pi}
\}_{i=1}^n$ of random variables $ \boldsymbol{Y}$ and  $ \boldsymbol {X}$.

Here we discuss both kinds of models, primarily from the least-squares estimation point of view, in Sects. 8.1 and 8.2, respectively. Both sections present the main facts concerning the fitting of these models and relevant inference, whereby their focus is above all on the estimation of these regression models under near and exact multicollinearity.



Subsections
next up previous contents index
Next: 8.1 Linear Regression Modeling Up: References Previous: References