Let us first consider the PLM in order to discuss the main ideas
for the estimation of the components of the model.
Our goal is to find the coefficients
and the values
in the following structural equation
Now, let us take expectations conditioned
on
, i.e.
Once we have calculated one component (
or
) of a PLM, the
computation of the remaining component is straightforward.
There are two alternative approaches to this task:
Note in particular that
is a
-dimensional column
vector just like
. Thus,
and
in equation (7.4) can be replaced by
and
.
Obviously, (7.4) is now simply a version of the standard
linear model.
Applying the familiar standard linear regression, we can easily
estimate the vector of coefficients
. Using this estimated
vector to replace
in (7.2) allows us to estimate
by nonparametric regression of
on
. This estimation idea has been studied by
Speckman (1988)
and Robinson (1988a) for univariate
using
Nadaraya-Watson kernel regression. See Subsection 7.2.2
for more details.
Under regularity conditions
can be shown
to be
-consistent for
and asymptotically
normal, and there exists a consistent estimator of its limiting covariance matrix. The nonparametric
function
can be estimated (in the univariate case)
with the usual univariate rate of convergence.
In the following section, we review both estimation procedures for the
PLM and its extension for the GPLM in more detail. We restrict
ourselves to Nadaraya-Watson type regression for the
nonparametric component. It will become clear, however, that other
techniques can be used in the same way. In the later part of
this chapter we also discuss tests on the correct
specification of the GPLM (vs. a parametric GLM).