Postscript version of these notes
STAT 804
Lecture 11
Likelihood Theory
First we review likelihood theory for conditional and full
maiximum likelihood estimation.
Suppose the data is
and write the density of
as
Differentiate the identity
with respect to
(the
th component of
)
and pull the derivative under the integral sign to get
where
is the
th component of
, the derivative of the log conditional
likelihood;
is called a conditional score. Since
E
we may take expected values to see
that
E
It is also true
that the other two scores
and
have mean 0 (when
is the
true value of
).
Differentiate the identity a further time with respect to
to get
We define the conditional Fisher information matrix
to have
th entry
E
and get
![$\displaystyle I_{Y\vert Z}(\theta\vert Z) =$](img24.gif)
Var
The corresponding identities based on
and
are
![$\displaystyle I_X(\theta) =$](img28.gif)
Var
and
![$\displaystyle I_Z(\theta) =$](img30.gif)
Var
Now let's look at the model
.
Putting
and
we find
Differentiating again gives the matrix of second derivatives
Taking conditional expectations given
gives
To compute
E
write
and get
with
.
You can check carefully that in fact
converges to some
as
. This
satisfies
which gives
It follows that
Notice that although the conditional Fisher information
might have been expected to depend on
it does not, at least for
long series.
Richard Lockhart
2001-09-30