Reading for Today's Lecture:
Goals of Today's Lecture:
Today's notes
If
are iid Bernoulli(p) then
is Binomial(n,p). We used various algebraic tactics to
arrive at the following conclusions:
This long, algebraically involved, method of proving that
is the UMVUE of p is one special case of
a general tactic.
In the binomial situation the conditional distribution of the data
given X is the same for all values of
;
we
say this conditional distribution is free of
.
Defn: A statistic T(X) is sufficient for the model
if the conditional distribution of the
data X given T=t is free of
.
Intuition: Why do the data tell us about ? Because
different values of
give different distributions to X. If two
different values of
correspond to the same joint density or cdf
for X then we cannot, even in principle, distinguish these two values of
by examining X. We extend this notion to the following. If two
values of
give the same conditional distribution of X given Tthen observing T in addition to X does not improve our ability to
distinguish the two values.
Mathematically Precise version of this intuition: If T(X)is a sufficient statistic then we can do the following. If S(X) is any estimate or confidence interval or whatever for a given problem but we only know the value of T then:
You can carry out the first step only if the statistic T is
sufficient; otherwise you need to know the true value of
to
generate X*.
Example 1:
iid Bernoulli(p).
Given
the indexes of the y successes have the
same chance of being any one of the
possible subsets of
.
This chance does not depend on p so
is a sufficient statistic.
example 2: If
are iid
then
the joint distribution of
is multivariate
normal with mean vector whose entries are all
and variance covariance
matrix which can be partitioned as
You can now compute the conditional means and variances of Xi given
and use the fact that the conditional law is multivariate
normal to prove that the conditional distribution of the data given
is multivariate normal with mean vector all of whose
entries are x and variance-covariance matrix given
by
.
Since this does not depend
on
we find that
is sufficient.
WARNING: Whether or not a statistic is sufficient depends on
the density function and on .
Theorem: Suppose that S(X) is a sufficient statistic
for some model
.
If T is an
estimate of some parameter
then:
Proof: First review conditional distributions: abstract definition of conditional expectation is
Defn: E(Y|X) is any function of X such that
Defn: E(Y|X=x) is a function g(x) such that
Fact: If X,Y has joint density
fX,Y(x,y) and
conditional density f(y|x) then
Proof of Fact:
You should simply think of E(Y|X) as being what you get when you average Y holding X fixed. It behaves like an ordinary expected value but where functions of X only are like constants.
Proof of the Rao Blackwell Theorem
Step 1: The definition of sufficiency is that the
conditional distribution of X given S does not depend on
.
This means that E(T(X)|S) does not depend on
.
Step 2: This step hinges on the following identity
(called Adam's law by Jerzy Neyman - he used to say it comes
before all the others)
From this we deduce that
Step 3:
This relies on the following very useful decomposition.
(In regression courses we say that the total sum of squares
is the sum of the regression sum of squares plus the
residual sum of squares.)
We apply this to the Rao Blackwell theorem
to get
Examples:
In the binomial problem
Y1(1-Y2) is an unbiased
estimate of p(1-p). We improve this by computing
Example: If
are iid
then
is sufficient and X1 is an unbiased estimate of
.
Now
which is the UMVUE.
In the binomial example the log likelihood (at least the part depending
on the parameters) was seen above to be a function of X (and not
of the original data
as well). In the normal example
the log likelihood is, ignoring terms which don't contain
,
These are examples of the Factorization Criterion:
Theorem: If the model for data X has density
then the statistic S(X) is sufficient if and only if the density can be
factored as
Proof: Find statistic T(x) such that
X is a one to one function of the pair S,T. Apply
change of variables to fS,T. If
factors then
Example: If
are iid
the joint density is
which is evidently a function of