Exercise 3
Question 1.
Let T1 and T2 be two independent (for example, obtained
from different samples) unbiased estimators of
a parameter θ with variances V1 and
V2 respectively. Find the UMVUE for θ among all linear
combinations of T1 and T2.
What is its variance?
Question 2.
Prove that if there exists UMVUE, it is unique, i.e. if
T1 and T2 are both UMVUEs for θ, then
T1 = T2
(hint: consider, for example, (T1 + T2)/2).
Question 3.
Let Y1,...,Yn be a sample from the uniform distribution
U[0,θ].
- Compare the MLE and the method of moments estimator of θ with
respect to the MSE criterion (in order to do this you'll probably need first
to demonstrate your probabilistic skills to derive the distribution of the MLE
of θ - it's really simple!).
- Is the MLE unbiased? If not, find an unbiased estimator based on the MLE.
Is it UMVUE?
(hint: use the fact that ymax is a complete statistic).
Compare it with the MLE and the method of moments estimator.
- Consider a family of estimators αymax, where α
may depend on n but not on the data. Which of the estimators considered
in the previous
paragraphs belong to this family? Find the best (minimal MSE) estimator within
the family.
Question 4.
Let y1,...,yn be a sample from an exponential
distribution exp(θ), E(Y)=1/θ.
- Find UMVUEs for θ and 1/θ and check whether they achieve
Cramer-Rao lower bound.
- (hint: recall that Y1+...+Yn ~ Gamma(α=n,β=θ))
- The survival function of exponential distribution S(t) is defined as
S(t)=P(Y>t)=exp(-θt). Verify that for given t, a trivial unbiased
estimator for S(t) is 1, if t > y1 and 0, otherwise. Using it and
applying the Rao-Blackwell theorem, find the UMVUE for S(t).
- (hint: derive first
the conditional density of Y1 given Y1+...+Yn
via the joint density of Y1 and Y2+...+Yn)