Let be a random sample from a distribution that has p.d.f (or p.m.f in the discrete case) where is a parameter in the parameter space. Suppose is a sufficient statistic for Ξ, and let be a complete family. If then is the unique MVUE of Ξ.
Proof
By the
RaoâBlackwell theorem, if is an unbiased estimator of Ξ then defines an unbiased estimator of Ξ with the property that its variance is not greater than that of .
Now we show that this function is unique. Suppose is another candidate MVUE estimator of Ξ. Then again defines an unbiased estimator of Ξ with the property that its variance is not greater than that of . Then
Since is a complete family
and therefore the function is the unique function of Y with variance not greater than that of any other unbiased estimator. We conclude that is the MVUE.
Example for when using a non-complete minimal sufficient statistic
An example of an improvable RaoâBlackwell improvement, when using a minimal sufficient statistic that is not complete, was provided by Galili and Meilijson in 2016.[4] Let be a random sample from a scale-uniform distribution with unknown mean and known design parameter . In the search for "best" possible unbiased estimators for , it is natural to consider as an initial (crude) unbiased estimator for and then try to improve it. Since is not a function of , the minimal sufficient statistic for (where and ), it may be improved using the RaoâBlackwell theorem as follows:
However, the following unbiased estimator can be shown to have lower variance:
And in fact, it could be even further improved when using the following estimator: