In statistics a minimum-variance unbiased estimator (MVUE) or uniformly minimum-variance unbiased estimator (UMVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter.
For practical statistics problems, it is important to determine the MVUE if one exists, since less-than-optimal procedures would naturally be avoided, other things being equal. This has led to substantial development of statistical theory related to the problem of optimal estimation.
While combining the constraint of unbiasedness with the desirability metric of least variance leads to good results in most practical settings—making MVUE a natural starting point for a broad range of analyses—a targeted specification may perform better for a given problem; thus, MVUE is not always the best stopping point.
Consider estimation of
g(\theta)
X1,X2,\ldots,Xn
p\theta,\theta\in\Omega
\Omega
\delta(X1,X2,\ldots,Xn)
g(\theta)
\forall\theta\in\Omega
\operatorname{var}(\delta(X1,X2,\ldots,Xn))\leq\operatorname{var}(\tilde{\delta}(X1,X2,\ldots,Xn))
for any other unbiased estimator
\tilde{\delta}.
If an unbiased estimator of
g(\theta)
p\theta,\theta\in\Omega
Further, by the Lehmann–Scheffé theorem, an unbiased estimator that is a function of a complete, sufficient statistic is the UMVUE estimator.
Put formally, suppose
\delta(X1,X2,\ldots,Xn)
g(\theta)
T
η(X1,X2,\ldots,Xn)=\operatorname{E}(\delta(X1,X2,\ldots,Xn)\midT)
is the MVUE for
g(\theta).
A Bayesian analog is a Bayes estimator, particularly with minimum mean square error (MMSE).
An efficient estimator need not exist, but if it does and if it is unbiased,it is the MVUE. Since the mean squared error (MSE) of an estimator δ is
\operatorname{MSE}(\delta)=\operatorname{var}(\delta)+[\operatorname{bias}(\delta)]2
the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE because they have a smaller variance than does any unbiased estimator; see estimator bias.
Consider the data to be a single observation from an absolutely continuous distribution on
R
p\theta(x)=
\thetae-x | |
(1+e-x)\theta |
and we wish to find the UMVU estimator of
g(\theta)=
1 | |
\theta2 |
First we recognize that the density can be written as
e-x | |
1+e-x |
\exp(-\thetalog(1+e-x)+log(\theta))
T=log(1+e-x)
T
\operatorname{E}(T)=
1 | |
\theta, |
\operatorname{var}(T)=
1 | |
\theta2 |
Therefore,
\operatorname{E}(T2)=
2 | |
\theta2 |
Here we use Lehmann–Scheffé theorem to get the MVUE
Clearly
\delta(X)=
T2 | |
2 |
T=log(1+e-x)
η(X)=\operatorname{E}(\delta(X)\midT)=\operatorname{E}\left(\left.
T2 | |
2 |
\right|T\right)=
T2 | |
2 |
=
log(1+e-X)2 | |
2 |
This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU, as Lehmann–Scheffé theorem states.
However, the sample standard deviation is not unbiased for the population standard deviation – see unbiased estimation of standard deviation.
Further, for other distributions the sample mean and sample variance are not in general MVUEs – for a uniform distribution with unknown upper and lower bounds, the mid-range is the MVUE for the population mean.
k+1 | |
k |
m-1,
where m is the sample maximum. This is a scaled and shifted (so unbiased) transform of the sample maximum, which is a sufficient and complete statistic. See German tank problem for details.