Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Definition  





2 Estimator selection  





3 Example  





4 Other examples  





5 See also  



5.1  Bayesian analogs  







6 References  














Minimum-variance unbiased estimator






Català
Deutsch
Español
Euskara
فارسی

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


Instatisticsaminimum-variance unbiased estimator (MVUE)oruniformly minimum-variance unbiased estimator (UMVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter.

For practical statistics problems, it is important to determine the MVUE if one exists, since less-than-optimal procedures would naturally be avoided, other things being equal. This has led to substantial development of statistical theory related to the problem of optimal estimation.

While combining the constraint of unbiasedness with the desirability metric of least variance leads to good results in most practical settings—making MVUE a natural starting point for a broad range of analyses—a targeted specification may perform better for a given problem; thus, MVUE is not always the best stopping point.

Definition[edit]

Consider estimation of based on data i.i.d. from some member of a family of densities , where is the parameter space. An unbiased estimator ofisUMVUEif,

for any other unbiased estimator

If an unbiased estimator of exists, then one can prove there is an essentially unique MVUE.[1] Using the Rao–Blackwell theorem one can also prove that determining the MVUE is simply a matter of finding a complete sufficient statistic for the family and conditioning any unbiased estimator on it.

Further, by the Lehmann–Scheffé theorem, an unbiased estimator that is a function of a complete, sufficient statistic is the UMVUE estimator.

Put formally, suppose is unbiased for , and that is a complete sufficient statistic for the family of densities. Then

is the MVUE for

ABayesian analog is a Bayes estimator, particularly with minimum mean square error (MMSE).

Estimator selection[edit]

Anefficient estimator need not exist, but if it does and if it is unbiased, it is the MVUE. Since the mean squared error (MSE) of an estimator δis

the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE because they have a smaller variance than does any unbiased estimator; see estimator bias.

Example[edit]

Consider the data to be a single observation from an absolutely continuous distributionon with density

and we wish to find the UMVU estimator of

First we recognize that the density can be written as

Which is an exponential family with sufficient statistic . In fact this is a full rank exponential family, and therefore is complete sufficient. See exponential family for a derivation which shows

Therefore,

Here we use Lehmann–Scheffé theorem to get the MVUE

Clearly is unbiased and is complete sufficient, thus the UMVU estimator is

This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU, as Lehmann–Scheffé theorem states.

Other examples[edit]

where m is the sample maximum. This is a scaled and shifted (so unbiased) transform of the sample maximum, which is a sufficient and complete statistic. See German tank problem for details.

See also[edit]

Bayesian analogs[edit]

References[edit]

  1. ^ Lee, A. J., 1946- (1990). U-statistics : theory and practice. New York: M. Dekker. ISBN 0824782534. OCLC 21523971.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)

Retrieved from "https://en.wikipedia.org/w/index.php?title=Minimum-variance_unbiased_estimator&oldid=1156141706"

Category: 
Estimator
Hidden categories: 
CS1 maint: multiple names: authors list
CS1 maint: numeric names: authors list
Articles with short description
Short description matches Wikidata
Articles needing additional references from November 2009
All articles needing additional references
 



This page was last edited on 21 May 2023, at 11:26 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



Privacy policy

About Wikipedia

Disclaimers

Contact Wikipedia

Code of Conduct

Developers

Statistics

Cookie statement

Mobile view



Wikimedia Foundation
Powered by MediaWiki