What is a minimum-variance, mean-unbiased estimator?

1 Answer
Oct 19, 2015

Of all estimators with the property of being "mean-unbiased", it is the estimator with the smallest variance, and sometimes also referred to as the "best" estimator.

Explanation:

Say you observe some data on N individuals. Label one variable #Y# and all the others #X_1, X_2, X_3# etc.

An estimator is some function of observed data designed to estimate some true underlying relationship.

So we have to have a belief of the true underlying relationship, and statisticians call this the specification assumption. Often, a linear specification is assumed:

#Y = B_1X_1 + B_2X_2 +B_3X_3 +u \quad (1)#

Suppose we want an estimator of #B_3#, the effect of #X_3# on #Y#.

We use a hat to denote our estimator - #\hat{B_3}# - which is a function of our observed data.

#\hat{B_3} = f(X,Y)#

Note that this can be any function using the data (X,Y) and so there are limitless possible estimators. So we narrow down which to use by looking for those with nice properties.

An estimator is said to be mean-unbiased if its expected value (mean) is the true parameter value:

# E[\hat{B_3}] = B_3 #

But for any given question there may be many estimators that satisfy this requirement, and so among all the possible estimators with the property of being mean-unbiased, we look for the one with the minimum variance, that is, we want #\tilde{B_3}# such that:

#Var[\tilde{B_3}] \leq Var[\hat{B_3}]#

For all #\hat{B_3}# that are mean-unbiased.

We like this property so that our estimate will be the closest to the true value among all the unbiased estimators. This is not always simple, given how many estimators are possible, but, you can sometimes appeal to previous results, e.g.,
If you restrict your attention to linear models like (1) above with certain assumptions about #u#, the Gauss-Markov theorem (https://en.wikipedia.org/wiki/Gauss%E2%80%93Markov_theorem) proves that Ordinary Least Squares is the minimum variance mean-unbiased estimator.