Properties of Good Estimator

Following are the small and large sample properties of a Good Estimator

Small Sample Properties

1. Unbiased Estimator: Biased means the difference between the parameter’s true value and the estimator’s value. When the difference becomes zero then it is called Unbiased estimator.

\(E(\hat\beta)\) = \(\beta\)

2. Best Estimator: An estimator is called best when the value of its variance is smaller than the variance is best.

Var(\(\hat\beta\) ) < Var (\(\beta\))

Where \(\beta\) is another estimator.

3. Efficient Estimator: An estimator is called efficient when it satisfies the following conditions:

a) \(\beta\) is Unbiased i.e., \(E(\hat\beta)\) = \(\beta\)

b) The estimator is Best i.e., Var(\(\hat\beta\) ) < Var (\(\beta\))

So estimator is called efficient when it has both properties of unbiasedness and best.

4. Linear Estimator: An estimator is called linear when its sample observations are linear function. For Example \(y_{1},y_{2},y_{3}……..y_{n}\) then \(k\left( y_{1}+y_{2}+y_{3}…….y_{n} \right) =ky\). Where \(k\) are constants.

5. BLUE: An estimator is BLUE when it has three properties :

  • The estimator is Linear.
  • The estimator is Unbiased.
  • Estimator is Best

So an estimator is called BLUE when it includes Best, Linear, and Unbiased property.

6. MSE Estimator: The meaning of MSE is the minimum mean square error estimator. It is the combination of unbiasedness and best properties. An estimator is called MSE when its mean square error is minimum. The formula for calculating MSE is

MSE\(\hat\beta\) = \(var(\hat\beta) + bias^2\hat\beta\)

7. Sufficient Estimator: An estimator is called sufficient when it includes all above mentioned properties, but it is very difficult to find an example of a sufficient estimator. Only arithmetic mean is considered as sufficient estimator.

Large Sample Properties

A sample is called large when \(n\) tends to infinity. This property is called asymptotic property. The large sample properties are : 

1. Asymptotic Unbiasedness: In a large sample if the estimated value of the parameter is equal to its true value then it is called asymptotic unbiased.

E (\(\hat\beta\)) (When \(n\to\infty\)) = \(\beta\)   

2. Consistency: An estimator is called consistent when it fulfills the following two conditions:

a) \(\beta\) must be Asymptotic Unbiased.

b) The variance of \(\hat\beta\) must approaches Zero as \(n\) tends to infinity.

3. Asymptotic Efficiency: An estimator is called asymptotic efficient when it fulfills the following two conditions:

a) \(\beta\) must be Consistent.

b) Var( \(\hat\beta\) ) < Var (\(\beta^\ast\) ) , where \(\hat\beta\) and \([latex]\beta^\ast\)[/latex] are constant estimators.

Share This Article

Leave a comment