Why is the ordinary least squares method used in a linear regression?

1 Answer
Apr 22, 2018

If the Gauss-Markof assumptions hold then OLS provides the lowest standard error of any linear estimator so best linear unbiased estimator

Explanation:

Given these assumptions

  1. Parameter co-efficents are linear, this just means that beta_0 and beta_1 are linear but the x variable doesn't have to be linear it can be x^2

  2. The data has been taken from a random sample

  3. There is no perfect multi-collinearity so two variables are not perfectly correlated.

  4. E(u/x_j)=0 mean conditional assumption is zero, meaning that the x_j variables provide no information about the mean of the unobserved variables.

  5. The variances are equal for any given level of x i.e. var(u)=sigma^2

Then OLS is the best linear estimator in the population of linear estimators or (Best Linear Unbiased Estimator) BLUE.

If you have this additional assumption:

  1. The variances are normally distributed

Then the OLS estimator becomes the best estimator regardless if it is a linear or non-linear estimator.

What this essentially means is that if assumptions 1-5 hold then OLS provides the lowest standard error of any linear estimator and if 1-6 hold then it provides the lowest standard error of any estimator.