Why is the ordinary least squares method used in a linear regression?
1 Answer
If the GaussMarkof assumptions hold then OLS provides the lowest standard error of any linear estimator so best linear unbiased estimator
Explanation:
Given these assumptions

Parameter coefficents are linear, this just means that
#beta_0 and beta_1# are linear but the#x# variable doesn't have to be linear it can be#x^2# 
The data has been taken from a random sample

There is no perfect multicollinearity so two variables are not perfectly correlated.

#E(u# /#x_j)=0# mean conditional assumption is zero, meaning that the#x_j# variables provide no information about the mean of the unobserved variables. 
The variances are equal for any given level of
#x# i.e.#var(u)=sigma^2#
Then OLS is the best linear estimator in the population of linear estimators or (Best Linear Unbiased Estimator) BLUE.
If you have this additional assumption:
 The variances are normally distributed
Then the OLS estimator becomes the best estimator regardless if it is a linear or nonlinear estimator.
What this essentially means is that if assumptions 15 hold then OLS provides the lowest standard error of any linear estimator and if 16 hold then it provides the lowest standard error of any estimator.