# How is the Ordinary Least Squares formula derived?

Mar 19, 2018

#### Explanation:

Let us say that we expect a linear relation, say $y = a x + b$, between two observable variables $x$ and $y$ and where $a$ and $b$ are not known. It is not necessary that the relation should be linear, but here we have assumed this only for the purpose of simplification.

However, actual observed data may be accompanied by errors or noise for various reasons and when we plot the observed data on a graph, it may not be a linear and may be some thing as shown below.

Observe that each point denotes observed value of $x$ and $y$ and $n$ points provides us with $n$ data points given by $\left({x}_{i} , {y}_{i}\right)$, where $i$ ranges from $1$ to $n$.

We can draw many lines through these points, with varying slopes i.e. $a ' s$ and intercepts $b ' s$, but how do we know which one is the best fit. This is done by the method of Least squares. In this method we define the relationship between observed data and expected data - by minimizing the sum of squares of the deviation between observed and expected values.

In other words, best line has minimum error between line and data points. Note that had we not squared, positive and negative errors would have almost cancelled out. We will talk more about this later.@

For a particular ${x}_{i}$ observed value is ${y}_{i}$, expected value of ${y}_{i}$ for is $a {x}_{i} + b$ and difference between them ${d}_{i}$ is given by ${y}_{i} - a {x}_{i} - b$ and in least square method we seek to
minimise error $E = {\sum}_{i = 1}^{n} {d}_{i}^{2}$ i.e. ${\sum}_{i = 1}^{n} {\left({y}_{i} - a {x}_{i} - b\right)}^{2}$.

To get this minimum, we use calculus. For this, we must have first derivatives of $a$ and $b$ yielding zero. Differentiating ${\sum}_{i = 1}^{n} {\left({y}_{i} - a {x}_{i} - b\right)}^{2}$ w.r.t. $a$ and $b$, we get

$\frac{\partial E}{\partial a} = - 2 {\sum}_{i = 1}^{n} {x}_{i} \left({y}_{i} - a {x}_{i} - b\right) = 0$

and $\frac{\partial E}{\partial b} = - 2 {\sum}_{i = 1}^{n} \left({y}_{i} - a {x}_{i} - b\right) = 0$

To solve for $a$ and $b$, we rewrite them as

$a \sum {x}_{i}^{2} + b \sum {x}_{i} = \sum {x}_{i} {y}_{i}$ and

$a \sum {x}_{i} + b n = \sum {y}_{i}$

and solving them for $a$ and $b$ we get

$a = \frac{\sum {y}_{i} \sum {x}_{i}^{2} - \sum {x}_{i} \sum {x}_{i} {y}_{i}}{n \sum {x}_{i}^{2} - {\left(\sum {x}_{i}\right)}^{2}}$

and $b = \frac{n \sum {x}_{i} {y}_{i} - \sum {x}_{i} \sum {y}_{i}}{n \sum {x}_{i}^{2} - {\left(\sum {x}_{i}\right)}^{2}}$

@ - Note that $a \sum {x}_{i} + b n = \sum {y}_{i}$ can be expressed as $a \sum {x}_{i} / n + b = \sum {y}_{i} / n$ is just a fit between averages of ${x}_{i}$ and ${y}_{i}$. Hence, this alone may not give the best fit. The latter is arrived due to $a \sum {x}_{i}^{2} + b \sum {x}_{i} = \sum {x}_{i} {y}_{i}$.