# What does the "least squares" in ordinary least squares refer to?

Mar 3, 2016

"Least squares" is the name given to the sum of squares with the minimum possible total value...

#### Explanation:

Suppose we are given a set of data points { (x_1, y_1),...,(x_n, y_n) } and want to find a straight line that fits it reasonably well.

The equation of a (non-vertical) line can be written:

$y = m x + c$

where $m$ is the slope and $c$ the $y$-intercept.

In ordinary least squares, we seek to find a good fit by minimising the sum of the squares of the vertical errors for each point of our dataset. We can describe this sum of squares as a function of $m$ and $c$:

$s \left(m , c\right) = {\sum}_{i = 1}^{n} {\left({y}_{i} - \left(m {x}_{i} + c\right)\right)}^{2}$

We want to find values of $m$ and $c$ which minimise $s \left(m , c\right)$

This minimised sum of squares is called "least squares". This doesn't mean that we minimise all of the individual terms, just the sum of all of them.