# Can the standard deviation ever be negative?

Dec 19, 2014

I would suggest you to recall the formula for standard deviation.For instance, when we take the corrected sample standard deviation into account we know that;

s = sqrt(1 /(N-1)sum_(i=1) ^N(x_i-bar x)^2

As you can see, you need to take the square root of the above expression in order to find the standard deviation and we know that we cannot have a negative number inside the square root.

In addition, the $N$ stands for the size of the sample (group of people, animals etc.) which is a positive number and if you expand the second part of the expression ${\sum}_{i = 1}^{N} {\left({x}_{i} - \overline{x}\right)}^{2}$ it is clear that you'll end up with having either zero or positive number as you have to square the differences from the mean.

Thus the inside of square root will be greater than or equal to zero and we will end up with having a non negative number for standard deviation so it doesn't make any sense to talk about the square root of a negative number.

Sep 22, 2015

It must always be positive because the calculation is based on the square of a difference - making it positive no matter what the difference is.

Oct 8, 2015

No.

#### Explanation:

I feel the others are going somewhere a bit different here, in which they're explaining why the variance can never be negative, but as we all know

${x}^{2} = 1$

Has two answers, $- 1$ and $1$, which can raise a question much like your own, can square roots be negative?

The answer to this, is no. Conventionally when taking the square root we only take the positive value. The concept that a negative value appears come from a frequently omitted step and/or a not very known fact.

${x}^{2} = a$
$\sqrt{{x}^{2}} = \sqrt{a}$

So far so good, but you see, the definition of the absolute value function is $\sqrt{{x}^{2}}$, so we have

$| x | = \sqrt{a}$

And since we now have an equation dealing with a modulo, we must put the plus minus sign

$x = \pm \sqrt{a}$

But you see, despite using $s$ or $\sigma$ for standard deviation and ${s}^{2}$ or ${\sigma}^{2}$for the variance, they came to be the other way around!

Standard deviation was defined as the square root of variance and square roots are by convention always positive. Since we're not using the standard deviation as an unknown value, that plus minus sign won't show up.