Of the range and the standard deviation, which is more widely used in statistical analysis, and why?

1 Answer
Feb 9, 2015

Standard deviation is most widely used.

Range simply gives the difference between lowest and highest value, and a few extreme values will alter the range excessively.

The standard deviation #sigma# tells you where most of the values will be, and in a normal distribution 68% of all values will be within one standard deviation from the mean #mu#, and 95% will be within two standard deviations of the mean.

Example:
You have a filling machine that fills kilogram bags of sugar. It will not fill exactly #1000g# every time, the standard deviation is #10g#.
Then you know, that #68%# is between #990and1010g#, and #95%# between #980and1020g#, a total span of #20g# or #40g# respectively.

Every now and again a bag will be far over-filled (say #1100g#) and sometimes a bag will end up empty (#0g#), so the range will be a total of #1100g#.

You may decide which of the two gives a better idea of the spread in this distribution.