# Question #f4b1c

Sep 25, 2017

Using the range approximation, our estimate for the standard deviation is $s \approx 1.67 .$

#### Explanation:

The range approximation says, "the standard deviation $s$ for a data set is roughly proportional to the range $R$ of that set." In other words:

$\frac{R}{s} \approx \left[\text{some constant}\right]$

Let's call this constant $c$. Since the range is the difference between the highest and lowest value, solving for $s$ gives us:

$s \approx \frac{{x}_{\text{max"-x_"min}}}{c}$

The constant $c$ will depend on how big the data set is (that is, how large $n$ is). For small data sets $\left(n = 5\right)$, we expect that constant to be $2.5$. That is, the range of our data should be about 2.5 times bigger than the standard deviation $\left(R \approx 2.5 s\right)$. When $n = 10$, the table says the $R$ should be about 3 times larger than $s$ $\left(R \approx 3 s\right) .$

Since our data set has 10 elements $\left(n = 10\right)$, the value we'll use for our constant $c$ is 3.

Start with the formula, then plug in the known values to solve for $s :$

$s \approx \frac{R}{c}$

$\textcolor{w h i t e}{s} = \frac{{x}_{\text{max"-x_"min}}}{c}$

$\textcolor{w h i t e}{s} = \frac{7 - 2}{3}$

$\textcolor{w h i t e}{s} = \frac{5}{3}$

$\textcolor{w h i t e}{s} \approx 1.67$