What is absolute deviation?

1 Answer
Mar 1, 2018

A way to describe how far each data point differs from the average of the data, regardless of sign (hence the absolute part).

Explanation:

Let's say I'm doing an experiment and I want to calculate gravity experimentally. I'm going to use a dense sphere and hold it up to a certain height and drop it several times, recording how many seconds it takes to fall each time.

I end up with a bunch of data in seconds:

0.56
0.64
0.55
0.60

If I want to find the absolute deviation, I first need to calculate the average by adding all the data and dividing by the number of data points we have.

So, # (0.56+0.64+0.55+0.60)/4 = 0.5875 # seconds

Now I have my average. Well I want to know, how far is each measurement off from the average? This is simply the difference. However, I don't really care about which direction (positive or negative) I'm off in, so I'm ignoring the signs and taking the absolute value.

#|0.56 - 0.5875|= 0.0275 # seconds
#|0.64- 0.5875|= 0.0525 # seconds
#|0.55- 0.5875|= 0.0375 # seconds
#|0.60- 0.5875|= 0.0125 # seconds

This alone is a little bit useless. Usually we then take the average of these deviations (mean absolute deviation) so we have a general sense of how far off we are from the average, on average!

Hope this helps!!

(Please note to calculate g experimentally you'd first square all the data and calculate for g before doing these calculations. It's not meant to be accurate physically, just to show how it works in a practical sense!)