What is absolute deviation?
A way to describe how far each data point differs from the average of the data, regardless of sign (hence the absolute part).
Let's say I'm doing an experiment and I want to calculate gravity experimentally. I'm going to use a dense sphere and hold it up to a certain height and drop it several times, recording how many seconds it takes to fall each time.
I end up with a bunch of data in seconds:
If I want to find the absolute deviation, I first need to calculate the average by adding all the data and dividing by the number of data points we have.
Now I have my average. Well I want to know, how far is each measurement off from the average? This is simply the difference. However, I don't really care about which direction (positive or negative) I'm off in, so I'm ignoring the signs and taking the absolute value.
This alone is a little bit useless. Usually we then take the average of these deviations (mean absolute deviation) so we have a general sense of how far off we are from the average, on average!
Hope this helps!!
(Please note to calculate g experimentally you'd first square all the data and calculate for g before doing these calculations. It's not meant to be accurate physically, just to show how it works in a practical sense!)