Percent yield is simply the ratio between what your reaction produced, or its actual yield, and what it was supposed to produce, or its theoretical yield, multiplied by 100.
Percent difference is the ratio between the difference of two values and their average, multiplied by 100.
In essence, a reaction's percent yield tells you how your reaction compares with an ideal reaction with no loses.
The lower a reaction's percent yield is, the further away from this ideal scenario it is.
On the other hand, percent difference is useful when you're doing two measurements and don't know which of the values you got is "exact".
Let's say that you weight a coin using two different scales, and record two values. If you don't know which of the two scales is calibrated properly, you can't really tell which of the two values you got is correct.
As a result, you'd use percent difference to express the absolute error as relative to the average of the two measurements.