How do you use the Taylor Remainder term to estimate the error in approximating a function #y=f(x)# on a given interval #(c-r,c+r)#?