How do you use the Taylor Remainder term to estimate the error in approximating a function #y=f(x)# on a given interval #(c-r,c+r)#?
1 Answer
Oct 16, 2014
Assume that there exists a finite
for all
The error of approximating
I hope that this was helpful.