Prove that if f(x) is increasing and it has a derivative at a, then f'(a) >= 0. (You may use the fact that a positive function has a limit > 0.)?

Part b of the question) If the conclusion of part (a) is changed to : f'(a) > 0, the statement becomes false. Indicate why the proof of part (a) fails to show that f'(a) > 0, and give a counterexample to the conclusion f'(a) > 0 (i.e., an example for which it is false).

For part a) A function is said to be increasing on a certain interval x1<a<x2 implies that f(x1)<f(a)<f(x2). We want to calculate f'(a), which is the slope of the tangent line to the curve y=f(x) at x=a. The two sided limits exist because the function is differentiable at x=a. The derivative at x=a because the difference quotient is positive for both cases, delta x<0 and delta x>0.
In a) Is the derivative at a equal to zero only if the function flattens out for values of x near a.

In b) Will the statement fail to be true because the function contains an inflection point in its domain so we cannot say that it's increasing because its curve concaves up on the some other value of x<x<aand concaves down on another interval a <x< other value of x