What could go wrong with Newtons Method?

Why are these conditions are true? Newton's method works very well if f' is not too small, f'' is not too big, and if our initial guess is near the solution.

If we examine the graph of a function that when crosses the x-axis merely changes as x changes (slower than a linear function) and we start with an initial guess x1 as to where the graph intersects y=0, the tangent line we draw to the function at x1 intercepts the x-axis at an x-coordinate which is much further away than the exact value.

can we use linear approximations to solve equations of the form f(x)=0 instead of Newton's method when the function is not defined for x values near the solution. For example square root (x- square root of 5)

Suppose the graph of a function portrays the following behavior: concave down, becomes sharply curved and then concaves up. It happens that the root lies in the region where the graph is sharply curved, any tangent line drawn to the function to approximate its root will be the function itself and therefore will go through the root itself.

1 Answer
Jul 29, 2017

See below.

Explanation:

In some circumstances but not ever, the Newton method converges. An iterative process needs some additional conditions to be convergent. Those conditions can be sufficient or in the better, necessary-sufficient.

The sufficient conditions are the easier to find so we will try to explain how to obtain sufficient conditions for the Newton iterative process to be convergent.

Given #f(x)=0# the Newton algorithm to find roots is

#x_k = phi(x_(k-1))# where #phi(x) = x-f(x)/(f'(x))#

We say that #phi(x)# is a contraction if

#forall {x_1,x_2}->abs(phi(x_1)-phi(x_2)) < lambda abs(x_1-x_2)# with #lambda in (0,1)#

If #phi(x)# is derivable then by the MVT we have that #EExi, {x_1 < xi < x_2} | phi(x_1)-phi(x_2) = phi'(xi)(x_1-x_2)# where

#phi'(xi) = (f(xi)f''(xi))/(f'(xi))^2#

then if #abs(phi'(xi) ) < 1 # for all #xi in B(x^@)# with #B(x^@)# denoting a ball containing the root #x^@#, the the process will converge.

Conclusion: Analyzing the behaviour of #phi'(x)# near the solution, if #abs(phi'(x)) < 1# over a connected open set containing the solution, then necessarily the iterative process will converge.

Ex. Determine the roots of #f(x) = x + e^x=0#

Here #phi(x) = x - (e^x + x)/(1 + e^x)# and

#phi'(x) = (x^2- e^x (x-3) (x-1) -1)/(e^x + x)^2#

As we know, the solution is #x^@ = -W(1) = -0.567143# where #W(cdot)# is the Lambert function.

Attached the graphics for #phi'(x)# around the solution point. As we expect, in this case, the iterative process will succeed favorably.

enter image source here