Back to 498CQM Home Calendar / Outine Previous Lecture Next Lecture |
## Phys 498CQM Lecture 2Author: Erik Koch, Modified by R. Martin |
HW 1 (due 1/31) |

There are two kinds of roots to consider:

- Odd roots - sign changes across x
_{0}. These are the roots that will be considered in this section. - Even roots - no sign change. Numerically, these pose problems. If you add some numerical uncertainty to the function (f = f + constant), there root can split into two roots or disappear (become complex). Thus, if you are looking for an even root, it is best to solve for a local extremum instead.

- x
_{n+1}= ( a + b ) / 2 - If f( x
_{n+1}× f(a) > 1 then let a = x_{n+1} - If f( x
_{n+1}× f(b) > 1 then let a = x_{n+1}Thus the new [ a, b ] is a smaller interval that brackets the root.Comments:

- Each step halves the interval size.
- Convergence is guaranteed (you can't lose the root).
- This method has "linear" convergence, that is, the log of the error decreases linearly with the number of iterations.

When do you stop? This choice can can some difficulties:

- Absolute error: | a - b | < delta. This will not work if x
_{0}is very large, since round off error in | a - b | may be larger than delta. - Relative error: | a - b | < delta × | a |. This runs into problems near x = 0.
- Ill-conditioned root. If the slope of f(x) is very small near the
root, then there may be a range of x values for which f(x) is nearly
zero. Machine round-off error can prohibit determination of the
correct value of x
_{0}in this case.

### Newton's Method

Make a linear approximation to the function each iteration to get a better guess at the root.Start with an x

_{1}near the root. Iterate using the formula:x (This formula can be easily derived from a taylor expansion about x_{n+1}= x_{n}- f ( x_{n}) / f' ( x_{n})_{n}.)**Example**- Caluclate the Square Root of 5- f(x) = x
^{2}- 5 - f'(x) = 2 x
- so x
_{k+1}= x_{k}- ( x^{2}- 5 ) / 2 x

k x _{k}1 2 2 2.25 3 2.2361111 4 2.2360798 infinity 2.2360798 - A nice description of the Newton method with an animated graphic is given here .
- Much faster convergence than bisection (superlinear).
- But - this method can head to other roots, go to infinity, or get trapped in cycles. NOT GUARENTEED TO CONVERGE.
- Extra evaluations of f are neccesary.
- The proper choice for h is not obvious.
- Much faster convergence than bisection (superlinear), but can be slower than Newton's method.
- You don't need to calculate f'.
- Just like Newton's method, this method can head to other roots, go to infinity, or get trapped in cycles. NOT GUARENTEED TO CONVERGE.
- Start with bracketing interval.
- Perform one secant step to get x
_{k}. - If the secant step goes outside of the bracketing interval, do bisection instead.
- Refine the bracketing interval using the new x
_{k}. - Repeat loop until converged.

Comments:

The obvious choice (but not a good choice) is

The secant method uses the previous point in the sequence for the approximation of the derivative.

Start by choosing
[ x_{1}, x_{2} ] that is near the root. (This
interval does not have to bracket the root.)

Iterate according to

Comments:

The 'standard' (Brent's method) for calculating roots of a nonlinear equation without using derivatives is a hybrid method, combining root bracketing, bisection, and inverse quaderature interpolation (instead of the linear interpolation used in our example). For an example, see the function zeroin.f from netlib.

Email question/comments/corrections to rmartin@uiuc.edu