Sunday Function

Again I have to apologize for the sparseness of posting lately, but I've got two research projects going full blast and time has not been something I have a lot of. I'll still be writing at least a few times a week, and you can't beat the price. ;) In any case once things cool down just a little I should be back to a more regular schedule.

Today's function isn't interesting because of the function itself, the interest comes from what we'll do with it. Let's say we have a function like this:

i-d4bb9fe09d6b78eb832985821d8a307f-1.png

If we want to see where the function is equal to zero, it's clear that 0 = x^2 - 2 is solved by x equal to the square root of two, either positive or negative. That's the snappy analytic solution, but let's say we want to use this function to compute a decimal approximation to whatever accuracy we feel like. The brute force method is just to try various decimal numbers and see what gets us closest, refining our guess each time. But this is ugly, slow, and requires a lot of continual work. We'd prefer a faster method where we can just turn the crank easily and get a good answer. To do this, let's plot the function and (for reasons I'll explain in the next paragraph) its tangent line at the point x = 4.

i-7b7142f1efa09b47c3b8242f89bcb9f4-graph.png

Let's say we picked x = 4 for our initial guess as the value of the square root of two. It's an awful guess, obviously, but that's ok since we're looking for a procedure that can turn any terrible guess gradually into better and better approximations of the actual answer. So we pick our initial point and draw the tangent line. We notice that it cuts the x-axis pretty close to the square root of two (which is exactly where the parabola cuts the x-axis, since the square root of two is by definition the number that makes the function equal to zero). So why don't we take the location where the straight line cuts the x-axis, and use that as our second guess, and draw the tangent line there:

i-3fdee7de4197c08986e3407853fa09fa-graph2.png

This line cuts the axis even closer to the point x = the square root of 2. If we take that point as our next guess and repeat the process, we should be closer still. So first we need to actually work this procedure into mathematical language so we can actually get some numbers out of it. The equation of a line is:

i-88516ca505d84f969b7ab15a965c3fb3-2.png

Where m is the slope and b is the y-intercept. We know the slope of a tangent line is just the derivative of the function at that point, and y is just the value of the function at that point. That means we can solve for b:

i-d18063683dcc0d42b9be45451a84d1e3-3.png

Where the prime denotes differentiation, and we're subscripting the x so we know it's the particular value of the guess we're using. Our new x for the next iteration of the guess will be the value that makes our line equation y = mx + b equal zero, i.e., 0 = m x + b. Substituting all the previous stuff into this equation and solve for x. After a little simplification, we get:

i-8d7070ce28f0af28047c7a5fb8246d53-4.png

Now this is a very general expression that works to find the zeros of an arbitrary function f, whatever it happens to be. Our particular function can be plugged in (for us, f'(x) = 2x by a little bit of calculus), which gives us the complete procedure:

i-bb5c2c6b0452df43a61e3974bd9b473f-5.png

Let's give this a try. Plug in our initial guess of 4 and the procedure tells us our next guess is 2.25. Plug that in and the procedure gives us 1.56944. Plug that in and we get 1.42189. Repeat again and get 1.41423. So on and so forth closing in on the rounded-off real value of 1.41421, and if we didn't round off at 5 decimal places as I'm doing eventually we'd get as many digits of the square root as we wanted to arbitrary accuracy.

This procedure is called Newton's Method, and it's a fine way of calculating the zeros of a function if you know its derivative. The method is not quite perfect - if a function has more than one zero the one you get will depend on your initial guess in a not-always-predictable way. And while the method is quite general, there are certain functions that don't fulfill the relatively generous convergence conditions. Still, it's a great method with a long and continuing history of use. It's pretty likely that your pocket calculator uses a very similar method when you hit the square root button. And now if you ever find yourself without such a button, you can do it yourself if you're patient.

More like this

As with most root finding algorithms, it helps to have a good idea of where your root is. Otherwise, you might converge to the wrong root (every method has this problem), or (even worse) you might hit a local extremum which sends you shooting off to infinity. Numerical Recipes recommends Newton's method for root polishing, or in multiple dimensions where there are few alternatives. In one dimension, if you do not know the derivative a priori, you are better off with the secant method, and if you have a pathological case, use bisection, which is guaranteed to find a root no matter how bad your initial guess.

By Eric Lund (not verified) on 09 Nov 2009 #permalink

This is certainly one way to do it, but I suspect most calculators either use Taylor expansions or built in log tables for this operation. Not because it's faster, but because it's easier.

I prefer to think of Newton's method slightly differently when thinking about applying it to polynomials. One is essentially taking a starting approximation x_0, looking at what happens when you plug in x_0 + epsilon into your polynomial. Then since you assume that epsilon is small, powers of epsilon should be even tinier and so you can ignore any term that has epsilon to a power greater than 1. Then you solve for what epsilon this would give you and get your new approximation. This has the advantage that y calculus you can more or less convince someone this process should work if one started with x_0 close to the actual solution.

@3: I don't see how that's different from what Matt is saying. Both are just assuming local linearity. His is just the visual version.

This is a very nice development of the idea of square root that I cover differently in my book (Inside Your Calculator) about simple algorithms that could support calculator keys. There, because the book uses no calculus, I derive the square root iteration process differently, only coming on Newton's Method in an appendix.

Sam's comment is interesting because a student of mine tried to find out from several calculator manufacturers what algorithm they used. The answer was always privileged information. I suspect that this was a translation of "I don't know." It took a journal search to find a few of the algorithms, but interestingly not this one for the square root key. In any case Newton's method, simply implemented, gets 10-digit accuracy as fast as the calculator key. (As soon as you get a decimal digit, it is straightforward to show that the number of decimal digits doubles with each iteration.)

By Gerry Rising (not verified) on 10 Nov 2009 #permalink

Josh: Sam is right. Newton's method is equivalent to iterated linear extrapolation.

As for implementing this method to find sqrt(x), I would think a reasonable initial choice for sqrt(x) (assuming nonnegative x) would be x. That happens to get it right for x = 0, a difficult root to find because the first derivative of x^2 also vanishes there (multiple roots are hard for any algorithm to find), and for everything else it will converge to the positive square root. It works because quadratic polynomials are well behaved.

By Eric Lund (not verified) on 10 Nov 2009 #permalink

Extending this to complex numbers, the map of the root that you get if you start at z (ie, colour point z green if you wind up at root A, red if root B etc) turns out to be a fractal.

"fractint" used to plot them - I don't know if there's an equivalent anymore.

I took a history of math course a long time ago. Seems there are at least 3 ancient root finding algorithms; this is one of them.