Fractals from Newton's method

One of the most interesting mathematical developments of the last few decades has been the discovery of fractals. These are curves and plane figures that are infinitely intricate, and that sometimes have strange properties, such as surrounding a finite area with an infinite border. These figures appear in a number of areas, such as determining the length of the coastline of Virginia or even finding a shortest route for Meals on Wheels to visit its customers. Some of the most beautiful fractals come about because of the use of procedures for solving roots of equations in places where they don't apply. These are the Newton's Method fractals.

To explain how these fractals are developed, let us take an elementary mathematics problem. Find the square root of 2. You might say the answer is 2, but how big is it? We need to get the decimal representation. It used to be in grade and high school that a method of taking square roots was taught, similar to long division, in which you put in the quotient the number whose square comes closest to the number you are taking the square root of. Then you multiply that by 2 and put it outside as a trial divisor, and bring down two digits. This will suffice by hand to get a few decimals but it rapidly becomes cumbersome. A better way is to use Newton's Method, a method for solving equations derived from calculus. For the square root of 2, this consists first of guessing at an answer, call it x. Compute 2/x, and average that with x. The idea is that if y = x2, then x and y/x should be the same. For example, 52 = 25, therefore 5 and 25/5 are the same. If we guess x, then how good our guess is can be measured by how close x and y/x are. So if we average these, we should get a guess that is better than x and y/x. In fact we get a lot closer. The formula then for computing the next guess is y = (x + y/x)/2. This table shows what happens if we guess 1 for the square root of 2 initially:

guess x 2/x x+2/x next guess = (x+2/x)/2
1 2 3 1.5
1.5 1.33333333 2.83333333 1.41666667
1.41666667 1.41764706 2.82843137 1.41421568
1.41421568 1.41421144 2.82842712 1.41421356

The last value is correct to eight decimal places! This method of computing the square root is much easier than the old textbook method, especially if you have a calculator handy. On the other hand, if you learned the old method, in the 1950's or earlier before the advent of the calculator, Newton's method would have been just as cumbersome because of the many multi-digit long divisions that you would have to do by hand.

Where does Newton's method come from? The calculus. The idea is to approximate the curve of a function by a straight line by taking the tangent line, and then find where it intersects the x axis. Here is a diagram:

The initial guess is x'. The equation to solve is f(x) = 0 where f is a function, which for our purposes here will be a polynomial function such as f(x) = x2 - 2. x' intersects the curve at f(x'). Draw a tangent to f(x) at (x', f(x')) and compute its equation; this requires calculus, to the extent of knowing what the derivative of a polynomial is. Find where the line intersects the x-axis. If you do the computations, you get this formula: x" = x' - f(x)/f'(x), where f' is the derivative of f. To compute the derivative of a polynomial, for each term, for the coefficient, multiply the coefficient by the power to which x is taken, and reduce that power by one. For example, the derivative of 5x3 is 15x2.

If you do this for the equation x2 = A, you get the formula x" = (x'+A/x')/2, similar to the formula above. Each guess at the square root of A gets you twice as many good decimals as the previous one, making this an excellent method for taking square roots, that is, if you start close enough. If you guess that the square root of 2 is 407, for example, you will eventually compute the correct square root, but it will take you many iterations. This method takes square roots, so one would logically select a positive A to take the square root of.

But what if you take a negative A? The square root of a negative number does not exist, at least, not on the real number line, where we do most of our arithmetic. If you try to compute the square root of -1 using this method, and guess 2 to begin with, you get:

guess x -1/x x+(-1/x) next guess = (x+(-1/x))/2
2 -0.50000000 1.50000000 0.75000000
0.75000000 -1.33333333 -0.58333333 -0.29166667
-0.29166667 3.42857143 3.13690476 1.56845238
1.56845238 -0.63757116 0.93088122 0.46544061
0.46544061 -2.14850182 -1.68306121 -0.8415306
-0.8415306 1.18831092 0.34678031 0.17339016
0.17339016 -5.7673401 -5.59394995 -2.79697497
-2.79697497 0.35752912 -2.43944585 -1.21972292
-1.21972292 0.81985832 -0.39986460 -1,9993230

Looks like the guesses bounce around all over the place. Why does this happen? Actually, square roots of negative numbers do exist, but not on the real number line. They are imaginary and are in the complex plane instead. Such a number is of the form x + y-1 where x and y are real numbers. However, you have to take the square root of a negative number to get from real numbers to complex ones. In the formula x" = (x+(-1/x))/2, all the operations are closed over the reals. For example, the reciprocal of a real number is a real number, the sum and the product of two real numbers is a real number, and so forth. So you are never going to get -1, which is not real, through applying Newton's formula, if you start with a real number. . Therefore, Newton's method cannot possibly converge to-1, so the method fails for it. Instead, the numbers bounce around endlessly without pattern on the real number line, as frustrated in getting to -1 as a tomcat is when he see a female cat in heat that he can't get at because there is a chain-link fence in the way. The poor tomcat simply paces back and forth randomly, not knowing what to do, and so does Newton's method. Newton's method is a frustrated tomcat.

To get at -1, or i, as it is usually called, you need to start with a point that is not real, such as 1/2 + (1/4) i. If you do this, then it bops around a bit, then heads straight towards i. If you choose a starting point a - bi, where b > 0, so that the point is below the x axis, the method heads towards -i. If one calls i the sky number, and -i the underground number, then numbers that are above the real ground fly towards i, and numbers that are below the ground burrow their way towards -i. A number on the ground (a real number) bounces back and forth on the ground endlessly, unsure of whether to take off towards i or go subterranean towards -i. This suggests we make a chart to somehow describe these motions. For each point on the complex plane, let's repeat this method over and over again, counting the steps, until one gets within 0.001 of either the sky number i or the underground number -i. If the number of iterations to do this is odd, then color the point with a light color; if it is even, color it with a dark color. Further, color points in the sky yellow and underground points red. For example, if it takes 7 steps to get to -i, color it light red or pink. The result is:

Note how all the points above the real line go to a point above it which represents i, and that points below the line go to a mirror image point, namely -i. I have included three trajectories. One of the trajectories above the real line takes several steps to get near i, and it zigzags back and forth as it does so. This is similar to the behavior of points on the real line, which jump all over the place seemingly randomly. The graph shows that there must be some instability on and near the real line, since the colored bands seem to converge on this line. In fact, the entire picture looks like a great yellow bull's-eye in the sky with its reflection on an enormous magenta sea.

So we have created a surrealistic graphic by studying the behavior of points as we try to use Newton's method to find the square root of -1. This is the same as solving x2 + 1 = 0. In fact, Newton's method is really a method for solving equations. So this gives us the idea: what if you try this on other equations? They should be equations with complex roots so that the drawing comes out on the plane. Take for example the cube root of 1. There are three of them. They are the three solutions of x3 - 1 = 0. One of them is 1, since 1 x 1 x 1 = 1. However, there are two others, (-1 + i3)/2 and (-1 -i3)/2. If you plot these points on the complex plane, you get an equilateral triangle. What if you use Newton's method on this equation? One would think that one would get three nicely partitioned areas in the plane, surrounding the vertex bull's-eyes of this triangle, so that each point goes to whatever root is closest to it, as with i above. But instead we get this surprisingly beautiful picture:

Impressive and breathtaking! On the borders are eyeballs which have eyeballs on their border, which in turn have eyeballs on their borders, and so forth; eyeballs all the way down. But why? Why do points roughly halfway between the pink and yellow areas, in one of the cyan eyeballs, go not to the pink or to the yellow root but to the cyan root? This is one of the mysteries of fractals, and it also illustrates the applicability of Newton's method. Briefly, Newton's method is adequate if you are already close to a root but if you are far away from roots, it can be surprisingly unstable.

Another example. Try the fifth roots of 1: x5 - 1 = 0. Of course 1 is a root since 1 to the fifth power is 1. The other four roots are complex and are given by (5 +1)/4 + i[(10 + 25)/4] and by this formula with three of the eight possible combinations of plus and minus signs substituted for the plus signs in this formula. Graphing them in the plane produces a regular pentagon. If you do Newton's method on this equation, you get an even more intricate picture:

Now there are three-colored hearts on the boundaries, which in turn have hearts on their boundaries, and so forth. I cut out one of these hearts once and used it as a valentine greeting.

All this goes to show that by applying a formula to a problem incorrectly, serendipitous and beautiful patterns can emerge. The moral from this is that if you study mathematics, don't just apply a method to a problem or prove a theorem and let it be that. Try changing and twisting the problem or theorem, and apply the method improperly, just to see what happens. You usually get nonsense or nothing at all, but every once in a while, something really interesting will greet you.