Back to Chaos: Bifurcation and Predictable Unpredictability

i-6c995763d034c5fe55849fd085e0cb9d-800px-LogisticMap_BifurcationDiagram.png

So I'm trying to ease back into the chaos theory posts. I thought that one good
way of doing that was to take a look at one of the class chaos examples, which
demonstrates just how simple a chaotic system can be. It really doesn't take much
at all to push a system from being nice and smoothly predictable to being completely
crazy.

This example comes from mathematical biology, and it generates a
graph commonly known as the logistical map. The question behind
the graph is, how can I predict what the stable population of a particular species will be over time?

If there was an unlimited amount of food, and there were no predators, then it
would be pretty easy. You'd have a pretty straightforward exponential growth curve. You'd
have a constant, R, which is the growth rate. R would be determined by two factors: the
rate of reproduction, and the rate of death from old age. With that number, you could
put together a simple exponential curve - and presto, you'd have an accurate description
of the population over time.

But reality isn't that simple. There's a finite amount of resources - that is, a finite
amount of food for for your population to consume. So there's a maximum number of individuals
that could possibly survive - if you get more than that, some will die until the population
shrinks below that maximum threshold. Plus, there are factors like predators and disease,
which reduce the available population of reproducing individuals. The growth rate only
considers "How many children will be generated per member of the population?"; predators
cull the population, which effectively reduces the growth rate. But it's not a straightforward
relationship: the number of individuals that will be consumed by predators and disease is
related to the size of the population!

Modeling this reasonably well turns out to be really simple. You take the
maximum population based on resources, Pmax. You then describe the
population at any given point in time as a population ratio: a
fraction of Pmax. So if your environment could sustain one
million individuals, and the population is really 500,000, then you'd describe
the population ratio as 1/2.

Now, you can describe the population at time T with a recurrence relation:

P(t+1)= R × P(t) × (1-P(t))

That simple equation isn't perfect, but it's results are impressively close
to accurate. It's good enough to be very useful for studying population growth.

So, what happens when you look at the behavior of that function as you
vary R? You find that below a certain threshold value, it falls to zero. Cross
that threshold, and you get a nice increasing curve, which is roughly what
you'd expect. Up until you hit R=3. Then it splits, and you get an oscillation
between two different values. If you keep increasing R, it will split again -
your population will oscillate between 4 different values. A bit farther, and
it will split again, to eight values. And then things start getting
really wacky - because the curves converge on one another, and even
start to overlap: you've reached chaos territory. On a graph of the function,
at that point, the graph becomes a black blur, and things become almost
completely unpredictable. It looks like the beautiful diagram at the top
of this post that I href="http://en.wikipedia.org/wiki/Bifurcation_diagram">copied from
wikipedia (it's much more detailed then anything I could create on my
own).

But here's where it gets really amazing.

Take a look at that graph. You can see that it looks fractal. With a graph
like that, we can look for something called a self-similarity scaling
factor
. The idea of a SS-scaling factor is that we've got a system with
strong self-similarity. If we scale the graph up or down, what's the scaling
factor where a scaled version of the graph will exactly overlap with the un-scaled
graph/

For this population curve, the SSSF turns out to about 4.669.

What's the SSSF for the Mandelbrot set? 4.669.

In fact, the SSSF for nearly all bifurcating systems that we see,
and their related fractals, is virtually always exactly 4.669. There's a basic
structure which underlies all systems of this sort.

What's this sort? Basically, it's a dynamical system with a
quadratic maximum. In other words, if you look at the recurrence relation for
the dynamical system, it's got a quadratic factor, and it's got a maximum
value. The equation for our population system can be written: P(t+1) =
R×P(t)-P(t)2, which is obviously quadratic, and it will
always produce a value between zero and one, so it's got a fixed maximum.
value, and Pick any chaotic dynamical system with a quadratic maximum, and
you'll find this constant in it. Any dynamical system with those properties
will have a recurrence structure with a scaling factor of 4.669.

That number, 4.669 is called the Feigenbaum constant, after
Mitchell Fiegenbaum, who first discovered it. Most people believe
that it's a transcendental number, but no one is sure! We're not really sure
of quite where the number comes from, which makes it difficult to determine
whether or not it's really transcendental!

But it's damned useful. By knowing that a system is subject to recurrence
at a rate determined by Feigenbaum's constant, we know exactly when that system will
become chaotic. We don't need to continue to observe it as it scales up to
see when the system will go chaotic - we can predict exactly when it will happen
just by virtue of the structure of the system. Feigenbaum's constant predictably
tell us when a system will become unpredictable.

More like this

I've known several software systems which would eventually degrade into chaos (semi-literally), usually due to over-architecting and higher-ups insisting that multi-threading was the only way to solve a problem. I wonder if there is an equation that literally goes chaotic to describe multi-threading issues. That would be both funny, and apt, in my view. :)

By Kyle Szklenski (not verified) on 20 Oct 2009 #permalink

Oh, also, thanks for getting back to these. I love them. I did a ton of research related to the above and wrote a paper on it in school. It was fun stuff. The Feigenbaum constant is one of the most incredible things I've ever researched, and that's saying something.

Any chance you could do a post or a few on some interesting unsolved math problems, such as the 196 algorithm, or reverse-then-add algorithm?

By Kyle Szklenski (not verified) on 20 Oct 2009 #permalink

Awesome post! Keep 'em coming.

My name links to an Applet that, on some browsers, allows one to generate the above graph on the fly and drill down to see its fractal structure.

I have posted this here before, but it seems appropriate to do so again. Some years ago I made an applet that draws the above diagram then lets you zoom it to see the fractal detail. Then if you really want to you can turn it into audio, although why I can't remember.
You can try it here: Fractal Sound Applet

For this population curve, the SSSF turns out to about 4.669.

What's the SSSF for the Mandelbrot set? 4.669.

Although the Feigenbaum constant certainly does show up in many places, these two aren't the best example of this coincidence because the Mandelbrot set basically is the logistic map extended to the complex plane (technically, a conjugate of it). See Wikipedia.

OR you need to carry the R through to the second term I guess.

Thanks Mark, glad to see you back in the saddle

It's obvious if you think about it but if you use N colors then the stable area with N values uses each color exactly once. 2N values uses each color exactly twice, etc.

We look at that and forget that the function is deterministic. In many places it's demonstratively periodic. It may even be periodic everywhere if we had an infinitely precise calculator. But in the real world all computers must eventually truncate the number(*) and that means you're moving along a slightly different track that behaves differently. That's the definition of chaos - wildly different results given miniscule changes in the initial condition.

* The problem is that floating point numbers are represented by f*2^n
where f is between 0.5 and 1. (That is, the leading digit is always 1). If you multiple two floating point numbers your result will have twice as many bits as f and you have to truncate the results to the expected size. If you want to keep all of the bits the size of the floating point number will rise exponentially and soon exhaust all memory.

Then there's the notorious problem with subtraction. Subtracting two numbers that are nearly identical causes a huge loss of precision since you have to shift the results left (so the first bit is 1) and pad everything to the right with 0. That will quickly jump you to another track.

I've loved this since I did Population Biology in grad school 1973-1977. My publications in the field could not yet use the word "Chaos" -- so I usually wrote about "aperiodic oscillations." Now I see that my PhD work was not only Nanotechnology and Artificial Life, but also Chaos Theory.

One of my coauthors insists that the 3 biggest scientific theories of the entire 20th century were: Quantum, Relativity, and Chaos!

Upon seeing the bifurcation diagram of the logistic map again, I began wondering whether the periodic windows have full measure or not. (They correspond to the so-called hyperbolic components of the Mandelbrot set. For instance, the window of period 3 between R=3.8 and R=3.9 corresponds to the cardioid of the largest mini-Mandelbrot.)

The periodic windows form an open dense set (see here; the corresponding question about the Mandelbrot set is still unresolved), but open dense sets don't necessarily have full measure (e.g., the complement of a fat Cantor set). And indeed, it turns out that the regular (periodic) points are known not to have full measure.

In other words, if you choose a value of R uniformly at random from the interval [3,4], there's a nonzero probability that you'll pick a truly stochastic point (which is different from having an insanely large period, although since periodic points are dense you can find one of them arbitrarily close to any given stochastic point).

There aren't very good bounds for what the actual probability is, but it's known that as you approach R=4, the stochastic points take up an ever larger proportion (approaching 1 as R approaches 4).

The question behind the graph is, how can I predict what the stable population of a particular species will be over time? ... And then things start getting really wacky - because the curves converge on one another, and even start to overlap: you've reached chaos territory.

I'm pretty sure you just disproved evolution!

It's really confusing to me that you say the SSSF is "exactly 4.669". Since this is the first time I've read about the Feigenbaum constant, I didn't already know that it is NOT exactly 4.669.

I was especially confused when you appeared to claim that most people believe that 4.669 is a transcendental number. Obviously 4.669 is rational so... I had to assume that you were rounding, but you had already said "exactly". Do you see how that is confusing?

By Christopher (not verified) on 24 Oct 2009 #permalink

@ #18

The way understand it, the interesting thing is not, that the numbers are exactly 4.669, but that the numbers are exactly the same for every case, no matter if it's exactly 4.669 or 4.669201609...

My wife and our dog and I were at a barbecue at Caltech yesterday afternoon. An international bunch of undergrads were discussing weird laws and treaties that has bitten them at Customs. I said something that I often say in such conversations: "What we call 'The Law' is a chaotic attractor in the space of all possible laws." I gave examples of trajectories, i.e. how laws evolve over time due to courtroom resolutions of conflicts between laws, in a way sensitively dependent on initial conditions. One student said: "Yes, I get it!" A second student challenged: "How can you get that? You're a Biology major." The first said "I can SEE it that way." Then they went into a sub-argument over whether Biology or Physics or EE or what field was closest to Chaos Theory. I took another bite of sushi and listened, fascinated... My dog nonverbally reminded me that there were cheeseburgers in the vicinity...

In other words, if you choose a value of R uniformly at random from the interval [3,4], there's a nonzero probability that you'll pick a truly stochastic point.