Chaos
https://scienceblogs.com/taxonomy/term/16136/feed
enAWOL
https://scienceblogs.com/casaubonsbook/2011/10/25/awol
<span>AWOL</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p>The phone rang about 2 on Thursday afternoon, just as I was about to settle down with my book draft for a long, dull afternoon of revisions. If I was implicitly fantasizing about something to get our adrenaline pumping, I got it. Our social worker called and asked if we would consider taking a 17 month old boy with severe speech delays and special needs. Oh, we'd need to come pick him up downtown before 4:30.</p>
<p>Yikes. My first inclination was to say "no" since we've wanted to take a sibling group, but there was something about this that just felt right to both Eric and I. We had planned to close our home to placements for am month starting the previous week, due to my ASPO travel and book deadline, but my 9 year old son SImon had asked us not to. While discussing the Jewish holiday of Sukkot, a week long festival in which we eat and sometimes sleep outdoors in a sukkah, an open hut, both as a harvest celebration and also to remind ourselves of the vulnerability of being homeless, Simon pointed out that it seemed wrong for us to close our home to foster children during a holiday in which we remember being without shelter. Arriving on the last day of the holiday, this little boy seemed like something we were waiting for. We talked it over for a minute, and said yes.</p>
<p>It would turn out that almost everything we were told about him was wrong, other than that he was male ;-). After packing baby things to take to the office to pick him up, we were called our our way to the office with the sheepish note that the worker had misread his birthdate and that M. was actually 2 1/2. We were told he had no language, just grunting and pointing, but the first word we heard as Asher (my youngest, who has been dying to be a big brother) got off the elevator was "Mine!" Most of the other details would turn out to be wrong too - but M. has turned out to be very right.</p>
<p>Life has been very crazy - besides the legal and logistical details of integrating a two year old into the family with visitation, medical appointments, etc... he arrived about 2 hours before the beginning of Simchat Torah services (one of the two biggest parties of the year at our synagogue), and so we took him with us (he had a great time), and got home only to spend the next two days preparing for a gigantic, forty-plus person party at our place, including a weekend with 16 people sharing the house, including four toddlers ;-). It was awesome, and M. has handled it all with remarkable aplomb, fitting in very gracefully to our chaotic life. When asked how he was handling the new brother, Simon rolled his eyes and said, "Look, yesterday I had a whole bunch of little brothers. Today I have a whole bunch of little brothers. It just isn't that different, Mom."</p>
<p>The most likely scenario is that M. will go home to one of his parents at some point - until late yesterday we had thought it might be quite soon, but now it looks like it may be a while. In the meantime, however, we're just enjoying him and settling into a routine. More details to come as things stabilize - or not, since of course, one week for today I leave for ASPO-USA's annual conference in DC (Eric will have the kids, with help from Grandparents while he's working). Stability? What's that?</p>
<p>I should be back blogging, though, in a day or two. Stay tuned!</p>
<p>Sharon</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/author/sastyk" lang="" about="https://scienceblogs.com/author/sastyk" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">sastyk</a></span>
<span>Tue, 10/25/2011 - 08:32</span>
Tue, 25 Oct 2011 12:32:09 +0000sastyk63761 at https://scienceblogs.comCities, Books, Updates.... at Least it Isn't Monday!
https://scienceblogs.com/casaubonsbook/2010/12/06/cities-books-updates
<span>Cities, Books, Updates.... at Least it Isn't Monday!</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p>I kind of wandered off on y'all - we just spent four days visiting and celebrating Thanksanukah ;-) with biological and chosen family. We didn't go to my parents' place for Thanksgiving this year, so we headed out and ate turkey and latkes together, spent four glorious days goofing off, and are home refreshed.</p>
<p>Of course, in the meantime I realized I was *supposed* to have written my Anyway Project Update before I left (it was kind of like packing my toothbrush - somehow some things just get left behind - and there's not "Anyway Project Update Store" in Beverly MA, the way there are drugstores that will sell you a toothbrush). And today I'm supposed to be doing the first post in my new Urbanization series, with <a href="http://scienceblogs.com/thepumphandle/2010/12/cities_at_their_best_and_worst.php#more">Liz Borkowski at the Pump Handle</a>, and also the first post of my Post-Apocalyptic Novel reading series, on Jim Kunstler's _The Witch of Hebron_. </p>
<p>But there are some hitches. Like the snow that screwed up Eli's bus transport and messed up my morning schedule, and the fact that Eli doesn't sleep much when he gets excited (and he was very excited to see everyone) so we're sleep deprived and chose sleep over early morning diligence at the computer. And the fact that I've been reading Percy Jackson books with Simon obsessively and I haven't quite finished _The Witch of Hebron_ (I will by tonight, though). Oh, and the fact that it is the end of the term for Eric and he's leaving for 12 hours in about 40 minutes, leaving me with the kids. And the fact that last week the temps were in the 50s, and today they are 21 degrees, and I haven't even sealed up the windows... Oh, and the 109 emails in my box when I got back this morning after 5 days away from the computer. And that's after I got rid of the spam.</p>
<p>All of which means that I'm officially cancelling Monday, and moving it until tomorrow, so that I'm not so far behind. I realize yesterday was Sunday, and that already happened, but today cannot be Monday. All of these things will start on Monday, but today is absolutely, positively not Monday, because I'm just plain not ready for it. Today, then, is one of those rare interstitial days, "Not-Monday, December ^" All of you who, like me, are a little under-prepared for today now get an extra day! I will be putting up a short piece on urbanization, which will make me look cool and awesome - here I am, putting things up ahead of time - how does she do it? ;-).</p>
<p>Happy Not-Monday, folks!</p>
<p>Sharon</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/author/sastyk" lang="" about="https://scienceblogs.com/author/sastyk" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">sastyk</a></span>
<span>Mon, 12/06/2010 - 03:43</span>
Mon, 06 Dec 2010 08:43:27 +0000sastyk63545 at https://scienceblogs.comThe End of Defining Chaos: Mixing it all together
https://scienceblogs.com/goodmath/2010/02/07/the-end-of-defining-chaos-mixi
<span>The End of Defining Chaos: Mixing it all together</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p> The last major property of a chaotic system is topological mixing. You can<br />
think of mixing as being, in some sense, the opposite of the dense periodic<br />
orbits property. Intuitively, the dense orbits tell you that things that are<br />
arbitrarily close together for arbitrarily long periods of time can have<br />
vastly different behaviors. Mixing means that things that are arbitrarily far<br />
apart will eventually wind up looking nearly the same - if only for a little<br />
while.</p>
<p> Let's start with a formal definition.</p>
<p> As you can guess from the name, topological mixing is a property defined<br />
using topology. In topology, we generally define things in terms of <em>open sets</em><br />
and <em>neighborhoods</em>. I don't want to go too deep into detail - but an<br />
open set captures the notion of a collection of points with a well-defined boundary<br />
that is <em>not</em> part of the set. So, for example, in a simple 2-dimensional<br />
euclidean space, the contents of a circle are one kind of open set; the boundary is<br />
the circle itself. </p>
<p> Now, imagine that you've got a dynamical system whose phase space is<br />
defined as a topological space. The system is defined by a recurrence<br />
relation: s<sub>n+1</sub> = f(s<sub>n</sub>). Now, suppose that in this<br />
dynamical system, we can expand the state function so that it works as a<br />
continous map over sets. So if we have an open set of points A, then we can<br />
talk about the set of points that that open set will be mapped to by f. Speaking<br />
informally, we can say that if B=f(A), B is the space of points that could be mapped<br />
to by points in A.</p>
<p> The phase space is topologically mixing if, for any two open spaces A<br />
and B, there is <em>some</em> integer N such that f<sup>N</sup>(A) ∩ B &neq; 0. That is, no matter where you start,<br />
no matter how far away you are from some other point, <em>eventually</em>,<br />
you'll wind up arbitrarily close to that other point. <em>(Note: I originally left out the quantification of N.)</em></p>
<p> Now, let's put that together with the other basic properties of<br />
a chaotic system. In informal terms, what it means is:</p>
<ol><li> Exactly where you start has a huge impact on where you'll end up.</li>
<li> No matter how close together two points are, no matter how long their<br />
trajectories are close together, at any time, they <em>can</em><br />
suddenly go in completely different directions.</li>
<li> No matter how far apart two points are, no matter how long<br />
their trajectories stay far apart, eventually, they'll<br />
wind up in almost the same place.</li>
</ol><p> All of this is a fancy and complicated way of saying that in a chaotic<br />
system, you never know what the heck is going to happen. No matter how long<br />
the system's behavior appears to be perfectly stable and predictable, there's<br />
absolutely no guarantee that the behavior is actually in a periodic orbit. It<br />
could, at any time, diverge into something totally unpredictable.</p>
<p> Anyway - I've spent more than enough time on the definition; I think I've<br />
pretty well driven this into the ground. But I hope that in doing so, I've<br />
gotten across the degree of unpredictability of a chaotic system. There's a<br />
reason that chaotic systems are considered to be a nightmare for numerical<br />
analysis of dynamical systems. It means that the most miniscule errors<br />
in any aspect of anything will produce drastic divergence. </p>
<p> So when you build a model of a chaotic system, you know that it's going to<br />
break down. No matter how careful you are, even if you had impossibly perfect measurements,<br />
just the nature of numerical computation - the limited precision and roundoff<br />
errors of numerical representations - mean that your model is going to break.</p>
<p> From here, I'm going to move from defining things to analyzing things. Chaotic<br />
systems are a nightmare for modeling. But there are ways of recognizing when<br />
a systems behavior is going to become chaotic. What I'm going to do next is look<br />
at how we can describe and analyze systems in order to recognize and predict<br />
when they'll become chaotic.</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Sun, 02/07/2010 - 13:35</span>
Sun, 07 Feb 2010 18:35:49 +0000goodmath92788 at https://scienceblogs.comMore about Dense Periodic Orbits
https://scienceblogs.com/goodmath/2010/01/26/more-about-dense-periodic-orbi
<span>More about Dense Periodic Orbits</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p> It's been quite a while since my last chaos theory post. I've<br />
been caught up in other things, and I've needed to do some studying. Based<br />
on a recommendation from a commenter, I've gotten another book on Chaos<br />
theory, and it's frankly vastly better than the two I was using before.</p>
<p> Anyway, I want to first return to dense periodic orbits in chaotic<br />
systems, which is what I discussed in <a href="http://scienceblogs.com/goodmath/2009/11/orbits_periodic_orbits_and_den.php">the previous chaos theory<br />
post</a>. There's a glaring hole in that post. I didn't so much get it<br /><em>wrong</em> as I did miss the fundamental point. </p>
<p> If you recall, the basic definition of a chaotic system is<br />
a dynamic system with a specific set of properties:</p>
<ol><li> Sensitivity to initial conditions,</li>
<li> Dense periodic orbits, and</li>
<li> topological mixing</li>
</ol><p> The property that we want to focus on right now is the<br />
dense periodic orbits.</p>
<!--more--><p> In a dynamical system, an <em>orbit</em> isn't what we typically think of<br />
as orbits. If you look at all of the paths through the phase space of a<br />
system, you can divide it into partitions. If the system enters a state in any<br />
partition, then every state that it ever goes through will be part of the same<br />
partition. Each of those partitions is called an <em>orbit</em>. What<br />
makes this so different from our intuitive notion of orbits is that<br />
the intuitive orbit <em>repeats</em>. In a dynamical system, an<br />
orbit is just a set of points, paths through the phase space of<br />
the system. It may never do anything remotely close to repeating - but it's<br />
an orbit. For example, if I describe a system which is the state<br />
of an object floating down a river, the path that it takes is<br />
an orbit. But it obviously can't repeat - the object isn't going to<br />
go back up to the beginining of the river.</p>
<p> An orbit that repeats is called a <em>periodic orbit</em>. So<br />
our intuitive notion of orbits is really about <em>periodic</em><br />
orbits.</p>
<p> Periodic orbits are tightly connected to chaotic systems.<br />
In a chaotic system, one of the basic properties is a particular<br />
kind of unpredictability. Sensitivity to initial conditions<br />
is what most people think of - but the orbital property is<br />
actually more interesting.</p>
<p> A chaotic system has <em>dense periodic orbits</em>. Now, what<br />
does that mean? I explained it once before, but I managed to<br />
miss one of the most interesting bits of it.</p>
<p> The points of a chaotic system are <em>dense</em> around<br />
the periodic orbits. In mathematical terms, that means that<br />
every point in the attractor for the chaotic system is<br /><em>arbitrarily</em> close to some point on a periodic orbit. Pick<br />
a point in the chaotic attractor, and pick a distance greater than zero.<br />
No matter how small that distance is, there's a periodic orbit<br />
within that distance of the point in the attractor.</p>
<p> The last property of the chaotic system - the one which makes<br />
the dense periodic orbits so interesting - is topological mixing. I'm<br />
not going to go into detail about it here - that's for the next post. But<br />
what happens when you combine topological mixing with the density<br />
around the periodic orbits is that you get an amazing kind of<br />
unpredictability.</p>
<p> You can find stable states of the system, where everything<br />
just cycles through an orbit. And you can find an instance of<br />
the system that <em>appears</em> to be in that stable state. But<br />
in fact, virtually <em>all</em> of the time, you'll be wrong. The<br />
most miniscule deviation, any unmeasurably small difference between<br />
the theoretical stable state and the actual state of the system - and<br />
at some point, you're behavior will diverge. You could stay close to the<br />
stable state for a very long time - and then, whammo! the system will<br />
do something that appears to be completely insane.</p>
<p> What the density around periodic orbits means is that<br />
even though <em>most</em> of the points in the phase space aren't<br />
part of periodic orbits, you can't possibly distinguish them<br />
from the ones that are. A point that appears to be stable<br /><em>probably</em> isn't. And the difference between real stability<br />
and apparent stability is unmeasurably, indistinguishably small.<br />
It's not just the <em>initial</em> conditions of the system<br />
that are sensitive. The entire system is sensitive. Even if you<br />
managed to get it into a stable state, the slightest pertubation,<br />
the tiniest change, could cause a drastic change at some unpredictable<br />
time in the future.</p>
<p> This is the real butterfly effect. A butterfly flaps its wings -<br />
and the tiny movement of air caused by that pushes the weather system<br />
that tiny bit off of a stable orbit, and winds up causing the<br />
diversion that leads to a hurricane. The tiniest change at any<br />
time can completely blow up.</p>
<p> It also gives us a handle on another property of chaotic systems<br />
as models of real phenomena: we can't reverse them. Knowing the<br />
measured state of a chaotic system, we <em>cannot</em> tell how it<br />
got there. Even if it appears to be in a stable state, if it's part<br />
of a chaotic system, it could have just "swung in" the chaotic<br />
state from something very different. Or it could have been in what<br />
appeared to be a stable state for a long time, and then suddenly<br />
diverge. Density effectively means that we can't distinguish<br />
the stable case from either of the two chaotic cases. </p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Tue, 01/26/2010 - 08:29</span>
Tue, 26 Jan 2010 13:29:06 +0000goodmath92785 at https://scienceblogs.comOrbits, Periodic Orbits, and Dense Orbits - Oh My!
https://scienceblogs.com/goodmath/2009/11/05/orbits-periodic-orbits-and-den
<span>Orbits, Periodic Orbits, and Dense Orbits - Oh My!</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p> Another one of the fundamental properties of a chaotic system is<br /><em>dense periodic orbits</em>. It's a bit of an odd one: a chaotic<br />
system doesn't have to have periodic orbits <em>at all</em>. But if it<br />
does, then they have to be dense.</p>
<p> The dense periodic orbit rule is, in many ways, very similar to the<br />
sensitivity to initial conditions. But personally, I find it rather more<br />
interesting a way of describing key concept. The idea is, when you've got a<br />
dense periodic orbit, it's an odd thing. It's a repeating system, which will<br />
cycle through the same behavior, over and over again. But when you look at a<br />
state of the system, you can't tell which fixed path it's on. In fact,<br />
miniscule differences in the position, differences so small that you can't<br />
measure them, can put you onto dramatically different paths. There's<br />
the similarity with the initial conditions rule: you've got the same<br />
basic idea of tiny changes producing dramatic results.</p>
<!--more--><p> In order to understand this, we need to step back, and look at the some<br />
basics: what's an orbit? What's a periodic orbit? And what are dense<br />
orbits?</p>
<p> To begin with, what's an orbit?</p>
<p> If you've got a dynamical system, you can usually identify certain<br />
patterns in it. In fact, you can (at least in theory) take its<br />
phase space and partition it into a collection of sub-spaces which<br />
have the property that if at any point in time, the system is in a<br />
state in one partition, it will <em>never</em> enter a state in any<br />
other partition. Those partitions are called <em>orbits</em>. </p>
<p> Looking at that naively, with the background that most of us have<br />
associated with the word "orbit", you're probably thinking of orbits as being<br />
something very much like planetary orbits. And that's not entirely a bad<br />
connection to make: planetary orbits <em>are</em> orbits in the<br />
dynamical system sense. But an orbit in a dynamical system is more like the<br /><em>real</em> orbits that the planets follow than like the idealized ellipses<br />
that we usually think of. Planets don't really travel around the sun in smooth<br />
elliptical paths - they wobble. They're pulled a little bit this way, a little<br />
bit that way by their own moons, and by other bodies also orbiting the<br />
sun. In a complex gravitational system like the solar system, the orbits<br />
are complex paths. They might never repeat - but they're still orbits: a state<br />
where where Jupiter was orbiting 25% closer to the sun that it is now<br />
would never be on an orbital path that intersects with the current state of<br />
the solar system. he intuitive notion of "orbit" is closer to what<br />
dynamical systems call a <em>periodic</em> orbit: that is, an orbit that<br />
repeats its path.</p>
<p> A <em>periodic orbit</em> is an orbit that repeats over time. That is,<br />
if the system is described as a function f(t), then a periodic orbit is<br />
a set of points Q where ∃Δt : ∀q∈Q: if f(t)=q,<br />
then f(t+Δt)=q. </p>
<p><img src="http://scienceblogs.com/goodmath/wp-content/blogs.dir/476/files/2012/04/i-6ce1bad8831b28eef37ae750c1f29a21-pendulum.tiff" alt="i-6ce1bad8831b28eef37ae750c1f29a21-pendulum.tiff" /></p>
<p> Lots of non-chaotic things have periodic orbits. A really simple<br />
dynamical system with a periodic orbit is a pendulum. It's got a period,<br />
and it loops round and round through a fixed cycle of states from its<br />
phase space. You can see it as something very much like a planetary orbit,<br />
as shown in the figure to the right.</p>
<p> On the other hand, in general, the real orbits of the planets in the solar<br />
system are <em>not</em> periodic. The solar system never passes through<br /><em>exactly</em> the same state twice. There's no point in time at which<br />
everything will be exactly the same.</p>
<p> But the solar system (and, I think, most chaotic systems) are, if not<br />
periodic, then <em>nearly</em> periodic. The exact same state will never occur<br />
twice - but it will come arbitrarily close. You have a system of orbits that<br />
look almost periodic.</p>
<p> But then you get to the <em>density</em> issues. A dynamical<br />
system with <em>dense</em> orbits is one where you have lots of different<br />
orbits which are all closely tangled up. Making even the tiniest change<br />
in the state of the system will shift the system into an entirely different orbit,<br />
one which may be dramatically different.</p>
<p> Again, think of a pendulum. In a typical pendulum, if you give the pendulum<br />
a little nudge, you've changed its swing: you either increased or decreased the amplitude<br />
of its swing. If it were an ideal pendulum, your tiny nudge will <em>permanently</em><br />
change the orbit. Even the tiniest pertubation of it will create a permanently<br />
change. But it's not a particularly <em>dramatic</em> change.</p>
<p> On the other hand, think of a system of planetary orbits. Give one of the planets<br />
a nudge. It might do almost nothing. Or it might result in a total breakdown<br />
of the stability of the system. There's a very small difference between a path<br />
where a satellite is captured into gravitational orbit around a large body, and<br />
a path where the satellite is ejected in a slingshot.</p>
<p> Or for another example, think of a <em>damped driven pendulum</em>. That's one<br />
of the classic examples of a chaotic system. It's a pendulum that has some force that acts to reduce the swing when it gets too high; and it's got another force that ensures that it keeps swinging. Under the right conditions, you can get very unpredictable behavior. The damped driven pendulum produces a set of orbits that really demonstrate this, as shown to the right. Tiny changes in the state of the pendulum put you in different parts of the phase space very quickly.</p>
<div style="align: right;"><a href="http://scienceblogs.com/goodmath/wp-content/blogs.dir/476/files/2012/04/i-1c138ef7b09aef28e905fd760b9c5237-Damped driven chaotic pendulum - double period behavior.png"><img src="http://scienceblogs.com/goodmath/wp-content/blogs.dir/476/files/2012/04/i-11bba9ebce2c9f2542f2ef41cf78b657-Damped driven chaotic pendulum - double period behavior-thumb-130x85-21821.png" alt="i-11bba9ebce2c9f2542f2ef41cf78b657-Damped driven chaotic pendulum - double period behavior-thumb-130x85-21821.png" /></a></div>
<p> In terms of Chaos, you can think of the orbits in terms of an attractor.<br />
Remember, an attractor is a black hole in the phase space of a system, which<br />
is surrounded by a basin. Within the basin, you're basically trapped in a<br />
system of periodic orbits. You'll circle around the attractor forever, unable<br />
to escape, inevitably trapped in a system of periodic or <em>nearly</em> orbits.<br />
But even the tiniest change can push you into an entirely different<br />
orbit, because the orbits are densely tangled up around the attractor.</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Thu, 11/05/2009 - 06:16</span>
Thu, 05 Nov 2009 11:16:18 +0000goodmath92763 at https://scienceblogs.comChaos and Initial Conditions
https://scienceblogs.com/goodmath/2009/10/26/chaos-and-initial-conditions
<span>Chaos and Initial Conditions</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p> One thing that I wanted to do when writing about Chaos is take<br />
a bit of time to really home in on each of the basic properties of<br />
chaos, and take a more detailed look at what they mean.</p>
<p> To refresh your memory, for a dynamical system to be chaotic, it needs<br />
to have three basic properties:</p>
<ol><li> Sensitivity to initial conditions,</li>
<li> Dense periodic orbits, and</li>
<li> topological mixing</li>
</ol><p> The phrase "sensitivity to initial conditions" is actually a fairly poor<br />
description of what we really want to say about chaotic systems. Lots of<br />
things are sensitive to initial conditions, but are definitely not<br />
chaotic.</p>
<p> Before I get into it, I want to explain why I'm obsessing<br />
over this condition. It is, in many ways, the <em>least</em> important<br />
condition of chaos! But here I am obsessing over it.</p>
<p> As I said in the first post in the series, it's the most widely known<br />
property of chaos. But I <em>hate</em> the way that it's usually<br />
described. It's just <em>wrong</em>. What chaos means by sensitivity<br />
to initial conditions is really quite different from the more general<br />
concept of sensitivity to initial conditions.</p>
<!--more--><p> To illustrate, I need to get a bit formal, and really<br />
define "sensitivity to initial conditions".</p>
<p> To start, we've got a dynamical system, which we'll call <em>f</em>.<br />
To give us a way of talking about "differences", we'll establish a<br /><em>measure</em> on f. Without going into full detail, a measure is<br />
a function M(x) which maps each point x in the phase space of f to a<br />
real number, and which has the property that points that are close together<br />
in f have measure values which are close together.</p>
<p> Given two points x and y in the phase space of f, the <em>distance</em><br />
between those points is the absolute value of the difference of<br />
their measures, |M(x) - M(y)|. </p>
<p> So, we've got our dynamical system, with a measure over it<br />
for defining distances. One more bit of notation, and we'll<br />
be ready to get to the important part. When we start our<br />
system f at an initial point x, we'll write it f<sub>x</sub>.</p>
<p> What sensitivity to initial conditions means is that no<br />
matter how close together two initial points x and y are,<br />
if you run the system for long enough starting at each point,<br />
the results will be separated by as large a value as you want. Phrased<br />
informally, that's actually confusing; but when you formalize it, it<br />
actually gets simpler to understand:</p>
<p> Take the system f with measure M. Then f is sensitive to<br />
initial conditions if and only if:</p>
<ul><li> (∀ ε > 0 (∀ x,y : |M(x)-M(y)| < ε<br /><em>(For any two points x and y that are arbitrarily close together)</em></li>
<li> Let diff(t) = |M(f<sub>x</sub>(t)) - M(f<sub>y</sub>(t))|.<br /><em> (Let diff(t) be the distance between f<sub>x</sub> and f<sub>y</sub><br />
at time t)</em></li>
<li> ∀G, &exists;T : diff(T) > G. <em>(No matter what value you<br />
chose for G, at some point in time T, diff(T) will be larger than G.)</em></li>
</ul><p> Now - reading that, a naive understanding would be that the diff(T)<br />
increases <em>monotonically</em> as T increases - that is, that for any two<br />
values t<sub>i</sub> and t<sub>j</sub>, if t<sub>i</sub> > t<sub>j</sub> then<br />
diff(t<sub>i</sub>) > diff(t<sub>j</sub>). And in fact, in many of the newage<br />
explanations of chaos, that's exactly what they assume. But that's<br /><em>not</em> the case. In fact, monotonically increasing systems aren't<br />
chaotic. (There's that pesky "periodic orbits" requirement.) What makes<br />
chaotic systems interesting is that the differences between different starting<br />
points <em>don't</em> increase monotonically.</p>
<p> To get an idea of the difference, just compare two simple quadratic<br />
recurrence based systems. For our chaotic system, we'll about the logistic map: f(t) =<br />
k×f(t-1)(1-f(t-1)) with measure M(f(t)) = 1/f(t). And for our non-chaotic<br />
system, we'll use g(t) = g(t-1)<sup>2</sup>, with M(g(t)) = g(t).<br />
Think about arbitrarily small differences starting values. In the<br />
quadratic equation, even if you start off with a miniscule difference -<br />
starting at v<sub>0</sub>=1.00001 and v<sub>1</sub>=1.00002 - you'll<br />
get a divergence. They'll start off very close together - after 10 steps,<br />
they only differ by 0.1. But they rapidly start to diverge. After 15<br />
steps, they differ by about 0.5. By 16 steps, they differ by about 1.8;<br />
by 20 steps, they differ by about 1.2×10<sup>9</sup>! That's<br />
clearly a huge sensitivity to initial conditions - an initial difference<br />
of 1×10<sup>-5</sup>, and in just 20 steps, their difference is measured<br />
in <em>billions</em>. Pick any arbitrarily large number that you want, and<br />
if you scan far enough out, you'll get a difference bigger than it. But<br />
there's nothing <em>chaotic</em> about it - it's just an incredibly<br />
rapidly growing curve!</p>
<p> In contrast, they logistic curve is amazing. Look far enough out, and you<br />
can find a point in time where the difference in measure between starting at<br />
0.00001 and 0.00002 is as large as you could possibly want; but <em>also</em>,<br />
look far enough out past that divergence point, and you'll find a point in<br />
time where the difference is as <em>small</em> as you could possible want!<br />
The measure values of systems starting at x and y will sometimes be close together, and sometimes<br />
far apart. They'll continually vary - sometimes getting closer together,<br />
sometimes getting farther apart. At some point in time, they'll be arbitrarily<br />
far apart. At other times, <em>they'll be arbitrarily close together</em>.</p>
<p> That's a major hallmark of chaos. It's <em>not just</em> that given<br />
arbitrarily close together starting points, they'll eventually be far apart.<br />
That's not chaotic. It's that they'll be far apart at some times, and close<br />
together at other times.</p>
<p> Chaos encompasses the so-called butterfly effect: a butterfly flapping its<br />
wings in the amazon could cause an ice age a thousand years later. But it also<br />
encompasses the sterile elephant effect: a herd of a million rampaging giant<br />
elephants crushing a forest could end up having virtually no effect at all<br />
a thousand years later.</p>
<p>That's the fascination of chaotic systems. They're completely<br />
deterministic, and yet completely unpredictable. What makes them<br />
so amazing is how they're a combination of incredibly simplicity<br />
and incredible complexity. How many systems can you think of that are<br />
really much simpler to define that the logistic map? But how many have<br />
outcomes that are harder to predict?</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Mon, 10/26/2009 - 17:07</span>
Mon, 26 Oct 2009 21:07:15 +0000goodmath92760 at https://scienceblogs.comBack to Chaos: Bifurcation and Predictable Unpredictability
https://scienceblogs.com/goodmath/2009/10/20/back-to-chaos-bifurcation-and
<span>Back to Chaos: Bifurcation and Predictable Unpredictability</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><div style="align: right;" class="inset right"><img src="http://scienceblogs.com/goodmath/wp-content/blogs.dir/476/files/2012/04/i-6c995763d034c5fe55849fd085e0cb9d-800px-LogisticMap_BifurcationDiagram.png" alt="i-6c995763d034c5fe55849fd085e0cb9d-800px-LogisticMap_BifurcationDiagram.png" /></div>
<p> So I'm trying to ease back into the chaos theory posts. I thought that one good<br />
way of doing that was to take a look at one of the class chaos examples, which<br />
demonstrates just how simple a chaotic system can be. It really doesn't take much<br />
at all to push a system from being nice and smoothly predictable to being completely<br />
crazy.</p>
<p> This example comes from mathematical biology, and it generates a<br />
graph commonly known as <em>the logistical map</em>. The question behind<br />
the graph is, how can I predict what the stable population of a particular species will be over time?</p>
<!--more--><p> If there was an unlimited amount of food, and there were no predators, then it<br />
would be pretty easy. You'd have a pretty straightforward exponential growth curve. You'd<br />
have a constant, R, which is the growth rate. R would be determined by two factors: the<br />
rate of reproduction, and the rate of death from old age. With that number, you could<br />
put together a simple exponential curve - and presto, you'd have an accurate description<br />
of the population over time.</p>
<p> But reality isn't that simple. There's a finite amount of resources - that is, a finite<br />
amount of food for for your population to consume. So there's a maximum number of individuals<br />
that could possibly survive - if you get more than that, some will die until the population<br />
shrinks below that maximum threshold. Plus, there are factors like predators and disease,<br />
which reduce the available population of reproducing individuals. The growth rate only<br />
considers "How many children will be generated per member of the population?"; predators<br />
cull the population, which effectively reduces the growth rate. But it's not a straightforward<br />
relationship: the number of individuals that will be consumed by predators and disease is<br />
related to the size of the population!</p>
<p> Modeling this reasonably well turns out to be really simple. You take the<br />
maximum population based on resources, P<sub>max</sub>. You then describe the<br />
population at any given point in time as a <em>population ratio</em>: a<br /><em>fraction</em> of P<sub>max</sub>. So if your environment could sustain one<br />
million individuals, and the population is really 500,000, then you'd describe<br />
the population ratio as 1/2. </p>
<p> Now, you can describe the population at time T with a recurrence relation:</p>
<p> P(t+1)= R × P(t) × (1-P(t))</p>
<p> That simple equation isn't perfect, but it's results are impressively close<br />
to accurate. It's good enough to be very useful for studying population growth.</p>
<p> So, what happens when you look at the behavior of that function as you<br />
vary R? You find that below a certain threshold value, it falls to zero. Cross<br />
that threshold, and you get a nice increasing curve, which is roughly what<br />
you'd expect. Up until you hit R=3. Then it splits, and you get an oscillation<br />
between two different values. If you keep increasing R, it will split again -<br />
your population will oscillate between 4 different values. A bit farther, and<br />
it will split again, to eight values. And then things start getting<br /><em>really</em> wacky - because the curves converge on one another, and even<br />
start to overlap: you've reached chaos territory. On a graph of the function,<br />
at that point, the graph becomes a black blur, and things become almost<br />
completely unpredictable. It looks like the beautiful diagram at the top<br />
of this post that I <a></a>
href="http://en.wikipedia.org/wiki/Bifurcation_diagram">copied from<br />
wikipedia (it's much more detailed then anything I could create on my<br />
own).</p>
<p> But here's where it gets really amazing.</p>
<p> Take a look at that graph. You can see that it looks fractal. With a graph<br />
like that, we can look for something called a <em>self-similarity scaling<br />
factor</em>. The idea of a SS-scaling factor is that we've got a system with<br />
strong self-similarity. If we scale the graph up or down, what's the scaling<br />
factor where a scaled version of the graph will exactly overlap with the un-scaled<br />
graph/</p>
<p> For this population curve, the SSSF turns out to about 4.669.</p>
<p> What's the SSSF for the Mandelbrot set? 4.669.</p>
<p> In fact, the SSSF for nearly <em>all</em> bifurcating systems that we see,<br />
and their related fractals, is virtually always exactly 4.669. There's a basic<br />
structure which underlies <em>all</em> systems of this sort. </p>
<p> What's <em>this sort</em>? Basically, it's a dynamical system with a<br />
quadratic maximum. In other words, if you look at the recurrence relation for<br />
the dynamical system, it's got a quadratic factor, and it's got a maximum<br />
value. The equation for our population system can be written: P(t+1) =<br />
R×P(t)-P(t)<sup>2</sup>, which is obviously quadratic, and it will<br />
always produce a value between zero and one, so it's got a fixed maximum.<br />
value, and Pick any chaotic dynamical system with a quadratic maximum, and<br />
you'll find this constant in it. Any dynamical system with those properties<br />
will have a recurrence structure with a scaling factor of 4.669.</p>
<p> That number, 4.669 is called the <em>Feigenbaum constant</em>, after<br />
Mitchell Fiegenbaum, who first discovered it. Most people <em>believe</em><br />
that it's a transcendental number, but no one is sure! We're not really sure<br />
of quite where the number comes from, which makes it difficult to determine<br />
whether or not it's really transcendental!</p>
<p> But it's damned useful. By knowing that a system is subject to recurrence<br />
at a rate determined by Feigenbaum's constant, we know exactly when that system will<br />
become chaotic. We don't need to continue to observe it as it scales up to<br />
see when the system will go chaotic - we can predict exactly when it will happen<br />
just by virtue of the structure of the system. Feigenbaum's constant predictably<br />
tell us when a system will become unpredictable.</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Tue, 10/20/2009 - 04:20</span>
Tue, 20 Oct 2009 08:20:07 +0000goodmath92759 at https://scienceblogs.comChaotic Systems and Escape
https://scienceblogs.com/goodmath/2009/07/16/chaotic-systems-and-escape
<span>Chaotic Systems and Escape</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p><img src="http://scienceblogs.com/goodmath/wp-content/blogs.dir/476/files/2012/04/i-255fe74a5014ac38c7b550cbcaad0121-nbody.jpg" alt="i-255fe74a5014ac38c7b550cbcaad0121-nbody.jpg" /></p>
<p> One of the things that confused my when I started reading about chaos is easy to<br />
explain using what we've covered about attractors. <em>(The image to the side was created by Jean-Francois Colonna, and is part of his slide-show <a href="http://www.lactamme.polytechnique.fr/Mosaic/descripteurs/AVirtualSpaceTimeTravelMachine.Ang.html">here</a>)</em></p>
<p> Here's the problem: We know that things like <a href="http://faculty.ifmo.ru/butikov/Projects/Collection6.html">N-body gravitational systems are chaotic</a> - and a common example of that is how a gravity-based orbital system that appears stable for a long time can suddenly go through a transition where one body is violently ejected, with enough velocity to permanently escape the orbital system.</p>
<p> But when we look at <a href="http://scienceblogs.com/goodmath/2009/06/chaos.php">the definition of chaos</a>, we see the requirement for dense periodic orbits. But if a body is ejected from a gravitational system, ejection of a body from a gravitational system is a demonstration of chaos, how can that system have periodic orbits? </p>
<!--more--><p> The answer relates to something I mentioned in the last post. A system doesn't have to be<br />
chaotic <em>at all points</em> in its phase space. It can be chaotic <em>under some conditions</em><br />
- that is, chaotic in some parts of the phase space. Speaking loosely, when a phase space has<br />
chaotic regions, we tend to call it a chaotic phase space. </p>
<p> In the gravitational system example, you <em>do</em> have a region of dense periodic orbits. You can create an N-body gravitational system in which the bodies will orbit forever, never actually <em>repeating</em> a configuration, but also never completely breaking down. The system will never repeat. Per <a href="http://scienceblogs.com/goodmath/2007/07/order_from_chaos_using_graphs_1.php">Ramsey theory</a>, given any configuration in its phase space, it <em>must</em> eventually come <em>arbitrarily close</em> to repeating that configuration. But that doesn't mean that it's really repeating: it's chaotic, so even those infinitesimal differences will result in<br />
divergence from the past - it will follow a different path forward.</p>
<p> An attractor of a chaotic system shows you a region of the phase space where the system behaves<br />
chaotically. But it's <em>not</em> the entire phase space. If the attractor covered the entire<br />
space, it wouldn't be particularly interesting or revealing. What makes it interesting<br />
is that it captures a region where you get chaotic behavior. The attractor isn't the whole<br />
story of a chaotic systems phase space - it's just one interesting region with useful analytic<br />
properties.</p>
<p> So to return to the N-body gravitational problem: the phase space of an N-body<br />
gravitational system does contain an attractor full of dense orbits. It's definitely<br />
very sensitive to initial conditions. There are definitely phase spaces for N-body<br />
systems that are topologically mixing. None of that<br />
precludes the possibility that you can create N-body gravitational systems that<br />
break up and allow escape. The escape property isn't a good example of the chaotic<br />
nature of the system, because it encourages people to focus on the wrong properties<br />
of the system. The system isn't chaotic because you can create gravitational<br />
systems where a body will escape from what seemed to be a stable system. It's chaotic<br />
because you can create systems that <em>don't</em> break down, which are stable,<br />
but which are thoroughly unpredictable, and will never repeat a configuration. </p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Thu, 07/16/2009 - 08:03</span>
Thu, 16 Jul 2009 12:03:10 +0000goodmath92732 at https://scienceblogs.comStrange Attractors and the Structure of Chaos
https://scienceblogs.com/goodmath/2009/07/13/strange-attractors-and-the-str
<span>Strange Attractors and the Structure of Chaos</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p><img src="http://scienceblogs.com/goodmath/wp-content/blogs.dir/476/files/2012/04/i-4191aefff59c41d3ae660b425719ea85-sage0-1.png" alt="i-4191aefff59c41d3ae660b425719ea85-sage0-1.png" /></p>
<p> Sorry for the slowness of the blog; I fell behind in writing my book, which is on a rather strict schedule, and until I got close to catching up, I didn't have time to do the research<br />
necessary to write the next chaos article. (And no one has sent me any particularly<br />
interesting bad math, so I haven't had anything to use for a quick rip.)</p>
<p> Anyway... Where we left off last was talking about attractors. The natural question<br />
is, why do we really care about attractors when we're talking about chaos? That's a question<br />
which has two different answers.</p>
<!--more--><p> First, attractors provide an interesting way of looking at chaos. If you look<br />
at a chaotic system with an attractor, it gives you a way of understanding the chaos. If<br />
you start with a point in the attractor basin of the system, and then plot it over time, you'll<br />
get a trace that shows you the shape of the attractor - and by doing that, you get a nice view<br />
of the structure of the system.</p>
<p> Second, chaotic attractors are strange. In fact, that's their name: <em>strange attractors</em>: a strange attractor is an attractor whose structure has fractal dimension,<br />
and most chaotic systems have fractal-dimension attractors.</p>
<p> Let's go back to the first answer, to look at it in a bit more depth. Why do we want<br />
to look in the basin in order to find the structure of the chaotic system?</p>
<p> If you pick a point in the attractor itself, there's no guarantee of what it's going to do. It<br />
might jump around inside the attractor randomly; it might be a fixed point which just sits in one<br />
place and never moves. But there's no straightforward way of figuring out what the attractor looks<br />
like starting from a point inside of it. To return to (and strain horribly) the metaphor I used in<br />
the last post, the attractor is the part of the black hole past the even horizon: nothing inside of<br />
it can tell you anything about what it looks like from the outside. What happens inside of a black hole? How are the things that were dragged into it moving around relative to one another, or <em>are they</em> moving around? We can't really tell from the outside.</p>
<p> But the <em>basin</em> is a different matter. If you start at a point in the attractor basin,<br />
you've got something that's basically orbital. You know that every path starting from a point<br />
in the basin will, over time, get arbitrarily close to the attractor. It will circle and cycle around. It's never going to escape from that area around the attractor - it's doomed to approach it. So if you start at a point in the basin around a strange attractor, you'll get a path that tells you something about the attractor.</p>
<p> Attractors can also vividly demonstrate something else about chaotic systems: they're not necessarily chaotic <em>everywhere</em>. Lots of systems have the <em>potential</em> for chaos: that is, they've got sub-regions of their phase-space where they behave chaotically, but they also have regions where they don't. Gravitational dynamics is a pretty good example of that: there are plenty of N-body systems that are pretty much stable. We can computationally roll back the history of the major bodies in our solar system for hundreds of millions of years, and still have extremely accurate descriptions of where things were. But there are <em>regions</em> of the phase space of an N-body system where it's chaotic. And those regions are the attractors and attractor basins of strange attractors in the phase space.</p>
<p> A beautiful example of this is the first well-studied strange attractor. The guy who<br />
invented chaos theory as we know it was named Edward Lorenz. He was a meteorologist who was<br />
studying weather using computational fluid flow. He'd implemented a simulation, and as part<br />
of an accident resulting from trying to reproduce a computation, but entering less precise<br />
values for the starting conditions, he got dramatically different results. Puzzling out why,<br />
he laid the foundations of chaos theory. In the course of studying it, he took the particular equations that he was using in the original simulation, and tried to simplify them to get the simplest system that he could that still showed the non-linear behavior.</p>
<p> The result is one of the most well-known images of modern math: the Lorenz attractor. It's<br />
sort of a bent figure-eight. It's dimensionality isn't (to my knowledge) known precisely - but it's a hair above two (the best estimate I could find in a quick search was in the 2.08 range). It's not a particularly complex system - but it's fascinating. If you look at the paths in the Lorenz attractor, you'll see that things follow an orbital path - but there's no good way to tell when two paths that are very close together will suddenly diverge, and one will pass on the far inside<br />
of the attractor basin, and the other will fly to the outer edge. You can't watch a simulation for long without seeing that happen.</p>
<p> While searching for information about this kind of stuff, I came across a wonderful demo, which<br />
relates to something else that I promised to write about. There's a fantastic open-source<br />
mathematical software system called <a href="http://sagemath.org">sage</a>. Sage is sort of like<br />
Mathematica, but open-source and based on Python. It's a really wonderful system, which I really<br />
will write about at some point. On the <a href="http://sagemath.blogspot.com/2008/01/josh-kantors-lorenz-attractor-example.html">Sage blog</a>, they posted a simple Sage program for drawing the Lorenz attractor. Follow that<br />
link, and you can see the code, and experiment with different parameters. It's a wonderful<br />
way to get a real sense of it. The image at the top of this post was generated by that Sage<br />
program, with tweaked parameters.</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Mon, 07/13/2009 - 15:40</span>
Mon, 13 Jul 2009 19:40:25 +0000goodmath92730 at https://scienceblogs.comDefining Dynamical Systems
https://scienceblogs.com/goodmath/2009/06/12/defining-dynamical-systems
<span>Defining Dynamical Systems</span>
<div class="field field--name-body field--type-text-with-summary field--label-hidden field--item"><p> In my first <a href="http://scienceblogs.com/goodmath/2009/06/chaos.php">chaos post</a>, I kept talking about <em>dynamical systems</em> without bothering to define them. Most people who read this blog probably have at least an informal idea of what a dynamical system is. But today I'm going to do a quick walkthrough of what a dynamical system is, and what the basic relation of dynamical systems is to chaos theory.</p>
<p> The formal definitions of dynamical systems are dependent on the notion of phase space. But before going all formal, we can walk through the basic concept informally.</p>
<p> The basic idea is pretty simple. A dynamical system is a system that changes<br />
over time, and whose behavior can be (in theory) described a function that takes<br />
time as a parameter. So, for example, if you have a gravitational system which<br />
has three bodies interacting gravitationally, that's a dynamical system. If you<br />
know the initial masses, positions, and velocities of the planets, the positions of all three bodies at any future point in time is a function of the time.</p>
<!--more--><p> It's important to understand, though, that as I mentioned in the first chaos<br />
post: as is typical for mathematical things, most things are bad. Just because<br />
a function <em>exists</em> doesn't mean that it's <em>computable</em> or<br /><em>derivable</em>. For most dynamical systems, we know that the system<br />
is parametric in time, but we don't know an equation for it.</p>
<p> The most common case for interesting dynamical system that aren't linear is<br />
to describe the system in terms of differential equations. A differential<br />
equation for a dynamical system basically says "Given the state of the system at<br />
time t, this equation tells you what the state of the system will be at time<br />
t+ε", where ε is an infinitesimally small period of time.</p>
<p> To get a precise answer out of a differential equation, you need to be able<br />
to integrate it. But most of the time, we don't know how to integrate it<br />
symbolically. The closest we can come is to evaluate it as a series of<br />
steps, keeping the steps as small as possible. The result of doing this is <em>not</em> exactly correct, but if you can get the time-steps short enough,<br />
you can get very close to the correct answer.</p>
<p> For a lot of systems, this approach works really well. For one prominent<br />
example, it generally works quite well for N-body gravitational dynamics of<br />
things like the solar system. N-body systems are difficult and have some<br />
seriously unstable points. But for many examples, with precise measurements and<br />
small timesteps, you can get astonishingly accurate predictions using stepwise<br />
evaluation of the differential equations. They're very good, but far from<br />
perfect. To give you a sense of what I mean by that: we can predict pretty much<br /><em>exactly</em> where the earth will be at any point for the next 10,000 years.<br />
But there are several asteroids whose orbits come very close to earth (very<br />
close in astronomical terms that is), and we can't be absolutely certain of<br />
where they'll be 30 years from now. The best we can do is talk in terms of<br />
probabilities.</p>
<p> To reiterate: a dynamical system is basically a system that's parametric in time. But for chaos theory, we want to describe it in terms of a phase space. To get to the phase space, you need to think of it in terms of topology.</p>
<p> Using topology, you can describe almost anything continuous in terms of a<br /><em>space</em>. A topological space is a tricky concept, but the gist of it is<br />
that it's an infinite set of objects (called <em>points</em>), along with a<br />
structure that defines what objects are <em>close to</em> one another. If you<br />
want more detail than that, then I've got a whole series of posts on topology<br />
that you can look at, starting <a></a>
href="http://scienceblogs.com/goodmath/2006/08/topological_spaces.php">here</p>
<p> If you look at a complex system, you can define the set of states of that<br />
complex system as the points of a space, and where points are close to each<br />
other when there's a short path through the states of the system from one<br />
of those points to the other. If you define it so that it's got the right properties, you end up with a topological space.</p>
<p> To get from there to the phase space of a dynamical system, you need to add<br /><em>time</em> - the defining characteristic of a dynamical system is that it's<br />
parametric in time. That's done by providing an <em>evolution function</em>: a<br />
mapping which, given any point p in the phase space of the dynamical system and<br />
any interval of time, gives you <em>another</em> point, p' in the space. The meaning of the evolution function is that if you start the system in the<br />
state corresponding to the point p, and then you stop it after time t has passed, the state of the system will be p'.</p>
<p> The evolution function is completely deterministic: given a precise point in<br />
the phase space, after a precise interval has passed, the system will<br /><em>always</em> wind up in a specific state. At this level of the system,<br />
there is nothing obviously chaotic, nothing uncertain, nothing random. The system is precise, fully defined, and fully deterministic.</p>
<p> For many systems, the phase space is very clear and well defined, and<br />
we can perform computations in it with great precision. Just for example, there are lots of linear dynamical systems, and they're perfectly stable. In fact, you can make the argument that the ease with which we can analyze linear<br />
dynamical systems is why chaotic systems were such a shock.</p>
</div>
<span><a title="View user profile." href="https://scienceblogs.com/goodmath" lang="" about="https://scienceblogs.com/goodmath" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">goodmath</a></span>
<span>Fri, 06/12/2009 - 02:57</span>
Fri, 12 Jun 2009 06:57:02 +0000goodmath92725 at https://scienceblogs.com