Back to the Basics?

Here at ScienceBlogs, we've got our own back-channel forums for the bloggers to chat with each other. An idea that came up, which a bunch of us are interested in, is doing some posts about basic definitions and basic concepts.

There are many people who read various blogs around here who've had problems with definitions of some basic ideas. For example, there's the word vector - there are at least two very different uses of the word vector around here at SB: there's the form that people like me use (the mathematical vector), and there's the form that epidemiologists/biologists use.

For another example, there are things like the logic and proofs - a lot of people just aren't familiar with the concept of a proof, or how to tell whether an argument is a proper mathematical proof, or whether a conclusion follows logically from an argument.

So the question: what kinds of basic ideas or terms would you like to see a very basic-level introductory post about?

Tags

More like this

In a back-channel discussion among ScienceBloggers, John Wilkins suggested that it might be interesting to do occasional posts on really basic concepts in our fields-- the sort of jargon terms that become so ingrained that we toss them around without realizing it, and end up confusing people. A lot…
There was a postdoc in my research group in grad school who had a sister in college. She called him once to ask for help with a math assignment dealing with series expansions. He checked a book to refresh his memory, and then told her how to generate the various series needed for her homework…
As I mentioned a while back, I was loaned the Library of Congress discard of George Shollenberger's book. Since he's made such a big deal about how unfair I've been by not reading and considering his argument, I've actually forced myself to read it. (See what I'm willing to do for you, my faithful…
Today we've got a bit of a treat. I've been holding off on this for a while, because I wanted to do it justice. This isn't the typical wankish crackpottery, but rather a deep and interesting bit of crackpottery. A reader sent me a link to a website of a mathematics professor, N. J. Wildberger, at…

Statistical Uncertainty - what it means when someone says this has only a 1/100 chance of being a random occurance, what does that really mean and not mean. I keep having to explain stuff like this, and I seem to be doing a lousy job, so I could use some help.

I tend to get blown away by the good math (at least I presume it's good based on how well you take down the bad), so I'm glad to hear you'll be doing a series on the basics.

One suggestion I'd like to throw in: An entry covering the definition of number types: Natural, real, integers, rational, irrational, transcendent, imaginary, and so forth.

Here's a few: vectors, like I suggested in the back-channels, standard deviation, error bars, confidence interval... I'm sure others will come up with other science-related terms.

Here's one for you - "delta". I'm told there are actually three different kinds of "delta" (one using the capital-Delta Α - which is the one I've been familiar with, one using the lowercase delta and, uh, some third one.)
I seem to recall one has to do with continuous (e.g. real numbers vs. integers?), one for discrete(?) changes, and I don't recall what the third was.
What are the differences, and when do you use them?

(Ah, crud, that's "Capital-Delta: Δ", not Capital Alpha...)

I would still very much like to know what the definition of "information" is as used by physicists, as in "it is impossible for information to travel faster than the speed of light" or "there are exotic quantum events with negative information content". I am familiar with information as we CS/math type people use the word, but when used in physics it appears to mean something different and I've never found a clear, rigorous explanation of what.

(On a related note, the information theory texts I've looked at indicate that the use of the word "entropy" as a property of data was chosen based on some kind of analogy to physical entropy, but I've not seen this elaborated on.)

I can't actually think of something I'd like you to do a basics post on, but if this is all of science blogs, I'd love to know what the standard philosophical refutation is of the argument that inductive reasoning can only be shown better than some other reasoning method (say, reading tea leaves) by using inductive reasoning. (I hope no one would suggest that I determine whether tea leaf reading is an accurate source of knowledge by asking the tea leaves)

Oh, and I suppose you could as an aside mention the different uses of "induction" - there's the way I used it in the previous paragraph, there's what mathematicians mean, and there's what physicists and electrical engineers mean. Possibly a few other uses as well - it's the kind of word that gets overloaded.

Coin: You can think of "entropy" as the degree to which a frequency histogram is "flat".
As an example, consider a system with N states, and a lot of objects which can be placed in those states. If all of the objects have the same state, the entropy is 0. If there are equal numbers of objects which have each state, then entropy is -log (1/N). (The logarithm is base 2 in information theory, and base e in thermodynamics.) The "flatter" the histogram, the higher the entropy.
One informal way to understand the second law of thermodynamics is that it states that if you have a bunch of particles in some distribution of thermodynamic states, then as time advances, the frequency distribution can only get "flatter".

By Pseudonym (not verified) on 12 Jan 2007 #permalink

...wow. That's the best explanation of that I've ever heard.

Thanks, Coin, but bear in mind one thing: You only find that explanation "good" because you already understand what "entropy" means to information theorists. :-)
Obviously I've left a lot out. It's still not obvious, for example, what a "state" is in thermodynamics. One might naively think, for example, that if you put a block of ice (all molecules "cold") in a glass of hot water (all molecules "hot"), it might result in the frequency distribution becoming less flat (all molecules "warm"; a thermal distribution with two "peaks" evolves into a thermal distribution with one "peak"). You need a good intuition of what a "state" is to work that one out.

By Pseudonym (not verified) on 12 Jan 2007 #permalink

Oh, while I think of it, my suggestions for topics...

  • Some more category theory would be nice. I don't think we've gotten to adjunctions yet.
  • The Langlands Program.
  • Modern cryptography and cryptanalysis.
  • Differential geometry. (For those who grew up on Euclidean and analytic geometry, this might be quite interesting.)

These aren't "back to the basics", but I'd find them interesting.

By Pseudonym (not verified) on 12 Jan 2007 #permalink

I've been fascinated by what appears to be a particular mode of thinking that is required to do math and related things. Granted, it's more in the field of psychology, but I've noticed that some people just never quite manage to think about math and programming in the right way to be able to make use of them.
So I was wondering if you have some insight into why it is so much harder for some people to 'get it' and what could be done to help people adjust better to math. In particular, how should math be taught to children.

Yes, I can go with SMC and discuss differential operators and derivatives. Especially if it gets into covariance and contravariance, perhaps as a followup.

Coin:

I am familiar with information as we CS/math type people use the word, but when used in physics it appears to mean something different and I've never found a clear, rigorous explanation of what.

I think that is because there is no clear, rigorous explanation AFAIK. Just as information can be studied by Shannon channel capacity for signal transmission or KC complexity for algorithmic constructions, entropy has different measures too as Pseudonym shows and discusses (states in TD). So it is probably hard to say what it really is, at least in all situations.

It is the physics of the system studied that can be interpreted to have properties of information and entropy. For example, "it is impossible for information to travel faster than the speed of light".

If you see a light with one frequency it will not transmit information. It is just light, and after a while rather boring to watch.

To see anything interesting you can try to pulse it. Now you are effectively adding modulation, other frequencies, to a carrier wave.

While each frequency component may have its own velocity (in a media), its phase velocity, the pulse or modulation itself travels with a group velocity. This velocity is consistent with Lorentz covariance so it can never exceed the vacuum speed of light.

That was the definition of information travel I believe I once learned, average group velocity. Now it turns out that it isn't an accurate description. What matters is the group velocity of the leading edge.

Because today people can send in a pulse of light (or sound!) into a fiber (tube) and observe the peak exciting the other end before it enters the fiber. The pulse is reconstructed by the information in the leading edge. ( Light: http://www.livescience.com/technology/060518_light_backward.html . Sound: http://www.livescience.com/technology/070112_ftl_sound.html )

But in no instance the information here travels faster than the vacuum speed of light. It is the definition (or perhaps only my own interpretation of it ;-) that has changed, or rather have been made precise, to keep up with what is observed.

By Torbjörn Larsson (not verified) on 13 Jan 2007 #permalink

An interesting followup is of course, what exactly is the description of "the information in the leading edge"? I sure hope someone puzzles that out, I get stuck after deriving group velocities. :-)

By Torbjörn Larsson (not verified) on 13 Jan 2007 #permalink

What sorts of basic concepts need definitions is really going to be a hit-or-miss proposition if undertaken in the absence of a post on a particular topic as a central organizing factor. Not that I object if anyone wants to do it - it's always great fun to learn new things, or re-learn old ones better.

Lets's face it, there are a bajillion concepts out there, and a post provides a wonderful central idea around which a few of those bajillion can be selected, explained, then fleshed out in comments and responses to comments.

I like the mathematical proof idea. I've been scavenging wikipedia never finding satisfactory answers.

Markk: "Statistical Uncertainty - what it means when someone says this has only a 1/100 chance of being a random occurance, what does that really mean and not mean."

I think the problem is that people don't agree. I personally believe in the Bayesian interpretation. (In particular read Chapter 9: Repetitive Experiments: Probability and Frequency and Chapter 10: Physics of "Random Experiments") Also read this.

In Math? There are more than 80 "branches" of math, or so I've read. Graph theory, Topology, number theory, etc...
They're all interrelated of course, but it would be interesting to hear a little about each one, history, field of use, basic theorems and or axioms.

For the other sciences, just look over your last "good" post and pick the most important concept. Then do an explanation of that. It would keep your posts topical and interesting at several levels.

Your the CompSci guy so... how about giving the basics of how cryptology works and how prime numbers are important. Linear Transformations and 3D image generation...

Weldon urged: "how prime numbers are important."

There are some fine web domains about primes, of which the most diverse, entertaining, and searchable is probably Prime Curios.

There are fascinating and unexpected connections to Biology, such as periodic cicadas (13-year and 17-year cycles). I'm intrigued by this quote, apparently about an inference that amino acid sequences in genetic matter exhibit patterns expected of binary representations of prime numbers: "Additively generated numbers can be primes or nonprimes ('composites' in number theory terminology) [JVP: technically, 0 and 1 are neither primes nor composites]. Thus prime numbers are more creative than nonprimes.... The creativeness and indivisibility of prime numbers leads one to infer that primes smaller than 64 are the number equivalents of amino acids; or that amino acids are such Euclid units of living molecules."

[J. Yan, A. Yan, and B. Yan, Prime numbers and the amino acid code: analogy in coding properties, J. Theor. Biol, 151(3)333-341, 1991]

Does that mean that semiprimes are semicreative? Actually, semiprimes are worth explaining, as they are the basis of a billion dollar crypto industry. I happen to have published several hundred things about them, so I am biased. For the silliest, see:

Post, Jonathan Vos. "Emirpimes." From MathWorld--A Wolfram Web Resource, created by Eric W. Weisstein.

I'm not even sure there's one answer for this but here goes...
I've always been stumped when asked in exams to provide a proof of a mathematical relationship. Is there a standard way of approaching this problem that I'm missing? In other words... is there a common thread/strategy in all the simple proofs undergrads encounter in maths textbooks?

I'd love to see an exposition of statistics and uncertainty. Really, I think I'm still dreaming of a world where every preacher's sermon and politician's speech came with its own error bar. (Curse you, Carl Sagan, for putting that hope into my head!) Having just today read John Allen Paulos's Innumeracy, I'm eager to see the problem attacked.

On perhaps a less basic level, I'm curious to know whether Mark (or the others here) have any other favorite theorems in the fixed-point genre. You know, things of this sort, which make you go, "I just proved it, but I know it can't really be true!"

To give you a taste of what's coming: imagine that you have two sheets of graph paper, with the edges numbered with a coordinate system. So you can easily identify any point on the sheet of paper. Take one sheet, and lay it flat on the table. Take the second sheet, and crumple it up into a little ball. No matter how you crumple the paper into a ball, no matter where you put it down on the uncrumpled sheet, there will be at least one point on the crumpled ball of paper which is directly above the point with the same coordinate on the flat sheet.

On the way to the store where I ended up buying Innumeracy, my friends and I chatted about the Hairy Ball theorem and how it applies to nanotechnology. This probably put the question in my mind. . . .

Another thing that comes to my mind, perhaps not quite basic, is the relative performance of differing computational schemes. For example, if I had a program running on a given CPU that takes certain number of clock cycles to execute on given input, how many steps it would take to run an equivalent program on some Turing machine, or any other kind of computing machine? Besides the obvious distinction of P/NP algorithms, are there any other classes that have some practical properties; for example, could I know that some particular computing system just can't execute a particular algorithm in O(n), when some other system can?

As a side note: MarkCC, I, for one, appreciate the time you take to write on this blog and to answer readers' questions. There must surely be tons of work you are paid to do waiting, as well as various hobbies and other activities consuming your time. Your efforts to serve the public, in the very least, deserve a raising of the hat.

taki - There is a somewhat common strategy for such proofs. A good book on the topic (especially if you want to study it on your own) is "How To Prove It" by Velleman. The essential idea is that the method of proof follows first from the logical form of the objective then from the logical form of your premises.

Personally, I would like to see posts on algebraic structures & their applications (groups, rings, modules, etc.). Fixed point theorems are also interesting.

Second time I've tried to post this comment
===============
Weldon urged: "how prime numbers are important."

There are some fine web domains about primes, of which the most diverse, entertaining, and searchable is probably Prime Curios.

There are fascinating and unexpected connections to Biology, such as periodic cicadas (13-year and 17-year cycles). I'm intrigued by this quote, apparently about an inference that amino acid sequences in genetic matter exhibit patterns expected of binary representations of prime numbers: "Additively generated numbers can be primes or nonprimes ('composites' in number theory terminology) [JVP: technically, 0 and 1 are neither primes nor composites]. Thus prime numbers are more creative than nonprimes.... The creativeness and indivisibility of prime numbers leads one to infer that primes smaller than 64 are the number equivalents of amino acids; or that amino acids are such Euclid units of living molecules."

[J. Yan, A. Yan, and B. Yan, Prime numbers and the amino acid code: analogy in coding properties, J. Theor. Biol, 151(3)333-341, 1991]

Does that mean that semiprimes are semicreative? Actually, semiprimes are worth explaining, as they are the basis of a billion dollar crypto industry. I happen to have published several hundred things about them, so I am biased. For the silliest, see:

Post, Jonathan Vos. "Emirpimes." From MathWorld--A Wolfram Web Resource, created by Eric W. Weisstein.

Personally, I'd like to see something about analysis, or logic, or the basic definition of what certain classes of numbers (complex, real, etc.) are. I'm alreday doing the algebra stuff, Echidne already posted a series about statistics a while ago, and Tyler DiPietro is going to delve into theoretical comp sci.

John, Blake, and everyone else interested in statistics -

1. You poor fools.
2. There are some good articles from the BMJ by Bland and Altman: they're aimed at medics, so they might not be suitable for philosophers (not abstract enough, and the wrong sorts of long words).
3. For a late Christmas present, there's the Cartoon Guide to Statistics.

Bob

@Bob O'H:

Already own the Cartoon Guide to Statistics, and love it. But not everybody is as lucky as me.

@Stu Savory:

Non-commutative geometry has the problem that we actually have no experimental evidence whatsoever that spacetime actually exhibits "quantum fuzziness" (with Heisenberg uncertainty limiting our ability to know simultaneously all coordinates of a particle's position). It's therefore one step further into weirdness than string theory itself, and (I'd bet) consequently even harder to turn into a falsifiable prediction.

Besides which, you have the problem that if you work on non-commutative geometry just for the abstract mathematical appeal, you end up benefitting the string theorists anyway. NCG appears within string theory, e.g., when studying the behavior of strings stretched between parallel D-branes.

Folks:

The point of this request isn't just a general open request for topics. I'm always open to suggestions for topics to write about - just send me email! But in this post, I'm looking for suggestions specifically about basic definitions and concepts. Things like non-commutative geometry definitely do not fall into that category! :-)

How about a post defining specified complexity, with a worked example?

*ahem*

Bob

I would propose a slightly modified version of this idea. Glossaries tend to be useful only when used as references, rather than as reading material in and of themselves. So have a "sticky" glossary post that gets augmented whenever a regular post contains terms that need explaining. Don't put a monolithic amount of effort into building a complete glossary right at the outset.

By Halberd Halberd (not verified) on 14 Jan 2007 #permalink

I think it would be worth defining the target before shooting for it. What is the intended niche? No point in duplicating MathWorld, the biggest and best general purpose Math encyclopedia online. No way to outnumber contributors to Wikipedia, which ranks #2 on most Math topics. No point in going for the 120,000 pages of the Online Encylopedia of Integer Sequences (which is primarily about integers, and only gets into complex numbers, real analysis, topology, and the like to the extent that integers can be squeezed out of them, as with sequences of decimal digits of real constants).

So where in niche space would Back to the Math Basics be compared to MathWorld, Wikipedia, and OEIS?

Ready [check], Aim [necessary; where you are now], Fire [not good if too soon and at a repeatedly changing alleged target, as the M.B.A. Fuzzy Math President has proved in Iraq].

the general idea behind mathematical notation, are there any patterns, process while reading stuff, is there a fixed set of symbols that have common meaning across various mathematical branches.
what are the general/basic rules/concepts (besides and including the idea of function) while reading any or most of the popular stuff written/expressed in the language of mathematics for eg. articles on wikipedia or i guess some of your own stuff.

I still think that explaining basic probability and statistics would be the most helpful from a "let's save the world" aspect, but lots of other stuff would be fun to cover too. What about differentiation, integration and the Fundamental Theorem of Calculus? Two hundred twenty-two ways to prove the Pythagorean Theorem? Trigonometry?

Another way of thinking about the question occurred to me today, a way which bears upon JVP's comments above. I know MarkCC has considered writing a book; one way to do so would be to create a wiki, dump a whole bunch of blog posts into it, shuffle the pieces until they fall in a nice order, and write new blog posts to fill the gaps. (I have some server space and a working MediaWiki installation, if you'd like to try. . . .) Another way of phrasing the question "what are good basic topics" is to ask, "What would chapters 1 through 5 be in such a book?"

Blake, there are already two separate wikis about math - Wikipedia and Planetmath.

I'll repeat what I tried writing yesterday, only for the comment to get into the moderation queue and never see the light of day: there are other bloggers who're writing or have written posts like that. Echidne wrote a superb series on introductory statistics. I wrote a basic post about the motivation behind algebraic number theory, seeing as how my other number theory posts are becoming an online textbook. Tyler DiPietro of Growth Rate n lg n has just started doing information theory and complexity.

I'd love to read any efforts in trying to define "consciousness" in a rigourously logical manner (which definition is implied by the term "Intelligent Design"). Seems to me that most ID debates hinge on such unclear semantics.

Also, a post re informal logic in composition and how it relates to formal logic would be interesting.

Wonderful idea! I'd love to see a sort of on-going "glossary" created for the site, maybe via categories (or "tags" or "labels" or whatever the lingo is). What to put in it? Don't ask me. I dunno nuttin from nuttin.

JohnVP:

The purpose of this is basically for ScienceBlogs to have a sort of glossary. There are a lot of topics where we repeatedly see people either asking the same questions or making the same mistakes. We'd like to have some kind of
dictionary/glossary type thing for ScienceBlogs that contains the answers to them.

The purpose isn't to create something to rival wikipedia or mathworld; it's to create simple, informal, easy to read
articles that aren't anywhere near as complete as wikipedia or mathworld, but that are easy, light reading that gives people the basics idea - and then they can go to wikipedia or mathworld for the details.

The problem with mathworld (and to a lesser extend wikipedia) is that the articles are often too brief or too advanced. I often find that it takes me a huge amount of time and effort to figure out what's what in a mathworld article if it's not about something where I have some basic familiarity - it's written at a very advanced level. Wikipedia often provides more background - but quite often, not enough for a neophyte. But here at ScienceBlogs, we have people reading things like PZ's biology/evo-devo articles, or Ed's creationist debunking, and they're not familiar with some of the basic math concepts that are being used to explain something. Referring them to mathworld is certainly not appropriate; wikipedia is better; but having something quick and easy that they can skim here would be best.

Now that MarkCC has mentioned "creationist debunking", it's worth considering what basic math topics are most useful in showing that creationist arguments are nonsense. Again, I think this typically comes back to probability.

Probability and statistics are probably the two big things that people don't get that renders some form of scientific understanding incomprehensible to them.

For a glossary, just go for the relevant terms, I guess.

Oh. And complexity - that one shows up every now and then.

By Michael Ralston (not verified) on 14 Jan 2007 #permalink

I'd love to read any efforts in trying to define "consciousness" in a rigourously logical manner (which definition is implied by the term "Intelligent Design"). Seems to me that most ID debates hinge on such unclear semantics.

Is "consciousness" even a rigorously logical concept in the first place?

Coin wrote: "Is "consciousness" even a rigorously logical concept in the first place?"

Yes, but only from certain individuals or in certain sides in logical arguments. Which is all the more reason why the word needs to be dissected, to help prevent it's unconscious multiple uses. It appears to me that most of the ID debate could be cleared up with simple semantic analysis.

Mark:

Thank you for directly replying to my comment.

"The purpose of this is basically for ScienceBlogs to have a sort of glossary." -- as opposed to dictionary, encyclopedia, link list, citations to literature?

I still see a problem with the projected glossary, by your approach. In saying that MathWorld or Wikipedia articles can be too long or too short or too advanced, which are all true, then how do you evaluate the ideal length and depth as a function of the level of knowledge and area of specialization of the questioner?

Can you give an example of how you might do that? Above in this thread, for instance, someone asked for a definition of Differentiation." How would you summarize and point to today's update of this page at MathWorld:
Weisstein, Eric W. "Derivative." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/Derivative.html

Is there a plausible way to define length? Number of alphanumerical characters? Column inches? Concepts? Is there a plausible way to define the level or depth of an existing article?

To what extent would you point to articles, to what extent to printed literature, to what extent to interactive graphics and calculators?

"There are a lot of topics where we repeatedly see people either asking the same questions or making the same mistakes."

So do you keep a directory of the questions and build a FAQ around the questions, or do you do what good teachers do and debug their false assumptions, point them in the right direction, based not just on the questions they ask, but the questions they cannot ask until they see things more clearly?

To what extent should you be proactive, leaping ahead to what you believe that a subcommunity of readers will need to know, before they ask any of the questions?

"simple, informal, easy to read articles" -- but (I can email you offline a 100-page monograph I've drafted, from which 2 conference papers have already been extracted, and presented) "simple" is extremely non-simple to define. Centuries of effort have gone into attempts to axiomatize Occam's razor (that's part of Kolmogorov's motive to define Complexity his way). Recent formal breakthroughs exist here, but don't help you with the task of "simplicity" in a glossary.

"Easy to read" conflates prose style and other considerations. Isaac Asimov was brilliant on "easy to read" in his science and Math books for popular audiences. He told me that he had no prose style as such, and never took a writing course. But with all due respect to Science Bloggers, none are Asimov, nor am I.

Grade Level formulae based on word length and vocabulary lookup exist, and have been automated. But that's not what you want, is it?

Easy to read the WORDS is not the same same as easy to get the IDEAS.

Informal -- that gets us to the issue of WHICH formalisms? Hawking explained that his publisher warned him that each equation in a popular book would cut sales in half. So he limited A Brief History of ime to one equation (perhaps E = MC^2). But why cripple youself by ignoring equations?

On this thread there was a query about mathematical notation, which is a minefield itself.

The different Science Blogs attract different (albeit sometimes overlapping) audiences. Biologists and Philosophers may (I oversimplfy here) be more averse to equations.

MathWorld and Wikipedia have a spectrum of different lengths and levels of advancement of articles. How do you evaluate the level of a questioner, and thaen firt that to the spectra?

I think the project is well intentioned. It is worth you continuing the backchannel collaboration. It is worth avoiding redunat effort and internal contradictions. But my 40 years software experience tells me that you need to be much, much clearer in your conceptual requirements before you get to a preliminary design, let alone a detailed design.

I do not yet see sufficient coherence in the statements of requirements.

It also seems to be reinventing the wheel to start this without reference to what the NAS and AMS have already opined and recommended.

All the above, of course, in my humble opinion. Again, thanks for the personalized and thoughtful reply to which this is surreply (to use the legal term).

Jonathan [not abbreviated to John] Vos Post

OK, since it's primer material we want, here are some suggestions, some of which have already been covered. Define and expand on the following terms:

  • Information
  • Probability
  • Statistical error
  • Tensor

That's off the top of my head.

By Pseudonym (not verified) on 15 Jan 2007 #permalink

Mark, for the usage you describe - "we have people reading things like PZ's biology/evo-devo articles, or Ed's creationist debunking, and they're not familiar with some of the basic math concepts that are being used to explain something" - sounds like the concept of the FAQ would be appropriate. So, perhaps links to an FAQ somewhere on the upper right or upper left of the page, where people can see 'em, with stuff like proper explanations/debunkings of canards re the 2nd Law of Thermodynamics, terms like "information," specified or irreducible complexity, math concepts useful to understand PZ & PT, etc., etc.

Well, science use-groups have FAQ's (that you are presumed to have read before commenting on such questions). So a couple of basic posts under a link or a set of links would fill that function.

How about a post defining specified complexity, with a worked example?

Mark has already debunked Dembski's SC as badly defined ( http://scienceblogs.com/goodmath/2006/06/dembskis_profound_lack_of_comp… ).

Now, Elizabeth Liddle noted, first on UD and then on PT, as other has earlier on Talk Reason, that if you stick to Dembski's operational definition of intelligence as "the power and facility to choose between options", the process of evolution becomes an intelligent process by means of selection ( http://www.pandasthumb.org/archives/2007/01/dissent_out_of.html ).

In this case 'specified complexity' is simply "the result of choice". (Though evolution contains non-selective processes such as neutral drift, so it is still not sufficient.)

But as Murray Gell-Mann noted "A measure that corresponds much better to what is usually meant by complexity in ordinary conversation, as well as in scientific discourse, refers not to the length of the most concise description of an entity (which is roughly what AIC is), but to the length of a concise description of a set of the entity's regularities.

Thus something almost entirely random, with practically no regularities, would have effective complexity near zero. So would something completely regular, such as a bit string consisting entirely of zeroes. Effective complexity can be high only a region intermediate between total order and complete disorder.

There can exist no procedure for finding the set of all regularities of an entity. But classes of regularities can be identified." Murray Gell-Mann, on his book "The Quark and the Jaguar". [My bold.]

So perhaps "the result of choice" or rather "the result of evolution" is the only description we can make here (instead of 'specified complexity'). And phylogenetic trees are the general objects resulting from these descriptions.

By Torbjörn Larsson (not verified) on 15 Jan 2007 #permalink

information can be studied by Shannon channel capacity for signal transmission or KC complexity for algorithmic constructions

Btw, Tyler DiPietro had a nice formulation, putting these concepts closer: "Shannon information theory assigns information content to an ensemble of possible messages from sender to receiver, while KCS information assigns information content to an individual message." ( http://growthratenlgn.wordpress.com/2007/01/14/information-complexity-a… )

By Torbjörn Larsson (not verified) on 15 Jan 2007 #permalink

If there are people in your audience that do not have pure math training at undergrad level then it might be beneficial to start at senior high topics like calculus, limits, logic, then move to linear spaces, set theory, topology, number theory, and so on. My personal interest from a reading perspective are ZF axioms, Axiom of choice, topology, discrete math and number theory.

hope this helps
Mark W in Vancouver

Harald:

No, the Haskell tutorial will continue. I tend to hop around between different topics a lot. The basics stuff won't interfere with the topology or Haskell posts. But due to my real job, I've got less time to work on the blog for the next few weeks than I normally do, so everything will be a bit slower. I've got the next piece of the Haskell tutorial about halfway done, but I haven't had a chance to work on it for a few days. It should be ready either tomorrow or thursday.

I would suggest a little group theory. I think it is the most accessible example of what math is for a mathematician (in opposition to doing long division...).

What is a "proof"? How does one use a computer to "prove" theorems?