Sorry, I've been busy these past few days.
I want to respond to a comment posted by Dan and take this opportunity to broaden the discussion about how we use language to construct models.
Dan's concerns about information and life have been echoed by many out there, for example by John Wilkins. Can we use the term "information" when discussing life? Is there such a thing as "information"? Are these buzzwords without any deeper meaning?
What is lost in such an analysis is that all of our theories are infused with metaphors. These words and concepts help us to better understand the ideas and insights that may come out of any particular model. The metaphor of information and the use of similar terms in molecular biology (signal transduction, transmit, secondary messenger, genetic program, mRNA, transcription, translation, copying, amplification) help us to comprehend underlying biological processes. A certain sequence of base pairs is passed on from generation to generation, information. A cell secrets a growth factor and causes a nearby cell to differentiate, information.
Does information exist per say? Well I would like to argue that nothing exist in the exact way that any term implies, words after all are simply tools that help us to understand the world in which we live in. Some words are better than others, some less so. Some words give more insight because their meanings give insight when invoked in a particular context. You may choose to eliminate all the words for information in your explanation of biological processes and use instead another metaphor, that of machines for example, and you may gain some different insight, but let's face it, you will not be able to easily understand all the issues and questions that are implicit in the study of classical genetics. What is the unit of heredity? How is genetic information transmitted? Does genetic information have a physical counterpart? How is the sequence of bases along a strand of DNA converted into a sequence of amino acids? To say that information is a buzz word, or that it doesn't exist misses the point. It is a useful analogy. It's a metaphor that gives insight, just like the idea of molecular motors can promote the development of other forms of deeper understanding.
The idea that Nurse's commentary is that when we construct our models about how biomolecular players orchestrate biological phenomena, we should view the process not in terms of just a simple flow chart, where "information" is passed from one molecule to the next, but about how molecular interactions alter an incoming signal (or a signal emanating from a genetic program). We need to move beyond the idea that signaling networks can be understood by measuring whether two molecules bind or how efficiently one molecule modifies the next. The idea that a protein interaction-map can provide insight is too simplistic. The MAP kinase cascade appears throughout eukaryotes, sometimes it promotes mating (as in yeast), other times it regulates osmolarity sensing, still other times it stimulates cell division. Why? Well instead of thinking about this kinase cascade as being fixed in one pathway, perhaps this module acts more like an amplifier, taking in a weak signal and making it stronger. With regards to the Mettetal paper, sure the exact term of information is barely used (let alone information processing) yet the approach that they use is to treat the osmo-regulator like a CD player (to use that analogy straight from a talk that van Oudenaarden gave here at HMS). They take the entire system feed in various inputs and measure various outputs and then break the mechanism down into two feedback loops. Next they can probe each loop and determine how it works. Does this functional unit act to delay the signal, does this unit amplify the signal. Sure they don't explicitly say it, but the approach they use is reminiscent of reverse engineering, of reconstructing the osmo-regulator in terms of an electrical diagram similar to what Nurse was talking about. In some ways the idea that the various cellular components act like an information processor takes the idea of information in biological systems and fuses it with the mechanistic model. The result is a new way to view the biological process.
- Log in to post comments
It has been my view for some time now that all legitimate talk in biology about information is merely a replacement term for causal specificity; all the connotative baggage is useless or worse, misleading. Nurse has fallen prey to that last.
Alex,
The best book I've seen that approaches this topic from a genetic/biological perspective is "Transducing the Genome", by Gary Zweiger. It seems to be out of print, and I hope someone runs the presses again.
DNA encodes information, but doesn't *know* that it's information--but, even so, it works *as* information. And that's only one of the angles the book works on.
You have an interesting, thoughtful blog here, and you might want to check out GMObelus, at http://www.gmobelus.com
I am missing the meat of this argument somehow. Ever since Shannon "information" has had a mathematically precise definition that can be applied to a biological process like transcription. It has concrete mathematical rules and can make predictions about what is possible and what can be expected given error rates and such. In this sense "information" is as real as, say, the number PH as a measure of the acidity of someones blood is real. Information theory is an axiomatized well worked out branch of Mathemathics that is providing insight into biological processes.
The approaches described use this form of "information". I don't see the issue. Is someone claiming that ALL of biology is information transfer? That is stupid in the same sense that claiming all of biology is quantum electrodynamics. It is useless as they are at different levels of models. But information theory does help explain and give hard limits to things in biology. In this sense it is not a metaphor, it is a tool that can be applied to things - well or badly depending on the user..
Information has both a fuzzy common usage and a very precise technical meaning. So does 'information processing' and pretty much every other word/phrase you mention.
At least for evolutionary biology (and I would argue development), the technical meanings are quite applicable and useful.
For example, this paper does a decent (if I do say so myself) job of describing molecular evolution using information theory... providing some insightful corollaries.
http://arxiv.org/abs/quant-ph/0301075
I think most of the folks who use the term 'information' in biology don't know the first thing about information theory or computer science, and therefore tend to butcher it and annoy people who do understand it. Even worse, it leads others to assume that the information / computation model is just a metaphor... no, it really is a precise explanatory model. Biology (and arguably everything else in the universe) really IS a information processing system.
Every biologist should at least skim Tom Schneider's "Molecular Information Theory and the Theory of Molecular Machines".
http://www-lmmb.ncifcrf.gov/~toms/
Markk... "Danger Danger" as they say ;)
Shannon studied communication / coding channels. The basic info theory formula he used do apply in biology, but his most famous derived formula and conclusion DO NOT generally apply. Quite simply, they rest on assumptions which are reasonable for a communication channel (radio, telegraphy, ect), but not a molecular system.
Mis-applying Shannon's work (which was brilliant BTW), is very common and really really annoys people who know better.
BTW: The term 'entropy' should probably just never be used by a biologist. It is almost inevitably incorrectly applied. Hell, I would prefer it if computer science and info theory people would stop using it too, just to help avoid the confusion. It really belongs to thermodynamics, and if you don't know what it means in that context, don't use it at all.
This is what a "science blog" should be ;)
Thanks, I will read up on Shannon. I must point out that Nurse's piece was aimed (I believe) at biologists, more specificaly on how to approach problems dealing with the functioning of basic cellular processes. As the inter-omics trend continues, biology is being flooded with a ton of data, most of it meaningless. Biologists are obsessed with how proteins are connected. From this exercise we've gained ... not much. In his essay, Nurse was calling for a shift in how we address these problems. In a way what he is writing is to persue "systems biology" in the way defined nicely by Marc Kirschner.
In the past my concern with Nurse's ideas is that he never gave a concrete example of how to go about this. But what the Mettetal paper demonstrates is that you can analyze a cell as if it were a set of circuit and reverse engineer it down. The cell acts to process incoming signals that are converted into an output - the processing events are analogous to what you would find in a CD player. Proteins and protein modules are more like capacitors, resistors and amplifiers - they modulate the signal in specific ways. Evolution selects a protein module (like the MAP kinase cascade) because it plays a useful function. It is easily inserted into many pathways and tweaks each pathway in a specific manner. That's the advance.
The nice thing about Shannon's models is that they model information 'mathematically'. Which means, you can discover coherent radiation from a quasar, and recognize that it's not 'noise'.
Implicit in a lot of information theory (few want to mention this) is the notion that encoding implies intention.
Mathematical theory will allow you to detect a quasar against 'noise', but the quasar isn't 'encoding' anything in an intentional sense. A quasar doesn't want to 'say' something. It's simply less noisy than the background. Nor does a gamete have an 'intention' in encoding its offspring. (Actually, there's 'noise' in DNA, but that's another thing.)
I suspect that concerns about 'who is encoding' and 'is it a Human encoding' is at the core of many objections against genetic engineering-- a conception of Nature as a blind impulse. Once infected with human purpose, the DNA of engineered crops carries a "message", as opposed to a mystery.
I could reinforce my point by referring to the intentional radiation mutation of crops (which relies DNA accidents which are non-noisy by accident), but you get my drift. For more, see, "Atomic crops!" at http://www.gmobelus.com/news.php?viewStory=108
In sum, when it comes to altering biological processes, there seems to be a preference for genomic encoding which *does not* result from an intention.
Andy.
Very level-headed analysis, Mr. Palazzo. I also found Markks reply very interesting, but unfortunately Im unfamiliar with these ideas. I look forward to following up on the subsequent comments if I can find the time.
This is more general, but Ive always had a lot of interest in what it would be like if all (new) scientific ideas were given new, unique, completely invented words. For example, when Einstein came up with his concept of space-time, he used those two words in a significantly different fashion than they had been previously defined (by Newton, etc.). If he had called space-time tarfurtin, would people have an easier comprehending his vision of the world than they do now? (I think the learning of such new concepts would have to be based in experiments and concrete examples to initially illustrate the meaning of the term.) Or is it altogether impossible replace fundamental concepts like space and time, because eventually everyone will figure out what tarfurtin is jargon for and do a little mental substitution from that point on?
Maybe you could rig some kind of psych-type experiment to get a clue into this by testing two subjects, each of which speaks a different language. Have them learn and be tested on the same material. The only difference (in a perfect world) would be that one of the languages uses a co-opted everyday term the main idea (etc.), whereas the other has a foreign or other term that has no common connotations. Of course, if people learn concepts better when theyre NOT associated with a word they already know, then perhaps we could posit that terms like information having multiple uses really is an impediment to critical thought. (Just off the top of my head, as if you couldnt tell, but I think you get the idea.)
Peace
Thanks Alex for the response, and sorry for taking SO long to get back to you (I was on vacation for the better part of a week).
Your explanation of the sense that "information" is used is entirely defensible, but I still object to its use, for reasons similar to some of the other commenters. That is, these are scientific concepts that you're referring to, and "information" is a very vague and ambiguous word. As with the journal articles that have been mentioned, would it not be far more preferable to do away with that ambiguity and use precise explanations of this "information processing"? Your defense of the word "information" is excellent - why not use that explanation in the first place??