Epigenetics

Blogging on Peer-Reviewed Research

Epigenetics is the study of heritable traits that are not dependent on the primary sequence of DNA. That's a short, simple definition, and it's also largely unsatisfactory. For one, the inclusion of the word "heritable" excludes some significant players — the differentiation of neurons requires major epigenetic shaping, but these cells have undergone a terminal division and will never divide again — but at the same time, the heritability of traits that aren't defined by the primary sequence is probably the first thing that comes to mind in any discussion of epigenetics. Another problem is the vague, open-endedness of the definition: it basically includes everything. Gene regulation, physiological adaptation, disease responses…they all fall into the catch-all of epigenetics.

Here's another definition, cited by Mary Jane West-Eberhard in Developmental Plasticity and Evolution. Epigenetic factors are defined as:

…those heritable causal interactions between genes and their products during development that arise externally to a particular cell or group of cellswithin the same individual, and condition the expression of a cell's intrinsic genetic factors (i.e., genome) in an extrinsic manner. In other words, epigenetic factors are the contributions to a cell's environment by genes in other cells of the same individual.

Just to confuse matters even more, I surveyed the long line of developmental biology textbooks on my office shelf, and most don't even mention epigenetics. Not because it isn't important, of course, but because developmental biology basically takes epigenetics entirely for granted — development is epigenetics in action! Compare an epidermal keratinocyte and a pancreatic acinar cell, and you will discover that they have exactly the same genome, and that their profound morphological, physiological, and biochemical differences are entirely the product of epigenetic modification. Development is a hierarchical process, with progressive epigenetic restriction of the fates of cells in a lineage — a dividing population of cells proceeds from totipotency to pluripotency to multipotency to a commitment to a specific cell type by heritable changes in gene expression; those cases where there is modification of the DNA, as in the immune system, are the exception.

In part, the root of the problem here is that we're falling into an artificial dichotomy, that there is the gene as an enumerable, distinct character that can be plucked out and mapped as a fixed sequence of bits in a computer database, and there are all these messy cellular processes that affect what the gene does in the cell, and we try too hard to categorize these as separate. It's a lot like the nature-nurture controversy, where the real problem is that biology doesn't fall into these simple conceptual pigeonholes and we strain too hard to distinguish the indistinguishable. Grok the whole, people! You are the product of genes and cellular and environmental interactions.

With our current state of knowledge, though, we can at least separate the two operationally. We can go into a cell, or into the online databases, and pull out DNA sequences, like this little snippet from human chromosome 15:

gaattctact aatgtttaaa aaattaatac caataaagtc ttacaaaaat atagaagtag

We can also see how mutations that change that sequence affect the organism, and we can also see that sequence being passed on from parent to child. Those are genetic traits; they are characterized by an overall stability, with deviations being the fascinating and important exception.

Epigenetics is messier and more fluid, and therefore harder to pin down. The genome is actually not a simple sequence of letters, it is a more complex chemical structure that is bound up with proteins called histones forming a complex called chromatin. This is what cells are actually working with when they execute a 'genetic program':

i-e72a206da62d60324a31c1abdabbc9ec-histones.jpg

Strands of DNA (in blue) are wrapped around spools of histones forming a unit called the nucleosome; these units are folded and wrapped into a great tangled loops and whorls, the chromatin. This is what is modified by epigenetic processes. We can break these processes down into several different categories. The two big ones are methylation of DNA and modification of the histones themselves.

i-0c27c68e09fb2c469f20aaad3f1955b9-epigenetics.jpg
  • DNA modification. Stretches of DNA can be inactivated by covalently attaching methyl groups, which can interfere with the binding of transcriptional enzymes, and can also be signals to recruit enzymes that modify associated histones. Cells have enzymes called methyltransferases that bind to specific dinucleotides (a cytosine adjacent to a guanine) and attach a methyl group to the cytosine. Methylated DNA is silent DNA.

  • Histone modification. Those roughly spherical histone complexes also have dangling N-terminal tails that can also be covalently modified by acetylation, phosphorylation, ubiquitination, or methylation. These changes affect how tightly packed the chromatin will be: in loosely packed chromatin, called euchromatin, the DNA is more accessible and more active, while in tightly packed chromatin (heterochromatin), the DNA is more inactive.

  • Histone variants. All histones are not alike! Some variants are more permissive of transcription, while others facilitate tighter packing. Activity in a region of DNA can be modulated by the kinds of histones used.

  • Chromosomal arrangement. There is growing evidence that at least some aspects of the 3-dimensional arrangement of DNA in the nucleus is non-random — that is, the DNA isn't a willy-nilly tangle of spaghetti, but folded in some specific ways that bring widely separated regions into association. One of the prettiest examples of this is the control of olfactory receptor expression in mice: by unknown mechanisms, a single specific receptor gene is activated in an olfactory cell by the association of a distant enhancer element with a single receptor gene.

  • Every other mechanism of gene regulation. Heritable modifications of DNA are easily seen as epigenetic factors, but similarly, just about every known regulatory mechanism is in some sense also heritable. The concentration of transcription factors and RNA in the cytoplasm, for instance, affects levels of gene activity…and those factors are passed on in mitosis as well. Especially by the West-Eberhard definition above, epigenetics opens up into a vast catalog of everything that modifies gene expression.

How about some examples?

One clear example of a long-term epigenetic modification is X chromosome inactivation. Mammalian females have two X chromosomes, while males have only one, which could create problems of differences in dosage — left unregulated, females would have twice the concentration of the gene products found on the X chromosome, and we know that many genetic effects are sensitive to concentration differences. The mammalian strategy is not to make genes on the X in males work twice as hard to compensate, but to instead shut down one whole X chromosome in females. This is accomplished by, largely, extensive methylation of histones on the inactivated X, and by recruitment of repressive histone variants. The chromosome is heterochromatized to shut it down.

Which X chromosome is silenced is heritable. If a cell in a female embryo happens to shut down the X chromosome it inherited from the mother, leaving the paternal X active, all of its subsequent daughter cells will also shut down that same X. This is fixed; all future progeny of that cell, from that early embryonic state until the female grows old and dies, will be using the same single X. This is definitely a long term commitment.

One other interesting phenomenon occurs in eutherian embryos. Initially, the paternal X chromosome (that is, the one carried in the sperm) is always inactivated, and the earliest steps of development are carried out using only the maternal X chromosome. The extraembryonic tissues persist in this pattern, but the embryo itself later briefly activates all X chromosomes, and then in females, randomly shuts down one of them. This leads to the interesting situation that mammalian females tend to be mosaic, with an invisible (except in cases like calico cats) mottling of cells that have arbitrarily shut down one or the other X chromosome. It's this later choice that is locked in for the rest of the individual's life.

Another instance is genomic imprinting. It's not just the X chromosome that is differentially activated or inactivated depending on whether it is maternal or paternal; other genes on non-sex chromosomes also show differences. The best known example is a bank of genes on human chromosome 15. In males, some of these genes are silenced; in females, a different set of genes in the same area are shut down. The pattern of inactivation is perpetuated in the sperm and egg, so sperm cells always carry a chromosome 15 with those genes methylated, while egg cells similarly have the female pattern of inactivation. Now normally, this has no detectable consequences to the embryo. It has one paternal and one maternal copy of chromosome 15, so it still has one copy of each gene that is entirely functional. All is well.

However, there are instances where the embryo must rely on just one of the chromosomes, and then things can go wrong. What if the sperm cell carries a chromosome 15 that has a defective allele, or one that is completely deleted? Then the embryo must use the maternal chromosome 15 copy, but what if that allele is maternally inactivated, or imprinted? Then it effectively has no copies of that gene product to use in development.

Another situation is called uniparental disomy. Sometimes there are errors in mitosis or meiosis called nondisjunction, in which a cell inherits an extra copy of a chromosome (a well known example is Down syndrome, where individuals have an extra copy of chromosome 21, with serious effects). Being trisomic, or having an extra copy, for chromosome 15 is lethal to the embryo, so that no such individuals make it to term. However, sometimes they can be spontaneously rescued by a second mistake, a loss of one of the extra chromosomes, reducing them back down to two copies of chromosome 15. Here's the catch, though: which one is lost is random. If the individual has two copies of the maternal chromosome 15 and one copy of the paternal chromosome 15, and sheds the paternal chromosome, it's no longer trisomic, but it does bear two chromosomes with only the maternal pattern of imprinting. This can also have serious consequences.

Individuals that develop with only maternally imprinted copies of these gene on chromosome 15 have something called Prader-Willi syndrome, a disorder characterized by mental retardation, obesity, and short stature; if instead the individual has only the paternally imprinted copies of chromosome 15, they have Angelman syndrome, a different disorder with severe mental retardation, and characteristic changes in facial features and movement (the original descriptions called them "puppet children" for their howdy-doodyesque appearance and jerky limb movements). These individuals may have identical genetic factors, and the only difference is in the epigenetic modifications of their chromosomes.

Another concern is the role of epigenetics in disease. Some chronic diseases, such as cirrhosis of the liver, are more than just an acute reaction to an environmental insult — they represent long-term changes in the pattern of gene expression in the cell lineages of the organ. Our cells are responsive, and they can be changed epigenetically in our lifetimes.

Some cancers seem to be facilitated by what are called epimutations — changes, not in the DNA itself, but in the pattern of methylation such that genes that play a role in our defenses against cancer are inactivated. Epigenetic silencing of the gene MLH1, for instance, is associated with some colorectal cancers.

One of the ways viruses can affect us is that they insert their genomes into ours — they induce a dramatic genetic change, which can be deleterious and which can be passed on by dividing cells. Epigenetic processes are defenses against the propagation of viral infections. Methyltransferases can sweep through and silence viral insertions, preventing them from promoting viral proliferation.

I began this with a couple of definitions of epigenetics. Perhaps a simpler, non-technical way to think of it all is that it represents a kind of cellular memory that can be passed down to daughter cells. It's not as specific as the sequence of DNA, but it is sufficient to reconstitute the state of gene activity between generations. It's central to understanding development as well as how organisms interact with their environment, and is intertwined and inseparable from our understanding of the gene.


Cavalli G (2006) Chromatin and epigenetics in
development: blending cellular memory with cell
fate plasticity. Development 133:2089-2094.

Egger G, Liang G, Aparicio A, Jones PA (2004) Epigenetics in human disease and prospects for epigenetic therapy. Nature 429:457-463.

Kiefer JC (2007) Epigenetics in development. Dev Dyn 236:1144-1156.

More like this

What is really exciting about epigenetic research is that it might allow an "end around" to bypass the tricky mess that is genetic therapy (see: the "bubble boy" fiasco that pretty much ended gene therapy trials). There would be no need to delete a problematic gene that is driving tumor formation if we could just preferentially silence it. Methinks this is going to get HUGE in the coming years.

On a side note, I would be interested to hear Dawkins' take on this. Has he modified his "selfish gene" hypothesis to take into account these other mechanisms of heritability?

By Leukocyte (not verified) on 22 Jul 2008 #permalink

oops. I really do know the difference between "your" and "you're". It's my proofreading skills that suck.

By LightningRose (not verified) on 22 Jul 2008 #permalink

@LightningRose - PZ already posted about that creature... scroll back a little from the main page and you'll see it, picture and all.

By Leukocyte (not verified) on 22 Jul 2008 #permalink

What ever happened to the simple good ol' days of phenotypic plasticity?

Thanks PZ. I will not presume that it was my request for an overview on this topic that resulted in this post, but I did ask for it and here it is! Many thanks.

Another problem is the vague, open-endedness of the definition: it basically includes everything. Gene regulation, physiological adaptation, disease responses...they all fall into the catch-all of epigenetics.

God?
.
.
.
.
.
.
.
.
.
.
Ducks and runs..............

Good timing! I've seen the word popping up everywhere lately, but it seems it missed a good explanation of exactly what it means.

Another problem is the vague, open-endedness of the definition: it basically includes everything.

...oh well.

Spectacular summary of very complicated shiz there.

Of course, this only creates more questions than answers. Clearly the best explanation for all this is that Magic Man Done It.

-P

And RNAi ought to also count as an epigenetic process...I tell you, it's epigenetics everywhere.

Or, more properly, "Son, it's epigenetics all the way down."

Seriously, though, with the understanding of epigenetics and development; can we finally get people to stop using a "DNA as computer code" analogy? From what I understand, a single gene can be used to create different traits depending on how/when it is copied and to what use it is put. Doesn't the expression of a gene depend on the neutral network in which it interacts? I was reading and trying to understand an article in PLoS Computational Biology which shows evolution doesn't always find the optimal path towards the development of "fittest" organisms, as long as it makes an organism "fit" for its environment.

I am not sure if I understand the study very well, but here is the author's summary:

Evolutionary biology tells us much about the immediate fate of a mutation once it appears, but relatively little about its long-term evolutionary implications. Major evolutionary transitions from one trait to another may depend on a long sequence of interacting mutations, each arising by chance and surviving natural selection. In this study, we characterize the network of mutations that connect diverse molecular structures, and find that this network biases evolution toward traits that are readily produced by one or a short sequence of mutations. This bias may prevent the evolution of optimal traits, a phenomenon they call the "ascent of the abundant."

Citation: Cowperthwaite MC, Economo EP, Harcombe WR, Miller EL, Meyers LA (2008) The Ascent of the Abundant: How Mutational Networks Constrain Evolution. PLoS Comput Biol 4(7): e1000110. doi:10.1371/journal.pcbi.1000110

So, it seems that DNA doesn't translate codon for protein into a predetermined structure. Or have I missed the point completely?

Leukocyte @4: "I would be interested to hear Dawkins' take on this."

From last week's New Scientist:
"What does Dawkins himself think? "The 'transgenerational' effects now being described are mildly interesting, but they cast no doubt whatsoever on the theory of the selfish gene," he says. He suggests, though, that the word "gene" should be replaced with "replicator". This selfish replicator, acting as the unit of selection, does not have to be a gene, but it does have to be replicated accurately, the occasional mutation aside. "Whether [epigenetic marks] will eventually be deemed to qualify as 'selfish replicators' will depend upon whether they are genuinely high-fidelity replicators with the capacity to go on for ever. This is important because otherwise there will be no interesting differences between those that are successful in natural selection and those that are not." If all the effects fade out within the first few generations, they cannot be said to be positively selected, Dawkins points out."

By Jason Failes (not verified) on 22 Jul 2008 #permalink

I haven't read this yet (this is a spur of the moment reply), but I just wanna thank you for writing a post on Epigenetics PZ!

Yay!

OK, time for a dumb question.

Epigenetic modifications like DNA methylation and histone jiggery-pokery can affect an organism's behavior, yes? Mouse A and Mouse B have the same genome, but Mouse A had a gene methylated during its embryo phase and grew slightly differently; with its altered body plan, Mouse A interacts differently with other mice. (Maybe it can no longer recognize kin, or something.) Assuming these epigenetic variations are effectively heritable, then selection should act on them.

Question: do we now have to start talking about the selfish epigene?

What I came up with over at bioblog: From a molecular standpoint, epigenetics consists of modifications to DNA, other than mutations, or chromatin which changes gene expression.

And the improved: Epigenetics is the study of modifications of DNA or chromatin that don't change the DNA sequence, yet have an effect on gene expression that is persistent through replication (either cellular or organismic)

It's not surprising that epigenetics is "messy".

Life is messy & unpredictable, as is any emergent property.

From Physics, you might recall the 3body (nbody) problem. It is impossible to absolutely predict the behavior of three (or more) bodies to a single force (ie. gravity) due to their interactions with each other. (In other words, Newton's clockwork god can't accurately tell what time it is.)

Life is far more complicated & messy, with hundreds of factors (ie. "bodies") interacting with each other in complex & various ways (ie. multiple "forces"). The usual focus of science is on component parts (analysis) rather than whole systems (synthesis) where complex properties can easily emerge from simple components. A fine example is found in the behavior of social insects, where a tiny number of preprogrammed behavioral responses can elicit complex social behavior.

Often it is comforting to focus on something relatively simple (ie, DNA) rather than the messiness of functioning living organisms.

What I came up with over at bioblog: From a molecular standpoint, epigenetics consists of modifications to DNA, other than mutations, or chromatin which changes gene expression.

And the improved: Epigenetics is the study of modifications of DNA or chromatin that don't change the DNA sequence, yet have an effect on gene expression that is persistent through replication (either cellular or organismic)

I don't much care for formal definitions of epigenetics... it seems bizare to me that something as pedestrian as a transcription factor regulating levels of gene expression in euchromatin might be included, even though this does not really necessarily mean "reprogramming cell fate"... Yet the ultra-cool recombination that occurs in B-cells and T-cells isn't considered "epigenetic", merely because the developmental reprogramming does occur on the DNA sequence level.

@Mike Haubrich- I am not so sure that particular article is easy to interpert. It sounds more like the evolutionary verison of "when you have a hammer, all your problems look like nails" than DNA sequence != protein function.

Of course, DNA sequence != protein function, but the sequence puts constraints on the chemical composition and therefore spatial constraints that proteins can have. And, since many (coding and noncoding) DNA sequences are also important in regulating temporal and spatial (within the cell) parameters of a protein, they do affect function as well. How much information is in the sequence itself is an interesting question.

"can we finally get people to stop using a 'DNA as computer code' analogy? From what I understand, a single gene can be used to create different traits depending on how/when it is copied and to what use it is put"

That's what my genetics professors always say!

Are there examples of alleles being shut off or turned on in a foetus by maternal hormonal triggers during pregnancy?

By Akheloios (not verified) on 22 Jul 2008 #permalink

When I taught grade 5, we had a girl with Prader-Willi syndrome. Essentially, we always had to have low-carb snacks on hand for her, otherwise she would start eating anything and everything. Nice to understand what was going on though. Thanks !

By Annick Jean (not verified) on 22 Jul 2008 #permalink

So, I'd heard that the explanation for the incidence of color-blindness in men (more common than in women) was due to the lack of redundancy of genes for building colored light receptors. It seems like X deactivation would also eliminate this redundancy. But obviously it doesn't... what am I missing?

Leukocyte #4

What is really exciting about epigenetic research is that it might allow an "end around" to bypass the tricky mess that is genetic therapy

Conversely, epigenetics can very much complicate a straightforward alteration of the genetic sequence.

Nice write up, PZ.

Doesn't the expression of a gene depend on the neutral network in which it interacts?

Um, the way "neutral network" is used in that PLoS Computational Biology article, I believe it refers to the set of nucleotide sequences which are connected by neutral mutations. Each node in the network represents a nucleotide sequence, and nodes are connected to one another if a mutation can turn the first sequence into the second. All the sequences represented by all the nodes in the network are equivalent, in that they map to the same protein (or a functionally equivalent protein). In general, as the authors say, neutral networks

are sets of genotypes with identical fitness that are interconnected by neutral mutations [15]. In the RNA model, the genotypes in a neutral network are sequences that fold into the same shape and are connected to each other by paths of neutral point mutations.

If you start at one node in a neutral network, you can hop around via mutations to lots of of other nucleotide sequences. Going through a non-neutral mutation, a mutation which changes fitness, takes you out of the neutral network. The larger your neutral network, the more possibilities you have for reaching different sequences.

Based on these characteristics, researchers have proposed that large neutral networks should facilitate evolution by allowing populations to explore vast regions of regions of fitness landscapes through neutral drift [15],[18],[24],[26],[27]. There is some evidence to support this assertion, though it is largely based on sampling studies [15],[24],[26] or simulation studies with strong assumptions about the fitness landscape [26]. Most recently, Wagner (2008) showed that populations evolving on large neutral networks sample more alternative phenotypes than those evolving on small neutral networks, yet these populations were constrained to explore a single neutral network.

The neutral network is not the same thing as the gene regulatory network, in which the nodes are different genes and the connections tell how the products of one gene affect the expression of another.

However becca, the genesis of the concept of epigenetics was somewhat of a paradigm shift. I think what's cool is simply what you happen to be researching. For me it's hematopoiesis and tumorogenesis in zebrafish.

PZ, thank you for the well-written article that clearly explains a term that has caused me a lot of confusion. I really enjoy reading your science articles!

Wow! PZ, you laid out a complex agrument in a very clear fashion. I learned something. That makes this a good day.

By Nerd of Redhead (not verified) on 22 Jul 2008 #permalink

Another great science post, the surefire way to leave the godbots behind... your ink squirt, so to speak.

Well played, professor!

By Longtime Lurker (not verified) on 22 Jul 2008 #permalink

Firstly, Thanks PZ for such a thoughtful post on Epigenetics.

Are there any other mechanisms besides epigenetic that result in identical twins appearing physically different from each other? Often times, they even have opposite handedness, and that's probably also epigenetic in origin. Ultimately, it may be that the question itself is pointless since everything is epigenetic, including the physical appearance of each individual, regardless of being a twin or not.

By Helioprogenus (not verified) on 22 Jul 2008 #permalink

I don't see how the definition quoted from West-Eberhard fits in.

The diagram of "[t]he two main components of the epigenetic code" really illustrates the truth of the selfish gene. It's all about Me, Me, Me...

By Richard Smith (not verified) on 22 Jul 2008 #permalink

Oh man, this is so much more fun than the usual pop oversimplifications. Plus: calico cats!

I agree strongly with the profs quoted by jj above. Thinking without Procrustean metaphors like "computers" and "programming" might make things seem more complicated, but it has the advantage of not being misleading. Seems to me that metaphors, like most ideas, are productive only when you have a grip on them and not vice-versa, so you can let go when they ramble off in wrong directions.

Thank you PZ for writing this up. I have been trying to figure this epigenetics stuff out and this was a great guide to pointing me in the right direction.

On a side note, if anyone wants to see what the postmodernist new-age retards have done with epigenetics, just do a quick search on youtube. "Dr. Bruce Lipton <----- possible retard"

Does it help, or only make things yet more confusing, to say that it's developmental capacities, rather than developed traits, that are inherited?

By bob koepp (not verified) on 22 Jul 2008 #permalink

Thank you PZ for this fantastic article! I'm soon to begin studying biology at the University of Bath as an undergrad and as such have never heard of epigenetics during my A-level study, until now. It's a really thought provoking topic, hopefully I'll learn more of it at university, thanks for bringing it to my attention now!

Thanks a lot for the well-written article! I understand epigenetics much better now.

@ PZ

Has the amount of information besides DNA that gets passed from gametes been quantified? If not, what would you estimate it to be on a percentage basis?

Also from my perspective as an engineer, the collection of genes in a genome looks like a massive state machine with genes being turned on and off by various mechanisms. How much of the genome would you estimate is responsible for this regulation?

@ Anyone Else

I will probably take a lot of shit for this question but here it goes:

Perhaps viruses are a mechanisms used by the Designer(s) to propagate non-random modifications to genomes. Has anyone else considered this? If you think this is completely impossible, why?

@ Randy Stimpson #43, If viruses were the only means of genetic modification, we would still have to explain the origins of viruses within a natural environment. Viruses don't just pop in out of existence. What you're proposing is that there is an intelligent guiding a virus into infecting a host with certain genes that may eventually benefit the host and in turn, effect the evolutionary outcome of said organism. Yet, this is definitely not the case. Besides all the other mechanisms that result in hereditary genetic modification, epigenetic traits as we know them can be passed down, yet we don't know if they're ultimately nullified or not by a certain number of generations.

By Helioprogenus (not verified) on 22 Jul 2008 #permalink

"Perhaps viruses are a mechanisms used by the Designer(s) to propagate non-random modifications to genomes. Has anyone else considered this? If you think this is completely impossible, why?"

It's not impossible, but if you don't mind, we'll consider all other naturalistic possibilities before concluding that a magic man done it.

I apologize in advance if you are a member of the small group of ID theorists who is actually trying to develop a science. You should realize that the originators of your movement were and are barely-concealed creationists responding to a court ruling and nothing more. If I was you, I would get as far away from them as possible to preserve my own credibility. Try panspermia research.

Another general comment, what makes the idea of a designer so far-fetched, is that in any known case, when a designer designs something, it has an immediate purpose. For example, if we were to terraform Mars, it would be to live there. If we took the alien equivalent of primates and genetically engineered them to be smarter, it would similarly be for some purpose, given human history probably the purpose of enslavement.

If we were intelligently designed:
1) They didn't do a very good job.
2) They either didn't have a purpose or
3) Their society collapsed before that purpose could be realized.

Oh, and in the absence of and actual phenomenon, any speculation about an intelligent designer is so much philosophizing into an old gym sock. Useless.

By Jason Failes (not verified) on 22 Jul 2008 #permalink

I meant to say "in the absence of ANY actual phenomenon"

By Jason Failes (not verified) on 22 Jul 2008 #permalink

@ Jason (#15) - Thanks! Couldn't have gotten a straighter answer unless he responded himself.

By Leukocyte (not verified) on 22 Jul 2008 #permalink

Perhaps viruses are a mechanisms used by the Designer(s) to propagate non-random modifications to genomes. Has anyone else considered this? If you think this is completely impossible, why?

First, we look for reasons to think it's plausible, rather than following the IDist program of considering everything they've thought up that is not disproven to be "science".

Secondly, no, oddly enough we don't look at parasitic evolving organisms such as viruses and think "designer". It would be strange if viruses were propagating non-random changes when they themselves appear to have enhanced mutations rates in order to supply their own "random mutations" (they're not totally random, but there is no reason to think they're guided) to evolve to survive unpredictable future conditions.

Viruses are doing what the rest of life is doing, evolving without being directed to do so (by all of the evidence we have). They greatly affect evolution, but there is no rationality or intelligence to their existence and persistence.

Glen D
http://tinyurl.com/2kxyc7

I will probably take a lot of shit for this question

Oh, you have no idea....

Perhaps viruses are a mechanisms used by the Designer(s) to propagate non-random modifications to genomes.

Hey, perhaps we could, I dunno, study retroviruses and find out what their mechanisms really are?

Has anyone else considered this?

Sigh. Oh, yes.

http://endogenousretrovirus.blogspot.com/2007/07/ervs-are-functional.ht…

Also:

http://scienceblogs.com/erv/ervs/

By Owlmirror (not verified) on 22 Jul 2008 #permalink

Obviously the analogy with computers can be pushed too far - so can any metaphor - but this does sound rather like the behavior of a run-time optimizing compiler, such as is used by Java.

I'm so glad to find a combination of a more serious, educating posting AND reasonable, meaningful comments - with content other than "lolz , xtians/atheists suck!" It seems that especially after the whole cracker-episode the blog has become a hostage of meaningless drivel from both sides on the intellectual level of "Your momma so fat..."

Having said this I'm afraid to ask my probably stupid question:

What is the controlling force behind epigenetics then? I understand from your example of liver chirrosis that at least enviromental factors can mess with epigenetic-processes but what constitutes and controls the "normal" balance of methylated parts, histone variants etc.? Seeing as it is called EPIgenetics and not e.g. METAgenetics I'm guessing the answer will be most confusing to a Sunday biologist like me =)

From #8:

"PBS's NOVA had an interesting basic show on epigenetics called "Ghost in Your Genes":"

I caught the first half of that program and it looked pretty good. The level of complexity that it (epigenetics) adds to development and diversity is overwhelming to me. It *almost* makes me sympathetic to those who throw up their hands and say "God must have done it! It is so much easier that way!!"

I sense a bit of a culture clash here, with EvoDevo folks stuck in a bit of a schizophrenic state.

Evolutionary biologists, at least those with a more theoretical bent, rightly focus on non-genetic heritable states. They tend subdivide the broad 'epigenetic' category based upon relative heritability, stability, and plasticity... the factors that matter evolutionary dynamics wise.

Developmental folks do see epigenetics as a kindof 'duh, obviously *rolleyes*' thing. The subdivisions they gravitate to seem mostly mechanistic, which makes perfect sense since discovering those mechanisms is a lot of what developmental bio is about.

I tend to side with the Evo folks. Epigenetics is fundamentally a evolutionary biology term, and defining broad sub-classes (or otherwise narrowing it down) based on the evolutionary dynamics of different heritable mechanisms is a really worthwhile endeavor. Devo should certainly group different mechanisms together as similarities are identified, but this is more of a bottom-up thing.

I don't know if I'm being really clear, but let me put it this way: Since epigentic factors are ubiquitous in developmental biology, the actual term 'epigentic' is pretty useless. Leave it to evolutionary biology where it is actually a useful distinction.

That is f***ing awesome. Period. I love biology.

By Mike Beavington (not verified) on 22 Jul 2008 #permalink

I don't know if I'm being really clear, but let me put it this way: Since epigentic factors are ubiquitous in developmental biology, the actual term 'epigentic' is pretty useless. Leave it to evolutionary biology where it is actually a useful distinction.

Clear enough; I thought you highlighted the distinction succinctly.

defining broad sub-classes (or otherwise narrowing it down) based on the evolutionary dynamics of different heritable mechanisms is a really worthwhile endeavor.

I think it would be good to go into a few examples that would actually demonstrate why it's worthwhile.

Have you run across a few good recent cites?

#43 Randy Stimpson aka Intelligent Designer

Perhaps viruses are a mechanisms used by the Designer(s) to propagate non-random modifications to genomes.

You mean, perhaps those oncoviruses may have been intelligently designed?

http://en.wikipedia.org/wiki/Oncovirus

By Sauceress (not verified) on 22 Jul 2008 #permalink

Damn, PZ, you beat me to it, although you did a better job than I would have... Can you leave Ensatina alone for about 3 more days?

Thanks PZ! That was great. I love epigenetics. Freaking cool.

The first definition given: "the study of heritable traits that are not dependent on the primary sequence of DNA", would include culture, if you judge heritability simply on the correlation between parent and child, without restricting the mechanism. I'm not sure I got the whole of the second, much more complicated definition, but so far as I can see it would allow culture as well. I'm not sure whether this is actually a disadvantage - as both are systems that pass on information acquired by an organism, to its descendants, it would surely be useful to compare culture (at least in non-human animals) with the kinds of epigenetics PZ describes.

By Nick Gotts (not verified) on 22 Jul 2008 #permalink

One of the coolest things I have seen about epigenetics are the changes to gametes brought on by environmental stresses such as wars or famines. I've seen a study not long ago about how children born of parents which have lived through a famine in their childhood were smaller more prone to diabetes even though they never themselves suffered from malnutrition.

That is what is revolutionary about epigenetics: the possibility of transmitting to offspring an acquired trait brought on by an environmental stress. So-called Lamarckism !

Zombie @#25:

So, I'd heard that the explanation for the incidence of color-blindness in men (more common than in women) was due to the lack of redundancy of genes for building colored light receptors. It seems like X deactivation would also eliminate this redundancy. But obviously it doesn't... what am I missing?

Well, if I understood correctly (and if I didn't, someone will no doubt correct me), you missed the implication of " an invisible [...] mottling of cells that have arbitrarily shut down one or the other X chromosome". That is, some of the retinal cone cells will indeed shut down the X chromosome with the functioning pigment gene, but others will shut down the X chromosome with the defective pigment gene, and will work properly.

As I recall, the development of the retina is to a certain degree random (that is, whether any particular cell becomes a cone sensitive to the red frequencies, the green frequencies, or the blue frequencies, or a rod cell), and the optic nerve takes the gestalt of the signals from these cells anyway.

It's been a while since I skimmed that info, and the visual system is weird and complex, so take that with a grain of NaCl.

By Owlmirror (not verified) on 22 Jul 2008 #permalink

#43-- @ Anyone Else

I will probably take a lot of shit for this question but here it goes:

Perhaps viruses are a mechanisms used by the Designer(s) to propagate non-random modifications to genomes. Has anyone else considered this? If you think this is completely impossible, why?

One of my research projects is on how loss of epigenetic control of ERVs leads to cancer. Does that answer your question? ;)

To add another example to Zombie's question at #25 and owlmirror's response at #61, females get Duchenne muscular dystrophy at much lower rates than males because the dystrophin gene is on the X chromosome. If you actually look at the muscle cells of a female dystrophin mutation carrier, you'll see that about half of the cells do make the "bad" dystrophin. It's just that your body can deal with a loss of 50% of most proteins with few ill effects.

Pianismi #51:

What is the controlling force behind epigenetics then? I understand from your example of liver chirrosis that at least enviromental factors can mess with epigenetic-processes but what constitutes and controls the "normal" balance of methylated parts, histone variants etc.?

In many cases, the controlling factors are themselves genetic -- for example, proteins called transcription factors can bind to DNA and recruit histone-modifying proteins to shut down or open up the locus.

My favorite protein is a transcription factor, and it binds to its site in neurons to shut down the transcription of genes that aren't needed in those cells.

By molliebatmit (not verified) on 22 Jul 2008 #permalink

An altogether excellent exposition on the subject! And about time. I asked you to do this 2 years ago in a comment on a thread about clonal differences where most everyone seemed to want to ignore (rather understandably, perhaps) epigenetic effects.

And speaking of about time, where DO you find the time and energy, PZ? Remarkable. Maybe you've been working on this for lo these past two years? Please don't tell us that you typed it up in 25 minutes.

By William Gulvin (not verified) on 22 Jul 2008 #permalink

Re: DNA as computer code - software can do something rather similar. Ken Thompson (the unix guy) wrote a famous paper back in the 70s called "Reflections on Trusting Trust"
http://cm.bell-labs.com/who/ken/trust.html
He tells a story about how a C compiler comes to know that the sequence '\r' in C source code stands for the ASCII return character, represented by the number 10. If you look at the source code for the C compiler - which is itself written in the C language - you find that it simply defines '\r' as '\r'. The source code must be compiled by a working compiler, and such a compiler will already "know" that '\r' means 10, so the circular definition produces another working compiler that "knows" '\r' is 10, i.e. it's a heritable trait. If a cosmic ray turned the 10 into an 11 (in one particular copy of the binary), and you re-compiled the unmodified source code you would get a compiler that still thinks '\r' is 11, and would pass that onto its "offspring" reliably.

The web browser you're reading this on was compiled with a compiler that inherited this and many other things "epigenetically" (i.e. not encoded in its high-level source code) for many generations, probably beginning with a C compiler written in assembly language sometime in the 80s.

Another Sunday biologist here...

Does this mean that Lamarck gets to come back in from the cold?

By pipsqueak (not verified) on 22 Jul 2008 #permalink

to #66
No, epigenetic modifications to DNA are reset during gametogenesis with a few exceptions. Lamarck was still wrong with the "acquired characteristics" thing as he was talking about physical traits such as color, extremity length, etc.. This is more of a "epigenetic modifications can survive for a few generations." This is probably due to RNAi.

Well, thanks, rhr, but how does survivability affect the '\r' being eleven? Computer language, once compiled, still has to be read and interpreted by memory and the memory itself isn't a copying mechanism. It simply does what the intsructions tell it to do. In copying from DNA to tRNA to mRNA the sequences are templates for the structure of protein being assembled and not instructions.

So, while it's intriguing to follow changes in code as if they were mutations I still think that the analogy is misleading.

"So, it seems that DNA doesn't translate codon for protein into a predetermined structure. Or have I missed the point completely?"

'fraid so! Sorry... :-)

Here's a very simplified version of the DNA -> codons -> amino acids -> protein structure. Reality is more complex, as evolution tinkers away throwing up quirky variations on the basic scheme, so this is the gist of the thing.

The portion of the DNA encoding a gene is translated to RNA (messenger RNA). Every three adjacent bases of mRNA make up a codon, which provides the binding site for one tRNA. Each tRNA carries a particular amino acid. Each new amino acid is added to those before it, building up a chain of amino in the order given in the gene. Hence, the DNA sequence of the gene specifies the particular order of amino acids in the protein.

The protein fold (or three-dimensional structure) is a consequence of the physical and chemical properties of the particular sequence of amino acids. Here's an explanation I posted on another blog:

The amino acids used to make proteins are composed of a "backbone" that is the same for all the amino acids and side chains". The side chains of different amino acids differ and are what make each particular amino acid unique. Different side chains have different chemical and physical properties. Some dislike being in water (are hydrophobic). Others are "charged" in a variety of ways. Those with like charges repel one-another. Those with complementary charges will interact to remain in proximity of eachother, given the opportunity.

The key point is that amino acids have physical properties that affect how they respond to amino acids near to them and to the solvent (the solution the protein is in). These physical properties cause the protein to fold. Order amino into a chain, as they are in a protein, and a particular order of amino acids will have particular interactions with its neighbouring amino acids in the chain and with the solvent its in. The particular set of interactions defined by the order of the amino acids in the chain results in the protein "folding" to form a particular three-dimensional shape.

Its been known for a long time that some proteins bind near genes to control if they are used or not. So a picture emerges that all that is needed to "drive" a genetic system is the genes and the proteins controlling them. (The other bits being considered subservient to these two.)

What epigenetics does is add other things that can control if a gene is used or not. It doesn't really change the basic process of "reading the gene" (making the mRNA) and converting it into a protein itself, or the process of folding a protein that I outlined above, but adds new ways of controlling the process that starts with a DNA sequence and ends up with proteins. The existing understanding of "how genes are read" hasn't been replaced, so much as added to. [I'm aware of the details for those who want to nitpick: I'm trying to be simple here!]

To rattle on a bit more...

You can chemically modify the DNA in response to environmental events (e.g. methylation and ethylation of DNA bases). These modifications can block proteins from binding that portion of DNA in order to control a nearby gene.

You can modify proteins after they are made, for example adding small carboydrate molecules to specific parts of a protein. (Some people like to write "decorating" proteins!) This includes proteins that hold the DNA in place in the nucleus, wrap it into a compact form (e.g. histones), or those that control if a gene is used (transcription factors), altering their functions. You could add chemical groups to mark the "state" of the gene (active, inactive"), or make the protein inactive or active for a particular function.

Newly identified types of RNA (e.g. RNAi) control parts of the basic process I outlined above.

There is evidence that DNA can be moved within the nucleus, making the DNA available to be "read" (or not). This can be viewed as organising the nucleus into "domains".

DNA can be arranged in loops anchored at the base. Take two "motifs" (specific DNA sequences), bind proteins to them, and pull them together to form a loop. Genes in the DNA within the loop are controlled by protein binding sites with the loop. If you make a loop from two motifs that are further apart, you can include more genes or more binding sites for proteins that control genes, altering how those genes are controlled. [Horribly simplified, again...]

And so on. PZ has given you specific examples of these general things, but I guess putting them another way doesn't hurt!

By Heraclides (not verified) on 22 Jul 2008 #permalink

Thanks for posting this PZ.....always appreciate the information on evolution.....hopefully the Texas heat and approaching hurricane will not cause any environmental insults !!

#19: "It's not surprising that epigenetics is "messy".

right....which also makes the fossil record "messy."

What historically has been perceived as "evolutionary" change in the fossil record may very well may be epigenetic (aka non-evoluionary change.) Of course this means that any trait once thought to take thousands or millions of years to evolve (assuming it evolved at all and wasn't created) may very well have appeared epigenetically in each member of a population simultaneously. Epigenetics shrinks the fossil record and perceived "evolution" to a vastly more YEC-friendly timetable. Darwin's (and Grant's) finches, for example, probably just experienced an epigenetic change during development, which happened as a response to environmental challenges....this allowed them to generate their adaptive beaks. Same with cichlids and honeycreepers and guppies and all the other so-called examples of "evolution" aren't "evolution" afterall. ToE is a complete failure.

"can we finally get people to stop using a 'DNA as computer code' analogy? From what I understand, a single gene can be used to create different traits depending on how/when it is copied and to what use it is put

That's what my genetics professors always say!

Computer programs do this sort of thing all the time, including the ones sending you the html for this page.

The UNIX fork() command---traditionally used to create every process in the system except the first one---copies the parent process's state including data as well as code, and then hands the child process whatever code pointer it wants it to start executing.

(Modern implementations use efficient virtual copy techniques, where actual data pages are shared until one or another process attempts to modify it, then transparently copied to preserve the illusion that the whole program and process state was copied during the fork().)

You might also have a look at dynamically bound variables in LISP or fluid variables in Scheme. It's the same basic idea, applied to selected data for nested subroutine calls within a single process.

Or have a look at prototype-based object-oriented languages like Self. Cloning is the basic operation for creating an object in the language. There are no explicit classes---you just copy an object and modify it, and copy the copy as many times as you want, if you want to use it as a class template. That's like having epigenetics only, and no genes at all.

Many languages and systems have less elegant means of accomplishing the same type of thing, because it's very useful. If the OS or language doesn't support it explicitly, programmers will hack up something with the same effect. (It's unsurprising to me that evolution found this very basic and useful trick a zillion years before we thought of it.)

This is one of the many, many ways biologists say that the genome is "not like a computer program" that turns out to be just like a computer program, if you really understand computer programming.

What historically has been perceived as "evolutionary" change in the fossil record may very well may be epigenetic (aka non-evoluionary change.)

what part of "heritable" did you miss, fuckwit?

stan, you need to take a refresher course in reading comprehension. Epigenetic traits are HERITABLE. That means that the mechanisms of evolution apply to epigenetics in exactly the same way as they do to genetics. Natural selection works on any form of information that is passed with variation from one generation to another. Whether that information is stored in the form of nucleic acid base pairs, histone protein configurations, methylation patterns, concentration gradients within a germ cell, or any other mechanism, does not matter.

Epigenetics reveals a whole new range of factors that the mechanisms of evolution can act upon, and increases the explanatory power of the ToE.

amphiox, you've got some learning to do and luckily for you I am here to teach you.

Charles Darwin, who died over 100 years ago and was ignorant about genes, cells, and basically every biological process, claimed that natural selection (differential breeding success) was the CAUSE of adaptation and evolution....that any variation that arose was just a crazy accident in a single lucky individual and therefore his beloved NS got the credit for proliferating this accident, thus slowly swaying populations in adaptive directions via reproduction. This still is, to this day, the theory of evolution.

So in order to qualify as "evolution" -- whether it be micro or macro -- natural selection must be the cause. Not only that, but in order to qualify as "evolution" natural selection must select from a randomly-arising genetic event (mutations)....in this way evos would have an explanation of how genomes came to be: the selection of random genetic events. If natural selection is not the cause of a genetic adaptive trait, then it is not, by definition, "evolution," as it is not an example of how how a genome (or part of a genome) could come to be accidentally.

Here is Talk Origins' confirmation that in order to qualify as "evolution" that natural selection must be the cause.

http://www.talkorigins.org/faqs/faq-...o-biology.html

Natural Selection

Some types of organisms within a population leave more offspring than others. Over time, the frequency of the more prolific type will increase. The difference in reproductive capability is called natural selection. Natural selection is the only mechanism of adaptive evolution;

Dobzhansky:

"The process of mutation supplies the raw materials of evolution, but the tempo of evolution is determined at the populational levels, by natural selection in conjunction with the ecology and the reproductive biology of the group of organisms"

Simpson:

"Adaptation has a known mechanism: natural selection acting on the genetics of populations

Haldane:

"Variation is in some sense random, but natural selection picks out variations in one direction, and not in another

Mayr:
"It is most important to clear up first some misconceptions still held by a few, not familiar with modern genetics: (1) Evolution is not primarily a genetic event. Mutation merely supplies the gene pool with genetic variation; it is selection that induces evolutionary change."

Gould:

"The essense of Darwinism lies in its claim that natural selection creates the fit. Variation is ubiquitous and random in direction. It supplies the raw material only. Natural selection directs the course of evolutionary change."

So in order for "evolution" to occur, what science needs to show is an example of what their theory actually says, which is that randomly-arising genetic changes get proliferated by natural selection. If the trait -- such as an epigenetic trait -- arises as an internal response to an external cue, then natural selection isn't responsible for the adaptive change; what's responsible is the internal mechanism.

I have yet to see a single example of "evolution" by natural selection. Of course evos will peer into the genome and observe that genomes change on cue in the face of environmental threats, but that does NOT prove their theory -- all that proves is that the genome can somehow restructure itself upon need. There is no proven randomness, there is no spontaneity, there is no selection, there is no death, there is no differential breeding success, there is nothing to prove "evolution," as defined by the theory.

Likewise they'll look out into nature, observe a populational change in organisms, and just blindly give the credit to "evolution," despite the fact that there is no scientific evidence that "evolution" did it...and there is often no confirmation that the adaptation was even genetic. They just blindly proclaim "evolution" and hope people buy it. Yet it's never -- ever -- validated.

What's worse, evos are afraid of conducting the types of controlled experiments on animals that would validate their silly selectionist theory because they KNOW what would happen: what would happen is that each individual who was placed in an environmentally-stressful situation would respond purposefully, adaptively either during development or during its lifetime after conception. Not only that, but the change very well may not be genetic, which leaves open the question of what actually caused the change in the first place...(aka "mind," "Intelligence," "purposful response," etc)

So the challenge here, is for evos to present me even ONE example -- one controlled scientific experiment -- of microevolution, macroevolution or any other sort of evolution that demonstrates that natural selection has caused a genetic adaptation via the selection of a random genetic event in the animal kingdom. You must show the mutation and you must show that natural selection proliferated it.

This stuff IS science, right?

stan @ 71: bull-cocky. You're even more outlandish than Lipton for desperate attempts to use the word 'epigenetics' in to make your beloved ID "right."

More seriously, I doubt epigentics add any new problems to paeleontology.

Firstly, a separate species is a separate species regardless of what the precise underlying speciation events for the that species were.

Secondly, paeleontologists already have to consider phenotypic variation within species in a variety of ways (diseased individuals, age, sex, etc): I doubt new genotypic mechanisms for causing phenotypic differences change much for paeleontologists--they'll be more worried solely about the phenotypic side of it, which they already deal with.

Besides, look at the people around you in a busy mall. Lets assume we all have epigenetic variation. OK. But don't see anyone in the mall who has spontaneously blipped off into a new species, do you? :-)

(Speciation is more than change in individuals: you need at least the selection of, and reproductive isolation of, a changed group.)

By Heraclides (not verified) on 22 Jul 2008 #permalink

heraclides, you don't even know what a "species" is. If you care to debate this, please explain to me why science calls different finch varieties differnt "species" yet "evolution" (aka RMNS) was not responsible for their differences. (how can speciation happen without evolution?) Also, why does science call polar bears and brown/black bears different species when all of these can interbreed and produce viable, fertilie offspring? Why label them as different species?

anyone have a clue why the scientific community hid and supressed epigenetics for decades? I think it's because Atheist/darwinist scientists are the most corrupt, immoral animals on the planet and they know that purposeful, self-emergent (non-evolutionary) change contradicts the notion that natural selection is responsible for all adaptation/fitness. These pigs have been hiding all sorts of biological processes for many years from the unassuming public. All atheists are interested in is lying to public...well, that and killing babies, old people, and other people those who disagree with them politically. I personally believe all darwinists should be shipped to their own private island somewhere out in the Pacific...somewhere they can't bother people with their silly lies and fairytales.

@owlmirror & molliebatmit: Thanks, for some reason I got the impression that the switch was thrown in the egg and propogated to daughter cells as it divided (except in specific cases of mosaics like calico cats).

Just to be annoying though: you'd think that that if photorecepters express one or the other copy randomly, that tetrachromatic women would be as common as men with anomolous trichromatism (that is, three-color vision but with one of the colors peaking at a different wavelength than normal).

Something I read today (trying to figure this out) suggested that the X inactivation somehow picked the "good" gene, but I'm not clear on how that works, either.

Come to think of it, in the case of calico cats, the expression of one copy or the other is "patchy" rather than completely mixed.

Incidentally, I happen to be in the middle of Zimmer's Microcosm, about e. coli, and he talks about epigenetic switches in e. coli's metabolism that apparently are inherited by daughter cells.

When you come off your meds you come down hard, don't you Stan?

By Wowbagger (not verified) on 22 Jul 2008 #permalink

"stanning" (a spamming version of stan :-) ??!

I often chose not to respond to idiots, particularly if they try the ever-so-original "you're ignorant" lead-in. But read post 74 and others. (My post crossed with 74.)

Paul: Some of the people saying "no", will be computational biologists, who have programmed for 20+ years. Just a thought.

By Heraclides (not verified) on 22 Jul 2008 #permalink

Oh well. Dinosaurs might be no match for modern parasites anyways.

Oh, look, stan is back.

Hey, Bolshevist Lysenkoist Comrade stanislavski, when are you going to demonstrate your magical purposeful self-emergent changing cells by JUMPING IN THE OCEAN AND GROWING GILLS? When will you walk naked in the Antarctic winter?

I personally believe all darwinists should be shipped to their own private island

That is called GULAG, BOLSHEVIST COMRADE LYSENKOIST. We once again see you showing your true bloody colors.

By Owlmirror (not verified) on 22 Jul 2008 #permalink

Stan is a Poe, right? Indistinguishable from satire.

By John Scanlon FCD (not verified) on 22 Jul 2008 #permalink

Some of the people saying "no", will be computational biologists, who have programmed for 20+ years. Just a thought.

Well, OK. I've been programming computers for longer than that, in a variety of programming paradigms, but I'm not a biologist, so it's possible I just don't get it.

All I can say is that (in light of the CS I know) the more biology I learn, the less convinced I am that the genome is not a computer program.

If there's a reason why the genome is not a computer program, maybe a computational biologist can explain it to me, but most of the regular biologists saying so don't seem to actually know what it is. They get hung up on unnecessary and insufficient conditions for being a "computer" or a "program."

They keep saying it's not because computer programs don't do x, where x is in fact something that some computer programs do. (And often something that programs they use every day do frequently.)

BTW, we hashed a bunch of this stuff out a few months ago in this older thread:

http://scienceblogs.com/pharyngula/2008/02/the_genome_is_not_a_computer…

(Oh, and I really like PZ's post on epigenetics. I meant to say that before. Thanks, PZ!)

Paul:

"Well, OK. I've been programming computers for longer than that, in a variety of programming paradigms," and so have plenty of computational biologists ;-) After all the earliest papers in the field are from (at least) the mid 1960s.

You'd do better if you learn some chemistry and physics as well. Biological systems are in the end molecular systems behaving according to the laws of chemistry and physics. And they are collections of molecules with all their random fluctuations, motions, diffusion and whatnot. In the end they are large populations of molecules driven by their molecular properties.

Molecular concentrations, reaction efficiencies, fluid dynamics, molecular dynamics, etc., and the "random" element in all of these (and others) play a role.

For example, you'd find the control of the expression of most genes is often (usually) "leaky". You'd find that thresholds in biological systems aren't hard-and-fast but somewhat fuzzy and prone to quirks of chemistry. You'd find that populations of cells of the same type aren't all perfectly the same. There's plenty of "noise" in the system, as you'd expect in molecular systems.

I could go on and on, but suffice to say the computer program thing is an analogy. As you wrote "like", not "is."

Its an analogy that gets overplayed, basically. (Or is out of date.) Fine to get rid of the party bore insisting you talk about your work, or perhaps to lay out a few starting points before pointing out its a poor analogy for the real thing, but lousy as a serious comparison.

You might also want to remember that a number of developments in computer science borrow principles from elsewhere in science, so vague similarities in terminology and overarching concepts can be seen in some things, but they're not literally the same thing.

To fully appreciate this you need to know both, as you were suggesting.

Thanks for pointing out the older thread, but I doubt I'll have time for it. Its horribly long and in any event glancing at it, its got a bit of a low signal to noise ratio for my interest.

By Heraclides (not verified) on 22 Jul 2008 #permalink

A bit of science is like a breath of fresh air in an atmosphere that had got positively fuggy due to the crackers. I wonder, however, about environmental epigenetic factors such as inheritance of constructed hiches.

Every physical system can be fully characterized by a computer program. Why would the genome be any different?

Maybe when biologists say the genome isn't a computer program, they only mean that "computer program" isn't the best perspective to adopt.

@zombie #79

I think that the frequency of tetrachromatic women is pretty high, but not tested for very often. Any time I wear more than one item of clothing with red in, I'm pretty much guaranteed to get several female friends going "Eeeew, that clashes, like, sooooo bad!"
As for calico cats, the X-activation must happen at the stage of 20-or-so epidermal cells, which then grow into patches, I'd guess?
DH

Zombie @ #79:

you'd think that that if photorecepters express one or the other copy randomly, that tetrachromatic women would be as common as men with anomalous trichromatism

Maybe they are...

http://en.wikipedia.org/wiki/Tetrachromatism

Yes I know it's wiki, but I've heard of this elsewhere too.

I swear comment #89 wasn't there a moment ago, but the timestamps say there was a 15 minute gap! Has it really taken me that long to read the post and all comments?

*mutter mutter grumble grumble*

Prader-Willi Syndrome is god-awful. The reason they are obese is they have no ability to sense non-hunger. As you can imagine this makes them nasty as well as obese. Pyromania is more frequent with them.

"Hey, Bolshevist Lysenkoist Comrade stanislavski, when are you going to demonstrate your magical purposeful self-emergent changing cells by JUMPING IN THE OCEAN AND GROWING GILLS?"

I would never claim that the origin of all bodily structures can be accounted for by "natural" biological processes. I don't believe humans came from fish or can turn back into fish, genetically, epigenetically, or otherwise.....so your statement is just pretty silly.

DaveH #89:

As for calico cats, the X-activation must happen at the stage of 20-or-so epidermal cells, which then grow into patches, I'd guess?

Human Molecular Genetics (available for free search at the NCBI Bookshelf) says X inactivation occurs pretty early -- late blastula in mice and probably also humans -- and is clonally inherited.

Zombie #79:

Something I read today (trying to figure this out) suggested that the X inactivation somehow picked the "good" gene, but I'm not clear on how that works, either.

I just read a review (Pubmed ID #9442908, if you have access), and it suggests that most nonrandom inactivation is actually the result of selection -- that the cells carrying the active mutant allele are outcompeted.

By molliebatmit (not verified) on 23 Jul 2008 #permalink

I don't believe humans came from fish or can turn back into fish, genetically, epigenetically, or otherwise.....so your statement is just pretty silly. - stanner

No stanner, it's your statement that is just pretty silly, since the evidence that humans are indeed the evolutionary descendants of fish is overwhelming, and accepted by all relevant experts.

By Nick Gotts (not verified) on 23 Jul 2008 #permalink

Well, Stanner, since you've thrown up your hands and decided that a magic man done it, I guess you're out of this discussion.

We'll be up here talking about interesting new research if you decide to join us at any point.

You can go down to the pub and have a drink with Randy Stimpson. You guys can talk all night about how you two already know everything, and why it's entirely sensible to ignore the field of genetics and the fossil record in favor of an invisible magic man who has never in the slightest been detected.

Remember, even if God is the ultimate cause, in every case we have found a natural proximate cause, and that's what science is interested in. So be a Miller, not a Behe.

By Jason Failes (not verified) on 23 Jul 2008 #permalink

Thanks for pointing out the older thread, but I doubt I'll have time for it. Its horribly long and in any event glancing at it, its got a bit of a low signal to noise ratio for my interest.

Heraclides, you seem to be making several of the same mistakes several people in that thread made.

Basically, I think you're one of the people who is hung up on unnecessary and insufficient conditions for something to be a computer or a program, and talking down to people who actually have a better understanding of the issues than you seem to.

All that sophisticated rhetoric yielding nothingn more than more speculation, more questions, more conjecture and little in the form of conclusive evidence. Whatever floats your boat!!

By Ronald Cote (not verified) on 23 Jul 2008 #permalink

"...personally believe all darwinists should be shipped to their own private island somewhere out in the Pacific..."

If you can arrange for Tahiti, I'm in!

If you want to explain how Biology Thing X, which might be a genome, a brain or whatever, is or is not like a computer program, you have to go out of your way to explain what you mean by "computer program", first. In my experience, arguments over whether "the mind/brain relationship is analogous to the software/hardware relationship" tend to get confounded by somebody's restricted understanding of the term software. To them, it might mean a sequence of instructions executed one after the other with the odd JUMP command, whereas someone else will see "software" as a higher-level description of computer behavior, as thermodynamics is to statistical mechanics. (Many different configurations of atoms — statistical-mechanical "microstates" — map to the same thermodynamic situation, or "macrostate", much like many different flows of electrons map to the same effective state of a RAM chip. At an even higher level, many different sequences of 1s and 0s in RAM all map to the situation, "My laptop is running Firefox on Ubuntu with Pharyngula open in two tabs, etc." All of these levels of description have to be invoked at one time or another, depending on the problem you're trying to solve.) Thanks to the perennial affliction with Inigo Montoya Syndrome — "I do not think it means what you think it means" — whenever these analogies come up, we have to spend so much time explaining the other end of our comparison that one might well wonder why we bothered in the first place.

I would never claim that the origin of all bodily structures can be accounted for by "natural" biological processes.

Yes. I know. That's why I wrote "magical".

I don't believe humans came from fish

Because you're an ignoranimus.

or can turn back into fish, genetically, epigenetically, or otherwise.....so your statement is just pretty silly.

Yes. Your entire ridiculous idea of how biology works is indeed contradictory and silly. So sprout gills or shut up.

By Owlmirror (not verified) on 23 Jul 2008 #permalink

If you want to explain how Biology Thing X, which might be a genome, a brain or whatever, is or is not like a computer program, you have to go out of your way to explain what you mean by "computer program", first.

Right. And saying that something is not a computer program is a big statement---much bigger than most people saying it seem to realize.

A computer program can be just a set of conditional rules, executed in parallel and nondeterministically by default.

The genome seems to be mostly that---what computer scientists call a "production system."

A rule in a production system is something like

If A and B and not(C) and not (D), then F and G

At a low level, the genome seems to be a production system with a gene implementing a rule. The propositional variables A, B,... etc. are represented by chemical concentrations in the relevant plasm, repressors, etc. More precisely, they're represented by the concentrations of molecules with particular binding sites that represent variables.

At a very low level, you can view each transcription of a gene as implementing a rule firing. The preconditions for firing are just the docking of various molecules that enable the rule to fire, or the non-docking of repressors that would prevent it from firing.

Whether a rule fires depends statistically on chemical concentrations (with a lot of noise affecting whether the rule fires at any given moment); it also depends on whether the transcription machinery is busy doing something else, accidents of things moving around in the cell, etc.

That's okay. It's still a forward-chaining production system and hence a computer program. It's an extremely low-level, nondeterministic, massively parallel bitch-to-program computer program, but a program.

At a slightly higher level, and longer timescale, the statistical properties of the rule firings make the rule act more reasonably, with concentrations of inputs affecting concentrations of outputs. That lets you (evolution) use feedback loops and hysteresis to implement mostly digital switching, and use chemical concentrations either as boolean variables as crude analog scalars.

The fact that the variables are noisy analog scalars at a low level, and many are only boolean at a high level, does not affect whether this is a computer program. The fact that some of the variables are analog at a high-level doesn't matter either.

What makes it a production system is that the rules themselves are discrete, with specific dependencies through particular variables. The fact that the variable values are not discrete is not a problem for program-ness. It's a hybrid digital/analog computer, but it's programmed with discrete rules, so it counts.

(The fact that it may not be a universal computer doesn't really matter either. Universal computers are a subset of programmable computers.)

This kind of computer doesn't look much like the von Neumann machines most people are familiar with. That's because von Neumann machines are a special case of this very general sort of thing, with a very stereotyped pattern of rule firing. (Each instruction is a rule, and there's an implicit precondition and postcondition on each rule, to enforce sequential execution by default.)

The genome = computer program analogy is pretty unhelpful. Not because it is really false (in a deep sense everything is a computational system), but because it is misleading to the vast majority of people.

Instead of saying 'computer code' or 'computer program', lets go with computational system or even better, production system. Yeah, these aren't very useful explaining things to the-man-on-the-street... but at least they are much less likely to mislead people.

Here's where I think is the problem --- the metaphor of genes as "programs". The common thinking in biology is that genes act as a "program", like in a computer.

Unfortunately, that's only a small part of the story. Biology is filled with programs, most of which are not "genetic" but act at a higher level. Development is filled with them, but they aren't recognized as distinct programs that are being programmed by genes.

Genes are the master programmer, acting at key points to direct higher level programs, which are themselves programmers, and so forth. There is no "epigenetics" as a field -- there are multiple domains of programs that are organized but not determined by the programmer.

It also makes the whole mess appropriately difficult to study --- anyone who has written code to produce programs that themselves produce code will recognize the difficulty of analyzing such systems, particularly when the code maintains other levels of code.

Pick the right metaphor, and everything becomes clearer.

"Heraclides, you seem to be making several of the same mistakes several people in that thread made."

Oh, right. Start by accussing me of being ignorant. Either you're not worth the time (people who start with this sort of thing rarely are in my experience) or this is (subconcious) ploy to get rid of me because you are worried about being shown up. But "whatever."

"Basically, I think you're one of the people who is hung up on unnecessary and insufficient conditions for something to be a computer or a program, and talking down to people who actually have a better understanding of the issues than you seem to."

Ever occur to you that this applies to you? (But about biology.)

The impression I get is of someone who want this "computer program" idea to be true by repeatedly bashing on, which does make people wonder if you're really "listening." You seem a bit obssessed with it, too. Its only an analogy, for goodness sake.

By Heraclides (not verified) on 23 Jul 2008 #permalink

Heraclides,

I don't think dueling dismissiveness is going to get us anywhere.

You seem to think I'm an obsessed everything-looks-like-nail computer weenie, and you seem to me unable to take seriously the idea that I might be right.

The only way for either of us to know who's right, or learn something from each other, is to actually discuss the issues, rather than saying "you're wrong, go do your homework in my field," without actually convincing the other that there's something particularly relevant to learn, or giving some direction as to what it might be.

I may be wrong but, you do seem, to me, not to understand what I'm actually saying. I could well be wrong about that, and would happily stand corrected if so.

I'm not defending calling the genome a computer program as a useful analogy. I'm defending it as a simple literal truth that doesn't mean what most people assume, because they don't understand what it is to be a computer or a program. (Being a computer or program is just not that hard, and there are a lot of ways to do it.)

I agree that as an introductory analogy, calling the genome a program has its problems. (Although I claim it's no worse than the other introductory analogies, e.g., "it's like a recipe.") Too few people understand what kinds of programs there are, or can be, and most people draw too many invalid inferences based on limited knowledge from stereotypical von Neumann programs, hand-coded top-down using structured programming techniques.

Too few people understand what bottom-up asynchronous production system programs look like, so it's natural if they miss what I think I get---that the genome literally is a program of an unfamiliar kind.

When you tell me "it's only an analogy," I take that to mean that you missed my point. (Either that, or you're making an argument by assertion against it, which is not very useful.) What I'm saying might be false, or perhaps boring to most people, but it's not only an analogy.

A fantastic article PZ. Answered many questions that aren't covered in depth by many of my biology text books.

Thank you!

Paul W. (#106):

I'm not defending calling the genome a computer program as a useful analogy. I'm defending it as a simple literal truth that doesn't mean what most people assume, because they don't understand what it is to be a computer or a program. (Being a computer or program is just not that hard, and there are a lot of ways to do it.) [...] Too few people understand what kinds of programs there are, or can be, and most people draw too many invalid inferences based on limited knowledge from stereotypical von Neumann programs, hand-coded top-down using structured programming techniques.

As could probably be inferred from my earlier posts upthread, I agree with this sentiment wholeheartedly.

Paul,

"I don't think dueling dismissiveness is going to get us anywhere."

Speak for yourself, but do bear in mind that my first post on this matter was a short, polite, attempt to hint that you were dismissing all biologists as not understanding this out of hand without considering that some would.

"you seem to me unable to take seriously the idea that I might be right" And there you go again.

"you're wrong, go do your homework in my field," I didn't write quite what you're saying here. I generally dislike people who try re-work my words to mean something different that what I wrote, which they then attack. Its a form a straw-man argument, after all. I pointed you to some broad areas that are relevant but were absent from your earlier posts. And they are part of the story, after all they're whole nature of how biology works at molecular level.

"doesn't mean what most people assume, because they don't understand what it is to be a computer or a program." To the first, I'm aware of that; the second I do. (Remember, my first gentle hint to you on this topic... I was hoping you'd pick up that assuming that others don't understand isn't the best idea.)

"something particularly relevant to learn, or giving some direction as to what it might be." I did, but you'll have to learn it to see why. That's over to you. And that's not a circular argument, its just the nature of the beast. That post was just a collection of loose pointers. When I post to someone who appear to be seriously interested, I usually only give pointers, partly to not not to patronise them and partly as I expect them to follow them up it using their own initiative if they are interested. That's how it usually works with genuinely interested people: you don't have to spell it out for them, just point them the right direction and they'll head off there themselves. I was doing you a favour, but you've undone it with your rudeness. Besides, you know the saying: "You can lead a horse to water..." I'm not going to bother try make you drink the water.

All you've acheived is annoy someone who might have helped. Well done. (With intended sarcasm.)

I'm no longer interesting in "discussing" this or anything else with you because you don't discuss. Please don't try work around that.

By Heraclides (not verified) on 23 Jul 2008 #permalink

Oh, right. Start by accussing me of being ignorant.

That isn't always an insult. For example, see who PZ calls ignorant in the original post.

windy:

Trying to throw some troll bait around, are we? By the way, PZ never called anyone ignorant in the original post, so goodness knows what thread/commentary you think you're in.

"That isn't always an insult." And I didn't say it was either. (Hint: try the word 'assumption'.)

By Heraclides (not verified) on 24 Jul 2008 #permalink

Heraclides, you're right, I was hallucinating the PZ part. Sorry. However, many of your objections were discussed in the other thread, so it's a pity if you won't take a look. For example, as a geneticist, I found this part very apt:

Keep in mind that every digital machine is analog at the bottom. The itsy-bitsy transistors and capacitors we use as "uniform" switches and flip-flops are actually noisy analog devices with random variations in sensitivity and capacitances. They work only because enough electrical particles per transistor that they are statistically reliable, and because we ensure that whatever's downstream can ignore the variations between them and extract a clear signal.
Using chemicals dissolved in a liquid is the same kind of thing---if you have enough liquid with enough molecules dissolved in it, you can use it in much the same way, as a statistically reliable variable with acceptable precision.

You didn't say if you yourself were one of those computational biologists that's been programming for 20 years, so I don't mean to speculate anything about you personally, but a computational biologist is not necessarily particularly familiar with computer architecture.

Heraclides,

I don't think dueling dismissiveness is going to get us anywhere.

Speak for yourself, but do bear in mind that my first post on this matter was a short, polite, attempt to hint that you were dismissing all biologists as not understanding this out of hand without considering that some would.

I'm sincerly sorry if I wrongfooted this from the start by talking about "biologists" saying things and giving the impression that I meant all biologists. I meant the biologists who say those things, and particularly the ones who've spelled out particular reasons for saying them. (And especially some biologists here, but not all.)

Sorry I didn't make myself clear. If I had, I might also have mentioned that I also know that there are biologists who don't have a problem with describing the genome as a program, and some who explicitly say that the essential core of life is computational. I didn't want to get into dueling arguments from authority on that; I'd rather talk about what it actually means for something to be computational and whether that's actually true of the genome.

you're wrong, go do your homework in my field

I didn't write quite what you're saying here. I generally dislike people who try re-work my words to mean something different that what I wrote, which they then attack. Its a form a straw-man argument, after all. I pointed you to some broad areas that are relevant but were absent from your earlier posts. And they are part of the story, after all they're whole nature of how biology works at molecular level.

Sorry, but have a look at your posting #86, from where you say You'd do better if you learn some chemistry and physics as well to where you say I could go on and on, but suffice to say the computer program thing is an analogy.

Think about how that has to come across.

It's evident to me and to some other people here that you're telling me a bunch of things I already know, and ending with a pat conclusion that I obviously do not buy.

You don't know me, so it's understandable if you don't realize that you are inappropriately talking down to me. That's why I suggested that you look at the older thread; if you just skim my postings in that thread, I think you'll realize that you've been unintentionally condescending and patronizing to the wrong guy about the wrong stuff.

You told me you wouldn't read that thread, because it's just too long, and seemed to imply that I'm a blowhard who's not worth the time anyway. I need to learn some chemistry and physics, and think about leakiness and randomness.

That's one reason I posted comment #102; it was partly a hint to you that I'm quite aware of the statistical nature of the molecular interactions in gene transcription, and a few other issues you seem to think are problematic, but that I don't think they're actually a problem for my point. It's something I've thought through fairly carefully.

You did not take the hint.

I can be clearer about my background if you want, but I think I'm qualified to talk about the things I'm talking about, by Pharyingula standards. I'm not a troll, as you seemed to imply to Windy.

(I think she and Blake were trying to give you a hint, which you missed.)

That post was just a collection of loose pointers. When I post to someone who appear to be seriously interested, I usually only give pointers, partly to not not to patronise them and partly as I expect them to follow them up it using their own initiative if they are interested. That's how it usually works with genuinely interested people: you don't have to spell it out for them, just point them the right direction and they'll head off there themselves. I was doing you a favour, but you've undone it with your rudeness. Besides, you know the saying: "You can lead a horse to water..." I'm not going to bother try make you drink the water.

I think you failed in what you were trying to do, because you didn't know who you were talking to, and assumed I'm much more ignorant than I am about the relevant biology. The "loose pointers" you gave were way too loose. For example, think I already understand the basic chemistry and physics well enough, as well as the biology at about the level that PZ talks about it in his more technical posts.

The last book I read was Zimmer's Microcosm. In the last year I've read several other fairly serious (popular) biology books, including Hazen's Gen-e-sis, about abiogenesis. Two of my favorite books are Stuart Kauffman's Origins of Order and Laurie Garrett's The Coming Plague. It's not like I haven't made a sincere effort to understand the basic physical and chemical issues in biology that are most relevant to the kinds of things I'm saying. I already know basic stuff about about variable gene transcription rates, stochastic processes, imprinting, repressors, methylation, emergence, etc.

If you think I've missed something relevant, that's fine, but I need you to tell me more specifically what it is; don't just tell me to learn chemistry and physics, and that biology is messy.

And please don't get all huffy that I'm not terribly grateful that you tell me things I already know and proceed to tell me I'm wrong, skipping the interesting part in the middle, and justifying that condescendingly.

I think I'm fairly seriously interested in fairly serious biology, for a non-biologist. I only found your vague recommendations useless because they weren't specific enough; you're not giving me anything to work with.

I'm no longer interesting in "discussing" this or anything else with you because you don't discuss.

I think you're mistaken about the latter. I'm perfectly happy to actually discuss things; you don't seem to be.

Please don't try work around that.

Well, OK, then.

Paul:

In juxtaposing a few of my words and leaving out intervening material, you've changed the "tone" of the original. Several of the bits left out show I wrote it in a different tone, but please check that in your own time as I have no wish to argue this with you :-) I tried to help but I wish I hadn't bothered :-/

Please note the smileys, too :-)

(To clarify my previous last words to you: I was asking that you don't post to me, please. (Yet you did!) Could I suggest you start you own blog and invite people who want to discuss (or "discuss") this with you over there? I, for one, would rather be discussing epigenetics, which is, after all, the reason I came to this thread.)

windy:

Many computational biologists have "formal" computer science as part of their backgrounds as opposed to "computing" (i.e. "just" programming). (This may be more true of those in the field prior to the "eukaryote genome era".)

Hope I'm not stating the obvious, but "back then" (!) more-or-less everyone who "played" with computers, with or without formal training, knew computer architecture in a way that seems relatively rare today. Or at least that's the way it seemed to me.

I pointed out earlier that I'm familiar with what "Paul" is driving at, by the way.

By Heraclides (not verified) on 24 Jul 2008 #permalink

To clarify my previous last words to you: I was asking that you don't post to me, please. (Yet you did!) Could I suggest you start you own blog and invite people who want to discuss (or "discuss") this with you over there?

Sorry, but that's not how it works. You don't get to tell people not to reply to you, and get the last word. If you want a blog that works that way, you can start your own.

BTW, if you're implying that I took the thread off-topic, do realize that I wasn't the first person here to bring up the genome-as-program issue. Or the second, or third, fourth, or even fifth; I was late to the party.

I, for one, would rather be discussing epigenetics, which is, after all, the reason I came to this thread.

Go for it, by all means.

Please stop posting to me. About anything. I am entitled to ask. If you don't want to or can't respect others' wishes, that's your idiocy, not mine. And please don't write another silly post trying to interpret "meanings" into my posts: it was only a suggestion you great git.

By Heraclides (not verified) on 24 Jul 2008 #permalink

I pointed out earlier that I'm familiar with what "Paul" is driving at, by the way.

Sorry, this remark is rather cryptic? You did say that you have a certain impression of what Paul is driving at, but this is not necessarily the same as being familiar with what he's really driving at.

Paul, you can always post to me ;)

Cracking post PZ! (pun intended).

Whilst highlighting the stupidity, hypocrisy and hatefulness induced in some people by religion may be both necessary and amusing your evo\devo posts like this are the real reason why I keep dropping by.

By BrightonRocks (not verified) on 25 Jul 2008 #permalink

Heraclides,

You are certainly entitled to ask me not to post to you. If you follow that with three insults that I think are mostly unfair, you will not get your wish. ("your idiocy"... "silly post"... "great git".)

You'll get this instead:

Heraclides, you're acting like a lightweight. First you dismiss what I'm saying by telling me I'm uninformed. Then I show that I'm not so uninformed after all, and maybe you're uninformed about what I'm saying. Rather than addressing the substance, you resort to ad hominems about why I'm badgering you, and repeatedly insult me.

Now you want the right to tell me to leave you alone, while you're still insulting me.

I'll be happy not to post to you anymore, because its evident that the last thing you'll do is address the substantive questions. It's not worth it.

But you don't get to dismiss me with the back of your hand. If you want me to shut up and let it go, you need to shut up and let it go.

Getting back to epigenetics per se...

Reading about epigenetics, methyl groups, etc., I'm wondering if epigenetic adaptation across multiple generations gives you an intermediate scale of adaptation which makes it easier to to evolve genetic adaptations.

In the evolution of neural architectures, there's something called the Baldwin Effect, which makes it easier to evolve instincts and the like if the individual is more adaptive.

Basically, if the individual is adaptive enough to survive in a new environment---maybe just barely in an only somewhat different environment---that gives it a foothold in the new environment and gives it more time to evolve "hardwired" adaptations. Over a longer timescale, those hardwired adaptations tune it better to that environment "out of the box." The individual adaptability broadens the search space that genetic evolution can search.

I'm wondering if methyl groups and whatnot perform a similar function, making it easier for (say) a strain of microbes to evolve genes for a novel environment. If the methyl groups are somewhat randomized but tend to persist across multiple generations, that would broaden the range of environments a set of clones can survive in, giving them more generations to evolve beneficial mutations that result in new strains better tailored to those environments.

I'm not sure if this is a stupid idea, or perhaps an obvious one. It seems that people are saying that this kind of epigenetic adaptation does result in more adaptive sets of genetic clones---variability among clones is good, all other things being equal.

What's counterintuitive about the Baldwin Effect is that people tend to think (in simplistic nature/nurture terms) that individual adaptiveness or intelligence leads to a reduction in selection pressure for specific instincts and biases, so more "intelligently" adaptive organisms would have fewer and less specific "instincts" or biases. Instead, greater individual adaptiveness often acts as a scaffolding to allow the evolution of more specific instincts for the new environment, or a richer and subtler repertoire of instincts that work across more environments.

I'm wondering if there might be similar and similarly easy-to-miss stuff like that going on with epigenetics.

Just read this blog entry for the first time, after reading a Time article on this subject. Thanks for the details - quite fascinating! As a software developer, I found some of the comments quite interesting (and amusing) as well. Just a quick note to restart the argument - a program is not just a set of instructions in machine code. There are includes, makefiles and other 'modifiers' in the process that can tailor the end result based upon 'environmental factors'. In this sense, it does make an interesting analogy... :-).