The Clock Metaphor

i-710d005c8660d36282911838843a792d-ClockWeb logo2.JPG

Chad wrote a neat history of (or should we say 'evolution of') clocks, as in "timekeeping instruments". He points out the biological clocks are "...sort of messy application, from the standpoint of physics..." and he is right - for us biologists, messier the better. We wallow in mess, cherish ambiguity and relish in complexity. Anyway, he is talking about real clocks - things made by people to keep time. And he starts with a simple definition of what a clock is:

In order to really discuss the physics of timekeeping, you need to strip the idea of a clock down to the absolute bare essentials. At its core, a clock really has only one defining characteristic: A clock is a thing that ticks.

OK, I'm using a fairly broad definition of "tick," here, but if you'll grant that leeway, "ticking" is the essential property of clocks. In this context, "ticking" just refers to some regular, repetitive behavior that takes place in a periodic fashion.

This reminds me that a "biological clock" is a metaphor. A useful metaphor, but a metaphor nonetheless (and just like metaphors of cellular machinery are taken literally by Creationists, they have been known on occasion to talk about circadian clocks as if they had real wheels and cogs and gears!).

I want to stress that the clock metaphor has been very useful for the study of biological rhythms. Without Pittendrigh's insight that cycles in nature can be modeled with the math of physical oscillators, we would be probably decades behind (unless someone else of authority in the field at the time had the same insight back then) in our understanding of the underlying biology. Just check how useful it was in the entire conceptualization of entrainment and photoperiodism. The Phase-Response Curve, based on the math of physical oscillators, is the Number One tool in the chronobiological repertoire.

But, just as most people in the field take the clock metaphor for granted and without much thinking, there have been a few people who questioned its utility for some areas of research. For instance, for the study of biological rhythms in nature within an ecological and evolutionary context, Jim Enright proposed a metaphor of an audio-tape set on continuous play (Enright, J.T. (1975). The circadian tape recorder and its entrainment. In Physiological Adaptation to the Environment (ed. F.J.Vernberg), pp. 465-476. Intext Educational Publishers, Ney York.). Only a dozen or so publications since then took him seriously and tried to apply this concept. Today, in the age of CDs and iPods, who even remembers audio tapes?

While fully utilizing the utility of the clock metaphor and applying it myself in my own work, I was always cautious about it. Aware that it is a metaphor, I always wondered if it constrains the way we think about the biological process and if we may miss important insights by not thinking in terms of other possible metaphors.

While far from mature, my thinking is that different metaphors apply best to different areas of research and different questions. While the clock metaphor is great for understanding the entrainment of the circadian system (including whole organism, tissues and individual cells) and photoperiodism, and Enright's endless tape (or some modern substitute) may be useful for ecological studies (including temporal learning and memory), other angles of study may require other concepts.

For instance, I think that the study of what goes inside the cell can benefit from a different metaphor. Studying the molecular basis of circadian rhythms may best be done by utilizing a Rube-Goldberg Machine metaphor: event A triggers event B which starts process C which results in event D....and so on until the event Z causes the event A to happen again. If that last step is missing, it is not a circadian rhythm - it is more akin to an hourglass clock in which something outside of the system needs to start the process all over again.

For studying the outputs, i.e., how the circadian system orchestrates timing of all the other processes in the body, the metaphor may have to fit the organism. An ON-OFF switch is the best metaphorical description of the clock system in (Cyano)bacteria, where there are only two states of the system: the day state and the night state. For something a little bit more eukaryotic, a relay may be a better metaphor (more than two, but not too many states). The metaphor of a rod in car engines (how are those called in English and do modern cars even have those any more?) that times the opening and closing of cylinders would be fine for fungi and plants and perhaps some invertebrates.

But I had a hard time coming up with a decent metaphor that could apply to complex animals, like us. So far, the best I could come up with is the barrel of a Player Piano. Many little knobs on its surface determine when each note will be played. If you make the barrel rotate slowly and the song lasts 24 hours, then outputs from circadian pacemakers are knobs and the target organs (and peripheral oscillators in them) are those long prongs that make music. Can you think of a better metaphor?

Categories

More like this

At base, ticks. At the next logical level, ticks that are counted in some fashion. At the next level, the counts have some semantics.

Yes, A -> B -> C -> ... -> A is one oscillatory mechanism, though I can say that, in the logic circuit business, it is not a very reliable method. It has some very nasty failure modes, and is very sensitive to noise so even good circuits will fail. And I would expect the staged triggering to be even less reliable if implemented in a biological mechanism. OTOH, in biology, it is more statistical success than reliability that determines what survives. In Nature, "If it _can_ happen, it will" seems to be the rule.

I was thinking something more along the lines of what is called a "Relaxation Oscillator" might be more likely to find in the wild. All that is required is a system with (at least) two phases, some hysteresis in the way it changes phase, and some property that self-accumulates in one phase until some threshold is reached after which it switches to the other phase and self-dissipates; the state being evidenced/measured by the dv/dt strength change direction of the property. Neurons seem to demonstrate a particularly clear example with their electro-chemical oscillations. Seeking food when hungry and doing other stuff when not, until the food reserves are exhausted and hunger re-appears, is another.

The simplest relaxation oscillator circuit is a capacitor based integrator that charges up until a threshold is reached at which point a switch is thrown to put it into a discharge state which continues until the capacitor is discharged. At that point the switch resets, the discharge load is removed, and the capacitor charges up again.

The absolutely simplest circuit that exhibits this property consists of one resistor, one capacitor, a little neon indicator, and a low current high voltage power supply. Circuit is resistor and neon in series, capacitor parallel across the neon. The capacitor charges until the strike potential of the neon is reached across it. At that point the neon gas ionizes and its resistance drops dramatically, such that it passes more current than the resistor can supply, so it discharges the capacitor. The flow of electrons maintains the ionized state so the resistance is low as long as current is flowing. Ultimately the charge on the capacitor becomes exhausted, current stops flowing through the neon, the gas loses ionization and reverts to high impedance, and the cycle begins anew. Otherwise known as Christmas lights. Bimetallic strips have also been used as relaxation oscillators. The traditional bell is another example.

By Gray Gaffer (not verified) on 30 Jun 2009 #permalink

Almost forgot this basic observation: oscillatory circuits have one common property: positive feedback with a delay. The positive feedback results in responses to stimuli being greater than the stimuli, the delay allows for the response to tail off until the stimuli re-assert. Combine this with meta- level negative feedback systems and you have the beginnings of the cybernetic view of things. We have Norbert Wiener to thank for this - cybernetics: the study of self-organizing systems. It feels like it might be a productive analytical tool.

By Gray Gaffer (not verified) on 30 Jun 2009 #permalink

A (at least sometimes) better metaphor:

A Turing Machine

Studying the molecular basis of circadian rhythms may best be done by utilizing a Rube-Goldberg Machine metaphor: event A triggers event B which starts process C which results in event D....and so on until the event Z causes the event A to happen again.

That's what we call a "ring oscillator." A series of stages, each with a gain and phase delay, connected such that the first stage is triggered by the last with a net negative feedback. If the Barkhausen conditions are met, you get oscillatory behavior.

As it happens, I build a lot of those.

Almost forgot this basic observation: oscillatory circuits have one common property: positive feedback with a delay.

ITYM "negative feedback with delay." Which isn't quite true, either, depending on how you look at it. The differential LC tank oscillators used in high-frequency communications PLLs are positive feedback with minimal gain outside of the passband, and ring oscillators are delay but negative feedback.

But I had a hard time coming up with a decent metaphor that could apply to complex animals, like us. So far, the best I could come up with is the barrel of a Player Piano. Many little knobs on its surface determine when each note will be played.

Ring oscillators intrinsically generate multiple phases, which can be used to operate portions of a system. The biologial cycle I know best is the female reproductive cycle, which follows a similar "A causes B and C, B causes D and E, D causes F and G, F causes A" cycle where C, E, and G are external results triggered by the basic cyclical system. This is a bit like a finite state machine except that canonical FSMs rely on a discrete external clock source and you're interested in something more like an asynchronous self-timed one [1].

Finite-state automata are very well-studied, but I suspect that the generalization to biological systems would be very, very messy. Oh, wait -- that's a good thing!

[1] An asynchronous FSM has been described [2] as "a fortuitous collection of race conditions."
[2] The delightful 1970s-era MMI handbook for FIFOs, before they were bought by AMD.

By D. C. Sessions (not verified) on 01 Jul 2009 #permalink

DC: self timed natural FSMs would probably rely on relaxation phenomena. Also, there is always delay unless the oscillator operates at an infinite frequency. In fact, the operating frequency will be such that there are an integral number of cycles covering the delay time, this being the condition required for positive feedback. And the delay in your second case results in positive feedback by the time the signal recirculates, even if the circuit, shorn of timing considerations, looks like it only has negative feedback. In fact, this characteristic is what gives rise to parasitic oscillations on Op-Amp circuits, when there is a feedback path whose delay is frequency dependent such that at some high frequency the feedback sign is positive instead of the intended negative. Remember, I am not necessarily talking about explicit delay stages: mostly the delays arise because we are in fact dealing with real analog circuits, a factor often masked by the pretty Boolean logic schematics we design the systems in. Explicit delays may be represented instead by the V-I phase differences, e.g. across L or C components. There is never a zero time propagation path in a real circuit. Einstein says so.

The problem with ring oscillators comes from either missed triggers or spurious out of sequence triggers, which result in multiple circulating signal fronts. Ground noise is the prime cause of these failures. Counter or similar single-clock timebases do not suffer from this particular failure mode. They have others, to be sure, but tend to recover from disturbances to a single state, not a mixed state like the rings.

After my painful experiences dealing with OEM products whose sequencing was based on ring oscillators, I developed a style that relied more on counters and decoders at first, then later on latched reprogrammable logic arrays such as GAL16V8s. Essentially, single clock single stage encoded FSMs. More recently things like PIC and ARM and higher level embedded processors for the same functions.

But I have a hard time visualizing these kinds of solutions in organisms. So for those I come back to relaxation processes.

By Gray Gaffer (not verified) on 01 Jul 2009 #permalink

In fact, the operating frequency will be such that there are an integral number of cycles covering the delay time, this being the condition required for positive feedback.

Calling negative feedback with insufficient phase margin "positive feedback" is at best misleading, but we're arguing over terminology to little purpose. I already cited the Barkhausen criteria, which neatly summarizes the various modes.

Remember, I am not necessarily talking about explicit delay stages: mostly the delays arise because we are in fact dealing with real analog circuits, a factor often masked by the pretty Boolean logic schematics we design the systems in.

Umm -- no. I don't design logic schematics, I design directly with MOS transistors. Lots of them. In my world, pretty much everything is a nonlinear transconductance device.

The problem with ring oscillators comes from either missed triggers or spurious out of sequence triggers, which result in multiple circulating signal fronts.

Barkhausen criteria again. A well-designed ring oscillator doesn't have gain greater than 0 dB above the fundamental frequency, and is unstable below it. They're subject to supply (easily dealt with) and thermal noise, but that's all part of the budget.

After my painful experiences dealing with OEM products whose sequencing was based on ring oscillators, I developed a style that relied more on counters and decoders at first, then later on latched reprogrammable logic arrays such as GAL16V8s. Essentially, single clock single stage encoded FSMs. More recently things like PIC and ARM and higher level embedded processors for the same functions.

Don't look inside those things or you're going to see things you don't like. That's my world.

But I have a hard time visualizing these kinds of solutions in organisms. So for those I come back to relaxation processes.

I'm not a biologist either, but Our Host is -- and it appears that he's looking for a model a bit more complex than a relaxation oscillator. Partly, I suspect, because aside from some neuron/synapse processes which depolarize in a way more like a relaxation circuit, biological processes have too many stages in the causality chain to be modeled like a single-phase clocked FSM. Cardiac rhythms come to mind.

As he reminds us, Nature is messy.

By D. C. Sessions (not verified) on 01 Jul 2009 #permalink

(comment system chucked mine away - so I'll try again...)

Messy, yes. So probably all the above.

MOS level? Then I'll stop trying to teach you to suck eggs. I don't enter the picture until 7400 or RTL level.

However, bio systems are talking long periodicities and are unlikely to be linear systems, so Barkhausen criteria per se do not apply. I can however easily see relaxation oscillators built of chemical concentration swings, exhibiting long periods. I can also see complex ring-like structures built of them, each stage having its own pulse but perhaps relying on the others to close its feedback loop.

But I still do not like ring oscillator pathologies. I had to post-mortem a printer my company thought it might OEM once. Used ring circuits to control the motor and the thermal print head. One pathology is where ground noise injection creates multiple 1s circulating when proper operation demands precisely one 1. Motors do not like it. Thermal print heads just catch fire. Also messy. Yes, good design can recover from this, but there is still at least a transient possibility of failure in a noisy environment.

By Gray Gaffer (not verified) on 01 Jul 2009 #permalink