# You’re a Dim Bulb (And I mean that in the best possible way)

I have a fondness for collecting brain lore–memes about the wonders of the human brain that race around the world for decades. The classic of brain lore is the “ten-percent myth.” As I wrote here, people often claim we only use ten percent of our brain, implying that we’d be supergeniuses if we could just switch on the rest. But that’s just based on a misinterpretation of some studies in the 1930s. Actually, the energy consumed by the cortex is only enough to power one percent of its neurons at any time.

In a press release descibing the work of Stanford bioengineer Kwabena Boahen, I stumbled on another meme:

According to Boahen, the brain is capable of performing 10 quadrillion (that’s 10 to the 16th) “calculations,” or synaptic events, per second using only 10 watts of power. At this rate, he says, a computer as powerful as the human brain would require 1 gigawatt of power.

I searched for the origin of this meme, and discovered Paul Valery, an early 20th century poet and essayist. He declared:

The ultimate “computer,” our own brain, uses only ten watts of power — one-tenth the energy consumed by a hundred-watt bulb.

It’s a claim that falls in that gray zone, the intersection of cool and crazy. So to see if it was actually true, I asked Bill Leonard, an expert on the evolution of human brains at Northwestern University. He responded thusly:

This is really interesting. The 10 watt estimate looks pretty close to being correct — perhaps a bit on a the low side, but certainly in the ballpark.

In terms of calories, here is how the 10 watts translate:

10 watts = 10 joules/sec = 207 kcal/day for the brain

At 200-210 kcals, this is enough energy to support a brain of about 1000 grams, at the low end of the modern human range.

For an average size human brain — 1300 -1400 grams — the costs would be a bit higher — between 250-300 kcal/day. However, this would only up the “wattage” to about 15.

So there you go. One urban myth survives the cold scrutiny of reason! Pass it on in the full confidence that it’s true (not to mention amazing).

1. #1 Harlan
March 23, 2006

You know what it’s called when more than 1% of the neurons in your brain fire at once? A seizure.

Also, it’s really a mistake to equate synaptic events with calculations in a computer. Synaptic events are much, much simpler. When computer scientists talk about a “calculation”, they usually mean an amount of computation that takes one clock cycle, which might be to add two long numbers, or decide if one number is bigger or smaller than 0. To the extent we understand what computations synapses compute, they’re probably, at least mostly, small portions of an addition. The simplest model of a neuron basically computes “if the sum of the average firing rate of my input neurons is more than X, fire”. Any single synaptic event affecting that neuron is likely to be one of probably thousands of such events that is involved in that computation…

On the other hand, 1% of the neurons in your brain can compute that kind of rule simultaneously, while your desktop computer can compute only one or two things at once.

2. #2 RPM
March 23, 2006

Good work tracking that down. It looks like Albert Brooks may need to do a rewrite on Defending Your Life.

3. #3 Janne
March 24, 2006

This “We only use 10% (or 5% or 1% or whatever) of our brain” thing is mighty irritating. Of course we do. The brain is not homogenous; a given area is doing something pretty well defined. It’s not like you can take the fusiform gyrus when there is no faces to detect and temporarily help out with the algebra problem you’re working on.

It’s like saying you only use 10% of your car at any one time just because you’re not using all gears at the same time, running both the heater and the air conditioner, and constantly opening and closing the doors, honking the horn and blinking the turn signals while weaving in lazy curves along the highway (so as not to waste the use of the steering wheel).

4. #4 josh
March 24, 2006

“On the other hand, 1% of the neurons in your brain can compute that kind of rule simultaneously, while your desktop computer can compute only one or two things at once.”

that isn’t quite true concerning computers. processor architecture and instruction sets have a lot to do with the number and type of operations per second. for example, dedicated graphics chips can process an immense number of graphic operations per second despite their relatively low clock rate. they suck at general computations though

5. #5 Jim Anderson
March 25, 2006

I noted a similar meme about a boy with no brain a while ago–add it to your collection, if it’s not already there.

6. #6 Stephen Uitti
March 27, 2006

On the computer side, 10^16 IPS requires about 10^6 chips at about 100 Watts each, using desktop processors. 100 MegaWatts. If one uses processors that are designed for mobile devices, one gets better instruction per Watt rates – at least 10x but probably 100 times better. A Game Boy is one tenth the speed, but lasts 18 hours on a charge and doesn’t get hot. The other problem with these comparisions is that Moore’s Law tends to make the current answer obsolete so quickly.

All that said, my experience is that through training, one can make the muscles perform to at least 800% better. Not all the gains are pure increased power – some is efficiency of use. There is plenty of evidence that the brain behaves similarly – though it may be more difficult to measure.

7. #7 luca
March 28, 2006

Carl, you say: Actually, the energy consumed by the cortex is only enough to power one percent of its neurons at any time.

Does this mean that were you to use more than that, your brain would drain the energy resources of your body too quickly? it may even overheat, I guess…

8. #8 Emily Sommer
March 28, 2006

Thats awesome. I’ve always heard that statistic, but was always a bit doubtful. Very cool that tis true =)

9. #9 outeast
March 29, 2006

So thinking really, really hard is a good way to lose weight?

10. #10 John
March 29, 2006

I think a point that is missed here is that your neurons are “doing” something when they don’t fire, since it is the pattern and timing of neuronal firings that matters in thought (most likely). Adults are able to do many cognitive activities using far less energy than children doing the same activity (a more efficient mind will use less energy). So the myth is that more activity in your brain will equal better thought processes…