How the brain codes numbers is a challenging problem. We know that certain parts of the brain must code numbers because they are involved in numerical calculation. Some of them -- such as the prefrontal cortex (PFC) -- are also involved in the calculation of reward, so it would be good if we knew how numerical rewards were encoded.
Neider and Merten address this issue of neural encoding of numbers in a recent paper in the Journal of Neuorscience.
In the paper, they trained monkeys to respond to different numbers of cues in a delayed response task while they were recording from their prefrontal cortex.
They list two possible ways in which the brain could code numbers.
The first called summation coding increases the number of neurons firing and their firing rate in proportion to the size of the number being encoded. In essence, summation coding encodes the magnitude of the number in the magnitude of the firing. This is depicted in the Figure (from the paper) as A.
The second is called labelled-line coding, depicted in the Figure as B. Labelled-line coding has a special set of neurons that respond to each separate number. The rate of firing does not matter as much as which neurons are firing.
When the authors examined the firing rates of many neurons during their task, they found that the second theory was the correct one. Neurons show what are called tuning curves for particular numbers of stimuli. A tuning curve is an increase of firing rate at a particular point on the axis of possible stimuli -- thus we say that neuron is tuned for that particular stimuli.
Here are some samples of neurons that are tuned for particular numbers. The x-axis shows the number of cues. The y-axis shows the firing rate. See how the firing rate peaks for each of these neurons at a different number. That indicates the number to which these neurons are tuned to fire.
What is the significance of this work?
Well now we know that as a set, the neurons can encode all the numbers depending on which neurons are firing.
This has interesting implications to the mechanics of computation. For example, if each number is represented not as a magnitude but as an abstract entity, how does addition take place? Also, what does the brain do with particularly large numbers. Does it have neurons for them or do they blur together at the high end? (It mentions in the paper that 1 is over-represented, but this is probably because one stimulus can be interpreted as a variety of things besides just a number.)
Hat-tip: Faculty of 1000.
- Log in to post comments
Incredibly cool. Do they discuss implications for the logarithmic mental number line? or do you have any ideas about how/whether that relates?
For example, if each number is represented not as a magnitude but as an abstract entity, how does addition take place?
Well, I can think of at least two techniques, and I suspect humans at least use both of them. One would be visualization and internal combination (for small numbers), the other would be straight associative memory, just like the multiplication tables.
I like the ones with multiple peaks -- maybe there are multiple-of-N neurons there too.