Where do numbers come from?

When I was addressing this lunacy about how God exists because minds and mathematics are supernatural, I was also thinking about a related set of questions: biologically, how are numbers represented in the brain? How did this ability evolve? I knew there was some interesting work by Ramachandran on the representation of digits and numerical processing, coupled to his work on synesthesia (which is also about how we map abstract ideas on a biological substrate), but I was wondering how I can have a concept of something as abstract as a number -- as I sit in my office, I can count the vertical slats in my window blinds, and see that there are 27 of them. How did I do that? Is there a register in my head that's storing a tally as I counted them? Do I have a mental abacus that's summing everything up?

And then I realized all the automatic associations with the number 27. It's an odd number -- where is that concept in my cortex? It's 33. It's the atomic weight of cobalt, the sum of the digits 2 and 7 is 9, the number of bones in the human hand, 2 times 7 is 14, 27 is 128, my daughter's age, 1927 was the year Philo Farnsworth first experimentally transmitted television pictures. It's freakin' weird if you think about. 27 isn't even a thing, even though we have a label and a symbol for it, and yet it's all wrapped up in ideas and connections and causes sensations in my mind.

And why do I have a representation of "27" in my head? It's not as if this was ever useful to my distant ancestors -- they didn't need to understand that there were precisely 27 antelope over on that hillside, they just needed an awareness that there were many antelope, let's go kill one and eat it. Or here are 27 mangoes; we don't need to count them, we need to sort them by ripeness, or throw out the ones that are riddled with parasites. I don't need a map of "27" to be able to survive. How did this ability evolve?

Really, I don't take drugs, and I wasn't sitting there stoned out of my head and contemplating 27. It's a serious question. So I started searching the scientific literature, because that's what one does. There has been a great deal of work done tangentially to the questions. Human babies can tell that 3 things is more than 2 things. An African Grey parrot has been taught to count. Neurons in the cortex have been speared with electrodes and found to respond to numbers of objects with differing degrees of activity. The problem with all that is that it doesn't actually address the problem: I know we can count, I know there is brain activity involved, I can appreciate that being able to tell more from less is a useful ability, but none of it addresses the specific properties of this capacity called number. Worse, most of the literature seems muddled on the concept, and confuses a qualitative understanding of relative quantity for a precursor to advanced mathematics.

But then, after stumbling through papers that were rich on the details but vague on the concept, I found an excellent review by Rafael Núñez that brought a lot of clarity to the problem and summarized the ongoing debates. It also lays out explicitly what had been nagging me about all those other papers: they often leap from "here is a cool numerical capability of the brain" to "this capability is innate and evolved" without adequate justification.

Humans and other species have biologically endowed abilities for discriminating quantities. A widely accepted view sees such abilities as an evolved capacity specific for number and arithmetic. This view, however, is based on an implicit teleological rationale, builds on inaccurate conceptions of biological evolution, downplays human data from non-industrialized cultures, overinterprets results from trained animals, and is enabled by loose terminology that facilitates teleological argumentation. A distinction between quantical (e.g., quantity discrimination) and numerical (exact, symbolic) cognition is needed: quantical cognition provides biologically evolved preconditions for numerical cognition but it does not scale up to number and arithmetic, which require cultural mediation. The argument has implications for debates about the origins of other special capacities – geometry, music, art, and language.

The author also demonstrates that he actually understands some fundamental evolutionary principles, unlike the rather naive versions of evolution that I was recoiling from elsewhere (I'll include an example later). He also recognizes the clear differences between estimating quantity and having a specific representation of number. He even coins a new word (sorta; it's been used in other ways) to describe the prior ability, "quantical".

Quantical: pertaining to quantity related cognition (e.g., subitizing) that is shared by many species and which provides BEPs for numerical cognition and arithmetic, but is itself not about number or arithmetic. Quantical processing seems to be about many sensorial dimensions other than number, and does not, by itself, scale up to produce number and arithmetic.

Oops. I have to unpack a few things there. Subitizing is the ability to immediately recognize a number without having to sequentially count the items; we can do this with a small number, typically around 4. Drop 3 coins on the floor, we can instantly subitize them and say "3!". Drop 27, you're going to have to scan through them and tally them up.

BEPs are biologically evolved preconditions.

Biologically evolved preconditions (BEPs): necessary conditions for the manifestation of a behavioral or cognitive ability which, although having evolved via natural selection, do not constitute precursors of such abilities (e.g., human balance mechanisms are BEPs for learning how to snowboard, but they are not precursors or proto-forms of it)

I think this is subtley different from an exaptation. Generally, but not necessarily, exaptations are novel properties that have a functional purpose that can be modified by evolution to have additional abilities; feathers for flight in birds are an exaptation of feathers for insulation in dinosaurs. Núñez is arguing that we have not evolved a native biological ability to do math, but that these BEPs are a kind of toolkit that can be extended cognitively and culturally to create math.

He mentions snowboarding as an example in that definition. No one is going to argue that snowboarding is an evolved ability because some people are really good at it, but for some reason we're more willing to argue that the existence of good mathematicians means math has to be intrinsic. He carries this analogy forward; I found it useful to get a bigger picture of what he's saying.

Other interesting data: numbers aren't universal! If you look at non-industrialized cultures, some have limited numeral systems, sometimes only naming quantities in the subitizing range, and then modifying those with quantifiers equivalent to many. Comparing fMRIs of native English speakers carrying out a numerical task with native Chinese speakers (both groups having a thorough grasp of numbers) produces different results: "the neural circuits and brain regions that are recruited to sustain even the most fundamental aspects of exact symbolic number processing are crucially mediated by cultural factors, such as writing systems, educational organization, and enculturation."

Núñez argues that many animal studies are over-interpreted. They're difficult to do; it may require months of training in a testable task to get an experimental animal to respond in a measurable and specific way to a numerical task, so we're actually looking at a plastic response to an environmental stimulus, which may be limited by the basic properties of the brain being tested, but aren't actually there in an unconditioned animal. It says this ability is within the range of what it can do if it is specifically shaped by training, not that it is a built-in adaptation.

What we need is a more rigorous definition of what we mean by "number" and "numerical", and he provides one.

Strangely, because this is one case where I agree with human exceptionalism, he argues that the last point is a signature of Homo sapiens, but…that it is not hard-coded into us, and that it may also be possible to teach non-humans how to do it. I have to add that all of those properties are hard-coded into computers, although they currently lack conscious awareness or intent, so being able to process numbers is not sufficient for intelligence, and an absence of the cultural substrate to enable numerical processing also does not imply a lack of intelligence.

The paper doesn't exactly answer all of my questions, but at least it provides a clearer framework for thinking about them.


Up above, I said I'd give an example of bad evolutionary thinking from elsewhere in the literature. Conveniently, the Trends in Cognitive Science journal provides one -- they link to a rebuttal by Andreas Nieder. It's terrible and rather embarrassing. It's not often that I get flashed by a naked Panglossian like this:

Our brain has been shaped by selection pressures during evolution. Therefore, its key faculties – in no way as trivial as snowboarding – are also products of evolution; by applying numbers in science and technology, we change the face of the earth and influence the course of evolution itself. The faculty for symbolic number cannot be conceived to simply ‘lie outside of natural selection’. The functional manifestations of the brain need to be adaptive because they determine whether its carrier survives to pass on its genes. Over generations, this modifies the genetic makeup of a population, and this also changes the basic building plan of the brains and in turn cognitive capabilities of the individuals of a population. The driving forces of evolution are variation and natural selection of genetically heritable information. This means that existing traits are replaced by new, derived traits. Traits may also shift their function when the original function becomes less important, a concept termed ‘exaptation’. In the number domain, existing brain components – originally developed to serve nonverbal quantity representations – may be used for the new purpose of number processing.

I don't think snowboarding is trivial at all -- there are a lot of cognitive and sensory and motor activities involved -- but just focus on the part I put in bold. It's absurd. It's claiming that if you find any functional property of the brain at all, it had to have had an adaptive benefit and have led to enhanced reproduction. So, apparently, the ability to play Dungeons & Dragons was shaped by natural selection. Migraines are adaptive. The ability to watch Fox News is adaptive. This really is blatant ultra-adaptationism.

He also claims that "it has been recognized that numerical cognition, both nonsymbolically and symbolically, is rooted in our biological heritage as a product of evolution". OK, I'll take a look at that. He cites one of his own papers, "The neuronal code for number", so I read that, too.

It's a good paper, much better than the garbage he wrote in the rebuttal. It's largely about "number neurons", individual cells in the cortex that respond in a roughly quantitative way to visual presentations of number. You show a monkey a field with three dots in it, no matter what the size of the dots or their pattern, and you can find a neuron that responds maximally to that number. I can believe it, and I also think it's an important early step in working out the underlying network behind number perception.

What it's not is an evolutionary study, except in the sense that he has a strong preconception that if something exists in the brain, it had to have been produced by selection. All he's doing in that sentence is affirming the consequent. It also does not address the explanation brought up by Núñez, that these are learned responses. With sufficiently detailed probing, you might be able to find a small network of neurons in my head that encode my wife's phone number. That does not imply that I have a hard-wired faculty for remembering phone numbers, or even that one specific number, that was honed by generations of my ancestors foraging for 10-digit codes on the African savannah.

Nieder has done some valuable work, but Núñez is right -- he's overinterpreting it when he claims this is evidence that we have a native, evolved ability to comprehend numbers.


Núñez RE (2017) Is There Really an Evolved Capacity for Number? Trends in Cognitive Sciences 21(6):409–424.

Nieder A (2017) Number Faculty Is Rooted in Our Biological Heritage
Trends in Cognitive Sciences 21(6):403–404.

Nieder A (2016) The neuronal code for number. Nat Rev Neurosci 17(6):366-82.

Categories

More like this

This problem of the conceptual representation of numbers has played a big role in my independent-scholar AI career. I will preempt ad hominem attacks by re-iterating that my work is more philosophy than science. I spent my twenties designing memory channels to model the brain in artificial intelligence, but I was stumped by the problem of the visual recognition of objects, until a series of articles in BYTE Magazine explained the neuronal mechanism of "feature extraction" to me.

Once I had mapped out five memory channels for sensory input, I came up against the problem of how to access plural nouns in the auditory memory channel when the brain is trying to speak or think about the visual recognition of more than quantity one of a known phenomenon. I was really stymied. I could visualize a direct associative tag between one recognized dog in the visual memory channel and the singular noun "dog" in the auditory memory channel, but I could not see how to tag multiple dog-engrams over to the plural "dogs" in audition. At such an impasse it suddenly hit me -- there must be an abstract memory channel intermediating between visual memory recognitions and their linguistic description in auditory memory. Then I spent three agonizing months fully convinced that there was an abstract memory channel but not having the slightest idea of how it worked, until it all came tumbling out of me over a six-month period onto the pages of my hand-written journal. Borrowing from Ludwig Wittgenstein and his "Tractatus Logico-Philosophicus" I explored the idea of a "logico-conceptual cable" of neuronal fibers containing concepts of things, actions and numbers. I abandoned the problem of noun-plurals as low-hanging fruit and I went instead after the more complex problem of summoning up verbs in auditory memory to describe actions being recognized in visual memory. Cutting to the chase, right now in May of 2017 I am working on "How the Mind Works" at http://ai.neocities.org/theory.html not to develop my ideas but to convey them.

By Mentifex (Arth… (not verified) on 20 May 2017 #permalink

Where do numbers come from?

From evolution, of course.

By See Noevo (not verified) on 22 May 2017 #permalink

Seems to boil down to a more careful look at Nature/ Nurture. Given that so many animals learn while they're young, we're not so special, just more.
BTW, I appreciate it when you take the time to write these essays.

There is a similar discussion in Matt Parker's "Things to make and do in the 4th dimension," about the difference between digits and numbers.

I think that Piaget's idea of sensory motor schemata applies to our concepts of number. Not all concepts are purely verbal. Asked what a spiral staircase is, many people will say something like, "One that goes like this," and make an appropriate gesture. In theory, all numbers can be derived from the natural numbers, the ones we use for counting. Sometimes you can see people gesture when they count. They may nod their heads or point with their hands or arms. People count on their fingers. My grade school teachers discouraged counting on your fingers, but they were probably wrong to do so. Counting is different from looking at a collection of objects and recognizing that there are six of them. Don't we have a concept of 27 because we can count to 27? Doesn't counting have a sensory motor basis?

It would seem the neural networks of the brain are functionally somewhat similar to the artificial networks being experimented on by computer scientists these days to recognise animals, drive cars and play Go and the like. I imagine evolution hit on a design that is fairly general and can be used for all sorts of cognitive discrimination rather like the computer versions.

By Tim Spear (not verified) on 25 May 2017 #permalink

I have a question that I think, when it comes to Christianity, needs to be addressed.

I am not a Christian. My belief is that belief in God is infantile narcissism. This is Freud's theory and I think it helps explain things for me.

I like Freud's theory because, logically, how could someone read the entire Bible (and many Christians do), many times and even examine it, and yet not have some degree of intelligence? Now, many people go to Church just for the social atmosphere. But I know that a lot of people who are Christians read a lot: the Bible, Aquanis, writings of the saints, Paradise Lost, Dante, etc. Catholics seem to be especially intelligent when it comes to literature.

So, how can a person have an average or low-IQ and yet be able to read and interpret the Bible or Paradise Lost or City of God from a Christian (or Catholic) perspective? I don't think, here, the problem is intelligence, at least for some, but mental illness: the theory of a belief in God as infantile narcissism as posited by Freud.

Surely someone else has thought of this but I haven't done much research on it; I've just been pondering the logic behind it.

I did a little research, though, on Catholic monastics: they read a lot of Catholic literature and philosophy (including atheist philosophy and probably evolution, too) and examine their own behavior. Surely, some type of intelligence is present here. The problem isn't low intelligence, at least for some...the problem is infantile narcissism.

And I know a lot of people don't believe that Freud is correct, either...but I can't wrap by head around the logic of:
- reads and interprets entire Bible
- reads and interprets Paradise Lost and Dante and Aquanis and City of God from a Christian (or Catholic) perspective

- has average or low-IQ

Can anyone else see the logical fallacy here?

The more interesting question, I think, is where do numbers go when you're done with them?