Kids' misconceptions about numbers -- and how they fix them

Blogging on Peer-Reviewed ResearchOne of our readers emailed us asking if there has ever been research on whether kids' understanding of numbers -- especially large numbers -- differs from adults. Greta did a little poking around and found a fascinating study on second- and fourth-graders.

In the U.S. (and I suspect around the world), kids this age are usually taught about numbers using a number line. In first grade, they might be introduced to a line from 0 or 1 to 10. In second grade, this is typically expanded up to 100. But what happens when second-graders are asked to place numbers on a line extending all the way up to 1,000?

This happens:

i-62476fdad0d543feefa4c504a37d7a70-Opfer1.gif

I've placed the 500 (in gray) on these lines for reference; in the real study, conducted by John Opfer and Rober Siegler, the kids used lines with just 0 and 1000 labeled. They were then given numbers within that range and asked to draw a vertical line through the number line where each number fell (they used a new, blank number line each time). The figure above represents (in red) the average results for a few of the numbers used in the study. As you can see, the second graders are way off, especially for lower numbers. They typically placed the number 150 almost halfway across the number line! Fourth graders perform nearly as well as adults on the task, putting all the numbers in just about the right spot.

But there's a pattern to the second-graders' responses. Nearly all the kids (93 were tested) understood that 750 was a larger number than 366; they just squeezed too many large numbers on the far-right side of the number line. In fact, their results show more of a logarithmic pattern than the proper linear pattern. This chart of the results makes the pattern clear:

i-07817a4ff9a41ae738c0c8da4cd6b9a0-opfer2.gif

Both the logarithmic and linear curves fit the data for their population quite closely. What's more, the second-graders' performance didn't change over time as they worked on the problems. It appears that they simply have a different conception of how numbers are distributed over the number line compared to older kids and adults.

So what happens between second grade and fourth grade to cause this difference? How do kids learn the relationship between numbers and their proper place on the line? In a second experiment, Opfer and Siegler worked with just second graders to see what it took to teach them to place the numbers correctly.

In their first experiment, they had found that about a third of the second-graders matched the performance of fourth-graders. So working just with the 61 second-graders whose results best fit with a logarithmic pattern, they repeated the study with a key difference. This time, on certain key trials, the kids were given feedback: If they were off by more than 10 percent, the experimenter told them where the number should go on the number line. If they were correct, they were asked how they knew to put the number in the right spot. At the end of the study, which took less than an hour, the second graders' results looked remarkably like the fourth-graders': accurate, linear placement of the numbers on the line.

What's more, most of the kids followed the same pattern in learning: When they got it, they got if right away. Take a look at this graph:

i-f293e3c6779b7b9ed35be346aa0f4412-opfer3.jpg

The experiment was divided into four blocks, with feedback given before each block. The graph defines "0" as the block at which point the function that best explained their answers switched from logarithmic to the correct linear function. The gray line shows the portion of the results were best explained by a logarithmic function, while the black line shows the portion best explained by a linear function. As you can see, once the linear form is learned, the transformation is quick, and permanent.

Interestingly, even if students were given feedback only for higher numbers or lower numbers, they still learned the correct linear form for the entire length of the number line. They learned fastest when they were given feedback for numbers around 150, which represented the largest typical error, but nearly all the students who were given feedback eventually understood the concept.

Opfer and Siegler argue that a "rapid and broad" change like this can occur in a variety of different types of knowledge, ranging from fractions to biology. They claim that there are certain concepts which, when grasped, open up a whole different understanding or representation of how a system works.

One final thought: I wonder if even adults would display a similar pattern to second graders, if the numbers were large enough. I'd be interested to see a study on adults using numbers, say, from zero to a billion. How many adults would properly place the value of one million just a third of a millimeter from the left end of a 30-centimeter line?

Opfer, J., Siegler, R. (2007). Representational change and children's numerical estimation. Cognitive Psychology, 55(3), 169-195. DOI: 10.1016/j.cogpsych.2006.09.002

More like this

When adults are asked if they remember pictures of faces, they're more accurate when the faces are the same race as they are. It makes some sense -- people are likely to spend more time with and have more same-race friends, so they may become better attuned to the differences in individuals in…
If you said 1/1000, you've given the answer provided more often by second graders than by undergraduates. And you're also right. Evidence from functional neuroimaging and developmental psychology indicate that the human brain possesses an abstract system for magnitude comparison, perhaps relying…
A few weeks back, a Union alumnus who works at Troy Prep contacted the college to arrange a visit for a bunch of second-graders, and asked if faculty would be willing to arrange talks and demos for the kids. I said something like "Sure, we could probably make liquid nitrogen ice cream for them,"…
The SNARC effect is a fascinating phenomenon (and no, it has nothing to do with cheeky one-off blog posts). When asked to recognize numbers, people react faster with their left hand for low numbers, and faster with their right hand for high numbers. Take a look at this graph: This shows the…

Or perhaps there's a problem with the adults not defining things well enough. Maybe the instructions were more explicit, but all I see here is the notion of the "proper" place for a number. What does that mean? Perhaps improving results are simply the effect of kids being forced into the conventional way of thinking.

Counter-example - I've seen reproductions of very old maps (medieval?). Two come to mind, one of the Holy Land and one of Northern Italy. Common characteristic - they look inaccurate until you realize that they are not done to a linear scale as are most maps today. They're done so as to be compact and yet fit in words or pictures about sigificant features. There are indications of how far one could expect to travel in a day ... which after all is the significant item for a traveller. They are wildly non-linear, but marvellously practical.

Maybe the kids were just gradually having the flexibility and creativity beaten out of them! :-)

By Scott Belyea (not verified) on 04 Dec 2007 #permalink

Well, maybe there is a misconception about the "right way" of putting numbers on a line. I'm studying physics for 3 semesters now and had to learn logarithmic scales by hard.

In fact, I find the concept of big, very big and very very big numbers incorporated in them a lot more "natural" than the linear one because, as you remarked, it probably is the way we deal with numbers larger than those we learnt to handle linearly.

All of you, me and the scientists conducting the study have obviously made the step from logarithmic to linear scales at one point; maybe we are the ones doing it wrong now.

To a little girl, there are about as many years between herself and her mother as between her mother and her grandmother, but early on, asked to put the three on an age scale, she would place her mother much nearer her grandmother than herself.

We perceive a lot of things on logarithmic scales -- sound intensity and light intensity, to name a couple of obvious ones.

Perhaps it's natural to perceive number as being logarithmic. Only with learning does the linearity of the number line come to be understood.

Does it have to do with the type of math they are learning at different stages? By fourth grade, students understand multiplication tables and the consistent addition of a particular number when multiplying.

Second graders are dealing with addition, subtraction and learning to carry and regroup ones and tens. They are not focusing on the "big picture" yet. 750 might as well be 10,000 in their minds. They are representative of numbers of things that they have probably not dealt with in "hard copy" reality.

Every kid has had a box of 100 Lego pieces, but not many have had 1000. It seems beyond comprehension to imagine so much of one thing.

It looks like the 2nd grader response could be obtained by the following strategy: Place the numbers in order on the line, approximately evenly and comfortably spaced, until you run out of numbers to place. The hypothesis proposed above ("2nd graders concieve of numbers logarithmically" or something) is nice, but mine seems roughly as simple. Does the data distinguish between these two?

I don't really find using a logarithmic distribution to be "wrong" in any strong sense of the word. It's a very natural way to think of scales (our perceptual systems are logarithmic to boot). It really looks like we're simply defaulting to using logarithmic scale and switch to linear only when people around us tells us we have to.

I wonder what would happen with school children that get encouraged to continue using logarithms and get their early math presented in a way consistent with them (talking about multiplication early and presenting it graphically on logarithmic scales, for instance)?

I've been doing a fair amount of reading about numerical cognition such as this and think there are a few interesting things to be said. Similarly to what 6EQUJ5 mentioned, logarithmic scales show up all over the place in numerical cognition. One place is an effect known as the distance effect, where performance (both accuracy and reaction time) on number-comparison tasks (judging which of two numbers is larger) improves with increasing distance.

As such, most computational models of number representation posit that numbers are represented in the form of an analogue number line (there's even evidence that numbers are actually spatially coded, too: see Dehaene, S., Bossini, S., & Giraux, P. (1993)) on a logarithmic scale.

I wonder if the log relationship is related to Weber's law.

Perhaps the only numbers that stand out to little kids, as they learn numbers early on, are the ones where they need to add a new digit: 10, 100, 1000. Their attention will skip over the numbers in between because they can fill them in using pattern-matching (ie, once they know to count from 1 to 10, and then from 11 to 20, and 21 to 30, chances are they'll breeze through 31 to 40, 41 to 50, and each other group of ten up until 100.)

Of course, at some point they'll learn to add a digit when you get to 99, 999, 9999, etc. I'm guessing that this concept is solidified by the time they've learned place value in school - somewhere in 2nd or 3rd grade?

Then there's the issue of naming, which might also affect kids' conceptions of the number line. "One," "ten," "hundred," and "thousand" are all different words, whereas the numbers in between are all composites of these words (and the words denoting their multiples). Then the next new number word is "million". A kid will have to stop and learn a new word each time they run out of numbers, but they can skim through the ones in-between by building them up with existing number names. One, ten, hundred, thousand, and million could gain landmark status in his or her conception of the number line.

As for your final question: I, for sure, unconsciously place 1 million, 1 billion, 1 trillion, etc. evenly on a number line - probably because their names suggest a linear relationship. I rarely have to think about these numbers in that form, except perhaps in the populations of big cities. So that region of the number line has never had the chance to mend itself, although I intellectually understand that they increase exponentially. (More often I think of 10^6, 10^7, 10^8, etc, which of course appear on a logarithmic number line.)

It'd be interesting to see more data on this!

Dehaene and co have a lot of data from human fMRI studies and monkey neurophysiology suggesting that the brain has a logarithmic number line (even in monkeys) and that humans need verbal training to switch to a linear number line.
For example, see -
Izard V, Dehaene S. Calibrating the mental number line. Cognition. 2007 Aug 1
Feigenson L, Dehaene S, Spelke E. Core systems of number. Trends Cogn Sci. 2004 Jul;8(7):307-14.

Oh come on, it's blindingly what's going on. The 2nd grade responses in the first figure are almost perfectly even in spacing; the obvious hypothesis is that they're NOT trying to space the numbers at all - they're just putting them in order! The "logarithmic" effect is simply an artifact of the experimentalists' choice of numbers - they have a roughly exponential distribution, so when you space them evenly on a linear scale... you get a logarithm!

{15, 27, 150, 366, 750}

I want to see the control where they gave the 2nd-graders a linear distribution like {130, 260, 390, 520, 650}; I'd strongly expect a null hypothesis, the exact same spacing as in figure 1.

By untenured anonymous (not verified) on 05 Dec 2007 #permalink

"untenured anonymous" you are a genius. I love it when people point out how silly some of these educational experiments are. After reading your comment I couldn't stop laughing!

#11: but the post states that the first figure represents only "a few of the numbers used in the study" and more importantly "they used a new, blank number line each time".

"untenured anonymous" didn't even read the article enough to know that students were asked to put one single number on one blank line and that the figure, with it's even spacing" is not a representative response but rather a summarization over many experiments and many children. A little more attention and a little less rabid reactionary-ism would prevent a lot of blather.

I have only a few distinct memories of early childhood, but the ones I still have really stick out. I actually developed my own pattern ideas for patterns I didn't fully understand. This memory is probably from grade 2: I knew the numbers from 1 to 10, and I had somehow heard of higher levels of numbers. Thus, when I tried counting beyond my own means, it went like this: 1 2 3 4 5 6 7 8 9 10 100 1000 1,000,000 1,000,000,000 (there may have been a zillion in here) infinity. There you have it; only 15 or 16 numbers in existence. I didn't even know how to write a million or billion, but I knew the words and their relative order. Someone, of course, corrected me later on. Strangely enough, this was probably way ahead of the class, because while I was trying to count higher than I knew how, the other kids were probably playing games and sports and such...

I already guessed that it was logarithmic, since 10, 100, 1000, *seem* to be a very logical order for numbers and *feel* adjacent to each other. That's why they made it logarithmic. By the way, as a high school student, I would have imagined that the "million" mark would be overshadowed by the "billion" mark, making the the million right next to the zero mark.

A very fascinating article.

On a related topic, I'd love to see a discussion of how people relate to "big" and "small" numbers. For example, how do you describe a "micron" in a meaningful way to a lay person? How can you relate the "Big Bang" to someone while discussing time in 10(-11) seconds after the BB? Or for that matter, can the the human mind ever *really* rap itself around the idea that the universe is 13.7 billion years old?

I think this matters because people are constantly confronted with tiny but statistically real risks. Can those people really evaluate the possibility of increasing their risk of cancer by .0001 percent? In a similar vein, how can the average person "make sense" of Evolution and the Big Bang given that they occur over time periods well beyond the realm of normal experience?

Very interesting and thought-provoking--both the article itself and the analysis of it. I've always been very interested in psychology, particularly cog-psych. Looks like I'm going to have to read Cog Daily more often!

Nice post. My father, www.williamghunter.net, a professor of statistics and of Chemical Engineering at the University of Wisconsin, was a guest speaker for my second grade class to teach us about numbers - using dice. He gave every kid a die. I remember he asked all the kids what number do you think will show up when you roll the die. 6 was the answer from about 80% of them (which I knew was wrong - so I was feeling very smart).

Then he had the kids roll the die and he stood up at the front to create a frequency distribution of what was actually rolled. He was all ready for them to see how wrong they were and learn it was just as likely for any of the numbers on the die to be rolled. But as he asked each kid about what they rolled something like 5 out of the first 6 said they rolled a 6. He then modified the exercise a bit and had the kid come up to the front and roll the die on the teachers desk. Then my Dad read the number off the die and wrote on the chart :-)

I assume that sequence effects was protected against by randomization, otherwise it is likely that the children "remembered" where they wrote the last number and compared for each new number. That could explain why they made the numbers the could relate to 5,27,150 relative to each other, rather than to the scale.

This could also be a dimensional problem. Maybe second graders cannot combine numerical size and geometrical lenght efficiently enough?

an amazing study. it proofs that their mistake follows a mathematical pattern. from beta to alfa and back. 'chapeau' (hed) as they say in french<

Very interesting. What I realised after reading the comments is that the decimal system IS a logarithmic number scale, where the significant mileposts grow exponentially further apart. Perhaps the little ones (not sure how old second- and fourth-graders are) have a better grasp of it than we think.

Matthias:

My problem extends well past "big" and "small" numbers on a linear scale... Indeed, even as I KNOW it's not the case, I have always found something "fake" about logarithmic scales.

Yet another simpler explanation: smaller numbers are more real (in the sense of more familiar in everyday experience) to young children than larger numbers. Therefore smaller numbers are allotted more space at the "starting" end of the scale while less interesting larger numbers would be shoved off toward the high end of the scale, sort of as an afterthought. It reminds me of "naive" illustration in which more important people or objects are rendered larger than they would be in correct perspective. Is it also be worth recalling that in the process of cortex maturation (oops a different blog), thickening peaks and pruning begins between 2nd and 4th grades?

Before I started school I used to like to count to see how high I could get. I remember the first time I got to 99, I didn't know what came next. It occurred to me that it might be 100, but I had this conception of 100 as being an impossibly large number that you could never count to.

By Fibonacci (not verified) on 08 Dec 2007 #permalink

6EQUJ5 notes [#3] that we perceive sound and light intensity logarithmically. I suggest another for that list: money. Specifically, financial gains are perceived (and the uncertain realization of those gains is evaluated) to reflect their declining marginal utility. This is seen in the S-shaped value function associated with prospect theory.

Has anyone evaluated that curve against a logarithmic scale?

Apropos suggestions that kids have numeric instincts that are pragmatic and sound from evolutionary perspectives:
"Monkeys Rival College Students' Ability to Estimate" (The monkeys outperformed the college kids in estimating numbers.)

And the best part: "Cantlon said young children probably do something very similar before they learn formal arithmetic."

http://www.npr.org/templates/story/story.php?storyId=17344697

Re the following statement: "Opfer and Siegler argue that a "rapid and broad" change like this can occur in a variety of different types of knowledge, ranging from fractions to biology. They claim that there are certain concepts which, when grasped, open up a whole different understanding or representation of how a system works."

Anyone out there familiar with 'threshold concepts'? This seems to be exactly what Opfer and Seigler are describing. This is a good place to start for anyone interested in finding out more: http://www.tla.ed.ac.uk/etl/docs/ETLreport4.pdf

By Grumblepuss (not verified) on 18 Dec 2007 #permalink