On the subject of basic concepts, here's an essay I orginally posted back in June. In it I try to explain what infinity is all about. It seems appropriate for this series, so I thought I would bring it back. Enjoy!

________________________________

Think for a minute about basic arithmetic. Addition is something that is done to two numbers. You take two real numbers and add them together to produce another real number. But suppose you had three numbers, x, y, and z? What does it mean to add three numbers together?

Very simple. You would begin by adding x to y. Then, you would take the result and add that to z. The resulting number can be said to be the sum of x, y and z. The point is that at each step of the process we only added two numbers together. The fact that addition (and all of the other standard arithmetic operations like subtraction, division and multiplication) can only be carried out with two numbers is what is intended by describing addition as a “binary operation.”

The procedure just outlined can be extended to any finite collection of real numbers. Thus, if I hand you ten numbers you can add them all together by starting with any two of them and proceeding from there. But what does it mean to add up infinitely many numbers? You certainly can't go two at a time any more. You will never come to the end of the process if you try it that way.

The answer is that adding up infinitely many numbers is a fundamentally different process from adding up finite collections of numbers. The only way to give meaning to the idea of finding the sum of an infinite series is to use the idea of a limit. The basic idea is this:

Suppose you have a series like:

(1/2) + (1/4) + (1/8) + (1/16) + (1/32) +...

The bottom of each fraction is twice the number that appears on the bottom of the previous fraction.

Notice that the first fraction is (1/2). The sum of the first two fractions is (1/2)+(1/4)=(3/4). The sum of the first three fractions is (7/8), the sum of the first four is (15/16), and the sum of the first five is (31/32).

We then line up these sums and stare at them for a moment:

1/2, 3/4, 7/8, 15/16, 31/32, ...

Since each term in this sequence represents the sum of part of the series, we refer to it as the sequence of partial sums.

You might notice that each term in the sequence is closer to one than the term preceding it. In fact, it seems reasonable to say that if I continued evaluating partial sums, I would produce fractions that become arbitrarily close to one. And from there it seems reasonable to take the plunge and say that since the limit of the sequence is one, we may as well say that our original series adds up to one.

I have left out a fair amount of technical detail here. The term “limit” has a precise definition usually expressed with a multitude of Greek letters (epsilons and deltas to be specific). But the important point is that summing an infinite series is a fundamentally different process from adding up finite collections of numbers. The former case is adequately modeled by thinking about how many objects you have when collections of various sizes are put together. The latter involves computing a certain sequence of numbers and trying to determine the limit (if, indeed, the limit exists at all).

That may seem like a subtle distinction, but it is crucial. Many of our intuitions about finite arithmetic break down when you try to apply them to infinite sums. I wrote a lengthy post a while back exploring one bizarre consequence of the way the sum of an infinite series is defined. Specifically, consider the series:

1 - (1/2) + (1/3) - (1/4) + (1/5) - (1/6) + ...

Take my word for it that this series adds up to

*log 2*, where the log refers to the natural (base e) logarithm.

Now for the bizarre part. I can rearrange the terms of that series so that when you add it up, instead of getting *log 2*, you get the number five instead. Don't like five? Well, I can rearrange the series so that you get seven, or eleven, or minus eighteen or the square root of two or any other real number you care to mention. Really! The details are given in the post linked to above.

Sums of infinite collections of numbers are fundamentally different from sums of finite collections of numbers.

There is no distinction between saying the sum of the series equals one on the one hand, but merely converges to one on the other. When talking about infinite sums, convergence is the only game in town. Saying the sum equals one and saying the series converges to one are two different ways of saying the same thing (with the second formulation a bit more precise).

But in evaluating such a sum, does *n* actually reach infinity or does it merely approach infinity? The answer is that infinity is not a final destination for wandering variables. The entire phrase “as n approaches infinity” has a precise definition. You should not think of this phrase as indicating that *n* is the sort of thing that goes places, infinity is a place for it to go, and the word “approaches” means the same thing here as it means in every day speech.

Our newfound inisght allows us to make sense of the idea that .99999...repeating equals 1. The expression .9999... repeating is a short-hand way of writing the number obtained when the infinite series

(9/10) + (9/100) + (9/1000) + ...

is evaluated. It is a consequence of the way the sum of an infinite series is defined that the series above converges to one. Therefore, it is meaningful to say that the expression .99999... repeating is another way of writing the number one.

The final point is that there is no philosophical question here. That .9999... repeating equals one is a logical consequence of the way various terms are defined, and that is all.

Which brings me, finally, to the question asked in the title of this post. What is Infinity? Well, the answer is that infinity, by itself, is not a word that mathematicians use very much. It is definitely not a number. You can't add infinity to other numbers or multiply a number by infinity. And if you ever come across a math book where it appears the author is doing precisely that, I can assure you that specific rules were established for doing so in the context presented in the book.

It is often said that infinity is a concept. It expresses the idea of boundlessness, or of something that never ends. That's a reasonable way of thinking about it. I would repeat, however, that infinity as an abstract concept just isn't something mathematicians talk about very much.

In calculus people often say “as x approaches infinity” or in geometry you migh talk about the “point at infinity” but again, in these contexts it is the whole phrase that gets defined, not infinity by itself.

What mathematicians do talk about quite a bit is the infinite, especially infinite sets. It seems meaningful to say that there are infinitely many positive integers but only finitely many people on Earth. A great number of fascinating issues arise when you think seriously about infinite sets, but we'll save that for a different post.

- Log in to post comments

Gee thanks Jason,

I think you bruised my brain. =)

My infinite hotel is full, but I'm sure I can find a room for you anyway...

One way that I have explained that 0.9999.... is equal to 1 is to get people to consider the arithmetic difference between 1 and 0.9999.... 1 - 0.9999.... = 0.00000... which is kinda exactly equal to zero. So, there is so arithmetic difference between the numbers.

Have you seen Tom Lehrer sing about deltas and epsilons?

This reminds of the Banach-Tarski paradoxes which I believe were a consequence of the axiom of choice. What you describe is not really my specialty, but it seems there is a relationship between the observer choices in arranging the terms and the outcome.

Sal