A while back I mentioned the St. Petersburg paradox. It’s a hypothetical gambling scenario where you win money based on the outcome of a coin toss. If you get your first tails on the first throw, you get $1. If you get one head before your first tails, you get $2 dollars. If you get two heads before tails, you get $4. If three heads before tails, you get $8, and so on doubling each time.
How much should you be willing to pay to play this game? If you work out the mathematical expected value of the game, it turns out to be infinite. Play this game enough times and it doesn’t matter how much it costs to play, you will certainly come out ahead in the long run. It might be the very long run, because so much of the value of the game is the one in a bazilion chance to win a bazilion dollars. But formally, any casino that offers this game at any price will eventually go broke.
There’s a lot of discussion of this result and why so many people wouldn’t be willing to pay much to play it despite its mathematical value. For most people (including me), it boils down to the fact that while we might jump at the chance to bet a dollar on a 2% chance at winning $100, we’d hesitate to bet ten thousand dollars on a 2% chance at winning a million dollars. The math is the same, but what happens to us after the very likely loss is not.
But real life doesn’t quite fit this mathematical abstraction anyway. Any casino that offers this bet will only have finite resources. Therefore the bet they’re offering is to toss the coin either until tails comes up or until the casino goes broke. And that changes the math quite a bit. I’ll follow the argument on the linked Wikipedia article, since it’s a nice summary and it gets me out of typesetting it myself because ScienceBlogs still doesn’t have LaTeX.
Since your winnings double for each head, the maximum number of times you can flip heads before the casino goes broke is L = 1 + Floor[log(W)], where W is the casino’s total assets and log is the base 2 logarithm. The floor function is just a function that leaves whole numbers unchanged and rounds everything else down to the nearest whole number. For instance, Floor[2.9] = 2.
So we calculate the expected winnings thusly:
The first line is pretty easy. It’s the probability of tossing k heads times the winnings you’ll get if you toss that many heads. It’ll be whichever is smaller: 2^k dollars or the casino’s total assets because you just bankrupted them.
The second line splits the sum for each of those possibilties. The first term is the pre-bankrupting winnings and the second is the post-bankrupting winnings. The third line calculates those sums. Remember L is a function of W, the casino’s total holdings.
We can plot this. Notice I’m using a log-linear plot, so look carefully at the x-axis. The graph is giving the expected winnings on the y-axis compared to the casino’s holdings:
This is pretty instructive. If your friend offers you the bet but he’ll only pay up to $100, then you shouldn’t be willing to pay more than about $4.28 if you want to come out ahead. But even if the maximum winnings are a million dollars, it’s still only worth about $10.95. And even if you were offered maximum winnings of trillions of dollars you’d be justified in paying much more than a Jackson. The logarithmic function is very slowly growing, and so that infinite expected value only comes into play because of the infinite possible winnings. Such a situation will not obtain in Vegas, so you shouldn’t go betting your house to play the game. After all, they have statisticians too.