Breaking the Bank

In the study of probability there's a concept called expected value. If you're measuring some random process, the expected value is the average over a very large number of trials. For instance, if you roll one dice you can expect to come up with a value of 3.5. Now obviously no individual roll can give you a 3.5, that's just the average you'll get over a large number of trials.

The expected value is calculated in the following way. Take the probability of one of the outcomes, multiply by the value of that outcome, repeat for all possible outcomes, and add them all up. For dice, the probability of each of the 6 outcomes is of course 1/6. So we have 1*(1/6) + 2*(1/6) + ... + 6(1/6) = 3.5 for the expected value of a dice roll.

If someone offered you the opportunity to roll a dice and give you dollars equal to the value of your roll, over the long run you could expect to win $3.50 per roll. If you were paying to play the game as a gamble, you'd lose money in the long run if the game cost more than $3.50 to play. Casinos do this exact kind of expected value calculation in great detail for all their games. They want to make sure they charge more than the expected value for each game so they don't go broke. Indeed, there's a lot of mathematicians who make good money doing this very thing.

Now here's a sort of tricky scenario which explores the limits of the expected value concept. A casino offers you a coin flipping game with the following rules:

1. Flip a fair coin. (We may assume it's perfectly fair.)
2. If heads, the game ends and you win $N, where initially N = 1.
3. If tails, double N and repeat step 2.

Eventually you'll come up tails and win some amount of money. If you flip tails many times it may in fact be a pretty huge amount, since the prize has doubled so many times before you come up heads.

Using the expected value concept, how much money can you pay for this game and still come out ahead? How much would you actually be willing to pay, if it were personally offered you?

Savvy readers may ask how much money the casino has. Start off assuming infinite wealth, and then, if you'd like, repeat for a casino of large but finite wealth. Go!

More like this

A while back I mentioned the St. Petersburg paradox. It's a hypothetical gambling scenario where you win money based on the outcome of a coin toss. If you get your first tails on the first throw, you get $1. If you get one head before your first tails, you get $2 dollars. If you get two heads…
I heard it again the other night. One of the TV chin strokers talking about this poll or that poll showing Obama (or McCain) ahead with a "statistically insignificant" lead, and I thought to myself, no one who knew much about statistics would use a phrase like that. Strictly speaking, while there…
Here's Timothy Gowers, a Fields Medalist, from his book Mathematics: A Very Short Intorduction: However, there certainly are philosophers who take seriously the question of whether numbers exist, and this distinguishes them from mathematicians, who either find it obvious that numbers exist or do…
The first key concept in probability is called a random variable. Random variables are a key concept - but since they're a key concept of the frequentist school, they are alas, one of the things that bring out more of the Bayesian wars. But the idea of the random variable, and its key position in…

If I recall correctly, the expected value is undefined in this case, but I can't recall the argument as to why. I guess I could sit down and do the math, but tomorrow is Christmas and I'm on my holidays.

By Ketil Tveiten (not verified) on 23 Dec 2009 #permalink

Of course math is for holidays! What could be more fun

Of course, the expected value here is infinite:
sum(prize-n * probability(n)) = sum(2^n*(1/2^n)) = sum(1) = inf

"n" being number of coins tossed, probability(n) being odds of getting tails only after n throws. I might have an off-by-one somewhere here, but it shouldn't matter.

So, in conclusion, As long as you promised me I could play this game as long as I'd like, I'd pay for the game any price you'd care to mention, assuming I have enough cash...

I'm not sure about the finite cash scenario - if we get to the point that the casino is at its limit, do i get the money so far or not? If not, then obviously this is a bad bet. If yes, then the game is still worth it.

This looks as though the answer is simpler than it first appears (so I've probably got it wrong).

The value of the prize doubles for each incremement in length of run of tails.
At the same time, the likelihood of a run of that length occurring halves at the same rate.

So:
Throw 1 - .5 chance of winning $1
Throw 2 - .25 chance of $2
Throw 3 - .125 chance of $4
...

The expected value is therefore $0.50 per throw. Pay more than this and in the long run you'll go broke.

#3, that's sort of the "Deal or No Deal" question: in the statistical long run it's almost always better not to take the deal. But in that game you only get the one shot, so the risk/reward decision is weighted by more than just the cash value.

So what do you think you'd pay if you could only play it once? I think my answer is "not much more than a few bucks", despite the considerable mathematical pressure to wager more.

Infinite expected value, sure, but you'd best be careful how much you pay to play this game. The probability of a big win is very, very, very small. In order to answer the question "how much would you pay to play" you need a good understanding of your risk attitude.

pedantry, but...

one die
two or more dice

#4, but you don't control the number of throws. I think the expected length of the game is somewhere between 1 and 2 throws (the math is beyond me).

The fee asked by the casino would be infinite if there is no cut off point.
The break even point would be $0.25 times the maximum number of coin tosses allowed in a row.

By Who Cares (not verified) on 23 Dec 2009 #permalink

Oops that should have been $ 0.25 + $0.25 times the maximum number of coin tosses

By Who Cares (not verified) on 23 Dec 2009 #permalink

Expected value is infinite, but variance is too high. Your pay out could be anywhere between $1 (with prob 0.5) and infinite (with probability approaching 0 :-). Personally, I wouldn't pay more than 10 bucks, because I don't believe I can toss more than 5 straight tails.

#3:

I'm not sure about the finite cash scenario - if we get to the point that the casino is at its limit, do i get the money so far or not? If not, then obviously this is a bad bet. If yes, then the game is still worth it.

Actually, the expected values of the two scenarios differ by just $1!

Suppose the max prize is Q. In the case where you get the money so far, where prize(n) = 2^n for n<= log2Q (which we'll call N) and 2^N otherwise, the expected value is
sum( prize(n)*probability(n) ) = sumn=0N1 + sumn>=N+1( 2^(N+1)/2^n )
    = (N+1) + 2^N/2^(N+1) * [1/(1-1/2)] = (N+1) + 1 = N+2
so, the expected value is log2Q + 2.

On the other hand, in the case where you lose the money so far, where prize(n) = 2^n for n<=N and 0 otherwise, we have the same result as the other case except that the geometric series is multiplied by 0 (since prize(n)=0 for n>N) and therefore vanishes. So, the expected value is log2Q + 1.

By Avi Steiner (not verified) on 23 Dec 2009 #permalink

Shoot. In my previous post (#12), I meant "prize(n)=2^n for n less than or equal to log2Q" in both the first and second paragraphs. Damn html character thingies

By Avi Steiner (not verified) on 23 Dec 2009 #permalink

@Mohan (#11):
$10.00 is statistically speaking to much if you expect to only get 5 tails in a row max.
You only want to spend $1.50

The first step is to see how many possibilities there are.
In this case that is easy since the coin toss is binary.
Then you want to see the payout for the different possibilities.
Next step is to sum the payouts and divide by the number of possibilities.

So to continue with the example of a max throw of 5 straight tails:
2^5 = 32 possible outcomes
16 loss - pay 0
8 1 tail - pay 1
4 2 tails - pay 2
2 3 tails - pay 4
1 4 tails - pay 8
1 5 tails - pay 16
Summed pay: 16*$0+8*$1+4*$2+2*$4+1*$8+1*$16 = $48
So you expect to gain $48 in 32 attempts.
Meaning you can expect to make a profit if each attempt costs less then $1.50 ($48/32)

n = max expected number of tails in a row.

The amount of possible results is
2^n

for n = 1
Total expected result
2^(n-1)*0 + 1*2^(n-1) = $1
Expected result per attempt
(2^(n-1)*0 + 1*2^(n-1))/2^n = $0.50

for n > 1
The total expected result by the following
2^(n-1)*0 + sum[i=n-2i=0, j=0j=n-2](2^i*2^j) +1*2^(n-1)

The expected result (and because of that the break even point) per attempt as
(2^(n-1)*0 + sum[i=n-2i=0, j=0j=n-2](2^i*2^j) +1*2^(n-1))/2^n

It is possible to reduce the expected result per attempt.
(2^(n-1)*0)/2^n + sum[i=n-2i=0, j=0j=n-2](2^i*2^j)/2^n + 1*2^(n-1)/ 2^n
0 + sum[i=n-2i=0, j=0j=n-2](2^i*2^j)/2^n + 0.5
The sum part (I'm to lazy to do the entire thing) reduces to 0.25 * (n-1)
so you end up with an expected result per attempt of
0.25*n-1 + 0.5

By who cares (not verified) on 23 Dec 2009 #permalink

This is, of course, the famous St. Petersburg Paradox. There is a good treatment of this in Poundstone's FORTUNE'S FORMULA, pp. 182-188, in which he discusses Bernoulli's utility function as an alternative to expectation.

By Gerry Rising (not verified) on 24 Dec 2009 #permalink

Like @7 said:
Reading this was like hearing fingernails on a blackboard.

By CCPhysicist (not verified) on 24 Dec 2009 #permalink

If you look at the individual expected values on this, a sixth flip win happens roughly 10% of the time. A tenth flip win happens roughly 1% of the time. If you're looking to make infinite wealth, it looks like you might have a better chance at the lottery.

The lotto will pay you $24 mil right now and your odds are roughly 1 in 26 million.

Coin Flip will pay you $24 mil on a 34th flip tails, odds 2^34 -> 1 in 17 billion.

So, I suppose you could look at it this way -
There's a 1/8 chance of getting HHT. So give me 8 guys. 1 gets $4, another gets $2 (the 1/4 guy), and everyone else gets $1.

There's a 1/16 chance of getting HHHT. So give me 16 guys. 1 gets $8 (the 16th guy), 1 gets $4 (the 8th guy), 2 get $2 (the 4th and 12th guys), and everyone else gets $1.

Admittedly, someone might bang out 34 Hs in a row and only be the 20th guy in line, but if everyone pays up front before you flip, you should be covered if you charged enough and there's absolutely no variance (insane, I know).

In an 8 man case (where 1 gets $4, 1 gets $2, and the rest get $1), you can charge $12 total - $1.33 a head - and break even.

In a 16 man case ($8/$4/$2/$2/$1...), you need $1.75 a head - and that's when you realize this is a really shitty game for casinos. Because, the more people who play, the more you have to charge per player if you really plan to break even.

Since casinos make their business in volume and this game is more expensive the more players you add, I think I know why casinos don't offer it.

So, my logic sucks. In an 8-flip case, it would be 4($1), 2($2), 2($4) = $16 or $2 / flip. In a 16-flip case, it would be 8($1), 4($2), 2($4), 2($8) = $40 or $2.25 / flip.

The break even point would be $0.25 times the maximum number of coin tosses allowed in a row.

That doesn't make any sense. You get a dollar bare minimum. So in the base case (1 flip - you always win $1) and the first iterative case (2 flips - 50/50 you win $1 or $2), you'd have to collect $1 and $1.50 respectively.

But yeah, in the infinite case, the casino will always lose at some point in the future.

@18

"But yeah, in the infinite case, the casino will always lose at some point in the future."

Only if they are playing against a bigger casino. What is left out of this scenario (and is much more important than the casino's bank size)is the size of the players bank. Gamblers ruin guarantees that the player will go broke before the casino does given that the price of the game is set high enough that he runs out of money to play before he hits the big win.

Only if they are playing against a bigger casino. What is left out of this scenario (and is much more important than the casino's bank size)is the size of the players bank.

Not really. Players can't bid to flip. You pay a flat fee and win what you win. So one player flipping a thousand times is no different to the casino than a thousand players flipping once.

That's why the the game is a dud. The casino is expected to pay out progressively more money the more people that play the game.

Unlike in craps or roulette, where players can vary their winnings based on how much they wager and how much risk they take, the wager and the risk in this game is constant. The only way to break the bank is to go on a head-flipping hot streak. But the more people who play, the more likely that is to happen. With no upper bound on winnings, and exponentially impressive payouts, increased traffic at the table means an increased chance to win big.

You expect the 1/2^n player to win twice as much as the 1/2^(n-1) player. And you can't draw in losers' income fast enough to outpace the winnings.

"who cares" at #14 is on the right track, but I think he or she is using slightly different rules than in the original problem: as I read the problem, the player can never walk away with $0. That's unusual for a casino game, but whatever.

Repeating the rules:
1. Flip a fair coin. (We may assume it's perfectly fair.)
2. If heads, the game ends and you win $N, where initially N = 1.
3. If tails, double N and repeat step 2.

I'm not sure these are completely unambiguous (it's not obvious to me how the process terminates if the bank has finite finances) but I would interpret them as following:

If we terminate after 1 throw, I'd interpret the payoff scheme as follows:
H : $1
T : $2
expectation value $1.50

If we terminate after 2 throws, I'd interpret the payoff scheme as follows:
H- : $1
TH : $2
TT : $4
expectation value $2

If we terminate after 3 throws, I'd interpret the payoff scheme as:
H-- : $1
TH- : $2
TTH : $4
TTT : $8
expectation value $2.50

Without doing the actual math, the breakeven cost for playing is $(1+0.5*n), where n is the maximum number of throws.

Personally, if n was very large I wouldn't be willing to play even if the cost was less than the above amount. But that's not math, that's weird human stuff like the nonlinear utility of money, the idea that I'm willing to pay to reduce risk, the fact that I can only play a finite number of times, etc. etc.

By Anonymous Coward (not verified) on 03 Jan 2010 #permalink