When I was a little kid I used to take a pair of dice and throw these dice repeatedly. At each throw I’d fill in a box for the corresponding number on some graph paper and I would essentially “race” the numbers against each other. I suppose for that reason I’ve always been fascinated not just by probabilities, but in the convergence of repeated trials to the limiting “probabilities.” Which explains not just why I’m an uber geekazoid, but also why I was quite shocked today when I Googled “gambler’s ruin” and found that the intertubes only returned about 16000 hits (“card counting,” by the way, returns about 845,000 hits.) Gambler’s ruin is one of my favorite basic probability exercises (and a reason why many a poor soul, even if they have an advantage, ends up losing their money.)

What is gamber’s ruin? Suppose you have access to a game in which you have a slight advantage in winning. We codify this advantage by postulating that you win the game a certain percent of the time which we call p (having an advantage means p is greater than a half.) In this game, you can bet a dollar on each run of the game and if you win you get another dollar. If you lose, well, of course you lose your dollar. Now starting with a bankroll of D dollars, you might be interested in figuring out what the probability is that you will run out of money before your bankroll expands to T dollars. In other words, what is your probability of ruin, given a starting bankroll of D dollars, an advantage of p, and a target of T dollars?

The way to calculate this is kind of cute. We want to calculate Pr(Ruin starting with x dollars). But this is equal to the probability of winning the first game times the probability of ruin starting with x+1 dollars plus the probability of losing the first game times the probability of ruin starting with x-1 dollars. In other words Pr(Ruin starting with x dollars) =Pr(Ruin starting with x+1 dollars|won first game) Pr(won first game) + Pr(Ruin starting with x-1 dollars|lost first game) Pr(lost first game). To simplify notation let Pr(Ruin starting with x dollars) be denoted by P(x). Then we can express the above equation as P(x)=P(x+1) Pr(won first game) + P(x-1) Pr(lost first game). But we know the probabilities of winning and losing that first and last game (it just p and 1-p respectivally.) Thus we have P(x)=p P(x+1) + (1-p) P(x-1).

Note that P(0)=1 (since if we start with no money we always end up ruined) and P(T)=0 (since if have our target amount of money then we are never ruined.) The equation P(x)=pP(x+1)+(1-p)P(x-1) is a difference equation, and as such there are a couple of ways to solve this equation. We will choose the method which is the bane of students everywhere: the method of judicious guessing. In particular suppose that P(x)=k^x for some unknown k. Then if this solves the difference equation we must have k^x = p k^{x+1} +(1-p) k^{x-1}. Canceling out some k’s allows one to write this as k=p k^2 +(1-p). This quadratic equation has two solutions for k: k_+ = 1 and k_- = (1-p)/p (we will assume that p does not equal one half, i.e. someone has an advantage in the game.) Call this later solution k=k_-=(1-p)/p. From this we can guess that a generic solution to the difference equation is P(x)= A 1^x + B k^x for some yet to be determined A and B. Recalling that P(0)=1, we obtain A+B=1 and further using P(T)=0 we obtain A +B (k_-)^T = 0. Thus B=1/(1-k^T) and A= -k^T /(1-k^T). Putting this all together we obtain that P(x)=[-k^T+k^x] /[1-k^T]. Since we start with D dollars, the formula for gambler’s ruin is thus P(D)=[k^D – k^T]/[1-k^T].

It’s useful to try some sample values in the formula to get an idea of what it is telling you. Suppose that your goal is to double your money, i.e. T=2D. Then a handy table to assemble is to figure out your probability of ruin given your advantage and the size of your bankroll:

D \ p | 0.501 | 0.505 | 0.51 | 0.55 | 0.6 |
---|---|---|---|---|---|

1 | 0.499 | 0.495 | 0.49 | 0.45 | 0.4 |

5 | 0.4950 | 0.4750 | 0.4502 | 0.2683 | 0.1164 |

10 | 0.4900 | 0.4502 | 0.4013 | 0.1185 | 0.0170 |

100 | 0.4013 | 0.1192 | 0.0180 | 1.927 x 10^{-9} |
2.460 x 10^{-18} |

1000 | 0.0180 | 2.059 x 10^{-9} |
4.226 x 10^{-18} |
7.077 x 10^{-88} |
8.105 x 10^{-177} |

From this table you can conclude a few things. Even if you have, a fairly considerable advantage, say p=0.55, if you start with 10 dollars you will be ruined before reaching 20 dollars with about 11.8 percent probability. As a gambler this really means that you need to pay attention to not just your odds of winning, but also to your bankroll: betting a substantial portion of your bankroll is a good way to go broke even when you’ve got a considerable advantage!

Suppose that you want a chance of ruin which is one in a thousand. What size bankroll should you use given an advantage p an a desire for an unlimited upside? A rough quick way to calculate this, given you have an advantage p is to calculate 7p/(2p-1). So if p=0.51, then this is about 178 (the real number is 172.)

Even more humbling is to use the above formula for gambler’s ruin when you are at a disadvantage. For example if you win 49 percent of the time, and want to double your initial 100 dollars, the probability of ruin is 98.2 percent. Thinking about this you might realize why casinos make such good money even with games whose odds are only slightly stacked in their favor.