Wisely Using Your Advantage

When I was a little kid I used to take a pair of dice and throw these dice repeatedly. At each throw I'd fill in a box for the corresponding number on some graph paper and I would essentially "race" the numbers against each other. I suppose for that reason I've always been fascinated not just by probabilities, but in the convergence of repeated trials to the limiting "probabilities." Which explains not just why I'm an uber geekazoid, but also why I was quite shocked today when I Googled "gambler's ruin" and found that the intertubes only returned about 16000 hits ("card counting," by the way, returns about 845,000 hits.) Gambler's ruin is one of my favorite basic probability exercises (and a reason why many a poor soul, even if they have an advantage, ends up losing their money.)

What is gamber's ruin? Suppose you have access to a game in which you have a slight advantage in winning. We codify this advantage by postulating that you win the game a certain percent of the time which we call p (having an advantage means p is greater than a half.) In this game, you can bet a dollar on each run of the game and if you win you get another dollar. If you lose, well, of course you lose your dollar. Now starting with a bankroll of D dollars, you might be interested in figuring out what the probability is that you will run out of money before your bankroll expands to T dollars. In other words, what is your probability of ruin, given a starting bankroll of D dollars, an advantage of p, and a target of T dollars?

The way to calculate this is kind of cute. We want to calculate Pr(Ruin starting with x dollars). But this is equal to the probability of winning the first game times the probability of ruin starting with x+1 dollars plus the probability of losing the first game times the probability of ruin starting with x-1 dollars. In other words Pr(Ruin starting with x dollars) =Pr(Ruin starting with x+1 dollars|won first game) Pr(won first game) + Pr(Ruin starting with x-1 dollars|lost first game) Pr(lost first game). To simplify notation let Pr(Ruin starting with x dollars) be denoted by P(x). Then we can express the above equation as P(x)=P(x+1) Pr(won first game) + P(x-1) Pr(lost first game). But we know the probabilities of winning and losing that first and last game (it just p and 1-p respectivally.) Thus we have P(x)=p P(x+1) + (1-p) P(x-1).

Note that P(0)=1 (since if we start with no money we always end up ruined) and P(T)=0 (since if have our target amount of money then we are never ruined.) The equation P(x)=pP(x+1)+(1-p)P(x-1) is a difference equation, and as such there are a couple of ways to solve this equation. We will choose the method which is the bane of students everywhere: the method of judicious guessing. In particular suppose that P(x)=k^x for some unknown k. Then if this solves the difference equation we must have k^x = p k^{x+1} +(1-p) k^{x-1}. Canceling out some k's allows one to write this as k=p k^2 +(1-p). This quadratic equation has two solutions for k: k_+ = 1 and k_- = (1-p)/p (we will assume that p does not equal one half, i.e. someone has an advantage in the game.) Call this later solution k=k_-=(1-p)/p. From this we can guess that a generic solution to the difference equation is P(x)= A 1^x + B k^x for some yet to be determined A and B. Recalling that P(0)=1, we obtain A+B=1 and further using P(T)=0 we obtain A +B (k_-)^T = 0. Thus B=1/(1-k^T) and A= -k^T /(1-k^T). Putting this all together we obtain that P(x)=[-k^T+k^x] /[1-k^T]. Since we start with D dollars, the formula for gambler's ruin is thus P(D)=[k^D - k^T]/[1-k^T].

It's useful to try some sample values in the formula to get an idea of what it is telling you. Suppose that your goal is to double your money, i.e. T=2D. Then a handy table to assemble is to figure out your probability of ruin given your advantage and the size of your bankroll:

D \ p 0.501 0.505 0.51 0.55 0.6
1 0.499 0.495 0.49 0.45 0.4
5 0.4950 0.4750 0.4502 0.2683 0.1164
10 0.4900 0.4502 0.4013 0.1185 0.0170
100 0.4013 0.1192 0.0180 1.927 x 10-9 2.460 x 10-18
1000 0.0180 2.059 x 10-9 4.226 x 10-18 7.077 x 10-88 8.105 x 10-177

From this table you can conclude a few things. Even if you have, a fairly considerable advantage, say p=0.55, if you start with 10 dollars you will be ruined before reaching 20 dollars with about 11.8 percent probability. As a gambler this really means that you need to pay attention to not just your odds of winning, but also to your bankroll: betting a substantial portion of your bankroll is a good way to go broke even when you've got a considerable advantage!

Suppose that you want a chance of ruin which is one in a thousand. What size bankroll should you use given an advantage p an a desire for an unlimited upside? A rough quick way to calculate this, given you have an advantage p is to calculate 7p/(2p-1). So if p=0.51, then this is about 178 (the real number is 172.)

Even more humbling is to use the above formula for gambler's ruin when you are at a disadvantage. For example if you win 49 percent of the time, and want to double your initial 100 dollars, the probability of ruin is 98.2 percent. Thinking about this you might realize why casinos make such good money even with games whose odds are only slightly stacked in their favor.

More like this

A while back I mentioned the St. Petersburg paradox. It's a hypothetical gambling scenario where you win money based on the outcome of a coin toss. If you get your first tails on the first throw, you get $1. If you get one head before your first tails, you get $2 dollars. If you get two heads…
Baseball's World Series is played over the best of seven games. The first two games are played at the home field of one team (we will call this one team A), the next three at the home field of team B, and the last two at the home field of team A. Given that teams are more…
The Quantum Pontiff : Wisely Using Your Advantage "In other words, what is your probability of ruin, given a starting bankroll of D dollars, an advantage of p, and a target of T dollars?" (tags: science math statistics blogs) Optics basics: Coherence ë Skulls in the Stars A very nice…
Sunday night, the Patriots lost a heartbreaker to the Colts 35-34. The talk of the sports world yesterday was Bill Belichick's decision to go for it on fouth-and-two on his own 28 yard line when he was up by six with just over two minutes to play. They didn't get the first down, and turned the ball…

What caught me is that you ran probability tests as a child.

By Paul Johnson (not verified) on 03 Sep 2008 #permalink

There's a general criterion for deciding how much to bet when you have an advantage; called the Kelly criterion. It has wide applicability; for example it can be used in the share market. If you can gauge the odds of return accurately (a big if!) then you can also tell how much of your bankroll you should invest.

Basically, you invest so as to maximize the expectation for the log of the bankroll. You can also think of it as maximizing the fraction by which your bank will increase. In games where you can invest any fraction of your bankroll, and play indefinitely, this strategy is optimal, in the following sense. Suppose you are competing with another player, who has some other investment strategy, playing the same games, but betting some different fraction of the bank. The probability that the Kelly investor has the larger bank approaches unity as the time increases.

Example. You mention a game of double or nothing where you have a probability p = 0.55 chance of winning. If you bet a fraction f of your bank, then your expected log is p*log(1+f)+(1-p)*log(1-f). This has a maximum at f = 2p-1

Thus when p = 0.55, you should invest 10% of your bank for the best possible growth of your bank roll. Less than this, and you are not taking enough advantage of your edge. More than this, and your losses drop your bank too far for you to have enough remaining to take advantage of subsequent opportunities to recover.

A "risk-averse" investor will invest a bit less than the Kelly optimum, to avoid short term down turns. But you should never invest more than the Kelly criterion. That way lies ruin.

Dave,

Great post.

I used to also run probability tests when I was a kid.....still do. However, mine were so I could find an edge while I was playing in illegal craps and poker games during my misspent youth. That being said, I find gambler's ruin to be very useful when applied to analyzing a trade.

Jeff

Many people who go to a casino do so with a specific affordable 'float' and the hope that they might increase it, but the expectation that they will play until it is all lost. To maximise their time at the table they will place smallish bets. Unfortunately this strategy is the worst possible, and there is effectively no chance of increasing your money. In the larger sense it is not a gamble at all, but a payment for the 'pleasure' of time at the table.
Assuming that p < .5 (which it will be in a casino), and that you insist on gambling all of your float, your best strategy is to bet the whole lot in one go and then leave!

What caught me is that you ran probability tests as a child.

Not much to do in a small rural town! Well not much if you don't include hours spent playing games (probability and competitiveness), stacking wood (patience and hard work), doing long division (math and concentration). Seriously that last one: my father used to take me to work with him on Saturdays (work ethic) and would give me long division to do while he was working.

Duae Quartunciae thanks for that comment. Now you've got me wasting my time thinking why this is optimal! :)

Running probability tests? You were a nerd!

I only did two things as a kid, but they were of course sensible. One was to take random walks through the neighbourhood (by throwing coins). Now I tell my students about it to motivate the exercise of making equiprobable choices among 3 options by using coins. The other thing was to track the winning numbers of Loto; then my Atari program would suggest a bet using the numbers less used so far. I also tell the students about than one, in this case to highlight the misunderstandings of the Law of Large Numbers and independence. I wonder what the whispers and strange looks in class are about...

By dileffante (not verified) on 04 Sep 2008 #permalink