A Great Mathematics Paradox

When you flip a coin, heads or tails, with each flip of the coin there is exactly a 50/50 chance of heads coming up. No matter how many times you flip the coin, it remains a 50/50 chance of heads for each flip. But we all know that sooner or later it has to come up tails. If you were making bets on each toss of the coin and you begin by betting one dollar on the first flip and then doubling the bet each time you lose, you will eventually win and can walk away with the profits or start again with a new low bet, provided you don’t run out of money before the streak of losing ends, and providing someone is stupid enough to take those bets in the first place. That is why casinos have a maximum bet at their blackjack tables. The bet limit prevents you from simply doubling your bet each time you lose, and then eventually walking away with a sizeable win at the end of your losing streak. 

It’s called “gambler’s fallacy” by mathematicians. Sometimes when people are gambling,  they base their betting strategy on knowing that streaks always end sooner or later. Remember, you can eventually win, but only if you have enough money to double your bet with each hand of blackjack, because you have to win back your previous loss. A long streak can amount to an awful lot of cash on the table. Consider this: You start with a one dollar bet and you lose the round, so you now bet with two dollars on the table, and having lost again, you must put four dollars out, and as you continue the losing streak, you double your previous bet each hand.  By just the tenth hand, you’ll have $512.00 dollars on the table. If they let you continue and you keep losing, you’ll have $524,288 out there by the 20th hand, and when your losing streak hits the 30th hand, you’ll have just under 537 million dollars bet on the hand. And you started with one dollar. That’s why they give you free drinks in casinos.  (see the chart below for how the 30-day progression goes beginning with one penney)

Perhaps this erroneous thinking is just a mind-bender like this story: Three men enter a hotel and approach the clerk to pay for a room they will share for the night which costs $30. Each man gives the clerk ten dollars for their share of the fee, and they go upstairs to the room. Moments later the clerk realizes he has overcharged them for the room, which should have cost 25 dollars, so he takes five one-dollar bills from the register and hands it to the bellhop to take upstairs to the guests. At the room the bellhop explains to the men what happened and tries to give the men the five dollars, but instead they each take one dollar from him and tell him to keep the remaining two dollars as a tip. Now, as a result of all this, each man has paid nine dollars which (3×9) equals 27 dollars, and the bellhop walked away with his 2 dollar tip. So lets sum it up. Each man wound up spending 9 dollars for the room, which totals 27 dollars, and the bellhop got the 2 dollar tip, so 27 dollars plus the 2 dollar tip equals 29 dollars. But recall that they had actually paid 30 dollars downstairs, so what happened to the other dollar?

Day 1: $.01
Day 2: $.02
Day 3: $.04
Day 4: $.08
Day 5: $.16
Day 6: $.32
Day 7: $.64
Day 8: $1.28
Day 9: $2.56
Day 10: $5.12
Day 11: $10.24
Day 12: $20.48
Day 13: $40.96
Day 14: $81.92
Day 15: $163.84
Day 16: $327.68
Day 17: $655.36
Day 18: $1,310.72
Day 19: $2,621.44
Day 20: $5,242.88
Day 21: $10,485.76
Day 22: $20,971.52
Day 23: $41,943.04
Day 24: $83,886.08
Day 25: $167,772.16
Day 26: $335,544.32
Day 27: $671,088.64
Day 28: $1,342,177.28
Day 29: $2,684,354.56
Day 30: $5,368,709.12     That’s one cent rising to over five million dollars in thirty days. 

Comments are closed.

Post Navigation