The Pig in a Poke Paradox

 





The 1985 Amy Heckerling film National Lampoon's European Vacation opens with the Griswald family competing for cash and prizes on the fictitious game show Pig in a Poke. The Griswalds, having already won some impressive prizes, are presented with a choice: walk away with their winnings or "be a pig" and vie for the top prize. Clark W. Griswald (Chevy Chase) decides on the family's behalf to be a pig. The family go on to accidentally win the top prize, which turns out to be an all-expenses-paid tour of Europe and hilarity, as the saying goes, ensues.

The scene pokes fun at a number of clichés found in American television game shows. There's the pedantry and self-satisfaction of some Jeopardy contestants. There's the creepiness of Richard Dawson's insistence on kissing each female contestant on Family Feud. Then there's the dilemma over whether the Griswalds will quit while they're ahead or "be a pig". In game show after game show—Wheel of Fortune, Press Your Luck, Who Wants to Be a Millionaire?, Deal or No Deal—we see this same moment. A contestant is given the chance to consolidate his or her winnings and terminate gameplay, or risk all or a significant portion of those winnings to continue in pursuit of a larger prize. It's a well that game show creators return to because viewers are drawn into the drama. We celebrate those rare-but-not-too-rare moments when the contestant wins the top prize, but more often we indulge in the schadenfreude of seeing contestants' greed catch up with them.

The contestants' comeuppance has less to do with the universe dispensing justice than it does with mathematical inevitability. An experiment with two possible random outcomes is known in mathematics as a Bernoulli trial. The probabilities of the two outcomes can be distributed 50-50—like a coin toss—or unevenly—like a roulette bettor who puts his or her money on green either winning or losing. As the number of Bernoulli trials increases, the probability of one particular outcome occurring at least once increases. A member of a B-17 crew during the Second World War might consider a 93% probability of surviving a mission to be not too bad considering there are people shooting at his plane, but the probability of completing a 25-mission tour of duty would be 0.9325, or about 16%. As the number of Bernoulli trials converges on infinity, the probability of one particular outcome occurring at least once converges on one. Returning from the life-and-death reality of bomber crews to the more entertaining world of game shows, a contestant who accepts enough opportunities to risk his or her winnings on a bigger payout will eventually lose.

Now let's consider a hypothetical game show where a contestant draws a card from a shuffled deck. If the contestant draws any card other than the four of clubs, he or she wins $1000. The contestant is then given a choice: place the drawn card back in the deck, reshuffle, and draw another card, or stop play. Each time the contestant draws a card other than the four of clubs, another $1000 is added to the accumulated winnings, but if the four of clubs is ever drawn, the cash award is reduced to zero and the game ends. After each draw the contestant is given the same choice: play another draw for a chance to increase the winnings by another $1000, or end the game and accept the winnings accumulated so far. For each draw, the probability of winning is 98%, but if the game goes on for an infinite number of draws, the probability of winning is 0%. There could be any number of ways to think of a dominant strategy. The contestant might set a target amount of winnings. A $1000 prize might not be worth the trip to the television studio, but a $10,000 prize, which has an 82% probability of being achieved, might be a tidy little payday. Or the contestant might choose to take no worse than a 50% chance of a win, which occurs at a maximum of 34 trials. What would never be considered a dominant strategy would be to keep playing forever; that strategy is guaranteed to lose.

But here's the paradox. The outcomes that have already occurred and the number of trials that have already occurred have no bearing on the outcome of the next trial. To suggest otherwise would be to invoke the Gambler's Fallacy. Random means random. There is no cosmic force balancing things out; the law of averages only applies at infinity. At each decision point, a completely rational choice would be to continue gameplay and accept a 2% risk of losing everything for the 98% likelihood of increasing the winnings, but making that rational decision trial after trial will, with mathematical certainty, ultimately lead to the player losing.

I looked around and couldn't find a description of this paradox, or a name for it. If any math majors or math-adjacent majors would like to chime in, please leave a comment. Otherwise, I'm claiming the naming rights. I'm calling it the Pig in a Poke Paradox.

Comments

Popular posts from this blog

John Mellencamp Ordered to Change "Little Pink Houses" Lyrics

Unplugged

An Open Letter