Skip to main content
ABC News
Can You Beat The Game Show?

Welcome to The Riddler. Every week, I offer up problems related to the things we hold dear around here: math, logic and probability. There are two types: Riddler Express for those of you who want something bite-size and Riddler Classic for those of you in the slow-puzzle movement. Submit a correct answer for either,1 and you may get a shoutout in next week’s column. If you need a hint or have a favorite puzzle collecting dust in your attic, find me on Twitter.

Riddler Express

From Dan Calistrate, a geometric puzzle of agricultural inheritance:

A farmer has three daughters. He is getting old and decides to split his 1-mile-by-1-mile farm equally among his daughters using fencing. What is the shortest length of fence he needs to divide his square farm into three sections of equal area?

Submit your answer

Riddler Classic

From Jared Bronski, our ace game-show-puzzle submitter, we head back to the bright lights and the big money:

You and three of your friends are on a game show. On stage is a sealed room, and in that room are four sealed, numbered boxes. Each box contains one of your names, and each name is in one box. You and your friends take turns entering the room alone and opening up to two boxes, with the aim of finding the box containing your name. Everyone enters exactly once. Your team can confer on a strategy before stepping on stage, but there is no communication allowed during the show — no player knows the outcome of another player’s trip into the room.

Your team wins if it’s ultimately revealed that everyone found the box containing his or her name and loses if any player failed to do so. Obviously, the odds of winning are no better than 50 percent because any single player has a 50 percent chance of finding his or her own name. If each person opens two boxes at random, the chance of winning is \((1/2)^4 = 1/16 = 6.25\) percent. Or to put it in technical terms: The chance of winning is not so great. Call this the naive strategy.

Your goal: Concoct a strategy that beats the naive strategy — one that gives the team a better chance of winning than 1/16.

Extra credit: Suppose there are 100 contestants and 100 boxes. Each player may open 50 boxes. The chance of winning by using the naive strategy is 1 in \(2^{100}\), or about 1 in \(1.2\times 10^{30}\). How much can you improve the team’s chances?

Submit your answer

Solution to last week’s Riddler Express

Congratulations to рџ‘Џ Ben Knox рџ‘Џ of Madison, Wisconsin, winner of last week’s Express puzzle!

Last week, Riddler Nation was asked to find the optimal strategy in a new version of rock-paper-scissors. Besides the usual three options, there was a fourth — double scissors — which is played by making a scissors with two fingers on each side (like a Vulcan salute). Double scissors defeat regular scissors, and just like regular scissors, they cut paper and are smashed by rock. The three traditional options interact just as they do in the standard game. A rock-paper-scissors-double scissors match is always played best two out of three. There is just one exception: If your opponent throws paper and you throw regular scissors, you immediately win the match regardless of the score. What was the optimal strategy at each possible score (0-0, 1-0, 0-1, 1-1)? And what was the probability of winning the match given a 1-0 lead?

In a traditional game of rock-paper-scissors, the only optimal strategy is to randomize equally between the three objects — to play each with probability 1/3. Any other strategy can be exploited by a savvy opponent. If you favored scissors as your go-to throw, for example, your opponent could always throw rock and would best you in the long run.

A similar argument applies to our modified game. But first, notice that it is never optimal for a player with one win to play regular scissors. With one win already in the bag, there is no benefit from the exceptional rule where a win with regular scissors wins the match automatically, and double scissors beats everything that scissors does, plus scissors itself. In game theory parlance, double scissors dominates regulars scissors. Therefore, if the score is tied 1-1, our modified game is reduced to rock-paper-double scissors, and the optimal strategy is to play each with probability 1/3. That’s one piece of our solution done!

What about a match with a score of 1-0? Suppose you are in the lead and I am trailing. We both know that you’ll select from rock, paper and double scissors (because regular scissors is dominated) and I’ll select from rock, paper, scissors and double scissors. Let’s make a table showing all our possible plays and your probability of winning given those plays. Call your chances of winning with a 1-0 lead \(X\), which is what we’re searching for to solve this problem.

Solving for your next throw when you’re up 1-0
Rock X 1 0.5
Paper 0.5 X 1
Scissors 1 0 1
Double scissors 1 0.5 X

For instance, if you play rock and I play double scissors, you smash my double scissors, take a 2-0 lead and win the match — so your probability of winning the match is 1. If you play rock and I play paper, I cover your rock and tie the match 1-1. Because we’re both playing optimally, a tied match must give us both a 50-50 chance of winning it all. If you play paper and I play scissors, I cut your paper and win the match automatically, thanks to the special rule — so your probability of winning the match is 0.

But there are some \(X\)s in that table that we don’t yet know! Those are the chances that you win when you play optimally with your 1-0 lead. Let’s say, in your optimal strategy, that you play rock with some probability \(R_1\), double scissors with probability \(DS_1\), and paper with probability \(P_1\). No matter what I play, playing according to this strategy should deliver you your optimal 1-0 win strategy of \(X\). So, for example, if I play rock, we get an equation like this from the table above:

\begin{equation*}(R_1)(X)+(DS_1)(0.5)+(P_1)(1) = X\end{equation*}

If I throw rock: In those times, you wind up throwing rock, you win with probability \(X\); in those times you throw double scissors, you win with probability 0.5; and in those times you throw paper, you win with probability 1.0. The same holds true for the other three things I might throw, which gives us a system of equations:

\begin{equation*}(R_1)(1)+(DS_1)(X)+(P_1)(0.5) = X\end{equation*}

\begin{equation*}(R_1)(1)+(DS_1)(1)+(P_1)(0) = X\end{equation*}

\begin{equation*}(R_1)(0.5)+(DS_1)(1)+(P_1)(X) = X\end{equation*}

I am trying to solve a very similar problem, where I am playing rock with probability \(R_2\), double scissors with probability \(DS_2\), scissors with probability \(S_2\), and paper with probability \(P_2\). So if you play rock, for example, I know that:


The solution to this big system of equations gives \(X\approx 0.73\). It also gives that you should play rock, double scissors and paper with probabilities 0.40, 0.33 and 0.27, respectively. I should play rock, double scissors, scissors and paper with probabilities 0.55, 0, 0.21 and 0.25, respectively. Three more pieces of our solution done!

All that’s left is to figure out the strategy at the beginning of the game — score 0-0. We can make another table to help with that:

Solving for your next throw when you’re tied 0-0
Rock 0.5 X 1 – X 1 – X
Paper 1 – X 0.5 1 X
Scissors X 0 0.5 X
Double scissors X 1 – X 1 – X 0.5

The solving process is the same as above, too, but I’ll spare you all the algebra. (Turns out a rock-paper-scissors problem isn’t very “Express” after all!) The optimal strategy that pops out of the math is that at score 0-0, we should both play rock, double scissors, scissors and paper with probabilities 0.52, 0, 0.24, 0.24, respectively. One interesting finding, noted by the puzzle’s submitter, Patrick Coate: No one should ever play double scissors before a win or regular scissors after a win.

It’s like the old saying goes: In the world of double scissors, the rock is king.

Solution to last week’s Riddler Classic

Congratulations to рџ‘Џ Michael Goss рџ‘Џ of San Jose, California, winner of last week’s Classic puzzle!

On the table in front of you last week were two coins. They looked and felt identical, but you knew one of them had been doctored. The fair coin came up heads half the time while the doctored coin came up heads 60 percent of the time. How many flips — you flipped both coins at once, one with each hand — did you need to give yourself a 95 percent chance of correctly identifying the doctored coin?

You would need 143 flips.

The flips of each coin follow a binomial distribution — the number of successes in a sequence of trials. Because the flips are independent, the joint distribution is simply the product of two binomial probability mass functions. So the following expression is the probability of j heads for the fair coin and k heads for the biased coin after a total of n flips:

\begin{equation*}{n\choose k} 0.6^k (1-0.6)^{n-k} {n\choose j}0.5^j (1-0.5)^{n-j}\end{equation*}

\begin{equation*}= (0.5)^n{n\choose k}{n\choose j}(0.6)^k (0.4)^{n-k}\end{equation*}

Using this expression for the probabilities of the individual heads-tails outcomes, we can calculate the probability that the coin with more heads is the biased coin. We simply sum the probabilities over all the cases in which \(k \leq n\) and \(k>j\).

\begin{equation*}(0.5)^n \sum_{k \leq n} \sum_{j

When \(n=143\), this probability is 95.01 percent.

A trio of authors posed just this problem in an economics paper earlier this year. The authors ran a survey asking finance professionals to estimate, without doing any math, how many flips it would take. The vast majority thought it would take fewer than 143. The median response was 40.

Hector Pefo charted how this number of necessary flips decreases as the doctored coin becomes more doctored and its probability of landing heads increases:

Keep on flippin’, Riddler Nation. See you next week.

Want to submit a riddle?

Email me at


  1. Important small print: For you to be eligible, I need to receive your correct answer before 11:59 p.m. EDT on Sunday. Have a great weekend!

Oliver Roeder was a senior writer for FiveThirtyEight. He holds a Ph.D. in economics from the University of Texas at Austin, where he studied game theory and political competition.