Skip to main content
Menu
Can You Flip The Magic Coin?

Welcome to The Riddler. Every week, I offer up problems related to the things we hold dear around here: math, logic and probability. Two puzzles are presented each week: the Riddler Express for those of you who want something bite-size and the Riddler Classic for those of you in the slow-puzzle movement. Submit a correct answer for either,1 and you may get a shoutout in next week’s column. If you need a hint or have a favorite puzzle collecting dust in your attic, find me on Twitter.

Riddler Express

I have a coin with a sun on the front and a moon on the back. I claim that on most days, it’s a fair coin, with a 50 percent chance of landing on either the sun or the moon.

But once a year, on the summer solstice, the coin absorbs the sun’s rays and exhibits a strange power: It always comes up the opposite side as the previous flip.

Of course, you are skeptical of my claim. You figure there’s a 1 percent chance that the coin is magical and a 99 percent chance that it’s just an ordinary fair coin. You then ask me to “prove” the coin is magical by flipping it some number of times.

How many successfully alternating coin flips will it take for you to think there’s a 99 percent chance the coin is magical (or, more likely, that I’ve rigged it in some way so it always alternates)?

The solution to this Riddler Express can be found in the following week’s column.

Riddler Classic

From Dean Ballard comes a riddle of radiant spheres and fatherhood, just in time for the summer solstice and Father’s Day:

King Auric adored his most prized possession: a set of perfect spheres of solid gold. There was one of each size, with diameters of 1 centimeter, 2 centimeters, 3 centimeters, and so on. Their brilliant beauty brought joy to his heart. After many years, he felt the time had finally come to pass the golden spheres down to the next generation — his three children.

He decided it was best to give each child precisely one-third of the total gold by weight, but he had a difficult time determining just how to do that. After some trial and error, he managed to divide his spheres into three groups of equal weight. He was further amused when he realized that his collection contained the minimum number of spheres needed for this division. How many golden spheres did King Auric have?

Extra credit: How many spheres would the king have needed to be able to divide his collection among other numbers of children: two, four, five, six or even more?

The solution to this Riddler Classic can be found in the following week’s column.

Solution to last week’s Riddler Express

Congratulations to 👏 Christopher R. Green 👏 of Oxford, Mississippi, winner of last week’s Riddler Express.

Last week, you tried your hand at a technique for rolling dice called “bowling,” in which you placed your index finger and thumb on two opposite sides of the die and rolled it along the table. When done correctly, the die never landed on the faces on which you held the die, leaving you with a 25 percent chance of landing on each of the remaining four faces.

You had to apply this technique to optimize your chances of rolling a 7 or 11 in a game of craps. With a standard rolling technique, your chances were about 22.2 percent. But if you bowled the dice one at a time (i.e., you knew the outcome of the first die before rolling the second), what were your chances of rolling a 7 or 11?

If your first roll had been a 1, 2, 3 or 4, then the only way to win was to have the two rolls add up to 7 — 11 would have been out of reach, since the maximum roll was 6. But if your first roll was a 5, then you could win with a second roll that was a 2 or 6. And if your first roll was a 6, then you could win with a second roll that was a 1 or 5. So with the first roll, you wanted to get a 5 or 6, since that doubled your chances of winning.

At this point, it was helpful to write out the three possible strategies, which we’ll call A, B and C, keeping in mind that opposing faces of a standard die always add up to 7:

  • A: Place your fingers on the 1 and the 6, resulting in a 25 percent chance of rolling a 2, 3, 4 or 5.
  • B: Place your fingers on the 2 and the 5, resulting in a 25 percent chance of rolling a 1, 3, 4 or 6.
  • C: Place your fingers on the 3 and the 4, resulting in a 25 percent chance of rolling a 1, 2, 5 or 6.

As noted by solver Carolyn Phillips, strategy C gave you the best chances of rolling a 5 or 6, so that should have been your first roll. And the result of your first roll determined your strategy for the second roll:

  • If you rolled a 1, then you needed a 6. Strategies B and C both had a 25 percent chance of resulting in a 6.
  • If you rolled a 2, then you needed a 5. Strategies A and C both had a 25 percent chance of resulting in a 5.
  • If you rolled a 5, then you needed a 2 or 6. Strategy C had a 50 percent chance of resulting in a 2 or 6.
  • If you rolled a 6, then you needed a 1 or 5. Strategy C had a 50 percent chance of resulting in a 1 or 5.

Averaging these four equally likely cases together, your chances of winning were 3/8, or 37.5 percent. That was quite the improvement over the 22.2 percent chance you had with a standard rolling technique.

For extra credit, you still wanted to roll a 7 or 11 (earning you a point), but you also wanted to avoid rolling a 2, 3 or 12 (losing you a point). With a standard rolling technique, your average score would have been one-ninth of a point. But if you “bowled” to maximize your expected score, what was that average?

Once again, you could determine the optimal strategy by determining your second roll based on your first. For example, if your first roll was a 6, you wanted your second roll to be a 1 or a 5 (giving you a total of 7 or 11), but not a 6 (giving you a total of 12). Your best bets for the second roll would have been strategies A or C; either netted you 0.25 points on average.

If you did this analysis for each of the first rolls, you found that if your first roll was a 1, you’d get 0 points on average. If your first roll was a 2, 3, 4 or 6, you’d get 0.25 points on average. And if you were fortunate enough to get a 5 on your first roll, you’d get 0.5 points on average.

Putting these all together, your best option for the first roll was Strategy A, netting you an average of 5/16 points. That was about three times better than how you would have done with standard rolling.

Solver David Alpert took this problem one step further, wondering what would happen if you bowled non-standard dice where the numbers on opposing sides did not have to add up to 7. Based on his analysis, your average point total jumped up to 3/8 — the exact same result as the original problem.

If there’s a lesson to be learned in all this, it’s that it pays to cheat in dice games. No, that can’t be right.

Solution to last week’s Riddler Classic

Congratulations to 👏 Dan Upper 👏 of Corvallis, Oregon, winner of last week’s Riddler Classic.

Last week, you were studying a new strain of bacteria, Riddlerium classicum. Each R. classicum bacterium did one of two things: split into two copies of itself or die. There was an 80 percent chance of the former and a 20 percent chance of the latter.

If you started with a single R. classicum bacterium, what was the probability that it would lead to an everlasting colony (i.e., the colony would theoretically persist for an infinite amount of time)?

Sometimes a puzzle is so good that it’s worth solving twice. After posting this puzzle last week, I learned that a very similar variation — the extra credit, in fact, in which the 80 percent was replaced by probability ppreviously appeared on the Riddler column. It’s also on page 83 of “The Riddler” book.

Nevertheless, it’s a truly excellent puzzle, and I would like to acknowledge a few solvers from this past week who had never seen it before.

First off, the first bacterium had a 20 percent chance of dying outright, which meant the answer was at most 80 percent. After that, things got hairy.

As with many Riddler Classics, it was tempting to start with simulations in order to gain some intuition for what was happening. However, how could a finite simulation truly determine whether a colony was “everlasting?” Solver Greg Y. and a team who identified as the “MassMutual Crew” both tried their hand at this, each running 1 million simulations and seeing how many times the colony lasted 100 generations. It turned out that approximately 750,000 of those 1 million colonies made it, suggesting the answer was close to 75 percent.

Meanwhile, Jason Ash modeled the problem as an absorbing Markov chain and assumed that once the colony reached a population of 500 it was effectively guaranteed everlasting survival. With this approach, along with some code to construct the transition matrix, Jason approximated the answer as 75.00000000000211 percent.

Emma Knight solved the problem analytically and head-on, directly solving for the probability that the colony dies out with each generation and then summing all those infinite probabilities. That sum is the total probability the colony dies out, so 1 minus the sum is the probability of survival.

But many solvers, including Alain Bruguières, Hector Pefo, Josh Silverman and the international team (from Manila, Philippines and Ottawa, Canada) of Erin and Nicole, discovered a roundabout method that got to the answer in far fewer steps. They first defined x as the probability of extinction, when starting from a single bacterium. Without yet knowing exactly what x was, you had enough information to determine the probability that two such bacteria would go both extinct. Since they were independent events, this probability was x2.

Returning to our single bacterium, what was the probability it led to an everlasting colony? By definition, it was 1−x. But it was also the probability that it divided into two bacteria, which did not both go extinct. And that probability was 0.8(1−x2). Setting those probabilities equal gave the equality 1−x = 0.8(1−x2), which meant x was equal to 0.25 and 1−x was 0.75. Frankly, it was remarkable that any solvable equation popped out at all, given the infinities inherent to this puzzle. And sure enough, the answer turned out to be 75 percent, just as the simulations predicted.

For extra credit, you were asked to solve the problem when the probability each bacterium divided was p, rather than 80 percent. The above reasoning worked just as well for this general case, leading to the equation 1−x = p(1−x2). When p was between 0.5 and 1, the answer was 2−1/p. But when p was less than 0.5 (i.e., when the value of 2−1/p dipped negative), x was 1, meaning the colony never survived.

This was a truly bizarre result. Imagine two strains of bacteria, one with a 50 percent chance of dividing and the other with a 50.0000000001 percent chance of dividing. Only the latter has any chance of forming an everlasting colony. Until I apply some antibacterial soap, that is.

Want more riddles?

Well, aren’t you lucky? There’s a whole book full of the best puzzles from this column and some never-before-seen head-scratchers. It’s called “The Riddler,” and it’s in stores now!

Want to submit a riddle?

Email Zach Wissner-Gross at riddlercolumn@gmail.com.

Footnotes

  1. Important small print: Please wait until Monday to publicly share your answers. In order to 👏 win 👏, I need to receive your correct answer before 11:59 p.m. Eastern time on Monday. Have a great weekend!

Zach Wissner-Gross leads development of math curriculum at Amplify Education and is FiveThirtyEight’s Riddler editor.

Comments