Welcome to The Riddler. Every week, I offer up problems related to the things we hold dear around here: math, logic and probability. Two puzzles are presented each week: the Riddler Express for those of you who want something bite-size and the Riddler Classic for those of you in the slow-puzzle movement. Submit a correct answer for either,^{1} and you may get a shoutout in next week’s column. Please wait until Monday to publicly share your answers! If you need a hint or have a favorite puzzle collecting dust in your attic, find me on Twitter.

## Riddler Express

From Drew Mathieson comes an exploration of basketball’s historied hot hand:

This season, on the way to winning her fourth WNBA championship in her 17-year career, Sue Bird made approximately 50 percent of her field goal attempts. Suppose she and Seattle Storm teammate Breanna Stewart are interested in testing whether Bird has a “hot hand” — that is, if her chances of making a basket depend on whether or not her previous shot went in. Bird happens to know that her chances of making any given shot is *always* 50 percent, independent of her shooting history, but she agrees to perform an experiment.

In each trial of the experiment, Bird will take three shots, while Stewart will record which shots Bird made or missed. Stewart will then look at all the trials that had at least one shot that was immediately preceded by a made shot. She will randomly pick one of these trials and then randomly pick a shot that was preceded by a made shot. (If there was only one such shot to pick from, she will choose that shot.)

What is the probability that Bird made the shot that Stewart picked?

The solution to this Riddler Express can be found in the following week’s column.

## Riddler Classic

Now that LeBron James and Anthony Davis have restored the Los Angeles Lakers to glory with their recent victory in the NBA Finals, suppose they decide to play a game of sudden-death, one-on-one basketball. They’ll flip a coin to see which of them has first possession, and whoever makes the first basket wins the game.

Both players have a 50 percent chance of making any shot they take. However, Davis is the superior rebounder and will always rebound any shot that either of them misses. Every time Davis rebounds the ball, he dribbles back to the three-point line before attempting another shot.

Before each of Davis’s shot attempts, James has a probability *p* of stealing the ball and regaining possession before Davis can get the shot off. What value of *p* makes this an evenly matched game of one-on-one, so that both players have an equal chance of winning *before* the coin is flipped?

The solution to this Riddler Classic can be found in the following week’s column.

## Solution to last week’s Riddler Express

Congratulations to ÑÑâÐ Frank Probst ÑÑâÐ of Houston, Texas, winner of last week’s Riddler Express.

Last week, the residents of Riddler City were electing a mayor from among three candidates. The winner was the candidate who received an outright majority (i.e., more than 50 percent of the vote). But if no one achieved this outright majority, there would be a runoff election among the top two candidates.

If the voting shares of each candidate were uniformly distributed between 0 percent and 100 percent (subject to the constraint that they add up to 100 percent, of course), then what was the probability of a runoff?

The “uniformly distributed” wording in the problem was ambiguous and was interpreted several ways by readers. How can you randomly choose three numbers between 0 and 100 that add up to 100? Here, I will write about three popular interpretations.

First, imagine randomly picking three values between 0 and 100, and call them *x*, *y* and *z*. Each choice of (*x*, *y*, *z*) corresponded to a point in a cube that measured 100 by 100 by 100. But only *some* points in this cube had coordinates that summed to 100 — those that also lay on the plane *x*+*y*+*z* = 100. This intersection between a cube and a plane might have been hard to visualize — it was an equilateral triangle (shown below). If you divided this triangle into quarters, then three of those quarters had one value (*x*, *y* or *z*) that exceeded 50. That meant the probability of a runoff, with *no* voting shares that exceeded 50, was **1/4**.

That was one interpretation. Another way to “uniformly” pick three values was to draw a number line between 0 and 100 and break it into three segments by randomly picking two points, *a* and *b*. Assuming *b* was greater than *a*, the three lengths that summed to 100 were *a*, *b*−*a* and 100−*b*. The challenge was to find when each of these three values exceeded 50 inside the triangle defined by 0 ≤ *a *≤ *b* ≤ 100. Each of the three inequalities (a ≥ 50, *b*−*a* ≥ 50 and 100−*b* ≥ 50) carved out a quarter of the larger triangle. And so, once again, that meant the probability of a runoff was **1/4**.

Yet another way to “uniformly” pick three values was to go ahead and pick three numbers between 0 and 100 (again, let’s call them *x*, *y* and *z*) and then “normalize” them — that is, divide each number by *x*+*y*+*z* and multiply by 100, so that they were guaranteed to add up to 100. Like before, each choice of (*x*, *y*, *z*) corresponded to a point in a cube. But this time, to avoid a runoff, you needed one of the values to exceed the sum of the other two, meaning it would exceed 50 percent of the sum of all three numbers. There were three regions in the cube where a runoff would *not* occur: *x* > *y*+*z*, *y* > *x*+*z* and *z* > *x*+*y*. Each region made up one-sixth of the cube, so all together they represented half the cube, as shown below. In the other half, a runoff was necessary. So according to this interpretation of the problem, the answer was **1/2**. (Some solvers, like Benjamin Dickman, noted that this approach was identical to finding the probability that three lengths from a random, uniform distribution satisfy the triangle inequality.)

For extra credit, you wanted the probability of a runoff when there were *N* candidates instead of three. Once again, the answer depended on your interpretation of the problem. Based on the first interpretation, this general solution was 1−*N*/2^{N−1} (nicely explained by Josh Silverman), since each of the N candidates had a 1/2^{N−1} chance of winning an outright majority. Similarly, based on the second interpretation, the answer was again 1−*N*/2^{N−1}. But based on the third interpretation, the answer was 1−1/(*N*−1)!.

That’ll do it for Riddler City’s mayoral election. Don’t forget to vote in *any other elections* that may be happening!

## Solution to last week’s Riddler Classic

Congratulations to ÑÑâÐ Asher S. ÑÑâÐ of Chicago, Illinois, winner of last week’s Riddler Classic.

Last week, you were playing a modified version of “The Price is Right.” In this version’s bidding round, you and two (not three) other contestants had to guess the price of an item, one at a time.

The true price of this item was a randomly selected real number between 0 and 100. Among the three contestants, the winner was whoever guessed the closest price *without going over*. For example, if the true price was 29 and you guessed 30, while another contestant guessed 20, then they would be the winner even though your guess was technically closer.

In the event all three guesses exceeded the actual price, the contestant who had made the lowest (and therefore closest) guess was declared the winner.

If you were the first to guess, and all contestants played optimally (taking full advantage of the guesses of those who had gone before them), what were your chances of winning?

At first, this three-player game might have seemed unsolvable. As the first to guess, you would want to know what the second and third players’ strategies would be. But their strategies depended on yours and on each other’s. Was there any way out of this mess?

Indeed there was. One approach was to work backwards. Suppose you (the first player) guessed a price *A* and the second player guessed a price *B*. What should the third player do? For now, let’s assume *A* was less than *B*. The third player would then choose from among three options:

- Guess a value of zero, in which case they’d win if the true price was between 0 and
*A*— a range of*A*. - Guess a value infinitesimally greater than A, in which case they’d win if the true price was between
*A*and*B*— a range of*B*−*A*. - Guess a value infinitesimally greater than
*B*, in which case they’d win if the true price was between*B*and 100 — a range of 100−*B*.

But which of the three values should the third player guess? Whichever corresponded to the greatest range. (If *A* had been greater than B, there would again be three options, but with *A* and *B* reversed.)

So for any combination of *A* and *B*, the third player’s strategy was known. Next, it was time to look more closely at the second player.

For each value of *A*, the second player could figure out their chances of winning for any *B *they picked, since they now knew what the third player would do given *A* and *B*. For each *A*, the second player would then pick a *B* that maximized their own chances of winning.

At last, we’re back to you, the first player. By now, you knew exactly what the second and third players would do in response to any guess *A*. As with the second player, that meant you had to pick the value that maximized your own chances of winning.

Amidst all this strategizing, I neglected to mention just what these optimal guesses were. In the end, your best guess was two-thirds of 100 (~66.7). Then the second player’s best move was to guess one-third of 100 (~33.3), and the third player’s best move was to guess anything less than that (e.g., zero). All players had a **one-third** chance of winning. If you deviated from these optimal values, the second and third players would both have the advantage over you, each with a greater than one-third chance of winning.

Some solvers said they would have guessed a price that was *one*-third of 100. But as Emma Knight observed, Player 2 could then have guessed anything less than that and Player 3 slightly more. That would have meant Player 2 *still* had a one-third chance of winning, while Player 3’s chances went up to two-thirds, leaving you with nothing. Player 2 might not have chosen to sabotage your hopes of winning, but why leave it to chance?

Finally, Keith Wynroe of Edinburgh, Scotland offered a neat extension to this problem, asking how the three players’ strategies might change if the goal was not simply maximizing one’s chances of winning, but rather maximizing the *expected value* of the prize won. According to Keith, while this shift would incentivize all three players to bet higher values, no one’s chances of winning actually changed.

After all was said and done, it turned out to be a fair game. How sweet.

## Want more riddles?

Well, aren’t you lucky? There’s a whole book full of the best puzzles from this column and some never-before-seen head-scratchers. It’s called “The Riddler,” and it’s in stores now!

## Want to submit a riddle?

Email Zach Wissner-Gross at riddlercolumn@gmail.com