Skip to main content
Menu
How Much Do State Jobless Rates Differ?

North Dakota’s unemployment rate is a rock-bottom 2.6 percent. Rhode Island’s is a painfully high 8.7 percent. How unusual is that kind variance between states?

DataLab czar Micah Cohen [Editor’s note: I prefer “benevolent dictator”] asked me that question last week when I was writing about California’s impressive economic recovery. Despite spending a lot of time writing about the job market, I wasn’t sure of the answer — which, I figured, meant it was an interesting question. So I took a look at the data.

The simplest way to measure this kind of variance (we mean the plain language definition of variance here, not the statistical definition) is to look at the raw gap between the highest state unemployment rate and the lowest. By that measure, today’s gap of 6.1 percentage points isn’t at all unusual. A few years ago, when states such as Nevada and Michigan had unemployment rates well into the double-digits, the gap got as wide as 10.2 points. In the early 1980s, the gap topped 12 points. (Back then, the state with the highest unemployment rate was West Virginia; South Dakota had the lowest.)

That doesn’t really tell us much, though. In terms of raw numbers, the gap between the highest and lowest unemployment rates is always going to be widest during recessions simply because a higher unemployment rate leaves more room for variance. Instead, we can look at the ratio between the maximum and minimum unemployment rates. As the chart below shows, the current ratio of 3.3 (8.7 divided by 2.6) is pretty typical over the past few decades. (The Bureau of Labor Statistics has been collecting state-level unemployment data since 1976.)

casselman-datalab-state-unemployment-1

Three quick notes on this chart: First, that spike in the mid-1980s corresponds with the 1986 oil-price collapse, which was a boon to much of the country but sent unemployment soaring in Texas, Louisiana and other energy-producing states. Second, notice that very brief spike in 2005. At first I thought it was the result of an error in the data, but it isn’t: It was from Hurricane Katrina, which sent Louisiana’s unemployment soaring from 4.9 percent in August to 11.2 percent in September. (By December, it was down to 6 percent.) Lastly, the interstate variance did rise in the aftermath of the most recent recession, although it’s since fallen back to normal.

Of course, looking just at the maximum and minimum doesn’t tell us anything about what’s going on in between. But looking at the standard deviation — essentially, how much variance there is compared to the average — tells pretty much the same story. Variance among the states peaked in the mid-1980s, was unusually low in the early 2000s, and has now settled in close to its long-run average. (Interestingly, the standard deviation doesn’t show the same post-recession rise that the maximum-to-minimum ratio does.)

So, today’s variance isn’t unusual by historical standards, but it is higher than in the years leading up to the recession.

Ben Casselman is a senior editor and the chief economics writer for FiveThirtyEight.

Comments