The United States and most of the rest of the developed world enjoy very high vaccination rates for routine childhood illness — typically over 90 percent. But in the last decade or two, these vaccination rates have stagnated and even declined in some locations. The measles vaccination coverage in the United Kingdom, for example, went from a high of 92 percent in the mid-1990s to just 81 percent by the mid-2000s (it has since recovered). Poor countries often can’t achieve complete vaccination because of cost, but non-universal vaccination in the developed world is more often a matter of choice.
There are many reasons people choose not to vaccinate their children. Some parents view vaccinations as risky — some people even think they can cause autism, despite the fact that this theory has been thoroughly debunked. Some object for religious reasons. In other cases, medical conditions make vaccination impossible or dangerous.
And there are still others who aren’t necessarily convinced that vaccines are dangerous, but simply view the benefits as minimal. Why subject your child to a painful shot, given that you’ve never actually seen anyone get the measles? This logic relies on the idea that “herd immunity” will protect even unvaccinated children. If enough people are vaccinated against measles, then no one will get it, so even unvaccinated children will be protected. Since vaccination rates are extremely high in the U.S., herd immunity surely operates, right?
Well, it turns out, yes and no. High vaccination rates do provide significant protection. But when I analyzed the data, I was surprised to find that even within the U.S., states and counties with lower vaccination rates do experience higher rates of illness.
Looking at vaccination rates across all 50 states, as reported in the National Immunization Survey, I found that household-reported DPT (diphtheria, pertussis and tetanus) vaccination rates among children 19 months to 35 months vary across states but in a tight range: 95 percent to 99 percent. If herd immunity is operating over this range, we’d expect no relationship between the vaccination rate and disease outbreaks.1
Pertussis, also known as whooping cough, is by far the most common of the diseases. It’s a respiratory infection with symptoms similar to a cold, but typically followed by a long period (in some cases months) of serious coughing episodes. It can be deadly in small infants. The chart below shows the state-level cases of pertussis (per 100,000 people) plotted against vaccination rates. The best-fit line is also shown.
The relationship is negative. The higher the vaccination rate, the fewer the number of whooping cough cases — and in a regression, the relationship is significant. Going from a 95 percent vaccination rate to a 99 percent vaccination rate makes a difference in the number of people who get sick. Compare Louisiana, which has a vaccination rate of 98.5 percent, to Montana, with a rate of 95 percent. Although both numbers are high, Montana has four times as many whooping cough cases per person than Louisiana. The number of cases is small, but they show that even small decreases in vaccination rates matter.
One might think that since, on average, poor kids are sick more often than rich kids, this relationship might show up if poor states have lower vaccination rates. But that’s not the case: Some of the poorest states (Louisiana, for example) have the highest vaccination rates.2
The story looks slightly different for measles, mumps and rubella, which all produce rash and fever. Measles can be deadly in a small share of cases (the Centers for Disease Control and Prevention estimate one to two deaths per 1,000). These diseases are much less common, and we see basically no relationship at the state level in the chart below. Lower vaccination rates — at least in this range — do not seem to correlate with outbreaks.
Based on what we know about herd immunity, there is a sense in which the relationship for whooping cough is surprising. Estimates suggest a vaccination rate of 92 percent to 94 percent is high enough to confer herd immunity in whooping cough, and so a 95 percent vaccination rate should be high enough to deliver good protection. In other words, I suspect most epidemiologists would say that if you chose not to vaccinate 5 percent of people and randomly sprinkled them around, that wouldn’t put an area at much risk of disease.
The obvious explanation is that a 95 percent state-level vaccination rate doesn’t mean 95 percent of people are vaccinated everywhere in that state — it means there are probably some areas whose vaccination rate is significantly below 95 percent and some areas where the rate is higher. To get at that, we need to look at smaller areas.
California, a big state, has a lot of variation in vaccination rates, and it helpfully reports immunization rates by school for all entering kindergarten classes. I got the data on vaccination rates for the 2009 and 2010 entering classes and aggregated up to the county level. I then combined these vaccination rates with information on disease outbreaks by county for 2006 through 2010 (when these kids were still very young), since vaccines for whooping cough and for measles, mumps and rubella are all given in a child’s first year of life.
I divided the counties into five groups based on their vaccination rates from low to high. The charts below show disease rates (per 10,000 kindergarten-age children) for whooping cough and for measles, mumps and rubella.3
What we see is very stark: Low vaccination rates increase disease outbreaks.4 And it’s notably the vaccination rates on the lower end of California’s spectrum that really matter. For measles, counties with vaccination rates below 90 percent have a lot more cases — twice as many as anywhere else. For whooping cough, disease rates are more than three times as high in counties with a vaccination rate lower than 86 percent compared to those with a rate above 94 percent.
For both vaccines, it looks like the areas with the highest vaccination rate also have slightly higher rates of disease outbreaks, but that is just statistical noise. When we test for significant differences, it’s only the difference between the lowest quintile and the rest that meets the standard.
For parents, this information would seem to caution against reliance on herd immunity. Yes, if your county (or better, your neighborhood) has a 99 percent vaccination rate, you’re probably safe. But knowing that your state vaccination rate is 95 percent really isn’t enough. For the vast majority of people, there is absolutely no medical reason not to vaccinate, and the idea that there are no benefits is foolish.
From a public health standpoint, this data argues for continuing to push to increase vaccination rates and stem any declines. A 95 percent vaccination rate in a state doesn’t mean every place in the state is at 95 percent. At rates even a bit lower, we start to see increases in whooping cough and measles cases. And remember that parents who choose not to vaccinate their kids are also putting other people’s kids at risk: Many of the victims of whooping cough are babies who are too young to be vaccinated.