Skip to main content
Menu
Grading Your State’s Preparedness For An Ebola Outbreak

From time to time, without warning, more than 200 public health workers in Nevada get an urgent call on their cellphones.

“We say, ‘You’re being called in,’ ” said Erin Seward, the state’s public health preparedness manager. “This is an emergency.”

Generally, it’s not. It’s a drill aimed at testing how ready the state is for a public health emergency — whether an anthrax attack or a major outbreak of flu, for example. When the call comes, first responders from state, local and private agencies across Nevada’s 111,000 square miles practice distributing vaccines, antibiotics or other emergency supplies.

And based on these exercises, Seward said she is “very confident that Nevada is prepared.”

A lot of us could use some confidence about now. As the U.S. deals with its first cases of Ebola, however small and contained, people everywhere are nervously wondering how ready we are for a serious public health emergency. What are the most and least prepared places for something like an Ebola outbreak? If an infectious disease is spreading, what states do you want to be in and not be in?

Since 2002, the Centers for Disease Control and Prevention has reviewed states’ plans and checked response times to answer this very question.1 One important assessment conducted by the CDC is a technical assistance review (TAR) — basically, a measure of how well prepared the states are to distribute medicine and supplies during an emergency.2 When the review is done, each state gets a score ranging from 0 to 100, with points off if its plan lacks a clear communication strategy or if it hasn’t adequately taught procedures for securing and dispensing supplies. Eighty-nine is the minimum acceptable score.3

By this metric, it’s easy to see which states might be a good bet if you want to stay healthy amid a large-scale emergency. California, Michigan, New Jersey, New York, Oklahoma and Virginia received perfect 100 scores for the past three years. And Nevada? It has one of the lowest three-year average TAR scores, at 89,4 putting it just higher than Hawaii and Minnesota but well above the big outlier, Alaska.

When asked about the relatively low scores, Seward pointed out that TAR merely looks at preparedness plans and doesn’t evaluate what first responders do. “TAR scores don’t get into exercises,” she said.

Besides, things in Nevada have improved, she said. In 2007, the state registered a score of only 34.

A more comprehensive index

The CDC stats have limited value because they examine the states’ preparedness plans but don’t look at a host of other factors — the readiness of the health care system, how many doctors and pharmacists are ready to volunteer, whether a state has someone overseeing school health policies, and so on.5 To get a more holistic understanding of preparedness, a group of state health officials and the CDC teamed up last year with an assortment of other organizations — ranging from the American Red Cross to the Department of Defense to McKinsey, a consultancy — to create a new metric: the National Health Security Preparedness Index (NHSPI).

The index is meant to “build understanding in the [preparedness] community,” to measure readiness and be a catalyst for improvements, said Thomas Inglesby, CEO and director of the Center for Health Security at the University of Pittsburgh Medical Center. Inglesby, a doctor, was a leader in the creation of the index and the person the CDC referred me to when asked for comment.6

The NHSPI, launched last December, draws on 35 sources, including CDC stats and hospital and health-care-system reports, and 128 measures, such as the number of epidemiologists per 100,000 people in a given state. These are aggregated into five domains: health surveillance, community planning and engagement, incident and information management, surge management, and countermeasure management. Each state is rated on a 1-to-10 scale, with 10 being the best. Their scores — overall and in the five domains — are in the table at the bottom of this article.

The index is a “combination of resources, processes and outcomes,” Inglesby said.

Outcomes, he said, are the best way to judge preparedness — how did authorities deliver in an emergency or perform in an emergency drill like the one Seward described in Nevada.7

But tracking data on outcomes is costly. Only nine of the 128 NHSPI measures are outcomes. It’s easier to rate the process, which is laid out in the states’ emergency plans. As Inglesby said: “Sometimes it’s just hard to get at the outcome data; sometimes you have to use a proxy, such as a pandemic planning document that they’ve posted online.”

The first NHSPI scores came out last December. By this metric, Massachusetts and Rhode Island are well prepared. Nevada? It scored 5.9, the lowest in the nation. It’s TAR all over again.8

flowers-feature-measuringpreparedness-1

Seward, the Nevada public health preparedness manager, said NHSPI isn’t an appropriate measure of her department’s work “because it covers so many different areas” beyond what government is supposed to do.9

Preparedness depends on resources

There’s another crucial measure of preparedness: money. That’s what Inglesby means by resources, whether judged by number of trained staff or funding.

But the connection between funding and preparedness is not so simple. It turns out there’s often an inverse relationship between the CDC’s funding per capita and a state’s average TAR scores.

flowers-feature-measuringpreparedness-2

This is skewed by the handful of sparsely populated states that get far more money per capita than most states. For example, Alaska is one of the best-funded states (at more than $7 per person), yet it lags in performance. (But it’s also an isolated state and may not pose as big a threat in a pandemic emergency.)10 According to CDC data, total funding to the states for preparedness is down about $1 billion between 2002 and 2013.

However, when we consider the total dollars sent to states, the relationship is reversed: Better-funded states get higher TAR scores.

flowers-feature-measuringpreparedness-3

Given Nevada’s level of funding ($6.6 million, or about $2.40 per person), you’d expect it to have much higher TAR scores (somewhere in the mid-90s).

Yet funding is not the only measure of “resources.” Trained staffers matter, too; these are the individuals who dispense supplies from the strategic national stockpile in the event of a pandemic. By this score, Nevada does better; it has eight trained individuals per million citizens. This puts it near the median of all states.

Other data from the CDC — beyond the TAR scores and trainee numbers — is limited. Although the CDC collects data on states’ preparedness in 15 “capabilities,” it has released recent information on only three: laboratory testing, emergency operations coordinating, and emergency public information and warning. The CDC knows far more about state preparedness than the public does.

As a comprehensive measure of preparedness, the NHSPI, too, is imperfect and evolving.

“In year two, there will be additional components added,” Inglesby said. “There will be more focus on elements in the health care system that contribute toward preparedness.”

NHSPI doesn’t tell us much about how ready we are for Ebola, Inglesby said. For now, he said, assessing actual hospital experiences with Ebola patients is the best way to measure preparedness. System-level or state-level measures are of limited use.

“It’s extremely unlikely for Ebola to become a pandemic in the U.S.,” he said.

“A state’s score doesn’t give you a lot of insight into an individual institution’s ability to respond.” If the problem were to grow, how well a state was prepared would matter a lot more.

“But let’s say [we had] something like the H1N1 scare — then there are many things in the [NHSPI] index that would provide some anticipation of strengths and challenges.”11

As for Nevada’s challenges, Seward remained upbeat: “We work so well together with our local health authorities and our tribal partners. It gets showcased with our exercises and drills.”

Next year, Nevada will have a full-scale exercise, one done every five years.

There is that warning, at least.

flowers-feature-measuringpreparedness-table

Footnotes

  1. The findings are published in the 2013-14 National Snapshot of Public Health Preparedness.

  2. The CDC relies on supplies of vaccines, antibiotics and other medical resources from the strategic national stockpile (SNS). The SNS assets are spread throughout the country to allow for efficient distribution in such a scenario.

  3. In 2011-12, the minimum was moved from 79 to 89.

  4. The CDC’s statistics are confusing on this. Its 2011-12 fact sheet for Nevada said the state’s TAR score was 88, dragging down the three-year average. Seward said that’s an error and that Nevada’s actual TAR score was a 90. This was confirmed by the CDC.

  5. The CDC only measures programs that it funds through its Public Health Emergency Preparedness grants, and thus covers only the government response.

  6. He is also the chair of the CDC’s Board of Scientific Counselors under the Office of Public Health Preparedness and Response.

  7. Nevada’s exercises are based on the Homeland Security Exercise and Evaluation (HSEEP) guidelines.

  8. This is a plot of the 2013 NHSPI scores for all states against their three-year average TAR score. There is a weak positive correlation (0.24) between the two series, which is significant at the 10 percent threshold.

  9. Fair enough, but the data also shows that Nevada scores low in monitoring potential health threats, which might be in the purview of public health officials.

  10. The funding levels presented here are from 2011-12 in order to match the metrics.

  11. The H1N1 virus, or swine flu, killed more than 18,000 people globally in 2009.

Andrew Flowers writes about economics and sports for FiveThirtyEight.

Comments