Numbers and data are a backbone of modern life. We cite them buzzily at bars and soberly to bosses so often that “studies show” might as well be given its own entry in the dictionary. Much of what we cite comes from government data — weather patterns, the population or average income of a city, even honeybee activity — collected across innumerable departments, agencies and centers, then made public.
Now, watchdogs are worried that a Donald Trump administration could erode the quality of government data collection and systems. Transparency advocates have raised the possibility that a Trump administration could simply remove data sets. But they have a more foundational worry that the the integrity of U.S. government data could be compromised more subtly and more systematically over the next four years.
Certain steps being taken by the president-elect’s transition team have raised alarm bells for some who worry that Trump’s glibness with the truth could take root in a more institutional form. Trump’s nominee for EPA administrator, Scott Pruitt, is a climate change denialist. His choice for director of the National Economic Council, Gary Cohn, once called the U.S. employment rate “a very, very fictitious rate.” And last week, Bloomberg reported that the Trump transition team sent a memo to the Department of Energy requesting the names of employees involved in determining climate change metrics and asking the Energy Information Administration “in what instances the EIA’s independence was most challenged over the past eight years.”
Widespread government data tampering remains unlikely given the vast numbers of career bureaucrats working across agencies, according to Alex Howard, senior analyst at the Sunlight Foundation, an open government advocacy organization. But Howard said that Trump’s refusal to release his tax returns could signal that Washington will operate within a new paradigm. “What that tells us is that unless there’s a statutory requirement, there are norms that can be bucked,” he said. Some data, like the census, is collected and protected by law, but not all of it.
What does have observers worried are two things in particular: budget cuts that could significantly affect data collection and quality, particularly within the government’s statistical agencies — those that produce key economic indicators like the Bureau of Labor Statistics and the Bureau of Economic Analysis; and the willful miscommunication of scientific research that proves politically inconvenient to the White House.
Fears of data manipulation might conjure up images of black-turtleneck-clad men changing numbers in Excel sheets late at night, but there are safeguards against such outright meddling. A series of directives and best practices currently govern the collection of U.S. government data, all with an overwhelming emphasis of divorcing fact-gathering from politics. “To have credibility, an agency must be and must be perceived to be free of political interference and policy advocacy,” the Committee on National Statistics states in its guiding text on best principles and practices.
The safeguarding of institutional trust in data is seen as the guiding mission of these statistical agencies, and certain ritual practices to maintain data integrity are therefore observed. When the Bureau of Economic Analysis releases gross domestic product numbers, for example, a sort of economic conclave is convened. Working separately, groups of career employees compile the different numerical inputs for the GDP before putting all the numbers together to reach a final product — the Gross Domestic Product.
“It doesn’t all come together until the night before the release, and they actually go into lockup where they don’t communicate with the outside world,” Justin Antonipillai, acting undersecretary for Economic Affairs at the Department of Commerce, said about the team who puts together the numbers. “I’m a political appointee — I see the number in a lockup after the journalists have it,” he said. “So there’s no ability for me to go back and alter and manipulate the data or affect the staff in any way.”
The same guardrails protect the Energy Information Administration findings the Trump transition team was questioning. Those findings were not subject to political review, said Richard Newell, former administrator of the EIA. “If you look at specific questions that were asked, they’re all questions that EIA will be able to readily answer — they’re based in evidence and EIA’s analytical modeling capabilities.”
But if outright manipulation is unlikely — though not impossible — there are other ways for a budget-slashing White House to undermine and skew data. Funding issues have plagued statistical agencies for some time, and there are worries that a budget-cutting or statistically skeptical Trump administration could deal fiscal body blows to those organizations. (A spokesperson for the Trump transition did not immediately respond to questions about these concerns.)
“Several of the [statistical] agencies have struggled with budgets since basically the early 2000s,” said Ken Poole, executive director of the Association of Public Data Users. Over the last few years, “all the agencies would have haircuts, and there’s a point at which you have no hair left, and you’re starting to cut into the bone.” He pointed to the Bureau of Labor Statistics and the Census Bureau, two of the largest producers of government data, as being particularly strapped. The Census Bureau’s American Community Survey, which collects data annually about where and how Americans live, is seen as occupying a particularly precarious position. It’s currently administered as a mandatory survey, acting essentially as an extension of the census, which is required by law. “One of the most efficient ways to get the American Community Survey done is to keep it mandatory,” Poole said. “But there are proposals to make it voluntary or to eliminate it completely because the questions are seen as being intrusive.”
These arguments were made in 2012, when the Republican-led House voted to eliminate the survey, citing its expense and intrusiveness, despite appeals from industry groups who use the information to inform business decisions. In 2015, Jack Kleinhenz, chief economist for the National Retail Federation, wrote that “any movement away from a fully-funded, annual ACS survey could do unreasonable and irrevocable harm to this important economic, government and retail resource.”
Canada’s conservative government eliminated its version of the long-form census in 2006, a move that was recently reversed, but which proved disastrous for statistical quality. “Canada’s response rate dropped dramatically, and they could not publish data because it was such poor quality,” Poole said. “And that is the risk of what might happen if we make the American Community Survey voluntary.”
Financially depriving programs or agencies viewed as superfluous or counter to the administration’s political mission seems a likely first move should the Trump administration wish to shake things up. “First thing they’d do is starve them financially — just keep cutting back the budget until they’re barely alive,” said Christine Todd Whitman, a former EPA administrator under President George W. Bush.
The other worry for data-watchers is a deliberate program of misrepresenting scientific research and findings. That’s particularly a concern when it comes to agencies like the EPA, which inevitably get caught up in the politicized atmosphere that surrounds topics such as climate change research.
The EPA has been the focus of recent attention, given Trump’s choice of Pruitt, a climate change denialist with close ties to the oil and gas industry, and Trump’s promise at one point during the campaign that he would eliminate the agency. Whitman said the administrator could have the power to “cherry-pick data” in order to fit a more politically convenient narrative.
“They can go outside the agency if they don’t like the data they’re getting from the agency and they can highlight that,” she said. “If you say 97 percent of scientists believe climate change is real, that means you’ve got 3 percent of the scientists who don’t believe it and so you go to them.” Whitman herself resigned after a dispute with Vice President Dick Cheney over Clean Air Act regulations. “It was going to set the standard at a place that was going to let truly bad actors off the hook,” she said. “They were the ones that were elected, and they get to set the policy and they get to do things like set standards in arbitrary ways.” She pointed out that the more permissive administration-backed standards were later overturned in court.
There is a long recent history of outright numbers-fudging or misleading communications surrounding environmental data. Julie MacDonald, a deputy assistant secretary at the Department of Interior during the George W. Bush administration, was found to have forced scientists to reverse findings that would have made a species of prairie dog eligible for Endangered Species Act Protection. (MacDonald later resigned.) In 2005, Philip Cooney, chief of staff for the White House Council on Environmental Policy and a lawyer, was found to have made changes to scientific reports on climate change. And in 2015, last-minute changes were made to the wording of an EPA study on hydraulic fracking’s impact on drinking water quality, softening the scientific claims significantly.
“This isn’t just a Republican thing, this is a problem in general,” Gretchen Goldman, research director for the Center for Science and Democracy at the Union of Concerned Scientists, said of data interference by political appointees. “Across presidents and issues.”
Louis Clark, executive director and CEO of the Government Accountability Project, a whistleblower protection organization, said that he would be keeping a close eye on the practices of agencies that deal with climate change science in particular — he singled out the EPA and Department of the Interior. But, Clark said, recent scientific data dust-ups have left agencies better prepared to deal with potential political interference. “These scientists have protections that they didn’t have before,” Clark said.
He noted that during the 2000s, problems arose not in data collection, but in the communication of that data to the public. “They set up all these PR operations,” Clark said of the Bush administration. “If a reporter called up and wanted to know about the Arctic, the scientists getting the question couldn’t answer and were required to send the reporter to the government PR person. But they can’t do that anymore because now we have rules against gags on federal employees.”
But Richard Rood, a climate researcher at the University of Michigan and a former NASA scientist, said he was still worried about the quality of data products coming from the federal government during the next administration, citing the potential draining of resources and less emphasis on analysis from government scientists. “I was using NOAA’s new El Niño portal last night, and the way they have organized and presented that information, including access to limited data sets, is vastly improved in the Obama administration,” he said. “What the executive thinks and what initiatives [are taken] can have a significant impact over the length of an administration over the quality of the material and the presentation.”
“I’m personally worried some of those analyses could get taken down,” Rood said.
All of which is not to say that government agencies don’t have ways of pushing back. This week, the Department of Energy announced that it would not release the names of its employees involved in climate change research, despite a request from the Trump transition team.
Whitman, the former EPA administrator, pointed to the unseen power of career staff, and proffered something of a revenge-of-the-bureaucrats scenario. “It’s going to be interesting to see,” she said. “It’s amazing how they can slow things down. Things can disappear.”