If you took one look at the Facebook group Vaccines Exposed, it seemed clear what it was all about. It was “a group opposed to deadly vaccinations,” with over 13,000 members on a platform known to harbor anti-vaccination conspiracy theories. Some of its followers’ recent posts included a video falsely claiming the COVID-19 vaccine will kill people and a post claiming children are having cancer “injected into them. #facts.” Standard anti-vaxx fare.
It was so convincing, Facebook removed the group January 15 for violating the site’s community standards. But Vaccines Exposed was really a “honeypot” group run by vaccine advocates hoping to attract the attention of anti-vaxxers and people on the fence. When those folks posted something that furthered the anti-vaxx cause, pro-vaxx members responded and tried to persuade them to question their beliefs. It wasn’t always a gentle exchange.
Why are people hesitant to trust a COVID-19 vaccine?
“In this community, there are people whose goal is purely to educate,” said David Litton, a pro-vaccine member who used a fake account to participate in Vaccines Exposed, and is a podcaster and Twitch stream host who covers conspiracy theories online. “Then there’s a spectrum between that and people who are just trying to dunk on anti-vaxxers for being stupid.”
For example, in response to that video falsely claiming the COVID-19 vaccine would kill people, one member asked why we aren’t seeing this in the thousands of trial participants, another queried why the original poster chose to trust individuals with no science background over experts, while yet another asked, referring to the original poster: “why do we allow these people to breed?”
These skirmishes between pro- and anti-vaccine users aren’t limited to Vaccines Exposed; all of Facebook is a battlefield. And while those confrontations aren’t unique to Facebook — the anti-vaxx movement is as old as vaccines themselves — the site has created an ecosystem that, intentionally or not, has allowed this battle to flourish. And while the social media giant has made efforts to curb the spread of misinformation, it hasn’t been enough to end the battle for hearts and minds. As the American public attempts the most extensive vaccination campaign in half a century, that battle is all the more relevant.
The recently greenlit COVID-19 vaccines represent our best chance at ending the pandemic, so it’s particularly jeopardous to have the American public spending time fighting over a basic fact: vaccines are safe, effective and necessary for public health. While the new COVID-19 vaccines don’t have the benefit of decades of research demonstrating their safety and efficacy like other vaccines, many of the common narratives being spread about the COVID-19 vaccines come from existing anti-vaxx beliefs that have been debunked. It’s true that researchers don’t yet know for sure if the vaccines prevent people from spreading the virus, but we do know, for example, that mRNA vaccines don’t change your DNA. The latter is an anti-vaxx belief so prevalent, it led one Wisconsin pharmacist to allegedly tamper with vials of the vaccine.
The preexisting conflict between anti-vaxxers and pro-vaxxers has now seeped into the much broader discussions about the COVID-19 vaccine on Facebook, according to a November report from First Draft News, a nonprofit organization that provides investigative research to newsrooms tracking and reporting on mis- and disinformation. (FiveThirtyEight has partnered with First Draft in the past.)
“Our research shows how seamlessly old narratives can be repurposed to fit new contexts,” said Rory Smith, a research manager at First Draft and a co-author of the report. “When demand for information about a topic is high but the supply of credible information is low, you get a data deficit, and that deficit will quickly be filled up with misinformation.”
The researchers found that familiar tropes about vaccines, such as the idea that they are unnecessary and just a way for big pharma to make money, have been applied to the COVID-19 vaccine as well. But COVID-19 is, naturally, a much more widely discussed topic, so much of the conversations about vaccines online is now about the COVID-19 vaccine specifically, allowing anti-vaxx narratives to reach audiences who might not otherwise come across them. In fact, leaked audio recordings of anti-vaxx leaders, first noted in a report by the U.K.-based Center for Countering Digital Hate, shows that they strategized to use this exact scenario — anxiety and confusion about the new COVID-19 vaccines — to sow misinformation to a wider audience.
Data from CrowdTangle, a social media tracking tool, reveals examples of anti-vaxx ideas seeping into COVID-19 vaccine conversations across Facebook, including in otherwise unrelated spaces. In a recent search for the word “vaccine” among Facebook groups, I was able to find dozens of examples of discussions in unrelated groups, many of which inevitably had anti-vaxx misinformation in the comment sections.
The anti-vaxx movement has done so well on Facebook in part because it is controversial, and controversy helps make Facebook a lot of money. In 2019, 98 percent of Facebook’s revenue was from advertising — $20 billion in all. Facebook’s advertising is so valuable because it can be microtargeted, based on the data Facebook collects on its users. To collect more and better data (and to expose users to more ads), Facebook needs its users to be active and engaged: liking posts, sharing links, joining groups and commenting. One surefire way to keep people engaged is to expose them to content that provokes an emotional response, like a post claiming the vaccine you’re planning to give your toddler will cause him or her to develop autism.
How COVID-19 vaccines work
“What we saw at Reddit was that conflict and controversy generated the most attention,” said Ellen Pao, the former CEO of Reddit and a Silicon Valley vet who now runs Project Include, a nonprofit diversity consulting organization. “These networks are rewarded for engagement. And when people get heated over something, they stay either to engage or to watch.”
A Wall Street Journal investigation last year uncovered how teams within Facebook tasked with addressing the site’s disinformation crisis cited the platform’s design as the root of the problem. An internal company presentation from 2018 included slides that said Facebook’s algorithms “exploit the human brain’s attraction to divisiveness,” and, if not altered, would surface “more and more divisive content in an effort to gain user attention and increase time on the platform.”
And at a congressional hearing in September, Facebook’s former director of monetization, Tim Kendall, made similar observations.
“Social media preys on the most primal parts of your brain. The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions — it aims to provoke, shock and enrage,” Kendall said in his opening statement. “This is not by accident. It’s an algorithmically optimized playbook to maximize user attention — and profits.”
More recently, Facebook has made public statements and efforts to tamp down on the spread of anti-vaxx misinformation specifically.
“We are committed to reaching as many people as possible with accurate information about vaccines, and launched partnerships with WHO and UNICEF to do just that,” said Andrea Vallone, a spokesperson for Facebook. “We’ve banned ads that discourage people from getting vaccines and reduced the number of people who see vaccine hoaxes verified by the WHO and the CDC. We also label Pages and Groups that repeatedly share vaccine hoaxes, lower all of their posts in News Feed, and do not recommend them to anyone.”
Still, misinformation finds a way. “You can do these takedowns but that hasn’t necessarily stopped the flow of misinformation, and we can’t forget about the long tail of misinformation,” said First Draft’s Smith. “There are all of these hundreds of thousands or millions of posts that might not get that many interactions but collectively make up a lot of misinformation.”
THE BATTLE LINES
A typical post in the Facebook group What’s Happening In Aurora, IL? garners a handful of reactions. It’s an 81,000-member community group about, well, what’s happening in Aurora, Illinois. Posts often resemble classifieds: someone looking for bakers in the area to make a cake, someone posting a job opening, someone offering second-hand maternity clothes. But a recent post showing the first local health care worker to receive the COVID-19 vaccine drew more than 1,200 reactions and nearly 900 comments, including this one:
Anti-vaxx theories were prominent among the responses, suggesting the vaccine is dangerous and questioning the speed with which it was produced. Both of those doubts were common threads First Draft found in its report. It’s just one example of anti-vaxx beliefs bleeding into otherwise neutral spaces on Facebook.
They’re the same claims pro-vaxx advocates have been battling for years. But the battles don’t all play out the same way. In one private Facebook group called Vaccine Talk, nearly 50,000 pro-vaxxers, anti-vaxxers and people on the fence are encouraged to pursue carefully controlled, civil and evidence-based dialogue — though even in this group, some anti-vaxx and on-the-fence members told me they felt attacked or condescended to by pro-vaccine members. C.I.C.A.D.A. (which stands for Community Immunity Champions and Defenders Association), meanwhile, deploys pro-vaccine users to comment sections overrun with anti-vaxxers.
Take this Facebook post from a children’s hospital in Rochester, New York, showing one of its doctors receiving the COVID-19 vaccine. The post began to attract anti-vaxx comments, such as people questioning the ingredients of the vaccine (in reality, the ingredients are minimal, common and safe) and claiming doctors are only advocating for vaccination to make money (profits are not the motivation for recommending the COVID-19 vaccine). So a C.I.C.A.D.A. member posted in the group, sending up a flare, saying the hospital’s social media team was overwhelmed. Now, the post is flooded with supportive messages, photos of other health care workers getting their shot, and praise for the good example set, burying the anti-vaxx comments and attacks.
“Support doesn’t necessarily mean engaging with the anti-vaccine people; in fact, we encourage people not to do that,” said Dorit Rubinstein Reiss, a law professor who specializes in vaccine law at UC Hastings College of the Law, and a member of C.I.C.A.D.A. “It can mean coming in and providing positive comments. [The group is] there to prevent people from being intimidated into not posting about vaccines.”
Dr. Fauci on life post-vaccine and Biden’s approach to the pandemic | FiveThirtyEight
Vaccines Exposed, the honeypot group, took a more radical approach, luring anti-vaxxers into an ostensible safe space, only to pull back the curtain on a less sympathetic crowd. One administrator, who asked not to be named, told me she hoped the group might reveal to the anti-vaxx-curious the flaws in many of the claims against vaccination. But the interactions in the group weren’t always constructive, with pro-vaxxers sometimes mocking or ridiculing the anti-vaxx posters.
Group member Litton defended the more combative method, noting the people he affiliates with avoid explicit trolling (things like doxxing or threatening), and that humor — even at someone’s expense — can be an effective strategy in battling misinformation.
But the deception required to draw in members in the first place makes it unlikely the honeypot group will persuade anti-vaxxers, according to Rachel Alter, a research affiliate at the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine.
“If your goal is to change people’s minds, you don’t want to start out by tricking them right off the bat,” Alter said. “People aren’t going to stick around long enough. They’re going to see what’s going on and get defensive or leave.”
Research on how people form — and change — beliefs suggests that a gentler approach is more effective. People tend to see attacks on their beliefs as attacks on them personally, and we are all biased against information that challenges our existing worldview. Asking questions, sharing stories and lowering the temperature by avoiding insults can make people more susceptible to new ideas, according to Karin Tamerius, a former psychiatrist who founded Smart Politics, a program that trains people on how to have productive conversations with people they disagree with. Tamerius based her program on existing research into beliefs and persuasion and said changing views is a long, difficult process that is unlikely to occur through a single Facebook interaction.
This is ultimately the problem at hand. Best intentions and science-backed strategies are great, but the battle continues to spread because Facebook is designed for just that.
Industry researchers believe there are other efforts Facebook could make to reduce the impact of the anti-vaxx movement on the site. Last year, nonprofit research group Ranking Digital Rights released a report on how algorithmically driven advertising structures have exacerbated the disinformation epidemic by increasing its spread, and recommended social media sites look at changing these systems — rather than moderating content — to curb the spread. People will always post nonsense on the internet. The platforms we use don’t need to be designed to lead people to it.
And despite all of Facebook’s efforts, many users are still being exposed to misinformation at precisely the moment in time we need them to be well-informed.
CORRECTION (Jan. 22, 10:52 a.m.): An earlier version of this article included the incorrect name for Ellen Pao’s nonprofit. It is Project Include, not Project Inclusive. Separately, an earlier version described Karin Tamerius as a former political psychologist. She is a former psychiatrist with training in political psychology.