Covid-19 Tested U.S. Health Systems - and News Habits


THE LAST TIME THE U.S. FACED A NATIONAL HEALTH CRISIS that could be conquered by vaccination, Americans in 1955 jumped at the chance to get the new Salk polio vaccine, eager to vanquish the debilitating disease that killed thousands each year and left many more permanently disabled. Today, more than a year-and-a-half after free vaccines against COVID-19 became widely available, just over 65% of eligible Americans are fully vaccinated, the omicron and delta variants are killing hundreds of Americans a day and some health care systems are again overwhelmed.

A group of researchers — all have published on the causes and impacts of conspiratorial thinking — corral more than 150 academic studies, surveys and news reports to construct a framework to understand how social media aids and abets the flow of misinformation. In their compilation and contextualization of so much research, the authors bring empirical heft to the debate of the role social media has played in America’s response to the COVID-19 pandemic.

UCLA Anderson School of Management’s Jennifer Whitson, Washington University in St. Louis’ Benjamin J. Dow, the University of Maryland’s Amber L. Johnson, Northwestern’s Cynthia S. Wang and Ohio State’s Tanya Menon make a case that social media, a very 21st century mode of communication, is central to the story of vaccine hesitancy as it has served as fertile ground for conspiracy theories to gain traction and embolden believers.

Their narrative starts at the very beginning of how the pandemic changed Americans’ scripts. Stuck at home during the lockdown, “normal” lives upended, many did what humans do: scrambled to find answers that would provide any sense of order to the chaos. And in lockdown that meant going online.

The authors cite 2021 research that found an uptick in socialmedia usage as a way to deal with anxieties. People weren’t just swapping sourdough bread wins. Nearly half of people surveyed by Gallup reported turning to social media for COVID-19 information in the early stages of lockdown. An analysis of Instagram hashtags published in the Journal of Medical Internet Research found that two-thirds of Instagram posts in the first three months of lockdown had COVID-19 hashtags.

And that hasn’t abated. A Pew Research report found that nearly six-in-10 people say they often rely on social media for their news. Moreover, the authors note that in a heightened state of worry, many were eager to find new “alternative structures” to make sense of their world. The authors cite three research papers that establish a lack of control increases the likelihood of latching on to a conspiracy theory. “The very same lack of control and, to a lesser extent, uncertainty that drove people onto social media also made them more open to conspiracy theories,” they write

And those individuals flocking to social media for answers stepped into a medium often untethered from the truth. The authors cite two 2020 studies that found, on social-media sites such as Facebook and Reddit, dubious posts pushing conspiracy theories (such as Big Pharma created COVID-19 to generate profits) spread as quickly as reliably sourced posts.

Over on Twitter, the research suggests, viral spread of misinformation was even swifter. A 2018 paper found that false rumors spread six times faster than the truth and bore deeper: The top 1% of false posts reach as many as 100,000 users while the truth rarely gets retweeted or shared to more than 1,000.

Bots are, of course, part of this narrative. The authors note 2020 research that analyzed a sample of more than 43 million English-language tweets about COVID-19. Bots were more focused on political conspiracies, while human tweets were more focused on mundane stuff like public-health concerns

Then there’s the issue of algorithms. The authors cite four studies published in 2020 and 2021 that suggest that, in the process of trying to boost user engagement, the platforms give oxygen to conspiracy theories. And that’s a bigger deal than you might think, as the authors slide in that four other studies have found that mere exposure to a conspiracy theory can plant a seed even in the unsuspecting.

Then there’s the issue of algorithms. The authors cite four studies published in 2020 and 2021 that suggest that, in the process of trying to boost user engagement, the platforms give oxygen to conspiracy theories. And that’s a bigger deal than you might think, as the authors slide in that four other studies have found that mere exposure to a conspiracy theory can plant a seed even in the unsuspecting.

Someone spending (more) time on social media, in a frame of mind in which they are more susceptible to conspiracy theories, was then likely to land on posts from influencers beating the COVID-conspiracy drum.

One study found that influencers spreading COVID-19 misinformation generated 20% of the volume of false posts, but those posts accounted for nearly 70% of the engagement (likes, posts, comments, shares, etc.) among their followers.

And the internet in general, and social-media platforms especially, makes it easy to find kindred conspiratorial spirits, which then often leads to living online in an echo chamber impervious to other information. The authors cite a study published in 2021 that found “these online bubbles do not simply reinforce existing beliefs; rather, they tend to encourage the adoption of even more extreme beliefs.” Moreover, once one is ensconced in a conspiratorial social-media group, the easy lines of communication help to entrench false beliefs.

A 2020 study provides evidence to support what seems obvious to many: The behavior of people who believe in COVID conspiracies puts them at risk. People who believed that drug companies created COVID-19 for profit motives, or that the Centers for Disease Control and Prevention was unnecessarily scare mongering for political reasons, were less likely to wear a mask or get vaccinated.

As for trying to engage the believers in a fact-based discussion, the review authors take a dispiriting position. They suggest that any pushback serves to entrench the conspiratorial believer, as it “provides both attention and confirmation that the confronted individual is having an impact on the social world around them.”

And then there’s the further confirmation bias delivered when they can share such encounters with other believers. Being confronted in the grocery store for not wearing a mask is fodder for a social-media post that prompts other true believers to deliver a virtual high-five.

The authors run through potential strategies for stemming the adoption, spread and effectiveness of conspiracy theories. While there are many possibilities, the practical implementation seems less clear, especially in the near-term.

Banning conspiratorial content is obviously one option. The authors give Facebook credit for banning QAnon and Holocaustdenier conspiracy theories, though those moves were made after much public scrutiny and long after the theories had gone viral. Whether the sites will be more proactive when the next conspiracy theory threatens the public good remains to be seen.

And the authors note that attaching factual pushback to posts containing misinformation may be ineffective. Research published in 2020 found that such messaging didn’t penetrate users who were already attached to a conspiracy theory and to a social circle of fellow believers.

While attempts at debunking may be ineffective, the authors suggest other forms of intervention. Multiple studies have shown that when someone is pushed to think critically about the veracity of a conspiracy theory right when they are encountering it, they are less likely to hop on board. The authors reference 2016 research that “simply labeling something a ‘conspiracy theory’ did not make people find it any less believable.” And to be fair, it is unclear how social-media platforms, or any organization, can get between susceptible users and conspiracy theories at the very moment they meet.

The authors suggest that “pre-bunking” might be a better approach. Improving science education and boosting science literacy would be one way to embed critical pushback against anti-vaccination conspiracy theories. That may help stave off today’s youngsters from growing into anti-vaxxers. But we’ve still got a national problem with those who have ventured deep down social media’s rabbit hole of COVID-19 conspiracy theories.