Rethinking how to view (and slow) conspiracy theories

Elliott Lee

Conspiracy theories are certainly no stranger to mainstream American consciousness. Numerous polls and surveys have quantified Americans’ beliefs in a wide range of conspiracy theories. Some notable examples show sizable portions of Americans believing that explosives caused the collapse of the Twin Towers on 9/11 (Dwyer, 2006), or that President Obama was not born in America (Dropp & Nyhan, 2016), or that the COVID-19 pandemic was intentionally planned (Schaeffer, 2020)

More recently, conspiracy theorists challenged the integrity and outcome of the 2020 presidential election, and they played a key role in driving the violent insurrection that took place at the Capitol on January 6, 2021 (Harwell et al., 2021). Some of the most prominent false allegations about the election argue, without evidence, that actual votes impossibly outnumbered registered voters, or that specific voting machines secretly changed ballots to favor President Joe Biden (Kessler, 2020; Bump, 2020). 

Fact-checking has emerged as a popular response to conspiracy theories and their amplification (Saglam, 2020, p. 18). During the 2020 election, Twitter and Facebook took to adding disclaimers to “disputed” election content, and linking to resources with verified information (Brodkin, 2020). News outlets often label conspiratorial narratives as “baseless” or “false” in their reporting, and individual journalists have even gone viral because of their prolific, real-time fact-checking of former President Trump (Kang, 2020; Blake, 2020).  

The rise of fact-checking suggests that the truth has some ability to blunt a conspiracy theory’s influence and reach. But what nuance is lost when we employ binary framing of fact versus mistruth? How could countermeasures change if we reexamined and recontextualized the characteristics that facilitate conspiracy theories’ spread and staying power? 

Research on conspiracy theories suggests that conspiracy theorists represent a uniquely impenetrable audience for fact-checking, since they are generally inclined to assume that any counterfactual evidence is part of the conspiracy itself (Sunstein & Vermeule, 2009). Authorities’ outright rejection of conspiracy theories can also be interpreted as a confirmation in the eyes of adherents (Barkun, 2016). 

In general, solutions that rely on factual or logical basis may be destined to fail because they neglect to address the heavily emotional nature of conspiracy theories’ appeal. Narratives and explanations are more likely to spread or be shared if they trigger intense reactions—which creates a mainstream discourse that favors outrage over accuracy, and often dooms truth to a much smaller audience than falsehoods typically enjoy (Sunstein & Vermeule, 2009). Social media platforms’ tendency to algorithmically elevate content that has high engagement — or online activity such as views, likes, and comments — offers an online parallel to this human tendency, since incendiary posts often garner the activity needed to prominently appear in many users’ feeds (Munn, 2020). 

Broader social and cultural implications are also important to understanding conspiracy theories. One study specifically researched how the cognitive process of distilling meaning from a complex world could “backfire” and result in conspiratorial thinking (Graeupner & Coman, 2017, p. 218). The researchers found social exclusion to be causally associated with the endorsement of superstitious beliefs (Graeupner & Coman, 2017, p. 221). They also hypothesized that social exclusion played a critical role in starting an “exclusion-belief cycle” where an individual is excluded, endorses conspiratorial ideas, and is then even further excluded (Graeupner & Coman, 2017, p. 221). As exclusion increases, an individual may be more inclined to search for like-minded company who can reinforce and affirm conspiratorial beliefs. Once they find that company, conspiratorial ideas are entrenched, and that individual finds some social inclusion in the resulting echo chamber (Graupner & Coman, 2017, p. 221). In their conclusion, the researchers highlighted “collective-level processes” as a critical driver of conspiracy theories’ emergence, spread, and persistence beyond the individual level (Graeupner & Coman, 2017, p. 221). 

Other research has also attempted to focus on the anthropological implications of conspiracy theory belief. Drawing from their time in rural Turkey, one social anthropologist framed conspiratorial narratives as a way for individuals to shape their agency and outward-facing identities: in many cases, belief in conspiratorial narratives allowed interlocutors to project masculinity and possession of rare knowledge (Saglam, 2020, p. 19-21). In this context, conspiratorial narratives were still essentially immune to facts, but an ethnographic perspective underscored how their usage within a “social performance” was key to forging one’s identity and place in society (Saglam, 2020, p. 20). Psychological research has also pointed to perceived social standing, feeling of control, and belief in simple solutions to complex problems as influential factors for measuring individuals’ susceptibility to conspiracy theory belief (van Prooijen, 2017). It’s easy to see some of these threads in American conspiracy theories today. In a series of now-deleted tweets, the Hawaii GOP account defended QAnon adherents as “largely motivated by a sincere and deep love for America,” framing their belief as an outflow of “Patriotism and love of County [sic]” (Dowd, 2021). To be clear, patriotism alone can’t explain or excuse QAnon believers. However, it doesn’t seem far-fetched to surmise that some people espouse QAnon ideology as a social performance about their love for America, or as a simple explanation for a perceived deterioration of American morals. 

Moving beyond a ‘truth versus falsehood’ framing highlights fact-checking’s fatally insufficient ability to address the varied factors that drive conspiracy theories’ popularity, and underscores latent opportunity for intersectional solutions. For example, could social media networks detect indications of social exclusion among its users, and use that data to connect them with outside resources? Previous studies suggest that the relationship between social network activity and feelings of isolation is correlated with some basic usage metrics. Individuals with high social media usage experience more social isolation than those with lower usage (Primack et al., 2017). Additionally, users who don’t receive likes and comments on their posts experience lower levels of belonging and control, and the lack of post activity can even influence others’ perception of social exclusion (Tobin et al., 2014; Vinuales & Thomas, 2021). If someone fits the profile for a high-activity, low-feedback user at risk of isolation and lower belonging, a connection to an offline activity or resource could be a step toward de-escalating potential radicalization and belief in conspiracy theories (Papatheodorou, 2017). 

Recognizing that conspiracy theories represent complicated interactions across multiple academic disciplines opens the door to developing better, more comprehensive safeguards. If a conspiracy theory is articulated in text, at a bare minimum, a fact-check should certainly accompany it. But if conspiracy theories are demonstrably undeterred by the truth alone, the policy playbook should reflect an understanding that conspiracy theories operate on a completely different plane, where facts are as good as fiction, and the distinction between them matters little. 

References

Barkun, M. (2016). Conspiracy Theories as Stigmatized Knowledge. Diogenes. https://doi.org/10.1177/0392192116669288

Blake, M. (2020, October 30). Trump makes so many false claims, CNN fact-checker Daniel Dale has lost count. Los Angeles Times. https://www.latimes.com/entertainment-arts/tv/story/2020-10-30/trump-false-claims-cnn-daniel-dale-fact-checker

Brodkin, J. (2020, November 4). As Trump falsely claims victory, Twitter and Facebook counter misinformation. Ars Technica. https://arstechnica.com/tech-policy/2020/11/as-trump-falsely-claims-victory-twitter-and-facebook-counter-misinformation

Bump, P. (2020, December 1). Swing-state counties that used Dominion voting machines mostly voted for Trump. The Washington Post. https://www.washingtonpost.com/politics/2020/12/01/swing-state-counties-that-used-dominion-voting-machines-mostly-voted-trump/

Dowd, K. (2021, January 24). Hawaii GOP deletes long thread defending QAnon believers. SFGATE. https://www.sfgate.com/politics/article/Hawaii-GOP-deletes-QAnon-tweets-15894250.php

Dropp, K. & Nyhan, B. (2016, September 23). It lives. Birtherism is diminished but far from dead. (Published 2016). The New York Times. https://www.nytimes.com/2016/09/24/upshot/it-lives-birtherism-is-diminished-but-far-from-dead.html

Dwyer, J. (2006, September 2). 2 U.S. reports seek to counter conspiracy theories about 9/11 (Published 2006). The New York Times. https://www.nytimes.com/2006/09/02/nyregion/02conspiracy.html

Graeupner, C. & Coman, A. (2017). The dark side of meaning-making: How social exclusion leads to superstitious thinking. Journal of Experimental Social Psychology, 69, 218–222. https://doi.org/10.1016/j.jesp.2016.10.003

Harwell, D., Stanley-Becker, I., Nakhlawi, R., & Timberg, C. (2021, January 13). QAnon reshaped Trump’s party and radicalized believers. The Capitol siege may just be the start. The Washington Post. https://www.washingtonpost.com/technology/2021/01/13/qanon-capitol-siege-trump/

Kang, C. (2020, November 11). Tweets from Biden aide show campaign’s frustration with Facebook. The New York Times. https://www.nytimes.com/2020/11/11/technology/tweets-from-biden-aide-show-campaigns-frustration-with-facebook.html

Kessler, G. (2020, November 23). Giuliani keeps peddling debunked falsehoods on behalf of Trump. The Washington Post. https://www.washingtonpost.com/politics/2020/11/23/giuliani-keeps-peddling-debunked-falsehoods-behalf-trump/

Munn, L. (2020). Angry by design: toxic communication and technical architectures. Humanities and Social Sciences Communications, 7(1). https://doi.org/10.1057/s41599-020-00550-7

Papatheodorou, K. (2017, October 15). What CVE Can Learn from Guerrilla Marketing. Lawfare. https://www.lawfareblog.com/what-cve-can-learn-guerrilla-marketing

Primack, B., Shensa, A., Sidani, J., Whaite, E., Lin, L., Rosen, D., Colditz, J., Radovic, A., & Miller, E. (2017). Social Media Use and Perceived Social Isolation Among Young Adults in the U.S. American Journal of Preventive Medicine, 53(1), 1–8. https://doi.org/10.1016/j.amepre.2017.01.010

Saglam, E. (2020). What to do with conspiracy theories?: Insights from contemporary Turkey. Anthropology Today, 36(5), 18–21. https://doi.org/10.1111/1467-8322.12606

Schaeffer, K. (2020, July 24). A look at the Americans who believe there is some truth to the conspiracy theory that COVID-19 was planned. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/07/24/a-look-at-the-americans-who-believe-there-is-some-truth-to-the-conspiracy-theory-that-covid-19-was-planned/

Sunstein, C., & Vermeule, A. (2009). Conspiracy Theories: Causes and Cures. The Journal of Political Philosophy, 17(2), 202–227. https://doi.org/10.1111/j.1467-9760.2008.00325.x

Tobin, S. J., Vanman, E. J., Verreynne, M., & Saeri, A. K. (2014). Threats to belonging on Facebook: lurking and ostracism. Social Influence, 10(1), 31–42. https://doi.org/10.1080/15534510.2014.893924

van Prooijen, J. (2017). Why Education Predicts Decreased Belief in Conspiracy Theories. Applied Cognitive Psychology, 31(1), 50–58. https://doi.org/10.1002/acp.3301

Vinuales, G., & Thomas, V. (2021). Not so social: When social media increases perceptions of exclusions and negatively affects attitudes toward content. Psychology & Marketing, 38(2), 313–327. https://doi.org/10.1002/mar.21339

Elliott Lee is a Master of Public Policy student at the USC Price School of Public Policy and an IT analyst at the University of California, Santa Barbara. After graduating with a BA in Political Science and English from UC Santa Barbara, he spent two years working in China, which focused his research interests on the intersection of technology, speech, and misinformation. 

Edited by Derrick Flakoll

Image credit: Ehimetalor Akhere Unuabona