The USAID Center for Democracy, Human Rights, and Governance (DRG Center) recently published the USAID Disinformation Primer, which provides an overview of “information disorder” and "disinformation culture" and presents resources to help counter disinformation, misinformation, and malinformation through USAID programming. These phenomena continue to spread, feeding election dysfunction, altering the course of the COVID-19 pandemic, and complicating a new war in Europe. As the Aspen Institute concluded in its “The Commission on Information Disorder Final Report” in November 2022, “Information disorder is a crisis that exacerbates all other crises. When bad information becomes as prevalent, persuasive, and persistent as good information, it creates a chain reaction of harm.”
Fact-checking initiatives, a form of debunking misinformation, are a popular method of mitigating the spread of misinformation. However, some studies suggest that, fueled by social media, misinformation can spread further and faster than fact-checked corrections and people exposed to misinformation may continue to rely on it, even after it has been debunked.
Screen shot from “Prebunking Anti-Refugee Rhetoric in Central and Eastern Europe,” a six-video series that was launched in Poland, the Czech Republic, and Slovakia to build resilience against manipulative messages, in this case, anti-refugee messages.
This month’s DRG Learning Digest focuses on “inoculation theory” and how to operationalize it against disinformation:
- Inoculation Theory: A “Vaccine for Brainwash”
- “Prebunk” to Build “Mental Antibodies”
- This Isn’t a Game…Or Is It?
Please make use of DRG Evidence and Learning Team resources! (See text box at the end.)
Inoculation Theory: A “Vaccine for Brainwash”
Inoculation theory is a promising approach that can be applied to address the spread of harmful misinformation and hate speech. Developed in the 1960s by social psychology giant William McGuire as a “vaccine for brainwash,” it offers the logical basis for developing psychological “vaccination” against misinformation.
Drawing its metaphor from medicine, inoculation theory suggests that an “attitudinal inoculation” works much like a biological inoculation — pre-exposure to a weakened dose of a virus immunizes our system against, and confers resistance to, future attacks by stronger versions of the virus. Inoculation treatments for misinformation typically include two key components: a forewarning of an impending attack against existing beliefs and a pre-emptive refutation of the anticipated argument, or “prebunk”. The forewarning motivates resistance, while the refutation helps individuals effectively defend against the attack. For example, this video not only warns viewers that disinformation targeting Ukrainian refugees may occur, but it also provides them with an effective counterargument to use when it does.
Inoculation theory has been applied across many fields. A 2010 meta-analysis of 54 cases found inoculation treatments are more effective than supportive treatment (conventional advocacy) or no treatment in fostering resistance to attitude change. A 2021 NATO study by Jon Roozenbeek and Sander van der Linden, “Inoculation Theory and Misinformation,” provides a deep dive into inoculation theory’s history, modern applications such as some of those described later in this edition, and an exploration into the limitations of reactive debunking on its own as a means to disrupt “disinformation culture.”
Five inoculation videos featuring common techniques of disinformation produce a significant improvement in “technique discernment,” i.e., the ability to discern manipulative from non-manipulative social media content. (Source: Inoculation Theory and Misinformation)
“Prebunk” to Build “Mental Antibodies”
“Prebunking,” a communication technique designed to help people spot and reject attempts to manipulate them through unwanted persuasion, is often key to effective inoculation against misinformation. Prebunking aims to foster cognitive tools and knowledge that help people consider information critically, and teach individuals to identify reliable sources of information, seek out balanced content and independent verification of claims, and identify hidden narratives in messaging presented to them.
Prebunking can be categorized into different types, including logic-based and fact-based. Logic-based prebunks help individuals identify tactics used to manipulate information, while fact-based prebunks correct a specific false claim or narrative. Prebunking works in tandem with debunking through “fact checks” that occur after exposure.
Logic-based prebunking, which research has shown is particularly effective, is built upon a foundation of media and information literacy and emphasizes the importance of identifying tactics used to manipulate information. Programs designed to bolster digital and media literacy often help participants recognize bad sources of information, which is another effective prebunking method.
For example, the “Advancing Media Literacy to New Digital Arrivals” activity supported by the USAID/Indonesia Mission encourages critical online consumption skills through a series of trainings, educational videos, and games called Literata that seek to empower “new digital arrivals” in Indonesia with an increased ability to identify misinformation. In one of few studies conducted in a developing digital economy, this activity found that social media users who engaged with its materials were more likely to identify misinformation than those in a control group. Furthermore, they were 77 percent more likely to decrease online sharing in general and 66 percent more likely to decrease online sharing of misinformation headlines.
Screenshots from anti-misinformation video episodes of Literata (Source: USAID/Indonesia’s “Advancing Media Literacy to New Digital Arrivals” activity)
A recent, large-scale experiment by the technology incubator Jigsaw (a unit of Google), the University of Cambridge, and the University of Bristol, found that teaching people how to spot misinformation made them more skeptical of it. The experiment used exposure to 30- and 90-second "inoculation" videos, followed eight days later by 15-second "boosters,” that inform people against manipulation techniques commonly used in misinformation such as emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks. This experiment yielded statistically significant improvements in users' ability to detect misinformation, lower willingness to share misinformation, and lower trust in the messenger of misinformation in the short term, regardless of political beliefs, demographics, and critical thinking skills. This approach is designed to be apolitical and takes a non-accusatory approach, which presumes that the audience is innocently misled. Like any education intervention, the effects decay over time without additional exposure, and as with biological vaccines, boosters are required.
Fact-based prebunking, another helpful tool for combating disinformation, focuses on addressing a specific claim or narrative. This approach is particularly helpful when dealing with clearly defined claims that pose key vulnerabilities. A 2019 study of inoculation theory in prevention of violent extremism in the United States found that participants exposed to an inoculation message reduced intention to support the extremist group, and negatively predicted perceptions of the extremist group’s credibility. Jigsaw has complemented their misinformation identification approach with elements of fact-based prebunking, testing it on specific issues such as alcohol education, climate change, and health. Jigsaw seeks to make its methodology open-source to equip others to prebunk, with free “how to” resources. According to John Cook of the Climate Change Communication Research Hub in Melbourne, Australia, the best prebunks combine “fact and logic so people can understand the facts but also be able to spot attempts to distort the facts.”
This Isn’t a Game…Or Is It?
While media literacy is generally the domain of classroom settings, new approaches must deliver these skills and capacities across populations. Games — and specifically online video games — are an effective method with the potential to create these capacities across broad swaths of populations at scale. Certain games in particular that prebunk on social media and meet people where they are online are demonstrating notable, positive impacts in a range of geographical and cultural contexts.
One interesting project is a series of three different “inoculation games,” each of which covers a different domain of misinformation: Bad News (about online “fake news”), Harmony Square (about political disinformation and intergroup polarization), and Go Viral! (“a five-minute game that helps protect you against COVID-19 misinformation”). These were created and are being studied in a collaboration between two design companies, DROG Group and Gusmanson Design, working with researchers at Cambridge University.
Screenshots from “Bad News,” a game in which players take on the role of a fake news-monger.
In Harmony Square players learn about five commonly-used manipulation techniques: trolling, using emotional language, polarizing audiences, spreading conspiracy theories, and artificially amplifying the reach of their content through bots and fake likes. A randomized controlled trial found that people who play this game find disinformation significantly less reliable after playing (by 16 percentage points), are significantly more confident in their ability to spot manipulative content, and are significantly less likely (by 11 percent) to report sharing disinformation.
Bad News has been the subject of multiple research studies, with mixed results. A study from several years ago found that Bad News can confer psychological resistance against common online misinformation strategies across different cultures. It provides initial evidence that people’s ability to spot and resist misinformation improves after playing Bad News, regardless of education level, age, political ideology, and cognitive style, concluding that social impact games rooted in basic insights from social psychology can boost immunity against misinformation across a variety of cultural, linguistic, and political settings. But a new paper published in July 2022 raises questions about the effectiveness of current psychological interventions, and seems to contradict prior research that has supported the efficacy of Bad News. More research will be needed as this young field develops and gamemakers continue to explore their potential as cures for information disorder.
Further Assistance
You can learn more about inoculation theory and other resources found in this Learning Digest by contacting the DRG Center’s Civil Society and Media (CSM) Team at ddi.drg.csm@usaid.gov, and in particular Josh Machleder (jmachleder@usaid.gov).
Recent DRG Learning Events
Open and Inclusive eGOV Systems
On December 14, Arturo Rivera, co-author of OECD’s Good Practice Principles for Data Ethics in the Public Sector, and Kristina Mänd, Senior Expert on e-Democracy at the Estonia e-Governance Academy, joined a DRG Center Governance (GOV) Team webinar, which examined how governments are increasingly using data to design and deliver policies and services, improve their operations, and better understand and respond to evolving conditions in their countries. The panelists discussed the benefits and risks to privacy and misuse in contexts where data governance is weak.
Electoral Cybersecurity
As digital technologies are increasingly used in elections around the world, the risk of cyber attacks against election infrastructure is growing. Under the DAI Digital Frontiers Project, the DRG Center's Democratic Elections and Political Processes (DEPP) Team is supporting the International Foundation for Electoral Systems (IFES) to produce a series of publications and virtual events to help USAID personnel, the interagency, and broader DRG community better integrate cybersecurity readiness into electoral assistance programs.
The publication series includes the Electoral Cybersecurity Primer, Electoral Cybersecurity Reference Document, and Voter Registration Cybersecurity paper, as well as the forthcoming Election Results Management Cybersecurity paper and Electoral Cybersecurity Programming Guide. To accompany these publications, on December 14 the DRG Center and IFES hosted a virtual panel discussion for U.S. Government staff only, during which leading experts in the field highlighted cybersecurity challenges and threats in voter registration and election results management, as well as approaches that international development programs can take to help address these threats.
Evidence and Learning Talk Series: “Building Social Cohesion Between Christians and Muslims Through Soccer in Post-ISIS Iraq”
Can intergroup contact build social cohesion after war? On December 15, Yale Professor Salma Mousa shared with the DRG Center the results of her experiment that randomly assigned Iraqi Christians to an all-Christian soccer team or to a team mixed with Muslim teammates. She found the intervention did not substantially affect behaviors in other social contexts nor did it yield consistent effects on intergroup attitudes. Her conclusion is that although contact can build tolerant behaviors toward peers within an activity, building broader social cohesion outside of it is more challenging.
Dollars and Dissent: Donor Support for Grassroots Organizing and Nonviolent Social Movements
More people than ever before are using nonviolent collective action to secure rights, justice, and democracy around the world. Scholarship shows this strategy has been twice as effective as violent action at attaining these goals. Yet, from 2011 to 2019, public charities and private foundations gave only three percent of their total human rights funding to support nonviolent collective action. On January 10, 2023, Ben Naimark-Rowse, an International Collective Action and Social Movements Expert in the DRG Center, led a Tuesday Group discussion of his recent report, Dollars and Dissent: Donor Support for Grassroots Organizing and Nonviolent Movements.
Developing Countries Are Facing a Renewed Debt Crisis: Is USAID Prepared?
Across the developing world, public debt levels have reached their highest levels since the 1980s. Today, more than half of low-income countries are in or at high risk of debt distress, and many middle-income countries face elevated risk as well. On January 23, the DRG Center and USAID’s Chief Economist Dean Karlan welcomed Rafael Romeu, President and CEO of DevTech Systems, Inc. implementer of USAID’s Fiscal Accountability and Sustainable Trade (FAST) activity, and Gabriel Lopetegui, the principal author of USAID’s new Public Debt: A Primer for Development Practitioners, to discuss the roots of the current situation and its potential impact on government’s spending on education, health, and agriculture, among others, as well as civil unrest and government collapse.
Use Our Resources!
Welcome to the DRG Learning Digest, a newsletter to keep you informed of the latest learning, evaluation, and research in the Democracy, Human Rights, and Governance (DRG) sector. Views expressed in the external (non-USAID) publications linked in this Digest do not necessarily represent the views of the United States Agency for International Development or the United States Government.
Don't forget to check out our DRG Learning Menu of Services! (Link only accessible to USAID personnel.) The Menu provides information on the learning products and services the Evidence and Learning Team offers to help you fulfill your DRG learning needs. We want to help you adopt learning approaches that emphasize best fit and quality.
The Evidence and Learning Team is also excited to share our DRG Learning, Evidence, and Analysis Platform (LEAP) with you. This Platform contains an inventory of programmatic approaches, evidence gap maps, the DRG Learning Harvest, and inventories of indicators and country data portraits - all of which can be very useful in DRG activity design, implementation, evaluation, and adaptation. Some of these resources are still being built, so check back frequently to see what has been newly added.
The DRG Learning Harvest on LEAP is a searchable database of DRG learning products, including summaries of key findings and recommendations, drop-down menus to easily find documents related to a particular country or program area, and links to the full reports on the DEC.
Our friends at the Varieties of Democracy (V-Dem) Institute are also seeking to expand their research partnership with USAID on the complex nature of democracy by inviting research questions from you for V-Dem to work on. If there's a DRG technical question you've been wondering about, please email the Evidence and Learning Team at ddi.drg.elmaillist@usaid.gov.
We welcome your feedback on this newsletter and on our efforts to promote the accessibility, dissemination, and utilization of DRG evidence and research. Please visit the DRG Center's website for additional information or contact us at ddi.drg.elmaillist@usaid.gov.
|