POLARCHATS | Polarizing Chats? Political Misinformation on Discussion Apps in India and Brazil.

Summary
In many developing countries, political actors make extensive use of discussion apps such as WhatsApp to diffuse misinformation. This political use of discussion apps has been hypothesized to generate hatred towards targeted social groups, support non-democratic tactics and values, and eventually lead to the election of politicians who rely on this tactic.

In spite of alarming reports, little systematic knowledge exists about political actors’ reliance on discussion apps or about its effects. Since 2016, social media misinformation has provoked considerable attention in academia. This literature has however almost entirely overlooked discussion apps. This lack of scholarship is puzzling, insofar as discussion apps may constitute a greater menace than other services. Two distinctive technical features of WhatsApp – encryption and the necessary existence of pre-existing ties between group members – arguably make these apps efficient vehicle for political misinformation. Drawing on this intuition, I hypothesize that WhatsApp constitutes a preferential channel for the circulation of political misinformation; that features of WhatsApp communities make it more likely than misinformation will be believed, less likely that it will be corrected, and more likely that it will have an impact on users’ downstream attitudes.

The project relies on qualitative and experimental methods to test hypotheses derived from this intuition in two large democracies in which political actors are widely hypothesized to use WhatsApp: India and Brazil. Relying on in-depth interviews, the project examines how and why party actors embrace the technology. Second, it relies on crowdsourcing to quantify misinformation on WhatsApp. Third, the project develops two types of experimental designs to examine the determinants of individual-level belief in misinformation. Finally, it relies on the same designs to analyze the effect of misinformation on the aforementioned attitudes.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101002985
Start date: 01-01-2022
End date: 31-12-2026
Total budget - Public funding: 2 000 000,00 Euro - 2 000 000,00 Euro
Cordis data

Original description

In many developing countries, political actors make extensive use of discussion apps such as WhatsApp to diffuse misinformation. This political use of discussion apps has been hypothesized to generate hatred towards targeted social groups, support non-democratic tactics and values, and eventually lead to the election of politicians who rely on this tactic.

In spite of alarming reports, little systematic knowledge exists about political actors’ reliance on discussion apps or about its effects. Since 2016, social media misinformation has provoked considerable attention in academia. This literature has however almost entirely overlooked discussion apps. This lack of scholarship is puzzling, insofar as discussion apps may constitute a greater menace than other services. Two distinctive technical features of WhatsApp – encryption and the necessary existence of pre-existing ties between group members – arguably make these apps efficient vehicle for political misinformation. Drawing on this intuition, I hypothesize that WhatsApp constitutes a preferential channel for the circulation of political misinformation; that features of WhatsApp communities make it more likely than misinformation will be believed, less likely that it will be corrected, and more likely that it will have an impact on users’ downstream attitudes.

The project relies on qualitative and experimental methods to test hypotheses derived from this intuition in two large democracies in which political actors are widely hypothesized to use WhatsApp: India and Brazil. Relying on in-depth interviews, the project examines how and why party actors embrace the technology. Second, it relies on crowdsourcing to quantify misinformation on WhatsApp. Third, the project develops two types of experimental designs to examine the determinants of individual-level belief in misinformation. Finally, it relies on the same designs to analyze the effect of misinformation on the aforementioned attitudes.

Status

SIGNED

Call topic

ERC-2020-COG

Update Date

27-04-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon 2020
H2020-EU.1. EXCELLENT SCIENCE
H2020-EU.1.1. EXCELLENT SCIENCE - European Research Council (ERC)
ERC-2020
ERC-2020-COG ERC CONSOLIDATOR GRANTS