Penn Calendar Penn A-Z School of Arts and Sciences University of Pennsylvania
India in Transition

India’s Misinformation Crisis: What Role Do BJP WhatsApp Groups Really Play?

Simon Chauchard
July 19, 2021

Building on already worrying dynamics, misinformation has abounded on Indian social media since the beginning of the COVID-19 pandemic. Claims that the virus was deliberately spread, or that minority groups are conspiring to accelerate its spread, have been common. Recommendations for miracle cures have also dominated online spaces, with fact-checkers debunking an unprecedented amount of false remedies. 

The Indian version of this crisis differs from the misinformation crises other countries face. This is due, in part, to contextual factors, with the country having comparatively lower rates of literacy and digital literacy. But this is likely also due to both technological and partisan differences: much misinformation in India—including misinformation about COVID-19—is disseminated through WhatsApp groups organized by the ruling BJP party, with a clear organizational edge in the digital world.

This peculiar mode of dissemination raises additional questions. First, does most of the misinformation circulating on Indian WhatsApp originate from these partisan threads? Second, what share of the content on these partisan threads arguably counts as misinformation? 

Answers to these questions prepare the way for a more causal, and arguably more meaningful, question: what are the political effects of exposure to the misinformation disseminated on these threads? Political misinformation circulating on WhatsApp has previously been linked to collective violence and electoral swings. COVID-19 misinformation may also lead to offline effects. Rumors accusing groups of deliberately spreading the virus may trigger collective violence. Claims about miracle cures may lead individuals to disregard legitimate scientific guidelines. 

While social scientists lack adequate data to provide systematic, evidence-based answers to each of these questions, and hence, to sort out reality from hyperbolic discourses about the putative impact of these groups, a number of theoretical inferences can be made.

First, what is the relative importance of BJP groups in the overall production and dissemination of misinformation (and specifically, COVID-related misinformation) in India? A proper response to this question would require an extensive analysis of the diversity of groups (partisan and non-partisan) Indians are exposed to through their phones. While this is extremely hard to achieve in light of the private nature of groups, existing information may provide us with an educated guess. Partisan threads do contain some misinformation, and as such, constitute a real danger. It, however, seems extremely unlikely that most misinformation Indians are exposed to through WhatsApp originates from—or even just transits through—BJP groups. Most of us are exposed to large amounts of misinformation through other channels, such as family groups. Even if some of this misinformation seems steeped in BJP-compatible beliefs, most of it is not. And in India, as it is elsewhere, humans do not need the intervention of a party to believe and share misinformation. For instance, religiosity may, in and of itself, be a powerful correlate of belief in misinformation. For instance, in the case of COVID-19 misinformation, Sumitra Badrinathan and I have, in our ongoing research, found levels of religiosity to be much better predictors of belief in misinformation than levels of BJP support. In that sense, it is clear that COVID-19 misinformation in India, even if it is frequently amplified by BJP actors, cannot entirely be blamed on partisan threads. 

To what extent is misinformation common on these threads? My expectation on this point also runs counter to some of the most hyperbolic claims made about these threads. Simply put, while they contain some misinformation, and certainly COVID-related misinformation, it is probably inaccurate to say that BJP WhatsApp threads are “packed” with such content; in fact, in proportional terms, BJP threads likely contain relatively few misinformed posts (at most a few percent of the overall content). This is simply because these threads also contain disproportionate amounts of other content—“legitimate” partisan propaganda on a diverse number of issues and various other types of partisan content (party workers’ selfies and birthday wishes to political actors) mainly geared toward the mobilization of already loyal supporters/members. In turn, all of this partisan content floats in a bath of entirely non-political content, much of which are salutations and wishes relying on religious iconography. But a large share of the content is also neither partisan nor religious, and more easily classifiable as timepass or entertainment. It is not rare to find jokes, songs, local news, and even ads on BJP groups. None of this should surprise us in light of the relatively horizontal architecture of WhatsApp, where “admins” are, despite their best efforts, not very powerful and cannot easily prevent content from being posted. Thus, whatever misinformation appears on these threads appears in the midst of far larger amounts of other content, most of which may be uninteresting to most, but is arguably not misinformation. 

The messy nature of the content disseminated through these groups informs the aforementioned causal question—what effects does exposure to this misinformation really have? Namely, it should lead us to wonder whether misinformation circulated through BJP threads—and more generally, through WhatsApp communities—might be dangerous, not because they unleash scores of misinformation on users, but precisely because these contents are interspersed between seemingly innocuous content. If this “other” content helps users develop a sense of community and trust, the effect of said misinformation may be heightened. The influence and power of closed discussion groups may, in other words, lie less in their ability to distribute large quantities of misinformation at a very low cost than in their ability to expose patiently constructed—and trustful—digital communities to selective misinformation. 

This all suggests that misinformation—if it had an effect on behaviors—would not have an effect because it is frequent and common, but more credibly, because it is rare and presented in a context that elicits trust. Should we, however, necessarily expect misinformation delivered in such a way that change behaviors? Concretely, how often does reading about a miracle cure for COVID-19 on a group chat actually lead readers to try said cure and/or discard all other health measures? 

Here again, it is important to remind readers that we currently possess no credible evidence establishing this potentially tragic causal relationship. Until we have such evidence, this will remain an open question that requires additional research. Even if we were able to scientifically measure the effects of exposure to health-related misinformation, it is unclear if we would detect such effects. In general, the influence of WhatsApp is likely lower than what several actors would have us believe. As a matter of fact, a narrative of all-powerful BJP groups changing minds almost perfectly serves the interests…of the party itself. Party leaders, with Amit Shah leading the pack, have repeatedly boasted of the “digital armies” or “WhatsApp machines” they have created, which may be read in the context of a dominant party willing to project an image of invincibility to discourage opposition. 

These networks have grown and surpassed those of opposition parties, but they may not constitute the silver bullet some BJP executives would want us to see them as. Simply put, this would go against what over 70 years of research in political psychology, political communication, and advertising generally tells us—that persuasion is far from an automatic process. And for the record, research on similar questions in the US context has, over the past few years, consistently concluded that the effects of online misinformation were largely overblown, in part, because the people exposed to this type of social media content were already believers, and, in part, because fewer people than expected properly processed this content.

It is credible to think that similar dynamics would apply to India and to COVID-19 misinformation. For a variety of reasons, exposure to these contents through social media likely changes the behavior of only a tiny group of individuals. This misinformation is never completely processed or taken seriously by many; it is lost in a flow of other information; or, more problematically, because those who are prone to believe it have already been exposed to it in any number of ways. In that sense, India’s real challenge may not lie as much in the peculiar role that BJP WhatsApp groups play in the exchange of COVID-19 misinformation as in a much more problematic fact: that the party’s leaders have, for years, loudly promoted scientific misinformation on WhatsApp and everywhere else.

Simon Chauchard is an Assistant Professor of Political Science at Leiden University.


India in Transition (IiT) is published by the Center for the Advanced Study of India (CASI) of the University of Pennsylvania. All viewpoints, positions, and conclusions expressed in IiT are solely those of the author(s) and not specifically those of CASI.

© 2021 Center for the Advanced Study of India and the Trustees of the University of Pennsylvania. All rights reserved.