Home

Donate

Dispatches on Tech and Democracy: India’s 2024 Elections #2

Amber Sinha / Apr 18, 2024

Amber Sinha is a Tech Policy Press fellow.

This is the second issue of Dispatches on Tech and Democracy, examining the use of artificial intelligence (AI) and other technologies in the Indian general elections and their implications for critical tech policy discussions and debates. The general elections will begin at the end of this week and conclude on June 4th. State assembly elections in Andhra Pradesh, Arunachal Pradesh, Odisha, and Sikkim, along with elections in 16 other states, will be held simultaneously with the general election.

In Focus: Youtube’s lax treatment of election disinformation in India

In the last dispatch, I briefly mentioned a new report by Access Now and Global Witness. The organizations recently conducted an empirical study to test YouTube’s review of advertising content in the context of election disinformation in India. They released their findings in a report, “Votes will not be counted”: Indian election disinformation ads and YouTube, named after a particularly egregious example of disinformation used in the study.

The methodology of this study was straightforward. Access Now and Global Witness teams prepared and submitted 48 advertisements in English, Hindi, and Telugu, three of the most spoken languages in India. These advertisements contained content directly violating YouTube’s advertising and election misinformation policies. Despite being in violation, the report claimed that all 48 advertisements passed through YouTube’s review process and would have been published had they not been withdrawn by the research team prior to the publication.

Related Reading: Dispatches on Tech and Democracy: India’s 2024 Elections - #1

The research team at Access Now and Global Witness first prepared “16 short pieces of election disinformation content drawing on local events and existing disinformation. More than half of these contained language specific to India, while the remainder had language generic for most national elections.” They then translated these pieces from English to two Indian languages, Hindi and Telugu. The team also created videos for these 48 pieces and submitted them on YouTube as ad creatives over February and March 2024. They allowed for one day of review for each ad, and YouTube’s system appeared to approve all of them for publication. Once approved, the research team withdrew the ads to ensure they were not published on the platform.

The kind of content created by the research teams included “voter suppression through false information on changes to the voting age, instructions to vote by text message, and incitement to prevent certain groups from voting.” In a statement provided by Google that was included in the report, the company’s primary defense was that “none of these ads ever ran” on their platform and that their “remaining enforcement reviews” were pending. The company goes on to claim, “Our enforcement process has multiple layers to ensure ads comply with our policies, and just because an ad passes an initial technical check does not mean it won’t be blocked or removed by our enforcement systems if it violates our policies.”

There seem to be two separate claims here. The first implies that additional review steps were pending, and Google would not have published the ads. However, Google does not provide details of these ‘additional review steps,’ nor is that information available to users when they create ads on YouTube’s platform. The second claim appears that even if YouTube approved the ads for publication, they could be blocked or removed later. While this is a valid claim, it is of limited relevance to the report’s primary argument that ads in egregious violation of YouTube’s policies appear to have been approved for publication. The fact that there were processes to facilitate the removal of the ads after publication does not mitigate any harm created while they remained visible to users.

Other Developments

The London Story, along with Ekō and Indian Civil Watch International, published a report focussing on large spending by shadow advertising networks on Facebook promoting hate speech, communal violence, and disinformation in India. Prior to the publication, the NGO and 38 Indian diaspora groups had written to Meta urging the company to adopt a 10-point plan, including observing an election silence period and banning shadow advertisers.

In another interesting development, YouTube placed monetization limits on some videos pertaining to Electronic Voting Machines (EVMs) and Voter Verified Paper Audit Trail (VVPAT) machines by two publishers, Meghnad, a public policy commentator and former LAMP fellow, and Sohit Mishra, an independent journalist. Mishra informed The Indian Express that four of his videos discussing EVMs had their monetization restricted. Eventually, only one video had its monetization reinstated following Mishra's request for a review. Meghnad also requested a review of his videos but did not receive any response until the press outlet reported on the issue.

This development follows a recent decision by YouTube to add additional context under videos about the efficacy of EVMs. In this case, the company placed limits on videos that violated the advertiser guidelines, which include promoting demonstrably false information about public voting procedures, political candidate eligibility based on age or birthplace, election results, and Census participation that contradicts official government records. I expect to continue to follow this story to understand better how YouTube, one of the most important platforms in the context of the upcoming elections in India, is dealing with videos on EVMs and the impact of its new moderation standards.

The Lancet also published a short article on the importance of data in the context of the elections, lamenting the poor state of health data in India. The article is not a reference to problems in data collection that have been endemic in India for a long time but delays in collecting or publishing essential data sets that are important to public health. Among the examples it cites is the decennial census of 2021, which was delayed for the first time in 150 years. In addition, the Sample Registration System survey report for 2021, India's most reliable source of data on births and deaths, and the periodic measurement of morbidity and out-of-pocket expenditure by the National Sample Survey Organization are both overdue.

Finally, reporters Without Borders recently asked political parties in India participating in the upcoming general elections to pledge their commitment to ten specific actions aimed at protecting the freedom of the press in light of the notable decrease in press freedom in India. These included ensuring access to accurate news and information and safeguarding journalists. Their demands range from seeking immediate release of journalists detained by the government to overhauling laws used to threaten media organizations and journalists to independent inquiries into instances of spyware used on journalists that hinder foreign media’s right to cover India.

Additional Reads

  • Last month, The Ken published a story on the rise of startups providing generative AI services to political campaigns.
  • In January, Srishti Jaswal covered BJP’s data collection app, Saral, in detail for the Rest of the World.
  • The Eurasian Times reported on the threats of Chinese intervention in Indian elections.

Please share your feedback. Are there stories or important developments that we have missed? You can write to us at contributions@techpolicy.press with your suggestions, reflections, or critiques.

Authors

Amber Sinha
Amber Sinha works at the intersection of law, technology and society, and studies the impact of digital technologies on socio-political processes and structures. His research aims to further the discourse on regulatory practices around the internet, technology, and society. He is currently a Senior ...

Topics