Home

Donate

Dispatches on Tech and Democracy: India’s 2024 Elections #5

Amber Sinha / May 20, 2024

Amber Sinha is a Tech Policy Press fellow.

This is the fifth issue of Dispatches on Tech and Democracy, examining the use of artificial intelligence (AI) and other technologies in the Indian general elections. Past issues of the dispatch have discussed the Voluntary Code of Ethics (Code) developed by the Election Commission of India (ECI), which is the primary regulatory code for political advertising on internet platforms during elections in India. This dispatch will look at the Code in more detail.

In Focus: Voluntary Code of Ethics

In March 2019, shortly before the general elections, the ECI invited several internet companies, including Google, Facebook, and ShareChat, to participate in the creation of a Voluntary Code of Ethics to ensure the integrity of the election process. Recognizing the increasing impact that social media was having on electoral processes, the stated purpose of the Code was “to identify measures[...] put in place to increase confidence in the electoral process.” The companies engaged in this process under the leadership of the Internet and Mobile Association of India (IMAI), which presented the Code to the ECI in March 2019. After the 2019 elections, IAMAI agreed to abide by the self-regulatory code in all future state and national elections.

A primary focus of the Code was greater transparency in paid political advertising. It brought political advertisements on social media platforms such as Facebook and Twitter under the Model Code of Conduct (MCC). The self-regulation allows ECI to notify platforms of violations of the Code using Section126 of the Representation of the People Act 1951. It created a direct line between ECI and the platforms, enabling the notification and speedy removal of violating content. Platforms must remove reported content within three hours during the two-day non-campaigning ‘silence period’ before polling and provide reports on their actions to IAMAI and ECI. The companies also agreed to help provide information on electoral matters, run educational campaigns to raise awareness about electoral laws and conduct platform-specific training for ECI nodal officers, who serve as liaisons between the government and ECI.

Related Reading:

As with newspaper and radio advertisements, the Code required parties to disclose expenditure accounts for social media advertisements. Advertisers submit pre-certificates issued by the ECI and the Media Certification and Monitoring Committee (MCMC) for election advertisements that feature the names of political parties or candidates for the upcoming elections. The candidates were also required to submit details of their social media accounts when filing their nominations. Candidates and parties also had to declare expenditures on social media, making it part of the overall spending limit.

Neither the ECI nor the Representation of Peoples Act specifies a definition of the term “political advertising.” The various Internet platforms were largely left to determine how they would define and govern political advertising. Google, for instance, includes four kinds of users whose advertising falls under the purview of political advertising. It includes (1) political parties, (2) businesses, (3) non-profit organizations, and (4) individuals, and the criteria for classifying an advertisement as political was that it must feature a political party.

On the other hand, X, formerly known as Twitter, defined political advertising as ads purchased by a political party or candidate or advocating for a clearly identified candidate or political party. X banned political ads in 2019 after the micro-blogging site faced criticism for not curbing the spread of misinformation during elections, though this was relaxed considerably in 2023.

Although the new ECI guidelines mandate the disclosure of expenditure on social media advertisements by political parties and candidates, they do not amply cover the ads purchased by non-members such as “supporters” or “well-wishers” of the party who cannot be directly linked to either the party or the candidate. The MCC and the Voluntary Code do not regulate this area adequately. The platforms have stepped in to fill this lacuna using their terms of use policies, requiring paid ads to carry disclaimers, taking down ads that should carry these disclaimers but do not, and maintaining public archives of these ads and the expenditure incurred by the purchaser.

However, these measures are still inadequate in identifying all types of political content and actors and are only partially enforced. There is only a minor punishment of ₹500 for any political content that comes under the Indian Penal Code (IPC) for illegal payments made by someone making election expenditures on behalf of a candidate without their written authority. This fine is significantly lower than the substantial amount spent on platform advertising.

More importantly, this self-regulatory code leaves substantial election activity on social media — surrogate advertising, where advertising is not directly funded by the political party or candidate — completely unregulated. In many cases, advertising is not paid for directly by political parties but through networks of supporters.

Other Developments

The Free Speech Collective published a report reviewing free speech violations in the first four months of 2024, in the lead-up to the general elections. The report documents 134 instances of free speech violations, ranging from censorship to attacks, arrests, harassment, violations of online speech, and sedition cases. For example, it details cases of blocking social media accounts, web links, and internet shutdowns, particularly during the farmer protests in February.

Karen Rebelo of Boom Live wrote about the inefficacy of X’s crowdsourced fact-checking program Community Notes. The feature allows ‘contributors’ to write notes for tweets they think are misleading or missing important context. The program uses a bridging algorithm to decide whether a note shows up with a tweet. Unlike engagement-based ranking systems, where popular content gains the most visibility, bridging-based rankings is a social media algorithm that ranks content in a way that aims to build trust and understanding between different perspectives. It relies on identical feedback from other users who have tended to disagree as per their past ratings and rewards content that helps opposing sides understand each other and bridge divides. Rebelo pointed out that in many instances, the platform did not display notes on outrightly false tweets because they did not receive enough ‘helpful’ ratings. Rebelo cites experts who have critiqued the fully crowdsourced and automated model for fact-checking used in Notes.

Reporters Without Borders (RSF) and the Network of Women in Media, India (NWMI) have trained 60 women journalists across India in election coverage. The training focused on fighting disinformation, practicing solutions journalism, and adopting an inclusive approach to ensure diverse and reliable reporting on the current general elections. This workshop taught fact-checking techniques, including using artificial intelligence (AI) to identify viral fake news and deep fakes. The participants were also provided technical training in processing, analyzing, and presenting quantitative information on the elections with a focus on investigating election campaign funding.

Additional Reads

  • In The Walrus, Mitali Mukherjee writes in detail about the use of deepfakes and AI in Indian elections.
  • Srishti Jaiswal analyzes the BJP’s use of WhatsApp groups in electioneering for the Rest of World.

Please share your feedback. Are there stories or important developments that we have missed? You can write to us at contributions@techpolicy.press with your suggestions, reflections, or critiques.

Authors

Amber Sinha
Amber Sinha works at the intersection of law, technology and society, and studies the impact of digital technologies on socio-political processes and structures. His research aims to further the discourse on regulatory practices around the internet, technology, and society. He is currently a Senior ...

Topics