ADVERTISEMENTREMOVE AD

Meta Has a Strategy To ‘Protect’ India’s State Elections, but Will It Work?

Meta has used a version of this strategy since 2018 for major elections across the world.

Published
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large
Edited By :Tejas Harad

Meta, in a statement on Thursday, 10 February, said that it is "prepared to protect the upcoming state elections in India."

"With the upcoming elections in Uttar Pradesh, Punjab, Uttarakhand, Goa and Manipur starting February 10, we are sharing an update on how Meta is prepared to protect people and our platform during this period."

It said it has a comprehensive strategy to limit hate speech, content that incites violence, misinformation, making political advertising more transparent, and helping people make their voices heard through voting.

Given Meta's history, will this strategy work? Here's a breakdown.

ADVERTISEMENTREMOVE AD

What Is Facebook’s Strategy?

Nothing new, Meta has used a version of this strategy since 2018 for major elections across the world.

Elections Operations Centre

Meta said it will be activating its Elections Operations Centre to watch out for "potential abuses that could emerge across the platform related to these elections. That way we can respond to them in real time."

"It brings together subject matter experts from across the company to give us more visibility of emerging threats. That way we can respond quickly before they become larger," it said.

For WhatsApp

Meta is relying on its spam detection technology to ban accounts engaging in automated or bulk messaging. “We banned over 2 million accounts in the month of December 2021 alone,” it said.

“Ahead of all elections, we train political parties about the responsible use of WhatsApp and party-workers are cautioned about the possibility of their accounts getting banned if they send WhatsApp messages to people without prior user-consent.”

Meta says it will also run awareness campaigns.

Hate Speech

The tech giant says that it has invested more than $13 billion in teams and technology to limit hate speech and harmful content, allowing it to triple the size of the global team working on safety and security to over 40,000. For India, Meta has reviewers in 20 Indian languages.

Misinformation

Meta partners with 10 independent fact checkers in India (including The Quint) to review content in 11 Indian languages. Whenever they rate a piece of content as false, Meta says it “significantly reduces” its distribution, adds a label, and notifies people who have interacted with the post.

Political Ads

Meta will require “Paid By For” disclaimers for ads about elections or politics and “important” social issues. Advertisers on Facebook and Instagram will have to be authorised and their name will be publicly visible.

Civic Engagement

Meta says it will remind users to activate two-factor authentication to protect their accounts against online threats. It will also send people election day reminders with “accurate information” and ask them to share it with Facebook friends.

ADVERTISEMENTREMOVE AD

Meta Reportedly Lobbied the EC

Internal Meta documents leaked by whistle-blower Frances Haugen indicated that just before the 2019 general elections in India, it managed to convince the Election Commission of India to settle for a voluntary code of ethics and abandon its original plan of introducing strict social media regulations, Hindustan Times reported.

These documents came to be known as the Facebook Papers.

Meta, multiple sources confirmed to the publication, aggressively pushed back against the EC’s original plan to require it to disable ads during the silence period, that is, 48 hours before polling.

Under the voluntary code, Meta created a high-priority channel to receive content-related escalations and to restrict content that violates local law after receiving valid legal orders. This code is applicable for these elections as well.

This shifts the onus of flagging content to the Election Commission, while Meta merely carries out its instructions, avoiding additional legal obligations.

Notably, Meta disabled fresh political ads on its platforms for a seven-day period before the 2020 presidential elections.

Apart from this, the documents revealed that a 2019 study was conducted by Facebook where a fake account based in India was created and studied to see what type of content it was presented and interacted with, New York Times reported.

Results of the study reportedly showed that within three weeks, the fake account's newsfeed was being presented pornography and filled with polarising and graphic content, hate speech, and misinformation.

ADVERTISEMENTREMOVE AD

Facebook’s Failures in the US

Meta (then Facebook) came under the scanner after the 2016 presidential elections following charges that fake news and misinformation on these platforms influenced the outcome of the election which saw Donald Trump becoming the president of the United States (US).

In an attempt to avoid a repeat, Facebook introduced a host of measures to curb the spread of fake news and disinformation on their platforms, ranging from encouraging voter participation to rooting out false information, banning political advertisements and introducing new labelling systems.

Experts at the time were worried about these changes being “potentially too little and certainly too late.” However, these are largely the tools that Meta continues to employ.

A report by the online advocacy group Avaaz, found that if Facebook had tweaked its algorithms earlier, the company could have prevented an estimated 10.1 billion views on the 100 most prominent pages that repeatedly shared misinformation on the platform ahead of the election.

After the election, Facebook rolled back many of the emergency policies, returning to the algorithmic status quo that allowed conspiracy movements like QAnon and Stop the Steal to flourish, the report said.

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Edited By :Tejas Harad
Speaking truth to power requires allies like you.
Become a Member
Read More
×
×