ADVERTISEMENTREMOVE AD

Fake Accounts Calling for 'Pro-Sikh Protests' Were Created in China, Meta Finds

The network of fake accounts linked to China shared poster images that are purportedly AI-generated.

Published
Tech News
4 min read
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large

A network of fake Facebook and Instagram accounts that sought to create a fictitious pro-Sikh movement in many countries was traced back to China, according to Meta's Quarterly Adversarial Threat Report published in May 2024.

In its report analysing coordinated inauthentic behaviour (CIB) across the world during the first quarter of 2024, Meta said, "This network (of accounts) originated in China and targeted the global Sikh community, including in Australia, Canada, India, New Zealand, Pakistan, the UK, and Nigeria."

"They appeared to have created a fictitious activist movement called Operation K which called for pro-Sikh protests, including in New Zealand and Australia. We found and removed this activity early, before it was able to build an audience among authentic communities," Meta said in its report.

What was the ultimate goal of the Chinese social media influence operation? How many social media handles made up the deceptive network? And how did the threat actors involved make use of AI-generated content?

Here are the main points from Meta's report.

ADVERTISEMENTREMOVE AD

'Operation K': China-Based Influence Op Targeting Sikhs

Scale: Meta said that it took down over 37 Facebook accounts, 13 Pages, five Groups, and nine accounts on Instagram for displaying coordinated inauthentic behaviour promoting 'Operation K'.

Reach: "About 2,700 accounts followed one or more of these Pages, about 1,300 accounts joined one or more of these Groups, and under 100 accounts followed one or more of these Instagram accounts," the report said.

Target: Besides Facebook and Instagram, Meta said that it had detected clusters of fake accounts displaying such activity on Telegram and X (formerly Twitter). One such cluster with "links to an unattributed CIB network from China targeting India and the Tibet region" was found to have sprouted up again despite being taken down by the big tech company in early 2023.

The intent: "Some of these clusters amplified one another with most of their engagement coming from their own fake accounts, likely to make this campaign [Operation K] appear more popular than it was," the report further said.

Modus operandi: By taking over compromised accounts and creating fake ones, the report said that the operatives posed as Sikhs and proceeded to post content as well as manage Pages and Groups.

Type of content: "They posted primarily in English and Hindi about news and current events," Meta's report said.

This content included "images likely manipulated by photo editing tools or generated by AI, in addition to posts about floods in the Punjab region, the Sikh community worldwide, the Khalistan independence movement, the assassination of Hardeep Singh Nijjar, a pro-Khalistan independence activist in Canada, and criticism of the Indian government," the report added.

However, Meta also said, "We have not seen threat actors use photo-realistic AI-generated media of politicians as a broader trend at this time."

0
  1. When Does Meta Flag Something as 'Coordinated Inauthentic Behaviour'?

    Meta defines Coordinated Inauthentic Behaviour or CIB as "coordinated efforts to manipulate public debate for a strategic goal, in which fake accounts are central to the operation."

    In simpler words, it is when "people coordinate with one another and use fake accounts to mislead others about who they are and what they are doing."

    "When we investigate and remove these operations, we focus on behaviour, not content – no matter who’s behind them, what they post or whether they’re foreign or domestic," Meta said, adding that it uses both automated and manual detection methods to remove accounts and Pages that are part of deceptive networks on its radar.

    Expand
ADVERTISEMENTREMOVE AD

What About the Fake Account Networks Allowed To Stay Up?

In October 2023, an investigation by The Washington Post found that the Indian Army's Chinar Corps deployed in Kashmir had operated a network of fake Facebook accounts that promoted content about problems in Pakistan.

This is one of the first reported instances of the Indian Army reportedly conducting covert operations on social media.

"Posts from different accounts came in bursts, using similar words. Often, they praised the Indian military or criticised India’s regional rivals – Pakistan and its closest ally, China," the US-based daily reported.

And what did Facebook do about it?

The report said that the big tech company had found out about the deceptive network operated by the Indian Army but failed to take it down immediately over fears of "antagonising the government."

When the Chinar Corps' network of fake handles was routed from the platform, the report said that Facebook did not disclose the takedown in its Quarterly Adversarial Threat Report – breaking with usual practice.

In another report as part of the same series, The Washington Post highlighted how the Bharatiya Janata Party (BJP) had allegedly spread divisive content through a network of WhatsApp groups in the run up to the Karnataka Assembly elections.

Taking note of the explosive revelations in these reports, Congress chief Mallikarjun Kharge penned a letter to Meta CEO Mark Zuckerberg (as well as Google's Sundar Pichai) on the "blatant partisanship and bias towards one political formation by a private foreign company" that is "tantamount to interfering in India's democracy."

(At The Quint, we are answerable only to our audience. Play an active role in shaping our journalism by becoming a member. Because the truth is worth it.)

Speaking truth to power requires allies like you.
Become a Member
Read More
×
×