advertisement
A network of fake Facebook and Instagram accounts that sought to create a fictitious pro-Sikh movement in many countries was traced back to China, according to Meta's Quarterly Adversarial Threat Report published in May 2024.
In its report analysing coordinated inauthentic behaviour (CIB) across the world during the first quarter of 2024, Meta said, "This network (of accounts) originated in China and targeted the global Sikh community, including in Australia, Canada, India, New Zealand, Pakistan, the UK, and Nigeria."
What was the ultimate goal of the Chinese social media influence operation? How many social media handles made up the deceptive network? And how did the threat actors involved make use of AI-generated content?
Here are the main points from Meta's report.
Scale: Meta said that it took down over 37 Facebook accounts, 13 Pages, five Groups, and nine accounts on Instagram for displaying coordinated inauthentic behaviour promoting 'Operation K'.
Reach: "About 2,700 accounts followed one or more of these Pages, about 1,300 accounts joined one or more of these Groups, and under 100 accounts followed one or more of these Instagram accounts," the report said.
Target: Besides Facebook and Instagram, Meta said that it had detected clusters of fake accounts displaying such activity on Telegram and X (formerly Twitter). One such cluster with "links to an unattributed CIB network from China targeting India and the Tibet region" was found to have sprouted up again despite being taken down by the big tech company in early 2023.
The intent: "Some of these clusters amplified one another with most of their engagement coming from their own fake accounts, likely to make this campaign [Operation K] appear more popular than it was," the report further said.
Modus operandi: By taking over compromised accounts and creating fake ones, the report said that the operatives posed as Sikhs and proceeded to post content as well as manage Pages and Groups.
Type of content: "They posted primarily in English and Hindi about news and current events," Meta's report said.
However, Meta also said, "We have not seen threat actors use photo-realistic AI-generated media of politicians as a broader trend at this time."
Meta defines Coordinated Inauthentic Behaviour or CIB as "coordinated efforts to manipulate public debate for a strategic goal, in which fake accounts are central to the operation."
In simpler words, it is when "people coordinate with one another and use fake accounts to mislead others about who they are and what they are doing."
"When we investigate and remove these operations, we focus on behaviour, not content – no matter who’s behind them, what they post or whether they’re foreign or domestic," Meta said, adding that it uses both automated and manual detection methods to remove accounts and Pages that are part of deceptive networks on its radar.
In October 2023, an investigation by The Washington Post found that the Indian Army's Chinar Corps deployed in Kashmir had operated a network of fake Facebook accounts that promoted content about problems in Pakistan.
This is one of the first reported instances of the Indian Army reportedly conducting covert operations on social media.
"Posts from different accounts came in bursts, using similar words. Often, they praised the Indian military or criticised India’s regional rivals – Pakistan and its closest ally, China," the US-based daily reported.
And what did Facebook do about it?
When the Chinar Corps' network of fake handles was routed from the platform, the report said that Facebook did not disclose the takedown in its Quarterly Adversarial Threat Report – breaking with usual practice.
In another report as part of the same series, The Washington Post highlighted how the Bharatiya Janata Party (BJP) had allegedly spread divisive content through a network of WhatsApp groups in the run up to the Karnataka Assembly elections.
Taking note of the explosive revelations in these reports, Congress chief Mallikarjun Kharge penned a letter to Meta CEO Mark Zuckerberg (as well as Google's Sundar Pichai) on the "blatant partisanship and bias towards one political formation by a private foreign company" that is "tantamount to interfering in India's democracy."
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)