advertisement
CLICK HERE TO READ THE FULL MULTIMEDIA IMMERSIVE
The leaders of the Soviet Union well understood the idea of proxy wars during the Cold War period. This 'cold war' was fought indirectly on different grounds by the United States and the Soviet Union.
However, this idea has taken a new shape in the modern times. When Russia invaded Ukraine, it inevitably started a direct war between the two countries.
But, Russia has also been fighting a proxy war.
A war of disinformation, a war of narratives, and a war on social media platforms.
You may ask how is this possible? Well, here's the answer.
On 3 April, The Quint's WebQoof received a mail from a user who requested us to check several Telegram links related to the ongoing Russia-Ukraine war.
While we normally receive requests to verify claims that are viral on the internet, this was a bit different.
These mails continued, more Telegram groups were flagged, more 'information' was asked to be verified.
As of 22 July of this year, we had received around 317 claims which peddled misinformation against different countries such as Ukraine, France, Germany, United States and drove pro-Russian narratives.
These claims included fabricated news reports, and narratives that targeted a few countries.
However, we found that only 246 claims were fact-checkable and rest of them (71) were shared as satire or something that could not be verified.
Historically, art has been used as a tool to express, oppose, or to start a movement against the political status quo. So, when we started receiving claims of graffiti being made in different parts of the world that indulged in political comments, we thought it was a callback to the renaissance era. However, there was one difference. Unlike the aforementioned period, these graffiti were not real.
The Quint’s WebQoof received around 39 claims of political graffiti made across the world, which targeted prominent political figures like US President Joe Biden, former UK PM Rishi Sunak, Ukrainian President Volodymyr Zelenskyy, and others.
Sample this - On 4 July, we received a mail from a ‘user’, which carried several images of graffiti and multiple links of different Telegram channels. The artwork targeted US President Biden after his below-average performance in the CNN debate against former President Donald Trump. The location was identified as London, the United Kingdom.
However, we did not find any news reports or information that corroborated such an artwork being made against President Biden.
Team WebQoof, using the help of all the images, was able to identify the exact location as ‘10, Queen’s Head Passage’.
Considering the popularity of the location, it is unlikely that any such graffiti went unnoticed by people and media organisations.
See for yourself the street view.
Further, we passed this image through an AI detection tool named ‘TrueMedia’ which showed substantial evidence of this image being fake.
Both of these factors combinedly indicated the claim being false.
Here's another one that establishes this pattern:
A day earlier on 3 July, a user had sent an email that carried several images along with a claim that said graffiti showing Zelenskyy as a Nazi was recently spotted in Munich, Germany.
When we searched the image on Google Lens, we found an article published on a Russian website that amplified the same claim.
With the help of Google Lens, we were able to find out the exact place where the artwork was supposedly seen. It should be noted that if such an artwork was painted outside an office of Apple, it surely would have received nationwide media coverage. However, we couldn’t find any credible news reports.
Did you notice a common link?
The United States, United Kingdom, and Germany are all part of the North Atlantic Treaty Organization (NATO) alliance, who put sanctions on Russia after it waged war against Ukraine. A major reason why pro-Russian websites and channels keep pushing propaganda against these countries.
Recently, on 18 July, we received an email carrying screenshots from French news website ‘Le Monde’ which claimed that the World Anti-Doping Agency (WADA) prepared anti-doping relaxations for the Ukrainian national team.
This claim, as one can imagine, questioned the legitimacy and authenticity of the Olympic standards and targeted both France and Ukraine. The claim was also shared on a Telegram channel named ‘The Other Ukraina’.
A thorough search of Le Monde’s official website and a Google search showed that the screenshot is fabricated and no such report was published by the news organisation.
The claim was also debunked by a website named ‘Disinfo Detector’ which clarified that no such piece was published by Le Monde.
Another example of concocting stories and trying to make it appear credible by using popular media channels’ logos was also seen in this video with BBC logo uploaded on a Telegram channel named ‘World Wide Number’.
Now, you may ask how is this important?
This claim of Paris Metro being emptied due to the terrorism threat directly insinuates that the country is not safe for hosting Olympics. Several such claims of terror attacks and different ‘warnings’ issued by the French police also went viral to further these types of narratives.
This video claiming Paris Metro being emptied was never published by the BBC. Stop Fake, a fact-checking website, too, debunked this claim and said that the video was false.
Out of the 317 claims that we received on email, The Quint found that around 243 of them were said to be reported by credible news media organisations.
The set of claims included different mediums, such as screenshot of articles, video news reports, and others.
Among the 317 pieces of content we received for verification Telegram comprised 90% of the total claims received.
Over one-third of the Telegram links redirected to just five Russian-language channel
We explored the messages we received for ‘verification’ from each of these channels, and found overarching themes. The most common themes, which also overlapped each other in several instances, revolved around targeting countries which are a part of the NATO and international sporting events.
These claims, which ranged from slightly misleading to completely fictitious, attempted to build several narratives.
1. Paris and the Olympics
2. Ukraine and Euro 2024
3. The USA
This operation did not target a specific country, but sent these queries to fact-checkers everywhere.
The European Fact-Checking Standards Network’s communications manager Samantha Lee told us that their partner organisations started getting these emails at the end of 2023, and still received them after the campaign was exposed.
Guillaume Kuster, the co-founder of CheckFirst, a Finnish software and methodologies company sent us their collaborative research on the same operation, which they named “Operation Overload.”
The operation dated back to “at least to August 2023,” and mirrored a similar operation on X (formerly Twitter), which was a Russian-led disinformation campaign called “Matryoshka”. Their research identified 100 X accounts that had shared or amplified this disinformation since October 2023.
“The analysed accounts exhibit clear markers of coordinated inauthentic behaviour,” it stated, mentioning that they showed specific behavioural patterns and calling them “a tactic likely designed to minimise detection by the platform.”
Both organisations found that platforms had not succeeded in restricting the spread, or even identifying this pattern.
“A lot of content is spread via the same Telegram channels,” CORRECTIV’s Max Bernhard told us, which went in line with these reporters’ findings. “The platform doesn’t appear to conduct any content moderation, especially when it comes to pro-Russian content.”
On X, however, CORRECTIV saw “limited action” to remove accounts that were a part of this campaign. Bernahard opined that X “clearly isn’t doing enough since the campaign, and similar campaigns, have only expanded in scope.”
Check First’s report, which was created in collaboration with 20+ fact-checking organisations, mentioned that Germany’s CORRECTIV had engaged with one of them.
They received a response to their feedback on the claims made, which initially expressed “respect and trust” in CORRECTIV’s work, but made the goal of the campaign clear when they asked — “Is it possible for your work to be seen by as many people as possible?”
Myth Detector Georgia had initially “engaged and responded to” the senders by telling them that they would work on the disseminated posts to verify whether it was true. “I don’t recall them replying after our response,” a member of their team wrote.
Bernhard, Lee, and Myth Detector Georgia shared the opinion that the goal might be to ‘waste time’ or ‘to tie up the resources’ of fact-checking organisations.
“Another reason may be that, if media report on the campaign’s content, they may see that as a sign of the campaign’s success that they can show their possible clients,” adding that he believed that the campaign was “not so successful.” - Bernhard
“Because of fact-checkers’ crucial work against disinformation, they are often targeted by rogue actors, such as those you are referring to, who do not like to be called on their lies.” - Lee
Kuster, however, said that CheckFirst was not “directly impacted” as they do not publish fact-checks. “However, we’ve listed (until June 4th) over 200 debunks or fact-checks published by media outlets.”
We interpret the purpose of this operation as two-fold: target newsrooms and fact-checkers with fabricated pro-russian fake content and try to have them fact-check content which has not been widely spread, exposing it to larger audiences through the publication of fact-checks or debunks,” Kuster told The Quint, referring to the Streisand effect — which involves bringing more attention to a piece of information while attempting to conceal, censor, or remove it.
Kuster’s statement went along with our findings. For many of these purported “claims,” we struggled to find more posts sharing disinformation on social media platforms.Debunking that piece of disinformation would inevitably grab more attention than not publishing a fact-check about the false claim would.
How does it impact the average social media user? It might, or it might not. Berhnard believes that it is difficult to assess the impact of such campaigns. “The campaign may also give the impression to users on these platforms that there is higher support for pro-Russian narratives than there actually is.”
He stressed on the need for people to be aware, and know that “malicious actors try to spread false content by for example impersonating established media such as the BBC.
X and Telegram are not a source for reliable information - readers should seek out trusted media outlets and get their information directly at the source.”
Bernhard recommended conducting “a simple Google search” whenever one comes across these claims, to look for coverage by other media outlets or fact-checks by newsrooms.
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)