Facebook Begins ‘Fact-Checking’ Photos and Videos

The fact-checking began in France with assistance from news organisation Agence France-Presse.

David Ingram
World
Published:
Facebook CEO and co-founder Mark Zuckerberg.
i
Facebook CEO and co-founder Mark Zuckerberg.
(Photo: iStock)

advertisement

Facebook Inc said on Thursday, 29 March, it had begun "fact-checking" photos and videos to reduce the hoaxes and false news stories that have plagued the world's largest social media network.

Facebook has, for months, faced an uproar among users whose complaints range from the spread of fake news to the use of the network to manipulate elections and the harvesting of 50 million people's Facebook data by the political consultancy Cambridge Analytica.

Manipulated photos and videos are another growing problem on social media.

The fact-checking began on Wednesday in France with assistance from news organisation Agence France-Presse (AFP) and will soon expand to more countries and partners, Tessa Lyons, a product manager at Facebook, said in a briefing with reporters.

Lyons did not say what criteria Facebook or AFP would use to evaluate photos and videos, or how much a photo could be edited or doctored before it is ruled fake.

The project is part of efforts to fight false news around elections.
Tessa Lyons, Product Manager, Facebook

A representative for AFP could not immediately be reached for comment.

Shares of Facebook closed up 4.4 percent at $159.79 on Thursday after a tumultuous two weeks. It remained down more than 13 percent from 16 March, when Facebook disclosed the Cambridge Analytica data leak and sparked fears of stricter regulation.

Facebook has tried other ways to stem the spread of fake news. It has used third-party fact-checkers to identify them, and then given such stories less prominence in the Facebook News Feed when people share links to them.
ADVERTISEMENT
ADVERTISEMENT

In January, Chief Executive Mark Zuckerberg said Facebook would prioritise "trustworthy" news by using member surveys to identify high-quality outlets.

Samidh Chakrabarti, another Facebook product manager, said in the briefing that the company had begun to "proactively" look for election-related misinformation rather than waiting for reports from users, helping it to move more quickly.

Alex Stamos, Facebook's chief security officer, said in the briefing that the company was concerned not only about false news but also other kinds of fakery.

He said Facebook wanted to reduce "fake audiences", which he described as using "tricks" to artificially expand the perception of support for a particular message, as well as "false narratives", such as headlines and language that “exploit disagreements”.

(This article has been published in an arrangement with Reuters)

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT