advertisement
Facebook on Thursday, 16 April, announced that it has displayed warnings on about 40 million posts related to COVID-19 on the platform in March alone, aided by fact-check articles published by the organisations that it partners with.
In its attempt to rate and review content on its platform, Facebook is working with over 60 fact-checker organisations in over 50 languages, a community which is still growing. It may be noted here that The Quint is also part of this fact-checking community, which aims to stop the spread of misinformation.
So, what does Facebook do when a fact-checker rates a piece of content as false? The platform reduces the distribution of the post and adds more labels with context.
According to Facebook, with respect to the 40 million posts in March, 95 percent of people did not open the actual post after encountering the warning. In addition, posts spreading misinformation about COVID-19 that could cause physical harm are removed by the platform.
Reacting to the current situation and the flood of misinformation, Facebook has also decided to display messages in News Feed to those users who have liked, reacted, commented or engaged with in any way with posts spreading harmful misinformation about COVID-19 that have since been removed.
Additionally, Facebook has also added a section to their COVID-19 Information Center called ‘Get the Facts’, where fact-checked articles from its partners debunking misinformation about coronavirus can be accessed.
You can read all our fact-checked stories here.
(Not convinced of a post or information you came across online and want it verified? Send us the details on WhatsApp at 9643651818, or e-mail it to us at webqoof@thequint.com and we'll fact-check it for you. You can also read all our fact-checked stories here.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)