Meta on Thursday added a tool that allows Facebook group administrators to automatically filter content it identifies as false information in the run-up to the US midterm elections, which are fueling waves of misinformation on the social network. “To make content more reliable (…), group administrators can automatically suspend posts that contain information for being false by third-party reviewers so they can review them before deleting them.”explained Tom Ellison, Facebook’s director of applications, in a statement on Thursday.
The platform already gave group leaders more tools to better moderate content, but continues to be accused by many NGOs and authorities of not doing enough to combat disinformation. More than 1.8 billion people use Facebook groups every month. Parents of students, fans of artists and neighbors meet there to share news and organize events, but also to discuss politics.
Meta has come under fire for insufficient policing, which has contributed to the political radicalization of certain individuals, especially during the 2020 US election. AFP participates in “Third-Party Fact-Checking,” a third-party verification program developed by Facebook, in about thirty countries. since 2016. About sixty media outlets around the world, general or specialized, are also part of this program. If something is diagnosed as false or misleading by one of these outlets, Facebook users are less likely to see it in their News Feed. And if they see it or try to share it, Facebook suggests reading the verification article.
Source: Le Figaro

I am David Wyatt, a professional writer and journalist for Buna Times. I specialize in the world section of news coverage, where I bring to light stories and issues that affect us globally. As a graduate of Journalism, I have always had the passion to spread knowledge through writing.