Facebook is introducing a new system on its platform to control the spread of misinformation. This is to downrank the posts shared by people/pages who are considered often misinformation spreaders. Pop-up labels of the same will be shown to new users joining a controversial page and warn them of such practices. Also, they’ll have a link to fact-checkers to verify the information.

Facebook Fighting Misinformation

Social platforms like Facebook have long been plagued with misinformation spread by users, who do it intentionally or unintentionally. Many court rulings and law enforcement agencies have ordered that it’s the responsibility of Facebook to stop such misinformation from spreading.

This led the company to introduce new warning tools like popups, which appear whenever a new user is following a page that’s market for spreading misinformation. Further, any controversial posts being made by a user will now have a warning that their posts will be downranked in the News Feed, and also a tag for fact-checking is set in.

facebook warning tools
Facebook warning tools

Down-ranking unverified posts make them less visible to the community, thus reducing the implications from it. A similar fact-checking system was introduced by Facebook last year, after the rise of misinformation posts on COVID-19, presidential elections, and COVID-19 vaccines. Users following the unverified data can result in adverse situations, thus should be controlled.

Similar tools were introduced by Twitter, Instagram and others to fight the misinformation. They had set fact-checking labels and warning pop-ups for posts sharing controversial posts. Introducing these tools, Facebook said, “Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps.“

LEAVE A REPLY

Please enter your comment!
Please enter your name here