Facebook to Take Action Against Users That ‘Repeatedly Share Misinformation’

by ian

Ian Patrick, FISM News


A blog post from Facebook details new actions that the company will take against people and pages that “repeatedly share misinformation,” in the blog’s own words. The actions are meant for misinformation on any topic, but the post highlights COVID-19, vaccines, climate change, and elections especially.

The first action is providing “more context.” Before a user likes or follows a page, Facebook will provide a prompt which notifies the user that this particular page has shared information that fact-checkers had deemed false. It will also provide a link to Facebook’s fact-checking program.



Facebook launched a fact-check program back in 2016, proudly citing the “stronger action” they’ve taken against users who have shared misinformation. The company said it will now start imposing new penalties for these users.

Starting today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners. We already reduce a single post’s reach in News Feed if it has been debunked.

The tech giant announced in this blog post that it will also clarify their notifications on posts that fact-checkers review. These notifications will now include an article that debunks the information.

In addition to these penalties and checks, accounts that often share misinformation “may have their posts moved lower in News Feed so other people are less likely to see them.”