Facebook will proactively shut down ‘fake’ groups and pages, even if they were not found to be in violation of its Community Guidelines, the company has announced.
In a blog post on Wednesday, the social media giant said when a page or group is removed for violating policies, “we may now also remove other Pages and Groups even if that specific Page or Group has not met the threshold to be unpublished on its own”.
Facebook also listed other steps to handle Page content that goes against its policies.
“People who manage a Page will see a new tab that shows when we remove certain content that goes against our Community Standards and when we reduce the distribution of posts that have been rated false by a third-party fact-checker,” said Facebook.
The move is designed to keep page managers from skirting Facebook bans by using pages they already manage to re-post the content Facebook removed from their shuttered pages and groups. Facebook’s crackdown on misinformation networks comes one week after it announced the the removal of 364 pages that originated in Russia and were engaged in “coordinated, inauthentic behavior.”
The tab includes two sections: content Facebook recently removed for violating a subset of its Community Standards and content recently rated “False,” “Mixture” or “False Headline” by third-party fact-checkers.
“To start, we’re including content removed for policies like hate speech, graphic violence, harassment and bullying, and regulated goods, nudity or sexual activity, and support or praise of people and events that are not allowed to be on Facebook,” said the post.
Facebook rules don’t allow people to recreate pages or groups that look similar to ones that have been banned, however bad actors have managed to stay one step ahead by converting other pages in their network into vehicles for fake news and other content that violates Facebook’s community guidelines.
“We hope this will give people the information they need to police bad behaviour from fellow Page managers, better understand our Community Standards, and, let us know if we’ve made an incorrect decision on content they posted,” it added.