Meta’s Oversight Board suggests changes to Facebook’s content moderation checking system

by ian

Ian Patrick, FISM News

 

Meta’s Oversight Board issued recommendations for Facebook’s system of content moderation and removal, saying that it treats ordinary users more harshly than certain VIP users and that the system itself is vague and unidentifiable.

The so-called “cross-check program” works by implementing “additional layers of human review for certain posts initially identified as breaking [Facebook’s] rules,” according to the board’s report.

This would be a daunting task considering the amount of Facebook’s users and posts every day. The board reported that Meta “was performing about 100 million enforcement attempts on content every day.”

“At this volume, even if Meta were able to make content decisions with 99% accuracy, it would still make one million mistakes a day,” the board acknowledged. “In this respect, while a content review system should treat all users fairly, the cross-check program responds to broader challenges in moderating immense volumes of content.”

However, in light of Meta’s commitment to human rights and other values, the board says “cross-check is flawed in key areas which the company must address.”

First and foremost, the cross-check system allows “greater protection” for certain users compared to more “ordinary users.” For these specific users, while the human review is taking place for their posts, those posts remain on the platform and will likely remain.

Ordinary users do not get the same treatment in the sense that their posts are less likely to get a human review, meaning their posts have a greater chance of being removed.

Meanwhile, the board says “it can take more than five days to reach a decision on content from users on its cross-check lists.”

“This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm,” the report added.

The board also noted that Meta’s metrics to track cross-check are not absolutely reliable and that the company has provided “limited information” on the system to the general public.

The board says Meta should “improve its content moderation for all users” and “prioritize expression that is important for human rights.” It also suggests removing any posts that have been identified as violating content while it is under review and increasing the transparency of the cross-check system.

DONATE NOW