Meta oversight board slams Facebook special treatment of VIPs, demands content moderation overhaul

>

Meta’s internal oversight board criticized the company’s policies that give VIP users, including celebrities, politicians and business partners, preferential treatment on Facebook and Instagram.

If you’re a regular user of either platform, your speech is subject to the tech giant’s often-controversial content moderation policies. However, if your name is Donald Trump or Kim Kardashian or you just have a very high following, you have more freedom to share and say things that violate the rules.

Known as cross-checking, Facebook and Instagram’s internal program shields celebrities and other high-profile users from having their content automatically removed by the company’s algorithms.

However, the revelations by whistleblower Frances Haugen, who testified before Congress in detail about the program, appear to have informed the oversight board’s assessment. Haugen has said the firm chooses “profits over security.”

Meta's internal oversight board criticized the company's policies that give VIP users, including celebrities, politicians and business partners, preferential treatment on Facebook and Instagram for content moderation decisions.

Meta’s internal oversight board criticized the company’s policies that give VIP users, including celebrities, politicians and business partners, preferential treatment on Facebook and Instagram for content moderation decisions.

“The board is concerned about how Meta has prioritized commercial interests in content moderation,” the report stated. The program, he said, “provided additional protection for the speech of certain users.”

when carelessness meeting began its research on the cross-verification program, Meta was making a staggering 100 million content app attempts every day.

So even if the company were able to make such decisions with 99% accuracy, an impossible standard, it would still make a million mistakes a day.

Meta chose 'profit over security,' a whistleblower testified before Congress.  Above: Meta CEO Mark Zuckerberg

Meta chose 'profit over security,' a whistleblower testified before Congress.  Above: Meta CEO Mark Zuckerberg

Meta chose ‘profit over security,’ a whistleblower testified before Congress. Above: Meta CEO Mark Zuckerberg

Among the board’s key findings, detailed in a 57-page report, is that content that breaks Meta’s own rules is often left for more than five days when the user posting it is a VIP.

The company run by Mark Zuckerberg currently doesn’t provide much transparency to the public about how cross-checking works.

“Meta does not currently inform users that they are cross-checklisted and does not publicly share its procedures for creating and auditing these lists,” the board wrote in a summary of its work, which began in October 2021.

“It is not clear, for example, whether entities that continually post infringing content are kept on cross-checklists based on their profile.”

The board recommends publicly flagging the pages and accounts of all entities receiving list-based protections in Meta in the following categories: ‘all state actors and political candidates, all business partners, all media actors, and all other public figures included due to benefit to the company.

The board also wrote that Meta’s cross-checking systems operate with a “constant backlog of cases.”

“Meta told the Board that, on average, it can take more than five days to make a decision about user content on its cross-checklists,” the monitoring group noted. “This means that, due to cross-checking, content identified as violating Meta rules is left on Facebook and Instagram when it is most viral and could cause harm.”

In many cases, that delay has negative consequences in real life.

For example, in 2019, Brazilian soccer star Neymar posted a video showing photos of a nude woman who had accused him of sexual assault.

Due to the cross-checking program, the post stayed for more than a day and received more than 100 million views before it was finally removed.

In its report, the board questions why the athlete was not suspended, also noting that the incident only came to light as a result of Haugen’s revelations.

In all, the board recommended 32 different actions and gave Meta 90 days to respond. However, since the board is advisory, the company is under no obligation to implement any of its suggestions.

When the oversight board began its investigation into the cross-verification program, Meta was making around 100 million content app attempts every day.

When the oversight board began its investigation into the cross-verification program, Meta was making around 100 million content app attempts every day.

When the oversight board began its investigation into the cross-verification program, Meta was making around 100 million content app attempts every day.