Published in News

Facebook's moderation systems for "high profile users" favour Meta

by on07 December 2022


No priority for free speech and civil rights

A policy which protects "high-profile" Facebook and Instagram users from moderation was structured so it did not prioritise protecting free speech and civil rights and was designed to protect the social notworking giant's interests.

The oversight board, which scrutinises moderation decisions on Facebook and Instagram, said the platforms’ “cross-check” system appeared to favour “business partners” including celebrities who generate money for the company.  Journalists and civil society organisations had “less clear paths” to access the programme.

The board said that while Meta told the board that cross-check aims to advance Meta’s human rights commitments, the programme appears more directly structured to satisfy business concerns. 

It was about as transparent as Donald Trump's tax returns. 

It said cross-check grants certain users greater protection than others because content from users on the cross-check list is allowed to stay up while it is vetted by human moderators applying the “full range” of content policies. Meta described it as a “mistake-prevention strategy” that protected important users from erroneous content takedowns.

Ordinary users, by contrast, are much less likely to have their content reach reviewers who can apply the full range of Meta’s content guidelines.

The board said a user’s “celebrity or follower count” should not be the sole criterion for receiving the special protection offered by the programme. Meta admitted to the board that criteria for including “business partners” on the list included the amount of revenue they generated.

Meta told the board that it exempts some content from takedowns. The company described this system as “technical corrections” and said it carried out about 1,000 a day. The board recommended that Meta conducted audits of enforcement actions that are blocked under the system.

The board added that the technical corrections system is viewed as an “allow list” or “whitelist”. In September last year the Wall Street Journal, using documents disclosed by whistleblower Frances Haugen, reported that Brazilian footballer Neymar had responded to a rape accusation in 2019 by posting Facebook and Instagram videos defending himself, which included showing viewers his WhatsApp correspondence with his accuser. The clips from WhatsApp – also owned by Meta – included the accuser’s name and nude photos of her.

Moderators were blocked for more than a day from removing the video and the normal punishment of disabling his accounts was not implemented. Neymar’s accounts were left active after “escalating the case to leadership”. Neymar denied the rape allegation and no charges were filed against the footballer.

Citing the Neymar example, the board said that despite Meta saying it had a system for prioritising content decisions, some content still remained online for “significant periods” while this happened.

“In the Neymar case, it is difficult to understand how non-consensual intimate imagery posted on an account with more than 100 million followers would not have risen to the front of the queue for rapid, high-level review if any system of prioritisation had been in place,” said the board.

The board made 32 recommendations. They included: removing special protection for commercially important accounts if they break content rules frequently; prioritising moderation of posts that are important for human rights; and violating content from cross-check users that is “high severity” should be removed or hidden while reviews are taking place.

Meta’s president of global affairs and former UK deputy Prime Minister, Nick Clegg, said that to “fully address” the board’s recommendations, the company would respond within 90 days.

 

Last modified on 07 December 2022
Rate this item
(0 votes)

Read more about: