Meta has been told that its treatment of high-profile users, such as former US President Donald Trump, has left dangerous content online that serves business interests at the expense of its human rights obligations.
A damning report released on Tuesday by the company’s oversight board – a “Supreme Court”-style panel created by the parent company of Facebook, Instagram and WhatsApp to rule on sensitive moderation issues – has urged the social media giant to Make “significant” changes to its internal system for reviewing content from politicians, celebrities and its business partners.
The panel, which began evaluating cases last year, is coordinated by the tech giant’s policy chief and former UK Deputy Prime Minister Sir Nick Clegg, and provides independent judgments on high-profile moderation cases, as well as recommendations on specific policies.
The board was asked to investigate the system after the Wall Street Journal and whistleblower Frances Haugen first revealed its existence last year, raising concerns that Meta gave preferential treatment to elite figures.
Clegg also has until Jan. 7 to decide whether to allow Trump back on the platform following a separate board recommendation.
After a lengthy investigation spanning more than a year, the board has requested that Meta scrutinize who is on the so-called “cross-check” list and be more transparent about its vetting procedures.
The report is one of Meta’s most thorough investigations into moderation issues yet, as the independent panel, made up of 20 journalists, academics and politicians, addressed concerns it has little power to hold the company accountable.
It’s putting further pressure on CEO Mark Zuckerberg, who last month announced plans to lay off 11,000 employees amid falling revenue and growth to ensure Meta’s content is fairly monitored.
Meta has already started to overhaul the system. In a Tuesday blog post, Clegg said it was originally designed to “doublecheck cases where there might be a higher risk of failure or when the potential impact of failure is particularly severe.” He added that the company has now developed a more standardized system with further controls and annual reviews.
How many people are currently on the secret list remains unclear. The Wall Street Journal, which first reported on the list, estimated there were 5.8 million users listed as of 2020. Meta has previously said it was 666,000 as of October 2021.
The system meant content posted by well-known figures like Trump and Elizabeth Warren would remain on the platforms until human moderators reviewed it, even if the messages would have been automatically removed had they been posted by a regular user.
It would take an average of five days for this human review to take place, with the content remaining on the platform during that time, and in one case up to seven months for the report to be found.
Meta’s “own understanding of the practical implications of the program was lacking,” the board said, adding that the company had not assessed whether the system was working as intended.
The board also accused the company of providing “inadequate” responses to the probe, which sometimes took months to respond to.
The board was referring to a Wall Street Journal report that detailed how Brazilian footballer Neymar posted non-consensual intimate pictures of another person to his Facebook and Instagram accounts, which had been viewed more than 50 million times before they were removed. According to Meta, this was due to a “delay in reviewing the content due to a backlog at the time.”
Thomas Hughes, director of the oversight body, said the Neymar incident was an example of how business partnerships could impact moderation processes.
“That raises concerns. . . about the relationships between individuals in the company and whether that might influence decision-making,” he said.
“There was probably a mingling of different interests within this cross-check process,” he added.
The report follows previous public tensions between the board and Meta after the former accused the social media company of withholding information about the system in September 2021. Many see the board of directors as an attempt to create distance between company executives and difficult decisions surrounding free speech.
Meta now has 90 days to act on the recommendations.