Supervisory Board’s annual report shows how often Meta follows its advice – TechCrunch

The advisory group reviewing Facebook and Instagram’s content moderation decisions has released its first annual report Wednesday, crowning its first year of operation.

The Oversight Board apparently received more than a million calls from Facebook and Instagram users in 2021. Most of these requests asked the Board to take down content from Meta apps that had been taken down for violating rules against hate speech, violence and bullying. The council issued decisions and explanations on 20 cases which it describes as “significant”. In 70% of cases reviewed by the group, it reversed Meta’s original decision.

“There was clearly huge pent-up demand among Facebook and Instagram users for a means to appeal content moderation decisions made by Meta, to an organization independent of Meta,” the council wrote in the report. .

The most significant decision of the Oversight Council to date is the issue of the reinstatement of former President Donald Trump, who was removed from Facebook after encouraging insurrection on the US Capitol. The board responded to this decision by asking Meta to clarify the rules it used to kick the former president out of the platform to begin with. “In enforcing this sanction, Facebook failed to follow a clear and published procedure,” the board wrote at the time, adding that Facebook did not have a rule for “indefinite” suspensions like the one issued to Trump.

Beyond its decisions, which set a kind of precedent for the future application of the policy, the board also makes more general recommendations to Meta on how the company should think about particular aspects of the moderation of content and the rules it should put in place.

In less high-profile cases, the board recommended Meta tighten Facebook and Instagram’s rules against doxing, asked the company to release a transparent report detailing how well it enforced COVID-related rules. 19 and asked him to prioritize fact-checking for governments that share misleading health information through official channels.

The Supervisory Board made 86 policy recommendations in its first year. Meta has implemented some of the board’s suggestions for better moderation transparency, including giving users more information when they violate the platform’s hate speech rules and informing whether AI or human moderation led to an app decision and ignored others. These results are follow-up in the annual reportwhich sheds light on the true effectiveness of the group’s impact and how often Meta implements or ignores its recommendations.

The Oversight Board reviews content moderation cases from around the world, sometimes sorting out linguistic and cultural nuances that Meta itself has failed to incorporate into its moderation decisions, automated or otherwise. Facebook whistleblower Frances Haugen has repeatedly sounded the alarm about the company’s ability to monitor its social platforms in non-English-speaking markets. According to the report, half of the decisions of the Supervisory Board concerned countries of the South, including some from Latin America and Africa.

Initially, the council only reviewed cases in which users requested content be restored on Instagram and Facebook, but the group expanded to review cases requesting content be removed a few months later. Yet the Oversight Council’s decision-making domain is limited to questions about individual posts and not the many other features people use on Instagram and Facebook.

The board writes that it wants to expand the scope of its powers to advise Meta on moderation affecting accounts and groups on its platforms, not just individual posts. The supervisory board is currently “in dialogue” with the company, which always has the final say on what the semi-independent advisory group can actually do.