An appeal is received by Facebook’s Supreme Court every 24 seconds.
Facebook’s Supreme Court, also known as the Oversight Board, was established in 2020 as an independent body to review and make binding decisions on content moderation issues on the platform. The board is made up of 20 members from various backgrounds and countries, including legal experts, human rights advocates, journalists, and former politicians.
Since its inception, the Oversight Board has received a staggering number of appeals from users around the world. According to the board’s latest report, an appeal is received every 24 seconds, which translates to approximately 2160 appeals per day. This number is expected to increase as more users become aware of the board’s existence and its role in content moderation on Facebook.
The appeals received by the Oversight Board cover a wide range of issues, including hate speech, misinformation, nudity, and violence. Users who feel that their content has been unfairly removed or that they have been wrongly penalized can submit an appeal to the board. The board then reviews the case and makes a decision based on Facebook’s community standards and international human rights law.
The process of reviewing an appeal can take up to 90 days, during which the board may seek input from experts, civil society organizations, and other stakeholders. Once a decision is made, it is binding on Facebook, and the company must implement the board’s recommendations within a specified timeframe.
The Oversight Board’s decisions have been closely watched by policymakers, civil society organizations, and the media, as they have the potential to set precedents for content moderation on social media platforms. Some of the board’s most high-profile cases include the decision to uphold Facebook’s ban on former US President Donald Trump and the ruling that Facebook must reinstate a post about breast cancer that had been removed for violating the platform’s nudity policy.
While the Oversight Board has been praised for its transparency and independence, some critics argue that it is not a substitute for stronger regulation of social media platforms. They argue that the board’s decisions are limited to individual cases and do not address broader issues such as the spread of disinformation and the impact of social media on democracy.
In response to these criticisms, the Oversight Board has emphasized that its role is to provide accountability and transparency in content moderation on Facebook, rather than to replace regulation. The board has also called on Facebook and other social media platforms to be more transparent about their content moderation policies and to engage in more meaningful dialogue with stakeholders.
In conclusion, the fact that an appeal is received by Facebook’s Supreme Court every 24 seconds highlights the scale of the content moderation challenge facing social media platforms. While the Oversight Board has made significant strides in providing accountability and transparency in content moderation on Facebook, it is clear that more needs to be done to address the broader issues of disinformation, hate speech, and the impact of social media on democracy.