top of page

Conducting Content Analysis on the Oversight Board Cases on Meta Transparency Center Website

Updated: Sep 20

The Meta Transparency Center Website showed some cases which were reviewed by the Oversight Board, from either Meta directly or the users of Facebook, Instagram or Threads, who disagree with Meta's initial content moderation decisions. The website provides an important and rare opportunity for the users of Facebook, Instagram and Threads to better know about how Meta conducts content moderation on these platforms. In addition, it provides a special and micro perspective for researchers to explore how the platforms owned by Meta conducted moderation on some controversial content, and the relevant reasons and explanation behind their content moderation decisions.


ree

ree

This research conducted content analysis on the cases which were selected to review between January and September in 2024 by the oversight board on the Meta Transparency Center website. There are more cases of content in North America, Asia Pacific and Middle East selected to review by the oversight board than other parts of the world. More than half of the cases, which were removed due to the initial content moderation, were reinstated through the reviewing process by the oversight board. In addition, around one-third of the cases, which were decided to be left after the initial process of the content moderation mechanism, were removed after the oversight board reviewing them.


ree

ree

Futhermore, this project calculated how many times each policy of Meta-owned platforms were involved in the cases which were selected to review by the oversight board. As shown in the picture below, Dangerous Organizations and Individuals policy, Hate Speech policy and Violence and Incitement policy were much more frequently involved in the selected cases than any other relevant policies for the governance of the platforms owned by Meta.

ree

 
 
 

Comments


bottom of page