Social Media Content Moderation
Conducting Text Analysis on the Selected News Articles or Interviews on the People Who Worked as Content Moderators of Social Media platforms or Researched on Relevant Fields
This project also applied qualitative research method to conduct text analysis as a part of the research. The data processing and visualization has provided me with an opportunity to explore the general landscape of current content moderation mechanism of social media platforms, the qualitative research offered us a chance to know more about the job as a content moderator and how the characteristics of this job would significantly affect people who work as content moderators. Therefore, five news articles about content moderation or interviews on the people who worked as content moderators or researched on the content moderation industry, were selected to conduct text analysis through a qualitative analysis software “ATLAS.ti”. In general, most news articles or interviews emphasized the harmful influence brought by this job on moderators’ mental health, the significant negative influence on moderators’ daily life, the terrible working conditions and the description about what kind of content would be included for reviewing, the phenomenon that social media giants outsource the works of content moderation to other countries, the future technical solutions to deal with the issue of content moderation.
Almost all the five news articles or interviews mentioned that this job could bring significant psychological trauma to those moderators. A former Facebook moderator said that the psychological and wellness support offered by her employer is far from enough, which made her impossible to feel relieved and enjoy the life after work. She also mentioned that the wellness coaches offered by her employer are not qualified psychiatrists, which caused her to be unable to get effective support after work. What’s worse, she mentioned that she was required to work in the office due to being employed by Facebook’s contractors rather than directly by Facebook, which resulted her to be exposed to more graphic content with high priority queues, including “the graphic violence, the child stuff, the exploitation and the suicides” (Criddle, 2021). In addition, people who worked as content moderators for social media platforms would be required to sign non-disclosure agreement, which limited them to share what they saw at work with their families or friends. Moreover, she also mentioned the content moderators were asked to sign the Post Traumatic Stress Disorder(PTSD) disclaimer to provide immunity for employers from liability from potential harm on moderators’ mental health due to working as a content moderator. There is no doubt that the lack of effective and enough psychological and wellness support, and the requirements of the non-disclosure agreement and the PTSD claimer profoundly exacerbate the dilemma for moderators sinking into the swamp of mental health problems.
Additionally, working as content moderators for social media platforms brought significantly negative influence on their personal daily life. For example, the former Facebook moderator described that the nightmare of the content reviewed for moderation would follow her home, and she “could just be watching TV at home and think back to one of the horrible, really graphic tickets” (Criddle, 2021). The content that moderators reviewed would affect their understanding and recognition of things and people, and indirectly affect their behaviors in their daily life. For instance, a professor of information studies at U.C.L.A. mentioned a case that a content moderator suddenly stiff-armed his girlfriend and shoved her away when they were getting close and enjoying the intimate moment because of “the image of something” which was reviewed by him “at work that day popped into” his mind at the moment (Chotiner, 2019). The methods to completely remove the influence of this job on those content moderators’ daily life have not been found yet, but social media giants could mitigate the negative influence on the moderators’ daily life through providing better mental health support.
Moreover, social media content moderators are always classified as an entry-level job in the technology industry, thus they often work under the relatively terrible working conditions. To be specific, they are usually required to work for a long time and exposed to an outrageous amount content waiting to review. Besides, the wellness time that they are given to relax themselves or talking to a wellness coach is very limited when considering the pressure and the working time requirement of this job. For example, the former Facebook moderator mentioned that “sub-contracted staff are given 1.5 hours of ‘wellness’ time a week”, “which can be used for speaking to a wellness coach, going for walks or talking time out when feeling overwhelmed”, but she thought it’s not enough. In addition, most of content moderators only have relatively low salary when comparing to other people working in technology industry. As for what is included in this job, content moderators are required to “have an encyclopedic knowledge of the rules that the platforms have set for user engagement and for users uploading content that they create” (Radu & Aznar, 2019), and review the content which might include “graphic violence, exploitation, extremism, abuse and suicide” to decide whether these content violate Terms of Services or the other relevant policies of the social media platforms(Criddle, 2021).
Furthermore, with the development of globalization, many social media giants from Silicon Valley started to outsource the job of content moderators to some developing countries, especially to Philippine and India, for some reasons as explained below. First of all, the labor cost in developing countries is relatively lower than developed countries in general. Secondly, there is less legal regulation and relevant policies and laws to require some social media corporations to offer better working conditions and afford to cost for content moderators mental health support. Both the two reasons are a reflection the logics of capitalism, which means to reduce the costs and create more profits. Thirdly, English is one of official languages in Philippine and India, so there is a large population base who can understand English in these two countries. Nonetheless, there “would be things like the dissonance and distance culturally and linguistically, contextually, and politically, for a group of people that are being asked to adjudicate and make decisions about the materials” which were created and disseminated by people from very distanced place and different cultural backgrounds (Chotiner, 2019).
The development and application of artificial intelligence with machine learning technology were frequently mentioned in the news articles or interviews, as a potential possible technical solution to deal with content moderation for social media platforms, and reduce or even avoid the psychological trauma or other harm that this job would bring to human content moderators. However, the possibility of artificial intelligence and machine learning algorithms completely replacing human moderators is widely doubted, while the current artificial intelligence technology is still unable to effectively take full responsibility moderating all the content which violate relevant policies of those social media platforms. Therefore, human moderators would still be a part of the content moderation mechanism of social media platforms in the foreseeable future, even though some social media giants have started to use a combination of machine learning algorithms and human moderators to review content for moderation.
References
Aznar, J. & Radu, S. (2019, August 22). Social Media's Dark Tasks Outside of Silicon Valley. U.S. News & World Report. https://www.usnews.com/news/best-countries/articles/2019-08-22/when-social-media-companies-outsource-content-moderation-far-from-silicon-valley
Chotiner, I. (2019, July 5). The Underworld of Online Content Moderation. The NEW YORKER. https://www.newyorker.com/news/q-and-a/the-underworld-of-online-content-moderation
Criddle, C. (2021, May 12). Facebook moderator: ‘Every day was a nightmare’. BBC. https://www.bbc.com/news/technology-57088382