Social Media Content Moderation

Literature Review
The picture above is generated with an embedded artificial intelligence tool on the Wix platform.​​
With the development and popularization of internet and social media, more and more people tend to receive information especially news information through social media platforms. The widespread existence of social media platforms broke the monopoly of traditional mass media and changed the communication mode from one-way communication to two-way or multi-way communication. From the perspective of gatekeeper theory, journalists and editors of mass media used to play a vital role of gatekeeper to control what information and contents could be allowed to spread through mass media platforms in the past, while social media users and social media itself replace the gatekeeper role of traditional mass media (Shoemaker & Vos, 2009). Therefore, social media platforms play an important role in stopping the misinformation, fake news information and harmful contents from dissemination through their content moderation mechanism.
​
With the widespread use of social media in politics and rise of nationalism and even populism, people gradually step into a post-truth era. Vittorio’s article explored the difference between post-truth, lies and bullshit. Telling a lie means that a person who tells a lie knows that there is a truth, but the person is determined to tell a story different from the truth. The post-truth “doesn’t simply deny or question certain facts, but it aims to undermine the theoretical infrastructure that makes it possible to have a truth” (Bufacchi, 2021, pp. 349). People who make bullshit show their disrespect to the truth, but they would accept the truth when the truth “serves them well” (Bufacchi, 2021, pp. 349). However, people who tell post-truth feel to be threatened by truth, so they would try to delegitimize the truth in order to get rid of the threats that the truth brings to them. The coming of post-truth era has attracted researchers’ interest, since the widespread dissemination of misinformation and fake news information on social media platforms has been considered to play an essential role in the 2016 United States Presidential Election and the 2016 United Kingdom European Union Membership Referendum with the rising of nationalism and even populism (Bufacchi, 2021) (Marshall & Drieschova, 2018).
Many governments around the world have recognized the importance of taking measures to limit the dissemination of misinformation and fake news information on the social media platforms, which would significantly affect people’s recognition and understanding on an issue especially when they tend to receive information through certain social media platforms. The Uses and Gratifications Study found that people would tend to receive information from a channel which offered them the information that corresponding to their recognition to an issue before (McQuail, 1997). On the other hand, social media corporations tend to attract users to spend more time on their platforms, so they would make efforts to design the algorithms to try to provide users with information which were disseminated by the users who have similar understanding and recognition to certain issues (Seaver, 2019).
​
​
References
Ahn, S., Baik, J. S. & Krause C. S. (2022). Splintering and centralizing platform governance: how Facebook adapted its content moderation practices to the political and legal contexts in the United States, Germany, and South Korea. Information, Communication & Society, 0-20. https://doi.org/10.1080/1369118X.2022.2113817
Bufacchi, V. (2021). Truth, lies and tweets: A Consensus Theory of Post-Truth. Philosophy & Social Criticism, 47(3), 347-361. https://doi.org/10.1177/0191453719896382
Gillespie, T. (2018). PLATFORMS ARE NOT INTERMEDIARIES. Georgetown Law Technology Review. Retrieved December 13, 2023, from https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198- 216.pdf
Killion, V. L. (2024, March 28). The First Amendment: Categories of Speech. IN FOCUS. https://sgp.fas.org/crs/misc/IF11072.pdf
Marshall, H., & Drieschova, A. (2018). Post-Truth Politics in the UK’s Brexit Referendum. New Perspectives, 26(3), 89-105. https://doi.org/10.1177/2336825X1802600305
McQuail, D. (1997). Audience Analysis. SAGE Publications.
Seaver, N. (2019). Captivating algorithms: Recommender systems as traps. Journal of Material Culture, 24(4), 421-436. https://doi.org/10.1177/1359183518820366
Shoemaker, P. J. & Vos, T. (2009). Gatekeeping Theory. Routledge.
​