The Dark Side of TikTok: A Call to Action Against Extremist Content Online

Evan W. Flynn

Abstract

This article investigates the rise of bigotry on TikTok and calls for immediate regulatory action from the European Commission. It highlights the alarming rise of hate on the app, and how the app reinforces said hate. It closes by calling for stricter regulation of the app to protect the minds of European youth.

Over the past couple of years, TikTok, a new social media platform, has become increasingly popular. The app allows users to view short videos that last a few seconds. Importantly, it is extremely addictive because the algorithm often shows users content that is extremely agreeable to them. In addition, the app is particularly popular among children and young teens (Weimann & Masri, 2023, p. 752).

Young people are immensely impressionable. Indeed, they often cannot distinguish what they see or read on the internet from reality. Therefore, the popularity of this app provides a perfect festering ground for hate speech and bigoted disinformation. Furthermore, the impressionability of young people is compounded by the echo chambers often created by online spaces such as TikTok (Ionescu & Licu, 2023, p. 1). In other words, if a young user downloads TikTok with pre-existing hateful attitudes, those attitudes will be compounded by TikTok’s algorithm. Because of the importance of the younger generation and because of their impressionability, action must be taken to secure the hearts and minds of the youth.

The prominence of hate speech on TikTok “largely went unnoticed until December 2019, when Motherboard reported that it had found examples of ‘blatant, violent white supremacy and Nazism,’ including direct calls to kill Jews and black people” (Weimann & Masri, 2022, p. 170). Comparing two different studies they conducted, Weimann & Masri (2022) found that antisemitic TikTok posts increased on the site by 41%. Furthermore, they also found a 912% increase in antisemitic comments and a 1375% increase in antisemitic usernames (Weimann & Masri, 2022, p. 172). Furthermore, in 2019, “it was reported that ISIS recruitment content was discovered on TikTok.” These videos showed ISIS militants posing with guns and cadavers and also showed decapitations (Weimann & Masri, 2023, p. 756). Moreover, “some accounts signaled support for Atomwaffen, a violent neo-Nazi group linked to the murders of several Jewish people across the United States” (Weimann & Masri, 2023, p. 756). This is an issue that requires immediate attention—TikTok’s descent into antisemitic vitriol cannot be allowed to continue unabated.

In addition to antisemitic content, the app also serves as a breeding ground for racist content, much of which often incites violence. Many racist posts are often accompanied by antisemitic content, especially if the post espouses neo-Nazi ideals. For example, some accounts verbatim read “kill all [n-word],” “all Jews must die,” and “kill [n-word]” (Weimann & Masri, 2023, p. 756). One would think, vainly, of course, that TikTok would be able to censor such heinous language. Alas, despite their advanced algorithm, they have thus far proven incapable of curbing such language on their platform.

Crucially, the ubiquity of such language is even more alarming, knowing that young people and adolescents widely use the app. However, in some respects, the presence of bigoted posts may not be unexpected. It has been shown that many people join violent, radical groups at a young age, regardless of the movement’s political leanings. In Europe’s case, it has been shown that Europe’s radical right often recruits members from the younger population (Koehler, 2020, p. 456). It has also been found that between 1990 and 1995, 53% of German right-wing arson offenders were between the ages of 17 and 19, and 19% were between 14 and 16 years old (Koehler, 2020, p. 456). Indeed, young people are especially susceptible to radicalization and are consequently more likely to commit violent crimes. This fact necessitates an immediate change to Europe’s regulatory framework for dealing with TikTok and online radicalization.

To deal with this alarming threat, the European Commission must mandate the use of “counter-narratives” on TikTok in order to challenge extremist content on the app. Said narratives should be “targeted campaigns to discredit the ideologies of violent extremists” (Liang & Cross, 2020, p. 16). This would take the form of the “dissemination of counter-narrative products,” for example (Liang & Cross, 2020, p. 17). Specifically, a state-sanctioned entity could, perhaps, spread anti-extremist content on the app. In other words, the goal is to use extremists’ digital tools against them. Alternatively, the EU could sanction the dissemination of content with a similar message on other social media platforms that are also used by young people.

I also recommend that the EU implement more broad forms of digital disruption. For example, in order to counter hateful rhetoric on TikTok, I contend that the EU must “[implement] racial sensitivity and diversity training through public service announcements, peer-to-peer dialogue workshops, or films that provide opportunities for youth and adults to self-reflect and learn about historical oppression, people of color, women, and the LGBTQIA+ community from credible sources” (Windisch et al., 2021, p. 3). Implementing these kinds of policies would help foster an informed European population, one that is less likely to spread biased information on social media. The logic holds that if young Europeans are informed on issues of oppression and bigotry, they will be prone to avoiding spreading content that contradicts this online.

Finally, I would also encourage the European Commission to call upon all member states to censor posts on TikTok that directly call for violence. Using a strict framework like this one to limit the amount of hateful content on TikTok would hopefully mitigate the chances of TikTok being used as a medium of violent radicalization. Furthermore, this framework would provide for the preservation of free speech, within limits, of course.

Overall, left unchallenged, TikTok presents a severe threat to democracy. The platform has become a center for hateful, bigoted content, which often incites violence (Weimann & Masri, 2022, 2023). Because of the app’s large youth population, it is disconcerting that they are being exposed to such content. Therefore, the EU must implement policies like the ones outlined above. Indeed, the EU must engage in a digital disruption or counter-narrative campaign undermining extremist content on TikTok. Additionally, it must work to mitigate the amount of violent rhetoric on the app.

References

Ionescu, C. G., & Licu, M. (2023). Are TikTok Algorithms Influencing Users’ Self-Perceived Identities and Personal Values? A Mini Review. Social Sciences, 12(8), 465. https://doi.org/10.3390/socsci12080465

Koehler, D. (2020). Violent extremism, mental health and substance abuse among adolescents: Towards a trauma psychological perspective on violent radicalization and deradicalization. The Journal of Forensic Psychiatry & Psychology, 31(3), 455–472. https://doi.org/10.1080/14789949.2020.1758752

Liang, C. S., & Cross, M. J. (2020). White Crusade: How to Prevent Right-Wing Extremists from Exploiting the Internet (11; pp. 1–26).

Weimann, G., & Masri, N. (2022). New Antisemitism on TikTok. In M. Hübscher & S. V. Mering, Antisemitism on Social Media (1st ed., pp. 167–180). Routledge. https://doi.org/10.4324/9781003200499-11

Weimann, G., & Masri, N. (2023). Research Note: Spreading Hate on TikTok. Studies in Conflict & Terrorism, 46(5), 752–765. https://doi.org/10.1080/1057610X.2020.1780027

Windisch, S., Wiedlitzka, S., & Olaghere, A. (2021). PROTOCOL: Online interventions for reducing hate speech and cyberhate: A systematic review. Campbell Systematic Reviews, 17(1), e1133. https://doi.org/10.1002/cl2.1133