TikTok was launched in September 2016 by ByteDance, a Chinese company. The app quickly gained popularity, especially among young people, for its ability to create and share short music videos, viral challenges and humorous content. With over one billion monthly active users, TikTok has become one of the most influential social media platforms in the world.
TikTok’s core audience is teenagers and young adults, who are particularly vulnerable to the negative effects of social media. The pressure to create viral content and the obsession with likes and followers can amplify these problems, but one of the most worrying dangers stems from the social media algorithm itself, which is designed to learn from user behaviour and maximise engagement and retention on the platform.
An experiment by the 2022 Center for Countering Digital Hate with accounts that mimicked the behaviour of 13-year-olds found that TikTok’s algorithm quickly recommended videos about suicide and eating disorders: the app’s ‘For You’ feed suggested harmful content about suicide within an average of 2.6 minutes and about eating disorders within eight minutes.
In addition, TikTok accounts with ‘loseweight’ in the nickname, called ‘Vulnerable Teen Accounts’, received much more exposure: three times more harmful videos and twelve times more frequent videos than standard accounts.
Amnesty International conducted a similar study, in collaboration with the Algorithmic Transparency Institute and AI Forensics. Salvatore Romano, a researcher at AI Forensics, said that ‘the suggested videos were also very burdensome for the researchers. This kind of material, especially if shown in an endless loop, can have very dangerous effects.
‘The results demonstrate that TikTok’s design practices are manipulative and addictive,’ said Lisa Dittmer, a researcher at Amnesty International. The content recommendation system – which is credited with its global rise – exposes children and young people with mental health problems to serious risks.
When asked to explain, TikTok reiterated its commitment to protecting its community. In a letter to Amnesty, it stated: “We have developed and implemented systems that restrict content on topics that may be acceptable when viewed occasionally, but potentially problematic when presented on an ongoing basis. We continue to work to expand and implement these systems, including the incorporation of other mental health issues”.
Marie’s story and the first complaint to Tik Tok
The very young Marie committed suicide in September 2021, at the age of 15. She had been bullied at school because of her weight and had reported it a few days earlier with a video on TikTok. That post, according to the complaint, triggered an avalanche of negative videos on her wall. Two years later, Marie’s parents sued TikTok for ‘incitement to suicide’ and ‘failure to assist a person in danger’.
The viral challenge phenomenon
Another of the most worrying aspects of TikTok is the phenomenon of viral challenges. Often dangerous, they quickly become popular, inciting users to participate in order to gain visibility. Some of them have led to serious incidents and, in some cases, suicide. For example, the ‘Blue Whale Challenge’, which originated on other platforms but found fertile ground on TikTok by proposing a series of self-destructive tasks that culminated in suicide.
Avoiding moderation
The platform has taken several measures to protect minors. However, this is done automatically. ‘Platforms choose this type of moderation, as opposed to moderation carried out entirely by humans, for reasons of cost but also speed. An algorithm, no matter how sophisticated, will always make mistakes. Also because the first to try to circumvent it are the users, who know the rules’.
World War to TikTok
Last March, the US House of Representatives and Senate passed a bill in record time that paves the way to ban TikTok in the US. The bill requires the Chinese company to sell TikTok to a US company within six months, or it will be banned.
China’s response has been swift, calling the matter ‘an act of intimidation that will backfire on the United States itself’.
All this, in the midst of an election campaign, helped former president Donald Trump, who had previously declared himself against TikTok but quickly changed his mind: ‘By banning TikTok, Facebook and others, but especially Facebook, will benefit’.
Actually, there is more to it: Trump’s goal is not to alienate a crucial portion of the youth electorate and especially Jeff Yass, a billionaire Republican financier with ties to Trump’s entourage, who owns 15% of ByteDance.
Measures were also announced in Italy. Public administration minister Zangrillo asked civil servants to delete the app by 15 March.
In 2020, the Indian government decided, a few days after the killing of 20 soldiers on a Himalayan border, to ban it for security reasons; Taiwan, too, at the behest of the FBI, banned its use; in the US, an ultimatum was given to the US government to ban ByteDance. The EU Parliament, Commission and Council have imposed bans on TikTok on their employees’ devices; Pakistani authorities have temporarily banned TikTok at least for disseminating immoral content; in Afghanistan it has been banned because it contributes to the perdition of the younger generation.
The role of parents and educators
Parents and educators play a crucial role in mitigating the risks associated with TikTok. It is crucial that they are aware of the content their children are viewing and establish an open dialogue about online safety. Promoting digital awareness is essential to help young people navigate social networks safely, recognise the risks associated with content, red flags and tools to get help. Schools can play a crucial role by integrating digital awareness into the curriculum and providing support to students.
TikTok, with its immense popularity and influence, has the potential to be a positive force among young people. However, the risks associated with dangerous content cannot be ignored. Only through collective efforts can a safe and supportive online environment be created that reduces the risks of self-harm and suicide among young people.