News Analysis
The Chinese-owned video-sharing platform TikTok has become increasingly scrutinized over reports of censoring anti-Beijing content, tracking users, and concerns that it hands user information to authorities in Beijing.
A number of governments have escalated efforts to restrict access to the video app citing security concerns. Lawmakers in the United States, Australia, Canada, and the United Kingdom have passed legislation banning access to TikTok on government-issued devices, in schools, or even in an entire U.S. state.
However, the problems associated with TikTok’s powerful algorithm may be far more severe than data security.
Tesla CEO Elon Musk on May 13 voiced concern about the ill effects of TikTok after sharing a screenshot of a report on Twitter that documented how the Chinese-owned social media app is pushing harmful content involving self-harm, eating disorders, and suicide into children’s feeds.
In the Twitter post, Musk added, “extremely destructive if accurate,” referring to a recent study released by the Center for Countering Digital Hate (CCDH), a British nonprofit.
Study: Harmful Content Algorithmically Pushed
In the study (pdf), the CCDH established accounts posing as 13-year-old teenagers in the United States, United Kingdom, Canada, and Australia.
One account in each nation was assigned a traditional female name. A second account was also created in each country with usernames containing the characters “loseweight” in addition to the name. Researchers used these characters after finding that people with issues like body dysmorphia usually express their situation via usernames.
The team then looked at the first 30 minutes of content recommended by TikTok in these accounts’ “For You” feed. When videos with potentially dangerous content about disorderly eating, self-harm, or mental issues were shown, the researchers would pause and like it, just like a typical teenager.
Every 39 seconds on average, the accounts were served with videos related to body image and mental health. Content referencing suicide was shown on one account within two and a half minutes. One account received an eating disorder content within eight minutes.
The accounts with “loseweight” characters were delivered three times more harmful content than other accounts. In addition, these accounts were also exposed to 12 times more suicide and self-harm videos. According to the CCDH, an eating disorder community hosted at TikTok had more than 13.2 billion video views.
Lingering Harms
After the researchers published their study, some videos they had flagged appeared to have been taken down from TikTok, but many of the accounts that posted the material remained while retaining other similar content, according to the Wall Street Journal (WSJ).
Users can filter out videos containing words or hashtags they don’t want to see, but the content can still slip through.
According to WSJ, some users have developed creative ways to skirt TikTok’s content filters, such as using a sound-alike “sewerslide” when referencing suicide or just writing “attempt,” leaving the rest to the viewer’s imagination.
TikTok claimed in March that it had 150 million users in the United States, with a predominantly teenage audience. A Pew Research study in April found more than two-thirds of U.S. teens report using TikTok, while 16 percent are on it almost constantly, showing signs of addiction.
Imran Ahmed, CEO of the CCDH, pointed out that TikTok was designed to influence young users into giving up their time and attention. The research proved that the app is “poisoning their minds” as well.
“It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food,” he said, according to the report.
“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from Big Tech billionaires, their unaccountable social media apps, and increasingly aggressive algorithms.”
Abnormal Physical and Mental Changes
The ubiquity of TikTok has been accompanied by a host of deleterious effects, chief among these privacy and mental health concerns.
During the COVID-19 pandemic, adolescents spent a tremendous amount of time at home using electronic devices. Subsequently, symptoms similar to tic disorders began to increase among teens.
Being diagnosed with a tic disorder, also called Tourette syndrome, means that one involuntarily performs physical jerking movements or verbal outbursts. The disorders involve uncontrollable repetitive movements or unwanted sounds (tics), such as repeatedly blinking the eyes, shrugging shoulders, or blurting out offensive words.
According to a 2021 WSJ report, experts at top pediatric hospitals in the United States, Canada, Australia, and the UK discovered that what most of the teens had something in common was their repetitive viewing of TikTok videos.
The report also cited medical journal articles and statistics from doctors and specialists that observed surges of tic-like disorders linked to TikTok usage.
In addition to physical disorders, TikTok is reportedly pushing videos about borderline personality disorder, bipolar disorder, and multiple-personality disorder toward teens, according to another WSJ report.
Those videos would encourage their viewers to self-evaluate, and many would recognize themselves in the disorders and become convinced that they had them, thus becoming chagrined and upset.
However, only 1.4 percent of the U.S. adult population is estimated to experience the disorder, according to the National Alliance on Mental Illness, a nonprofit mental-health advocacy organization. And that multiple-personality disorder is even rarer, affecting less than 1 percent of the population, according to the Cleveland Clinic.
The report said TikTok videos containing the hashtag #borderlinepersonalitydisorder had been viewed almost hundreds of millions of times.
Growing Adverse Affects on Youth
Social media is driving children and young adults to have a low sense of self-worth and be dissatisfied with their appearances, according to a study published by London-based mental health charity stem4 on Jan. 3.
“Social media is definitely negatively affecting me. As young people, we constantly compare ourselves to good-looking people online. On sites like TikTok, the only people you see are gorgeous due to the algorithms, and that makes us feel really bad about ourselves,” said a young person who was quoted in the study (pdf).
The study, which surveyed 1,024 children and young adults aged between 12 and 21 in the UK, found that 97 percent of them were on social media, spending an average of 3.65 hours a day on smartphone apps, such as TikTok, Snapchat, Instagram, YouTube, and WhatsApp.
It found that 77 percent of the respondents were unhappy about how they looked, with some saying that they were “embarrassed” by their bodies.
Nearly half of those surveyed said they had received negative and hateful comments about their appearance. As a result, 24 percent of them responded by becoming withdrawn, 22 percent began to exercise excessively, 18 percent stopped socializing, 18 percent chose to drastically restrict their food intake, and 13 percent inflicted self-harm.
The survey also found that 42 percent of the respondents—51 percent of females and 31 percent of males—said they were in mental health distress.
In light of social media’s increasingly adverse effects on adolescents, the American Psychological Association in April recommended parental monitoring and appropriate limit-setting on social media use, especially for those in early adolescence.
Japanese electronics engineer Lee Ji-Shin told The Epoch Times on May 17 that parents should keep their children away from TikTok and guide them to pursue something meaningful.
“TikTok’s algorithm is traffic-focused, its primary purpose is to make money, and it does not consider whether the content pushed will damage or affect the immature minds of teenagers, making them anxious or impulsive.”
Naveen Athrappully, Frank Fang, and Ellen Wan contributed to this report.