After creating an account on China’s TikTok app, it takes only ten minutes to start pushing suicide videos to children aged 13 years old.
According to Eko research, the Chinese app’s recommendation algorithm can push suicide videos within ten minutes if a young TikTok user indicates that he is experiencing sexual frustration.
According to VICE News, the researchers created nine new TikTok accounts and listed their age at 13.
The researchers discovered that the TikTok accounts’ pages “For You” were filled with similar content after only viewing ten videos on “incel” topics.
One test account was shown a clip featuring Jake Gyllenhaal. The actor was seen holding a rifle in his mouth and saying “Shoot Me.” Shoot me in the face.
Also included was text that read “Get shot or see him with someone else?”
The majority of commenters supported the suicide suggestion. Others lamented their loneliness and suggested that they commit suicide within four hours.
Jake Gyllenhaal’s clip has been deleted. It had received over 440,000 likes and 2.1 million views. 7,200 comments were made. More than 11,000 shares have been taken.
Maen Hammad, Eko campaigner, and coauthor of the research, stated that it takes only ten minutes to click on TikTok and fall into the rabbit hole containing some of the most dangerous and harmful content online.
Hammad said that the algorithm can force you into a downward spiral of despair, hopelessness and self-harm. It’s very difficult to break out of this spiral once it thinks it knows what to see. It’s alarming to see the ease with which children can fall into this vicious circle.
TikTok has become the most popular social media platform for teens in America, replacing Facebook and Instagram. It is well-known for promoting content that is harmful to children and young people, and can even cause injury or death.
The University of Massachusetts had earlier this month to alert its students to a new TikTok drinking trend. This has led to 28 ambulances being dispatched to off-campus parties. Students are creating a “blackoutrage gallon” of alcohol, flavoring and other ingredients.
A 12-year-old girl from Argentina died earlier this year after she took part in the “choking challenge”, which was first made popular by the Chinese app. The death of the girl was recorded in a video conference, while her classmates watched as they attempted to stop her from completing the dangerous challenge.
A TikTok challenge was allegedly fatal for a UK 14-year old and 12-year-old last summer.
The FDA warned parents last September about a new TikTok danger that involves children cooking chicken with NyQuil “presumably to eat.”
TikTok also challenged users in 2020 to use large amounts of the allergy medication Benadryl, or diphenhydramine, to cause hallucinations. Teens were reported to have been rushed to the hospital and even died as a result.