Why is TikTok allowing dangerous trends to be pushed on teens?

TikTok has come under fire for exposing teenagers to dangerous trends including ‘rape culture’, how to hotwire new cars and a vile ‘blackout challenge’ that has killed youngsters.

The social media giant has failed to strip down misogynistic comments by former kickboxer Andrew Tate, footage of thugs stealing KIAs and Hyundias and content encouraging children to asphyxiate themselves.

These all come despite the tech giant having extensive community guidelines spelled out on its website, including rules on dangerous acts and challenges, hateful behavior and against promoting suicide or harm.

But it could soon be held more accountable across the US for content on the site under the Combating Harmful Actions with Transparency on Social Act.

The bill seeks to combat dangers to children and boost transparency into how the apps are being used for crimes.

TikTok is not the only outlet to battle dangerous content on its site, with Facebook, Instagram and Twitter all being hauled over the coals in recent months for failing to moderate themselves effectively.

TikTok has taken the social media world by storm since its global launch in 2017, allowing users to share short bursts of content that range from innocent dance routines to perilous challenges.

The China-based firm has also proven popular with celebrities and the media, with both groups using it to expand their reach.

What is being done to hold TikTok accountable and what is the tech giant doing?

Social media giants had appeared to have a free reign to leave these shocking challenges up in previous years as governments figured out a way to crack down on them.

But now legislation is being brought in across the world to hold them accountable, with the bipartisan Combating Harmful Actions with Transparency on Social Act leading the charge.

The bill seeks to direct the FBI and Justice Department to collect and report data on crimes involving social media.

It looks to establish data collection guidelines so when local law enforcement files a police report they would check a box stating whether a social media platform is suspected to have been involved in the crime.

It also wants to direct the Attorney General to publish an annual statistical report detailing which internet platforms are connected to which crimes.

And this data is sort to pay special attention to offenses committed against or by children, allowing us to better understand the role of social media in crime and its impacts on children.

One of those pushing the act is Republican Rep and attorney Josh Gottheimer, who last week on Twitter reemphasized its importance.

He wrote: ‘The lack of transparency and accountability for social media companies has led to grave consequences for our kids, families, and national security. That’s why I introduced the bipartisan CHATS Act.

‘The Combating Harmful Actions with Transparency on Social Act takes steps to hold social media companies accountable, combat dangers to our kids, and boost transparency into which apps are being used for which crimes.

‘Social media platforms allow for drugs to be ordered on demand — like Amazon Prime or a pizza delivery. Kids don’t even have to leave their neighborhoods, let alone their houses.

‘We also know that China and TikTok have access to our kids’ and Americans’ private data. I worked on this bill with Dr. Laura Berman & Samuel Chapman, the parents of Sammy Chapman, a 16-year-old who died from an overdose in his bedroom after buying fentanyl-laced Xanax from a drug dealer on Snapchat, and Marc Berkman, CEO of @socialmedsafety.’

Meanwhile TikTok is among social media firms to bring in rigorous community guidelines to supposedly crack down on the dangerous trends pedaled on its site.

Its website is split into 14 sections, including on: dangerous acts and challenges, suicide or self harm and violent and graphic content.

The first says: ‘We do not permit users to share content depicting, promoting, normalizing or glorifying dangerous acts that may lead to serious injury or death.

‘We also do not allow content which promotes or endorses collective participation in dangerous or harmful activities that violate any aspect of our Community Guidelines.

We define dangerous acts or other dangerous behavior as activities conducted in a non-professional context or without the necessary skills and safety precautions that may lead to serious injury or death for the user or the public. This includes amateur stunts or dangerous challenges.’

In its suicide and self harm section, TikTok says: ‘We care deeply about the health and well-being of the individuals who make up our community.

‘We do not allow content depicting, promoting, normalizing, or glorifying activities that could lead to suicide, self-harm, or disordered eating.

‘However, we do support members of our community sharing their personal experiences with these issues in a safe way to raise awareness and find community support.

‘We also encourage individuals who are struggling with thoughts of suicide or self-harm, or who know someone is seriously considering suicide, to immediately contact local emergency services or a suicide prevention hotline.

‘In the event that our intervention could help a user who may be at risk of harming themselves, TikTok may also alert local emergency services.’

And in the violent and graphic content part, it adds: ‘TikTok is a platform that celebrates creativity but not shock value or violence.

‘We do not allow content that is gratuitously shocking, graphic, sadistic, or gruesome or that promotes, normalizes, or glorifies extreme violence or suffering on our platform.

‘When it is a threat to public safety, we ban the account and, when warranted, we will report it to relevant legal authorities.’ has approached TikTok for further comment.



Related Articles

Back to top button