The Dark Side of Tik-Tok

In Bethany, Oklahoma, a 12-year-old child died due to strangulation marks on his neck. According to police, this wasn’t due to murder or suicide, rather a TikTok challenge that had gone horribly wrong. The challenge is known by a variety of names, including Blackout Challenge, Pass Out Challenge, Speed Dreaming, and The Fainting Game. The challenge is kids asphyxiating themselves, either by choking themselves out by hand or by using a rope or a belt, to obtain the euphoria when they wake up.

Even if the challenge does not result in death, medical professionals warn that it is extremely dangerous. Every moment you are without oxygen or blood, you risk irreversible damage to a portion of your brain.

Unfortunately, the main goal on social media is to gain as many views as possible, regardless of the danger or expense.

Because of the pandemic kids have been spending a lot of time alone and bored, which has led to preteens participating in social media challenges.

There are some social media challenges that are harmless including the 2014 Ice Bucket Challenge, which earned millions of dollars for ALS research.

However there has also been the Benadryl challenge which began in 2020 that urged people to overdose on the drug in an effort to hallucinate. People were also urged to lick surfaces in public as part of the coronavirus challenge.

One of the latest “challenges” on the social media app TikTok could have embarrassing consequences users never imagined possible. The idea of the Silhouette Challenge is to shoot a video of yourself dancing as a silhouette with a red filter covering up the details of your body. It started out as a way to empower people but has turned into a trend that could come back to haunt you. Participants generally start the video in front of the camera fully clothed. When the music changes, the user appears in less clothing, or nude, as a silhouette obscured by a red filter. But the challenge has been hijacked by people using software to remove that filter and reveal the original footage.

If these filters are removed, that can certainly create an environment where kids’ faces are being put out in the public domain, and their bodies are being shown in ways they didn’t anticipate,” said Mekel Harris licensed pediatric & family psychologist. Young people who participate in these types of challenges aren’t thinking about the long-term consequences.

These challenges reveal a darker aspect to the app, which promotes itself as a teen-friendly destination for viral memes and dancing.

TikTok said it would remove such content from its platform. In an updated post to its newsroom, TikTok said:

“We do not allow content that encourages or replicates dangerous challenges that might lead to injury. In fact, it’s a violation of our community guidelines and we will continue to remove this type of content from our platform. Nobody wants their friends or family to get hurt filming a video or trying a stunt. It’s not funny – and since we remove that sort of content, it certainly won’t make you TikTok famous.”

TikTok urged users to report videos containing the challenge. And it told BBC News there was now text reminding users to not imitate or encourage public participation in dangerous stunts and risky behavior that could lead to serious injury or death.

While the challenge may seem funny or get views on social media platforms, they can have long-lasting health consequences.

Because the First Amendment gives strong protection to freedom of speech, only publishers and authors are liable for content shared online. Section 230(c)(1) of the Communications Decency Act of 1996 states that “no provider or user of an interactive computer service shall be treated as the publisher or any information provided by another information content provider.” This act provides social media companies immunity over the content published by other authors on their platforms as long as intellectual property rights are not infringed. Although the law does not require social media sites to regulate their content, they can still decide to remove content at their discretion. Guidelines on the laws regarding discretionary content censorship are sparse. Because the government is not regulating speech, this power has fallen into the hands of social media giants like TikTok. Inevitably, the personal agendas of these companies are shaping conversations, highlighting the necessity of debating the place of social media platforms in the national media landscape.


Social media is unique in that it offers a huge public platform, instant access to peers, and measurable feedback in the form of likes, views, and comments. This creates strong incentives to get as much favorable peer evaluation and approval as possible. Social media challenges are particularly appealing to adolescents, who look to their peers for cues about what’s cool, crave positive reinforcement from their friends and social networks, and are more prone to risk-taking behaviors, particularly when they’re aware that those whose approval they covet are watching them.

Teens won’t necessarily stop to consider that laundry detergent is a poison that can burn their throats and damage their airways. Or that misusing medications like diphenhydramine​ (Benadryl) can cause serious heart problems, seizures and coma.​ What they will focus on is that a popular kid in class did this and got hundreds of likes and comments.


Children are biologically built to become much more susceptible to peer influence throughout puberty, and social media has magnified those peer influence processes, making them significantly more dangerous than ever before. Teens may find these activities entertaining and even thrilling, especially if no one is hurt, which increases their likelihood of participating. Teens are already less capable of evaluating danger than adults, so when friends reward them for taking risks – through likes and comments – it may act as a disinhibitor. These youngsters are being impacted on an unconscious level. The internet issues that are prevalent nowadays make it impossible for youngsters to avoid them. This will not occur unless they have parental engagement.


Due to their lack of exposure to these effects as children, parents today struggle to address the risks of social media use with their children.

Parents, on the other hand, should address viral trends with their children. Parents should check their children’s social media history and communicate with them about their online activities, as well as block certain social media sites and educate themselves on what may be lurking behind their child’s screen.

In the case of viral infections, determine your child’s level of familiarity with any patterns you may have heard about before soliciting their opinion. You may ask as to why they think others will follow the trend and what they believe are some of the risks associated with doing so. Utilize this opportunity to explain why you are concerned about a certain trend.


It’s important to keep in mind that taking a break is completely appropriate. You are not required to join in every discussion, and disabling your notifications may provide some breathing space. You may set regular reminders to keep track of how long you’ve been using a certain app.

If you’re seeing a lot of unpleasant content in your feed, consider muting or blocking particular accounts or reporting it to the social media company.

If anything you read online makes you feel anxious or frightened, communicate your feelings to someone you trust. Assistance may come from a friend, a family member, a teacher, a therapist, or a helpline. You are not alone, and seeking help is completely OK.

Social media is a natural part of life for young people, and although it may have a number of advantages, it is essential that platforms like TikTok take responsibility for harmful content on their sites.

I welcome the government’s plan to create a regulator to guarantee that social media companies handle cyberbullying and posts encouraging self-harm and suicide.

Additionally, we must ensure that schools teach children what to do if they come across upsetting content online, as well as how to use the internet in a way that benefits their mental health.

To reduce the likelihood of misuse, protections must be implemented.


How can social media companies improve their moderation so that children are not left to fend for themselves online? What can they do to improve their in-app security?

Trapped in Virtual Reality!

Millions of people went crazy for Pokemon GO in 2016, venturing into private and public locations to catch Pokemon characters that were only visible to them. The game Pokémon GO was the first to introduce the public to the concept of augmented reality (AR).

AR users can see the real world as it is, but with visible digital images overlayed such that the images appear to be part of the real environment.

There’s also virtual reality, which goes beyond augmented reality (VR). Users can enter a virtual environment and move around and interact with it as if it were the real world by wearing a headset.

“Around 25 million people in the United States consider themselves to be active video gamers. The sector is worth $30 billion in the United States and $90 billion globally. It has its own popular television network,, and in 2015, the finals of a League of Legends tournament drew more viewers than the NBA basketball finals. In the last year, over $1 billion in income was produced by Pokémon Go alone.”

AR and VR, on the other hand, raise legal issues for courts, businesses, and users. People will use AR and VR to kill and die, and some have already done so. They will harm themselves as well as others. Players have already fallen down a cliff or walked into oncoming traffic while playing Pokémon GO. Some will take advantage of the technology to threaten or scam others. To determine who is to blame, courts will need to grasp the technology and how it varies from the world before it.

CRIMES. In the real world, people sexually harass strangers and expose themselves indecently; there’s no reason why they wouldn’t do it in virtual reality. They are undoubtedly more likely to do so if they believe it will be difficult for law authorities to apprehend them. That ambition, though, may be difficult to fulfill. Extradition’s additional hurdles are likely to outweigh the greater ease of proving. As a result, traditional police forces may effectively ignore numerous VR street crimes. Suspension or exclusion from the virtual reality environment will most likely be the consequences. Participants who have been kicked off can simply re-enter by generating a new user ID.

The exhibitionist would almost probably be charged with indecent exposure or public lewdness if this happened in real life. Is it possible to apply the same law to virtual reality? Would you expect police forces to welcome the prospect of extraditing a person from another state or county simply because their internet avatar is nude? Because the exchanges may occur in multiple physical jurisdictions, it will be more difficult to regulate them effectively. As a result, police arrests and prosecutions will become more expensive, and law enforcement will be less willing to intervene. This is especially true in circumstances where there appears to be no “real” harm. As a result, police will be less likely to take this issue seriously, leaving VR users to fend for themselves.

We may see crimes and other issues occur in VR without the legal system doing anything about it since enforcement will be too tough for the less serious crimes that are likely to be witnessed in VR and AR. To the layperson, virtual reality is merely a game. Courts and police departments may determine that the wrongdoing took place within the game or server and is a personal matter. The VR data will be owned by commercial corporations, who will impose terms of use that bind users and disclaim liability for harm. As a result, police will be even more hesitant to act. The capacity of VR and AR operators to contractually waive liability, together with 47 U.S.C. 230, will certainly deter lawsuits against them.

Virtual reality and augmented reality will also test our understanding of what constitutes speech, which is protected by the First Amendment, and what constitutes non-speech activity that requires regulation. Is nudity on a drive-in screen, speech, the same as indecent exposure, conduct? In the physical world, the basic distinction between words and actions makes sense because we believe that the harm that words may inflict at a distance is generally smaller and easier to avoid than the harm that physical touch can cause.

Virtual reality and augmented reality, on the other hand, are designed to make conveyed pictures and sounds feel as real as possible. They challenge our perception of reality because they blur the cognitive boundaries between imagery and physical existence. People react as if they’ve been slapped in the face when they receive a virtual slap. The reaction is intuitive; it is not based on actual physical contact, but it seems real in a way that words or images outside of VR do not.

With respect to injury in the actual and virtual worlds, VR and AR will offer legal challenges that may necessitate adjusting existing doctrines or changing legal laws. Now, I’d like to pose a question to you. Virtual reality isn’t “real” in the traditional sense. We see data that has been stitched together to create artificial audio and video. It does, however, feel real in a way that is difficult to explain until you’ve experienced it. The same might be said about augmented reality if it can overlay vibrant and lifelike representations of people and objects over the real-world reality we experience. Do you think we should punish specific types of conduct if a VR/AR misconduct experience feels genuine and has significant emotional and physiological consequences? How would you differentiate between virtual reality and physical wrongdoing in terms of punishment?

Skip to toolbar