A few years ago, as I got lost in the wormhole that is YouTube, I stumbled upon a family channel, “The ACE Family.” They had posted a cute video where the mom was playing a prank on her boyfriend by dressing their infant daughter in a cute and tiny crochet bikini. I didn’t think much of it at the time as it seemed innocent and cute, but then I pondered and thought about it. I stumbled on this video without any malintent, but how easy would it be for someone looking for content like this with other disgusting intent.
When you Google “Social media child pornography,” you get many articles from 2019. In 2019 a YouTuber using the name “MattsWhatItIs” posted a YouTube video titled “YouTube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized(2019)”; this video has 4,305,097 views to date and has not been removed from the platform. The author of the video discusses a potential child pornography ring on YouTube that was being facilitated due to a glitch in the algorithm. He demonstrates how with a brand-new account on a VPN, all it takes is two clicks to end up in this ring. The search started with a “Bikini Haul” search. After two clicks on the recommended videos section, he stumbles upon an innocent-looking homemade video. The video looks innocent, but he scrolls down to the comments to expose the dark side. Multiple random account comment time stamps, those timestamps are linked to slots on the video where the children are in comprising implicit sexual positions. The most disturbing part is that the algorithm glitches once you enter the wormhole, and you get stuck on these “child pornography” videos. Following the vast attention this video received, YouTube has created an algorithm that is supposed to catch this predatory behavior; when the video was posted, it didn’t seem to be doing much.
YouTube has since implemented a “Child Safety Policy,” which details all the content which the social media platform has aimed to protect. It also includes recommended steps for parents or agents posting content with children being the focus. “To protect minors on YouTube, content that doesn’t violate our policies but features children may have some features disabled at both the channel and video level. These features may include:
- Live chat
- Live streaming
- Video recommendations (how and when your video is recommended)
- Community posts”
Today when you look up the news on this topic, you don’t find much. There are forums and such exposing the many methods these predators use to get around the algorithms set up by platforms to detect their activity. Many predators leave links to child pornography in the comments section of specific videos. Others used generic terms with the initials “C.P.,” a common abbreviation for “child pornography,” and codes like “caldo de pollo,” which means “chicken soup” in Spanish. Many dedicated and concerned parents have formed online communities that scour the Internet for this disgusting content and report it to social media platforms. Online communities scan the Internet for this activity and report it, but why haven’t social media platforms created departments for this issue? Most technology companies use automated tools to detect images and videos that law enforcement has already categorized as child sexual exploitation material. Still, they struggle to identify new, previously unknown material and rely heavily on user reports.
The Child Rescue Coalition has created the “Child Protection System Software.” This tool has a growing database of more than a million hashed images and videos, which it uses to find computers that have downloaded them. The software can track I.P. addresses, which are shared by people connected to the same Wi-Fi network and individual devices. According to the Child Rescue Coalition, the system can follow widgets even if the owners move or use virtual private networks or VPNs to mask the I.P. addresses. Last year they expressed interest in partnering with social media platforms to combine resources to crack down on child pornography. Unfortunately, some are against this as it would allow social media companies access to this unregulated database of suspicious I.P. addresses. Thankfully, many law enforcement departments have partnered up and used this software and as the president of the Child Rescue Coalition said: “Our system is not open-and-shut evidence of a case. It’s for probable cause.”
The United States Department of Justice has created a “Citizen’s Guide to U.S. Federal Law on Child Pornography.” The first line on this page reads, “Images of child pornography are not protected under First Amendment rights, and are illegal contraband under federal law.” Commonly federal jurisdiction is applied if the child pornography offense occurred in interstate or foreign commerce. In today’s digital era, federal law almost always applies when the Internet is used to commit child pornography offenses. The United States has implemented multiple laws that define child pornography and what constitutes a crime related to child pornography.
Whose job is it to protect children from these predators? Should social media have to regulate this? Should parents be held responsible for contributing to the distribution of these media?
“Unfortunately, we’ve also seen a historic rise in the distribution of child pornography, in the number of images being shared online, and in the level of violence associated with child exploitation and sexual abuse crimes. Tragically, the only place we’ve seen a decrease is in the age of victims.
This is – quite simply – unacceptable.”
-Attorney General Eric Holder Jr. speaks at the National Strategy Conference on Combating Child Exploitation in San Jose, California, May 19, 2011.