Unwiring Tech

  • Home
  • Self Generated CSAM on Social Media Platforms, Instagram at the Top
Episode Name:

Self Generated CSAM on Social Media Platforms, Instagram at the Top

Host:

Ranjana Kushwaha – Policy Research Associate, her interest lies in social dimensions of the digital ecosystem.

Episode Summary:

In this episode of "Unwiring Tech," the host explores the dark side of Instagram, revealing its role in hosting a vast network of pedophiles sharing illegal underage content. The platform's algorithms unintentionally boost commercial SG-CSAM accounts, making it challenging to track and eradicate. The podcast emphasizes the need for tech giants to be held accountable, stronger regulations, and proactive detection of such content to protect children. Additionally, it calls for education and awareness campaigns to empower minors and highlights the slow response of organizations responsible for removing CSAM content.

Transcript:

Host: Welcome to another episode of Unwiring Tech, I am your host Ranjana the podcast where we delve into global digital issues, and explore the 'why' behind each topic and incidents. Today, we are going to explore a dark and disturbing side of social media platforms, with a particular focus on Instagram.

Host: Instagram, the popular photo and video sharing app, has taken the world by storm, especially among the younger generation. A Recent report titled "Cross-Platform Dynamics of Self-Generated CSAM" by The Wall Street Journal along with Stanford University, have uncovered a shocking truth: Instagram hosts a vast network of pedophiles, sharing, promoting, recommending and monetizing illegal underage sex content.

Host: It's hard to believe, isn't it? We often associate CSAM with adult offenders, but these investig;ations reveal a different side of the story. Self-generated CSAM, refers to images or videos created by minors themselves. This challenges our common notion of who creates, distributes, and monetizes CSAM. Instagram takes the top spot in boosting this network. it's the platform where nightmares come to life, fueled by Instagram's content discovery and recommendation algorithms.

Host: But here's the shocking part: Instagram's severe issue lies in its commercial SG-CSAM accounts. These operators use hashtags and keywords to attract newcomers, and the platform's recommendation algorithm unintentionally boosts their network. It is interesting to note that when someone search for CSAM, it displays warning 'These results may contain images of child sexual abuse' but strangely gives two options, "Get resources to report content" or "See results anyway."

Host: SG-CSAM content is often posted as stories on Instagram which is transient in nature, making it challenging to track down. Moreover, It provides link to offsite platforms hosting content menus and details Carrd where users can explore a vast variety of disturbing content, from self-harm videos to imagery of minors engaged in sexual acts with animals.

To maintain anonymity, minors involved in marketing and advertising these accounts often accept gift cards as payment. It's a dark and troubling world that exists right under our noses, and it's time we shed light on it.

Host: This study also brings attention to the various policies adopted by social media platforms and global organizations. Meta, the parent company of Instagram, has a comprehensive content policy across all its platforms. It strictly prohibits the sexualization of children, the advertising of CSAM, sexual conversations with minors, and obtaining sexual material from minors.

Report also revealed disturbing facts about Telegram, its Terms of Service state that publicly posting illegal pornographic content is not allowed, which means it implicitly allows CSAM to be shared in private groups or direct messages.

Host: So, where do we go from here? It's abundantly clear that the giants of tech must be held accountable for the content they harbor. This issue demands a multi-faceted and uncompromising approach, not only focused on the detection and monitoring of CSAM but also on strengthening laws and regulations. It is imperative that platforms proactively detect and eradicate this repulsive content, particularly when it concerns the lives and innocence of children. We must also prioritize education, sensitization, and awareness campaigns that empower minors and adolescents with the knowledge and understanding of the catastrophic consequences of SG-CSAM, such as the insidious world of sextortion.

Host: Time is of the essence. We must call upon organizations dedicated to the prevention of child sexual abuse material online to take swift and decisive action. During the study, it was discovered that CyberTipline, a global cleaner of CSAM, and the mechanism through which platforms should remove CSAM content and other social media platforms, have proven disappointingly sluggish in action. Reports made went unanswered for an entire month, a glaring testament to the lack of enforcement that perpetuates this industry.

With this I conclude today's episode.
Stay informed and stay safe

Additional Readings:
  1. Instagram Connects Vast Pedophile Network - WSJ
  2. Instagram’s recommendation algorithms are promoting pedophile networks - The Verge
  3. stacks.stanford.edu/file/druid:jd797tp7663/20230606-sio-sg-csam-report.pdf

Archives

No archives to show.

Categories

  • No categories