Self Generated CSAM on Social Media Platforms, Instagram at the Top

Image

Meta's Instagram, a photo and video sharing app, has been very popular among the younger generation in India and worldwide, evident from the fact that the major portion of its user base is the younger generation. According to a study commissioned by the National Commission for Protection of Child Rights (NCPCR), it was found that 24.3% of children aged ten years have Instagram accounts, even though the minimum age is 13 years to sign up for the platform. This huge presence of children and adolescents makes it pertinent to ask whether they are safe on these platforms? Fun in discovering content related to one's interests or previous searches makes it stimulating for this generation to spend excessive time on scrolling.

But this is only half the reality as Instagram is once again mired in a controversy regarding the type of content it hosts. The Wall Street Journal, Stanford University and the University of Massachusetts report revealed that Instagram hosts a 'vast network of pedophiles sharing, promoting and recommending illegal underage sex content or Child Sexual Abuse Material (CSAM)'. The investigation revealed some less-known facts about social media platforms and the vast landscape of self-generated CSAM (SG-CSAM) industry. SG-CSAM is an image or video that appears to be or is created by the minor subject in the image. This challenges the common notion that CSAM is created, distributed and monetized by adult offenders, but this is not always the case.

image

The report titled, 'Cross-Platform Dynamics of Self-Generated CSAM', released in June 2023, investigates the prevalence of CSAM content on social media platforms like Instagram, Twitter, Telegram, and Snapchat.

According to the report: 'The overall size of the seller network examined appears to range between 500 and 7 1000 accounts at a given time, with follower, like and view counts ranging from dozens to thousands.'


  1. Instagram is the platform with the most number of channels and accounts which promote, advertise and monetize SG-CSAM. Content discovery and recommendation algorithm feature of Instagram are the main driver of this whole network. Snapchat and Telegram, unlike Instagram, do not have this feature.
  2. In Instagram, it is a severe issue with commercial SG-CSAM accounts, and CSAM keywords are prevalent and can easily guide a pedophile network.
  3. These account operators occasionally use hashtags and keywords to attract newcomers. A recommendation algorithm inadvertently boosts the network and gives users related suggestions.
  4. While searching CSAM on Instagram, it gives a warning text, 'This results may contain images of child sexual abuse,' but strangely, it provides two options.
    1. Get resources to report content
    2. See results anyway
  5. SG-CSAM are posted as content 'menus' for this type of content and also use transient features of the platform such as 'stories', making it difficult to get tracked down.
  6. Content menus and details are often hosted off-site, like in Carrd, not on Instagram; Carrd maintains a list of such accounts from other social media platforms also. Hence, this off-site link further makes users discover a vast variety of content, for instance, self-harm videos, imagery of minors performing sexual acts with animals etc.
  7. To keep themselves anonymous, minors who are marketing and advertising usually accept gift cards.

This study also highlighted the various policies adopted by platforms and global organizations. Still, the lack of enforcement is a significant issue of concern as Meta have a very comprehensive content policy rule which applies across all its platform that prohibit the sexualization of children, advertising of CSAM, sexual conversations with minors and obtaining sexual material from minors.

Telegram is another platform that lacks enforcement but implicitly allows the promotion, marketing and selling of SG-CSAM. According to the report -

'Telegram's Terms of Service states that posting illegal pornographic content is not allowed on publicly viewable channels, implicitly allowing CSAM on its platform, provided it is shared in private groups or direct messages. It further states that, "All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them," presumably even if reported by a user. They further state, "To this day, we have disclosed 0 bytes of user data to third parties, including governments." '


This report highlights why big techs need to be held accountable for the content they host. This study explores the dark web of this industry where offenders are not only adults but minors themselves, which poses a serious question about how to handle the issue at hand. The issue requires a holistic approach which should not only focus on detecting and monitoring CSAM but also on strengthening laws and rules to make it mandatory for platforms to detect this illegal content proactively especially when the content is as serious as CSAM. Also, sensitization, education and awareness regarding the SG-CSAM is a must where minors and adolescents are told about the dangerous consequences of SG-CSAM, like sextortion.

There is an urgent need for timely action by the organizations working to prevent child sexual abuse material online. During the study, it was found that CyberTipline, a global cleaner of CSAM and mechanism in the platforms to remove CSAM, is slow and despite reports made, no action was taken for a month; this suggests a lack of enforcement.

Night
Day