Preliminary Findings: Alarming Prevalence of Child Sexual Abuse Material on Social Media and Instant Messengers
top of page
  • Protect Children

Alarming Prevalence of Child Sexual Abuse Material on Social Media and Instant Messengers

Updated: Dec 1, 2023

STATEMENT



PRELIMINARY FINDINGS

Alarming prevalence of child sexual abuse material on the surface web: social media, instant messengers, and pornography platforms


It is more urgent than ever to turn the tide on the child sexual abuse material epidemic. Children are being subjected to sexual abuse and exploitation in every corner of the internet, and without robust legislation, we cannot keep them safe.


Statement from Protect Children, September 2023
.pdf
Download PDF • 216KB

Preliminary findings from our latest research on undetected child sexual abuse material (CSAM) users in the dark web reveals the alarming prevalence of child sexual abuse material on the surface web. The results uncover the widespread use of social media, instant messenger, and pornography platforms for viewing and sharing CSAM, as well as directly contacting children.


81% of respondents to our ReDirection survey of people searching for child sexual abuse material on dark web search engines say that they have encountered child sexual abuse material, or links leading to CSAM, on the surface web. Mostly on social media and pornography sites.[i]

34% have actively used social media or messaging apps to search for, view or share CSAM. The platforms they use are the same platforms that children use every day.[ii]


  • 81% have encountered child sexual abuse material on the surface web

  • 34% have actively used social media to search for, view or share CSAM

  • 70% have attempted to contact a child via social media or messaging app


Perpetrators who view CSAM are likely to seek direct contact with a child afterwards: 37% of respondents say that they have sought contact, most of them using platforms on the open web to do so.[iii] 40% of those who have contacted a child say that they have done so via social media, 30% via a messaging application, and 26% through an online game.[iv]


How have you attempted to establish the first contact with a child? (N=209)​


As demonstrated by our research, perpetrators search for, view, and share CSAM on popular social media and messaging platforms. Internet service providers that are currently facilitating this harm must be legally obliged to identify, report, and remove CSAM from their platforms.


To effectively protect children from falling victim to sexual violence online, material depicting sexual violence against children must be urgently removed, and grooming attempts must be prevented. This cannot be achieved without regulation mandating it; therefore, Protect Children believes that the proposed EU Regulation 2022/0155 to prevent and combat child sexual abuse provides a proportionate long-term solution to enable the effective use of technology to better prevent sexual violence against children online.


We strongly urge policymakers to support the proposed EU regulation to keep children safe online.



Read more about our ReDirection and Primary Prevention to Protect Children projects.


This research is conducted by Suojellaan Lapsia, Protect Children ry., with the support of the Tech Coalition Safe Online Research Fund.






[i] 32% (N=237 of 747 who answered the question) reported that they have encountered CSAM on social media. 32% (N=236 of 747) reported that they have encountered CSAM on pornography sites. In total, 81% (N=602 of 747) reported that they have encountered CSAM or links to CSAM somewhere on the surface web. [ii] 34% (N=264 of 766 who answered the question) reported having used social media to search for, view or share CSAM. 30% (N=204 of 672) reported having used messaging apps to search for, view or share CSAM. [iii] 37% (N=3,271 of 8,789 who answered the question) report having sought contact with a child after viewing CSAM. [iv] 39% (N=56 of 142 who answered the question) report contacting a child via social media. 30% (N=43 of 142) report contacting a child via a messaging app. 25% (N=36 of 142) report contacting a child via an online game.

bottom of page