top of page
  • Protect Children

New #ReDirection report on Spanish-speaking CSAM users published


Child sexual abuse material: 70% of Spanish-speaking viewers exposed to material under the age of 18, study finds

Combination of innovative research methods of Protect Children and invaluable expertise of Colombian NGO RedPaPaz, leads to unprecedented research on Spanish-speaking child sexual abuse material (CSAM) users. New research results on the individuals who search for and view CSAM on the dark web highlights the need for immediate action to introduce adequate child protection mechanisms on the internet to prevent sexual violence against children.

The research, conducted by the Finnish child rights non-profit organisation, Suojellaan Lapsia, Protect Children ry, gathered responses from over 23,300 participants in 21 languages, including over 2,700 Spanish-language participants.

Protect Children, a Helsinki-based NGO takes a research-based holistic method to prevent all forms of sexual violence against children. In the ReDirection project, Protect Children collected data from over 23,300 anonymous CSAM users in the dark web. When the users searched for CSAM on the dark web search engines using the key words, they were suggested to complete one of the two surveys – ‘Help us to help you’ or ‘No need for help’. A total of 2,769 respondents answered the surveys in Spanish, making it the second largest language group among the respondents.

In 2022, The National Center for Missing and Exploited Children received over 32 million reports of suspected online child sexual exploitation and the Internet Watch Foundation received 255,588 reports that contained CSAM - images, videos, livestreaming, and any other material that depicts sexual violence against a child. With the rapid development of technology, crimes of sexual violence can be committed without the need for physical proximity and one offender can victimize tens or hundreds of children in an instance.

A particular concern is that over half of Spanish respondents were first exposed to CSAM accidentally, 70 % under the age of 18 and 40 % under 13. They often justified their CSAM use by it being easily accessible on the internet: “Reasons why I haven't managed to end this vice: - pornography on social media, - ease of access on both normal and illegal pornographic sites” – “Razones por las cuales no he conseguido acabar con este vicio: - pornografía en redes sociales, - facilidad para acceder a páginas pornográficas tanto normales como ilegales”. “I was looking for normal porn and something related to child porn came up.” – “estaba buscando porno normal y salio algo relacionado con porno de niños”.

“This finding demonstrates the urgent need to introduce stronger legislation that obligates service providers to facilitate removal of CSAM and set sufficient child safety measures.” underlines Valeriia Soloveva, Junior Specialist, Protect Children.

The research results highlighted that all children can be at risk of victimization: 25% of Spanish-speaking CSAM users search for CSAM depicting boys aged 4-14 years and 37% search for CSAM depicting girls of the same age bracket. Although CSAM depicting girls is generally searched more often, the Internet Watch Foundation reports that availability of CSAM depicting boys has grown since 2021. “We must ensure that all children, regardless of their gender, have a non-violent childhood. To enhance child protection, more research is needed to better understand the relation between gender and vulnerability to particular type of crimes.” Stresses Tegan Insoll, Head of Research, Protect Children.

Only 15% of the Spanish-speaking respondents reported that they have sought help to stop viewing child sexual abuse material, and only 3% have received help. “The low number of respondents who either sought for or received help signifies the need for low-threshold services for people who want to change their harmful behaviour. We have to offer help to those willing to accept it.” States Nina Vaaranen-Valkonen, Executive Director, Protect Children.

Based on research findings, Protect Children and RedPaPaz recommend to:

  1. Strengthen legislation to respect the rights of the child and adopt regulations and mechanisms to enforce mandatory detection and removal of CSAM.

  2. Invest in widespread implementation of perpetration prevention initiatives and encourage help-seeking behaviour of potential perpetrators, to prevent harm before it occurs.

  3. Increase the availability of effective help resources for victims and survivors.

  4. Amplify the voices of victims and survivors in public and policymaking and break the silence and the taboo around crimes of sexual violence against children.

  5. Introduce age-appropriate digital safety skills and sexual education.

  6. Raise research-based awareness about the prevalence and devastating effects of child sexual abuse.

The report is a continuation of the high-quality studies based on the ReDirection project data, e.g. Insoll, Soloveva, Ovaska & Vaaranen-Valkonen “Russian Speaking CSAM users in the Dark Web: Findings from Russian Language Respondents to ReDirection Surveys of CSAM users on Dark Web Search Engines”, Insoll, Ovaska, Nurmi, Aaltonen & Vaaranen-Valkonen "Risk Factors for Child Sexual Abuse Material Users Contacting Children Online", Insoll, Ovaska & Vaaranen-Valkonen "CSAM Users in the Dark Web: Protecting Children Through Prevention".

Suojellaan Lapsia, Protect Children ry. is the only organisation in Finland dedicated to ending all forms of sexual violence against children. The new research is a part of the ReDirection project, which is funded by the Safe Online Initiative at End Violence. Safe Online has invested over US $71M in 89 projects around the world to create a safer internet for children.

For more information, contact:

Tegan Insoll Head of Research, Specialist +358 40 081 0020

Valeriia Soloveva Junior Specialist +358 40 081 0044

Nina Vaaranen-Valkonen Executive Director, Senior Specialist, Psychotherapist +358 40 747 8829


bottom of page