BLOG POST
#ReDirection 2023 Blog Post Series 04
The creation, use, and dissemination of child sexual abuse material (CSAM) is an urgent public health issue. CSAM use entails continuous revictimization of depicted children and causes profound negative impact on the victim. Trauma from experiencing sexual violence may manifest daily, affecting every aspect of the victim’s life through inability to keep ordinary routine or cope with stress, difficulty in maintaining close relationships, challenges in completing education or pursuing desired career. These negative effects may also persist into adulthood. Victims have to cope with awareness that the material depicting their abuse might still be available on the internet or stored by the perpetrator. They often fear being recognized by the perpetrator or other CSAM users (Canadian Centre for Child Protection).
Our research demonstrates that CSAM use endangers not only the children depicted in the abuse material: many CSAM users try to contact a child online. Establishing contact with a child may result in further offences: CSAM users can try to lure the child to an in-person meeting or manipulate them into live-streaming or producing sexual images or videos.
In the ReDirection project, we have developed ‘Help us to help you’ survey that collects data about experiences, emotions and thoughts of CSAM users. When asking CSAM users about seeking a direct contact with a child, we received the following results:
52% of users have felt afraid that viewing CSAM might lead to sexual acts against a child
44% of users have thought about seeking direct contact with a child online after watching CSAM
37% of users have sought direct contact with a child online after watching CSAM
Crimes of sexual violence against children must be addressed before they occur. Protect Children conducted an innovative research to learn more about efficient prevention. Using the data collected via ‘Help is to help you’ survey, we have analysed potential risk factors for CSAM users contacting children online.
“Risk Factors for Child Sexual Abuse Material Users Contacting Children Online: Results of an Anonymous Multilingual Survey on the Dark Web”
Together with Professor of Criminology Mikko Aaltonen and expert in cyber criminality Dr Juha Nurmi, Protect Children explored risk factors for CSAM users contacting children online. In our research, we tried to define whether there is a link between contacting a child online and the following factors: age of first exposure to CSAM, nature of first exposure, intensity of CSAM use, type of material, thoughts before CSAM use, emotions before CSAM use, disclosure of CSAM use and contact with other CSAM users.
We studied a sample of 1,546 individuals who answered ‘Help us to help you’ survey in the dark web. The sample consists of people who search, view or share CSAM in the dark web. We did not collect any identifiable data about the respondents and do not know their age, gender or nationality.
After running the analysis, we discovered four risk factors
1. Older age of first exposure to CSAM
Users who first saw CSAM when they were 26-35 years appeared to be most likely to have already sought direct contact with a child online. Increased probability is also associated with users who were first exposed to CSAM being over 35 years old.
2. Viewing CSAM depicting toddlers and infants
Users who search, view and share CSAM depicting toddlers and infants are more likely to seek direct contact with a child online. These users also report the highest level of fear that CSAM use will lead to direct sexual offense against a child.
3. Having thoughts of self-expressing prior to viewing CSAM
Our results revealed that having thoughts of self-expressing before CSAM use significantly raises the likelihood of CSAM users contacting a child online. Similar to the previous finding, these users are most likely to fear that CSAM use will lead to direct sexual offense.
4. Being in contact with other CSAM users
CSAM users who are in contact with like-minded individuals are more likely to seek contact with children online. Communication between CSAM users may entail cognitive distortions related to CSAM use. In this case, support received from others may result in justification and reinforcement of further unlawful behavior that includes seeking contact with children.
These findings demonstrate how vital it is to facilitate removal of CSAM from the internet. To reduce availability of CSAM, Protect Children takes part in Project Arachnid. Project Arachnid is an innovative tool that uses hashing technology to detect CSAM on the internet. After potential CSAM is detected, it is shared with trained specialists who additionally verify the content of the image. In case the image indeed depicts child sexual abuse, Project Arachnid issues a removal notice until the content becomes unavailable.
Since 2020, Protect Children specialists have analyzed
1 348 750 CSAM images
There is a child in every image. We need to unite our efforts and work together to prevent and stop the distribution of CSAM.
Our research “Risk Factors for Child Sexual Abuse Material Users Contacting Children Online: Results of an Anonymous Multilingual Survey on the Dark Web” was published in the Stanford Internet Observatory’s Journal of Online Trust and Safety in 2022. The full article with our findings is available open-access via the link below.
To protect children from all forms of sexual violence, we need to develop stronger prevention. CSAM use is a grave crime and may lead to even more serious, contact offences.
Read more about Protect Children’s strong offender-focused prevention work on the following pages: