Indicator 3: Global Nature of Online Child Sexual Abuse Material (CSAM)

Introduction

% of online CSAM hosted/uploaded across world regions

Percentages shown in the map are based on the number of reported CSAM from all data sources combined.


World regions are based on UNICEF regional classifications.

For 3.6% of detected CSAM, the hosting location could not be determined.

The internet penetration rate is impacted by users who misrepresent their physical location in order to appear to be accessing the internet from other countries; as such the number of users may be greater than the population of a given country.

Data source

% of online CSAM hosted/uploaded across world regions

The following charts show the regional breakdown of total sent reports/notices of CSAM by UNICEF World Region. The cumulative total number of reports/notices were calculated by compiling the data from IWF, NCMEC and C3P.


Public reports by organisations working to detect and act on CSAM

Data source

All data sources
IWF data sources
NCMEC data sources
C3P data sources
Select data source
All data sources
InHope Data Source
IWF Data Source
C3P Data Source
NCMEC Data Source
Sex of victims (chart)
Sex of victims (chart)
Sex of victims (table)
Age of victims (chart)
Age of victims (table)
Severity of abuse (chart)
Severity of abuse (table)
Removal of content (chart)
Removal of content (table)

Sex of victims shown in detected CSAM online, by reporting organisation

Showing sex of victims as identified by CSAM analyst.


Public reports by organisations working to detect and act on CSAM

Data source

Sex of victims shown in detected CSAM online, by reporting organisation

Showing sex of victims as identified by CSAM analyst.


Public reports by organisations working to detect and act on CSAM

Data source

Age of victims shown in detected CSAM online, by reporting organisation

Showing age of victims as identified by CSAM analyst. Age groups have been harmonised across data sources by Childlight.

Ages were grouped into

  • Infant and Toddler: 0-2 years of age
  • Prepubescent: 0-13 years of age (unless otherwise represented in the previous category)
  • Pubescent/Post-pubescent: 14-17 years of age
  • Mixed Ages: When multiple victims from differing age categories are involved in the same abuse

Public reports by organisations working to detect and act on CSAM

Data source

Age of victims shown in detected CSAM online, by reporting organisation

Showing age of victims as identified by CSAM analyst. Age groups have been harmonised across data sources by Childlight.

Ages were grouped into

  • Infant and Toddler: 0-2 years of age
  • Prepubescent: 0-13 years of age (unless otherwise represented in the previous category)
  • Pubescent/Post-pubescent: 14-17 years of age
  • Mixed Ages: When multiple victims from differing age categories are involved in the same abuse

Public reports by organisations working to detect and act on CSAM

Data source

Showing severity of abuse as classified by CSAM analysts. Childlight has grouped classifications into two categories:

CSAM (notices and reports sent that meet the widely accepted definition of illegal child sexual abuse material), and

CSEM, harmful and exploitative material (content that may not meet the understood international illegal threshold but may be illegal in certain countries and region and is nevertheless harmful to the specific child depicted or children in general).


Public reports by organisations working to detect and act on CSAM

Each of the organisations have a slightly different way of differentiating the severity of abuse depicted in the CSAM detected. The reader is encouraged to view this graph and table alongside the Technical Note (found in the introduction box) and the Companion Report in order to understand the differences in the data between sources and Childlight's harmonisation.

Data source

Showing severity of abuse as classified by CSAM analysts. Childlight has grouped classifications into two categories:

CSAM (notices and reports sent that meet the widely accepted definition of illegal child sexual abuse material), and

CSEM, harmful and exploitative material (content that may not meet the understood international illegal threshold but may be illegal in certain countries and region and is nevertheless harmful to the specific child depicted or children in general).


Public reports by organisations working to detect and act on CSAM

Each of the organisations have a slightly different way of differentiating the severity of abuse depicted in the CSAM detected. The reader is encouraged to view this graph and table alongside the Technical Note (found in the introduction box) and the Companion Report in order to understand the differences in the data between sources and Childlight's harmonisation.

Data source

Removal of CSAM content

As none of the data sources were able to provide a specific indication of when all the reported content was removed, this reflects the reality that some CSAM may still be in the process of removal with the organisations when cited in their reports (2018-2022). Removal is a continuous process and requires the host and organisations working together towards the goal of removing all CSAM from their platforms. All of the organisations will continue to send notices until the content is removed.


Public reports by organisations working to detect and act on CSAM

Data source

Removal of CSAM content

As none of the data sources were able to provide a specific indication of when all the reported content was removed, this reflects the reality that some CSAM may still be in the process of removal with the organisations when cited in their reports (2018-2022). Removal is a continuous process and requires the host and organisations working together towards the goal of removing all CSAM from their platforms. All of the organisations will continue to send notices until the content is removed.


Public reports by organisations working to detect and act on CSAM

Data source

Removal Lag: time taken between request to remove CSAM and CSAM actually being removed.