WHAT IS CSAM?

CSAM stands for Child Sexual Abuse Materials. These are videos and/or images depicting scenes of child sexual abuse created and circulated mostly via the internet and digital devices, and often used by child sex offenders to further exploit children. Although “Child Pornography” is the most commonly used term to describe such material, experts in the child protection sector state that it lessens the gravity of these crimes against children and recommend the correct term as ‘child sexual abuse material’ or ‘child sexual exploitation material (CSEM)’. It is important to remember that a child cannot consent to their own abuse.

The Optional Protocol to the Convention on the Rights of the Child (CRC) on the Sale of Children, Child Prostitution, and Child Pornography (OPSC) defines Child Pornography in Article 2 (c) as “any representation, by whatever means, of a child engaged in real or simulated explicit sexual activities or any representation of the sexual parts of a child for primarily sexual purposes.”

According to the Wall Street Journal, the online CSAM trade is a multibillion dollar business with millions of images and videos, all at the expense of sexually exploiting and torturing innocent children. While the leading law enforcement organizations around the world struggle to curb the rise of CSAM with the improvements in technology, the perpetrators continue to find new avenues to engage and spread their crimes. 

Today’s offenders use underground internet sites to share CSAM, while using unsuspectingly normal digital devices such as mobile phones, tablets and laptops to record their victims. In particular, a growing number of CSAM depicting very young children, including babies and toddlers, have started pinging on the radars of the law enforcement agencies – pointing towards an increase in offenders involved in such crimes.

STATISTICS

The INTERPOL’s Child Sexual Exploitation database holds more than 1.5 million images and videos and has helped identify 19,400 victims worldwide (INTERPOL May 2019). According to the worldwide report done by the Internet Watch Foundation 105,047 webpages showing the sexual abuse of children were removed by their analysts in 2018. Of these:

  • 39% – of the victims were aged 10 years and under
  • 1% – of the victims were under 2 years
  • 78% – of the victims were girls
  • 17% – of the victims were boys
  • 44% – of the material showed sexual activity between adults and children including rape, sexual torture, or less extreme sexual activity
  • 82% – of the material was from image hosting sites
  • 5% – of the imagery was hosted in Asia

The National Child Protection Authority reported 15 instances of Obscene Publications, 15 instances of Soliciting a Child and 7 instances of Gross Indecency in 2018. However, it is important to note that these are only a minute portion of what truly occurs. Many CSAM in Sri Lanka are not reported to the authorities due to societal pressure, or are not recorded properly after being reported due to a lack of knowledge. Actions are being taken by PEaCE to raise awareness and train the necessary state organizations.