If your organization is serving consumers in person in your facilities or through off-site programs, online sexual abuse may seem distant from your day-to-day operations and safety protocols. Unfortunately, as online and in-person relationships become increasingly intertwined, organizations have a growing responsibility to be aware of and guard against online abuse.
While there is no single official definition of online child sexual abuse (online CSA), also called technology-facilitated CSA, researchers, law enforcement, and Nongovernmental Organizations (NGOs) use these terms to describe multiple categories of interactions through online channels like social media, gaming platforms, texting, or other messaging/image-sharing means1:
In a recent keynote address for abuse prevention practitioners, researchers, and advocates, Simon Bailey, CBE, QPM, DL said succinctly: “There’s never been, to our shame, a better time to abuse.”2 To date, Child Rescue Coalition has identified 72.5 million unique IP addresses worldwide that have shared or downloaded CSAM, and reports of online abuse are increasing every year. According to the National Center for Missing and Exploited Children (NCMEC), reports of online enticement increased by more than 300% from 2021 to 2023.
Online CSA may look and feel different than conventional CSA due to the perceived distance and anonymity created by an online vs. in-person encounter. However, online CSA requires the same three key factors you protect against when designing your abuse prevention practices: access, privacy, and control. Online CSA offenses are also associated with many of the same victimization risk factors as in-person sexual abuse, “including parental maltreatment, bullying, other forms of victimization as well as female gender and sexual minority identity.”3
And while stereotypes persist about anonymous online predators, there is evidence that so-called stranger danger may not be the most prevalent concern. Recent studies provide “evidence that a significant amount [of] online CSA may be perpetrated by an individual who was known to the child offline.”4
The most recent findings from the National Juvenile Online Victimization Survey (NJOV) indicate that the majority of harmers (62%) were not strangers. A 2023 study investigating the role of technology in the perpetration of in-person abuse found that 23.5% of those who experienced online solicitation for sexual images or in-person sexual activity knew the perpetrator. That number grew to an alarming 79.5% of those who reported experiencing online grooming – nearly eight out of ten online grooming victims were groomed by someone they knew. And these perpetrators are not all adults.
An estimated 30-50% of all child sexual abuse cases are perpetrated by youth under 18. Additionally, a 2022 study of online CSA in the US found nearly identical results when looking specifically at abuse in the online space. In online abuse cases where the age of the perpetrator was known, between 32-52% were under 18.4
Additional research highlights the complexity of this problem. A 2023 study reported that among their respondents who had experienced online CSA:
But this can be misleading – victims are not willingly engaging in their abuse. While these images are sometimes created under duress, they are often shared voluntarily in the context of a consensual intimate relationship but then misused or shared without consent.
As we shared in our 2024 Praesidium Report, the increasing sophistication of artificial intelligence (AI) is also compounding this already thorny issue. Generative AI technology is now creating CSAM, which is often indistinguishable from authentic images.
In their 2023 report How AI is being abused to create child sexual abuse imagery, the Internet Watch Foundation (IWF) indicates significant potential for the rapid growth of AI-generated CSAM. Talk-to-text image technology allows individuals to enter text describing what they want to see into an online generator. AI software then creates the image or images. According to IWF, “technology is fast and accurate… many images can be generated at once – you are only limited by the speed of your computer.”
The analysis also provides “reasonable evidence that AI CSAM has increased the potential for re-victimization of known child sexual abuse victims” by creating new images from existing CSAM images of victimized youth, which can result in additional exploitation, bullying, extortion, and harassment.
Your programs and spaces naturally allow relationships to form with and between consumers. You can’t control events happening outside of your program. Still, you can limit opportunities for inappropriate online contact between your employees, volunteers, and consumers by consistently enforcing appropriate policies and raising awareness. Click on the links embedded below for additional guidance.
Online CSA is a complex and pervasive risk, and mitigation may seem challenging for any organization to tackle alone. Praesidium can help you take action. Contact us for support on policy creation, risk assessment, guidance for implementation, or additional resources.
References