Smoothwall Insights

AI-Enabled CSAM: Is Your School Equipped to Mitigate the Risks?

Written by Smoothwall | Dec 19, 2024 3:46:49 PM

The role of AI in education is widely discussed, but comparatively, little is known about the significant risk this technology poses in terms of the proliferation of child sexual abuse material (CSAM). 

To gain a better understanding of the issue, Qoria, Smoothwall’s parent company, surveyed over 600 schools across the UK, USA, Australia and New Zealand. The survey explored what schools are experiencing regarding AI-Enabled CSAM and how they are addressing this complex challenge.

A glimpse into some key findings

The AI tactics used to facilitate CSAM and explicit content sharing are growing exponentially, requiring schools to remain vigilant and continuously update their safety protocols. 

Around one third of US, UK and Australian respondents said they were now experiencing incidents of students possessing, sharing or requesting nude images every month - with the average age reported between 11-13 years.

More than 30% were unfamiliar with common online grooming tactics perpetrators use. 

Increasingly, teachers and school staff are also finding themselves victims of image-based manipulation through the misuse of AI technology. 

Download the full report

This first-of-its-kind report empowers schools to understand and address: 

  • How AI technology is facilitating CSAM and impacting student safety
  • AI techniques perpetrators are using to target students 
  • How to improve staff, parent and student understanding of these risks 
  • Mitigation strategies your school can implement now

To discuss any of the issues raised in the report, or learn more about solutions to mitigate the risks of AI-enabled CSAM, get in touch with us at enquiries@smoothwall.com. We’re ready to help.