The role of AI in education is widely discussed, but comparatively, little is known about the significant risk this technology poses in terms of the proliferation of child sexual abuse material (CSAM).
To gain a better understanding of the issue, Qoria, Smoothwall’s parent company, surveyed over 600 schools across the UK, USA, Australia and New Zealand. The survey explored what schools are experiencing regarding AI-Enabled CSAM and how they are addressing this complex challenge.
The AI tactics used to facilitate CSAM and explicit content sharing are growing exponentially, requiring schools to remain vigilant and continuously update their safety protocols.
Around one third of US, UK and Australian respondents said they were now experiencing incidents of students possessing, sharing or requesting nude images every month - with the average age reported between 11-13 years.
More than 30% were unfamiliar with common online grooming tactics perpetrators use.
Increasingly, teachers and school staff are also finding themselves victims of image-based manipulation through the misuse of AI technology.
This first-of-its-kind report empowers schools to understand and address:
To discuss any of the issues raised in the report, or learn more about solutions to mitigate the risks of AI-enabled CSAM, get in touch with us at enquiries@smoothwall.com. We’re ready to help.