AI technology can be used to enhance education, but it’s also creating new threats for schools to mitigate. For example, perpetrators are utilising AI tools to personalise manipulation strategies and produce child sexual abuse material (CSAM) and other explicit content. Rather than tackling this serious issue alone, it’s imperative for schools to engage parents in addressing AI-enabled CSAM, particularly as a majority of students’ digital behavior often takes place at home.
This article acknowledges the potential difficulty educators face in engaging parents in digital safety measures, and provides a number of strategies schools can use to recruit this vital community in the fight against AI-enabled CSAM.
The challenge of engaging parents
In Qoria’s latest Global Insights Report, 77% of UK schools surveyed said one of their most significant challenges in addressing issues of AI, CSAM, and explicit content is “lack of awareness among parents”.
Not My Child (NMC) Syndrome
‘Not My Child’ Syndrome refers to the tendency of some parents to believe that their children are not susceptible to the risks and dangers present in the online world, despite evidence suggesting otherwise.
The NMC mindset can lead parents to:
- Underestimate the risks their children encounter online
- Not monitor or supervise their children's online activities adequately
- Not educate their children about online safety and responsible technology use
- Dismiss warning signs or concerning online behaviour exhibited by their children
Without parental involvement in reinforcing online safety, schools may struggle to address the growing threat of CSAM and protect students from exploitation.
While the process can be difficult, schools should persist and aim to innovate when it comes to the engagement and enablement of parents in safeguarding their children.
Strategies to educate and engage parents
A comprehensive strategy that includes consistent communication, workshops, and educational resources can help to build awareness amongst parents, resulting in a more informed and vigilant community.
Provide regular resources and communication channels
Resources that provide helpful, ‘snackable’ content through centrally located online platforms (such as Smoothwall’s Online Safety Hub) have proven both popular and effective with parents.
Regularly communicating the availability of these platforms is key, along with the establishment of additional open communication channels. This facilitates interactivity and advice sharing, and provides parents with agency and a sense of ownership over their child's online world.
Update strategies
Use parent year group communication channels to share information from teachers or leaders directly with parents. Content such as AI App of the Week overviews, scam alerts, or even incident management tips, can really help reinforce effective and consistent safety messaging, especially if there has been a high-profile incident in the media that could impact students.
Family technology nights
Host events where families can come together to learn about technology and its impact on education. These nights can include demonstrations, discussions, and activities designed to engage both parents and students.
Host parent workshops
Organise workshops or information sessions focused on AI, CSAM, avoiding exploitation and how to enhance digital safety. These can also delve into cross-over topics such as online privacy, identifying misinformation, and understanding the implications of technology on children’s education and wellbeing.
Collaborate with experts
The digital world is evolving quickly, and it is unrealistic to expect teachers to be able to provide all the answers and education. Schools can invite guest speakers or experts in child digital safety, child psychology, or law enforcement to speak to parents about online safety.
These professionals can provide diverse and valuable insights, real-life examples, and actionable advice to help parents understand the risks and equip them with effective strategies to safeguard their children.
Establish policies around AI use
Before implementing strategies to educate parents on AI-enabled CSAM, it’s important for schools to have clear policies in place regarding the use of AI tools. These should be informed by risk assessments which incorporate considerations such as GDPR and data protection laws.
Further guidance on how to create AI policies can be found in the Generative artificial intelligence (AI) in education paper from the Department for Education. School leaders and DSLs may also find this AI policy template from the South West Grid for Learning Trust (SWGfL) useful when establishing AI policies.
Increasing parental education and engagement can be an ongoing challenge for schools, but it's worth the sustained effort, given the additional level of vigilance it provides to children.
To explore the Online Safety Hub, or to learn how our Training and Consultancy service could support your school in engaging parents in digital safety, contact our experts at enquiries@smoothwall.com. We’re ready to help.

Discover further strategies to prevent the spread of AI-enabled Explicit Content
Download Qoria’s latest Global Insights Report: Addressing the Risks of AI-enabled CSAM and Explicit Content in Education, for information on strategies to help parents, students and school staff fight the rise of AI-enabled CSAM.