The Department for Education states that it is “essential that children are safeguarded from potentially harmful and inappropriate online material.” (KCSIE 2024). While many settings have safeguarding solutions in place to address text-based digital content that poses a risk, identifying harmful imagery is not as straightforward.
This article uncovers the challenges of protecting students from harmful content in the form of images, and introduces Smoothwall Cloud Scan, a brand-new solution that addresses this problem and enhances digital safeguarding - without adding to staff workloads.
Digital safeguarding tools such as web filters and digital monitoring software can offer exceptional protection from harmful content and behaviours on websites and apps. However, the nature of images and the way they are accessed and saved means they can sometimes evade a school’s existing digital safeguarding infrastructure.
For example, an inappropriate image a student receives via an online chat at home and uploads to a shared folder on the school drive will not, in some cases, be caught by these solutions.
Such content can then remain on the school drive, unbeknownst to DSLs and IT staff. This risk can develop accidentally or deliberately. In terms of the latter, students may rename, compress or even convert images to a different format in order to disguise them as harmless material.
As busy staff often lack the time or resources to regularly sift through their school’s cloud storage, high-risk images can go unnoticed for days, weeks or even months. During this time they can be shared and accessed by students.
Learn more: Detecting Unsafe Images in Cloud Storage: The Challenge and Solution
Cloud Scan is a detection tool that uses AI technology to identify harmful or inappropriate imagery present in a school’s cloud storage. It automatically scans the school computer drive every 24 hours, flags content deemed to be potentially harmful or pose a child safeguarding risk, and sends a summary email to the DSL.
When an unsafe image is identified:
By combining the power and speed of AI with the accuracy and contextual understanding of the human eye, Cloud Scan helps schools maintain drives that are free from images that pose a risk to students.
Learn more: Detecting Unsafe Images in Cloud Storage: Combine AI and Human Insight
Reducing the risk of unknowingly storing harmful imagery not only protects students and networks, but also supports a school’s compliance goals. Keeping Children Safe in Education (KCSIE) requires that schools minimise student exposure to “illegal, inappropriate, or harmful content”, including those that could be categorised as pornography or self-harm.
The UK Safer Internet Centre’s (UK SIC) Appropriate Monitoring for Schools guide expands on this to include categories such as violence. Cloud Scan applies aspects of appropriate monitoring to visual content, as images in many of these categories, such as those depicting weapons or nudity, can quickly be flagged.
The data Cloud Scan provides DSLs can also serve as evidence for senior leadership teams, governing bodies and Ofsted, demonstrating that clear actions are being taken in line with school policy. Cloud Scan stores the entire timeline of an image event, enabling DSLs to demonstrate active risk management and compliance with child safeguarding policies.
Learn more: Detecting Unsafe Images in Cloud Storage: How it Supports Compliance
Harmful or inappropriate images in school cloud storage can easily evade safeguarding systems and may be overlooked as a potential source of risk. Smoothwall Cloud Scan ensures that unsafe images are never stored for more than 24 hours before being identified and deleted.
By running in the background and providing DSLs with the information they need to make quick decisions, digital safeguarding is strengthened, without the need for IT intervention. This means peace of mind can be achieved while valuable staff time is saved.
When you introduce Cloud Scan: