Cloud storage offers schools significant benefits in terms of data scalability and accessibility, but it also presents a digital safeguarding challenge. Harmful and inappropriate images can go unnoticed in school drives for days, weeks and even months. The vast capacity of cloud storage also means that manual checks can be burdensome for time poor IT staff. As a result, students and networks may be at risk of exposure to unsafe material.
This article outlines the various challenges of identifying harmful content in cloud storage, and introduces a solution that can help schools keep their drives free from unsafe images.
Filtering and monitoring solutions are essential safeguarding tools for schools, but they have limited image recognition capabilities. Web filters manage access to online content, and digital monitoring addresses potentially harmful digital behaviours. They are not designed to detect unsafe images in cloud storage, and so cannot be expected to detect all harmful content stored on a school’s drive.
The specific content of images can be difficult to decipher, as visual data doesn’t follow the same structural patterns as text-based material. In addition, images are open to manipulation techniques that can further obscure their content.
For example, in an effort to store inappropriate images on school drives without detection, students may rename images, or temporarily convert them to another format to make them appear harmless. Harmful images may also be compressed for storage - further disguising the unsafe material within.
To ensure students are thoroughly protected from harmful content, schools should at least perform periodic manual checks of cloud storage. Unfortunately, given the large volume of data often stored on school drives, this can be a time-consuming process.
The laborious nature of manual checks may result in problematic content being missed. Furthermore, as busy school staff often lack the time to accommodate such tasks, the number of manual checks able to be conducted could be insufficient.
The first step in preventing harmful content from being stored on school drives is educating students and staff about their digital responsibilities. All network users should understand the consequences of storing or sharing harmful imagery and the related school policies that apply.
Smoothwall Cloud Scan is a digital safeguarding solution that integrates with Google Drive or OneDrive to quickly detect any harmful images in storage. When potential risks are identified, an alert containing full contextual data is created. DSLs receive a summary email of flagged images captured in the latest scan. They can then review the alerts and decide whether to remove images or mark them as safe - with no intervention required from IT staff.
For example, if a student has uploaded an image depicting a violent gang attack to the school drive, it will be detected by Cloud Scan. The DSL will be able to view details of the incident and the risk categories identified, which in this case could be “weapons” and “gore”. This enables them to:
Cloud Scan automatically assesses cloud storage every 24 hours, allowing staff to maintain control over drive data with minimal disruption to their day.