In our latest whitepaper, “Who’s Safeguarding the Safeguarders”, we surveyed DSLs and teachers about their role in protecting students. The data revealed that one in 10 DSLs feel they aren’t able to identify a child suffering from mental health issues, while more than one in five (22%) could neither agree nor disagree.
In this blog, we take a look at why that might be and how safeguarding technology can help DSLs spot those children suffering with mental health concerns that may have otherwise gone unnoticed or noticed too late.
Whilst our report found that some safeguarding leads don’t feel confident identifying mental health issues in children, over one-third have admitted to seeing something that has distressed or upset them in their role, in the past year.
The Health Organisation has reported that the number of young people and children with a mental health conditions has risen dramatically, and if left untreated, mental health problems can disrupt children functioning at home, school and in the general community.
Additionally, the rise in mental health cases comes at a time when children’s exposure to harmful content online has increased. In fact, Smoothwall data has shown that every 12 minutes, a child was found to have been involved in a serious sexual incident, and every 22 minutes, a child was involved in a serious cyberbullying, bullying or violent incident.
Effectively safeguarding children has never been more important – or challenging, considering the rise in digital use and additional pressures and anxieties since the pandemic. Because of this, it is unfair to place the entire burden of responsibility on a DSLs ‘eyes and ears’ approach, alone.
Being dependent on staff to notice negative behaviours with little tools at their disposal runs the risk of missing a child suffering with mental health issues, whilst also adding additional stress onto DSLs.
Smoothwall’s own digital monitoring data shows that every 37 minutes, a seriously vulnerable child was identified. This means that activity was detected including references or indicators of self-harm, suicide ideation and other health issues.
Dedicated human moderators are on hand 24/7, allowing for concerning alerts to be raised with DSLs immediately. This real-time technology means any worrying digital behaviour can be identified and dealt with appropriately, without DSLs having to filter through the automated alerts themselves. Instead, DSLs will have more time to effectively deal with a risk, giving the child all the attention and support they need.
Additionally, DSLs can spend more time analysing the findings to ensure their safeguarding processes are up to date and effective.