Skip to main content
SIGN UP

Safeguarding in Schools – What’s new in 2023

Share

Safeguarding remains a recurring issue that both our School Support and Schools HR teams continue to advise on. Our overview of safeguarding issues is available here. We have also produced a note on the new Online Safety Bill which is currently at the Report stage (where MPs can discuss and amend it).

The Wider Context: the Online Safety Bill

The media coverage following the inquest into Molly Russell’s death again highlighted the need for social media platforms to be held accountable for the content that children and young people can access online. The coroner found that the harmful online content on Pinterest and Instagram relating to self-harm and suicide which Molly had been accessing “contributed to her death in a more than minimal way.” He made a number of recommendations but it remains to be seen what changes may be made to the Online Safety Bill which is passing through Parliament.

The Bill, currently at the Report stage (where MPs can discuss and consider amendments) introduces new rules for firms which host user-generated content, i.e. those which allow users to post their own content online such as images, videos and comments or interact with each other through messaging, comments and forums.   It also proposes tighter rules for search engines, which will have tailored duties focussed on minimising the presentation of harmful search results to users. The legislation will apply to companies including social media platforms, sites such as forums and messaging apps, some online games, cloud storage and the most popular pornography sites, as well as search engines, which play a significant role in enabling users to access harmful content.

Tech platforms will need to remove illegal content, including child sex abuse material, revenge pornography, selling illegal drugs or weapons, and terrorism. Platforms likely to be accessed by children will also have a duty to protect young people using their services from legal but harmful material such as self-harm or eating disorder content. Additionally, providers who publish or place pornographic content on their services will be required to have robust processes in place to prevent children from accessing that content. Age limits (typically 13 years old) will need to be enforced and platforms will need to explain how they will do this. Risk assessments concerning illegal content and risks posed to children will need to be carried out to show how platforms will mitigate those risks and the process will be overseen by Ofcom, which will have the power to issue fines and enforcement notices. If a child does encounter harmful content or activity, parents and children will be able to report it easily. Platforms will also have a duty to report any child sexual exploitation and abuse content that they encounter to the National Crime Agency. The Bill will also introduce new criminal offences, including encouraging people to self-harm, sharing pornographic “deep fake” images, taking and sharing “downblousing” images, sending an unsolicited sexual image (so-called “cyberflashing”), and sending or posting a message that conveys a threat of serious harm.

Recent proposed amendments to the Bill include criminal sanctions (up to two years’ imprisonment) for tech senior managers whose platforms persistently ignore Ofcom enforcement notices, have contributed to the serious harm, abuse or death of a child, or where they hinder an Ofcom investigation or a request for information. However, some have argued that the scope of the Bill needs to be amended to specifically target major commercial operators, as in its current state many smaller platforms run by hobbyists (for example those hosting multi-player games) and volunteers (for example sites such as Wikipedia) would potentially be caught by the legislation too.

Online safety in schools

Schools have a responsibility to ensure that their pupils are safe online. The 2021 inquest into the death of Frankie Thomas, a 15-year-old with SEN, found she took her own life after reading graphic online content that she accessed through an iPad while on school premises. The school’s internet security system had allegedly failed to block a story telling platform where Frankie had managed to access stories relating to suicide and self-harm, and alerts were not generated to warn staff that inappropriate material was being accessed. Online safety statutory guidance for schools is contained within Keeping Children Safe in Education 2022 (KCSIE) which stipulates that:

  • All staff should have safeguarding and child protection training (including online safety) at induction, and this should be regularly updated;
  • All staff should be aware that technology is a significant component in many safeguarding and wellbeing issues. Children are at risk of abuse and other risks online as well as face to face. In many cases abuse and other risks will take place concurrently both online and offline. Children can also abuse other children online, this can take the form of abusive, harassing, and misogynistic/misandrist messages, the non-consensual sharing of indecent images, especially around chat groups, and the sharing of abusive images and pornography to those who do not want to receive such content;
  • Children should be taught how to keep themselves and others safe online;
  • It is essential that governing bodies and proprietors ensure that appropriate filtering and monitoring systems are in place;
  • An effective whole school approach is required to educate and protect, establish mechanisms to identify, intervene and escalate any concerns, including harmful content;
  • Schools need to communicate with parents regarding online learning, what sites are being used, what filtering and monitoring the school uses;
  • There should be annual reviews of online safety and risk assessments; and
  • Staff should be aware of the particular implications for children with SEND who are likely to be more vulnerable to abuse.

KCSIE references links to UK Safer Internet and many other resources, including sites which have the ability to test filters, but much of the advice tends to be centred around the Prevent duty and child abuse rather than self-harm/suicide content. Schools may need to take a more pro-active approach to ensure that pupils are kept safe from these other types of harmful online content.

Policies

Schools must ensure that they regularly review their Child Protection and Online Safety policies. Schools should have a policy for the use of mobile and smart technology, as well as an Online Safety policy, and these should link to the school’s Child Protection Policy (KCSIE paragraph 138). Schools should have a policy which deals with child-on-child abuse, ensure that staff are aware of the policy and that this type of abuse can happen inside or outside school, or online.

Primary aged children

It is important to note that the age of children being targeted in the area of online sexual abuse has increased most dramatically in the age group 7-10 years old in the period January to June 2022 (source: Internet Watch Foundation article dated 8 August 2022). Much of this material is self-generated imagery, where child victims have been manipulated into producing and sharing images using a smartphone, tablet or webcam from the “safety” of their own bedrooms and without the knowledge of the adults present in the house. This surge in self-generated abuse material is particularly evident in boys aged 7-13. Primary schools therefore need to be particularly aware of their online safety curriculum planning and ensure their policies are up to date and understood by staff. There is a wealth of resources for primary schools, including those produced by LgFL and the NSPCC which tackle sexual abuse of young children including online abuse.

How we can help

Our team can advise on and undertake reviews of policies. We also have expertise in advising on a range of issues concerning safeguarding in schools and related settings. More information on our safeguarding expertise and experience can be found here.

Share this article