by
Bec May
The July 2025 (for information) edition of Keeping Children Safe in Education (KCSiE) 2025 has finally arrived on our desks. While it doesn't deliver the sweeping changes many safeguarding leads were bracing for, especially in relation to technology, the statutory guidance does tighten expectations in several core areas.
Building on the foundations of KCSiE 2024, the guidance in the 2025 edition (published July 9 and coming into effect from September 1, 2025) refines rather than reinvents the existing statutory framework. It introduces tighter expectations around online safety, expands on the Four Cs of digital risk, and reinforces the importance of proactive monitoring, mental health awareness, and digital accountability across the whole school community.
Let’s take a look at the changes and what they mean for UK schools.
“Governing bodies and proprietors should ensure that their school or college has appropriate filtering and monitoring systems in place and regularly review their effectiveness.”
(KCSiE 2025, Part 2, para 138)
“Schools and colleges should consider how they monitor online activity… and whether that monitoring is effective and proportionate.”
(para 139)
It also recommends schools use the DfE's Plan Technology for Your School service to assess provision.
Along with the continued expectation to filter and monitor online activity on school devices and school networks, schools are now being asked to demonstrate the effectiveness of their approach. There's a greater emphasis on understanding how your system works, what it flags, and how you respond to alerts. Passive setups will no longer cut it.
History has shown that filtering and monitoring often fall short when schools either outsource the setup or lack the necessary technical knowledge to assess what is actually working. Fastvue is built for non-technical staff — giving DSLs, pastoral teams, and leadership a clear, accessible view of what’s happening across the school network, without needing IT to interpret it.
Our monitoring and reporting solution:
Sends real-time alerts to the right people when it matters most — so if a student searches for something worrying, like suicide, self-harm, bullying, or abuse at home, your wellbeing team can act fast. Fastvue also flags when students access high-risk platforms known for grooming or predatory behaviour, giving you the chance to step in before harm escalates.
“All staff should receive safeguarding and child protection training… including online safety, harmful online content, and the risks associated with social media.”
(KCSiE 2025, Part 1, para 14)
A new emphasis is placed on:
Self-generated sexual content
Misinformation and disinformation (fake news)
Online conspiracy theories
(KCSiE 2025, part 2, para 137)
Schools are now expected to do more than just give students vague warnings about 'online dangers'. There is now a push to tackle real issues that young people are confronted with almost every time they log onto the internet: manipulation, fake news, radicalisation, and social media harms. Staff must proactively help students navigate these risks, with education on digital literacy becoming a standard part of PSHE. Additionally, designated safeguarding leads require network visibility to identify when students are interacting with these topics in the digital space.
Fastvue gives your DSLs and wellbeing teams:
Our Safeguarding report shows searches and web visits for topics around suicide, radicalisation, and eating disorders
Real-time alerts for new threats via custom keyword libraries
Evidence to drive student conversations and lesson planning with real-world examples
“Schools should take appropriate action to meet the cybersecurity standards and consider the DfE’s guidance on generative AI.”
(KCSiE 2025, paras 143–144)
Cybersecurity is now front and centre in safeguarding. Schools are expected to maintain secure digital infrastructure, respond to evolving cyber threats, and stay informed about AI platforms, including how students are using (or misusing) them. This isn’t optional: schools should now be able to demonstrate how they detect, manage, and respond to risks such as malware, phishing, botnets, and the use of tools like ChatGPT or Grok, incorporating further learnings from past experiences.
Fastvue helps schools see what's really happening on their school network, from AI tool usage to intrusion events, without needing a cybersecurity degree to interpret the data.
AI platform tracking: Fastvue reports on students accessing platforms like ChatGPT, Grok and other generative AI tools, allowing schools to review usage and alignment with digital learning policies.
Cyber Threat Alerts When It Counts: Fastvue can be set up to send instant alerts when your firewall blocks a threat, whether it’s malware, a phishing attempt, a botnet callout, or a banned site. You’ll know who triggered it, what they were trying to do, and when — no digging through logs required.
AI Monitoring with Context (Fastvue Reporter for FortiGate Only): If students are using ChatGPT or similar tools, Fastvue Reporter for FortiGate can display the exact prompt content, allowing staff to determine whether it's for research purposes or something that requires closer examination.
Firewall & VPN Reporting: Attempts to bypass controls ( VPNs, proxies) and blocked categories like hacking, crypto mining, or inappropriate file sharing, showing who’s trying to get to what, and how often?
“Schools and colleges should have clear policies regarding the acceptable use of technologies... and staff behaviour online, including communication with pupils.”
(KCSiE 2025, para 155)
The statutory guidance includes increased clarity on:
Use of personal devices
Staff messaging pupils outside approved platforms
Risks of unmonitored digital communication
It’s a sad but true reality: it’s not just students that safeguarding staff need to be watching out for. The 2025 guidance reinforces the need for clear boundaries, enforceable policies, and the ability to spot red flags when staff behaviour crosses a line, especially online, to protect against child sexual abuse.
Tracks App & Website Use by User: Fastvue helps schools identify if staff members are accessing apps or websites not covered by policy, including unauthorised messaging tools or social platforms during school hours.
Flags Suspicious Activity: If someone is spending an unusual amount of time on messaging apps or attempting to access student profiles via social media, Fastvue can flag it in daily safeguarding reports or be picked up in custom alerting rules.
Builds a Clear Timeline: In more than one real-world case, schools have used Fastvue to build a detailed timeline of staff online behaviour, reporting on the apps accessed, when they were accessed, for how long, and in what context. That timeline is then used for internal review, escalation, or disciplinary processes.
“Mental health problems can, in some cases, be an indicator that a child has suffered or is at risk of suffering abuse, neglect or exploitation.”
(KCSiE 2025, para 48)
What it means for schools:
Schools are now expected to notice when digital behaviours might point to mental health issues. This means spotting trends, not just isolated incidents, and responding early.
Flags repeat patterns of worrying searches (for example, suicide methods, self-harm videos)
Makes it easier to triage and prioritise students showing signs of distress
Empowers DSLs and wellbeing staff to step in early, not after the fact
KCSiE 2025 Reference: Part 2, paragraph 222
The guidance clarifies that Virtual School Heads (VSHs) now have a statutory responsibility to promote the educational outcomes of all children who have been assigned a social worker, not just those in care. This change reinforces the importance of proactive, joined-up support between schools and local authorities.
KCSiE 2025 Reference: Part 4, Section 2, paragraph 379
KCSiE 2025 continues to emphasise the importance of a transparent safeguarding culture by encouraging schools to record all low-level concerns, even if they don’t meet the threshold for formal allegations. The update encourages schools to maintain more accurate records and reassess how these are managed.
KCSiE 2025 Reference: Part 4, Section 1, paragraphs 362–368
There’s updated advice on managing concerns about staff and volunteers, including those related to online conduct, inappropriate communication, or the use of personal devices. The guidance emphasises that allegations must be addressed promptly and consistently.
KCSiE 2025 Reference: Part 2, paragraph 153
The guidance reminds schools that safeguarding doesn’t end at the gates. Staff must be aware of risks that occur outside regular school hours, off-site, or via digital platforms, and DSLs should ensure policies reflect this broader duty of care.
KCSiE 2025 Reference: Part 2, paragraphs 109–116
Multi-agency working continues to be a statutory expectation. DSLs are reminded of their role in escalating concerns through the appropriate channels and collaborating with local safeguarding partners (LSPs), early help services, and children’s social care where necessary.
KCSiE 2025 Reference: Part 1, paragraphs 14, 48, 49, 52, 53
These updates put more emphasis on frontline staff understanding the link between online exposure and mental health, being alert to signs of distress, and treating harmful online content — including child sexual abuse and manipulation — as core safeguarding concerns.
KCSiE 2025 Reference: Part 2, paragraph 137
The Four Cs framework is clarified with new, modern examples of online harm. Schools are now expected to educate and monitor students for exposure to various issues, including conspiracy theories, grooming in games, disinformation, and financial manipulation. We've created a Mastering the 4 Cs of Digital Safeguarding checklist to help your school easily.
Fastvue reporter isn’t a filter, and that’s the point. Your firewall is already world-class at blocking dodgy sites and dangerous traffic. Fastvue steps in where filters stop: showing you what’s being accessed, searched, and attempted, by which user, and when, across your entire school network.
It can also help you evaluate whether your current filtering setup is actually doing its job and identify any gaps.
Identify Filter Gaps: See which blocked categories users are still trying to access, and where your filter might need adjusting.
Policy Refinement: Use actual activity data to evolve your filtering rules and digital behaviour policies.
Need to prove you’re meeting your safeguarding obligations? Fastvue gives you the documentation and evidence to do just that.
Audit Trails: Keep a clear record of activity across users, categories, and periods — ready for inspection.
Evidential Reporting: Generate reports that support internal reviews, external audits, or disciplinary action when needed.
Fastvue Reporter is built for safeguarding teams, not just IT, so you don’t need to speak sysadmin to use it. You get:
Clear, daily reports tied to real safeguarding risks
Custom alerts for key terms and emerging trends
Real-time visibility across users, devices, and behaviours
Detailed activity timelines for investigating concerns
Evidence-ready logs for audits, reviews, and referrals
By letting your global-leading firewall handle the filtering, and Fastvue take care of the monitoring, you get a cost-effective, granular, and accurate safeguarding solution that’s privacy-conscious and fit for purpose.
Whether you’re a DSL, IT lead, or headteacher, that means less guesswork and the confidence that you’re protecting your students in the ways that matter most.
Ready to see how Fastvue looks in your school environment? Book a meeting with our UK safeguarding specialists and get better visibility today.
Download Fastvue Reporter and try it free for 14 days, or schedule a demo and we'll show you how it works.