The Online Safety Act is now in force in the UK. It’s been designed to protect children from inappropriate and harmful online content.
For educators, this raises important questions. Are schools expected to enforce the Act? And could legitimate educational content be restricted under its rules?
This blog answers these questions. It explains how the Act works and how it may affect online learning. It also highlights how students are already finding ways to get around restrictions.
Key Takeaways
- The Online Safety Act (OSA) regulates online platforms to protect children from harmful or inappropriate content.
- The OSA does not directly apply to schools, but it may affect the online resources pupils and staff rely on.
- Some legitimate educational yet sensitive content may be restricted under the OSA’s rules.
- Children can bypass online restrictions using VPNs and other tools, so digital literacy and supervision remain essential.
What Is the Online Safety Act?
The Online Safety Act (OSA) is the UK’s new legal framework for regulating online content and services. It was passed into law in October 2023 and is now being enforced after a two-year implementation period.
The OSA was a response to growing concerns about the impact of harmful online content on children. It applies to social media sites, messaging services, search engines and online forums.
At a high level, the OSA requires these platforms to:
- Protect children from content that is harmful or inappropriate for their age, such as pornography or content promoting self-harm
- Prevent the sharing of illegal content, such as depictions of child sexual abuse
- Ensure users can report harmful content and respond quickly
- Provide greater transparency on moderation policies and enforcement actions
Does It Apply to Schools?
The Online Safety Act does not directly apply to education providers, but it will undoubtedly impact schools and pupils.
The new law is targeted specifically at sites and services popular with under‑18s. It also affects platforms that young people regularly use for legitimate learning and research, such as YouTube, Wikipedia and some educational sites with content‑sharing features.
For educators, this means the online resources you rely on may change how they operate, filter or display content. Some platforms may introduce stricter moderation, automated content filters or age‑verification measures to comply with the OSA.
While these steps are designed to protect young users, they could also result in certain legitimate educational materials being harder to access if they’re flagged as potentially harmful. You may need to adapt lesson plans, find alternative resources or prepare to explain why some online materials are no longer available.
Are Schools Expected to Enforce the OSA?
No. The responsibility for complying with the Online Safety Act lies with the online platforms and services it regulates, not with schools.
However, schools still have safeguarding duties under existing child protection laws. This includes promoting lawful and responsible online behaviour among pupils. And while the OSA has faced some legitimate criticisms, it was implemented to address real risks to children online.
You should be prepared to explain the OSA in age‑appropriate terms and help pupils understand why it was passed into law. This can support wider digital literacy aims and encourage young people to think critically about their behaviour online and the content they encounter.
Does the Online Safety Act Affect School Websites?
In most cases, no. The Online Safety Act is aimed at online platforms that allow users to generate, share or interact with content. A typical school website that provides information, term dates and contact details won’t be within the OSA’s scope.
However, if your school’s website or online portal includes interactive features – such as forums or comment sections – it may be considered a regulated service under the Act. In such cases, you would need to ensure these features meet the OSA’s safety requirements.
Does the Online Safety Act Affect Learning Apps and Sites?
Almost certainly. The OSA applies to any online service accessible in the UK where users can share, view or interact with content, regardless of whether the platform’s primary purpose is educational.
This means certain learning management systems, virtual classrooms or learning apps could fall under the OSA’s scope.
For platforms aimed at or used by under‑18s, compliance may involve implementing age‑appropriate content controls, stronger moderation policies and transparent reporting systems. These changes could, in turn, influence how teachers and pupils access and use the platform day‑to‑day.
Could the Online Safety Act Restrict Access to Legitimate Content?
Yes. There have already been multiple cases of legitimate content being restricted for fear of breaching the OSA.
For instance, social media sites have recently blocked posts related to the wars in Gaza and Ukraine since this content could be loosely interpreted as potentially harmful.
It’s also possible that children will be denied access to information on sensitive topics such as mental health, suicide prevention, consent and LGBTQ+ support, even if the material is age-appropriate and helpful.
But the Online Safety Act is in its early days. Industry experts expect online platforms will refine their content moderation systems. Over time, they should get better at distinguishing between genuinely harmful material and legitimate yet potentially triggering content.
Can Children Get Around Online Controls?
While the OSA sets out strong requirements for platforms to restrict harmful content, it cannot fully prevent children from accessing it.
Many young people are tech‑savvy and can bypass safeguards using proxy sites, anonymous browsers and virtual private networks (VPNs). (VPN apps exploded in popularity in the days after the OSA came into force.)
Under-18s can use these tools to mask their location or identity, letting them sidestep age verification and content filters.
Teachers and school staff should be aware that this behaviour is common among teenagers. It means that no online safety law, however robust, can replace the need for digital literacy, supervision and open conversations about safe and responsible internet use. These efforts are grouped under the term e-safety – but what is e-safety? It means protecting children from harm online by helping them recognise the risks and teaching them how to act responsibly on the internet.
Strengthening Safeguarding Through Training
The Online Safety Act should help protect children online, but it has flaws. Determined under-18s can still seek out harmful content and online bullying won’t disappear. School staff must still know how to respond to safeguarding concerns involving technology.
Our online Safeguarding Courses provide clear, practical guidance on recognising safeguarding concerns, including those that arise online. You will learn how to:
- Spot the warning signs of abuse, neglect or exploitation
- Follow correct reporting procedures in line with statutory requirements
- Respond appropriately if a child discloses they’ve seen something harmful
- Work with safeguarding leads and external agencies to protect pupils
These CPD‑certified, fully online courses ensure you and your colleagues are equipped to act quickly and effectively, whether the concern stems from the classroom, the playground or online.