top of page

The Truth About The New Online Safety Act: Will Children Really Be Protected?

  • Writer: Leah East
    Leah East
  • Jul 14
  • 5 min read

Smartphone displaying apps on a laptop keyboard, illuminated by purple and pink light. Technology-focused mood.

No parent ever imagines losing their child to something they couldn’t see coming, something hidden behind a screen.


And yet, families like Esther Ghey, Ian Russell, Ellen Roome and many others, have spoken out after losing their children in heartbreaking and often complex circumstances. In each case, the online world played a damaging and influential role - from the normalisation of violence and exposure to self-harm material, to the tragic loss of vital messaging data that could have helped explain what happened.


These parents are campaigning through their grief. They’ve found the strength to turn pain into purpose - demanding stronger regulation, better safeguards, and greater accountability from tech platforms. 


Their voices are a powerful reminder that behind every policy or headline are real children, and real families, calling for change.


And theirs is not a niche issue, it’s everyday life for nearly all children. According to Ofcom research, children aged 8 to 17 now spend between two and five hours online each day. Nearly every child over 12 owns a mobile phone, and most regularly watch videos on platforms such as YouTube or TikTok.


This is the digital world our children are growing up in - one that’s fast-moving, ever-present, and often far beyond the watchful eye of adults.


The stories of loss, and the courage of families who’ve chosen to speak out, have helped drive long-overdue action. From 25th July 2025, the UK’s new Online Safety Act will begin to take effect, a law that’s been years in the making.


But the question many of us are still asking is: 


"Will it be enough?"


So, What's Changing?


Tech platforms in the UK will be legally required to comply with new child safety duties, following a deadline of 24th July for completing their children’s risk assessments, with full enforcement of these requirements beginning on 25th July.


Here's an overview of the core requirements:


  • Prevent children from accessing harmful content, such as pornography, self-harm material, eating disorder content, hate, and violence.

  • Use robust age verification, such as real-time selfies, bank or ID checks, to ensure under-18s can’t access adult material.

  • Remove illegal content quickly, with Ofcom now enforcing these requirements.

  • Introduce user reporting tools and have clear processes for reviewing and responding to concerns.

  • Appoint a named individual responsible for children’s safety on the platform.


Failure to adhere to these statutory obligations carries substantial penalties. Ofcom is empowered to impose significant fines, which can be up to £18 million, or 10% of a company's global annual turnover, whichever figure is higher. In certain serious instances, senior managers of non-compliant organisations may also face criminal prosecution, particularly if they fail to comply with information requests from Ofcom.


This is huge - it shows a shift in how seriously digital safeguarding is now being taken at a national level. And for parents and professionals alike, it’s an acknowledgment that the online world needs stronger regulation.


But Are There Still Gaps?


The changes are significant, but they’re not quite enough.


The new rules don’t cover private messaging apps, where harmful peer-to-peer sharing can happen undetected. They also overlook in-app purchases, loot boxes (chance-based rewards in games that can cost families a fortune and introduce gambling-like behaviours), and the gamified pressure so many young people face.


And while some content will be filtered, the Act doesn’t cover legal-but-harmful material aimed at adults, nor does it fully tackle the influence of online extremism, misinformation, or the ever-evolving risks children face in the digital world.


These laws have been a long time coming, but technology hasn’t waited. In fact, it’s developing so quickly that parts of the Act may already feel out of date before they even come into force. The digital world has evolved far beyond what many safeguarding systems are currently equipped to handle.


What About Encryption and AI?


Some of the most difficult risks to regulate remain, especially end-to-end encrypted messaging and AI-generated content.


Apps such as WhatsApp and Signal allow harmful content to be shared privately, out of sight. Ofcom can now require these services to scan for child sexual abuse material, but only after a strict independent review, and the debate around privacy and protection is still ongoing.


AI-generated content, from chatbots to deepfakes, is technically covered by the Act, but many experts say the guidance is unclear and already lagging behind! Children are often interacting with these platforms before the adults around them even know they exist.


My Perspective - As A Parent And A Safeguarding Professional


With Cornerstone Safeguarding, we spend a lot of our time supporting DSLs, youth workers, and organisations committed to protecting children. 


But as a parent, this issue isn’t just a professional one for me, it’s personal.

Having a 10-year-old son growing up in a digital world looks nothing like the one my older sons, now 20 and 25, experienced.  Back then, ‘online safety’ meant setting screen time limits and parental controls (the latter of which was very limited), as well as checking their browser history. 


But now, it’s about whether our son is being exposed to harmful content at his fingertips, often through platforms that can allow content to filter through as ‘safe’. 


As a mum, I experience the worry of wondering what my child might stumble across. As a safeguarding professional, I see first-hand how overwhelming it is for DSLs who are trying to manage all of this with limited time, resources, or even a lack of support.


These new laws are a start, but they don’t change the reality: safeguarding in the digital age can’t just be about policies or filters. 


It requires conversations with young people to help raise awareness, ongoing education for us as adults and parents, and a shift in culture - especially within tech companies.


But What Can We Do?


Here are some steps that organisations AND parents can take now, alongside these law changes:


 Review your online safety policies in light of the new Act 

 Have open conversations with children and young people about what they’re seeing and doing online

 Create a culture where reporting is safe and supported, especially in settings supporting young people.

 Access supervision and support, as safeguarding can be emotionally draining, and DSLs need space to process.

 Keep learning - tech changes quickly, and staying informed is part of staying protective


Final Thoughts


Whether you’re a parent, in a safeguarding role, or someone working with children and young people, the Online Safety Act marks important progress - but it’s just the beginning.

It will take all of us working together, staying curious, and keeping conversations going to actually create a safer online world for the next generation.


At Cornerstone Safeguarding, we’re here to help with that. From safeguarding training to supervision and policy reviews, we’re committed to supporting you - not just in response to legislation, but in building a safeguarding culture that protects, educates and creates safer spaces for all children and young people.

Comentários


bottom of page