Ofcom unveils robust new rules to strengthen online child safety in the UK

In a landmark move to create a safer digital environment for young users, Ofcom has finalized a comprehensive set of over 40 practical measures that tech firms operating in the UK must implement by July.
Mandated under the Online Safety Act, these new rules represent a significant overhaul in how online platforms address child safety across social media, search engines, and gaming services.
Following extensive consultation and research involving tens of thousands of children, parents, companies, and experts, Ofcom’s new Codes of Practice demand a “safety-first” approach from tech companies. The core objective is to prevent children from being exposed to a wide range of harmful content, including material related to suicide, self-harm, eating disorders, and pornography.
Additionally, online services will be compelled to actively protect children from misogynistic, violent, hateful, or abusive content, as well as online bullying and dangerous challenges.
Dame Melanie Dawes, Ofcom Chief Executive, emphasized the transformative nature of these changes, stating: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content.”
She warned that companies failing to adhere to these new duties would face enforcement action, including potential fines and even court orders to block UK access.
A key aspect of the new regulations focuses on “safer feeds.” Recognizing that personalized recommendation algorithms often lead children to harmful content, Ofcom is mandating that providers operating such systems and posing a medium or high risk of harmful content must configure their algorithms to actively filter out such material from children’s feeds.
Furthermore, the new rules place a strong emphasis on “effective age checks.” Risky online services will be required to employ robust age assurance mechanisms to accurately identify child users. This will enable them to implement appropriate safeguards, either by restricting access to the entire site or app or by limiting access to certain types of content. Platforms with minimum age requirements, but lacking strong age verification, will be required to assume younger children are present and provide an age-appropriate experience.
Beyond preventative measures, Ofcom’s new Codes also emphasize the need for “fast action” in addressing harmful content once it is identified. All sites and apps must have clear processes in place to review, assess, and swiftly tackle such material. Moreover, children will be empowered with “more choice and support” online.
Platforms will be required to give children greater control over their online interactions, including options to indicate disliked content, manage group chat invitations, block and mute accounts, and disable comments on their own posts. Crucially, supportive information must be readily available for children who may have encountered or searched for harmful content.
To further enhance child safety, Ofcom is mandating “easier reporting and complaints” mechanisms. Children should find it straightforward to report inappropriate content or lodge complaints, and providers will be expected to respond effectively.
Terms of service must also be presented in a clear and understandable manner for young users. Finally, the new rules establish “strong governance” requirements, mandating that all services appoint a named individual accountable for children’s safety and that a senior body annually reviews the management of risks to children.
Providers now have until July 24th to finalize their risk assessments for child users and begin implementing the necessary safety measures, which should be fully in place by July 25th, 2025. Ofcom has made it clear that these new Codes of Practice mark the beginning of a new era in online child safety regulation, with further consultations on additional protective measures expected in the coming months.
Says Lina Ghazal, Head of Regulatory and Public Affairs at Verifymy:
“Today is a coming-of-age moment for the internet, as Ofcom’s Protection of Children Codes give thousands of online platforms, big or small, a newfound responsibility to protect the youngest and most vulnerable in society.
“As a result of the big changes announced today, age checks should become the norm on some of the most popular websites and online services. Large or high‑risk platforms will have to deploy highly effective age assurance to block children from entire services or specific material like pornography, self-harm and hate content.
“Knowing the age of your users is no longer optional – it is the baseline. Without this, platforms are effectively flying blind and hugely exposed to risk. The good news is that readily available technology, such as email-based age checks, can allow platforms to determine the age of their users quickly and effectively while also preserving their privacy.”
However Rachael Annear, Partner and part of the International Data Protection and Cyber Practice at Freshfields, believes that implementing age check verification presents certain challenges, not least when it comes to ensuring individual privacy.
“The new guidance highlights the close collaboration between Ofcom and the Information Commissioner’s Office on the crucial intersection of data protection and age assurance for online services. Although this joined up approach offers welcome clarity, implementing robust age checks will still present significant challenges for services.
“A range of age assurance tools will be available, but choosing the most appropriate method will require careful consideration. Services must strike a delicate balance between ensuring children’s safety and respecting an individual’s privacy. For instance, whilst photo ID matching may be appropriate in more high-risk contexts, it could represent an unnecessary intrusion for adults accessing some lower-risk services with legal (but potentially harmful to children) content.”
Ofcom Statement: Protecting children from harms online
Discover more from Tech Digest
Subscribe to get the latest posts sent to your email.