
As digital platforms continue to evolve rapidly, so too do the threats that come with them. To combat the growing concerns over illegal content, child safety, and livestream-related harms, UK media regulator Ofcom has proposed a new wave of online safety measures. The proposed rules, outlined in a fresh consultation, aim to strengthen protections for children and prevent the viral spread of illegal or harmful material. These initiatives build upon the UK’s existing Online Safety Act, but seek to address emerging risks with greater urgency and accountability.

I. Ofcom’s Enhanced Online Safety Proposals
1. Addressing the Spread of Illegal Content
Ofcom’s new recommendations call on tech companies to implement stronger tools to prevent illegal material from going viral. Under the proposal, platforms would be expected to intervene more effectively to stop the rapid dissemination of harmful content—including terrorist propaganda and graphic violence—particularly in formats such as livestreams or user-generated videos. This move would significantly raise the bar for moderation standards across the industry.
2. Prioritizing Child Protection
One of the most pressing aspects of the proposal is the added focus on safeguarding children. Among the new guidelines is a call to limit the ability of users to send virtual gifts to minors during livestreams or to record those livestreams without consent. These actions, if enforced, could greatly reduce opportunities for grooming and exploitation on platforms that support real-time interactions.
II. Tackling Harm at the Source
1. Proactive Detection of High-Risk Content
Ofcom is considering requiring large platforms to proactively identify and detect potentially harmful content before it spreads. For example, platforms might be obligated to develop technology that can recognize livestreams showing an imminent risk of physical harm. This proactive model would apply primarily to larger tech firms, as they are considered to pose the highest risk due to their scale and reach.
2. Livestream Reporting Mechanisms
The proposals include a requirement for all user-to-user platforms that enable livestreaming to provide a mechanism for viewers to report content that appears to show illegal activity or a risk of serious harm. These reporting systems would help prevent dangerous or abusive content from gaining traction before intervention is possible.
3. Limiting Risky Platform Features
Some features on social platforms—such as open livestreaming to large audiences—have been flagged by experts as potential risks to children. Ofcom’s new measures aim to curtail the availability of such features to younger users unless they are demonstrably safe and moderated appropriately.
III. Reactions from Advocacy Groups and Industry Experts
1. Mixed Response from Child Protection Advocates
While some child protection organizations praised the proposals, others argue they don’t go far enough. Rani Govender from the NSPCC noted that enhanced safeguards for livestreaming could provide meaningful protection in high-risk digital spaces. However, other advocacy groups criticized the measures as insufficient in tackling the deeper, systemic flaws within the UK’s Online Safety Act.
2. Call for Holistic Safety Integration
Leanda Barrington-Leach, executive director of children’s rights charity 5Rights, urged a broader rethink in how safety is embedded into the technology itself. She called for companies to incorporate child safety features into product design from the ground up, rather than relying on fragmented, reactive policies that are bolted on later.
3. Urgency for Legislative Reform
Ian Russell, chair of the Molly Rose Foundation, expressed concern that the new measures function more like temporary fixes rather than comprehensive solutions. Russell, who founded the organization after the tragic death of his daughter due to exposure to harmful online content, emphasized that “sticking plaster” regulations are not enough. He called on the Prime Minister to step in and overhaul the Online Safety Act to mandate real, systemic change.
IV. Enforcement and Accountability
1. Ofcom’s Enforcement Powers
According to Oliver Griffiths, Ofcom’s online safety group director, the organization is already taking enforcement action where necessary and plans to hold platforms accountable for failures. However, Griffiths acknowledged the ever-changing nature of online risks, highlighting the need for constant updates to safety frameworks.
2. Ongoing Consultation and Stakeholder Feedback
The proposals are part of a formal consultation process that remains open until October 20, 2025. Ofcom is seeking input from service providers, tech companies, law enforcement, civil society groups, and members of the public. The regulator aims to finalize a roadmap that balances innovation with safety, particularly for vulnerable users like children and teens.
V. Industry Practices and Recent Changes
1. Platform Policy Updates
Some major platforms have already begun implementing safety improvements. For instance, TikTok raised its minimum age for livestreaming from 16 to 18 following a BBC investigation that exposed children live-streaming from refugee camps and asking for donations. YouTube has also announced that, starting in July 2025, users must be at least 16 years old to livestream—an increase from previous minimum age requirements.
2. The Limits of Voluntary Measures
Despite these efforts, critics argue that voluntary changes by platforms are inconsistent and often reactive. As content-related dangers evolve quickly, regulators and lawmakers are under pressure to create more enforceable rules that proactively prevent harm, rather than relying on platforms to self-police.
Conclusion
Ofcom’s latest proposals mark a significant step in addressing the growing risks posed by online platforms, particularly regarding the safety of children and the prevention of illegal content going viral. By pushing for stricter livestream moderation, improved detection tools, and clearer mechanisms for reporting harmful activity, the regulator is aiming to build a safer digital space. However, as voices from the advocacy community suggest, technological protections alone are not enough. Lasting change will require a combination of bold legal reforms, platform responsibility, and societal engagement. Whether the UK can rise to meet that challenge remains to be seen—but the dialogue has clearly begun.














