Story Highlight
– Lords voted to ban under-16s from social media.
– Bereaved mothers urge swift action for children’s safety.
– Ofcom acknowledges more work needed for online protection.
– Consultation on social media ban to begin soon.
– 500 children referred to mental health services daily.
Full Story
Two mothers who have experienced devastating losses are expressing deep concern over the potential delays in implementing a ban on social media for users under the age of 16. This urgent call to action comes on the heels of a recent vote in the House of Lords, where a decisive majority supported a cross-party proposal to restrict access to platforms such as TikTok, Snapchat, Instagram, and Facebook for younger users. The vote resulted in 261 in favour and 150 against, representing a significant defeat for the government.
Esther Ghey and Ellen Roome, who both suffered the tragic loss of their children, shared their perspectives during an interview on the BBC’s Sunday with Laura Kuenssberg. They articulated their belief that the current measures in place, enforced by the UK’s telecommunications regulator Ofcom, are inadequate in providing the necessary protection for children.
Since last year, Ofcom has introduced new regulations aimed at shielding minors from exposure to harmful online content. However, the organisation itself acknowledges the substantial work still required to ensure the safety of younger users. An Ofcom representative stated that they are “under no illusion there is still much more to do.”
The recent developments have propelled the Children’s Wellbeing and Schools Bill into a critical phase, moving towards consideration in the House of Commons. This legislative action is particularly prominent as Technology Secretary Liz Kendall has initiated a three-month consultation period, which will evaluate the merits and drawbacks of enforcing a social media ban. Among the elements under review are strategies such as implementing overnight curfews for young people on social media and preventing excessive scrolling behaviours often referred to as “doom-scrolling.” The government is expected to receive recommendations from this consultation in the summer.
Ghey, whose 16-year-old daughter Brianna was tragically murdered in a park in February 2023, has been vocal about the perils of social media usage among teenagers. Brianna was reportedly engaged heavily with social media platforms, seeking fame on TikTok, which her mother believes led to her emotional distress and isolation prior to her untimely death. Ghey emphasised, “While we’re waiting, more and more children are being harmed,” underscoring the urgency she feels regarding this issue. She also shared alarming statistics, stating that “every single day 500 children are being referred to mental health services” and highlighted that “97% of 12-year-olds own smartphones.”
In December, Australia introduced strict measures requiring social media companies to restrict accounts for users under the age of 16. This action has gained international attention, with various governments, including the UK, observing the outcomes of this policy closely. The Australian government advocates for this ban as a vital step towards safeguarding children from harmful online interactions and content, a stance echoed by many parents and campaigners.
Contrarily, major tech companies, including Meta, agree on the necessity of heightened safety measures for young users. However, they contest that a blanket ban may not be the most effective solution. This sentiment has been echoed by several experts who caution that simply prohibiting access might not comprehensively address the complex issues at hand.
Ellen Roome, whose son Julian “Jools” Sweeney died after participating in a perilous social media challenge, joined fellow bereaved parents during a meeting with the technology secretary. Roome, hailing from Gloucestershire, expressed her frustrations regarding the unregulated access children have to these platforms. “How much longer are we going to let children have unregulated access?” she questioned during her discussion. “They have access to everything, and I just really think it needs to go, this whole thing of waiting and seeing and waiting.” She posited that if social media were a tangible product, it would have faced immediate scrutiny and withdrawal from the market due to its dangerous nature.
Roome has also become part of a group of parents who are pursuing legal action against TikTok in a Delaware court, seeking accountability for the harms they believe the platform contributes to young users.
In response to growing concerns, Ofcom has undertaken numerous investigations into over 90 different online platforms since acquiring new regulatory powers last summer. They have also imposed financial penalties on various services deemed non-compliant with age verification measures designed to protect children from inappropriate content. An Ofcom spokesperson noted that their actions have led to widespread implementation of age checks for preventing access to pornographic and harmful materials and that high-risk sites are increasingly being blocked.
Despite the progress, Ofcom remains aware of the long-standing issues within an internet landscape that has been largely unregulated for over two decades. They acknowledged that there is still considerable work ahead. “We will continue to work with urgency to drive change so that children in the UK enjoy a safer life online,” they stated. They also expressed appreciation for the insights shared by victims, survivors, and bereaved families, emphasising their importance in shaping effective policies.
As discussions surrounding the protection of children online continue, the experiences of parents like Ghey and Roome underscore the human impact of these ongoing debates. Their calls for immediate action highlight the pressing need for regulation and oversight in the rapidly evolving digital landscape, with many hoping the forthcoming consultations will yield decisive and meaningful change.
Our Thoughts
The tragic events highlighted in the article point to multiple failures in safeguarding young people from the risks associated with unregulated social media use. To mitigate such incidents, a more proactive regulatory approach is essential, particularly under the UK’s Health and Safety at Work Act 1974, which mandates that employers ensure the safety and health of users.
Key measures that could have prevented the circumstances described include stricter enforcement of age verification processes on social media platforms to prevent underage access. The current regulations, including the Online Safety Act 2021, could be more rigorously applied to enforce the removal or restriction of harmful content and the promotion of safer online environments.
The consultation period mentioned should lead to immediate action rather than prolonged discussions. Sensible steps, such as implementing curfews and limiting access to potentially harmful challenges, could reduce the opportunity for risky behavior.
Additionally, there has been a breach of the duty of care by social media companies in terms of allowing unregulated access to minors, which can contribute to mental health issues and exposure to harmful content. Industry accountability is vital in ensuring a safer digital landscape for children. Similar incidents could be prevented through better regulations, industry compliance, and education for both parents and children about online safety.



















