Story Highlight
– UK government pressured to ban under-16s from social media.
– Kemi Badenoch advocates raising “age of consent” online.
– Labour Party considers following Australia’s social media ban.
– NASUWT union poll shows 88.7% support for ban.
– NSPCC urges focus on tech companies, not blanket bans.
Full Story
The UK government is currently under significant scrutiny as discussions intensify regarding the potential implementation of a ban on social media access for individuals under the age of 16. The growing concern about children’s exposure to harmful content online, including hate speech and sexualised imagery, has led to increased calls for regulation. This follows alarming reports that an artificial intelligence tool, Grok, utilised on the platform X, has been generating inappropriate images of minors.
In a recent statement, Kemi Badenoch, the leader of the Conservative Party, voiced her support for raising the “age of consent” in relation to online content, aiming to better protect children from the myriad of harmful materials available on the internet. Meanwhile, Labour Party leader Keir Starmer revealed to a meeting of party members that they are considering adopting a strategy similar to that of Australia, which became a pioneer in December by prohibiting access to social media for individuals under the age of 16.
There is heightened anxiety that children in this age group are readily able to engage with disturbing material that may be sexualised, violent, or racially charged, which could have detrimental effects on their mental health. The Online Safety Act, which has recently come into effect, highlighted research indicating that minors’ cognitive development suffers due to exposure to toxic online environments, including harmful ideation and explicit content. Jonathan Hill KC, the UK’s independent reviewer of terrorism, emphasised that regulating the digital landscape is of utmost importance, asserting, “Children should be offline because their brains are not capable of dealing with this hatred.”
Despite the absence of an immediate decision regarding a legal ban, reports indicate that the government is contemplating legislative action within months. Adding to the discussions, Health Secretary Wes Streeting has tasked officials with investigating the specifics of Australia’s online restrictions. He has also sought the insights of Jonathan Haidt, a well-known advocate for limiting social media access for minors, who is vocal about the correlation between increased mental health issues among teenagers and the proliferation of social media and smartphones.
The call for restrictions is not confined to government officials; the National Association of Schoolmasters Union of Women Teachers (NASUWT), the largest teachers’ union in the UK, recently stated its position in favour of a ban. The union argues that prolonged exposure to social media detrimentally impacts children’s attention spans, concentration, and overall capacity for sustained learning. A survey of NASUWT members revealed a staggering 88.7% support for statutory restrictions on social media access for those under 16. Haidt has reiterated the belief that children should not be permitted to engage in social media until they reach at least this age, linking the rise in mental health issues among youth to unsupervised online experiences.
In a demonstration of public sentiment, a parliamentary e-petition advocating for a ban on under-16s accessing social media has garnered over 127,000 signatures, signifying widespread support for the initiative.
Contrasting views, however, come from the National Society for the Prevention of Cruelty to Children (NSPCC), which advocates for a different approach. The NSPCC believes that instead of imposing a blanket ban, the emphasis should shift towards ensuring that social media companies take accountability for the content shared on their platforms. Sir Peter Wanless, NSPCC’s chief executive, noted, “Children deserve to have age-appropriate experiences online rather than being cut off from it altogether,” highlighting that social media plays an integral role in contemporary youth engagement and communication.
The Molly Rose Foundation has cautioned that a complete ban might represent a “retrograde step,” suggesting it could create “a cliff edge of harm” where children face unregulated exposure upon turning 16.
Currently, UK law does not explicitly forbid those under 16 from using social media platforms, yet the Online Safety Act mandates that these platforms take protective measures for minors. This includes confirming user ages for adult content and providing tools for reporting and addressing harmful material. Generally, most social media sites, such as Facebook and Instagram, have a minimum user age of 13, but any legal changes could incorporate stricter measures including varying degrees of access or time restrictions for younger users.
Options for implementing a ban are under consideration, which could entail either a comprehensive prohibition or more selective limitations on specific platforms. The government is also observing initiatives like TikTok’s 10pm curfew for users under 16, as stated by Peter Kyle, Secretary of State for Science, Innovation, and Technology. Measures could include technology aimed at allowing parents to control or limit their children’s access to social media.
Examining the effectiveness of Australia’s model offers insights but also raises questions about applicability within the UK context. Australia identified multiple social media platforms affected by its law and mandated the removal of pre-existing under-16 accounts following its enactment. Companies reported escalating measures, including requesting user identification and implementing AI-driven age assessments.
There are concerns about the unintended consequences of an outright ban in the UK. Andy Burrows, Chief Executive of the Molly Rose Foundation, asserted that the approach should focus on creating safer online environments rather than pushing youth towards hazardous spaces. Moreover, a report from the UK Parliament’s youth select committee deemed a ban both impractical and ineffective, urging tech companies to be held accountable for the content they host.
A study by the University of Manchester examined the link between social media use and mental health among children, suggesting no definitive evidence that increased screen time directly correlates to heightened anxiety or depression. The investigation surveyed 25,000 participants aged 11 to 14 over a span of three academic years, finding no significant effects of social media usage on mental health outcomes.
As discussion unfolds regarding how best to approach children’s safety in the digital age, the balance between enabling beneficial online experiences while protecting them from potential harm remains an ongoing challenge for policymakers and society alike.
Our Thoughts
To prevent the issues highlighted in the article regarding children’s exposure to harmful content on social media, several measures could be taken. Firstly, stricter enforcement of the Online Safety Act, which mandates that social media companies protect children from harmful content, could be bolstered. This includes ensuring robust age verification processes and the removal of harmful content as quickly as possible.
Key safety lessons include the importance of proactive measures by tech companies to create safer online environments for children rather than reactive legislative measures that may lead to unintended consequences. Regular audits of social media platforms to ensure compliance with safety regulations could also mitigate risks.
Regulations potentially breached include the duty of care under the Health and Safety at Work Act 1974, requiring employers (including tech companies) to ensure the safety of users—specifically, that they are free from harm posed by malicious or harmful content.
To prevent similar incidents, a collaborative approach involving government, educational institutions, parents, and tech companies is essential. Social media platforms must incorporate safer design features specifically targeting under-16 users, enhancing monitoring and reporting mechanisms in line with current legislation.



















