Story Highlight
– Jury verdict deemed historic victory for child safety.
– Meta accused of prioritizing profits over children’s welfare.
– Undercover investigation documented dangers of social media.
– Meta criticized for enabling addiction and mental health issues.
– Company plans to appeal the jury’s decision.
Full Story
In a landmark ruling, a jury in New Mexico has found social media giant Meta culpable for its failure to safeguard children using its platforms. The decision has been characterised as a significant win for families and advocates concerned about the safety of young users online.
Raúl Torrez, the Attorney General of New Mexico, remarked on the verdict, describing it as “a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.” Torrez emphasised that the company’s leadership was aware of the potential harm their products posed to young users, stating, “Meta executives knew their products harmed children, disregarded warnings from their own employees and lied to the public about what they knew.” He added, “Today the jury joined families, educators, and child safety experts in saying enough is enough.”
The case drew significant attention and was bolstered by an undercover inquiry that saw state agents establish social media accounts mimicking minors. Through these accounts, they documented incidents of adults making inappropriate advances, demonstrating the platform’s failure to protect vulnerable users. Additionally, prosecutors highlighted Meta’s lack of action regarding the addictive elements of its platforms, such as infinite scrolling and autoplay videos, which they argued contribute to issues such as depression, anxiety, and self-harming behaviours among young users.
In her closing statements, attorney Linda Singer called for accountability, claiming, “Over the course of a decade, Meta has failed over and over again to act honestly and transparently. It’s failed to act to protect young people in this state. It is up to you to finish this job.” The jury’s decision reflects growing concerns regarding social media’s impact on mental health and child safety, areas that have attracted considerable scrutiny in recent years.
In its defence, Meta argued that the company is committed to user safety. Attorney Kevin Huff, representing Meta, asserted that “evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business.” Huff contended that the design of the social media apps focuses on helping individuals connect with friends and family, rather than enabling predatory behaviour.
Following the verdict, a spokesperson for Meta expressed strong disagreement with the jury’s findings, stressing, “We respectfully disagree with the verdict and will appeal.” They asserted that the organisation works diligently to ensure user safety and contended that the challenges of recognising and eliminating harmful behaviours are complex. The spokesperson added, “We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”
The implications of this ruling may resonate beyond New Mexico, potentially influencing other jurisdictions addressing similar concerns about the responsibilities of social media companies regarding user safety, especially for minors. The case is seen as a critical step in the ongoing dialogue about corporate responsibility, child protection in digital spaces, and the broader societal implications of social media consumption.
Experts in child safety and digital ethics have underscored the significance of this ruling, viewing it as a potential catalyst for more rigorous regulations governing the conduct of social media platforms. With increasing awareness of the detrimental effects of social media on young people’s mental health, movements advocating for stricter measures are gaining momentum.
As public awareness regarding the harms of social media continues to grow, this case may spur further investigations into the practices of large tech firms and their accountability in ensuring safe online environments. Organisations and individuals advocating for child safety are likely to intensify their efforts, leveraging the jury’s verdict as a reason to push for change at a legislative level.
This verdict stands as a testament to ongoing concerns regarding the ethical obligations of corporations in the digital age, particularly as they pertain to minors. With children growing up in an increasingly connected world, the stakes of ensuring their safety online have never been higher. The outcome of this case may serve as a vital part of the broader conversation on how society prioritises safeguarding its youngest members in the face of rapid technological advancement.
In conclusion, the New Mexico court’s ruling against Meta reflects a critical moment in the realm of digital responsibility, signalling a shift towards greater accountability for tech giants in protecting the wellbeing of young users. As this case unfolds, observers will be closely monitoring the response from Meta and the potential ripple effects across the industry and beyond.
Our Thoughts
The incident highlights significant shortcomings in Meta’s adherence to child safety obligations under UK Health and Safety legislation, such as the Health and Safety at Work Act 1974, which mandates employers to ensure the safety and welfare of users, particularly vulnerable groups like children.
To prevent similar incidents, Meta could have implemented more stringent safety measures, including robust monitoring systems to detect and mitigate harmful interactions and algorithms that prioritize user safety over engagement metrics. Additionally, transparency regarding the potential risks associated with their platforms is crucial.
Key safety lessons include the importance of actively addressing user feedback, particularly from employees and external experts, regarding product safety. Furthermore, compliance with the online safety guidelines, including thorough risk assessments and engaging with child protection agencies, could have mitigated the chances of harm.
Ultimately, reliable reporting mechanisms and a proactive approach to mitigating the risks associated with social media technologies can help avert similar situations in the future.




















