Meta, the parent company of Facebook, Instagram and WhatsApp, is facing a jury trial in the United States that could reshape how social media companies are held accountable for harm to children. The case opened in New Mexico, where state authorities accuse the tech giant of knowingly exposing minors to sexual exploitation and abuse through its platforms.
The lawsuit, brought by New Mexico Attorney General Raul Torrez, alleges that Meta created and maintained online environments that enabled predators to target children with alarming ease. Prosecutors argue that the company failed to act decisively, even when evidence pointed to sexual solicitation, sextortion and trafficking facilitated through its social networks.
Allegations Of Platforms Built Without Child Protection
At the centre of the case is the claim that Meta’s systems, including its recommendation and engagement algorithms, pushed harmful content into the feeds of young users. According to the state, these systems rewarded interaction and time spent online, without sufficient safeguards for minors navigating adult spaces.
Torrez, a former prosecutor, told the court that Meta’s platforms were not neutral tools but active environments shaping user behaviour. The lawsuit claims the company was aware that its products could expose children to exploitation, yet continued to prioritise growth and engagement over meaningful protection.
Undercover Investigation Fuels Legal Action
The trial follows an undercover operation conducted by the New Mexico Attorney General’s office in 2023. Investigators posed as minors on Meta platforms and documented interactions that, according to prosecutors, demonstrated how easily children could be targeted by adults seeking sexual contact.
State lawyers argue that these findings show a systemic failure rather than isolated incidents. They maintain that Meta had the technical ability to intervene earlier and more aggressively, but chose measures that were insufficient to stem abuse and prevent real world harm.
Meta Pushes Back Against Claims Of Negligence
Meta has strongly denied the allegations, insisting it has invested heavily in child safety tools, content moderation and reporting systems. The company argues that it works closely with law enforcement and child protection organisations to combat abuse across its platforms.
In its defence, Meta attempted to have the lawsuit dismissed, citing free speech protections and online immunity laws that shield platforms from liability for user generated content. A judge rejected that argument, ruling that the case raised substantive questions that should be tested before a jury.
Part Of A Wider Legal Reckoning For Tech Giants
The New Mexico case is not unfolding in isolation. It is the second major lawsuit in 2026 accusing Meta of harming minors, following a separate high profile trial in Los Angeles. In that case, families and schools allege that major social media platforms were deliberately designed to be addictive for children, with damaging effects on mental health.
Those proceedings mark the first product liability claims of their kind against social media companies, placing Meta alongside other global platforms accused of putting commercial interests ahead of child wellbeing.
Global Pressure Mounts Over Online Child Safety
Beyond the United States, Meta is facing intensifying regulatory scrutiny. The company has clashed with governments across Europe over competition, data protection and advertising practices, while relations with some states have deteriorated sharply in recent years.
Concerns about child safety online have driven governments to consider or implement stricter age limits for social media use. Several countries have announced plans to restrict access for younger users, reflecting a growing international consensus that existing safeguards may not be enough to protect children in digital spaces.
A Trial That Could Redefine Accountability
Legal experts say the outcome of the New Mexico trial could have far reaching implications for how social media companies operate. A verdict against Meta may embolden other states and countries to pursue similar actions, increasing pressure on the industry to redesign platforms with child safety at their core.
As testimony unfolds, the case is being closely watched by parents, regulators and technology firms alike, all aware that the verdict could signal a turning point in the long running debate over responsibility, regulation and the true cost of online engagement for children.















