A jury in New Mexico has ruled that the company must pay $375 million in civil damages over allegations that it failed to adequately protect children on its platforms. The case, centered on the safety of minors using Facebook and Instagram, marks one of the most consequential verdicts yet in the growing wave of litigation against social media companies.
The jury reached its decision after about a day of deliberations, concluding that Meta had knowingly violated the state’s consumer protection laws. The damages were calculated based on multiple violations identified during the trial, bringing the total penalty close to $400 million.
The outcome is being viewed as a turning point in the broader debate over the responsibility of tech companies to ensure user safety—particularly when it comes to younger audiences.
The lawsuit, filed in 2023 by New Mexico Attorney General Raúl Torrez, stemmed from an undercover investigation conducted by his office. Investigators created a fake account posing as a teenage girl to test how Meta’s platforms handled interactions involving minors.
According to evidence presented during the trial, the account quickly began receiving a large volume of inappropriate content, including explicit images and messages from individuals believed to be engaging in predatory behavior. Prosecutors argued that this demonstrated a systemic failure in Meta’s ability to detect and prevent harmful interactions targeting underage users.
The trial, held in Santa Fe, examined whether the company had misrepresented the safety of its platforms and failed to take sufficient action despite being aware of potential risks.
Jurors ultimately sided with the state, determining that Meta’s actions were not merely negligent but intentional. This distinction played a key role in the size of the financial penalty imposed.
State attorneys had pushed for even higher damages, arguing that the scale of harm warranted a penalty that could reach into the billions. While the final amount fell short of that figure, the ruling still represents a strong condemnation of Meta’s conduct.
Meta has rejected the verdict and indicated it will appeal. The company maintains that it has invested heavily in safety tools and continues to develop measures aimed at protecting younger users. Throughout the proceedings, Meta disputed the claims made by prosecutors and defended its approach to content moderation and user protection.
Attorney General Raúl Torrez has framed the verdict as a broader statement about accountability in the tech industry. He has argued that the case underscores increasing concern among families, educators, and policymakers about how social media platforms impact children.
The state’s legal team emphasized that the trial brought forward internal concerns within the company, suggesting that risks to young users were known but not fully addressed. The verdict, they argue, signals that companies can no longer ignore such warnings without facing consequences.
Legal experts say the case could influence similar lawsuits across the country, particularly as regulators and prosecutors look for new ways to hold tech firms accountable.
Despite the jury’s decision, the case is not yet finished. A second phase of the proceedings is scheduled to begin on May 4, where a judge will evaluate whether Meta’s conduct amounts to a public nuisance.
If the court rules against the company again, it could be required to take additional steps beyond financial penalties. These may include funding public initiatives designed to address harms linked to social media use and implementing changes to how its platforms function.
Proposals put forward by the state include stricter age verification systems, stronger enforcement against predatory accounts, and adjustments to features that may enable harmful behavior to go undetected. Such measures could reshape how Meta designs and operates its services, especially for younger users.
A major point of discussion during the trial involved internal company communications to privacy and encryption. Prosecutors presented documents indicating that employees had raised concerns about the potential impact of expanding end-to-end encryption in messaging services.
The issue dates back to 2019, when CEO Mark Zuckerberg outlined plans to make private messaging more secure. While encryption is widely seen as a tool for protecting user privacy, prosecutors argued that it could also make it more difficult to detect and report illegal activity, including child exploitation.
The internal discussions highlighted a tension between maintaining user privacy and ensuring platform safety. Meta, however, argued that the materials were presented without full context and reiterated that it continues to improve its safety systems.
The case also illustrates a broader shift in how legal challenges against social media companies are being framed. Rather than focusing solely on harmful content posted by users, prosecutors are increasingly targeting the design and functionality of platforms themselves.
This approach attempts to bypass legal protections such as Section 230 of the Communications Decency Act, which has traditionally shielded tech companies from liability for user-generated content.
Attorney General Raúl Torrez has applied similar strategies in other cases, including ongoing litigation against Snap Inc.. These efforts reflect a growing push to hold companies accountable not just for what appears on their platforms, but for how those platforms are built and operated.
Contact to : xlf550402@gmail.com
Copyright © boyuanhulian 2020 - 2023. All Right Reserved.