Meta Fined $375M Over Child Safety Violations

Meta chairman and chief executive Mark Zuckerberg - National News

By Our Correspondent

National News – A New Mexico court has ordered Meta, the parent company of Facebook, Instagram, and WhatsApp, to pay $375 million for misleading users about the safety of its platforms for children.

The verdict follows a seven-week trial in which jurors reviewed internal Meta documents and employee testimonies showing that the company knew minors were exposed to sexual content and online predators.

New Mexico Attorney General Raul Torrez called the decision “historic,” marking the first successful state lawsuit against Meta for child safety violations.

The jury determined that Meta violated the state’s Unfair Practices Act by falsely claiming its platforms were safe for young users.

Former Meta engineer Arturo Béjar, who became a whistleblower, testified that Instagram experiments had shown underage users were served sexualized content and that his own daughter had been propositioned online.

State prosecutors highlighted internal research showing that, in a single week, 16% of Instagram users reported seeing unwanted nudity or sexual content.

Meta argued that it had taken steps to protect minors, including introducing Teen Accounts in 2024 and parental alerts for self-harm content last month.

The total penalty was calculated based on thousands of violations of the act, each carrying up to $5,000.

Meanwhile, Meta faces additional lawsuits across the U.S., including a high-profile case in Los Angeles involving a woman claiming addiction to Instagram and YouTube due to design choices targeting children.

The New Mexico lawsuit, filed in 2023, claimed that Meta’s algorithms deliberately exposed minors to sexually explicit material and potential exploitation.

Torrez stated, “Meta executives knew their products harmed children, disregarded employee warnings, and misled the public. Families and experts say enough is enough.”

Meta has said it disagrees with the verdict and plans to appeal.

The case raises ongoing concerns about social media companies’ responsibility to protect children and the role of algorithms in exposing young users to harmful content.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may like