It happened this week: two landmark U.S. legal cases find Meta and Google negligent
March 26, 2026
Two jury verdicts delivered this week—one in New Mexico and one in California—illustrate a growing shift in the public's perception of social media companies and their responsibilities in keeping young people safe on their platforms.
For years, social media companies have disputed allegations that they harm children’s mental health through deliberate design choices that addict kids to their platforms and fail to protect them from sexual predators and dangerous content.
New Mexico state prosecutors argued that Meta prioritized profits over safety and violated parts of the state’s Unfair Practices Act. The jury found Meta liable for failing to protect young people from online dangers, including sexually explicit content, solicitation and human trafficking, having determined Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its social media platforms. The jury imposed a $375 million penalty.
A California jury determined that Google and Meta were negligent in having failed to provide adequate warnings about the potential dangers of their products. What seemed to persuade the jury were features that Meta and YouTube built into their software, like infinite scroll, algorithmic recommendations, and autoplay videos—designed to get young users to compulsively engage with the platforms. Internal company documents from Meta and YouTube executives showed they knew of and discussed the negative effects of their products on children. The jury concluded the plaintiff should be awarded $6 million in compensatory and punitive damages, with Meta responsible for 70 per cent of the total amount.
In Canada legal action is being filed against social media companies including a proposed class action suit against Meta that could include thousands of children across the country. In the U.S., several state and federal court cases are currently heading to trial. While details may vary, they all seek to hold Meta and other social media companies responsible for what happens on their platforms.
In December, Australia issued a ban on young people using social media. Malaysia, Spain, and Denmark are considering similar rules.
So is Canada. On March 12th The Honourable Marc Miller, Minister of Canadian Identity and Culture and Minister responsible for Official Languages, reconvened the expert advisory group on online safety to engage on new and emerging issues related to online harms, the same group that held a series of nine workshops with these experts on the important topic of online safety in 2022. The panel will reunite this spring to contribute their knowledge and experience from a variety of fields to the question of online platforms’ responsibility to creating a safer online environment for Canadians, especially for kids.