Several wildly popular social media platforms—including Facebook, Instagram, and WhatsApp—intentionally target young users with addictive features, according to lawsuits brought by dozens of states. These suits claim that these features have played a big part in a nationwide youth mental health crisis.
Let's take a look at these cases and the potential impact on social media platforms and their billions of users.
Over 40 states have filed a string of federal and state lawsuits against Meta, which operates Facebook, Instagram, Messenger, Threads, and WhatsApp, among other platforms. These cases say that Meta's business model is to intentionally addict teenagers and children to its platforms while falsely asserting that the platforms are safe for young users.
The biggest of these lawsuits, brought by 33 states in all, claims that Meta has "profoundly altered the psychological and social realities of a generation of young Americans" with its conduct.
What's more, these states claim that Meta has gone out of its way to mislead consumers about the "substantial dangers" represented by Facebook, Instagram, and other platforms. They allege that, in the name of profits, the company has concealed the ways the platforms are designed to "exploit and manipulate" children and teenagers.
According to the lawsuits, the most harmful features of Meta's sites and apps include:
The 33-state lawsuit claims that Meta knew exactly what it was doing. They charge that the company has been exploiting young users' developing brains while falsely claiming that the sites and apps are "safe."
The largest lawsuit also alleges that Meta's practices amount to "unfair and unconscionable conduct" under the Children's Online Privacy Protection ("COPPA") Rule. Plus, it accuses the tech giant of illegally collecting personal information from children under the age of 13, in violation of COPPA.
The 40-plus states bringing these kinds of lawsuits against Meta are asking the courts for several different kinds of legal remedies, including:
Meta hasn't shown any willingness to make meaningful changes to its social media platforms, or even to publicly acknowledge the dangers these platforms pose to young users. Governments have limited options for getting a company to change its potentially harmful ways, especially when those ways are insanely profitable.
Usually, the options for the federal government and state governments include:
In some sense, the states bringing these cases are asking the courts to do something Congress hasn't done: rein in Meta's and other social media companies' approach to young users.
It's too early to say. At least one attorney general (Colorado's Phil Weiser) has said that this lawsuit was filed only after Meta refused to compromise.
It's not clear what "success" might look like in the eyes of the states that have banded together to bring these claims. Any kind of settlement deal is much more likely to involve Meta paying a financial penalty rather than actually changing how it runs its social media platforms. That said, the states might be able to get Meta to change its advertising and marketing practices.
A number of the legal claims made against Meta are similar to those brought against Juul and other manufacturers of e-cigarettes and vaping products, whose tactics were also said to target young people. The multimillion-dollar settlement deals that came from those lawsuits limited the companies' marketing abilities, as with restrictions on in-store displays and ad campaigns aimed at people under 35.
This legal action against Meta also has a lot in common with the wide-ranging lawsuits filed against opioid manufacturers (such as Oxycontin maker Purdue Pharma). Those suits have led to settlements in the billions of dollars and prompted large-scale funding to fight the nationwide opioid crisis.
The states suing Meta are likely hoping for similarly significant changes. A potentially key difference here, though, is that the fundamental issue is mental rather than physical health.
No. These lawsuits are by states, not individuals. The states filed them in the "public interest," to stop and punish behavior that affects a significant number of people in the states. These aren't "class actions" that individual consumers can join, and the states aren't asking for a legal remedy for any particular people.
As the biggest case against Meta progresses, and as others like it are filed, the legal landscape will start to come into clearer focus: What legal arguments have the most promise? Are social media companies at all willing to change their practices? At this stage, though, it's very much a "watch and wait" situation for young social media users and their parents.
Individuals and their families in the U.S. and abroad have filed several social media addiction-related personal injury cases in recent years. These lawsuits have tried to hold various platforms liable for mental health issues, suicides, and other harm to young users. In October 2022, a number of these cases were pulled together in federal court, in a "multi-district litigation" (MDL) lawsuit (called "In Re: Social Media Adolescent Addiction/Personal Injury Product Liability Litigation") looking to hold the companies behind Facebook, Instagram, Snapchat, TikTok, and YouTube responsible for addicting and harming kids and teens.
Lawsuits like this rely at least in part on the legal theory of "product liability," which is often used to hold companies responsible when an unreasonably dangerous or defective product causes injury. But personal injury and product liability lawsuits involve a legal hurdle that lawsuits brought by states don't.
It's one thing to investigate a corporation, get hold of its internal communications, and establish that it knew its product was almost certainly harming lots of its users. It's also not much of a stretch to intuitively conclude that teenagers' near-constant use of social media is likely to have negative effects on their mental health and well-being. But bringing a successful personal injury lawsuit (or product liability case) requires clear proof that that corporation's product was the actual cause of the injured person's specific harm.
So, the challenge is proving that a specific young person's development of a mental health disorder or other diagnosable injury was a direct result of their use of social media. And that's on top of showing that the company knew (or should've known) that its product was harmful. (Learn more about proving a product liability claim.)
Even though proving direct harm can be a challenge, it isn't impossible. In fact, the state lawsuits we've discussed in this article might help people (and potential jurors) understand that social media can cause clear, detectable harm to young people.
It's possible, but so far, tech companies' use of this defense has seen only limited success when it comes to claims of intentionally targeting and addicting young users.
Companies like Meta are mostly protected from liability when it comes to content that's posted on their platforms by third parties. This protection comes from a federal rule known as "Section 230" (which is where it's listed as part of the Communications Decency Act of 1996). So, for example, if harmful content is published on a company's platform (like a social media post that's defamatory), the person who posted the material can be sued for harm, but the platform itself can't usually be held liable. Tech companies can also use Section 230 to avoid most liability for regulating content that violates the law, or that goes against platform guidelines, as long as those restrictions and decisions are made in "good faith."
In response to the October 2022 multi-district litigation (MDL) we discussed in the previous section, Meta, Google, and other tech companies argued that the cases should be dismissed based on Section 230 and the First Amendment. But in November 2023, the judge in the case (mostly) disagreed, finding that those protections bar only a handful of the plaintiffs' product liability claims against the companies. The majority of the claims in the MDL are allowed to proceed. This MDL is a collection of cases filed by individuals, not by states, but the same legal analysis could apply to the potential application of Section 230 to both kinds of lawsuits.
Perhaps not quite yet. Experts have been shedding light on how addictive and harmful social media can be, but this area of law is unsettled. We don't yet know whether and how Meta and other social media companies will be held liable. Consumers, businesses, and legal professionals are closely watching to see what happens.
Having said that, if you're convinced that your case is solid, it might make sense to at least discuss your situation with a lawyer. Doing so could be important if you'd be suing for harm that occurred more than a year or two ago.
A law called a "statute of limitations" sets a deadline on your right to bring a case to court, and it can be tricky to figure out when the "clock" starts running on these kinds of cases. Depending on the law in your state, you might need to get your lawsuit filed sooner rather than later to preserve your legal options.