On October 24, 2023, a bipartisan coalition of 33 states’ attorneys general filed suit against Meta Platforms, Inc., alleging in a lengthy complaint that Meta’s social media platform features are unsafe and designed to induce young users’ compulsive and extended use.1 According to the complaint, which is currently heavily redacted, Meta engaged in a four-part “scheme” to exploit young users for profit. The alleged scheme involved the following:
1) through its development of Instagram and Facebook, Meta created a business model focused on maximizing young users’ time and attention spent on its platforms;
2) Meta designed and deployed harmful and psychologically manipulative product features to induce young users’ compulsive and extended platform use, while falsely assuring the public that its features were safe and suitable for young users;
3) Meta routinely represented to the public a deceptively low incidence of user harms; and
4) despite overwhelming internal research, independent expert analysis, and publicly available data that its social media platforms harm young users, Meta continued to offer harmful features.2
The states allege that Meta’s actions violate the federal Children’s Online Privacy Protection Act (Count I); constitute unfair and/or deceptive acts or practices under the state consumer protection statutes; and further constitute unlawful acts under common law principles (Counts II-LIV).
A. Unfair and/or Deceptive Acts or Practices
The complaint alleges that, while Meta represents publicly that its social media platforms are designed to support young users’ well-being, the company’s ad-based business model in fact prioritizes maximizing young users’ engagement over safety, since increased engagement results in higher advertising revenue for the company. Features such as Meta’s “dopamine-manipulating” recommendation algorithms, infinite scroll, autoplay, push notifications, and ephemeral content allegedly serve to increase young users’ time spent on the platform, and discourage their attempts to disengage.3
The complaint pulls from Meta’s public statements—such as its blog posts, congressional testimony, and its executives’ public interviews—in which Meta has represented to the public that its platforms are designed to support young users’ well-being. The complaint alleges that in fact, Meta’s platforms cause significant user harm by encouraging compulsive use in order to maximize profit. Meta’s recommendation algorithms, for instance, are alleged to present material to young users in an “unpredictable sequence rather than displaying posts chronologically,” which allegedly manipulates users into habitually checking and refreshing their social media feeds, leading to “addiction with dopamine implicated.”4
The complaint discusses other features, such as notifications, which are alleged to be addictive and interfere with young users’ education and sleep, and certain visual filters that are alleged to promote eating disorders and body dysmorphia in youth.
The complaint alleges that, “to assuage public concerns about harms to young users on Meta’s social media platforms, Meta routinely published reports purporting to show impressively low rates of negative and harmful experiences by users of its platforms.”5 According to the complaint, these reports are “profoundly misleading”; the states allege that, in fact, Meta knows its platforms cause young users significant physical and mental harm, but downplays or dismisses these negative effects in its public representations.6
Finally, the complaint points to “overwhelming internal research, independent expert analysis, and publicly available data” suggesting that increased use of social media platforms results in physical and mental health harms to young users.7 The complaint alleges that Meta is aware of this issue and is aware that its own platforms directly contribute to the crisis. Despite this knowledge, the complaint alleges that Meta continues to promote and market features that are designed to maximize engagement, and falsely downplays statements about the harms of its platforms.
B. COPPA
The states also allege that Meta violates COPPA with respect to Instagram and Facebook by improperly collecting personal information from users under 13 without parental consent. The complaint further alleges that Meta has marketed its platforms to users under 13 and has actual knowledge of users under 13 on its platforms.
The states allege that Meta possesses actual knowledge of children on Instagram and Facebook and collects their personal information without obtaining verifiable parental consent. For example, Instagram did not require users to self-report age in order to create an account until December 2019. The complaint alleges that for the seven years before an age gate was put in place, “under-13 users faced no practical obstacles to creating accounts on Instagram.”8Further, the initial iteration of the age gate defaulted to age entry over 13, which the complaint alleges impermissibly encourages users to falsify their age.
Notably, in a significant departure from the requirements imposed by COPPA and Federal Trade Commission guidance concerning the COPPA rule, the states fault Meta for depending on “an under-13 user to correctly self-report their own age, without any verification.”9 The states emphasize that “Meta has access to, and chooses not to use, feasible alternative age verification methods that would significantly reduce or eliminate the number of underage users on Meta’s Social Media Platforms, for example, by requiring young users to submit student IDs upon registration.”10
Additionally, the states contend that Instagram and Facebook are directed to children based on such factors as third-party estimates of the number of underage users on its platform, advertising directed to children that promotes Instagram and Facebook and appears on Instagram and Facebook, and certain child-oriented content and accounts on Instagram and Facebook. For example, the complaint references research which purported to find that, of children ages 9-12, 45 percent used Facebook and 40 percent used Instagram daily.11 The complaint also includes screenshots of advertising campaigns for Instagram “featuring actors who appear to be children or teens,” and provides examples such as advertising promotions for children’s television shows which were run on Facebook and Instagram in July 2023 as evidence that Meta’s platforms promote content directed to children.12
C. Relief Sought
The states request the following relief:
Wilson Sonsini Goodrich & Rosati closely follows enforcement of and developments concerning COPPA and other children’s privacy and safety regulations. For more information, or if you need assistance with regulatory compliance regarding children’s privacy or safety or other privacy or consumer protection issues, please contact Libby Weingarten, Maneesha Mithal, Brett Weinstein, Dan Chase, or another member of Wilson Sonsini’s privacy and cybersecurity practice.
[1] The states included in the suit are Arizona, California, Colorado, Connecticut, Delaware, Georgia, Hawai’i, Idaho, Illinois, Indiana, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nebraska, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Virginia, Washington, West Virginia, and Wisconsin. Attorneys general in Florida, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah, Vermont, and the District of Columbia have filed similar lawsuits.
[2] State of Ariz. v. Meta Platforms, Inc., C.A. No. 4:23-cv-05448, Compl. for Injunctive and Other Relief (“Compl.”) ¶ 2 (N.D. Cal. Oct. 24, 2023).