By design —

41 states sue Meta for allegedly addicting kids to Facebook and Instagram

Meta repeatedly chose not to design platforms safe for kids, states allege.

41 states sue Meta for allegedly addicting kids to Facebook and Instagram

State attorneys general in 41 states and the District of Columbia sued Meta today. The move comes after the conclusion of a multistate probe launched in 2021, where a bipartisan coalition of state enforcers began examining how Facebook and Instagram features are designed to allegedly addict and harm kids.

Back in 2021, the Massachusetts attorney general's office led the multistate probe investigating "Instagram's impacts on young people" after Facebook whistleblower Frances Haugen revealed that Facebook knew Instagram was "toxic" to teen girls but downplayed risks to the public. In a press release today, Massachusetts Attorney General Andrea Joy Campbell accused Meta of "deliberately" exploiting "young users' vulnerabilities for profit."

Eight states and Washington, DC, filed lawsuits against Meta in state and local courts, while 33 states filed a joint lawsuit in a federal court in California, The Washington Post reported.

According to the Post, these lawsuits together mark the "most significant effort" yet by states to force social media platforms to carefully weigh potential harms to children when designing product features.

Ahead of states' announcement, a Meta spokesperson told Reuters that Meta is "disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path."

Campbell's office said that Meta left states no choice but to sue, "because Meta has shown that it will not act responsibly unless it is required to do so by courts of law."

In addition to accusing Meta of harming kids, Campbell's office alleged that Meta's defective product designs have placed an "undue burden" on state school systems, which have allegedly had to increase mental health expenditures to "address the mental and physical health harms that Meta has contributed to" in youths.

“Meta preys on our young people and has chosen to profit by knowingly targeting and exploiting their vulnerabilities," Campbell said. "In doing so, Meta has significantly contributed to the ongoing mental health crisis among our children and teenagers.”

Meta has argued that research is mixed when it comes to measuring the alleged harmful effects of social media on young users. The Post noted that the American Psychological Association released a report in May, concluding that "using social media is not inherently beneficial or harmful to young people."

But state enforcers seem convinced by research that they say shows steep declines in mental health for teens after just one hour of social media use per day, including decreases in happiness and self-esteem, and increases of self-harm, depression, and behavioral challenges. Massachusetts' complaint also points to long-term psychological risks.

Massachusetts plans to argue that "Meta secretly utilizes design features that deliberately exploit and capitalize off young users’ unique vulnerabilities and overcome young people’s ability to self-regulate their time spent on its platform." Those features include everything from notifications to "infinite scroll"—which keep the user engaged—as well as auto-playing Reels to disappearing Stories—which Massachusetts has claimed "create a sense of 'FOMO' (fear of missing out)."

"These features were designed and deployed with the intent of hooking young users into spending as much time as possible on the platform, to lure them back when they try to stop, and to overwhelm their ability to control or regulate their own use, with significant and concerning negative impacts on the brain development and mental health of teen users," Campbell's press release said.

Since 2021, Meta has made changes to address feedback that its platforms may be harmful to kids. Parents now have tools to monitor their kids' activity, and young users are better protected with stronger default privacy settings and alerts nudging them to stop scrolling. But for states—which have largely failed in efforts to pass laws to require social media platforms to design products responsibly for kids—these efforts have not been enough.

"Instead of prioritizing young users’ well-being—as it publicly claimed—Meta repeatedly and deliberately chose not to implement measures and design changes it knew could reduce harms and improve young users’ well-being," Campbell's press release alleged.

Campbell said that until Facebook and Instagram are deemed safer for kids, states "will continue to push for meaningful changes to Meta’s platforms that protect our young people.”

Channel Ars Technica