By: Najla Alexander
NJ AG authorities announced that they, along with 41 other attorneys general throughout the country, sued Meta in federal and state courts, alleging that the company knowingly designed and deployed harmful features on Instagram and Facebook to purposefully addict children and teens.
Attorney General Matthew J. Platkin and the Division of Consumer Affairs started, at the same time, Meta falsely assured the public that these features are safe and suitable for young users.
According to officials, New Jersey and 32 other states filed a joint complaint in federal court asserting that Meta's business practices violate the federal Children's Online Privacy Protection Act (COPPA), as well as other state consumer protection laws, including the New Jersey Consumer Fraud Act (CFA).
Eight other states filed similar actions in state courts.
Officials stated these practices have harmed and continue to harm the physical and mental health of children and teens, fueling what the U.S. Surgeon General has deemed a "youth mental health crisis" that has ended lives, devastated families, and damaged the potential of a generation of young people.
"As New Jersey's chief law enforcement officer and as a parent, I feel strongly that there is nothing more important than ensuring the well-being of our children. And we know that in the era of social media, their mental health has never been more at risk," said Attorney General Platkin.
"That is why today, I join dozens of other Attorneys General to once and for all hold Meta and its CEO, Mark Zuckerberg, accountable for deceptive, manipulative practices on Instagram and Facebook that they knew were harmful. Profits – not people, not its most vulnerable users, children, and teens – drive the decision-making at Meta. That stops today."
"Meta knows its platforms are harming children and teens but continues to make every effort to keep kids addicted without even attempting to abide by federal laws meant to protect the most vulnerable," said Cari Fais, Acting Director of the Division of Consumer Affairs.
New Jersey began investigating Meta in 2020 and has co-led the nationwide investigation since 2021, spending thousands of hours investigating Meta's unlawful conduct, officials said.
Officials say the federal complaint, filed in the U.S. District Court for the Northern District of California, alleges that Meta knew that its platforms, including Facebook and Instagram, were harming young people.
Authorities say instead of taking steps to mitigate the psychological and health harms associated with using its platforms, Meta not only concealed these harms but also amplified them by employing features that fueled young users' addiction to its platforms.
The complaint further alleges that Meta knew that young users, including those under 13, were active on the platforms and knowingly collected data from these users without the parental consent that is required by federal law, according to officials.
It targeted these young users because, as a 2021 Wall Street Journal article reported, this user base was "valuable, but untapped," officials said.
At the same time, however, officials stated, the company disavowed interest in this demographic, publicly stating that it did not allow kids under 13 on its platforms.
Officials said while much of the complaint relies on confidential material that is not yet available to the public, publicly available sources, including those previously released by former Meta employees, detail how Meta profited by purposely making its platforms addictive to children and teens.
Its platform algorithms push users to descend into "rabbit holes" to maximize engagement, officials say.
Features like infinite scroll and near-constant alerts were created with the express goal of hooking young users, officials said.
These manipulative tactics continually lure children and teens back onto the platform.
As Aza Raskin, the original developer of the infinite scroll concept, noted to the BBC about the feature's addictive qualities: "If you don't give your brain time to catch up with your impulses ... you just keep scrolling."
Meta knew these addictive features harmed young people's physical and mental health, including undermining their ability to get adequate sleep, but did not disclose or meaningfully try to minimize the harm. Instead, it claimed the platforms were safe for young users.
Officials say these choices, the complaint alleges, violate state consumer protection laws, including the CFA and COPPA.
The federal complaint seeks injunctive and monetary relief to rectify the harms caused by these platforms, officials said.
The multistate coalition that brought today's complaint is also investigating TikTok for similar conduct, authorities say.
According to officials, that investigation remains ongoing, and states have pushed for adequate disclosure of information and documents in litigation related to TikTok's failure to provide sufficient discovery in response to the multistate's requests.
States joining the federal lawsuit are Arizona, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Idaho, Illinois, Indiana, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nebraska, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Virginia, Washington, West Virginia, and Wisconsin, officials stated.
Florida is filing its own federal lawsuit in the U.S. District Court for the Middle District of Florida.
Authorities say filing lawsuits in their own state courts are the District of Columbia, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah, and Vermont, alleging the company violated numerous state laws, including deceptive trade, consumer fraud, unlawful trade, unjust enrichment, negligence, product liability, and public nuisance claims.