New York City filed a lawsuit on Wednesday (February 14) against TikTok, Instagram, Facebook, Snapchat and YouTube, accusing the social media platforms of contributing to a “nationwide youth mental health crisis.”
The lawsuit — filed in California Superior Court by the City of New York, its school district and the New York City Health and Hospitals Corporation — alleges that the companies knowingly designed their platforms to be addictive and harmful to children and teens, according to a press release from the office of NYC Mayor Eric Adams.
The city claims that the social media companies use algorithms that keep users engaged for extended periods, leading to “compulsive use.” Additionally, the lawsuit accuses the platforms of manipulating users by employing features like automatic “seen” notifications and push notifications, encouraging constant engagement and immediate responses.
“Over the past decade, we have seen just how addictive and overwhelming the online world can be, exposing our children to a non-stop stream of harmful content and fueling our national youth mental health crisis,” said Mayor Adams.
City lawyer Sylvia Hinds-Radix stressed that social media companies like TikTok, Snapchat, YouTube, and Meta “are fueling a national youth mental health crisis.”
“We are filing litigation today demanding that companies be held accountable for their platform’s damage and influence on the mental health of our young people, and while seeking to recover the cost of addressing this ongoing public health threat as well.”
Eric Adams, New York City Mayor
“These companies have chosen profit over the wellbeing of children by intentionally designing their platforms with manipulative and addictive features and using harmful algorithms targeted to young people. Social media companies should be held accountable for this misconduct and for the harms they cause to our children, schools, and entire communities,” Hinds-Radix said.
Some of the social media companies have responded to the allegations, with Snap maintaining that its platform prioritizes communication with friends and minimizes passive scrolling, when compared to traditional social media.
“These companies have chosen profit over the wellbeing of children by intentionally designing their platforms with manipulative and addictive features and using harmful algorithms targeted to young people.”
Sylvia Hinds-Radix, NYC Corporation Counsel
“Snapchat was intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their close friends. Snapchat opens directly to a camera – rather than a feed of content that encourages passive scrolling – and has no traditional public likes or comments,” Ashley Adams, a spokeswoman for Snap Inc., was quoted by ABC News as saying.
Meta, the owner of Instagram and Facebook, emphasized its extensive safety tools and features for teens and their parents.
“We want teens to have safe, age-appropriate experiences online, and we have over 30 tools and features to support them and their parents,” a Meta spokesperson reportedly said.
“We’ve built services and policies to give young people age-appropriate experiences, and parents robust controls. The allegations in this complaint are simply not true.”
José Castañeda, Google
A TikTok spokesperson mentioned age-restricted features, parental controls, and time limits in place to safeguard young users, telling ABC News, “We regularly partner with experts to understand emerging best practices, and will continue to work to keep our community safe by tackling industry-wide challenges.”
José Castañeda, a Google spokesperson, said, “Providing young people with a safer, healthier experience online has always been core to our work. In collaboration with youth, mental health and parenting experts, we’ve built services and policies to give young people age-appropriate experiences, and parents robust controls. The allegations in this complaint are simply not true.”
The city is seeking financial compensation for the resources it has allocated to address mental health issues among students, which it attributes in part to the social media platforms’ influence. The lawsuit remains in its early stages, and the court will determine the validity of the city’s claims and potential consequences for the social media companies.
Adams’ office said the city spends more than $100 million on youth mental health programs and services each year. In a speech on Wednesday, he said, “We are filing litigation today demanding that companies be held accountable for their platform’s damage and influence on the mental health of our young people, and while seeking to recover the cost of addressing this ongoing public health threat as well.”
The development comes as TikTok also faces a potential probe and hefty fines in the European Union over child safety concerns. Bloomberg reported Sunday (February 11), citing “people familiar with the matter,” that the European Commission could launch a probe into TikTok in the coming weeks to examine the platform’s practices regarding child safety.
The EU already opened a probe into TikTok and YouTube in November to assess how these platforms ensure the safety of minors.
In April 2023, TikTok was slapped with a £12.7 million (approx. $16 million) fine in the UK for a number of breaches in data protection including ‘misusing children’s data’. And in September, the Irish Data Protection Commission (DPC) hit TikTok with a €345 million (approx. $370 million) fine for violating regulations on children’s privacy.
Music Business Worldwide