Read time : 2 mins
Level : Intermediate
By CAROLYN THOMPSON Associated Press
New York City, its schools and public hospital system announced a lawsuit Wednesday against the tech giants that run Facebook, Instagram, TikTok, Snapchat and YouTube, blaming their “addictive and dangerous” social media platforms for fueling a childhood mental health crisis that is disrupting learning and draining resources.
Children and adolescents are especially susceptible to harm because their brains are not fully developed, the lawsuit said.
“Youth are now addicted to defendants’ platforms in droves,” according to the 311-page filing in Superior Court in California, where the companies are headquartered.
The country’s largest school district, with about 1 million students, has had to respond to disruptions in and out of the classroom, provide counseling for anxiety and depression, and develop curricula about the effects of social media and how to stay safe online, according to the filing. The city spends more than $100 million on youth mental health programs and services each year, Mayor Eric Adams’ office said.
“Over the past decade, we have seen just how addictive and overwhelming the online world can be, exposing our children to a non-stop stream of harmful content and fueling our national youth mental health crisis,” Adams said.
The legal action is the latest of numerous lawsuits filed by states,school districts and others claiming social media companies exploit children and adolescents by deliberating designing features that keep them endlessly scrolling and checking their accounts.
Teenagers know they spend too much time on social media but are powerless to stop, according to the new lawsuit, filed by the city of New York, its Department of Education and New York City Health and Hospitals Corp., the country’s largest public hospital system.
The lawsuit seeks to have the companies’ conduct declared a public nuisance to be abated, as well as unspecified monetary damages.
In responses to the filing, the tech companies said they have and continue to develop and implement policies and controls that emphasize user safety.
“The allegations in this complaint are simply not true,” said José Castañeda, a spokesman for YouTube parent Google, who said by email that the company has collaborated with youth, mental health and parenting experts.
A TikTok spokesperson cited similar regular collaborations to understand best practices in the face of industry-wide challenges.
“TikTok has industry-leading safeguards to support teens’ well-being, including age-restricted features, parental controls, an automatic 60-minute time limit for users under 18, and more,” an emailed statement said.
Virtually all U.S. teenagers use social media, and roughly one in six teens describe their use of YouTube and TikTok as “almost constant,” according to the Pew Research Center.
A spokesperson for Meta, which owns and operates Facebook and Instagram, said the company wants “teens to have safe, age-appropriate experiences online, and we have over 30 tools and features to support them and their parents. We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”
A statement from Snap Inc., the parent company of Snapchat, said its app is intentionally different from from others in that it “opens directly to a camera – rather than a feed of content that encourages passive scrolling – and has no traditional public likes or comments.”
“While we will always have more work to do, we feel good about the role Snapchat plays in helping close friends feel connected, happy and prepared as they face the many challenges of adolescence,” the statement said.