TikTok is designed to be addictive to kids and causes them harm, US states’ lawsuits say

More than a dozen states and the District of Columbia filed lawsuits against TikTok on Tuesday, saying that the popular short-form video app is designed to be addictive to kids and harms their mental health.

The lawsuits stem from a national investigation into TikTok, which was launched in March 2022 by a bipartisan coalition of attorneys general from many states, including New York, California, Kentucky and New Jersey. All of the complaints were filed in state courts.

At the heart of each lawsuit is the TikTok algorithm, which powers what users see on the platform by populating the app’s main “For You” feed with content tailored to people’s interests. The lawsuits note TikTok design features that they say addict children to the platform, such as the ability to scroll endlessly through content, push notifications that come with built-in “buzzes” and face filters that create unattainable appearances for users.

“They’ve chosen profit over the health and safety, well-being and future of our children,” California Attorney General Rob Bonta said at a news conference in San Francisco. “And that is not something we can accept. So we’ve sued.”

The latest lawsuits come nearly a year after dozens of states sued Instagram parent Meta Platforms Inc. in state and federal courts for harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing addictive features that keep kids hooked on their platforms.

Keeping people on the platform is “how they generate massive ad revenue,” District of Columbia Attorney General Brian Schwalb said in an interview. “But unfortunately, that’s also how they generate adverse mental health impacts on the users.”

The legal challenges, which also include Google’s YouTube, are part of a growing reckoning against social media companies and their effects on young people’s lives. In some cases, the challenges have been coordinated in a way that resembles how states previously organized against the tobacco and pharmaceutical industries.

TikTok, though, is facing an even bigger obstacle, as its very existence in the U.S. is in question. Under a federal law that took effect earlier this year, TikTok could be banned from the U.S. by mid-January if its China-based parent company, ByteDance, doesn’t sell the platform by then. Both TikTok and ByteDance are challenging the law at an appeals court in Washington. A panel of three judges heard oral arguments in the case last month and are expected to issue a ruling, which could be appealed to the U.S. Supreme Court.

In its filings Tuesday, the District of Columbia called the algorithm “dopamine-inducing,” and said it was created to be intentionally addictive so the company could trap many young users into excessive use and keep them on its app for hours on end. TikTok does this despite knowing that these behaviors will lead to profound psychological and physiological harms, such as anxiety, depression, body dysmorphia and other long-lasting problems, the district said.

TikTok is disappointed that the lawsuits were filed after the company had been working with the attorneys general for two years on addressing to the issues, a spokesman said.

“We strongly disagree with these claims, many of which we believe to be inaccurate and misleading,” the TikTok spokesman. Alex Haurek, said. “We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product.”

The social media company does not allow children under 13 to sign up for its main service and restricts some content for everyone under 18. But Washington and several other states said in their filings that children can easily bypass those restrictions, allowing them to access the service adults use despite the company’s claims that its platform is safe for children.

The District of Columbia alleges TikTok is operating as an “unlicensed virtual economy” by allowing people to purchase TikTok Coins – a virtual currency within the platform – and send “Gifts” to streamers on TikTok LIVE who can cash it out for real money. TikTok takes a 50% commission on these financial transactions but hasn’t registered as a money transmitter with the U.S. Treasury Department or authorities in the district, according to the complaint.

Officials say teens are frequently exploited for sexually explicit content through TikTok’s LIVE streaming feature, which has allowed the app to operate essentially as a “virtual strip club” without any age restrictions. They say the cut the company gets from the financial transactions allows it to profit from exploitation.

The 14 attorneys general say the goal of their lawsuits is to stop TikTok from using these features, impose financial penalties for their alleged illegal practices and collect damages for users that have been harmed.

The use of social media among teens is nearly universal in the U.S. and many other parts of the world. Almost all teens ages 13 to 17 in the U.S. report using a social media platform, with about a third saying they use social media “almost constantly,” according to the Pew Research Center.

High school students who frequently use social media more commonly have persistent feelings of sadness or hopelessness, according to a new survey from the Centers for Disease Control and Prevention conducted last year in which about 20,000 teenagers participated.

Last week, Texas Attorney General Ken Paxton sued TikTok, alleging the company was sharing and selling minors’ personal information in violation of a new state law that prohibits these practices. TikTok, which disputes the allegations, is also fighting against a similar data-oriented federal lawsuit filed in August by the Department of Justice.

Several Republican-led states, including Nebraska, Kansas, New Hampshire, Kansas, Iowa and Arkansas, also previously sued the company, some unsuccessfully, over allegations it is harming children’s mental health, exposing them to “inappropriate” content or allowing young people to be sexually exploited on its platform.