Entertainment

French Families Sue TikTok, Alleging Algorithm Promoted Harmful Content Leading to Teen Suicides

Seven French families have filed a lawsuit against TikTok, alleging that the platform’s algorithm exposed their teenage children to harmful content, which they claim contributed to the suicides of two 15-year-olds.

The lawsuit, submitted to the Créteil judicial court, argues that TikTok’s algorithm promoted videos encouraging suicide, self-harm, and eating disorders. According to Laure Boutron-Marmion, the families’ lawyer, this marks the first collective lawsuit in Europe seeking to hold TikTok legally accountable for such content.

“The parents want TikTok’s legal liability to be recognised in court,” Boutron-Marmion told *franceinfo*, asserting that TikTok, by providing its platform to a young audience, should bear responsibility for harmful effects. The lawsuit claims that TikTok’s content moderation failed to protect underage users from dangerous material.

Social media platforms, including TikTok, have faced rising concerns over their influence on young users’ mental health. In the United States, hundreds of similar lawsuits have been filed against TikTok and Meta (parent company of Facebook and Instagram), accusing their algorithms of fostering addiction and negatively impacting children’s mental well-being.

TikTok has not yet responded to the lawsuit. However, the company has previously emphasized its commitment to addressing issues related to children’s mental health. Earlier this year, CEO Shou Zi Chew assured U.S. lawmakers that TikTok has invested in initiatives aimed at safeguarding young users.

Kindly share this story:
Kindly share this story:
Share on whatsapp
Share on facebook
Share on twitter
Share on linkedin
Share on telegram
Share on facebook
Top News

Related Articles