By: Benyamin Davidsons
Meta, the parent company for Instagram, Facebook and WhatsApp, is facing a new onslaught of lawsuits, alleging disturbing social and mental effects of Instagram on children and teens.
As reported by the NY Post, experts say the recent slew of lawsuits use a novel argument that could put a real damper on Mark Zuckerberg’s social-media empire. The multiple suits include numerous disparaging stories of youth for which Instagram posts exasperated anorexia and depression problems, and lead to self-harm and even suicide. “In what universe can a company have a product that directs this kind of vile filth, this dangerous content to kids — and get away with it?” said Matthew Bergman, the founder of the Social Media Victims Law Center, which has filed more than a half-dozen of the lawsuits. “These products are causing grievous harm to our kids.”
The suits heavily depend on last year’s leaks by whistleblower Frances Haugen, who exposed internal Meta documents, showing that Instagram makes mental health problems worse for many teens. The leaks offer proof that Meta knew that its social media platform was hurting children but chose to keep at it, prioritizing its own growth and profits above the safety of users, the suits claim. Some of the suits also name Snapchat and TikTok, which plaintiffs say also offer addictive products despite knowing the harmful outcome.
The social media giants are protected from similar litigation thanks to Section 230 of the Communications Decency Act, which protects internet users’ free speech and thereby keeps web platforms from being held legally liable for content posted by users. Bergman, however, is using a novel legal strategy based on Haugen’s leaks. As per the Post, Bergman argues that Instagram is not just harmful because of comments made by third parties, but has its own harmful content embedded in the app. He says Instagram’s design can purposely direct vulnerable users toward such harmful content, as demonstrated in Haugen’s leaks. Therefore, he argues, Section 230 should not protect the company against lawsuits pinning liability for suicide or other harm on the company. “It’s our belief that when you attack the platform as a product, that’s different than Section 230,” Bergman said. “230 has been a barrier and it’s something we take seriously and we believe we have a viable legal theory to get around it.”
One of the lawsuits is focused on 14-year-old , Englyn Roberts from Louisiana, who committed suicide in 2020. Per the Post, the suit, filed in federal court, alleges that the more Roberts interacted with “harmful images and videos” on Instagram, Snapchat and TikTok, the more the apps suggested gory content to keep her hooked on “violent and disturbing content glorifying self-harm and suicide.” In her suicide, Roberts appeared to imitate one of the videos she had watched of a woman hanging herself with an extension cord from a door, as per screenshots included in court documents.
In August 2020, Roberts similarly used an extension cord to hang herself from a door, leading to her death. “What became clear in September of 2021 is that Englyn’s death was the proximate result of psychic injury caused by her addictive use of Instagram, Snapchat, and TikTok,” the suit reads. Bergman’s firm also represents two other lawsuits involving suicides, as well as other suits filed by victims who suffered from severe anorexia, and other mental trauma due to their social media use.

