US teen girl sexually exploited on Snapchat drags Silicon Valley giant to court

A teenager has dragged instant video-messaging app Snapchat to court for doing nothing to prevent sexual exploitation of girls on its platform. In the lawsuit filed before a US court on Monday, the girl, identified as just LW to protect her identity as a victim of sexual abuse, and her mother have accused Snapchat of failing to design a platform that could protect its users from “egregious harm “, reported The Washington Post,

The suit, filed in the state of California, seeks at least $5 million in damages and assurances that the company will invest more to ensure protection. “We cannot expect the same companies that benefit from children being harmed to go and protect them,” the girl’s attorney said in a statement. “That’s what the law is for.”

The events date back to when the girl was 12. A man she had met on Snapchat allegedly began demanding nude photographs from her, reassuring her that he was a friend and that he found her pretty. The man – an armed forces member convicted last year on charges related to child pornography and sexual abuse in a military court – allegedly saved the girl’s Snapchat photos and videos and shared them with others on the internet, an investigation has found.

The girl, now 16, is leading a class-action lawsuit against Snapchat, which holds a massive sway on American teenagers, claiming that its designers have done nothing to prevent sexual exploitation of girls like her on the platform. Class action is a type of lawsuit in which one of the parties is a group of people who are represented collectively by a member or members of that group.

Also Read: | A 17-year-old dies by suicide allegedly over Instagram, Snapchat addiction

The case against Snapchat, which has over 300 million active users, exposes a haunting web of abuse and shame on the app that has managed to dodge the authorities for years. Ironically, the app prides itself on a reputation for being a safe space for users, especially the young, to share the most intimate images and thoughts, according to the report.

The lawsuit raises concerns over privacy and safety, and argues that the systems tech giants depend on to weed out child pornography are dangerously flawed.

“There isn’t a kid in the world that doesn’t have this app,” the girl’s mother told The Washington Post, “and yet an adult can be in correspondence with them, manipulating them, over the course of many years, and the company does nothing. How does that happen?”

Snapchat has defended its key features of self-deleting messages and instant video chats as helping the young share openly about their lives.

In a statement to The Washington Post, Snap, the app’s parent company, said that it uses “the latest technologies” and develops software “to help us find and remove content that exploits or abuses minors”.

“While we cannot comment on active litigation, this is tragic, and we are glad the perpetrator has been caught and convicted,” Snap spokeswoman Rachel Racusen said. “Nothing is more important to us than the safety of our community.”

Last month, the 11-year-old company told its investors that the app now has 100 million active daily users in North America – more than double Twitter’s following in the US, and that 90 percent of the users are between 13 and 24 years of age.

,