Twitter rejected pleas to remove child porn from platform because it didn’t ‘violate policies,’ lawsuit claims
A new lawsuit has accused Twitter of turning a blind eye to child pornography on its platform, claiming that it snubbed repeat requests from an underage sex trafficking victim to remove explicit images obtained through blackmail.
The suit, filed by the teenage victim and his mother in the Northern District of California on Wednesday, argues that Twitter refused to pull the sexually graphic videos on the grounds that they did not violate its policies, allowing them to rack up well over 150,000 views.
The plaintiff in the case – identified only as “John Doe” in court records – says he was just 13 when he was manipulated into sharing nude images of himself with a Snapchat user he believed to be a 16-year-old classmate. After he did so, “the correspondence changed to blackmail,” the lawsuit claims, adding that the perpetrators threatened to share the photos with the victim’s “parents, coach, pastor, and others in his community” if he did not send additional material. He complied with the traffickers’ demands, sending sexually explicit videos of himself, some of which included another minor.
At some point in 2019, a “compilation video” featuring the footage extorted from John Doe surfaced on Twitter through at least two accounts, eventually making their way to the victim in January 2020 after “he learned from his classmates that [the] videos of him and another minor were on Twitter and that many students in the school had viewed them.”
Also on rt.com If you can be banned from Twitter for questioning transgenderism, why are accounts advocating pedophilia still on the site?
Due to the circulation of these videos, he faced teasing, harassment, vicious bullying, and became suicidal.
Immediately, the victim – who by this time was 16-years-old – informed his parents of the situation, prompting his mother, named as “Jane Doe” in the suit, to take up the issue with school officials, local police and with Twitter directly. That followed at least one previous complaint from a concerned Twitter user in late 2019, who reported one of the accounts that shared footage of the victim. The company took no action and the videos remained live.
By January 21, the plaintiff filed his own complaint with Twitter, telling the platform “These videos were taken from harassment and being threatened. It is now spreading around school and we need them taken down as we are both minors and we have a police report for the situation.” At the request of Twitter, he provided a photo of his driver’s license to confirm his identity.
Jane Doe also filed two additional complaints with the company one day later, to which Twitter replied with identical automated messages promising to review the content in question.
After a full week without a response from the company, despite repeat attempts by the victim’s mother beyond her initial complaints, Twitter finally replied on January 28, stating that it found no problems with the sexually explicit videos and would do nothing to have them removed.
“Thanks for reaching out. We’ve reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time,” Twitter said, while insisting without a hint of irony that “your safety is the most important thing.”Also on rt.com Fortnite used to lure minor into sex, child pornography
The victim replied on the same day, outraged over the platform’s inaction, asking “What do you mean you don’t see a problem?”
We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down.
While the company ignored John Doe’s final plea, within a few days his family “was able to connect with an agent of the US Department of Homeland Security” through a mutual contact, according to the suit.
“The federal agent also initiated contact with Twitter and at the request of the US federal government, the [explicit content] was finally removed from Twitter on or about January 30, 2020,” the lawsuit continues, adding that the offending accounts were also banned.
Twitter has come under fire in the past for its handling of child pornography, with the Canadian Centre for Child Protection finding in a review last year that the platform makes it “extremely difficult” to report such content, forcing users to locate a form separate from its “easily-accessible report function” found on every tweet.
As of March 2019, Twitter claims to enforce a “zero-tolerance child sexual exploitation policy,” and in its communications with John Doe and his mother said it forwards all reports of such material to the National Center Missing and Exploited Children. However, despite aggressive efforts by the victim to have images of himself removed from the platform, the company only did so after being contacted by the US federal government, failing to relay John Doe’s case to the Center until that point.
The company’s apparently lax stance on child pornography comes in stark contrast to its forceful policing of political content it deems “hateful” or to spread “misinformation,” regularly purging thousands of posts and users – among them even former president Donald Trump – over technical policy violations.Also on rt.com ‘This is bigger than one account’: Twitter CEO Jack Dorsey hints at future crackdowns in leaked VIDEO after Trump booted from site
Think your friends would be interested? Share this story!