Twitter vs ISIS: Social network suspends 125,000 accounts over terror activity
The move comes as the US government has increased calls for social media companies to do more to respond to abusive behavior online.
Twitter announced on Friday it had suspended more than 125,000 accounts for threatening or promoting terrorists acts, mainly related to Islamic State (IS, formerly ISIS/ISIL) militants, in the last eight months.
“Like most people around the world, we are horrified by the atrocities perpetrated by extremist groups. We condemn the use of Twitter to promote terrorism,” Twitter said in a statement released Friday. “As the nature of the terrorist threat has changes, so has our ongoing work in this area.”
Social media has been targeted for being a tool for recruitment and radicalization used by the Islamic State group and its supporters. The Associated Press reported there are accounts that the terror group allegedly used to send “tens of thousands of tweets per day.”
“We have already seen results, including an increase in account suspensions and this type of activity shifting off of Twitter,” Twitter said in a blog post.
Since late 2015, the company has been using spam-fighting software to find accounts that might be violating the terms of service. The company is also cooperating with law enforcement and online organizations to counteract abuse and terrorist recruitment online.
The White House on Friday said Twitter’s announcement was “very much welcome.” The Obama administration and the Department of Defense has repeatedly referred to the use of social media as recruitment tool by the Islamic State.
This month, meanwhile, Google announced it had begun two pilot programs aimed at curbing Islamic State influence on line. Under one effort, people using its search engine for extremist-related data will be diverted to anti-terror sites instead. The other program will focus on easier identification of extremist videos.
“We should get the bad stuff down, but it’s also extremely important that people are able to find good information, that when people are feeling isolated, that when they go online, they find a community of hope, not a community of harm,” said Anthony House, Google’s senior manager for public policy and communications, according to the Guardian.
House said 14 million videos were taken down from YouTube in 2014 for a number of reasons, including terrorist content. The company also noted some 100,000 “flags,” or signals that users find certain content inappropriate.
Twitter admitted there is no “magic algorithm” for finding terrorists content on its network, and companies are forced to make challenging judgments based on very limited information and guidance.
Twitter also noted that it has observed an increase in terrorist activity on other online forums as a result of its anti-terror actions, reported CNN.
“As an open platform for expression, we have always sought to strike a balance between the enforcement of our own Twitter Rules covering prohibited behaviors, the legitimate needs of law enforcement, and the ability of our users to share their views freely – including views that some may disagree with or find offensive,” said Twitter in its blog.