Facebook, Twitter & YouTube ‘consciously failing’ to tackle online extremism – MPs

A man types on a keyboard in front of a computer screen on which an Islamic State flag is displayed. © Dado Ruvic
Social media giants Facebook, Twitter, YouTube, and others are “consciously failing” to combat extremist material promoting terrorism on their sites, an influential group of MPs claims.

The Home Affairs Select Committee have accused internet giants such as Google of “passing the buck” and allowing websites to become “recruiting platforms for terrorism.”

Committee chair Keith Vaz MP described online forums, message boards, and social media platforms as “the lifeblood of Daesh,” also known as Islamic State (IS, formerly ISIS/ISIL).

His damning assessment comes a week after it emerged that UK authorities had faced difficulties when trying to get the internet posts of radical Islamist cleric Anjem Choudary taken down, even after he was arrested for supporting IS.

The parliamentary inquiry into tackling radicalization said that social media platforms have become the “vehicle of choice in spreading propaganda.

Vaz said: “Huge corporations like Google, Facebook and Twitter, with their billion-dollar incomes, are consciously failing to tackle this threat and passing the buck by hiding behind their supranational legal status, despite knowing that their sites are being used by the instigators of terror.”

The Labour MP added: “The companys’ failure to tackle this threat has left some parts of the internet ungoverned, unregulated and lawless.”

Internet companies reacted strongly to the MPs’ condemnation, insisting that they take combating extremism seriously.

Twitter said it has suspended 235,000 accounts for promoting terrorism in the six months since February, while Facebook insisted it has dealt “swiftly and robustly” with reports of terrorist-related content.

Google told the committee it had removed more than 14 million videos from across the world in 2014 connected to all types of abuse.

MPs want the government to introduce measures that would force web providers to cooperate with the UK authorities by promptly investigating reported hate speech sites and closing them down, or provide an explanation for why they are still online.

Facebook director of policy Simon Milner said: “As I made clear in my evidence session, terrorists and the support of terrorist activity are not allowed on Facebook and we deal swiftly and robustly with reports of terrorism-related content.

In the rare instances that we identify accounts or material as terrorist, we’ll also look for and remove relevant associated accounts and content. Online extremism can only be tackled with a strong partnership between policymakers, civil society, academia and companies.

A YouTube spokesman said: “We remove content that incites violence, terminate accounts run by terrorist organisations and respond to legal requests to remove content that breaks UK law. We’ll continue to work with government and law enforcement authorities to explore what more can be done to tackle radicalisation.”