British spooks delete 1,000 extremist videos weekly – still can’t keep up with propagandists
Extremist material is spreading online at such a fast rate, UK security services are unable to keep up, Home Secretary Amber Rudd has admitted.
The new minister blamed US tech giants Google, Facebook and Twitter for not doing enough to stop terrorists uploading propaganda to the internet.
Social media companies have previously defended their efforts in the online war on terror, insisting they are removing extremist content when it is discovered.
Rudd was grilled by Parliament’s Home Affairs Committee on Wednesday about how US internet firms could be made to do more in the battle against extremist propaganda.
The Home Secretary agreed that companies like Google and Facebook could take more responsibility because of the sheer speed at which such videos are uploaded.
“At the moment we are taking down 1,000 a week of these sites – [this] is too slow compared to the speed at which they are communicated,” she told MPs.
Asked what more could be done to force internet firms to take down websites “frequently and regularly” and also report content to the police, the Home Secretary said she is in discussions with the tech giants.
“We would like to see a form of industry standard board that they could put together in order to have an agreement of oversight and to take action much more quickly on sites which will do such damage to people in terms of making them communicating terrorist information.”
Rudd said the industry standards board could be similar to an existing board which protects children from exploitation online.
Last month the Home Affairs Select Committee accused internet giants such as Google of “passing the buck” and allowing websites to become “recruiting platforms for terrorism.”
Then-committee chair Keith Vaz MP described online forums, message boards, and social media platforms as “the lifeblood of Daesh,” also known as Islamic State (IS, formerly ISIS/ISIL).
YouTube defended its track record in tackling extremism, claiming: “We remove content that incites violence, terminate accounts run by terrorist organizations and respond to legal requests to remove content that breaks UK law. We’ll continue to work with government and law enforcement authorities to explore what more can be done to tackle radicalization.”