London: British MPs on Thursday rebuked global social media giants for failing to act against extremist material and allowing their sites to become "recruiting platforms for terrorism".
The UK's Home Affairs Select Committee, which is chaired by Indian-origin MP Keith Vaz, called on these companies to stop "passing the buck" and show a "greater sense of responsibility".
"Huge corporations like Google [which owns YouTube], Facebook and Twitter, with their billion-dollar incomes, are consciously failing to tackle this threat and passing the buck by hiding behind their supranational legal status, despite knowing that their sites are being used by the instigators of terror," Vaz said.
In its report, the influential House of Commons committee called for the industry to publish quarterly statistics showing the number of sites and accounts taken down and the reasons behind the action.
It added that anything that cannot appear legally in the print or broadcast media, should also not be allowed on social media. The report said: Networks like Facebook, Twitter and YouTube are the vehicle of choice in spreading propaganda and they have become the recruiting platforms for terrorism.
"They must accept that the hundreds of millions in revenues generated from billions of people using their products needs to be accompanied by a greater sense of responsibility and ownership for the impact that extremist material on their sites is having," it added.
The report criticised the companies for failing to be more specific about their counter-extremism efforts, including how many staff they had working on counter-extremism, and whether they had devoted sufficient resources from their vast revenues to developing systems that could automatically identify and remove content.
Twitter, Facebook and Google all gave evidence to the committee and claimed they took their responsibilities seriously and cooperated with security agencies.
Simon Milner, director of policy for Facebook UK, told the BBC the company had given extensive evidence to MPs about how it had been developing its counter-extremism strategy.
He said: "Terrorists and the support of terrorist activity are not allowed on Facebook and we deal swiftly and robustly with reports of terrorism-related content.
"In the rare instances that we identify accounts or material as terrorist, we'll also look for and remove relevant associated accounts and content.
A spokesperson for YouTube added: "We take our role in combating the spread of extremist material very seriously. We remove content that incites violence, terminate accounts run by terrorist organisations, and respond to legal requests to remove content that breaks UK law."
A Scotland Yard unit that works with social media companies is currently overseeing the removal of more than 1,000 pieces of extremist or illegal material a week.
Updated Date: Aug 25, 2016 16:15 PM