MPs slam social media giants for failure to tackle hate online
Report accuses firms of being too slow to take down content
Last updated 1st May 2017
MPs have accused social media giants of a "shameful'' failure to tackle online terrorist propaganda and hate speech.
A hard-hitting report accused major firms of putting more effort into protecting their profits than keeping the public safe online.
"Nowhere near enough is being done''
Ministers should consider forcing firms to pay for the cost of policing on social media and introducing a system of sanctions with multimillion-pound fines, the Home Affairs Select Committee said.
The cross-party committee, which took evidence from Google, Facebook and Twitter, acknowledged that the technology giants had considered the impact that online hate, abuse and extremism can have on individuals and welcomed measures taken to tackle the problem.
But they said "nowhere near enough is being done''.
The committee said it had found "repeated examples of social media companies failing to remove illegal content when asked to do so'', including "dangerous'' terrorist recruitment material, promotion of sexual abuse of children and incitement to racial hatred.
The report said: "The biggest companies have been repeatedly urged by Governments, police forces, community leaders and the public to clean up their act, and to respond quickly and proactively to identify and remove illegal content. They have repeatedly failed to do so.
"That should not be accepted any longer. Social media is too important to everyone - to communities, individuals, the economy and public life - to continue with such a lax approach to dangerous content that can wreck lives.
"And the major social media companies are big enough, rich enough and clever enough to sort this problem out - as they have proved they can do in relation to advertising or copyright.
"It is shameful that they have failed to use the same ingenuity to protect public safety and abide by the law as they have to protect their own income.''
In their strongly-worded report, the MPs accused Google, which owns YouTube, of making money from hatred because adverts appear alongside "inappropriate and unacceptable content, some of which were created by terrorist organisations'' - with the creators of the extreme content also getting a share of the revenue.
Although some advertisers had withdrawn their business from the site and Google may suffer financially as a result, "the most salient fact is that one of the world's largest companies has profited from hatred and has allowed itself to be a platform from which extremists have generated revenue''.
The MPs said it was "unacceptable'' that social media companies relied on users to report content, claiming the firms were "outsourcing'' the role "at zero expense'' while expecting the police - funded by the taxpayer - to bear the costs of keeping their platforms and reputations "clean of extremism''.
They suggested a similar model to that used in football, where clubs are obliged to pay for policing around their stadiums on match days, urging ministers to consult on requiring social media firms to contribute to the cost of the Metropolitan Police counter-terrorism internet referral unit (CTIRU).
They also suggested considering "meaningful fines'' for social media companies which fail to remove illegal content within a strict timeframe.
The committee's Labour chairwoman Yvette Cooper said: "Social media companies' failure to deal with illegal and dangerous material online is a disgrace.
"They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly, they have failed to do so. It is shameful.''
Home Secretary Amber Rudd said she expects to see social media companies take early and effective action'' and promised to study the committee's recommendations.
"We have made it very clear that we will not tolerate the internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponise the most vulnerable people in our communities,'' she said.
"Last month I convened a meeting with the social media companies to ask them to go further in making sure this kind of harmful material is not available on their platforms, and an industry-led forum has now been set up to more robustly address this.
"We will continue to push the internet companies to make sure they deliver on their commitments to further develop technical tools to identify and remove terrorist propaganda and to help smaller companies to build their capabilities. I expect to see early and effective action.''
NSPCC chief executive Peter Wanless said whoever wins the June 8 general election needs to introduce regulation for social networks to protect children online: "Online safety is one of the biggest risks facing children and young people today and one which government needs to tackle head on.
"The suffering experienced by children - often with devastating consequences for them and their families - shows that relying on voluntary regulations developed by internet companies is not enough.''