Social media firms are “shamefully far” from tackling illegal and dangerous content, says a parliamentary report.
Hate speech, terror recruitment videos and sexual images of children all took too long to be removed, said the Home Affairs Select Committee report.
It called for a review of UK laws and stronger enforcement around illegal material.
And the government should consider making the sites pay to help police what people post, it said.
The cross-party committee took evidence from Facebook, Twitter and Google, the parent company of YouTube, for its report.
It said they had made efforts to tackle abuse and extremism on their platforms, but “nowhere near enough is being done”.
The committee said it had found “repeated examples of social media companies failing to remove illegal content when asked to do so”, including terrorist recruitment material, promotion of sexual abuse of children and incitement to racial hatred.
It said the largest firms were “big enough, rich enough and clever enough” to sort the problem out, and that it was “shameful” that they had failed to use the same ingenuity to protect public safety as they had to protect their own income.
- Social media firms face huge hate speech fines in Germany
- Facebook, Twitter and Google grilled by MPs over hate speech
The MPs said it was “unacceptable” that social media companies relied on users to report content, saying they were “outsourcing” the role “at zero expense”.
Yet the companies expected the police – funded by the taxpayer – to bear the costs of keeping them clean of extremism.
The report’s recommendations include:
- The government should consult on requiring social media firms to contribute to the cost of the police’s counter-terrorism internet referral unit
- It should also consult on “meaningful fines” for companies which failed to remove illegal content within a strict timeframe, highlighting proposals in Germany which could see firms fined up to 44m and individual executives 5m
- Social media companies review urgently their community standards and how they are being interpreted and implemented
“Social media companies’ failure to deal with illegal and dangerous material online is a disgrace,” said committee chairwoman Yvette Cooper.
“They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse.
“Yet repeatedly, they have failed to do so. It is shameful.”
Ms Cooper said the committee’s inquiry into hate crime more broadly was curtailed when the general election was called and their recommendations had to be limited to dealing with social media companies and online hate.
Home Secretary Amber Rudd said she expected to see social media companies take “early and effective action” and promised to study the committee’s recommendations.
Facebook, Twitter and Google did not respond to a BBC request for comment on the committee’s findings.
The firms had previously told the committee that they worked hard to make sure freedom of expression was protected within the law.
Child protection fines
Last week, the NSPCC called for fines for social networks which failed to protect children.
NSPCC chief executive Peter Wanless said social media sites should face penalties if children saw inappropriate material.
He also said the government should consider age-rating sites in the same way as the British Board of Film Classification rates films.
Internet companies’ voluntary regulations on child protection were “not up to scratch” , he said.
“Online safety is one of the biggest risks facing children and young people today and one which the government of the day needs to tackle head on,” he added.