Social media giants should “do a better job” to protect users from online hate speech, British MPs have said.
Executives from Facebook, Twitter and Google were asked by the Home Affairs select committee why they did not police their content more effectively, given the billions they made.
They were told they had a “terrible reputation” for dealing with problems.
The firms said they worked hard to make sure freedom of expression was protected within the law.
‘Money out of hate’
Labour MP Chuka Umunna focused his questioning on Google-owned YouTube, which he accused of making money from “videos peddling hate” on its platform.
A recent investigation by the Times found adverts were appearing alongside content from supporters of extremist groups, making them around £6 per 1,000 viewers, as well as making money for the company.
Mr Umunna said: “Your operating profit in 2016 was $30.4bn.
“Now, there are not many business activities that somebody openly would have to come and admit… that they are making money and people who use their platform are making money out of hate.
“You, as an outfit, are not working nearly hard enough to deal with this.”
Peter Barron, vice president of communications and public affairs at Google Europe, told the committee the cash made from the videos in question was “very small amounts”, but added that the firm was “working very hard in this area” to stop it happening again.
Fellow committee member David Winnick said, when he heard Mr Barron’s answer, “the thought that came into my mind was the thought of commercial prostitution that you are engaged in,” adding: “I think that is a good and apt description.”
Yvette Cooper, who is chairwoman of the committee, turned her attention to Twitter.
Ms Cooper said she had personally reported a user who had tweeted a “series of racist, vile and violent attacks” against political figures such as German Chancellor Angela Merkel and London Mayor Sadiq Khan, but the user had not been removed.
Nick Pickles, head of public policy and government for Twitter in the UK, said the company acknowledged it was “not doing a good enough job” at responding to reports from users.
“We don’t communicate with the users enough when they report something, we don’t keep people updated enough and we don’t communicate back enough when we do take action,” he said.
“I am sorry to hear those reports had not been looked at. We would have expected them to have been looked at certainly by the end of today, particularly for violent threats.”
When the BBC checked the account after the committee session, it had been suspended.