Google will dedicate more than 10,000 staff to rooting out violent extremist content on YouTube in 2018, the video sharing website’s chief has said.
Writing in the Daily Telegraph, Susan Wojcicki said some users were exploiting YouTube to “mislead, manipulate, harass or even harm”.
She said the website, owned by Google, had used “computer-learning” technology that could find extremist videos.
More than 150,000 of these videos have been removed since June, she said.
In March, the UK government suspended its adverts from YouTube, following concerns they were appearing next to inappropriate content.
And in a speech at the United Nations general assembly in September, UK Prime Minister Theresa May challenged tech firms to take down terrorist material in two hours.
The prime minister has repeatedly called for an end to the “safe spaces” she says terrorists enjoy online.
Ms Wojcicki said that staff had reviewed nearly two million videos for violent extremist content since June.
This is helping to train the company’s machine learning technology to identify similar videos, which is enabling staff to remove nearly five times as many videos as they were previously, she said.
She said the company was taking “aggressive action” on comments, using technology to help staff find and shut down hundreds of accounts and hundreds of thousands of comments.
And its teams “work closely with child safety organisations around the world to report predatory behaviour and accounts to the correct law enforcement agencies”.
Meanwhile, police in the UK have warned that sex offenders are increasingly using live online streaming platforms to exploit children.
Earlier this year, Google announced it would give a total of £1m ($1.3m) to fund projects that help counter extremism in the UK.
And, in June, YouTube announced four new steps it was taking to combat extremist content:
- Improving its use of machine learning to remove controversial videos
- Working with 15 new expert groups, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue.
- Tougher treatment for videos that are not illegal but have been flagged by users as potential violations of its policies on hate speech and violent extremism
- Redirecting people who search for certain keywords towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages
Calum Chace, author of Surviving AI and The Economic Singularity, said that machine learning is developing fast.
“People are often unduly cynical about the prospects for AI because they judge it by what is possible today,” he said.
“They forget that our machines are on an exponential growth curve: they get twice as powerful every 18 months or so. This means that we are just at the beginning of their story.
“Although YouTube’s automated systems are probably among the best in the world since it is a subsidiary of Google, they need human support. For now.”