YouTube removing online terrorism content faster, aided by machine learning
{{#rendered}} {{/rendered}}
Nearly a month after YouTube said it was going to combat online terrorist content on its platform, the video-sharing site said the process is going well, thanks to machines and humans alike.
In an August 1 blog post, the Google-owned site said its multi-pronged approach is being aided by machine learning, where computers are detecting and removing the content in a faster manner.
"Our machine learning systems are faster and more effective than ever before," YouTube said in the post. "Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag."
{{#rendered}} {{/rendered}}
YOUTUBE REDIRECTING POTENTIAL ISIS RECRUITS TO ANTI-TERROR CONTENT
YouTube added that over the past month, machine learning has helped it remove more than double "both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down."
Over 400 hours of content are uploaded every minute to the platform, which has more than 1.5 billion logged-in users, according to YouTube CEO Susan Wojcicki.
{{#rendered}} {{/rendered}}
In addition to using computers, YouTube is utilizing human experts, through its Trusted Flagger program. They've added 15 non-governmental organizations including the Anti-Defamation League, the No Hate Speech Movement and the Institute for Strategic Dialogue.
There will also be tougher standards on videos that have been flagged by users as potential violations on hate speech and violent extremism, but may not actually be illegal.
"If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state," YouTube said.
{{#rendered}} {{/rendered}}
IS THIS APPLE'S NEXT BIG PRODUCT?
The push comes after several terrorist attacks around the globe, including in the U.K. In June. YouTube said it would redirect people looking for extremist content to videos that confront and discredit the search topics, via the Redirect Method, created by another team at Google and its parent company, Alphabet.
Despite the progress, YouTube said more needs to be done to combat extremist content that lives on its site.
{{#rendered}} {{/rendered}}
"With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat," YouTube wrote. "We look forward to sharing more with you in the months ahead."
Follow Chris Ciaccia on Twitter @chris_ciaccia