YouTube redirecting potential Isis recruits to anti-terrorist content
{{#rendered}} {{/rendered}}
Those searching for terrorist propaganda on YouTube are going to have a pretty hard time finding it.
In a Thursday blog post, YouTube said it's rolling out a new anti-terrorism feature that will redirect people searching for violent extremist content to videos that confront and discredit extremist messaging and mythology.
Now, "when people search for certain keywords on YouTube, we will display a playlist of videos debunking violent extremist recruiting narratives," the team wrote.
{{#rendered}} {{/rendered}}
This approach is called the Redirect Method. It was created by the team at Google parent company Alphabet's Jigsaw tech incubator in collaboration with Moonshot CVE, a company that helps clients respond to violent extremism. Jigsaw, formed in 2012, "builds technology to tackle some of the toughest global security challenges facing the world today."
More From PCmag
Google in June announced plans to increase its use of technology to help identify extremist and terrorism-related videos. At the time, the company said it planned to increase its efforts to redirect potential Isis recruits to anti-terrorist content, and now it's making good on that promise.
YouTube on Thursday said it hopes the Redirect Method will help "change minds of people at risk of being radicalized."
{{#rendered}} {{/rendered}}
Over the "coming weeks," YouTube plans to build on this effort by expanding the feature to "a wider set of search queries in other languages beyond English." The team also plans to use machine learning technology to "dynamically update the search query terms," work with expert NGOs to develop new anti-terrorism video content, and expand this feature to Europe.
"As we develop this model of the Redirect Method on YouTube, we'll measure success by how much this content is engaged," they wrote. "Stay tuned for more."
These changes come after the British government and several other big advertisters recently pulled their ads from YouTube because they appeared with videos containing extremist, homophobic, or racist content.
{{#rendered}} {{/rendered}}
This article originally appeared on PCMag.com.