YouTube struggled to remove New Zealand shooting videos. This is why

Not long after a man gunned down dozens and killed at least 50 worshipers inside two New Zealand mosques while wearing a body-mounted camera to record the carnage, the video quickly spread on YouTube — it was repeatedly taken down and then posted again, with uploaders able to work around the video platform's artificial intelligence detection tools.

A team of YouTube executives worked through the night to manage the crisis, identifying and removing tens of thousands of videos that were uploaded as quickly as one per second in the hours after the massacre, according to The Washington Post.

The massive and highly profitable Google-owned video platform, which sees 500 hours of content added every minute, was reportedly forced to take unprecedented steps in response to the shooting, including temporarily disabling certain search functions and removing human moderators to speed up the removal of videos flagged by automated systems.

GOOGLE RESPONDS AFTER TRUMP ACCUSES TECH GIANT OF AIDING CHINESE MILITARY 

"This was a tragedy that was almost designed for the purpose of going viral," Neal Mohan, YouTube's chief privacy officer, told The Washington Post, adding that the volume of videos was much larger and came much faster than in previous similar incidents.

According to the Post, many uploaders made tiny modifications to the video, such as adding watermarks or logos, to evade YouTube's ability to detect and remove it; others turned the people in the footage into animations, as one would see in a video game.

The San Bruno, Calif. company, which has come under fire for not moving quickly enough to take down terrorist content and combat conspiracy theories, most recently was forced to remove hundreds of channels and disable comments on almost all videos involving minors because they were being used by child predators.

YouTube was not the only company that struggled in the wake of the New Zealand shooting.

Facebook announced that it removed 1.5 million videos depicting images from the incident in the first 24 hours after it happened, with 1.2 million of those blocked by software at the moment of upload. Still, that means 300,000 videos were seen by a certain percentage of Facebook users.

"Out of respect for the people affected by this tragedy and the concerns of local authorities, we're also removing all edited versions of the video that do not show graphic content," Mia Garlick, a spokesperson for Facebook in New Zealand, said in a statement.

APPLE ANNOUNCES NEW IPADS AHEAD OF BIG SERVICES PUSH

This frame from the video that was live-streamed Friday, March 15, 2019, shows a gunman, who used the name Brenton Tarrant on social media, in a car before the mosque shootings in Christchurch, New Zealand.  (AP)

Besides the livestreamed video of the attack itself, the suspected gunman also apparently uploaded a 74-page manifesto that detailed his plans and railed against Muslims and immigrants.

Experts on online radicalization and terrorist content said that social media companies like Facebook, Twitter and Google must do more to combat it.

"Reports say Facebook needed 17 minutes to remove the livestream. ... The technology to prevent this is available. Social media firms have made a decision not to invest in adopting it," Counter Extremism Project Director David Ibsen said in a statement.

Another expert said the issue is that AI systems have not been perfected.

NEW ZEALAND MOSQUE SHOOTER'S LIVESTREAM SPARKS SOCIAL MEDIA PUSH TO REMOVE VIDEO

"In a way, they're kind of caught in a bind when something like this happens because they need to explain that their AI is really fallible," Pedro Domingos, a professor of computer science at the University of Washington, told the Post. "The AI is really not entirely up to the job."

YouTube, which announced it was hiring 10,000 content moderators across all of Google to review problematic videos and other content that's been flagged by users or AI, seems to have been outmatched in the days since the New Zealand shooting.

Mourners place flowers as they pay their respects at a makeshift memorial near the Masjid Al Noor mosque in Christchurch, New Zealand, Sunday, March 17, 2019, where one of the two mass shootings occurred. (AP)

According to the Post, engineers immediately "hashed" the video, which means AI software would be able to spot uploads of carbon copies and could delete them automatically. This hashing technique is also used to prevent copyright abuses and the re-uploading of child pornography.

CLICK HERE FOR THE FOX NEWS APP

In this case, however, the hashing system was no match for the tens of thousands of permutations being uploaded in real time, Mohan told the Post.

"Like any piece of machine learning software, our matching technology continues to get better, but frankly, it's a work in progress," Mohan said.

YouTube has not released statistics about exactly how many videos its systems removed in the wake of the shooting.