Facebook imposes new restrictions on livestreaming to prevent abuse following Christchurch attacks
{{#rendered}} {{/rendered}}
Two months after the deadly white nationalist terrorist attacks in two New Zealand mosques, Facebook has announced new restrictions on who can livestream video on the social network.
According to a Tuesday blog post, the Mark Zuckerberg-led company will begin applying a "one strike" policy to Facebook Live that would ban users who violate the platform's community standards once from using the livestreaming platform for set periods of time.
The policy applies to content posted elsewhere on the platform, not just streamed. Therefore, if a user posted content leading to a terrorist website, they'd be banned from livestreaming.
{{#rendered}} {{/rendered}}
"Following the horrific terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate," Facebook VP of Integrity Guy Rosen said in the blog post.
The new restrictions also apply to the social network's Dangerous Individuals and Organizations policy, which recently led to a range of anti-Semitic and far-right figures being banned from Facebook and Instagram.
The social network, which is used by more than 2.3 billion people per month, also announced $7.5 million in new partnerships with researchers and universities to boost Facebook's "image and video analysis technology."
{{#rendered}} {{/rendered}}
The New Zealand massacre killer's livestreaming of his attacks sparked a backlash against Big Tech firms – including Facebook and Google – that struggled to stop the horrifying footage from spreading online.