Facebook is using artificial intelligence to help prevent suicides

File photo: The logo of Facebook is pictured on a window at new Facebook Innovation Hub during a media tour in Berlin, Germany, February 24, 2016. (REUTERS/Fabrizio Bensch)

Facebook is turning to artificial intelligence to detect if someone might be contemplating suicide.

The social network already has mechanisms for flagging posts from people thinking about harming themselves. The new feature is intended to detect such posts before anyone reports them.

"We are starting to roll out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live," Guy Rosen, VP of Product Management, wrote in a blog post. Rosen added that the feature will eventually "be available worldwide, except the EU."

FACEBOOK TOOL WILL SHOW YOU THE RUSSIAN PROPAGANDA YOU FELL FOR

The service will scan posts and live video with a technique called "pattern recognition." For example, comments from friends such as "are you ok?" can indicate suicidal thoughts.

Rosen wrote that Facebook's Community Operations team includes "thousands of people around the world who review reports about content on Facebook," of which there is a certain number of people who are dedicated and specific training in suicide and self harm.

The company is also using AI to prioritize the order that flagged posts are sent to its human moderators so they can quickly alert local authorities.

This is not the first time Facebook has utilized AI to help people with physical and mental issues.

In April 2016, Facebook introduced automatic alternative text, which lets a description of a photo be read outloud. For example, someone could now hear, “Image may contain three people, smiling, outdoors.”

That technology is built off of Facebook’s object recognition technology, "which is based on a neural network that has billions of parameters and is trained with millions of examples," Facebook said at the time.

The Associated Press contributed to this report.

Follow Chris Ciaccia on Twitter @Chris_Ciaccia

Load more..