Facebook Taps AI in Suicide Prevention Efforts

— November 29, 2017

Facebook Taps AI in Suicide Prevention Efforts

Facebook has announced that it will roll out artificial intelligence (AI) “to help identify when someone might be expressing thoughts of suicide, including on Facebook Live.”

The AI suicide prevention tool will be available worldwide, except in the European Union (EU), according to the announcement made by Facebook Vice President of Production Management Guy Rosen.

The tool will not be available in the EU because “General Data Protection Regulation privacy laws on profiling users based on sensitive information complicate the use of the tech,” a TechCrunch report said.

Response

The move is apparently in response to the number of high-profile suicides on Facebook Live, which reports had described as the site’s “most celebrated yet most troubling” feature.

Back in May, Facebook CEO Mark Zuckerberg revealed that the company would do more to try to stop suicides and killings on Facebook Live.

We’ve seen people hurting themselves and others on Facebook – either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.”

Community Operations Team

Zuckerberg said that Facebook would add 3,000 more people to its “community operations” team, which already had 4,500 people to look over live videos and other content that have been reported.

The team will not only look for people at risk of hurting themselves but also banned content like hate speech and child exploitation.

Rosen pointed out that “Facebook is a place where friends and family are already connected, and we are able to help connect a person in distress with people who can support them.”

Additional Work

He revealed the company’s other efforts towards preventing suicides. They are:

  • Looking into pattern recognition to identify posts or Facebook Live clips wherein the user has expressed suicidal thoughts and to respond to reports quickly
  • Improving the identification of the appropriate responders
  • Tapping of more reviewers from Facebook’s community operations team to look into reports of suicide or self-harm.

What’s Next?

Facebook has rolled out an AI suicide prevention tool to help those who post suicidal thoughts on the site.

What can you say about Facebook’s move? Share your thoughts below.

Digital & Social Articles on Business 2 Community

Author: Jimmy Rodela

View full profile ›

(61)