Facebook really doesn’t want you to leave their platform. Really, these guys are super serious about keeping you liking, sharing, screeching madly at your screen as you attempt to correct yet another person who was wrong on the internet about something.
They want to keep you so badly that they’ve developed an AI program to identify when you’re at risk of suicide, and they’re notifying others of your suicide risk in the hope that you’ll be saved before you ever attempt to leave Facebook, er, commit suicide.
|Facebook is using AI to predict when users may be suicidal|
About a year ago, Facebook added technology that automatically flags posts with expressions of suicidal thoughts for the company’s human reviewers to analyze. And in November, Facebook showed proof that the new system had made an impact.
“Over the last month, we’ve worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts,” the company said in a blog post at the time.
Facebook now says the enhanced program is flagging 20 times more cases of suicidal thoughts for content reviewers, and twice as many people are receiving Facebook’s suicide prevention support materials. The company has been deploying the updated system in more languages and improving suicide prevention in Instagram, though tools there are at an earlier stage of development.
On Wednesday, Facebook provided more details on the underlying technology.
“We feel like it’s very important to get people help as quickly as we possibly can and to get as many people help as we can,” said Dan Muriello, a software engineer on Facebook’s compassion team, which was formed in 2015 and deals with topics like breakups and deaths.