You are here

Facebook Added a New AI Feature to Detect Suicidal Messages

Photo: chamsitr/Shutterstock

Social media has been connected to a whole slew of negative mental health issues, including depression and anxiety. But it also makes mental health resources more accessible and can be used to spread awareness. (See Instagram's recent mental health awareness campaign.) And one of Facebook's latest efforts could save lives.

Mark Zuckerberg announced (in a Facebook post, naturally) that the company will now use an AI tool that can identify when users send a message with suicidal content and connect the user to a first responder. The tool can pick up on patterns like repeated "are you okay?" comments, according to the post.

The same type of technology could be used to pick up on other issues eventually, Zuckerberg wrote. "In the future, AI will be able to understand more of the subtle nuances of language, and will be able to identify different issues beyond suicide as well, including quickly spotting more kinds of bullying and hate." (Instagram recently introduced features that make it easier to block hateful comments.) Over 44,000 people a year commit suicide in the U.S. alone, so connecting people to help is much-needed.

AI can be helpful, despite its big-brother connotations, Zuckerberg insisted in a comment on his post. "The reality is, like any technology, AI can be used for good and bad," he wrote. "It's our responsibility as builders to amplify the good uses of any technology and make the bad uses harder so the overall effect on society is positive."

Comments

Add a comment