This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

Can AI be used to help in suicide prevention? Exploring technology in the mental health space

brain ai
Credit: Pixabay/CC0 Public Domain

From diagnosing cancers to the development of new drugs, artificial intelligence is helping reshape health care in transformative ways.

When it comes to , AI tools have the potential to help treat more people in a sector that has struggled to find enough workers to meet demand, says Annika Marie Schoene, a research scientist at Northeastern University's Institute for Experiential AI.

As a researcher at the institute working in its Responsible AI team and AI+Health group, Schoene's focus is to understand how companies are developing these tools, their shortcomings and their ethical implications.

Schoene's area of focus has been AI's use for . Users and developers of these technologies include social media companies, government and clinics, and startups in Silicon Valley and other tech hubs, she says.

One of the best known users of AI in suicide prevention is Meta, the parent company of Instagram and Facebook, she says. It uses machine learning to help detect posts that contain languages or images that might indicate someone intends to harm themselves.

"This technology uses pattern-recognition signals, such as phrases and comments of concern, to identify possible distress," the company says. Meta also uses AI to help content reviewers track and prioritize reported cases to provide them additional support from trained and emergency services.

However, as useful as these technologies can be, it's important to recognize their limits and shortcomings, Schoene says.

Earlier this year, she presented findings at the Society for Affective Science Conference (SAS 2024) that highlighted one big limitation of the AI models—they struggle to detect emotion.

For the study, the researchers—including Tomo Lazovich and Resmi Ramachandranpillai from the Institute for Experiential AI—wanted to see how AI could analyze datasets of nearly 4,000 tweets that had been annotated by humans for potential sucide-related content.

Each piece of content had been assigned a specific emotion, such as anger, disgust, fear, joy, neutral, sadness and surprise.

To test the capabilities of current AI technologies, the researchers had three language models—a type of machine learning-based AI—assign an emotion for each tweet to see how their annotations compared to the human ones.

The results were mixed across the board. The human annotators had found the majority of the tweets fell into the neutral emotional category, but that wasn't the case for the three AI models. It was, in fact, challenging to find a consistent pattern among the models, though they found the models seemed to be more biased toward some emotional categories.

"This led us to consider that language models for emotion predictions may not be capable of finding finer distinctions and granular suicide-related content," Schoene says.

"Our findings also led us to question how useful and credible such techniques are when you want to use emotion features in suicide-detection tasks, especially given that this is a really high-risk scenario and a high positive rate could cause real harm," she adds.

The current technologies alone are not enough to be used to predict suicide, Schoene says, highlighting the importance of trained medical professionals.

But that's not to say AI cannot be used at all, she says. It can be useful in helping health care professionals in "understanding the causes and factors of suicidal ideation and intent," and in analyzing a lot of data at once.

"When you want information extraction or information summarization, AI can be very useful. No one would ever doubt that," she says. "The important part here is that the ultimate decision-making should never be left to the algorithm."

This story is republished courtesy of Northeastern Global News news.northeastern.edu.

Citation: Can AI be used to help in suicide prevention? Exploring technology in the mental health space (2024, July 17) retrieved 17 July 2024 from https://medicalxpress.com/news/2024-07-ai-suicide-exploring-technology-mental.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Machine learning tools can predict emotion in voices in just over a second

1 shares

Feedback to editors