This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

AI reliably detects emotions based on facial expressions in psychotherapeutic situations

Artificial intelligence as therapeutic support
Design for image processing. 1 = to reduce error, a region of interest (ROI) can be defined where a face is detected with high likelihood which is expanded if face detection fails; 2 = faces are detected and transformed into 48 × 48 pixels grayscale pictures; 3 = the convolutional neural net had been trained using the FER-2013 dataset. Credit: Psychopathology (2023). DOI: 10.1159/000534811

The face reflects a person's emotional state. The interpretation of facial expressions as part of psychotherapy or psychotherapeutic research, for example, is an effective way of characterizing how a person is feeling in that particular moment. In the 1970s, psychologist Paul Ekmann developed a standardized coding system to assign basic emotions such as happiness, disgust or sadness to a facial expression in an image or video sequence.

"Ekman's system is very widespread, and represents a standard in psychological emotion research," says Dr. Martin Steppan, psychologist at the Faculty of Psychology at the University of Basel.

But the process of analyzing and interpreting recorded as part of research projects or psychotherapy is extremely time-consuming, which is why psychiatry specialists often use less reliable, indirect methods such as skin conductance measurements, which can also be a measure of emotional arousal.

"We wanted to find out whether AI systems can reliably determine the emotional states of patients in ," says Martin Steppan, who developed the study together with emeritus Professor Klaus Schmeck, Dr. Ronan Zimmermann and Dr. Lukas Fürer from the UPK. The researchers published their findings in the journal Psychopathology.

No facial expression can escape AI

The researchers used freely available artificial neural networks that were trained in the detection of six (happiness, surprise, anger, disgust, sadness and fear) using more than 30,000 facial photos. This AI system then analyzed video data from with a total of 23 patients with borderline personality pathology at the Center for Scientific Computing at the University of Basel. The high-performance computer had to process more than 950 hours of video recordings for this study.

The results were astonishing: Statistical comparisons between the analysis of three trained therapists and the AI system showed a remarkable level of agreement. The AI system assessed the facial expressions as reliably as a human but was also able to detect even the most fleeting of emotions within the millisecond range, such as a brief smile or expression of disgust.

These types of micro expressions have the potential to be missed by therapists or may only be perceived subconsciously. The AI system is therefore able to measure fleeting emotions with an increased level of sensitivity compared to trained therapists.

Interpersonal communication is still key

The AI analysis also uncovered something rather unexpected. Patients who demonstrated emotional involvement and smiled at the start of a therapy session went on to cancel their psychotherapy less often than people who seemed emotionally uninvolved with their therapist. This "social" smiling could therefore be a good predictor of therapy success in a person with symptoms of borderline personality pathology.

"We were really surprised to find that relatively simple AI systems can allocate facial expressions to their emotional states so reliably," says Steppan.

AI could therefore become an important tool in therapy and research. AI systems could be used in the analysis of existing video recordings from research studies in order to detect emotionally relevant moments in a conversation more easily and more directly. This ability could also help support the supervision of psychotherapists.

"Nevertheless, therapeutic work is still primarily about , and remains a human domain," says Steppan. "At least for the time being."

More information: Martin Steppan et al, Machine Learning Facial Emotion Classifiers in Psychotherapy Research: A Proof-of-Concept Study, Psychopathology (2023). DOI: 10.1159/000534811

Journal information: Psychopathology
Citation: AI reliably detects emotions based on facial expressions in psychotherapeutic situations (2023, December 29) retrieved 27 April 2024 from https://medicalxpress.com/news/2023-12-ai-reliably-emotions-based-facial.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Study investigates monkeys' perception of emotions on viewing facial expressions

26 shares

Feedback to editors