This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

proofread

How AI could transform the future of the psychology profession

brain
Credit: Pixabay/CC0 Public Domain

Psychologists are discovering the benefits of emerging artificial intelligence (AI) applications for mental health—and the ethical considerations that they present.

By now, most people will have heard of the term "digital ." Promising technological applications are enhancing—but crucially, not replacing—conventional mental health assessment and support services. With the advent of digital approaches to mental health, the application of (AI) to the domain of mental health is also being explored.

This new generation of digital tools can be divided into two forms. First, there are digital interventions in mental health through our devices: things such as chatbots, apps and even virtual reality offering more personalized and intelligent assistance. Secondly, AI can also be used by psychologists to help facilitate and inform their practices.

Understanding digital phenotyping

By examining a client's digital footprint from their smartphone and wearable devices, experts can gain insights into their behavior by understanding such as their movement and where they spend most of their time.

This method is known as "digital phenotyping," and researchers are using it to understand individual behaviors that may assist in improving diagnostic processes and individual treatment plans for a range of problems, including addiction, mood disorders, sleep disorders and suicide prevention.

Digital phenotyping uses data in combination with traditional emotional and behavioral observations to detect patterns and triggers that may not be so easily apparent in traditional talk therapy, says Dr. Simon D'Alfonso, a senior lecturer in the School of Computing and Information System at The University of Melbourne.

"One simple example is getting somebody's raw geolocation coordinates, such as their GPS data, to construct [an idea] about things like where they are spending their time and what types of places these are. Are people suddenly much more sedentary than usual? That [information] could be an indicator of an emotional slump," says Dr. D'Alfonso, who will be speaking as part of the APS AI and Psychology Symposium on 13 October.

The psychologist can then use this as a starting point to talk through their circumstances. Were they placed in a stressful or unusual situation that elicited a particular emotional response? Or does their lack of movement indicate that they are in a depressive state?

If we add to this additional information, such as measurements of one's communication levels via their social app usage and phone/SMS record counts, we can begin to paint a picture of the way that a client is connecting and interacting with the world, says Dr. D'Alfonso.

"The gist is, our interaction with an array of digital devices and data technologies is leaving behind data exhausts, or a digital footprint. So can we mine those to get an understanding of the person that could then be of use in a psychological context?"

Natural language processing

Last year, a research team led by scientists at the University of Manchester found that there is an in mental illness detection research that utilizes (NLP) —the machine learning technology that gives computers the ability to interpret human language.

In a , psychologists are already employing basic NLP tools, such as AI transcription, to gain new insights into psychology sessions.

"You've heard the phrase language is the window into the mind. Well, we can get a transcript from the session and mine psychotherapy dialogical exchange from two angles. [The first is] to get some insight into the client—the way they're using language, and things like that," says Dr. D'Alfonso.

"But you could also analyze what the [psychologist] is doing. Are they using the right words? Is their practice combined with the techniques that they're purportedly supposed to be using? [We can monitor] the fidelity and quality of the techniques that the psychologist is using."

Ethical considerations

It goes without saying that AI raises a number of serious ethical concerns, and that psychologists need to be aware of the risks and considerations. In August, APS issued a response to the Federal Government's Safe and Responsible AI in Australia discussion paper.

"We believe that AI presents considerable benefits for humanity, including improved health outcomes, but that safeguarding mechanisms must keep up with AI advancements to keep individuals and society safe," says APS CEO Dr. Zena Burgess.

Psychologists need to be at the forefront of issues such as informed consent of collection of data. Such sensitive and valuable data as those around our mental health and well-being should always ensure privacy and transparency are the most important considerations.

D'Alfonso says there are a few practical steps that psychologists can take to ensure data security.

"Make sure data is transmitted through secure channels [and] if it's relying on a central server, make sure it's as secure and encrypted as possible," he says.

Beyond that, working within a defined set of principles will help protect clients and their data.

"Only use as much data as needed," he says, "And discard the rest."

Incorporating AI into a psychological practice will also require a rethink of informed consent of clients. D'Alfonso says it may no longer be enough to "have an initial consent from the client."

Clients may need to periodically give their consent to use AI technology as well as collect and store their confidential data on an ongoing basis.

Looking ahead

Although AI-powered psychology is still in an exploratory stage, practitioners should be aware of the possibilities for digital mental health and new considerations to their practice. This is particularly the case as grow up with these tools as part of their everyday life and have a high degree of savviness, as well as expectations with care.

"As a form of triage, AI may help address the significant burden on practitioners and improve the shortage of mental health professionals. In saying this, APS is a strong advocate for regulation of the technology to ensure client safety," says Dr. Burgess.

Most importantly, we caution that AI interventions are there to complement professionals and never replace these incredibly important human interactions.

Provided by Australian Psychological Society
Citation: How AI could transform the future of the psychology profession (2023, October 5) retrieved 27 April 2024 from https://medicalxpress.com/news/2023-10-ai-future-psychology.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Young adults who have more frequent psychotic experiences also spend more time on digital media, says study

5 shares

Feedback to editors