This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:


trusted source


AI model can accurately diagnose and triage health conditions, without introducing racial and ethnic biases

AI model can accurately diagnose and triage health conditions, without introducing racial and ethnic biases
The comparison of GPT-4's diagnostic and triage accuracy and that of physicians. The results showed no significant difference between the two. Credit: JMIR Medical Education (2023). DOI: 10.2196/47532

GPT-4 conversational artificial intelligence (AI) has the ability to diagnose and triage health conditions comparable to that provided by board-certified physicians, and its performance does not vary by patient race and ethnicity.

While GPT-4, a conversational artificial intelligence, "learns" from information on the internet, the accuracy of this form of AI for and , and whether AI's recommendations include racial and ethnic biases possibly gleaned from that information, have not been investigated even as the technology's use in health care settings has grown in recent years.

The researchers compared how GPT-4 and three board-certified physicians diagnosed and triaged health conditions using 45 typical clinical vignettes to determine how each provided the most likely diagnosis and decided which of the triage levels—emergency, non-emergency, or self-care—was most appropriate.

The study has some limitations. While based on real-world cases, the clinical vignettes provided only summary information for diagnosis, which may not reflect that typically gives patients more detailed information. In addition, the GPT-4 responses may depend on how the queries are worded and the GPT-4 may have learned from the clinical vignettes this study used. Also, the findings may not be applicable to other conversational AI systems.

Health systems can use the findings to introduce conversational AI to improve patient diagnosis and triage efficiently.

"The findings from our study should be reassuring for patients because they indicate that like GPT-4 show promise in providing accurate medical diagnoses without introducing racial and ethnic biases," said senior author Dr. Yusuke Tsugawa, associate professor of medicine in the division of general internal medicine and services research at the David Geffen School of Medicine at UCLA.

"However, it is also important for us to continuously monitor the performance and potential biases of these models as they may change over time depending on the fed to them."

The study is published in JMIR Medical Education

More information: Naoki Ito et al, The Accuracy and Potential Racial and Ethnic Biases of GPT-4 in the Diagnosis and Triage of Health Conditions: Evaluation Study, JMIR Medical Education (2023). DOI: 10.2196/47532

Citation: AI model can accurately diagnose and triage health conditions, without introducing racial and ethnic biases (2023, November 9) retrieved 3 March 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

ChatGPT shows 'impressive' accuracy in clinical decision making


Feedback to editors