This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:
fact-checked
trusted source
proofread
Doctors more likely to use negative language describing Black and Hispanic patients in electronic health records
A new study of patients' electronic health records found that doctors were more likely to use negative words in describing visits with Black and Hispanic patients than white patients, something that could result in bias toward and unequal treatment of patients, according to the researchers.
The study, titled "Examining Linguistic Differences in Electronic Health Records for Diverse Patients With Diabetes: Natural Language Processing Analysis" and published in JMIR Medical Informatics, analyzed the medical records of Black, white and Hispanic or Latino patients who were seen by 281 physicians in a large metropolitan area. The researchers were interested in whether or not doctors showed bias in the language choices when describing patients in post-visit reports.
"Previous studies have shown that care providers' biases may be part of the reason for racial disparities in health," said Eden King, the Lynette S. Autrey Professor of Psychological Sciences at Rice University and one of the study's lead authors. "We wanted to know whether we could detect such biases in the language providers use in health records, and we did."
The summaries from doctors for Black and Hispanic patients contained significantly more negative adjectives (such as "unkind," "negative" or "stupid") and significantly more fear and disgust words (such as "intimidate," "attack," "cringe" and "criticize") than those for white, non-Hispanic patients.
The notes for Hispanic or Latino patients included significantly fewer positive adjectives (such as "supportive," "kind," "great" and "nice"), trust verbs (such as "affirm," "advise," "confide" and "cooperating") and joy words (such as "admiration," "elated," "glad" and "pleased") than those for white, non-Hispanic patients.
"Understanding that providers' language may indicate bias points to an opportunity to interrupt it," King said. "If we can perfect algorithms to detect such bias, we can raise awareness in the moment of the patient–provider conversation. That awareness may be enough to encourage more equitable health care."
King and her fellow researchers hope their work will enable physicians and other researchers to identify and mitigate bias in medical interactions with the goal of reducing health disparities stemming from bias.
More information: Isabel Bilotta et al, Examining Linguistic Differences in Electronic Health Records for Diverse Patients With Diabetes: Natural Language Processing Analysis, JMIR Medical Informatics (2024). DOI: 10.2196/50428