How to Identify and Prevent Bias in EHR

They then used machine learning to analyze more than 40,000 clinical records from 33,000 patients at the University of Chicago Academic Medical Center between January 2019 and October 2020. Of the 18,500 patients included in the study, approximately 61 percent were black, 30 percent were black and white, 6 percent were Hispanic or Latino, and 3.5 percent were classified as “other.” A total of 8.2% of patients had one or more negative descriptions in their medical history.

15 common patient descriptors were used to identify stigma language, including nonadherent, aggressive, agitated, angry, challenging, aggressive, disobedient, confrontational, uncooperative, defensive, exaggerated, hysterical, unpleasant, rejected, and resistance. Sun said the commonality of these terms in EHRs reflects a cultural norm that patients’ health disorders are not properly expressed.

“The patterns of words we use are shortcuts, and we’re hurting our patients by not providing the full context and the full story,” he said.

For example, physicians can call patients “non-compliant” when they do lack health knowledge and misunderstand what they should be doing. Understanding this difference can help patients reformulate treatment plans that are right for them and improve outcomes.

Sun said the next step for research will be to explore the link between negative reviews in patients’ electronic health records and clinical outcomes. The report doesn’t directly link adverse medical outcomes to implicit bias, but points to other studies, including one that found that doctors with a high degree of implicit bias predominated in their speech to black patients, and a report suggesting that health care Aspects of bias and lower patient compliance.

Download the Modern Healthcare app to stay informed as industry news emerges.

The report also explains how electronic health records can bias and stigmatize clinicians. The authors cite a 2018 study that found that healthcare providers were more likely to have negative perceptions of their patients’ pain when they provided charts that included stigmatizing language, such as “frequent flyer.”

“It’s not hard to imagine that they might have different types of interactions,” he said. “This is certainly an area of ??follow-up research for us, but we expect that these descriptions will have some impact on the doctor-patient relationship and the many health care provider-patient relationships that occur during a patient’s hospitalization.”

The researchers found that the use of stigmatizing language decreased in 2020. With the nationwide reckoning of the murder of George Floyd following the start of the COVID-19 pandemic, clinicians are unlikely to use negative descriptions in the EHR, Sun said. The findings suggest that clinicians have the ability to examine their biases and hesitate to use negative descriptions in charts, especially when describing patients of color or borderline identities, he said. It may also reflect a growing interest from suppliers to address cultural incompetence in their operations.

“It was surprising to us at first because we thought the whole pandemic, as a stressful environment, would lead to people using more cognitive shortcuts or stereotypes, relying on bias or using bias more. During that time it actually decreased???,” he said. “I want people to see this as an opportunity to tell a patient’s full story and provide them with more compassionate and compassionate care. It’s certainly within our grasp, it just needs more intent.”

Source link