Tools that help US doctors make key decisions on birth, heart surgery, kidney care are racially biased: Study

Researchers found many 'race-adjusted algorithms' guide decisions in ways that may direct more attention or resources to white patients than to members of racial and ethnic minorities


                            Tools that help US doctors make key decisions on birth, heart surgery, kidney care are racially biased: Study
(Getty Images)

While our understanding of race has advanced considerably over the years, the insights have not led to clear guidelines on the use of race in medicine, according to researchers. They found that diagnostic algorithms that help US physicians individualize risk assessment and guide clinical decisions — from assessing death risk in heart failure patients to who can opt for vaginal birth or cesarean section — are racially biased.

Many of these "race-adjusted algorithms" guide decisions in ways that may direct more attention or resources to white patients than to members of racial and ethnic minorities, says a US team from Massachusetts General Hospital, Harvard Medical School, Harvard University, and NYU Langone Medical Center.

"One subtle insertion of race into medicine involves diagnostic algorithms and practice guidelines that adjust or 'correct' their outputs on the basis of a patient’s race or ethnicity. Physicians use these algorithms to individualize risk assessment and guide clinical decisions. By embedding race into the basic data and decisions of healthcare, these algorithms propagate race-based medicine," says the team in their findings published in the New England Journal of Medicine (NEJM).

To illustrate the potential dangers of such practices, researchers compiled a list of some race-adjusted algorithms. They say given their potential to perpetuate or even increase race-based health inequities, the algorithms merit thorough scrutiny.

The analysis, for example, shows that the American Heart Association or AHA’s "Heart Failure Risk Score", which predicts the risk of death in patients admitted to the hospital, assigns three additional points to any patient identified as "nonblack", thereby categorizing all black patients as being at a lower risk. "The AHA does not provide a rationale for this adjustment. Clinicians are advised to use this risk score to guide decisions about referral to cardiology and the allocation of healthcare resources. Since 'black' is equated with lower risk, following the guidelines could direct care away from black patients," say researchers.

Experts say that a 2019 study found that race may influence decisions in heart-failure management, with measurable consequences: black and Latinx patients presented to a Boston emergency department with heart failure were less likely than white patients to be admitted to the cardiology service.

Cardiac surgeons also consider race. The study shows that the Society of Thoracic Surgeons produces elaborate calculators to estimate the risk of death and other complications during surgery. The calculators include race and ethnicity because of observed differences in surgical outcomes among racial and ethnic groups. The authors acknowledge that the mechanism underlying these differences is not known.

"An isolated coronary artery bypass in a low-risk white patient carries an estimated risk of death of 0.492%. Changing the race to 'black/African-American' increases the risk by nearly 20%, to 0.586%. Changing to any other race or ethnicity does not increase the estimated risk of death as compared with a white patient, but it does change the risk of renal failure, stroke, or prolonged ventilation. When used preoperatively to assess risk, these calculations could steer minority patients, deemed to be at higher risk, away from surgery," they explain.

An isolated coronary artery bypass in a low-risk white patient carries an estimated risk of death of 0.492%. Changing the race to “black/African-American” increases the risk by nearly 20%, to 0.586%, says the study (Getty Images)

In obstetrics, the Vaginal Birth after Cesarean (VBAC) algorithm predicts the risk posed by a "trial of labor for someone who has previously undergone a cesarean section." It predicts a lower likelihood of success for anyone identified as African-American or Hispanic. The study used to produce the algorithm found that other variables, such as marital status and insurance type, also correlated with VBAC success. Those variables, however, were not incorporated into the algorithm, suggests analysis.

Stating that the health benefits of successful vaginal deliveries are well known, including lower rates of surgical complications, faster recovery time, and fewer complications during subsequent pregnancies, the researchers note that nonwhite US women continue to have higher rates of cesarean section than white US women. "Use of a calculator that lowers the estimate of VBAC success for people of color could exacerbate these disparities. This dynamic is particularly troubling because black people already have higher rates of maternal mortality," they add. 

The analysis further says that in nephrology, since it is challenging to measure kidney function directly, researchers have developed equations that determine the estimated glomerular filtration rate (eGFR) from an accessible measure, the serum creatinine level. These algorithms result in higher reported eGFR values (which suggest better kidney function) for anyone identified as black, they found.

"The algorithm developers justified these outcomes with evidence of higher average serum creatinine concentrations among black people than among white people. Explanations that have been given for this finding include the notion that black people release more creatinine into their blood at baseline, in part because they are reportedly more muscular. Analyses have cast doubt on this claim, but the 'race-corrected' eGFR remains the standard," the findings state.

Researchers caution that race adjustments that yield higher estimates of kidney function in black patients might delay their referral for specialist care or transplantation and lead to worse outcomes.

In a parallel development, a new model for predicting urinary tract infection (UTI) in children similarly assigns lower risk to children identified as "fully or partially black". According to the research team, this tool echoes UTI testing guidelines released by the American Academy of Pediatrics in 2011 that were recently criticized for categorizing black children as low risk.

The Vaginal Birth after Cesarean (VBAC) algorithm predicts the risk posed by a 'trial of labor for someone who has previously undergone a cesarean section'. It predicts a lower likelihood of success for anyone identified as African-American or Hispanic (Getty Images)

The authors say similar adjustment practices affect kidney transplantation. The Kidney Donor Risk Index (KDRI), implemented by the National Kidney Allocation System in 2014, uses donor characteristics, including race, to predict the risk that a kidney graft will fail. The race adjustment is based on an "empirical finding" that black donors’ kidneys perform worse than nonblack donors’ kidneys, regardless of the recipient’s race.

They argue that since black patients are more likely to receive kidneys from black donors, anything that reduces the likelihood of donation from black people could contribute to the wait-time disparity, and the use of KDRI may do just that. "The developers of the KDRI do not provide possible explanations for this difference. If the potential donor is identified as black, the KDRI returns a higher risk of graft failure, marking the candidate as a less suitable donor. Meanwhile, black patients in the US still have longer wait times for kidney transplants than nonblack patients," says the study. 

Even in urology, the 'STONE' score predicts the likelihood of kidney stones in patients who present to the emergency department with flank pain. The “origin/race” factor adds 3 points (of a possible 13) for a patient identified as nonblack. "By assigning a lower score to black patients, the algorithm may steer clinicians away from thorough evaluation for kidney stones in black patients. The developers of the algorithm did not suggest why black patients would be less likely to have a kidney stone," the authors write.

According to the research team, similar examples can be found throughout medicine. Some algorithm developers do not explain why racial or ethnic differences might exist. Others offer rationales, but when these are traced to their origins, they lead to outdated, suspect racial science or biased data, say experts. “To be clear, we do not believe that physicians should ignore race. Doing so would blind us to the ways in which race and racism structure our society. However, when clinicians insert race into their tools, they risk interpreting racial disparities as immutable facts rather than as injustices that require intervention,” says the team. 

The experts emphasize that researchers and clinicians must distinguish between the use of race in "descriptive statistics", where it plays a vital role in epidemiologic analyses, and in "prescriptive clinical guidelines," where it can worsen inequities. "Our understanding of race has advanced considerably in the past two decades. The clinical tools we use daily should reflect these new insights to remain scientifically rigorous. Equally important is the project of making medicine a more antiracist field. This involves revisiting how clinicians conceptualize race to begin with. One step in this process is reconsidering race correction to ensure that our clinical practices do not perpetuate the very inequities we aim to repair," the team recommends.

Disclaimer : This article is for informational purposes only and is not a substitute for professional medical advice, diagnosis, or treatment. Always seek the advice of your physician or other qualified health provider with any questions you may have regarding a medical condition.