7.9 C
New York
Thursday, April 18, 2024

A Health Care Algorithm Offered Less Care to Black Patients

logoThe AI Database →

Application

Prediction

Ethics

End User

Big company

Sector

Health care

Care for some of the sickest Americans is decided in part by algorithm. New research shows that software guiding care for tens of millions of people systematically privileges white patients over black patients. Analysis of records from a major US hospital revealed that the algorithm used effectively let whites cut in line for special programs for patients with complex, chronic conditions such as diabetes or kidney problems.

The hospital, which the researchers didn’t identify but described as a “large academic hospital,” was one of many US health providers that employ algorithms to identify primary care patients with the most complex health needs. Such software is often tapped to recommend people for programs that offer extra support—including dedicated appointments and nursing teams—to people with a tangle of chronic conditions.

Researchers who dug through nearly 50,000 records discovered that the algorithm effectively low-balled the health needs of the hospital’s black patients. Using its output to help select patients for extra care favored white patients over black patients with the same health burden.

When the researchers compared black patients and white patients to whom the algorithm assigned similar risk scores, they found the black patients were significantly sicker, for example with higher blood pressure and less well-controlled diabetes. This had the effect of excluding people from the extra care program on the basis of race. The hospital automatically enrolled patients above certain risk scores into the program, or referred them for consideration by doctors.

The researchers calculated that the algorithm’s bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent. Those missing out on extra care potentially faced a greater chance of emergency room visits and hospital stays.

“There were stark differences in outcomes,” says Ziad Obermeyer, a physician and researcher at UC Berkeley who worked on the project with colleagues from the University of Chicago and Brigham and Women’s and Massachusetts General hospitals in Boston.

The paper, published Thursday in Science, does not identify the company behind the algorithm that produced those skewed judgments. Obermeyer says the company has confirmed the problem and is working to address it. In a talk on the project this summer, he said the algorithm is used in the care of 70 million patients and developed by a subsidiary of an insurance company. That suggests the algorithm may be from Optum, owned by insurer UnitedHealth, which says its product that attempts to predict patient risks, including costs, is used to “manage more than 70 million lives.” Asked by WIRED if its software was that in the study, Optum said in a statement that doctors should not use algorithmic scores alone to make decisions about patients. “As we advise our customers, these tools should never be viewed as a substitute for a doctor’s expertise and knowledge of their patients’ individual needs,” it said.

>

The algorithm studied did not take account of race when estimating a person’s risk of health problems. Its skewed performance shows how even putatively race-neutral formulas can still have discriminatory effects when they lean on data that reflects inequalities in society.

The software was designed to predict patients’ future health costs, as a proxy for their health needs. It could predict costs with reasonable accuracy for both black patients and white patients. But that had the effect of priming the system to replicate unevenness in access to healthcare in America—a case study in the hazards of combining optimizing algorithms with data that reflects raw social reality.

When the hospital used risk scores to select patients for its complex care program it was selecting patients likely to cost more in the future—not on the basis of their actual health. People with lower incomes typically run up smaller health costs because they are less likely to have the insurance coverage, free time, transportation, or job security needed to easily attend medical appointments, says Linda Goler Blount, president and CEO of nonprofit the Black Women’s Health Imperative.

Because black people tend to have lower incomes than white people, an algorithm concerned only with costs sees them as lower risk than white patients with similar medical conditions. “It is not because people are black, it’s because of the experience of being black,” she says. “If you looked at poor white or Hispanic patients, I’m sure you would see similar patterns.”

Blount recently contributed to a study that suggested there may be similar problems in “smart scheduling” software used by some health providers to increase efficiency. The tools try to assign patients who previously skipped appointments into overbooked slots. Research has shown that approach can maximize clinic time, and it was discussed at a workshop held by the National Academies of Sciences, Engineering, and Medicine this year about scheduling for the Department of Veterans Affairs.

The analysis by Blount and researchers at Santa Clara University and Virginia Commonwealth University shows this strategy can penalize black patients, who are more likely to have transportation, work, or childcare constraints that make attending appointments difficult. That results in them being more likely to be given overbooked appointments, and having to wait longer when they do show up.

>

Obermeyer says his project makes him concerned that other risk scoring algorithms are producing uneven results in the US healthcare system. He says it’s difficult for outsiders to gain access to the data required to audit how such systems are performing, and that this kind of patient prioritization software falls outside the purview of regulators such as the Food and Drug Administration.

It is possible to craft software that can identify patients with complex care needs without disadvantaging black patients. The researchers worked with the algorithm’s provider to test a version that predicts a combination of a patient’s future costs, and the number of times a chronic condition will flare up over the next year. That approach reduced the skew between white patients and black patients by more than 80 percent.

Blount of the Black Women’s Health Imperative hopes work like that becomes more common, since algorithms can have an important role in helping providers serve their patients. However, she says that doesn’t mean society can look away from the need to work on the deeper causes of health inequalities through policies such as improved family leave, working conditions, and more flexible clinic hours. “We have to look at these to make sure people who are not in the middle class get to have going to a doctors appointment be the everyday occurrence that it should be,” she says.

Related Articles

Latest Articles