6.6 C
New York
Friday, March 29, 2024

Senators Protest a Health Algorithm Biased Against Black People

logoThe AI Database →

Application

Prediction

Regulation

Ethics

End User

Big company

Government

Sector

Health care

In October, a bombshell academic study questioned whether widely used software could cause racial bias in US health care. It found that an algorithm some providers use to prioritize access to extra help with conditions such as diabetes systematically favors white patients’ needs over those of black patients. Democratic presidential candidate and senator Cory Booker (D-New Jersey) and Senate colleague Ron Wyden (D-Oregon) are now demanding answers.

On Tuesday, Booker and Wyden released letters to the Federal Trade Commission and Centers for Medicare and Medicaid Services asking the agencies how they look for and prevent bias in health care algorithms. They asked the FTC to investigate whether decision-making algorithms discriminate against marginalized communities. The lawmakers also wrote to five of the largest health care companies asking about their internal safeguards against bias in their technology.

“In using algorithms, organizations often attempt to remove human flaws and biases,” Booker and Wyden wrote. “Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in.” The letters were sent to health companies Blue Cross Blue Shield, Cigna Corporation, Humana, Aetna, and UnitedHealth Group.

>

The study that prompted Booker and Wyden’s letters found racial bias in the output of patient management software from UnitedHealth subsidiary Optum. It is used to predict the health care needs of 70 million patients across the US, but data from a major hospital showed that it understated the severity of black patients’ disease, assigning them lower scores than white patients with the same medical conditions.

That skew could have serious, even fatal, consequences, because some health systems use the scores to determine who gets access to special programs for people with complex chronic conditions such as diabetes or kidney disease. In the large academic hospital where the study was conducted, the authors calculated that the algorithm’s bias effectively reduced the proportion of black patients receiving extra help by more than half, from almost 50 percent to less than 20 percent.

Booker and Wyden are not the first to suggest those results should stir government action. Last month, New York state’s health and financial services regulators wrote a joint letter to UnitedHealth warning that “these discriminatory results, whether intentional or not, are unacceptable and are unlawful in New York.” The agencies asked the company not to use any algorithms or data analysis unless it could show they were free from racially disparate impacts. In a statement provided after this article was initially published, UnitedHealth Optum said that the study focused on just one of the scores its software can calculate, and that health providers should use all the scores to identify and address gaps in access to care.

Twitter content

This content can also be viewed on the site it originates from.

In April, Booker and Wyden introduced a Senate bill called the Algorithmic Accountability Act that would require organizations using automation in decision-making to evaluate their technology for discrimination. US representative Yvette Clarke (D-New York) introduced a version in the House.

Ziad Obermeyer, a UC Berkeley researcher and lead author of the study dissecting the Optum algorithm, says its results should trigger a broad reassessment of how such technology is used in health care. “The bias we identified is bigger than one algorithm or one company—it's a systematic error in the way we as a health sector have been thinking about risk prediction,” he says. The algorithm’s skew sprang from the way it used health costs as a proxy for a person’s care requirements, making its predictions reflect economic inequality as much as health needs.

Obermeyer published his results with researchers from the University of Chicago and Brigham and Women’s and Massachusetts General hospitals in Boston. The authors have offered to work pro bono with health systems and others who want to detect and remove bias from health care algorithms.

“We've had an enormous response from health systems, hospital associations, insurers, and state commissions,” Obermeyer says—suggesting that the health industry is waking up to a problem that could be widespread.

Updated, 12-3-19, 5pm ET: This article has been updated to include a statement from UnitedHealth Optum.

Related Articles

Latest Articles