The equity of the healthcare industry’s population health management care has been called into question after researchers found that a widely used algorithm sold by Optum dramatically underestimated the health needs of the sickest black patients.
The unexpected results for Optum have put a spotlight on the entire industry, which relies heavily on programs that make predictions on which patients may benefit from additional and more comprehensive care.
Risk-prediction algorithms like Optum’s, which use cost as an indicator to pinpoint high-risk patients, are ubiquitous in the industry, said Cynthia Burghard, a research director in value-based IT transformation strategies at IDC Health Insights, a division of market research firm International Data Corp.
In fact, these types of algorithms analyze data from roughly 200 million patients in the U.S. each year, according to industry estimates cited in the study in the journal Science. And they’re only becoming more popular with the shift to risk-based contracts, according to Dr. Shaun Grannis, vice president of data and analytics and director of the center for biomedical informatics at the Regenstrief Institute.
“Folks want to understand where their costs are going to be,” he said. He was not affiliated with the study.
There’s nothing wrong with predicting, and proactively intervening in, care for high-cost patients. But if these tools are relied upon, they can exacerbate existing healthcare disparities.
“The overall goal of the algorithms can be very positive and helpful if directed toward improving patient health outcomes,” said Dr. Marshall Chin, a primary-care physician at UChicago Medicine and associate director of the health system’s MacLean Center for Clinical Medical Ethics. “The choice of cost reflects perhaps misplaced priorities.”
That decision to predict patients’ risk scores based on cost is what led to the racial disparities for Optum’s algorithm, which was found to assign healthier white patients the same risk score as black patients who had poorer lab results, according to the recent research published in Science. The algorithm had specifically excluded race as a variable.
The algorithm is part of an analytics tool called Impact Pro, which Optum markets as a way to help healthcare organizations identify individuals who will benefit most from population health management programs.
But the algorithm in question doesn’t predict patients’ future health conditions; it predicts how much patients would cost the hospital in the future, and used that as a proxy for who would benefit from additional care-management services. That created a disparity, since black patients generally use healthcare services at lower rates than white patients.
Annual care for black patients with chronic conditions cost about $1,800 less than that for comparable white patients, according to the study, which looked at a patient population at one unnamed academic health system using the algorithm. Essentially, that meant “healthier white patients were ‘cutting in line’ ahead of sicker black patients” to get more intensive care management, said Dr. Ziad Obermeyer, the study’s lead author and acting associate professor in health policy and management at the University of California at Berkeley.
There is a fix, according to the study authors—one that hits at the heart of the question over how to design and use clinical algorithms. Rather than having an algorithm predict how much a hospital would spend on patients, the researchers adjusted it to predict health conditions. As another alternative, they tweaked the algorithm to predict patients’ avoidable costs, rather than total costs.
“Both of those alternative algorithms actually had far less bias,” Obermeyer said. “The problem we found wasn’t anything to do with what’s going on in the black box of the algorithm. The problem was what the algorithm was told to do.” Obermeyer said the research team has been in communication with Optum to experiment with possible versions of the algorithm. Optum hasn’t commented on whether it will add the researchers’ adjustments to its product.
Optum, for its part, has stressed that its algorithm fulfills its intended purpose. A company spokesman highlighted that Impact Pro has multiple features, of which one part is an algorithm that forecasts expenditure costs. Optum’s tool also identifies gaps in care, which are often driven by social determinants of health.
“We appreciate the researchers’ work, including their validation that the cost model within Impact Pro was highly predictive of cost, which is what it was designed to do,” he wrote in an email. Optum did not respond to a request for comment on how much it charges healthcare organizations for Impact Pro.
Burghard, the analyst with IDC Health Insights, agreed. The algorithm did what it was designed to do: it “spit out a list of people based on cost,” she said.
If a hospital decides those high-cost patients are the population to target, “of course there’s bias. You miss all the people who didn’t get care, or who don’t have access to care,” she said. “To me, at this point in my musings, it’s not about the algorithm or the tool. It’s about the human decision (and how) to use that tool.”
On the surface, using cost as a proxy to predict health risk can seem reasonable—health is complex, and there’s no one variable to measure someone’s health, Obermeyer noted. But while cost may sound like a race-neutral measure on paper, there are social and historical disparities that shape it, such as, in this case, black patients generally using healthcare services at lower rates.
So even if the algorithm worked as intended, it raises ethical questions, according to Chin. “I think that healthcare organizations and hospitals have to look in their hearts and ask: What is their mission?” he said. “If you are thinking about the ultimate mission of patient care, you’ll be directed to metrics being high-quality care and the best possible patient health outcomes.”
He acknowledged that it’s understandable why a healthcare organization might decide to focus efforts on tackling high-cost patients. To truly encourage a focus on high-quality patient care, he said, the industry would need to think about realigning incentives so that patient outcomes are rewarded, rather than cost savings. “Hospitals are under a lot of cost pressures,” Chin said. “The financial margins for a lot of hospitals are small.”
A population health team at Partners HealthCare System in Boston was confronted with a decision on how to address disparities when it tested Optum’s algorithm a few years ago.
The team had mapped patients who were getting high risk scores from the algorithm, and found many were concentrated in some of the region’s wealthier neighborhoods. “That made us uncomfortable with just using the tool,” said Christine Vogeli, director of evaluation and research for population health at Partners HealthCare and a study co-author. While Partners continued to use the tool as part of its care-management program, it supplemented the algorithm’s findings with additional clinical information.
And while the system used tools like Optum’s algorithm to serve as a resource to help corral an initial list of patients who might benefit from enhanced care management, it decided primary-care physicians would be responsible for determining which patients would be offered enrollment in the program.
For the care-management program, all patients designated as high risk by Optum’s algorithm would be flagged as candidates. But they also took into account factors like whether a patient had multiple chronic conditions and their patterns of healthcare utilization, such as if they had missed appointments or frequently visited the emergency department.
Developing that process required a conscious focus on patient needs, according to Vogeli. “We’re not just interested in patients who are high-cost, we’re interested in patients who have a need for more intensive care-management services,” she said.
Partners HealthCare stopped using Optum’s algorithm to inform its care-management program earlier this year, switching to a different tool that only uses information about patient’s chronic conditions.
Even when the system had used Optum’s algorithm, only about 15% of patients designated as possible candidates for the care-management program were identified solely based on having a high risk score, Vogeli said. The program serves about 14,000 patients at a given time, the bulk of whom were flagged via Partners HealthCare’s review of their chronic conditions and healthcare use.
“Healthcare organizations need to be very savvy about how they use these tools,” Vogeli said.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.