19 December 2023 at 00:00 GMT+13

Doctors can be slow to talk about the end of the traditional medical road. When they’ve been trying to manage a life-threatening illness or keep a terminal patient alive, bringing up palliative or hospice care can feel like giving up. But these options can radically improve quality of life, or the end of life, when traditional medicine hasn’t helped enough—if patients and their doctors figure it out in time. Some providers just don’t recognize when the end is near until it’s very near, says Mihir Kamdar, a doctor at Massachusetts General Hospital in Boston who heads clinical health at a palliative-care startup called Tuesday Health. "When someone is actively declining, you can see it, but being able to predict before that happens is hard."


Can artificial intelligence software do a better job than humans of picking that moment? That’s the idea behind Serious Illness Care Connect, a software tool that about 150 doctors are testing in a pilot program in New Jersey’s largest health-care network, Hackensack Meridian Health. Developed by an in-house team of data scientists and hospice providers, SICC is a statistical model trained on a year’s worth of anonymized Hackensack Meridian patient data. It gets built into the software doctors use to review and update patient records. The tool calculates the likelihood that a patient will die within six months, a common medical benchmark for these kinds of decisions.

If the chance of death in that window is 70% or higher, SICC recommends an evaluation for end-of-life hospice care. It’s a cold calculation, but one that doctors hope can help bring greater relief to suffering patients: Hospice often includes psychological and spiritual services as well as physical care. If the chance is between 30% and 69%, the software suggests a conversation about palliative care, a holistic approach to living with chronic illness.

The team behind SICC says it doesn’t yet have meaningful data on the software’s performance; the current test phase began in June. Other forms of predictive technology in medicine have run into challenges: Insurers UnitedHealth, Cigna and Humana are facing lawsuits alleging that they used algorithms to wrongfully deny coverage. (In response, UnitedHealth and Humana have said people are always involved in coverage denials; Cigna has said the purpose of its AI tool has been mischaracterized.) The Hackensack Meridian team stresses that the tool isn’t making decisions. "Think of this as a ‘check engine’ light," says Lauren Koniaris, the chief medical informatics officer at Hackensack Meridian. "It’s a gentle nudge to help us take the best care of our patients." A larger pilot phase is planned for early 2024.

In the meantime, the doctors and developers are encouraged to be mindful of the bias that’s regularly, often unwittingly, built into AI algorithms. The tools are only as good as the data that goes into them, so developers aim to use data from a diverse patient population to avoid skewing results toward a certain race, age, gender or other characteristic. Disparities are already abundant in palliative care, study after study has revealed. In 2020, 48% of Medicare patients who died were enrolled in hospice at the time of death, according to a recent report from the National Hospice and Palliative Care Organization. Broken down by race, however, more than half of White patients used hospice whereas only about a third of Black and Hispanic patients did.

Hackensack Meridian’s senior vice president and chief data and analytics officer, Sameer Sethi, says the development team’s analysis of SICC showed the model had no specific biases as of May 2023. But they continue to test whether the software disproportionately recommends a particular outcome to particular populations, Sethi says.

If the industry’s engineers can navigate these challenges, they’ll find plenty of customers among America’s overworked doctors. Kamdar says he expects to see a wide range of AI products join the field in the next several years, such as tools that allow patients to more easily track their symptoms. As uncomfortable as it can be for doctors and patients to talk about end-of-life options, putting off the conversation is even worse. Digesting data to deliver a gentle nudge could be a good spot for AI in health care. But only humans can ensure equitable outcomes.