Healthcare may be bracing for a major shortage of providers and services in the coming years, but even now the industry is straining to meet an ever-growing demand for personalized, patient-friendly care. Artificial intelligence has often been touted as the panacea for this challenge, with many pointing to finance, retail and other industries that have embraced automation.
But the consumerism adopted by other sectors doesn't always translate cleanly into healthcare, says Nagi Prabhu, chief product officer at Solutionreach. Whereas people may be ready to trust automation to handle their deliveries or even manage their finances, they still prefer the human touch when it comes to their personal health.
"That's what makes it challenging. There's an expectation that there's an interaction happening between the patient and provider, but the tools and services and resources that are available on the provider side are insufficient," Prabhu said during a HIMSS20 Virtual Webinar on AI and patient experience. "And that's what causing this big disconnect between what patients are seeing and wanting, compared to other industries where they have experienced it.
"You have got to be careful in terms of where you apply that AI, particularly in healthcare, because it must be in use cases that enrich human interaction. Human interaction is not replaceable," he said.
Despite the challenge, healthcare still has a number of "low-hanging fruit" use cases where automation can reduce the strain on healthcare staff without harming overall patient experience, Prabhu said. Chief among these patient communications, scheduling and patient feedback analysis, where the past decade's investments into natural language processing and machine learning have yielded tools that can handle straightforward requests at scale.
But even these implementations need to strike the balance between automation and a human touch, he warned. Take patient messaging, for example. AI can handle simple questions about appointment times or documentation. But if the patient asks a complex question about their symptoms or care plan, the tool should be able to gracefully hand off the conversation to a human staffer without major interruption.
"If you push the automation too far, from zero automation ... to 100% automation, there's going to be a disconnect because these tools aren't perfect," he said. "There needs to be a good balancing ... even in those use cases."
These types of challenges and automation strategies are already being considered, if not implemented, among major provider organizations, noted Kevin Pawl, senior director of patient access at Boston Children's Hospital.
"We've analyzed why patients and families call Boston Children's – over 2 million phone calls to our call centers each year – and about half are for non-scheduling matters," Pawl said during the virtual session. "Could we take our most valuable resource, our staff, and have them work on those most critical tasks? And could we use AI and automation to improve that experience and really have the right people in the right place at the right time?"
Pawl described a handful of AI-based programs his organization has deployed in recent years, such as Amazon Alexa skills for recording personal health information and flu and coronavirus tracking models to estimate community disease burden. In the patient experience space, he highlighted self-serve kiosks placed in several Boston Children's locations that guide patients through the check-in process – but that still encourage users to walk over to a live receptionist if they become confused or simply are more comfortable speaking to a human.
For these projects, Pawl said that Boston Children's needed to design their offerings around unavoidable hurdles like patients' fear of change, or even around broader system interoperability and security. For others looking to deploy similar AI tools for patient experience, he said that programs must keep in mind the need for iterative pilots, the value of walking providers and patients alike through each step of any new experience, and how the workflows and preferences of these individuals will shape their adoption of the new tools.
"These are the critical things that we think about as we are evaluating what we are going to use," he said. "Err on the side of caution."
Prabhu punctuated these warnings with his own emphasis on the data-driven design of the models themselves. These systems need to have enough historical information available to understand to answer the patient's questions, as well as the intelligence to know when a human is necessary.
"And, when it is not confident, how do you get a human being involved to respond – but at the same time from the patient perspective [the interaction appears] to continue?" he asked. "I think that is the key."