The following is a guest article by Bevey Miner, Executive Vice President, Healthcare Strategy and Policy at Consensus Cloud Solutions
Artificial intelligence (AI) is everywhere, and with so much hype around it, healthcare organizations are rightfully cautious about deploying AI solutions.
Although promising in its ability to transform healthcare through more efficient and accurate data capture and management, there is still much that we as an industry and society need to understand before scaling its use, especially when it comes to patient care. During a recent panel discussion at HIMSS24 in Orlando, Fla., March 11–15, three panelists representing different types of healthcare technology shared their perspectives on the role of AI in healthcare and provided a real-world look at how, in a controlled setting, AI can be deployed and monitored to handle the complexity of care delivery.
Today, AI is at the heart of many technologies, and a large subset of it deals primarily with software that can imitate human behavior through functions like pattern recognition, and natural language processing — which encompasses generative AI, one of the more popular applications of AI today — and machine learning, according to panelist Jeffrey Sullivan, Chief Technology Officer at Consensus Cloud Solutions.
Clinical and Administrative Uses of AI
AI isn’t new to healthcare, but how we’re using it continues to evolve, according to another panelist, Madelaine Yue, Vice President of Solutions Delivery at Experis Health Solutions. For instance, we are capturing data in a more systematic way that can be used for machine learning in both clinical and administrative settings.
On the clinical side, this can include analyzing data from patients’ EHRs to provide decision support interventions, which is a term introduced by the U.S. Department of Health and Human Services Office of the National Coordinator for Health Information Technology in its recently released HTI-1 final rule to take into account how AI can inform clinical decisions. On the administrative side, AI can be used to automate manual processes, such as writing patient communications — which studies have found to be more empathetic than what clinicians draft — and consolidate information for “more efficient use of your human factor,” Yue said.
Population Health Insights and Patient-Centered Care
AI is also being used to gain insights about population health and provide more patient-centered holistic care, according to panelist Mason Ingram, Director of Payer Policy at Premier. For example, some providers are using ambient AI to record provider-patient interactions during exams or telehealth appointments, enabling them to capture more specific and discrete information, which helps drive proper outcomes. Meanwhile, predictive AI can help providers predict costs and figure out how to deploy scarce resources.
Tangential to its use in population health is AI’s application in the clinical trial space. Often delayed due to patient recruitment issues, Ingram shared that clinical trials are now leveraging AI to match the most appropriate patients for clinical studies.
One of Yue’s biggest concerns around the vast amount of data collected is whether human input is being used for AI outcomes to be actionable. Pointing to a study in which the diagnostic accuracy of AI was compared with that of human radiologists, Yue said the research found that AI caught more false positives, whereas humans detected more complex conditions that warranted immediate treatment. Although AI helped to maximize the efficiency of the radiologists, a human factor was still needed to detect the cases that needed more rapid intervention.
“With AI-augmented intelligence supporting human intervention, clinicians can use predictive analytics to see which patients are at a higher risk of being readmitted to the hospital or falling, and based on those patterns, make decisions regarding types of treatment and therapy,” Yue explained.
These insights are also helping provider organizations, especially critical access and rural hospitals that may lack adequate staffing, to remotely monitor patients, deploying proper interventions when they’re needed.
Reducing Provider Burnout with Turnkey Solutions
The question of whether and where AI will be used in healthcare has shifted since last year to conversations around how it’s being implemented, according to Sullivan.
He points to the real and practical benefits of AI as a tool to lessen the administrative workload of providers, allowing them to spend a higher percentage of their time doing high-impact clinical work with their patients.
On average, physicians spent 1.84 hours per day beyond work hours completing EHR documentation. That adds up to 9.2 hours spent each week on work outside of the workday.
“Your strategy on AI must be much more nuanced and involved now because it’s everywhere,” Sullivan said. “When you think about how you’re bringing AI into use in your daily work, it’s about helping you be more efficient, more effective, not about doing your work for you or displacing you.
“Understanding which areas would be most impacted by AI will help end users optimize the technology. For example, ambient listening to transcribe clinicians’ notes is one way AI offers users a clear return on investment.
“There are a lot of turnkey AI solutions that can unlock value for users, especially around administrative processes, according to Ingram. For example, AI can also play a role in formulating care plans for cases with ‘clear-cut clinical requirements’, such as appropriate use criteria for advanced diagnostic imaging.”
Some healthcare organizations are using intelligent data extraction, which combines AI software with natural language processing and tools like digital fax to gather shareable information more fully. These solutions pull information from unstructured documents like handwritten notes, PDFs, scans, and images, and send it to clinicians and staff directly within their workflows, speeding access to care and avoiding delays in treatment that could potentially impact outcomes.
Despite the abundance of low-hanging fruit for AI uses, Yue cautioned the healthcare industry to not be afraid of progression. “For example, some EHR companies are beginning to leverage generative AI to more easily craft communications that are more patient friendly,” she said.
The Need for AI Governance Amid Concerns
It’s clear that AI is contributing to healthcare in meaningful ways. However, concerns abound around its use and the need for guidelines. Some of the primary concerns that panelists discussed involve the veracity of data used to train AI and whether that data introduces bias, as well as privacy and security issues.
Although the panelists expressed a need for greater clarity in what’s currently an “extremely nebulous regulatory environment,” they also cautioned that regulation, even if well-meaning, could have unintended consequences in what is a very nuanced and complex space.
“A really interesting policy/ethical question is, when is the greater good and the regulatory thing in tension with each other? How do we advance the greater good in a way that also preserves things like commercial interest or privacy rights?” Sullivan asked. “We must think about how to advance the state of healthcare in a way that’s both responsible and considers those limitations.”
No comments:
Post a Comment