There is a need to expand AI data quality and radically revise EHR adoptions for the future of healthcare, posited Dr. John Halamka, President of Mayo Clinic Platform, as he shared the challenges the healthcare industry faces with AI and EHR innovation and the next steps for innovation.
According to Halamka, AI has a credibility problem in healthcare. Physicians do not know if data was developed on people like their patient population when buying an AI algorithm. There is a lack of credibility when there is no understanding of an algorithm's utility, bias, or likelihood of fulfillment. Halamka suggests addressing AI transparency is necessary for physicians to have algorithms that fit their patient populations.
Halamka supports useful AI testability. He explained how it can ensure transparency and requires AI to fit specific purposes. There is a need to build a national and international framework for this testability, enabling any dataset or algorithm to come together in evaluation.
Explainability is a key aspect of AI and comes with a lot of debate. However, most algorithms are probabilistic, multi-tiered mathematical equations without easy explainability, so this aspect may not be attainable.
Algorithms come from various sources like academic health centers, companies, and big tech. Academic centers are usually based on their internal data and use, while companies are creating algorithms within a monetized bubble to provide value-generating services across large populations.
Currently, the industry sees government academia and industry stakeholders producing AI algorithms, but there are no national guidelines or guardrails for how these processes move, Halamka explained.
As developers continue to create new algorithms, he emphasized the need to accompany advancements with guidance.
Data variability within communities must be addressed when using EHR data. When data sets do not adjust to population sets, applications present errors. According to Halamka, there are three specific problems with EHR data sets.
The first of these is the fidelity of the data. When constantly writing unknown data elements into workflows, there is a concern with the accuracy of the recorded information.
The granularity of data elements is a challenge regarding race and ethnicity. For example, categorizing an individuals as "Asian" rather than noting a specific country of origin would result in a less accurate algorithm.
Another concern is creating significant amounts of noise in the EHR data signal. After recording through scripts and templates, it can be hard to separate the signal from the noise in algorithms, Halamka explained.
According to Halamka, while EHR has aspirations to be a platform, it is evolving as a transactional system. With FHIR and CDS Hooks, various aspects of interoperability, and information blocking tools, there is an ecosystem of partners at its core contributing and using data for novel purposes.
Policies primarily drive EHR like 21st Century Cures Act, yet policy change is far-reaching in forcing players to take this next step. Halamka describes this situation as the perfect storm for innovation to align technology, policy, and culture.
Developing FHIR, APIs, restful transactions, and various policies are necessary. However, there is also a need for demand–a cultural expectation for value and utility.
"I think we are approaching a point where there are enough apps and algorithms; maybe even your insurer will offer you a discount or special benefits if there are certain kinds of data you contribute...When that cultural change occurs, that is what is going to get us adoption," Halamka said.
Systems can be engineered to achieve a specific result. In the Meaningful Use era, Halamka recalled physician burnout due to emphasizing population health. The system required physicians to record 141 pieces of information while simultaneously interacting with patients. This is the result the system was engineered for, he explained.
Currently, EHR has FHIR interfaces and increased decision support, but it is not enough. There is a need for a complete paradigm shift for recording medical records. Using NLP and ML for human transcribing deep conversations into discrete data elements is not enough, he emphasized.
One of the challenges for this EHR rebuilding is regulatory complexity. There are requirements to ensure data, prevent treasury fraud and abuse, and provide meaningful use. Despite extraordinary strides in ambient listening and natural language processing (AML), it is difficult to comply with the current regulatory framework.
"It’d be tough. We have to take a careful look at what we want to achieve and engineer what we want to achieve. That’ll require regulatory change," he said.
Now that there is market acceptance for EHR adoption, a radical revision of how it works is the next step, Halamka explained.