Healthcare Solved Interoperability. It Hasn't Solved Data Readiness. And Your Policies Are the Last Thing Anyone Is Cleaning Up.
HIMSS26 surfaced a hard truth: connecting healthcare data systems didn't clean what flowed through them. The document layer—clinical policies, compliance docs, protocols—is still broken.
At HIMSS26 last week, 40,000 health IT leaders arrived in Las Vegas and collectively admitted something: a decade of interoperability investment didn't solve the problem. It moved the mess. The phrase circulating through the conference: "Interoperability without data readiness is incomplete." Everyone nodded. Almost nobody extended it to where the problem actually hides.
The take
Healthcare spent fifteen years building pipes. Turns out nobody cleaned what went inside them.
Christy Bricker, VP of Strategic Operations at Murj, put the data on the table at HIMSS26: in some cardiac clinic databases, error rates reach 50%. Only 40-60% of listed patients in those systems are truly active (Healthcare IT News). Her point was specifically about structured cardiac device data. But the logic extends everywhere.
When you connect two systems that both contain wrong information, you get connected wrong information. AI doesn't change that math. It scales it.
Here's what the HIMSS26 coverage is not saying: this is not only a structured data problem. Clinical policies, compliance documents, procedure manuals, consent forms, regulatory guidance — these are the unstructured layer that AI reads before it acts. And nobody has a cleanup workflow for them. They accumulate over years. Sections get amended and the old version stays. Policies contradict each other across departments. The compliance team's documentation references a regulation that was updated eighteen months ago.
When AI is grounded in that documentation, the model is not the problem.
What HIMSS26 actually revealed
The numbers from this conference are striking in combination. 57% of health systems rank AI as their top technology priority (Becker's Hospital Review, March 24, 2026). At the same time, only 15% report having data fully prepared for large-scale AI use (Med Tech Solutions). That gap — 57% wanting AI, 15% ready for it — is what HIMSS26 was about.
AI governance awareness in healthcare jumped from 40% to 70% in the past year (HFMA). That's the industry recognizing it has a governance problem. What's still missing from the conversation is which layer governance needs to reach.
The HIMSS26 coverage is dominated by EHR data quality, claims data, cardiac device records. Real problems, all of them. But the conversations happening on the clinical floor are guided by something else: policy documents, patient protocols, onboarding materials, department-level compliance guidance. Clinicians look up procedures. AI chat interfaces return answers grounded in whatever documentation the health system has uploaded. If that documentation is three years out of date, or contradicts itself across floors, the AI is accurate about the wrong thing.
We've written before about what happens when clinical AI reads from ungoverned documentation. ECRI named AI the number one patient safety risk this year. The model behavior is the symptom. The inputs the model trusts are the cause.
The document layer nobody is cleaning
Every health system deploying AI should be asking two questions. Most are only asking one.
The first: is our EHR data accurate, normalized, and ready? HIMSS26 pushed this hard. Some organizations are finally starting that work.
The second: are the policies and compliance documents our AI is grounded in accurate, contradiction-free, and current?
That question almost never comes up. It should, because document decay in healthcare is structural. Staff turns over and nobody updates the onboarding manual. A department revises its infection control protocol but the old version stays on the shared drive. Regulatory updates arrive; the compliance team notes the change internally but the guidance document on the portal sits untouched. Over time, a health system's document layer accumulates quiet inaccuracies — none catastrophic on their own, all of them problems when an AI is answering clinical queries against them at scale.
Platforms designed for this — governed, auditable knowledge bases with contradiction detection and automated maintenance — address what HIMSS26 is calling the "data readiness wall" at the document level. Mojar AI provides that infrastructure for the unstructured layer: scanning for contradictions across uploaded policies, flagging outdated content, maintaining source attribution on every answer. This is also the thread running through what HIMSS26 surfaced around Epic and ECRI — clinical AI infrastructure is maturing fast, and governance is still scrambling to keep up.
The question for this week
Healthcare CIOs are heading home from Las Vegas with data readiness on their agendas. Good. But the full agenda item should read: we have a plan for EHR data readiness. What's our plan for clinical policy and compliance document readiness?
Most teams don't have an answer. That's the problem HIMSS26 didn't name.