Wednesday, February 11, 2026

< + > Epic Hosting in the Public Cloud

The question of where to host Epic in a hospital and health system is a really important decision.  It’s hard to argue that any system is more important to the operations of a healthcare organization than their EHR.  For the longest time, the Epic hosting decision was easy.  Everyone hosted Epic in their own data centers.  As time has gone on, many people started moving Epic to various private cloud environments including an Epic hosted cloud option.  Now we have some organizations choosing to host Epic in the public cloud.

A little over a year ago, we hosted an episode on our Healthcare CIO Podcast discussing the Epic Cloud Migration along with Michigan Medicine’s decision to go all public cloud.  Our guest on that podcast episode was Dr. Tim Calahan.  He recently decided to leave his position as CTO at Michigan Medicine to work full time as the Founder and Managing Member at EHC Consulting which focuses on hosting healthcare solutions like Epic in the public cloud.

In the interview below, we learn more from Dr. Calahan about his decision to work at EHC Consulting full time.  Plus, we dive into hosting Epic on the public cloud along with his experience moving other applications to the public cloud.

Tell us a little about yourself and EHC Consulting.

Dr. Tim Calahan: I began my technology career in the U.S. Marine Corps within the Judge Advocate General’s Office, where I was famously told to “fix the computer.” That directive launched a career spanning more than three decades in healthcare technology. Over that time, I’ve been privileged to witness — and help lead — the evolution from on-premises computing to modern cloud-based healthcare ecosystems. That constant state of change is what continues to energize and motivate me.

We founded EHC Consulting to address a clear gap we’ve seen across the healthcare industry over the past decade. Many health systems recognize the strategic advantages of the public cloud — agility, resilience, scalability, and innovation — but lack a clear roadmap, governance model, and execution playbook to migrate complex clinical workloads safely and effectively.

At EHC, our core focus is helping organizations move Epic EHR to the public cloud, but our expertise extends well beyond that. We also support migration and modernization of Epic third-party applications, imaging platforms, analytics environments, and general enterprise workloads. Our goal is not just to move technology, but to help organizations transform how they deliver care through modern infrastructure.

Why did you decide to leave your position as CTO at Michigan Medicine and go full time at EHC Consulting?

Dr. Tim Calahan: I am immensely proud of what our team accomplished at Michigan Medicine. During my tenure, we defined a comprehensive cloud strategy, began executing it at scale, and — most importantly — successfully migrated Epic to the public cloud, which is a significant milestone for any academic medical center.

Equally important, we built a strong, capable technology leadership team that I trust deeply to continue this journey. The organization is in excellent hands.

Ultimately, my decision comes down to impact. While I was able to drive meaningful change at Michigan Medicine, EHC Consulting allows me to bring that same experience, expertise, and approach to multiple health systems nationwide. By focusing full time on EHC, I can help accelerate cloud transformation across the broader healthcare ecosystem — which I believe is where I can have the greatest positive effect.

What are some of the big lessons learned while CTO at a health system when it comes to IT infrastructure?

Dr. Tim Calahan: There are three key lessons that stand out from my time as CTO at Michigan Medicine:

First, traditional on-premises infrastructure is increasingly inadequate for modern healthcare. Legacy architectures struggle to support real-time analytics, interoperability, AI, and the scale of data that today’s clinical and research environments demand. These limitations don’t improve over time — they compound.

Second, cloud transformation is not a quick project; it is a multi-year journey that requires disciplined leadership, patience, and trust in the long-term value of cloud. Organizations that lack consistent executive sponsorship or strategic clarity often stall or backslide. At Michigan Medicine, we were fortunate to have strong, unwavering leadership — particularly from Dr. Marshall Runge — which was critical to our success.

Third, your partners matter. Selecting the right cloud provider and systems integrators is one of the most consequential decisions a health system can make. You need partners who deeply understand both healthcare and large-scale cloud transformation, not just generic IT migration.

As CTO you decided to go all in on public cloud — what were the pros and cons of that decision?

Dr. Tim Calahan: Going all in on the public cloud was a straightforward decision for me, and one I would make again without hesitation.

The benefits are extensive: improved reliability, faster innovation, better security posture, elastic scaling, and the ability to integrate modern data and AI capabilities that are simply impractical in traditional data centers.

The primary “con” isn’t technical — it’s cultural. A transformation of this magnitude inevitably creates resistance. Some stakeholders are understandably cautious, and some incumbent vendors are invested in preserving the status quo. Throughout our journey, we encountered skepticism, fear, uncertainty, and doubt from various corners of the organization and industry.

Strong leadership and a clear vision were essential. We had to consistently remind people why we were doing this: to modernize care delivery, improve resilience, and position Michigan Medicine for the future of digital health.

Why are you so bullish on Epic in the public cloud?

Dr. Tim Calahan: I’ve been working on moving Epic to the cloud for nearly a decade, and I’ve seen firsthand how transformative it can be.

At a surface level, Epic performs well in the cloud, can be more cost-effective to operate, and allows for more flexible capacity planning. But the deeper benefit is organizational.

When Epic is delivered via cloud and managed services, IT teams are freed from routine operational maintenance and can focus more on innovation, clinical collaboration, and strategic initiatives that directly impact patient care. I’ve seen this shift dramatically change how IT functions within health systems — from infrastructure caretakers to strategic enablers.

That track record of real, measurable transformation is why I remain so confident in Epic’s future in the public cloud.

How have you seen Epic evolve in its approach to the public cloud?

Dr. Tim Calahan: When we first explored moving Epic out of traditional data centers, it was considered nearly impossible — and Epic initially told us as much.

Over time, through collaboration, engineering investment, and persistence across the industry, that mindset changed. What began as experimental architecture evolved into a validated, scalable model.

Today, Epic has fully embraced the public cloud. They’ve developed strong engineering capabilities, standardized best-practice architectures, infrastructure-as-code frameworks, and operational playbooks tailored specifically for cloud environments.

A great example of this evolution is Epic’s Cogito platform on Microsoft Fabric, which represents a truly cloud-native analytics strategy. This shift reflects a broader recognition that the consistency, scalability, and innovation velocity of the public cloud align better with Epic’s long-term roadmap than fragmented on-premises environments.

Are you concerned about cloud vendors raising prices once organizations are locked in? What can be done to mitigate that risk?

Dr. Tim Calahan: One of the advantages of the public cloud market is that it is competitive. There are three major cloud providers, and none can dramatically raise prices without risking significant customer migration.

While moving Epic between clouds is complex, it is absolutely feasible — and can be done without clinical disruption. That reality keeps pricing in check. If one provider acted in bad faith, others would quickly step in with incentives to attract customers.

Health systems can also hedge risk by designing architectures that are not overly dependent on proprietary services, negotiating strong contracts, and maintaining strategic flexibility. The key is thoughtful cloud governance, not avoidance of cloud altogether.

How do you see “AI at the edge” fitting with a public cloud-first strategy?

Dr. Tim Calahan: There is a lot of discussion about AI today — much of it still theoretical. What we do know is that effective AI depends fundamentally on data. Organizations that have centralized their data in the cloud are far better positioned to take advantage of AI at scale.

As AI tools mature, they will require ongoing monitoring, governance, and validation to ensure outputs remain accurate, ethical, and clinically reliable. That kind of oversight is far easier to manage from a centralized, cloud-based platform than from fragmented on-premises environments.

I do believe we’ll see more AI user interfaces and applications deployed at the edge — for example, in clinical workstations or medical devices. But the underlying data, analytics, and computational infrastructure will remain best suited for the cloud. In that sense, “AI at the edge” complements — rather than contradicts — a cloud-first strategy.



< + > Data That’s Timely and Insightful from PointClickCare

Avoiding readmissions after acute care is just one manifestation of the move to value-based care, according to Shweta Shanbhag, Director Product Management at PointClickCare in a recent interview with Healthcare IT Today. She points out at least one out of every five Medicare acute stays results in admittance to a skilled nursing facility. It’s important across the board for different teams to work together.

She recommends that the various providers who are partnering in value-based care agree on a small set of shared metrics. These feed into shared goals of reducing readmissions and improving care.

Shanbhag has two major recommendations regarding data sharing. First, it must be timely so that care teams have time to understand which patients to prioritize and what risk factors to examine. If it comes weeks or even days later, she calls it “documentation” instead of improving care. Data on readmissions is a “lagging metric,” because it comes too late to change the outcome for that patient.

Second, data must be presented in a way that generates insights—not just “dumped on staff that is already stretched thin.” Data must capture the whole episode, not just information from one setting or point in time.

PointClickCare offers a service that helps catch problems early, called the Predictive Return to Hospital Model.

She also recommends that partners meet every one or two weeks to look at what has happened and which providers had the most positive results. The term “visibility” came up repeatedly in the interview to summarize effective data sharing.

In general, Shanbhag sees value-based institutions moving from reactive to pro-active care, and making post-acute care more strategic, not just an afterthought.

Check out our interview with Shweta Shanbhag from PointClickCare to learn more.

Learn more about PointClickCare: https://pointclickcare.com/

Listen and subscribe to the Healthcare IT Today Interviews Podcast to hear all the latest insights from experts in healthcare IT.

And for an exclusive look at our top stories, subscribe to our newsletter and YouTube.

Tell us what you think. Contact us here or on Twitter at @hcitoday. And if you’re interested in advertising with us, check out our various advertising packages and request our Media Kit.

PointClickCare is a proud sponsor of Healthcare Scene.



< + > Health Plan AI Has a Data Problem, and It’s Costing CFOs More Than They Think

The following is a guest article by Megan Schmidt, President Chief Executive Officer, Madaket

Health plans are pouring capital into automation, analytics, and artificial intelligence, betting that smarter systems will lower administrative costs and improve operational performance. Yet a growing body of evidence suggests many of those investments are stalling, not because the technology is immature, but because the underlying provider data is fundamentally unreliable.

The situation has been described as a “house of cards,” with inconsistent and outdated provider data preventing health plan AI initiatives from ever reaching scale. The reporting reinforces a basic truth that has long haunted enterprise technology: AI systems cannot outperform the quality of the data they rely on.

What has received far less attention, however, is the financial implication.

For payer CFOs, unreliable provider data is not simply an IT challenge. It is a persistent, compounding cost center that quietly inflates administrative expenses, drives claims rework, and undermines the return on digital transformation investments.

The Invisible Cost Behind Provider Data Chaos

Every health plan maintains provider information across a sprawling ecosystem of systems, including enrollment, credentialing, directories, claims, network management, delegated entities, and external vendors. When a provider updates a specialty, location, tax ID, or affiliation, that change often must be verified and re-entered multiple times across disconnected platforms.

This work rarely appears as a discrete line item on a budget. Instead, it is embedded across departments and absorbed as routine operations. Teams spend time reconciling conflicting records, correcting downstream errors, responding to directory inaccuracies, and managing avoidable provider and member disputes.

Because these costs are distributed, they are rarely measured directly, even though they recur continuously. For CFOs under pressure to reduce SG&A while supporting growth initiatives, provider data represents a blind spot with real margin impact.

Why AI Is Exposing the Problem, Not Solving It

AI initiatives are forcing health plans to confront a long-standing operational reality: provider data is not standardized, synchronized, or authoritative.

Machine learning models can optimize workflows, flag anomalies, and automate decisions, but they cannot compensate for conflicting versions of provider truth. When underlying data varies across systems, AI outputs become unreliable or misleading. In practice, many organizations find that automation initiatives stall in pilot mode, staff revert to manual validation to double-check results, and new tools are layered on top of broken data foundations.

From a finance perspective, this dynamic is particularly painful. Capital is deployed, vendors are onboarded, and internal teams are mobilized, yet the expected efficiency gains fail to materialize. The issue is not that AI does not work. It is that the economics collapse when the data feeding it is fragmented.

The CFO Angle: Provider Data as an “Admin Tax”

Most payer CFOs can track medical cost trends, utilization, and claims performance in granular detail. Far fewer can quantify how much their organization spends each year maintaining provider data accuracy.

Yet poor provider data drives tangible financial consequences that directly affect margins:

  • Preventable claims rework and denials that increase administrative expense
  • Delayed provider onboarding that slows revenue realization and network growth
  • Directory inaccuracies that create compliance exposure and reputational risk
  • Labor costs that scale with volume because manual correction never disappears

In effect, fragmented provider data functions as an “admin tax,” a recurring operational expense that grows as plans add members, expand networks, or acquire new entities. Unlike medical costs, however, this tax is not inevitable.

From Fixing Data to Fixing the System

Industry attention is beginning to shift away from point solutions that clean up provider data after problems surface and toward infrastructure approaches designed to prevent fragmentation at the source.

This emerging model treats provider data less like static records and more like a continuously updated supply chain, where validated updates are normalized once and automatically synchronized across participating systems. Instead of multiple teams chasing the same information, organizations operate from a shared, authoritative provider record that stays current over time.

For CFOs, the appeal is straightforward. Fewer manual touches reduce operating expense. Auditability improves. Compliance confidence increases. Most importantly, the infrastructure scales without requiring proportional headcount growth.

Just as critically, this approach restores the economics of AI by giving automation a stable and trustworthy foundation on which to operate.

Why This Matters Now

With margins tightening, regulatory scrutiny increasing, and AI budgets under closer examination, health plans are being forced to justify not just innovation, but measurable outcomes.

Provider data may not be the most visible challenge in payer operations. But as AI initiatives continue to expose its weaknesses, it is becoming increasingly clear that fragmented provider data is not a technical inconvenience.

It is a financial liability hiding in plain sight.

For CFOs looking for sustainable efficiency gains, the question may no longer be whether to invest in AI, but whether their provider data infrastructure is capable of supporting it at all.

About Megan Schmidt

Megan Schmidt is based out of Grand Rapids, Michigan, and is the President and Chief Executive Officer at Madaket. With deep experience at some of the nation’s largest integrated care delivery systems, Schmidt previously held senior executive leadership roles at HealthPartners and Corewell Health, which includes healthcare providers and its own health plan, Priority Health.



< + > This Week’s Health IT Jobs – February 11, 2026

It can be very overwhelming scrolling through job board after job board in search of a position that fits your wants and needs. Let us take that stress away by finding a mix of great health IT jobs for you! We hope you enjoy this look at some of the health IT jobs we saw healthcare organizations trying to fill this week.

Here’s a quick look at some of the health IT jobs we found:

If none of these jobs fit your needs, be sure to check out our previous health IT job listings.

Do you have an open health IT position that you are looking to fill? Contact us here with a link to the open position and we’ll be happy to feature it in next week’s article at no charge!

*Note: These jobs are listed by Healthcare IT Today as a free service to the community. Healthcare IT Today does not endorse or vouch for the company or the job posting. We encourage anyone applying to these jobs to do their own due diligence.



Tuesday, February 10, 2026

< + > Is it a Tech Problem or a Policy Problem in Value-Based Care?

We’ve been working to move towards value-based care for a while now, but there are still some kinks we need to work out to have it run the way we all want it to. One such kink is trying to decipher if the problems we are having in value-based care are on the tech side of healthcare or if they are policy issues.

We reached out to our brilliant Healthcare IT Today Community to ask — how much of value-based care is a tech problem and how much is a policy problem? The following are their answers.

Jay Ackerman, CEO at Reveleer
Policy sets the direction for value-based care (VBC), but the real barriers are largely operational and technological. While the vision is clearly policy-driven, most challenges come down to fragmented systems, inconsistent data practices, and outdated processes that make execution difficult. In that sense, it’s less a question of motivation and more a matter of building the right tools and workflows to make the policy goals achievable. Policy defines what will happen, but technology determines whether it can happen at scale.

Kevin Riley, Co-CEO at Tendo
Value-based care is fundamentally both a technology and a policy challenge—and success requires progress on both fronts. Policy establishes the incentives and frameworks that define what “value” means, but technology determines whether those goals can be operationalized at scale. Policies may mandate coordinated, outcome-driven care, but without interoperable systems, data liquidity, and user-friendly workflows, even the best-designed models falter.

Sanjeev Menon, Head of Provider Solution at Ubie
As much as I’d love to say technology can solve everything, policy is the biggest barrier to moving to a true value-based system. Technology evolves to meet needs, and needs are defined by incentives. Today’s healthcare ecosystem has too many misaligned incentives, so in many ways, tech is exacerbating the biggest challenges in VBC. Payers and providers are locked in the Coding Wars with payers’ upcoding & revenue cycle management tools forcing insurers to adopt ever more draconian downcoding tools – and vice versa.

Ultimately, VBC comes down to paying for quality, which depends on two policy questions: 1) who pays? and 2) what’s quality? Tech only helps once we answer those questions.

Taylor Beery,  Co-Founder and Chief Innovation & Administrative Officer at Imagine Pediatrics
Value-based care is ultimately about the impact of improving outcomes and experiences for the patients we serve. Across the US health system, we find ourselves in a moment where providers and health plans can achieve much better outcomes than at any other time in history with the right technology.

The role of policy here is to ensure the right incentives and structure for tech-enablement, dismantling the fragmentation that has stood in the way of access to integrated and personalized care for far too long. Policy changes should start with vulnerable populations, like children with special healthcare needs. Those policy changes should align the systems, data, and incentives around the lived experience of patients and opportunities for impact, not just codes and transactions.

Susan Lofton, MPT, VP, Outcomes and Clinical Transformation at WebPT
Value-based care is roughly 60% policy and 40% tech. While policy mandates the shifts and defines the rules and requirements, technology is what enables providers to execute, measure, and report under those rules. Why 60% policy and 40% tech? You can’t ‘tech’ your way around bad policy, but even good policy fails without the right tools.

Steve Holt, Vice President, Government Affairs at PointClickCare
Value-based care (VBC) is less a single challenge and more a policy–technology alignment problem. The policy framework defines what outcomes and incentives matter, while technology determines whether and how those outcomes can be measured, shared, and acted upon in real time. Many healthcare organizations today, especially those that received HITECH funding, are technologically equipped to participate in value-based care.

However, the roadblocks to VBC come in the form of inconsistent alignment of federal and state definitions of “value,” varied data-sharing standards and requirements across payers, and the administrative complexity of participation. For example, post-acute and long-term care providers are largely excluded from incentive programs that would fund the connectivity and allow them to participate fully in VBC. However, revisiting state and federal quality incentive programs that focus on technology adoption and improved technology-driven outcomes would be a significant catalyst in increasing VBC adoption across the care continuum.

Linda Leigh Brock, Vice President of Product Management at NASCO
Value-based care (VBC) started as a policy-driven innovation, but its biggest barriers today are technological. Policy can mandate change, yet true progress depends on having the right data, analytics, and workflows to make value-based models work. Adoption has been slow due to the lack of longitudinal data, action-ready clinical workflows, and transparent economics.

Technology, especially predictive analytics, must enable better patient engagement, proactive interventions, and standardized outcome tracking. However, applying advanced tools to outdated fee-for-service (FFS) processes risks reinforcing the wrong system. Real success requires using technology to build new, purpose-built value-based care workflows rather than optimizing legacy ones.

Julie Sacks, CEO at Home Centered Care Institute
In my opinion, value-based care needs serious policy change. While technology can support care coordination, data sharing, and remote monitoring, it’s policy that determines who can participate, how care is reimbursed, and whether incentives are aligned across the continuum. Smaller practices, especially those serving frail, elderly, and homebound patients, often lack the scale or infrastructure to thrive under current models. Without inclusive payment pathways (like High Needs ACO REACH), these providers risk being left out of the value-based care movement.

To truly realize the promise of value-based care, we need policy reform that prioritizes flexibility, inclusivity, and outcomes that matter to patients.

Lucienne Ide, Founder & Chief Executive Officer at Rimidi
Value-based care is both a policy design challenge and an implementation and execution problem, and it’s not possible to succeed without addressing both. Policy sets the incentives and guardrails, while technology and workflow redesign make them operational at the point of care. Over the past few years, CMS policy has clearly moved in the right direction with decisions to encourage care management activities — such as RPM and CCM — and to introduce models that reward outcomes-focused care rather than episodic encounters.

Where organizations stumble isn’t a lack of policy; it’s the translation layer. Many core barriers stem from technology gaps: fragmented data, limited interoperability across EHRs and HIEs, poor care coordination workflows between specialists and primary care, and underdeveloped analytics that don’t operationalize risk.

Policy is perhaps 30-40% of the problem today. The remaining 60-70% is execution, largely getting the right data in the right workflow for the right clinician at the right time, and doing it consistently enough to improve readmissions, adherence, and total cost of care. You can count on policy to open the door, but technology determines whether providers actually walk through it.

Frank Vega, CEO at The Efficiency Group
Value-based care breaks down when policy goals and operational reality don’t meet. In these situations, technology is often deployed to address the “disconnect” when the real issue is the operational process executing the policy. The policy – and the process – need to be clear in order for the technology to be effective. The intent is there, but without clean workflows, structured data, and automation, even the best policies and technologies stall before they reach the patient.

Mary Sirois, Senior Vice President, Strategic Solutions at Nordic
Getting value-based care right is not a matter of whether tech or policy matters more, but rather, which must come first. Value-based care (VBC) succeeds or fails on an organization’s ability to understand and manage risk, and that can be heavily—but not entirely—influenced by policy. A health system’s ability to absorb shifts in policy (coverage, performance metrics, stability, tariffs, payment models, etc.) while maintaining responsible stewardship for quality of care, outcomes management, and understanding cost across the continuum of care activities is a prerequisite for VBC success.

Tech, on the other hand, is an accelerator and an enabler. It can speed up and enable progress, but it doesn’t fix weak cost discipline, poor quality data, or suboptimized workflows that fail to capture or utilize data to best measure and perform against the contract, and its efficiency gains alone aren’t a substitute. If a health system can’t manage unit costs and operational complexity, it won’t survive in a risk-bearing model, full stop.

VBC is good in theory, but it can be a losing bet for many mid-to-small hospitals right now as payer mix shifts toward Medicare, Medicaid churn pushes people into the uninsured bucket, and supply tariffs drive up input costs. Technology can absolutely help, but only after leaders confront the macroeconomics.

David Snow, CEO at Cedar Gate Technologies
On the policy side, the first mandatory alternative payment model is launching in 2026 (the Transforming Episode Accountability Model, or TEAM) and experts agree that additional mandates are likely on the horizon. Until now, value-based care has been largely voluntary, and organizations that didn’t see it as profitable enough could simply continue to operate in fee-for-service models. As the government shifts toward mandatory participation, commercial insurers are expected to follow CMS’s lead.

On the technology side, success in value-based care really hinges on the ability to bring data together in a cohesive, interoperable way—a challenge that has long plagued healthcare, but health IT systems are now capable of addressing in a meaningful and effective way.

Shitang Patel, VP, Payers at CitiusTech
Value-based care is less a pure technology or policy problem and more a systemic operating-model problem. Technology has contributed to fragmentation. Organizations have built “a quilt of patchwork” across Stars, HEDIS, readmissions, risk adjustment, and population health, each with its own tools, dashboards, and workflows that don’t always talk to each other.

At the same time, policy complexity from CMS and private payers has created overlapping and sometimes contradictory requirements, such as demanding value-based outcomes while still measuring physicians on RVU productivity. Prior authorization and payer-specific barriers further impede standardized care pathways and add friction to clinical workflows.

The real divide is not tech vs. policy; it’s that both have evolved independently without a unified governance model. We built scaffolding for a new care paradigm, but forgot the concrete. Until incentives, workflows, and data flows are aligned, both technology and policy will continue to underdeliver.

What great insights here! Huge thank you to everyone who took the time out of their day to submit a quote to us! And thank you to all of you for taking the time out of your day to read this article! We could not do this without all of your support.

How much of value-based care do you think is a tech problem, and how much do you think is a policy problem? Let us know over on social media, we’d love to hear from all of you!



< + > How Nordic Modularity is Rebuilding the Healthcare AI Stack

The following is a guest article by Andreas Cleve, Co-Founder and CEO at Corti

Nordic-Style Modularity is Rebuilding Healthcare AI with Shared, Trusted Building Blocks; This Foundation Accelerates Deployment and Strengthens Care at Scale

In Denmark, we grow up with a simple lesson: systems matter.

It’s why our turbines tolerate winter storms, why our insulin pens deliver such quiet precision, and why small plastic bricks still snap together flawlessly six decades on. Danish engineering has long favored modularity – designing parts that interlock cleanly, so the next layer becomes easier, not harder, to build.

That same mindset is reshaping how we think about healthcare AI.

Unblocking the Problem

Healthcare faces a structural challenge: demand for care is outpacing the supply of trained clinicians. The world is aging, chronic disease is rising, and every system is struggling to staff adequately.

AI is often presented as the solution. Yet those building AI for healthcare encounter the same obstacle: the infrastructure required to use it safely is far harder than the AI itself.

Clinical data is scattered across formats and systems. Regulations are essential but heavy. And general-purpose AI still fails to grasp the nuance of clinical language and reasoning. Too often, teams spend months constructing the regulatory scaffolding and clinical context that should already exist before they can even begin solving the actual problem at hand.

This bottleneck slows innovation in an industry that simply cannot afford delay.

The Modular Approach

What healthcare needs is not more standalone AI tools, but a foundation – a set of reliable building blocks that let teams construct applications without reinventing the entire technical and regulatory stack each time.

Imagine the healthcare AI ecosystem built more like a LEGO set: speech recognition that truly understands clinical dialogue; models trained for medical reasoning; privacy, governance, and compliance handled at the core; deployment options that meet stringent local requirements. A system where these components connect through interoperable standards, allowing builders to combine them however a particular workflow or clinical setting demands.

Such modularity would let teams focus on their specific use case – whether that’s documentation, specialty-specific decision support, or operational efficiency – instead of rebuilding the same foundations repeatedly. And as more applications are deployed, the underlying platform becomes stronger, safer, and more efficient for the next wave of builders.

This is how infrastructure compounds: not by chasing novelty, but by refining the parts so others can assemble them into something new.

Making the Possible Practical

The shift underway in healthcare AI is a move from hand-tooled pilots to composable systems that can be deployed broadly and safely. Today, it’s not unusual for an AI healthcare application to take a year or more to reach production. A modular foundation can compress that timeline dramatically – not by cutting corners, but by eliminating redundant work.

Speed matters because delays don’t just stall innovation; they ultimately affect patients. When clinicians have access to tools that are trustworthy, contextually intelligent, and able to scale across diverse patient populations, infrastructure becomes something closer to a public-health intervention than a technical convenience.

Europe’s Industrial Advantage

Europe has a long tradition of building regulated, precision infrastructure – from energy grids to medical devices – that other industries rely on. In Europe, regulation is sometimes framed as a barrier to innovation, but it can also form an advantage: a shared language for safety, trust, and interoperability.

In healthcare AI, that foundation matters. Systems built to meet Europe’s highest standards naturally extend well to other regions, proving that safety and scalability don’t have to be opposing forces.

Building Forward

AI will not replace clinicians. But strong infrastructure can extend their reach – the way great design tools expand what’s possible for architects or engineers. It can help close the widening gap between the care people need and the capacity health systems can provide.

The next generation of healthcare innovation won’t be defined by who builds the flashiest application or the largest model. It will be shaped by those who invest in the underlying systems: dependable, interoperable, thoughtfully engineered building blocks that unlock a thousand different possibilities.

For Denmark – and for Europe – this isn’t a new idea. It’s a continuation of a tradition: solving complex problems by designing systems that work beautifully together.

About Andreas Cleave

Andreas Cleve is Corti’s Co-Founder and CEO. After spending nearly a decade working as a multi-entrepreneur in AI, Andreas founded Corti with Lars Maaløe, pioneering a safe and effective Generative AI platform for healthcare. Corti’s AI not only takes notes but also quality assures, journals, codes, nudges, prompts, and documents every patient interaction. With significant research findings in speech processing, dialectic challenges, medical coding, and language understanding, Corti’s artificial intelligence enhances real-time consultations across the entire patient journey across the United States and Europe.



< + > Healthcare Triangle, Inc. Signs Definitive Agreement with Teyame AI LLC | Kodiak Solutions Acquires BESLER

Check out today’s featured companies who have recently completed an M&A deal, and be sure to check out the full list of past healthcare IT M&A.


Healthcare Triangle, Inc. Signs Definitive Agreement with Teyame AI LLC, Which is Forecasted to Generate $38M in Incremental NTM Revenue and Incremental NTM EBITDA of $5M in Addition to Expanding its SaaS Footprint in Europe and Latin America

Healthcare Triangle, Inc., a leader in digital transformation solutions for healthcare and life sciences, today announces that it has entered into a Definitive Agreement with Teyame AI LLC, a St Kitts and Nevis corporation, as part of its planned acquisition of the shares of Teyame 360 SL and Datono Mediacion SL, companies incorporated in Spain, which are run together as a Spain-based leader in AI-powered omnichannel customer experience (CX) solutions. This acquisition would position the Company as a global force in AI-powered customer and patient engagement.

The proposed transaction contemplates up to approximately $50 million of total consideration, consisting of a combination of cash, shares of the Company’s common stock, shares of non-voting convertible preferred stock, and contingent earnout-based equity consideration, and anticipates closing the transaction on January 29, 2026, subject to the required shareholder approval, and other customary closing conditions. Notwithstanding the closing timeline, the parties agreed that the transaction contemplated by this Agreement shall be deemed effective as of January 1st, 2026. This communication does not constitute a solicitation of any proxy, vote, or approval.

Based on financial information the Company has received from Teyame, the Assets generated approximately $32 million in incremental annual revenue and approximately $3.6 million in incremental EBITDA for fiscal year 2025. The planned acquisition represents a pivotal moment in HCTI’s evolution from healthcare IT provider to comprehensive digital innovator and is expected to significantly enhance HCTI’s financial performance and shareholder value.

“The transaction will bring real-world lived experience of Agentic Gen AI and is about to change the game for HCTI. It’s where the rubber meets the road in AI,” added David Ayanoglou, Chief Financial Officer at HCTI.

“We are pleased to take this decisive step with the signing of the Definitive agreement. Integrating these AI-powered engagement platforms with HCTI’s healthcare technologies positions us to deliver a next-generation, intelligent ecosystem for patients, providers, and expanding SaaS Footprint into Europe and Latin America,” said Sujatha Ramesh, Chief Operating Officer, Principal Executive Officer, and Director, Board of Directors at HCTI.

This planned acquisition is slated to be a critical step in HCTI’s broader strategy focused on…

Full release here, originally announced January 22nd, 2026.


Kodiak Solutions Acquires BESLER to Enhance Kodiak’s Revenue Integrity and Reimbursement Services for Hospitals, Health Systems, and Medical Practices

Adding BESLER Enables Provider Organizations to Accomplish Even More with their Data through the Kodiak Platform

Kodiak Solutions completed its acquisition of Besler & Company, LLC (BESLER), a market leader in revenue recovery and hospital reimbursement solutions, on January 22, 2026. Terms of the deal are not being disclosed.

Acquiring BESLER further enhances the Kodiak platform that hospitals, health systems, and medical practices rely on for net revenue reporting, revenue cycle benchmarking, analytics, and healthcare business office automations.

“Adding BESLER’s people to our teams and its products to the Kodiak platform serve our mission of simplifying complex business problems for healthcare leaders,” said Kodiak Solutions CEO Derek Bang. “Bringing Kodiak and BESLER together enables health system CFOs to accomplish more with their data and gain actionable insights that inform critical decisions they make every day.”

BESLER brings new automation and managed service options to Kodiak Solutions customers. For example, BESLER’s Transfer Diagnosis Related Group (DRG) software for detecting Medicare underpayments, other processes for validating DRGs, and other revenue integrity and reimbursement services will soon be available through Kodiak’s platform.

“For four decades, BESLER has supported hospitals in their efforts to be paid appropriately for the care they deliver,” said Jonathan Besler, who most recently served as President and CEO of the company…

Full release here, originally announced January 26th, 2026.



< + > Epic Hosting in the Public Cloud

The question of where to host Epic in a hospital and health system is a really important decision.  It’s hard to argue that any system is mo...