Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more

Transforming Health Payer Interoperability with Real-Time Data Streaming

Written By

The provision of healthcare in the U.S., as well as in many other countries, relies on the interaction of three groups: patients, healthcare providers, and healthcare payers. Interoperability is the ability to communicate data across these separate entities in a standard way to maximize positive healthcare outcomes. If a patient sees multiple healthcare providers, for example, interoperability enables those providers to share information about the patient’s health records, medications, and treatment plans, which can help avoid duplicative tests and treatments. 

While interoperability is important for each group within the healthcare ecosystem, in this blog we’ll focus on its specific role to healthcare payers (i.e., healthcare insurers). We’ll start by outlining how interoperability helps payers to deliver quality services to patients and providers before exploring some of the technical challenges in implementing it in a modern, scalable way. Drawing from our extensive experience working in the sector, we’ll then explain how multiple organizations have used real-time data streaming with Confluent to address these challenges.

By the end of this blog, we hope you’ll have a solid understanding of how data streaming forms a fundamental part of interoperability for healthcare payers. 

Why Is Interoperability So Important? 

Interoperability between healthcare systems is mandated by many countries and regions across the world. In the U.S., the Health and Information Technology for Economic and Clinical Health (HITECH) Act was passed in 2009, and includes provisions to promote the adoption and meaningful use of electronic health records (EHRs) by healthcare providers and hospitals. This was followed in 2016 by the 21st Century Cures Act, which requires healthcare providers, payers, and healthtech platforms to use standard application programming interfaces (APIs) to allow for the secure exchange of health information between systems. 

Other programs around the world, such as the U.K.’s NHS Digital Programme, the European Union’s eHealth Action Plan, Australia’s “My Health Record System,” and Canada’s “Health Infoway” initiative have all included provisions for improved interoperability between healthcare systems; and they share the same essential objective—to improve healthcare outcomes for patients. 

In the specific context of healthcare payers, however, interoperability plays a significant role across multiple business areas: 

Improved care coordination

Interoperability allows healthcare payers to access and share patient health information with providers, which can help improve care coordination and avoid duplicative tests and treatments.

Increased efficiency

Interoperability can help streamline administrative processes, such as claims processing and prior authorizations, which can reduce administrative costs and improve efficiency.

Better data analytics

Interoperability enables healthcare payers to collect and analyze data from different sources, which can help identify trends, patterns, and opportunities for cost savings.

Fraud and abuse prevention

Interoperability can help healthcare payers identify and prevent fraud and abuse by allowing them to more easily track and analyze healthcare data across different systems and providers.

Improved member experience

Interoperability can help healthcare payers provide a better member experience by allowing members to access their health information and communicate with their providers more easily.

The Challenges in Delivering Interoperability

Many health payers aren’t fully realizing the benefits associated with interoperability, and this is due in large part to their data infrastructures. These are the two most common issues we see when considering healthcare system interoperability across payer organizations: 

1) Technical challenge: Siloed data 

The IT infrastructure of many healthcare payer companies has evolved organically over years in order to meet the immediate needs of the business at a particular point in time. As a result, they have a complex web of different technologies and systems behind their interoperability platforms; networks of proprietary, monolithic applications (mostly on-premises) work to keep data siloed, often with little consistency in their implementation of open standards like Health Level Seven (HL7) and Fast Healthcare Interoperability Resources (FHIR).

Business impact: Higher administrative costs (due to greater manual intervention on claims processes); risk of limited visibility into member health; and inaccurate assessment of member risk. 

2) Technical challenge: Legacy communication methods

Traditionally, the interoperability platforms of healthcare payers have relied on point-to-point connections, using batch or SOAP (Simple Object Access Protocol) to interact with other healthcare systems. While these methods worked for previous applications that sent a relatively small number of requests to the core ledger (e.g., balance enquiries, ICD code updates, and prescription requests), they’re not suitable for high-volume, mobile-based applications. 

Business impact: Limited ability to deliver real-time, customer-facing applications (i.e., claim or payment status); limited ability to deliver mobile applications, due to lack of support for larger volumes of data; higher maintenance costs (due to constraint of managing multiple point-to-point connections).

Evolving Interoperability Platforms with Data Streaming 

We’ve helped a number of healthcare payers address these challenges by integrating data streaming into their interoperability platforms. 

Data streaming is a method of transferring data continuously and in real time from one system to another. Unlike batch processing, which involves processing large volumes of data in batches at specific intervals, data streaming involves processing data as it is generated or received, and making it immediately available for analysis or use.

Apache Kafka® is the leading data streaming technology, used by over 70% of Fortune 500 companies. Many healthcare payers rely on Kafka as a way to decouple proprietary monolithic applications, and unlock real-time data for use in event-driven microservices. 

Real-Time Interoperability – Example Architecture

Confluent, based on Apache Kafka and powered by the Kora engine, is a complete, cloud-native data streaming platform, and is used by healthcare payers to deliver real-time, scalable interoperability platforms. 

Here’s a common architecture we’ve helped our healthcare customers to deploy (with an explanation below). 

Cognizant’s Trizetta Facets is a core claims administration system which is widely adopted across the healthcare insurance industry. This architecture demonstrates two common integration patterns with Confluent: 

  1. Kafka Produce-Consume: Trizetta Facets supports a Kafka producer natively and this generates nested Avro events (5,000+ fields). We have helped customers to flatten these incoming data streams using Kafta Streams (KStream), ksqlDB, or custom Java. Additional processing includes filtering and removal or null values to give some examples. This allows events as they are happening in the backend systems to be available in real time in the Kafka brokers and also to downstream datastores (e.g., Snowflake, Azure Delta Lake, Databricks, etc.), processing systems (e.g., Salesforce), or custom event-driven microservices. 

  2. CDC pattern: Most core systems operating today either work on Oracle or on similar database technologies. The CDC pattern leverages native integration to Oracle CDC providers like GoldenGate or Confluent’s own premium Oracle CDC source connector to get a real-time stream of data and make it available in real time on the Kafka brokers as well as downstream consuming apps.

Real-Time Interoperability for Better Patient Outcomes

Interoperability is fundamental to the provision of modern healthcare. When delivered with real-time data streaming, it enables all parts of the healthcare ecosystem to function better. For healthcare payers, this means: 

  • Faster, more efficient claims management and care coordination 

  • Improved member experiences (with the ability to provide real-time mobile applications) 

  • Reduced fraud risk, due to availability of real-time streams

If you’d like to see how data streaming can help your business deliver better patient outcomes, try Confluent Cloud for free today.

Resources:

  • Qi Yang works with healthcare customers across the North East of the USA and has years of experience in improving operational efficiency through technology innovation.

  • Ananda Bose works with customers across the North East and Canada. His core capabilities include application modernisation and database modernisation with a focus on the financial services industry.

Did you like this blog post? Share it now

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.