Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more

How an EV Manufacturer Drives Operational Efficiency With Real-Time Streaming

Written By

Public manufacturers of electric vehicles operate a complex estate of manufacturing systems and integration technologies across heterogeneous environments. As in many enterprises in manufacturing, this architectural complexity makes it difficult to standardize data ingestion, share information across multiple systems, and optimize business processes for efficiency.

While preparing to ramp up the production of its vehicles, an EV manufacturer realized that it needed an integration platform that could provide visibility across every point of the supply chain. The manufacturer turned to systems integrator Allata to find a solution to this challenge.

Allata explains the complex data landscape of modern enterprises

Without real-time insights into the design, manufacturing, production, and distribution process, the company would be unable to predict future demand from its show and exhibitor clients. As a result, the EV manufacturer would risk missing deadlines and frustrating customers or overspending on unused materials and excess labor costs.

Learn how Allata used Confluent Cloud to build an event-driven, data mesh architecture with streaming data pipelines. In the end, this initiative allowed the EV manufacturer to integrate critical internal systems.

Want to learn how companies like this EV manufacturer have unlocked their data’s full potential with streaming pipelines?

Why Allata Proposed a Data Mesh to Solve an EV Manufacturer’s Integration Challenges

An EV manufacturer needed an integration platform that would serve as a bridge between three key systems, including the:

  • Product lifecycle management (PLM) tool responsible for creating 3D models of EV parts

  • Enterprise resource planning (ERP) system that contained the list of all parts involved in building a new EV

  • Manufacturing execution system that monitors and controls the entire production lifecycle for automotive parts

Each of these systems dealt with different components and data sets related to the design, manufacturing, and production of the EV manufacturer’s EVs. But despite their interconnected functions, the EV manufacturer couldn’t easily share information between these systems. 

This left the company unable to add details on newly designed parts to its Oracle ERP instance, which meant that data remained unavailable to suppliers and the rest of the downstream supply chain. The EV manufacturer also struggled to send parts data to the manufacturing system when it was time to start producing those parts.

The company’s director of infrastructure wanted consumers across the organization to have the ability to access all of this valuable information, when and where they needed it. The team reached out to Allata to take advantage of their team’s opinionated solution to integrations with streaming data.

After assessing the company’s technical pain points and business needs, Allata proposed implementing a data mesh to integrate the three systems with each other.

Adopting this kind of architecture was what the EV manufacturer needed to democratize data access and standardize data governance, both of which would help the company turn its valuable information into ready-to-use data products. Unlike other pitches—which focused on system-to-system integration rather than data integration—this proposal aligned with the EV manufacturer's vision for becoming a data company that produces vehicles.

Building a Data Mesh With Confluent Cloud Streaming Pipelines

As a leading provider of digital transformation services, Allata was prepared to provide its experience in enterprise software development, data infrastructure, and analytics to help the EV manufacturer use real-time insights to become more efficient.

Traditionally, similar manufacturing companies would use a middleware platform to connect the PLM tool to the ERP platform. However, the Allata team knew that the resulting spaghetti architecture would have disastrous effects on their EV manufacturer client’s long-term productivity and reliability.

How Allata migrates its clients from spaghetti to data mesh architectures

The challenge with these kinds of fragile spaghetti architectures is that when one issue arises, the entire system can fail and require significant effort and time to bring back online.

In contrast, an event-driven data mesh allows organizations to stream, process, govern, and share data across the operational and analytical plane. Those capabilities allow businesses to turn any of its data into data products that are ready to power multiple use cases.

How Allata builds operational and analytical data products within a data mesh

All of the systems the EV manufacturer wanted to integrate had existing connectors for Apache Kafka®—so Allata began evaluating Kafka and managed Kafka services for this use case.

Ultimately, Allata chose Confluent Cloud to implement this solution to give the EV manufacturer an integration layer that could keep up with its storage and scalability needs. Additionally, this fully managed, cloud-native service would make it easier for the EV manufacturer to adopt a single data streaming paradigm across the entire organization and its hybrid cloud environment.

Allata’s approach to building data products with Apache Kafka®

Technical Benefits and Business Impact of Building a Data Mesh With Confluent

With Confluent Cloud and Kafka connectors, the Allata team was able to connect to Snowflake in hours and have the first PLM to ERP integration up and running in just 2-3 months. From there, it was easy to stream data from these systems to any different consumers.

Many automotive manufacturers need to integrate system data with an existing mobile app. The challenge is that the relevant data needs to be shared with an IT group in another area of the business. This can add significant overhead, which was the case at the EV manufacturer as well before the company adopted data streaming.

Now, with Confluent, whenever the EV manufacturer’s digital apps team needed to get data out of a specific system, they were able to do so in a matter of days—rather than having to wait months—and then readily incorporate that data into the apps they were building.

Being able to reliably retrieve data from various systems helped the EV manufacturer gain visibility across its business, implement consistent processes, and significantly lower overhead. With these lower operational costs, the EV manufacturer was able to recoup its investment in this initiative in the first 18 months of this 3–year initiative.

These results were one of the key advantages of building data products with streaming pipelines on Confluent Cloud. Teams at the EV manufacturer no longer had to build custom integration between systems. The extensibility of these data products lowered the barrier to entry to valuable use cases, like predictive analytics on vehicles. 

Each business unit responsible for data streamed and processed through a specific pipeline. That team would have the domain knowledge to define the data product using Confluent Cloud’s built-in governance tooling.

Starting this practice with the initial PLM to ERP integration helped set the stage of data governance to be implemented more broadly at the EV manufacturer. In the future, this promises to unlock additional data streaming use cases that will help further increase the company’s operational efficiency and even drive new customer experiences.

Learn More About the Power of Streaming Data Pipelines

Innovators like Allata and its EV manufacturer client are using streaming data pipelines to maximize the value of their data and unlock limitless operational and analytical use cases.

Dive into the challenges associated with legacy data pipelines and learn how your organization can overcome them in the ebook, Transform Your Data Pipelines, Transform Your Business: 3 Ways to Get Started.

  • Lydia Huang is a Sr. Product Marketing Manager at Confluent, working with RSI partners to create joint GTM strategies. Prior to Confluent, she worked with product marketing at IBM, managing cloud security.

Did you like this blog post? Share it now

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.