Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more
In software systems, architecture that served the enterprise in the pre-cloud era is no longer sufficient. The need for cohesive customer experience is forcing the modernization of information and data supply chains. To enable next-generation application architecture, organizations are shifting from traditional data silos and batch processing to a streaming-first approach. Event streaming paradigms as seen in Kafka-based architecture are becoming de facto for such solutions. This session will dive deep into the engineering methodology, inspired by domain-driven design, to model business functions in an events-first approach and provide a practice to implement the event streaming architecture.
We’ll navigate a journey through application modernization. First, we explore business domain modeling with the swift methodology. Next, we work on a business flow that satisfies the most important functional requirements for the business stakeholder and addresses pain points surfaced from event Storming. Finally, we focus on the technical challenges of the architecture to satisfy the non-functional requirements. An example of a non-functional requirement described in this session is achieving high availability across multiple data centers while maintaining data integrity service level objectives. We accomplished this by moving business-critical workloads from legacy middleware to a Kafka-native solution using the Kafka Streams library.