Kafka in the Cloud: Why it’s 10x better with Confluent | Find out more

Bringing real-time financial risk management to legacy trading transactions

Written By

In the world of investment banking, there are thousands of trades happening every second. This means that real-time risk and margining are critical for clearing organizations to operate efficiently, and at scale.

Ness Digital Technologies is a regional system integrator (RSI) that aggregates best-in-breed technologies to custom-build solutions for customers with unique problems. For one customer, a large global-equity derivatives clearing organization (a DCO), Ness is building a data streaming architecture to re-orchestrate a market risk and valuation model in response to a recent explosion of volume in the equities market.

“Risk and valuation systems, many of which were built twentyish years ago on relational databases, are beginning to show stress. They’re having a tough time meeting throughput requirements in terms of valuing large baskets on high-volume days. Increasingly, risk managers and regulators want a wider array of scenarios to be evaluated, and latency requirements and the time-value expectancies of results are all decreasing.”

— Subir Grewal, Global Head, AWS Practice, Ness Digital Engineering

This DCO sits between the vast majority of derivatives contracts in the U.S. that are required to be cleared. It focuses on promoting stability and market integrity by delivering top-tier clearing and settlement services for options, futures, and securities lending transactions, clearing billions of options contracts per year across 16 exchanges worldwide. It’s the buyer to every seller—and the seller to every buyer—in the U.S. listed options markets, responsible for maintaining liquidity and the efficient trade flow in these markets so that transactions happen quickly and users don’t have to wait. The DCO’s book has grown tremendously, and along with it, the regulatory demands the company uses to manage risk.

Replacing the existing batch process-based system was no small project. This is a system that sits at the very center of the business—an enormous IT lift requiring a lot of commitment across the organization. They dubbed it the Renaissance Initiative, and in partnership with Ness, Confluent was at the center.

"The Renaissance Initiative is perhaps the most important project undertaken by {the company] in the past 20 years.”

— Executive Chairman and Chief Executive Officer of the DCO

How to walk away from historic data limitations

Since the onset of options trading in the ’70s, the trading volume of equities derivatives has increased exponentially. Even before the pandemic shifted everything for the financial market, the DCO had already embarked on the multi-year technology modernization initiative to strengthen its foundational capabilities and better serve users. When the pandemic set in and brought more financial unpredictability, both trading volume and volatility increased sharply across listed equities and equities derivatives models worldwide, requiring systems that could scale to levels uncontemplated even a few years ago. The surge in trading put tremendous stress on the DCO’s existing legacy technology and threatened to hinder future growth. That existing legacy model, built on batch-oriented risk valuations, which by their very nature could not happen in real time, put the firm at risk in times of volatile marketing conditions—and compromised timeliness for its customers overall.

“The domain itself has become a lot more complicated, and the requirements have gone up, and all of these demands are being met by systems showing their age,” explains Subir Grewal, Global Head, AWS Practice, Ness Digital Engineering.

“The other thing that’s changing in the game,” adds Jim Zucker, AVP Delivery, Financial Services, Ness, “is increased requirements to calculate more data each day due to increased regulatory requirements. If your batch now has 2x, 3x, or even 4x the amount of data to calculate every day, you either have to buy more CPUs or let it run longer.” In the meantime, adverse market movements or counterparty failure could expose the firm to millions of dollars in losses, and in addition to that, a firm with an inadequate risk system can be fined or even prevented from trading by regulators.

On the availability side of the tech stack, SLAs are getting shorter and shorter, and there’s a pressure to provide customized reporting to smaller business units and individual regulators in specific markets—and aggregate that at a global level. A lot of the biggest firms have expanded their footprints, through acquisitions and otherwise, to multiple markets.

Suddenly, the company needed to shift to an event-driven data solution that could reduce its reliance on slow batch processing and make this timeliness possible.

The three biggest pressures it faced were to:

  • Replace the company’s core clearing, data, and risk platforms with a highly modular system

  • Create an ability to scale and stay agile even during times of market volatility 

  • Deliver near-real-time risk management

Confluent makes real-time valuation and margining possible 

“Streaming means that when an event occurs, imagine that counterparty changes, market data changes, and curb updates are like water falling down a stream, processing as it goes past you rather than letting water build up behind a dam and processing it all at the end of the day.” 

— Jim Zucker, AVP Delivery, Financial Services, Ness

To power the multi-year Renaissance Initiative, the DCO partnered with Ness to rebuild and modernize with microservices and stream processing enabled by Confluent Cloud, built on the foundation of Apache Kafka. Unlike relational databases, Kafka is, by design, a distributed system. It’s intended to run on inexpensive hardware. Technology teams don’t have to invest in an expensive machine with expensive throughput, because the throughput is delivered through scaling horizontally. They can achieve significant scale, even on the highest volume days, and get away from transaction limits that are stressing systems. In short, data throughput issues go away.

This reimagining of the DCO’s margining and valuation system would provide a solid, secure environment for same-day risk management and computations, plus pricing and re-valuation. Ness selected Confluent to be a core part of the new architecture to build data streaming platforms that could allow real-time integration of trading activity and pricing to drive real-time risk management. With this selection, Ness was able to create streaming analytics and near-time computation capability, so the DCO can now manage risk and margin in real time.

The new platform that Ness built enhances efficiency and the speed of margin, stress-testing, and back-testing calculations. It also increases transparency and insight for clearing members into exposures, allowing ad hoc queries and real-time processing. 

Confluent is not the only technology in the mix for the DCO. The solution Ness built also includes AWS and Snowflake, along with the Amazon products S3, Redshift, and Athena. Data warehouses are connected directly to Confluent, and the architecture looks like this:

Kafka connects the data center to the cloud. It creates a data fabric and enables multiple data sinks optimized for different use cases. Data is encrypted and can be anonymized.

An investing market sped way, way up

The new architecture will enable the DCO to stay current as markets change and shift and help the company speed up development of new methodologies, analytics, and products on a backbone of data in motion. The ability to tap into real-time data streaming creates easy benchmarking and migration of analytical libraries—both proprietary and third party. And the single-streaming architecture that Confluent enables, combined with the ability to tap into historical data, makes all kinds of use cases possible: real time, end of day, and on demand. Confluent’s production-quality environment increases the reliability of testing and model calibrations, and proactive operational intelligence ensures the stability and dependability of risk computation.

The DCO now has access to data streaming and real-time analytics on 10,000 messages a second, including near-real-time calculations on price changes and re-aggregations in 5/20/60-minute cycles. They know the provenance of every number, calculation, and all source data, and have the ability to introduce new listed equities and equities derivatives models in weeks with full back-testing support and very high redundancy.

When the Renaissance Initiative is completed by the end of 2023, the DCO will be able to manage risk and margin in real time, thanks to data streaming and Confluent’s role in the new architecture. The DCO will be able to both scale and stay stable all the time, and deliver near-real-time risk management, with a much higher capability of meeting regulatory data demands. It’s a new era for investing, built on a backbone of data in motion.

  • Lydia Huang is a Sr. Product Marketing Manager at Confluent, working with RSI partners to create joint GTM strategies. Prior to Confluent, she worked with product marketing at IBM, managing cloud security.

Did you like this blog post? Share it now

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.