Welcome to LSEG Insights.
Hi everyone.
Today we're doing a deep dive into solving market data challenges with Real-Time Optimized - four use cases. And if you want to get into the source materials yourself or maybe speak to one of our specialists, just search for real-time optimized on lseg.com, you'll find loads more there.
Okay, and as we, you know, unpack this whole topic today, market data optimization, especially in this cloud native world, we really want to hear from you. What are some of the biggest hurdles, the biggest challenges you're actually facing when you try to integrate real-time data across your trading and risk systems? Let us know, share your thoughts in the comments or, you know, send us a message. Okay, let's get into it.
Financial services firms, well, they're fundamentally rethinking how they deal with market data. A lot of this, maybe most of it, is driven by this huge shift to the cloud. And LSEG's real-time optimized, it's this award-winning solution, right? It conflates, normalizes data, all trades, quotes. It's really become central to this shift. I mean, over 600 customers already.
Yeah, that's significant uptake.
It is. So today, we're really zeroing in on four key use cases. These really highlight its power, its impact, especially for anyone out there, building, optimizing, deploying financial tech.
And what's really transformative here, I think, it's not just about, faster data or making operations a bit more efficient. It's really about how robust cloud-based data can fundamentally change how institutions operate.
Right.
All the way from the front office with its, you know, high-speed demands right through to the complexities of risk and compliance. It's really a foundational shift we're seeing, not just incremental tweaks.
Okay, so let's kick things off with a market that's it's got everyone's attention. Cryptocurrency. I mean, despite the volatility, the global market cap is still around, what, US $3 trillion?
Roughly, yeah.
And we're definitely seeing more and more hedge funds diversifying, moving into this asset class.
Which naturally raises a pretty critical question, right? For anyone diving into a market that's this new, often fragmented, how do these funds make sure they actually have trusted, robust data for their critical operations? Especially when liquidity, it can be spread thin across dozens of exchanges, sometimes with wildly different prices.
Spot on.
For quants, for traders, just identifying genuine liquidity, finding the absolute best price for a trade, that's everything. And it's super challenging in crypto. Then there's the whole challenge of integrating this trusted price data seamlessly into their EMS, their execution management systems. And crucially, leveraging cloud computing for advanced analytics, think pre-trade analysis, maybe rapid back testing, to really slash those processing times. And after the trade, you still need precise analytics to verify best execution. That's key for compliance, for performance.
Yeah, and if you sort of zoom out a bit, the impact goes way beyond the trading desk. The middle office, they need that same consistent, trusted crypto data for accurate P&L marking, portfolio evaluation, and the risk management and compliance teams. They absolutely depend on this exact same data to meet their, well, their growing regulatory obligations. Any mismatch, any discrepancy between the data sets used by different teams that can introduce significant operational risk, firms are working really hard to stamp that out.
So real-time optimized tackles this head-on. It provides that robust, trusted crypto data, which is just vital in these fast-moving, sometimes kind of opaque markets.
And the cloud-based nature, that's really the key, isn't it?
Oh, absolutely. It's a game changer. It means front, middle, back office, they're all working with the exact same data right across the entire trade life cycle.
No more silos.
Exactly. It boosts efficiency massively and critically cuts operational risk by just smashing those old data silos. Forget those painful reconciliation processes. Everyone's literally on the same page.
And the coverage itself is, well, That's pretty impressive for such a relatively new asset class. We're talking over 12,000 crypto instruments. Real-time data streaming in from 35 exchanges and historical data points going back in some cases over five years, that gives you the depth you need for proper quantitative analysis.
That's crucial depth.
Yeah.
As Matt Eddy, he's the global head of real-time quant and economic data solutions, as he puts it.
“By using trusted data, these firms can significantly reduce many of the operational risks. enabling them to concentrate on generating alpha.” So for any financial software developer or quant listening, that means less time fighting data quality issues and more time actually building models and strategies.
Okay, so moving beyond just the raw price data, let's talk about the power of unstructured information.
Yeah.
How are firms using news, you know, to enhance their trade signal detection? This feels particularly relevant for anyone in, say, event-driven strategies or just looking for that informational edge.
Yeah, this is a fascinating area. What's really insightful here is the synergy, the combination of real-time optimized with machine-readable news. When you put those two together, especially with sophisticated cloud-based analytics, firms can start uncovering tradable news opportunities.
Think about super fast event-based trading on like key economic releases.
Or maybe even more critically, spotting those exclusive M&A news breaks that can just move markets in milliseconds. The speed and the integration of these streams, that's absolutely paramount for HFT, for algo strategies.
The machine-readable news feed itself, it's incredibly comprehensive, isn't it?
Oh, absolutely. It draws from Reuters news. That's over 2.6 million stories a year.
Huge scale.
And it doesn't stop there. It also pulls in content from over 130 third-party sources. So this raw, unstructured, real-time news stream gets transformed into a machine-readable feed.
Okay, so how does that work in practice for, say, developers or quants.
This is where it gets really powerful. That raw news, it isn't just dumped on you. It gets enriched with 90 additional fields of metadata. Yeah, through a mix of automation and human processes, things like index data. Then it's processed further with automated NLP natural language processing analytics. That gives you crucial context, like sentiment scores, significance, relevance, confidence scores.
Okay. So it's not just the headline.
Exactly. But the key thing for seamless integration into trading systems is that machine-readable news is aligned to the very same data model as real-time optimized. That makes it incredibly straightforward to work with both data sets together. No complex mapping needed.
That data model consistency, that must save a lot of headaches.
Huge time saver. And that alignment, it also extends to backtesting. Firms can easily use the news archive combined with tick history data, which again shares that same common data model to perform really rigorous back testing on event trading models. Minimal effort compared to trying to stitch different data sources together.
Right, reducing that development cycle time.
Precisely. And this, you know, it brings up an important point about the future of FinTech. How does all this rich integrated data actually leverage for AI and machine learning?
Good question.
The fact that the data is already cloud native means it's immediately available to cloud-based analytics services, other cutting-edge tools. This gives you so much more agility for model development for deployment compared to traditional on-prem setups.
Yeah, makes sense.
Matt Eddy forecasts this too. He says, we think that more and more financial firms will want to use real-time optimized for their cloud-based AI and ML use cases going forward. This is where the industry is headed. So for anyone building those advanced models, this ease of access, this consistency, it's genuinely transformative.
Are you finding value from what you hear? Get the latest insights from lseg.com slash insights. All right, let's shift gears now. Let's focus on an area that's often, well, let's be honest, plagued by decades-old systems, stubborn data silos. The middle and back office.
Yes, the engine room.
Exactly. For years, these operations have often run on completely different data, different tech platforms than the front office.
Yeah.
I mean, picture your finance team and your risk team basically speaking different languages because they're working off totally different data. That's a data silo, right?
Yeah. And that disconnect.
It leads to massive inefficiencies, increased operational risk. I mean, even serious losses sometimes from things like settlement failures.
Yeah, traditionally, the middle and back office, they often just lacked real-time data access entirely. They operated on tech stacks, on models that simply didn't talk well with their front office systems.
Right. Lots of manual workarounds.
Exactly. Which means delays, errors. And then compounding all this, you've got new regulations coming in. Things like the Basel Committee's fundamental review of the trading book FRTB.
FRTB.
Yeah, it's a complex set of rules, basically designed to make sure banks hold enough capital for their trading risks. But these rules, they demand unprecedented connection, unprecedented transparency, right across the entire trade life cycle. You need a unified view of data that, frankly, legacy systems were never designed to provide.
Okay, so how does real-time optimize fit into solving these deep-seated problems?
Well, it's increasingly being deployed specifically for these middle and back office use cases. We're seeing it used by major banks, broker dealers, asset managers. And its cloud-based nature is particularly popular here because data can be accessed securely from anywhere. That massively improves data governance, ensures consistency across the whole organization, and that, in turn, reduces operational risk by getting rid of those dangerous data discrepancies.
And you mentioned the data model earlier.
Yeah.
That's key here too.
Absolutely critical.
Real-time optimized is strategically aligned to LSEG's data model. What that means is the data is inherently consistent with other LSEG real-time feeds, like real-time direct, real-time full tick,
The whole spectrum.
The whole spectrum. This seamless connection across latency ensures that the middle and back office are using the exact same data as the front office, all using a common symbology. That unified data landscape is absolutely essential for complex risk calcs, for accurate regulatory reporting under things like FRTB.
Can you give us an example of this integration in practice?
Sure. A really compelling one is Mirrix's MX.3 platform. It can connect directly with real-time optimized when MX.3 is hosted on AWS, Amazon Web Services. That's a powerful combination for firms. And MX.3 clients can also get the data via the real-time distribution system or the real-time managed distribution service. So what you end up with is this single open platform providing consistency data, analytics, calculations across front, middle, and back office. It truly streamlines operations end to end.
Right, breaking down those walls.
Yeah, and we've actually seen this deliver tangible results. There was a large North American bank, for instance, they implemented MX.3 a few years back. Now, their front office and their risk management teams both access the same pricing library.
Unlike before.
Totally different from their old setup where they use different models leading to constant reconciliation nightmares. This change, it eliminated the need for manual reconciliation, drastically cut their operational risk, and makes it so much easier to comply with market risk regulations like FRTB.
Makes sense.
As Matt Eddy emphasizes, refreshing the data as well as the technology is essential. Market data in the cloud for the middle and back office can help ensure these teams are to the front office, boosting efficiency and reducing risk. It's really a clear blueprint for achieving that kind of operational excellence.
Okay, final use case. Let's explore how Real-Time Optimized is empowering payment and money transfer companies globally with high-quality FX data. these companies, they're all about delivering innovative, fast, frictionless services across borders. Shopping, traveling, sending money.
Absolutely. And fundamental to their entire business model, really, is the need for top quality FX data. Data that accurately reflects the true nature of international currency markets.
Exactly.
Without robust real-time data, it's just impossible for these companies to make sure their clients are making the best decisions, executing transactions efficiently.Accuracy, immediacy, their complete non-negotiable when you've got millions of transactions flying around every second.
So where do they actually use this real-time optimized FX data?
Well, several key areas. First, obviously, setting precise exchange rates and margins for their customers. That might be powering a web interface clients use directly, or maybe updating the screens their customer service reps use.
Right, front-end stuff.
Yeah, but beyond customer-facing apps, they deploy this critical data internally too. Treasury teams use it, risk management, compliance. They all need it for managing currency positions, developing hedging strategies, and of course, meeting all those extensive regulatory obligations. It's really an enterprise-wide solution for them.
Okay, so talk about the technical side. What's the advantage here for, say, the financial software developers building these platforms?
Right, this is where the tech advantage really shines. Real-time Optimized delivers a 0 footprint stream of real-time FX prices directly from the public cloud.
Zero footprint, meaning?
Meaning you don't need to install any heavy software or hardware on your own premises. The data just streams directly to you from the cloud, ready to integrate and use. Super efficient. And companies can pick and choose exactly what they need from LSEG's really extensive foreign exchange data catalog. We're talking pricing for 175 currencies, over 500 currency pairs, sourced for more than $2,000 contributors, that includes everything. FX spot, cross, forward swaps, fixings, NDFs, those are non-deliverable forwards.
Right, for restricted currencies.
Exactly. Plus OTC options, volatility surfaces, currency indices just unparalleled depth for pretty much any FX strategy or requirement.
And you mentioned exclusive venues earlier. Does that apply to FX too?
It does, and this is really compelling for developers working on FX platforms. LA's Forex data is further supported by exclusive pricing from 2 powerhouse trading venues, FX Matching and FX Sal.
Okay, tell us about those.
So FX Matching is a central limit order book CLO. It offers real-time credit screening, great price discovery, concentrated liquidity, efficient execution really geared towards active FX traders. FXL, on the other hand, gives active traders access to really deep FX liquidity from over 200 providers, serving about 2,400 buy-side institutions. It offers choice and execution across multiple leading liquidity venues. So having data integrated from both gives you a really comprehensive deep view of the whole FX market.
Got it. And how easy is it to actually connect internal systems to this data feed?
The setup? Well, it can be surprisingly quick. Connecting internal systems to real-time optimized FX data, whether it's for setting rates, managing risk, trading it can take as little as just a few hours.
Really? An hour.
Yeah, because it uses industry standard open APIs, things like web sockets, popular languages like Python. There are also high-performance development kits for Java and C++ if you need that extreme speed.
Okay, makes sense.
And, given the nature of FX markets, these companies really appreciate having robust operational resilience baked right into real-time optimized. Plus, support services that operate around the clock, matching their own demanding schedules. Matt Eddy highlights this too, saying, our data enables these companies to support their clients in the best way possible, while at the same time managing risk, compliance, and their margins.
Okay, so wrapping this all up, what does this really mean for you listening, whether you're a quant, a high-frequency trader, a financial software developer, market data engineer, or an active trader? What's the big picture here?
Well, I think real-time optimized being built in the cloud clearly supports a huge range of critical use cases right across the trade life cycle. It genuinely streamlines complex operations.The key takeaways are pretty clear, I think. First, you get access to high-quality, normalized real-time data. Over 100 million instruments from a single ID covers the wide Greatest array of Exchange and OTC markets.
That unified view again.
Exactly. And the data is aligned to LSEG's data model. That ensures seamless connectivity, front, middle, back office, and links easily to other valuable data sets like machine-readable news. It provides that truly unified data environment.
And it fully enables using all those constantly evolving cloud computing and analytics tools, right?
Which dramatically cuts the time, the resources needed just to work with the data, speeds up development cycles.
Definitely. Plus, the machine IDs, they're flexible. They're not tied to a physical site. That means you can easily switch applications to new sites or move them to the cloud, giving you unparalleled agility.
That's huge.
And importantly, critically, you get support from a global team. Plus, there's a professional services team that really gets the specific needs of firms who are either transitioning to the cloud or already operating in a cloud-native way.
Right.
Look, the fundamental shift to cloud-native data solutions, it's not just about efficiency gains anymore. It's really about enabling a level of agility of integrated insight that was frankly just unattainable with traditional market data infrastructure.
Okay, so that brings us to a final thought, a question for you to ponder after this deep dive. As market data becomes increasingly consolidated, increasingly normalized across that entire trade lifecycle, how do you think this will reshape traditional departmental roles? front office, middle back risk. And how will it foster maybe new forms of collaboration within financial institutions over, say, the next five years? Something to think about.
Indeed.
And that wraps up today's deep dive here on LSEG Insights.
Yeah.
Thanks for tuning in. If you enjoyed the discussion, please don't forget to subscribe and maybe share this with your colleagues in the trading and tech community.
Thanks for joining us.