Data has historically been considered as simply a by-product of business systems. At financial institutions and at global banks in particular, the approach to data, and especially reference data, has largely been on how to make it work, but not necessarily on how to make it right.
Reference data, whether it’s legal entity data, instrument data, pricing, corporate actions or standing settlement instructions (SSIs), has become mission-critical for banks. It underpins almost any process or activity in the core areas of accounting, trading and risk management. It is also essential for auditing and reporting, as well as ensuring compliance and transparency.
Challenges including intersystem connectivity, outdated distribution methods and a diversity of product reference data sets, make it extremely difficult for banks to optimize their reference data without adequate data management software.
This is why a growing number of banks are looking to centralize the management of their reference data eliminating organizational and technological silos to realize seamless integration of reference data.
However, a high proportion of data management initiatives can fail because banks have underestimated the scale and complexity of the challenge and adopt a technology-led approach as opposed to addressing the program as a business-led initiative. Without the buy-in of key sponsors across the breadth of the institution, a sub-optimal outcome is inevitable.
With 25 years’ experience on the banking frontline, Asset Control’s Managing Director for Global Markets, Paolo Mittiga, will be writing a series of blogs examining in detail the seven critical steps to optimizing a reference data management change project:
1. Engage business, prove the business case and use specialized resources
2. Establish data governance with clear responsibilities and accountabilities
3. Create vision, ensure it is socialized and sold to the business and data governance
4. Establish team, clear accountabilities and define the program office
5. Create architecture, decide buy vs. build and create roadmap
6. Establish a clear partnership with vendor
7. Ensure execution
These steps are based on Paolo’s real-world experiences in successfully delivering firm-wide data management initiatives at global banks. He has learnt firsthand the success that sticking to them can bring.
Paolo will therefore provide a practical guide to optimising a data management delivery based on these ‘seven data wins’ and demonstrate how large banks can improve data quality, service levels and technology management.
Those on the trading front line have gotten wise in the art of managing the extortionate amount of data the financial market now throws at them. And let’s face it they had to: downgrades and debt crises, shrinking trade sizes, shorter windows for processing, pricing and reconciliation are just a few of the factors that have created more data to access, analyze and act upon.
However, as we highlighted on Tabb Forum recently, anyone who thinks this is a sell-side only problem is missing the point. Buy-sides, with their multi-broker strategies and their own alphabet soup of TCA, DMA, EMS and ECN, have been getting closer to the trade for years. What’s more, no regulator, auditor, investor or custodian is going to fall for the ‘broker ate my homework’ excuse – as choppy markets make compliance with client mandates that much harder.
Fund managers, like everyone else, need to make sense of and meaningfully use the increased information available to them and importantly understand the impact of sudden market movements on the shape of their portfolio – if only to make sensible decisions about which brokers are delivering alpha and which are relying on the reputation of their star traders. In these volatile times that show no respect for traditional institutions, counterparty risk stalks the markets and renders vulnerable everyone who does not have an accurate and immediate handle on the state of play.
So it’s time for buy-side firms to examine how data works for them and not just from a risk and compliance perspective but in terms of gaining a competitive edge too. If they don’t and their course is anything but smooth sailing, investors and regulators will no doubt be queuing up to find out why.
At Asset Control we’re always looking for innovative ways to ensure we are serving our client base to the best possible standard, really getting into their heads and anticipating new requirements to build into our product roadmap.
So harnessing the old adage of poacher turned gamekeeper, Asset Control has developed a new initiative: Customers in Residence, in which individuals from key clients have moved on to come and work with us. We are doing it so we can tap into their brains and utilise their on-the-ground experience, bringing a whole new meaning to KYC.
Why now you may be asking? Well it goes without saying that the markets are undergoing phenomenal change and this shows no sign of stopping. So, our customers need to ensure they are ready for impending uncertainties and a less well-understood future. At the same time they must ensure they are in a position to take advantage of all the opportunities a multi-asset, multi-geographical trading environment presents.
This all means firms have big technology investment decisions to make and our customers need to know that when it comes to data management, we have thought of everything.
We’ve always had our finger on the pulse, but there’s nothing like hearing the challenges and opportunities direct from the horse’s mouth. That’s why we have a new addition to the team, Paolo Mittiga who has previously worked at the likes of Credit Suisse and Citadel. He has extensive experience across the tier one buy-side and sell-side community and will work with our existing team to strengthen our technology roadmap, give guidance on where to hone our focus and provide advice on how to develop current strategies.
And if you haven’t heard enough, watch this space for Paolo’s perspective on his work with us as he contributes his very own blogs to our site.
In our previous blog post we talked about ending the over reliance on the ratings agencies as the sole source of information for making critical investment decisions. To gain a comprehensive, 3D version of a market, a sector or even an individual asset, which let’s face it is exactly what clients are paying for, we believe it’s important the industry looks at a broad set of data in addition to the AAA rating.
But with the volume of data now available and the required speed to market, taking a multi-source approach can cause quite the operational headache.
Manually compiling and filtering data for these purposes is very costly and time consuming. And, it’s not a problem that can be overcome simply by adding more people to the process – it’s about arming the people you have with the tools that allows them to harness more information more effectively.
Not surprisingly, technology can bring relief and a solution that can consolidate, compare and contrast a multitude of disparate data sources and deliver these in a consistent and validated manner to multiple consuming systems. This is the only way to arm market participants with relevant and necessary information.
Indeed, the fundamental things that deliver value in the long-term, such as research, analysis and due diligence are fuelled by access to accurate, accessible and accurate information…Triple A, but not just in the conventional sense.
These are the components that ultimately drive value across a trading or investment business to create unique intelligence that generates alpha. However, the capability to efficiently pull a wide variety of data from more than one source demands investment in appropriate infrastructure. Without it, market participants are trading with limited intelligence.
The recent financial crisis has made credit rating agencies the target of greater scrutiny. There is no denying that closer attention needs to be paid to the business models of the big three agencies and, the push for greater transparency around these organizations is certainly justified.
However, by placing the blame solely on credit agencies, the industry is ignoring a much bigger issue: how financial services firms have become dependent on credit ratings as a single source of data for making key trading and investment decisions.
Over reliance can be a dangerous game in any situation, let alone when the markets can move drastically at the drop of a hat (or rating), and there is increasing pressure from regulators and investors to obtain the best possible outcomes for clients under best execution requirements.
So we suggest looking at a credit rating assessment as just one piece of the puzzle. Comparing multiple data sources and importantly, critically analyzing the results, is the only way to achieve a truly accurate assessment of investments.
Indeed, credit rating analysis should be checked against sales data, historical data and pre-payment information as well as data from niche providers to create well-rounded, validated intelligence that can properly inform decision-making.
It’s time for firms to consider the steps that need to be taken to end the over reliance on the ratings agencies that has characterized the industry for too many years.
I remember when the first ATM cards came about and there were two extreme types of users. First, there was the young Friday night reveler who marveled at the accessibility, convenience, ‘always-on’ (before ‘always-on’ was even coined as a phrase) and generally ‘cool’ aspect of this money-on-demand facility. They shared cards, used them until they ran out and showed off their new-found sophistication to their friends.
Then there were the older, established people, like my parents, who felt it was an uncontrolled, unsecure, provocative feature that bordered on recklessness on the part of the banks. The whole sorry business would lead to accounts being drained mysteriously, armies of middle class people shuffling around like bankrupt zombies, and the fall of democracy and the fee-world as we know it. Needless to say my parents continued to live under their own roof – at least until that dreadful mix-up over the ‘herb garden’ in the backyard, but that’s a story for another day!
Between these two extremes lay the majority of people who saw the convenience factor, but worried about the potential for fraud or theft, and therefore started to tip-toe their way towards adoption. This middle-ground group understood the pros and cons, and importantly, lobbied for enhancements and changes that closed the loopholes and narrowed the risks. Arising out of this are technology changes that maximize the benefits and obfuscate the negatives.
When I look at cloud computing today I see a similar pattern. We have those who have embraced and pushed the boundaries by removing the shackles of discreet boxes or on-site packages, by taking advantage of the huge reach, the elastic capabilities and the ‘pay for what you use’ psyche.
When one looks at music and video storage streaming and sharing, accessibility through tablets and smartphones, and heart-stopping innovation and creativity in those who have allowed the vast internet to act as a canvas, it is very impressive. And at this point, most of the middle-ground equivalents have also gotten comfortable. The last frontier – the final step, I believe – sits with those who, like a novice parachutist, are standing in the doorway of a plane at 2000 feet, and who simply have to make just one step …
In today’s world, in my world of data management, these are the institutions who just cannot contemplate the idea of letting go of their data beyond their physical firewall. Rather than be paralyzed by fear, I think we have to make a choice to accept that the cloud is here to stay and look to repeat the revolutionary success it has had in other industries.
We should advocate the elasticity and multi-locational, super-redundancy of the global network of data centers to cover the fact that with some foresight, the same set of data can be found in more than one place. We should celebrate the database technologies that allow replication, mirroring, gridding, and splitting. We should marvel at the processing technologies that transfer huge volumes of data in small packets so that every second of uptime is maximized. We should revel in the new security technologies that protect both stored and inflight data, and we should look for ways to capture snapshots synchronized on-site and stored in the cloud, bitemporal (time and transaction) data management to recreate moments in time…at any time, both for the data itself and also for the provenance of that data.
As software solution providers, and as an industry, we need to provide solutions that utilize these things, embrace innovation and are optimized for use in such a world. This way, we can perhaps create the right atmosphere and a safety net that will allow the last step to be a baby step and not a giant leap.
‘Is the value of your assets based on art rather than science- and how can you prove it?’ As sovereign debt crises continue to dominate the headlines, it’s a question worth asking, because like most fixed income assets, the value of government bonds is based on a combination of verifiable facts and informed assumptions. The more competent your people are, the more accurate the assumptions which underpin your pricing models will be. But with plenty of incentives to game the numbers and produce higher valuations, it’s not unknown for the tail to wag the dog when it comes to pricing fixed income assets, and for people to find ways of creating the price they want.
We’ve all seen the consequences of that, but today you need to justify those assumptions to regulators, auditors, investors and managers after the event. And that’s almost impossible if assumptions are recorded in various spreadsheets, random electronic files and post-it notes stuck to monitors. It’s also pretty hard if you have an impeccably controlled, technology-enabled environment in one department, and a complete free-for-all in another. The right data management solutions will give you the discipline and transparency into the art of valuation without cramping the style of those doing the valuing. It will enable consistency across the enterprise and enable every department to create, record, monitor and audit the valuations they need. No technology should stop you valuing any instrument or any asset in the ways you see fit – but at some point you will need to explain that decision. It’s time to think about a flexible, transparent, consistent, and repeatable approach that lets you do just that.
News of the huge loss as a result of unauthorized trading at UBS this week immediately impacted everyone holding positions with the firm, and has placed the need for a single common legal entity identifier (LEI) beyond debate. For traders, their clients and regulators, an immediate, consolidated view of counterparty risk across asset classes, desks and geographies is now a bare necessity.
LEI standards are being developed to replace the intricate patchwork of counterparties and ownership structures that currently comprise each transaction. But, introducing standards on a global basis across the financial services sector has never been easy and discussions on what the impact would be from a practical perspective continue to vex the data management industry.
Nobody yet knows what the final LEI standards will look like and how they will be implemented in practice. One thing that is certain however, is that standardization around legal entities will create a huge data management headache for firms running off creaky proprietary systems. Firms simply cannot afford to try and accommodate the onslaught of regulatory change, of which LEI is only one, from what is, essentially, a standing start.
Indeed, LEI isn’t just another box to tick on an audit or compliance form; it goes right to the heart of a firm’s counterparty risk management and, for fear of sounding melodramatic, being able to respond rapidly is essential to minimize losses, or even ensure survival.
Getting your house in order and putting the right system in place now is essential. Markets move too quickly for firms to respond via manual processes, and spreadsheets alone will be left behind. Moreover, If you invest in the infrastructure to spot these issues, and take appropriate action quickly, the shape, size and format of LEI won’t be a cause for concern, which leaves you able to focus on even more complex regulatory requirements that continue to cloud the horizon.
Far too many people in the data management industry think that there is a one-size-fits-all, static solution that will solve all data-related problems – and they spend a lot of their time promoting this idea. In reality there’s no such thing. This tired old fallacy has been hauled out for far too long, and unfortunately there are too many organizations that are just beginning to realize that what they’ve bought isn’t a solution – it’s just another problem.
So let’s go back to basics. Any data management infrastructure has to be appropriate for the size of the firm and the type of operation. The solution that is right for a 40-person hedge fund is very different from the solution needed by a global custodian with thousands of customers and tens of thousands of employees.
Most firms have multiple business units, product lines, and investment strategies – all of which require different data sets used in different ways. Accounting and risk management will need different data sets than the trading desk. Operations want data on actual holdings, so analysts can use it for modeling ‘what if’ scenarios. The idea that you can impose a monolithic, inflexible data management structure with a single data set and a single management tool, onto the modern business with all its complexities is manifestly false.
So let’s have a more realistic conversation about data management. And let’s start by calling out old-fashioned ideas about data management, and exposing them for the myths that they really are.
We all know that data volumes have gone intergalactic in the past few years. Businesses have to get more data, do more with it, more often, and in a shorter timeframe. There is much greater demand for real-time understanding of valuations, exposures and risk. Both investors and regulators want more transparency and proof that management has put adequate operational procedures, controls and risk checks in place. Regulatory arbitrage is out of the question: demonstrating that a consistent approach is unavoidable.
That’s a huge increase in operational complexity – and it is no longer something that can be avoided, ignored, or delegated down the chain of command. This is more than data management – this is data governance. And just like corporate governance, it goes all the way to the top of the organization. It might be operationally complex, but that doesn’t mean that ownership should remain in operational departments.