Data has historically been considered as simply a by-product of business systems. At financial institutions and at global banks in particular, the approach to data, and especially reference data, has largely been on how to make it work, but not necessarily on how to make it right.
Reference data, whether it’s legal entity data, instrument data, pricing, corporate actions or standing settlement instructions (SSIs), has become mission-critical for banks. It underpins almost any process or activity in the core areas of accounting, trading and risk management. It is also essential for auditing and reporting, as well as ensuring compliance and transparency.
Challenges including intersystem connectivity, outdated distribution methods and a diversity of product reference data sets, make it extremely difficult for banks to optimize their reference data without adequate data management software.
This is why a growing number of banks are looking to centralize the management of their reference data eliminating organizational and technological silos to realize seamless integration of reference data.
However, a high proportion of data management initiatives can fail because banks have underestimated the scale and complexity of the challenge and adopt a technology-led approach as opposed to addressing the program as a business-led initiative. Without the buy-in of key sponsors across the breadth of the institution, a sub-optimal outcome is inevitable.
With 25 years’ experience on the banking frontline, Asset Control’s Managing Director for Global Markets, Paolo Mittiga, will be writing a series of blogs examining in detail the seven critical steps to optimizing a reference data management change project:
1. Engage business, prove the business case and use specialized resources
2. Establish data governance with clear responsibilities and accountabilities
3. Create vision, ensure it is socialized and sold to the business and data governance
4. Establish team, clear accountabilities and define the program office
5. Create architecture, decide buy vs. build and create roadmap
6. Establish a clear partnership with vendor
7. Ensure execution
These steps are based on Paolo’s real-world experiences in successfully delivering firm-wide data management initiatives at global banks. He has learnt firsthand the success that sticking to them can bring.
Paolo will therefore provide a practical guide to optimising a data management delivery based on these ‘seven data wins’ and demonstrate how large banks can improve data quality, service levels and technology management.
At Asset Control we’re always looking for innovative ways to ensure we are serving our client base to the best possible standard, really getting into their heads and anticipating new requirements to build into our product roadmap.
So harnessing the old adage of poacher turned gamekeeper, Asset Control has developed a new initiative: Customers in Residence, in which individuals from key clients have moved on to come and work with us. We are doing it so we can tap into their brains and utilise their on-the-ground experience, bringing a whole new meaning to KYC.
Why now you may be asking? Well it goes without saying that the markets are undergoing phenomenal change and this shows no sign of stopping. So, our customers need to ensure they are ready for impending uncertainties and a less well-understood future. At the same time they must ensure they are in a position to take advantage of all the opportunities a multi-asset, multi-geographical trading environment presents.
This all means firms have big technology investment decisions to make and our customers need to know that when it comes to data management, we have thought of everything.
We’ve always had our finger on the pulse, but there’s nothing like hearing the challenges and opportunities direct from the horse’s mouth. That’s why we have a new addition to the team, Paolo Mittiga who has previously worked at the likes of Credit Suisse and Citadel. He has extensive experience across the tier one buy-side and sell-side community and will work with our existing team to strengthen our technology roadmap, give guidance on where to hone our focus and provide advice on how to develop current strategies.
And if you haven’t heard enough, watch this space for Paolo’s perspective on his work with us as he contributes his very own blogs to our site.
Financial firms remain at a familiar crossroads when it comes to the technology they implement: build something proprietary in-house or buy in a solution from sector specialists.
However, highlighted no more so than on this very blog, one of the things that differentiates the current position of financial services industry is the sheer amount of regulatory change taking place. On what sometimes feels like a daily basis, regulators change rules, adapt them, tweak them and, in some cases, create entirely new rules by which the entire industry must react and adapt to – or face the consequences.
For in-house technologists at these firms this presents a problem. Just like the Dodo, proprietary solutions must evolve to the changing landscape around them or face extinction. A Darwinist evaluation of in-house teams will reveal a graveyard of solutions which simply couldn’t stand the pace of the industry in which we work. Too often, the frequency of market, regulatory and client driven change adapt too quickly for such solutions to keep pace.
This situation, combined with the current and sweeping pace of change within capital markets is prompting many firms to now look more closely at the viability of their current operations at an application level.
The keyword here is future proofing. A great deal of IT and resource investment is spent on overly complex and expensive systems to address this issue. All too often such systems have been built in house are no longer fit for purpose and usually uneconomical and time consuming to maintain and expand as new instruments come on board, as trading volumes increase and as factors change with the acquisition of new business lines.
So what’s the alternative? A vendor supported and maintained commercial product. But before diving into the vendor space, ask yourself what such a product must look like.
Obviously anything you implement must cover what today’s requirements – but your proprietary solutions are built with these requirements in mind. The question is, what might the future look like? For example, what asset classes might you be trading or what regulatory changes are coming into view on the horizon.
Stare into your crystal ball for a moment, you’ll need something that can integrate seamlessly and easily with a broad array of applications across your firm, scalable enough to meet current business needs and future performance requirements – which will differ according to the various consuming business units.
The industry is constantly evolving – if you’re technology isn’t prepared to do the same then you face a Dodo-esque future.
Stress; it affects us all, and as we’ve seen over the past few years, the financial world is no different. Luckily for European financial institutions, help is at hand in the form of the regulators very own stress reliever: Basel III. Designed to enhance the loss-absorbing capacity and thus resilience to crisis situations, the big brother of Basel’s I and II is a regulatory game changer and the cornerstone of the industry’s attempt to control risk.
However, Basel III exists because Basel II fundamentally didn’t work. By allowing financial institutions to create their own sophisticated risk weighted models it hands the initiative back to the institutions, allowing them to justify their own risk weightings.
The question then becomes how firms can justify the lowest capital holdings possible. And the answer? By having the upmost confidence in the data they are using to base their decision on as risk infrastructure and calculations are nothing without the data that feeds them. In other words, banks will need to demonstrate how and why they have arrived at their risk weightings and the only way to do this is through their data.
So when it comes to Basel III compliance if you are going to set your own risk model;
- Your data had better be accurate
- You had better be able to prove it
- And, you had better be able to prove the validity of data’s source
These are commandments of Basel III; stick to them, and you’ll quickly see your stress related stress vanish.
Are you sitting comfortably? Are you ready to be served another regulatory delight?
In the spotlight this week, we’ve brought some tax regulation to the forefront. The Foreign Account Tax Compliance Act, or as it’s more commonly known, FATCA, requires foreign financial institutions (FFIs) to gather sensitive data, including balances, receipts and withdrawals, on US account holders and identify accounts for the purpose of reporting to the Internal Revenue Service (IRS). As is commonplace with regulatory changes of all flavours, data requirements go far beyond the current mandatory tax requirements on financial institutions.
However it’s not just more data that binds together the plethora of new regulations facing the financial markets, but also the increasing focus on the quality of data held and analysed by financial institutions. Indeed, as regulators strive for transparency, quality data is the fundamental ingredient to ensuring that new regulations, from FATCA to Form PF, effectively measure and manage what they propose to.
Although investment is required to achieve Triple A rated data, the cost of not having it is even bigger; for example, under FATCA, a withholding tax of 30% will be applied to payments made to FFIs if they fail to report or inaccurately classify clients, and that’s without even mentioning the reputational damage that can far outweigh any fines or other financial penalties.
So, if your systems are feeling the stress with FATCA compliance then the indigestion will only persist when it comes to stomaching the full menu of regulatory initiatives being served to the global financial markets. Firms must ensure they have the key compliance ingredient in stock if they are to confidently manage the onslaught of regulatory changes they face in 2012 and beyond.
‘Is the value of your assets based on art rather than science- and how can you prove it?’ As sovereign debt crises continue to dominate the headlines, it’s a question worth asking, because like most fixed income assets, the value of government bonds is based on a combination of verifiable facts and informed assumptions. The more competent your people are, the more accurate the assumptions which underpin your pricing models will be. But with plenty of incentives to game the numbers and produce higher valuations, it’s not unknown for the tail to wag the dog when it comes to pricing fixed income assets, and for people to find ways of creating the price they want.
We’ve all seen the consequences of that, but today you need to justify those assumptions to regulators, auditors, investors and managers after the event. And that’s almost impossible if assumptions are recorded in various spreadsheets, random electronic files and post-it notes stuck to monitors. It’s also pretty hard if you have an impeccably controlled, technology-enabled environment in one department, and a complete free-for-all in another. The right data management solutions will give you the discipline and transparency into the art of valuation without cramping the style of those doing the valuing. It will enable consistency across the enterprise and enable every department to create, record, monitor and audit the valuations they need. No technology should stop you valuing any instrument or any asset in the ways you see fit – but at some point you will need to explain that decision. It’s time to think about a flexible, transparent, consistent, and repeatable approach that lets you do just that.
The new standards are designed to ‘beef up’ the essential infrastructure supporting global financial markets and better position institutions to foresee, withstand and avoid financial shocks. The hope is that events such as the 2008 Lehman Brothers collapse can be avoided — all good news so far!
But this additional security will come at a compliance cost as voluntary reporting becomes a thing of the past. Data-wise, centrally collecting and reporting data in these trade repositories (TRs) will have huge implications on data management infrastructure and governance processes, especially since the majority of current systems weren’t built to cope with the onslaught of requests for greater transparency the markets are currently witnessing.
TRs are recognized throughout the financial regulatory community for their ability to bring transparency to previously opaque markets. The Dodd-Frank Act has identified repositories as one of the “three pillars” of its new infrastructure requirements.
Indeed, in the post-crisis era, there is no getting away from the fact that transparency and reporting have become the hallmarks of the financial services industry. And, what’s more, regulatory pressures and operational complexities will only continue to multiply.
This is no time for inadequate solutions and temporary palliatives. To truly prepare for IOSCO and the plethora of other regulatory changes on the horizon requires a fresh and strategic approach that delivers regulatory compliance — not only for today, but also for the future.