Data Management Blog
Following the 2008 financial crisis, Basel III was created to enhance the capacity of banks
to withstand market shocks and is considered to be core to a number of reforms within the global banking industry. Yet, views of politicians and the banking sector with regards to the new rules appear to be on an ever-divergent path. Cabinet ministers’ commitment to the rules were underlined last month when Germany’s government agreed to move forward with introducing stricter capital requirements. They are calling for their European counterparts to do the same.
However, banks have clear misgivings around the rules, citing some key unintended consequences, especially in the shipping and aircraft financing sectors. A senior Bank of England official, Andy Haldane, has even gone as far as to suggest that Basel III has become too complicated to be effective. He suggested a move back to the drawing board to design a more minimalist approach. In reality, this type of u-turn is not something we are used to seeing from the regulatory community. Therefore, despite their disquiet, banks must begin preparations in earnest to ensure they are not caught off guard. A wait and see approach will not do.
While Haldane’s suggestion that “less is more” might curry favor with the banking community, the same cannot be said for financial institutions’ approach to compliance in light of the myriad of new regulations they face. Effective and efficient data management sits at the heart of this and without the right infrastructure in place, banks will struggle to deal with the increased regulatory scrutiny resulting from, but by no means limited to, Basel III. After all, the quality of information management relies heavily on the data that feeds it.
However, ripping up the rule book in this instance does not mean ripping out current systems and a total overhaul of technology. What it does mean is that banks must take a strategic and considered approach to data management across people, processes and practices. Meeting the new requirements head on with accurate, accessible and actionable information is the only way to instill confidence in analysis and decision-making which underpins compliance, not least with Basel III.
The financial services industry is currently facing very challenging conditions. Cost pressures and reams of new regulations mean financial institutions are continually reassessing their processes to ensure they are not only delivering compliance but also value to the bottom-line.
It’s fair to say that most regulations come with a pretty substantial implementation price tag, and hardly a day goes by when the cost of compliance does not feature in our headlines. It’s another way the laissez-faire among us like to criticise the regulators. Responses to new rules from EMIR and MiFID II to OTC derivatives reporting and clearing requirements are testament to that.
However, it is interesting to see the proposed Legal Entity Identifier (LEI) programme has, in contrast to the usual regulation bashing, been welcomed by market participants. Many are likening it to the ‘golden chalice’ the industry has been waiting for and most can’t deny that the initiative (if implemented correctly) will considerably decrease the chances of history (2008/Lehman Brothers) repeating itself.
Some believe that LEI will be an opportunity to reduce operational costs as reporting becomes more efficient through a streamlined ID process and less resource is required to unpick the currently tangled and murky world of exposures and counterparty risk. Others take it one step further and suggest revenue-generating opportunities will be created as cash and manpower is freed up to focus on more lucrative business activities.
For any firm hoping to make the most of these opportunities we have just got one piece of advice: it’s what’s on the inside that counts. If any of this positive thinking is to be realised firms must ensure that all internal systems fit the bill – in today’s environment that’s the only way this story can end happy-LEI ever after.
Stress; it affects us all, and as we’ve seen over the past few years, the financial world is no different. Luckily for European financial institutions, help is at hand in the form of the regulators very own stress reliever: Basel III. Designed to enhance the loss-absorbing capacity and thus resilience to crisis situations, the big brother of Basel’s I and II is a regulatory game changer and the cornerstone of the industry’s attempt to control risk.
True to trilogy form, Basel III raises the stakes on its early models boasting a comprehensive set of reform measures, developed by the Basel Committee on Banking Supervision, to strengthen the regulation, supervision and risk management of the banking sector. This move has sparked calls from Germany and France to water down some important elements of the Basel III guidelines on capital requirements in order to mitigate any “negative effect” on growth.
However, Basel III exists because Basel II fundamentally didn’t work. By allowing financial institutions to create their own sophisticated risk weighted models it hands the initiative back to the institutions, allowing them to justify their own risk weightings.
The question then becomes how firms can justify the lowest capital holdings possible. And the answer? By having the upmost confidence in the data they are using to base their decision on as risk infrastructure and calculations are nothing without the data that feeds them. In other words, banks will need to demonstrate how and why they have arrived at their risk weightings and the only way to do this is through their data.
So when it comes to Basel III compliance if you are going to set your own risk model;
- Your data had better be accurate
- You had better be able to prove it
- And, you had better be able to prove the validity of data’s source
These are commandments of Basel III; stick to them, and you’ll quickly see your stress related stress vanish.
Are you sitting comfortably? Are you ready to be served another regulatory delight?
In the spotlight this week, we’ve brought some tax regulation to the forefront. The Foreign Account Tax Compliance Act, or as it’s more commonly known, FATCA, requires foreign financial institutions (FFIs) to gather sensitive data, including balances, receipts and withdrawals, on US account holders and identify accounts for the purpose of reporting to the Internal Revenue Service (IRS). As is commonplace with regulatory changes of all flavours, data requirements go far beyond the current mandatory tax requirements on financial institutions.
However it’s not just more data that binds together the plethora of new regulations facing the financial markets, but also the increasing focus on the quality of data held and analysed by financial institutions. Indeed, as regulators strive for transparency, quality data is the fundamental ingredient to ensuring that new regulations, from FATCA to Form PF, effectively measure and manage what they propose to.
At Asset Control the importance of quality data – what we call Triple A rated data (that has no chance of being downgraded) – has never been underestimated.
Although investment is required to achieve Triple A rated data, the cost of not having it is even bigger; for example, under FATCA, a withholding tax of 30% will be applied to payments made to FFIs if they fail to report or inaccurately classify clients, and that’s without even mentioning the reputational damage that can far outweigh any fines or other financial penalties.
So, if your systems are feeling the stress with FATCA compliance then the indigestion will only persist when it comes to stomaching the full menu of regulatory initiatives being served to the global financial markets. Firms must ensure they have the key compliance ingredient in stock if they are to confidently manage the onslaught of regulatory changes they face in 2012 and beyond.
Regarded as the “Basel of the insurance world,” Solvency II will impose new obligations on insurers and reinsurers operating in Europe. And with 2012 fast approaching, it is just two years until C-Day (compliance day). January 2014 is the latest amended date from the Financial Services Authority (FSA). If this deadline isn’t in their diaries already, insurers and reinsurers will have a Solvency II shaped New Year’s resolution come January 1, 2012.
Solvency II fundamentally reviews the capital adequacy regime for the European insurance industry. It aims to establish a revised set of EU-wide capital requirements and risk management standards that will replace the current solvency mandates.
Meeting these new requirements won’t be possible without the enterprise risk management frameworks that will demand clean and timely data for portfolio valuation that is capable of standing up to close scrutiny. The data-centric connotations for an overhaul of this stature are huge from both a collection and reporting standpoint.
The recent Solvency II: Internal Mode Approval Process (IMAP) Thematic Review Findings report from the FSA highlights serious concerns regarding insurers’ capabilities to deliver accurate data under Solvency II. According to the report, “Few firms provided sufficient evidence to show that data used in their internal model was accurate, complete and appropriate.”
But what exactly does it mean for data to be “accurate, complete and appropriate”?
Let’s start with appropriateness. The language of the regulator states that insurance companies must ensure that their data is “suitable for the intended purpose.” In this case, “appropriate” is an industry specific term. Insurers must prove they are able to cover any and all claims made. In short, their data must be able to demonstrate the ability to cover risks associated with potential claims.
Accuracy is a measure of the confidence that can be placed in the data. It must therefore be error free, consistent, and demonstrably so. Firms need to prove that they are using and applying this data to their undertakings. Sounds like a simple enough ask, but the fact of the matter is that for many firms, data is just not up to the required standards.
And finally, completeness. A complete data set will include a combination of sufficient historic data which gives insurers the full picture of risk, and the necessary quantities of risk data.
Here at Asset Control, we’ve had a similar mantra that reflects the data requirements for Solvency II compliance; focusing on providing financial organizations with accurate, accessible and actionable data. But however you interpret the language of Solvency II, it’s not going to go anywhere and it looks set to radicalize the way the insurance space record and report on their data.
Accurate information is the data management industry’s Holy Grail. It’s fundamental to what we do and what our clients do, and its importance is undeniable. But there is more to data than accuracy.
Accuracy by itself is useless if your people can’t get hold of the data they need when they need it. What might have been accurate this morning may not be at the end of the day. So accessibility is critical. There’s also no point in being accurate and accessible if your people can’t then act on that data.
In today’s world, most firms have multiple business units, product lines, and investment strategies – all of which require different data sets used in different ways. Compliance and risk management will need different data sets than the trading desk. Operations want data on actual holdings, while analysts use it for modeling, stress-testing and ‘what if’ scenarios. Clearly there’s no single solution for these distinct requirements.