Launched with Risk Managers in mind, AC Risk Data Manager seeks to bring consistency and control, quality and visualisation, and centralisation and seamless integration to clients’ risk data.
In a similar vein to the launch of Data Service Manager in April, Asset Control held a Webinar to introduce Risk Data Manager to both current and potential customers. The live broadcast took place on Tuesday 9th June at 2.30pm.
Product Consultant Cornelius Nandyal took viewers on a comprehensive tour of the ideas and motivations behind the product’s development, and how it could improve risk environments and bring sound governance to critical risk data.
This was followed by an in-depth product demonstration from Product Trainer Seán Barry, with useful tips and tricks on how to get the best from the new solution. With risk data visualisation and overall improved visibility across the board at the core of its development, Seán explained in detail how Risk Data Manager would fit smoothly within clients’ enterprise data management operations.
Following the results of the Comprehensive Capital Analysis and Review (CCAR), it’s fair to say that risk and capital management have significantly evolved since the global financial crisis of 2008. The positive CCAR results confirm that financial institutions are showing true commitment in having robust, forward-looking capital planning processes that take into account environment and firm-specific risks in order to operate through stressful conditions. This is supported by the recent DTCC report that indicates that systemic risk protection is becoming firmly embedded in corporate culture and standard business practices.
However, it’s important to keep in mind that the job is far from done. The Senior Supervisors Group on Counterparty Data states that data quality is of particular concern. While aggregation and automation capabilities have improved, data errors have not diminished proportionally. “Data aggregation issues and breakdowns in controls and governance result in that many firms cannot measure and monitor the accuracy of the data or rectify data quality issues in a timely manner”.
A robust data infrastructure, embedded data management processes, and dedicated data and reporting teams, can help implement the workflows that will allow compliance and reporting to become more effective and efficient for financial institutions.
By working with an experienced specialist to automate some of the core processes involved within stress testing, organizations can be relieved of some of the time, capital and resources that go into ensuring compliance. Without this, inaccurate data can have an exponential effect on the results and the capital requirements of a bank.
Take the stress out of stress testing – by developing a data supply chain that gets the right quality data to the right place at the right time, banks can concentrate on adopting a holistic and forward-looking view across the organization.
In advance of the Dodd Frank Act Stress Test (DFAST) results, due today, and Comprehensive Capital Analysis and Review (CCAR), next week, there has been much media and analyst speculation. There is concern and conjecture that poor results could cost the bank’s investors billions in potential dividend payments that banks might be able to make post-results.
Underpinning these billion dollar dividends and corresponding fluctuations in stock prices are small decisions around the collection and treatment of data. The foundations of the stress test results reside in data management. Sound data management is the first, and most important step, into meeting any stress tests with confidence. Poor data management practices create thousands of extra hours gathering the information, allows inaccurate data into the calculations, and creates uncertainty with regulators. Once there, they will have an exponential effect on the results, and the ultimate capital requirements, of a financial firm.
From our vantage point in the world of data management, we have had a front row seat into the effort and investment that banks have undertaken in order to meet the requirements for compiling, modeling and reporting of data. The technology, infrastructure and resources necessary to get these stress tests met, on time, has impressed us time and time again. Firms are showing true commitment to the process and the results are expected to be positive.
But the question remains: has the financial sector moved on from the systemic risks of 2008?
We’ll be watching with interest as the results unfold, and wondering if, for some institutions, was all the manual effort for naught?
Stress; it affects us all, and as we’ve seen over the past few years, the financial world is no different. Luckily for European financial institutions, help is at hand in the form of the regulators very own stress reliever: Basel III. Designed to enhance the loss-absorbing capacity and thus resilience to crisis situations, the big brother of Basel’s I and II is a regulatory game changer and the cornerstone of the industry’s attempt to control risk.
However, Basel III exists because Basel II fundamentally didn’t work. By allowing financial institutions to create their own sophisticated risk weighted models it hands the initiative back to the institutions, allowing them to justify their own risk weightings.
The question then becomes how firms can justify the lowest capital holdings possible. And the answer? By having the upmost confidence in the data they are using to base their decision on as risk infrastructure and calculations are nothing without the data that feeds them. In other words, banks will need to demonstrate how and why they have arrived at their risk weightings and the only way to do this is through their data.
So when it comes to Basel III compliance if you are going to set your own risk model;
- Your data had better be accurate
- You had better be able to prove it
- And, you had better be able to prove the validity of data’s source
These are commandments of Basel III; stick to them, and you’ll quickly see your stress related stress vanish.