DFAST results: the untold story

In advance of the Dodd Frank Act Stress Test (DFAST) results, due today, and Comprehensive Capital Analysis and Review (CCAR), next week, there has been much media and analyst speculation. There is concern and conjecture that poor results could cost the bank’s investors billions in potential dividend payments that banks might be able to make post-results.

Underpinning these billion dollar dividends and corresponding fluctuations in stock prices are small decisions around the collection and treatment of data. The foundations of the stress test results reside in data management. Sound data management is the first, and most important step, into meeting any stress tests with confidence. Poor data management practices create thousands of extra hours gathering the information, allows inaccurate data into the calculations, and creates uncertainty with regulators. Once there, they will have an exponential effect on the results, and the ultimate capital requirements, of a financial firm.

From our vantage point in the world of data management, we have had a front row seat into the effort and investment that banks have undertaken in order to meet the requirements for compiling, modeling and reporting of data. The technology, infrastructure and resources necessary to get these stress tests met, on time, has impressed us time and time again. Firms are showing true commitment to the process and the results are expected to be positive.

But the question remains: has the financial sector moved on from the systemic risks of 2008?

We’ll be watching with interest as the results unfold, and wondering if, for some institutions, was all the manual effort for naught?

Basel III: Helping the financial services industry cope with stress

Stress; it affects us all, and as we’ve seen over the past few years, the financial world is no different. Luckily for European financial institutions, help is at hand in the form of the regulators very own stress reliever: Basel III. Designed to enhance the loss-absorbing capacity and thus resilience to crisis situations, the big brother of Basel’s I and II is a regulatory game changer and the cornerstone of the industry’s attempt to control risk.

True to trilogy form, Basel III raises the stakes on its early models boasting a comprehensive set of reform measures, developed by the Basel Committee on Banking Supervision, to strengthen the regulation, supervision and risk management of the banking sector. This move has sparked calls from Germany and France to water down some important elements of the Basel III guidelines on capital requirements in order to mitigate any “negative effect” on growth.

However, Basel III exists because Basel II fundamentally didn’t work. By allowing financial institutions to create their own sophisticated risk weighted models it hands the initiative back to the institutions, allowing them to justify their own risk weightings.

The question then becomes how firms can justify the lowest capital holdings possible. And the answer? By having the upmost confidence in the data they are using to base their decision on as risk infrastructure and calculations are nothing without the data that feeds them. In other words, banks will need to demonstrate how and why they have arrived at their risk weightings and the only way to do this is through their data.

So when it comes to Basel III compliance if you are going to set your own risk model;

-          Your data had better be accurate

-          You had better be able to prove it

-          And, you had better be able to prove the validity of data’s source

These are commandments of Basel III; stick to them, and you’ll quickly see your stress related stress vanish.