Stress testing – what’s that coming over the hill?

Image

Binoculars look over summer view

report published by the Adam Smith Institute on 18th June questions the effectiveness of the the Bank of England stress testing program.

The report, penned by Kevin Dowd, Senior Fellow of the Adam Smith Institute and Professor of Finance and Economics at Durham University, suggests that the stress tests are flawed, due to four key reasons:

1. A single, questionable stress scenario

2. Inadequate data

3. Poor metrics

4. Unreliable models – especially risk models

Professor Dowd also claims that stress tests are creating systemic instability by forcing banks to standardise towards the models set out by the Bank of England.

The report pays special attention to the issue of bad data and points out the age-old issue that no model is of any use if poor data is fed into it. Most stress test exercises, he claims, involve stresses to a spreadsheet-based valuation model, and these are prone to a number of problems, including a tendency to underestimate the risks of complicated derivative trading positions and unquantifiable risks such as litigation or regulatory fines.

Mr Dowd also claims that a bank is likely to have thousands of different spreadsheet models and that there is no straightforward method of combining or standardising the information they provide across the institution as a whole. Data fed into any models will therefore vary in quality and be susceptible to error. The report claims that the Bank of England itself has acknowledged these issues and stated that data quality varied considerably across banks.

The report portrays the Bank’s stress tests as highly unreliable – “worse than useless” in fact – because they paint an unreliable picture of resilience. So far the Bank of England has not commented on the criticism, but informally banks have shown some frustration with the effort involved in preparing what critics would call a box-ticking exercise of limited practical use to those involved in the process.

Meanwhile, in the US on 29th May, the Federal Reserve Governor Daniel Tarullo met with several US banks to discuss possible changes to their stress testing programme. Those familiar with the meeting report that the banks were concerned with the ’one size fits all’ nature of the tests, and urged the Fed not to incorporate a complicated set of requirements for how bigger banks must calculate their capital requirements. The Fed has so far deferred including these “advanced approaches” in the test, but officials are yet to make a permanent decision on the matter.

For its part, the Fed also has concerns on how stress tests are being conducted.  At a speech on 3rd June, the Federal Reserve Bank of Chicago CEO, Charles Evans, described how some institutions are failing in their efforts to produce reliable stress tests under CCAR and DFAST.

Some firms, according to Mr Evans, distribute the initial scenarios across business lines, but do not have proper controls or management engagement to ensure that the tests are conducted in a systematic way with controls around data and assumptions. Without proper guidance and controls there is a risk that scenarios are inconsistently interpreted and produce results that are overly optimistic – or just plain wrong. Poor documentation and audit trails of information make it difficult to identify these issues and ultimately such data and procedural weaknesses can undermine the credibility of the stress test modelling framework and the pro forma financial results.

While Mr Evans did not say in his speech whether he expected any changes to the stress testing regime, Mr Tarullo and the four banking executives he met in May plan to meet again this summer to discuss the issue again, along with other regulatory measures.

So what is coming over the hill? 

Despite the call from critics like the Adam Smith Institute to abolish the tests altogether, and the difficulty regulators admit that banks have in producing stress testing submissions, the public appetite for financial services regulation and the perceived failure of ‘light touch’ pre-2007 regulatory regimes mean that stress testing will not be scrapped anytime soon.

Moreover, given that we are in early stages of a major change to how financial market stability and industry solvency is monitored, we can expect further change as regulators refine their stress testing approaches, as they gain insight from the information provided by participants. Industry feedback may yet prompt further changes, as regulators look to build acceptance for their stress testing regimes.

Another powerful argument to suggest that stress testing will continue is because regulators see evidence that it is working.  Since the establishment of CCAR, DFAST, UK and EU stress tests and other risk data regulations such as BCBS 239 and the upcoming BCBS 265, firms have made sizeable improvements.

These improvements have been seen in areas such as data collection, risk analytics and elements of corporate governance.  All this suggests that, independent of whether institutions believe the scenarios being tested are realistic or not, we are seeing the start of a fundamental shift in risk culture which is what the regulations seek to achieve in the long run.

Banks on both sides of the Atlantic are faced with two common challenges:

1.     How to make stress testing into a transparent, repeatable and cost-efficient process

2.     How to cope with stress testing changes in the future

Regulators will remind banks to focus on the intent behind regulation, rather than what the specific tests will be this year or next.

The intent of the regulation, in both its American and European form, is to encourage institutions to seek the building of a capital planning and stress testing framework that identifies firm-specific risks and ultimately informs the decisions that management take. Therefore, when it comes to stress scenarios, banks should not merely try to comply with regulatory stress tests, but instead build a framework to understand and monitor the specific risks in their own markets.

A proactive, rather than reactive, approach to stress testing and risk reporting generally is viewed as the key to increasing the resilience of the individual firms, the overall stability of the financial system, and – ultimately – regulatory approval.

How is this done? 

Just as critics of stress testing say a ‘one size fits all’ approach can be considered too blunt for the wide variety of business models in financial markets, there is no single method (or off-the-shelf solution) to help banks manage the challenge of annual stress testing. That said, there are common approaches that institutions – including clients of Asset Control – adopt, that are being recommended by regulators as best practice.

From an organisational and systems perspective, these recommendations include:

1.     Governance

When conducting stress tests, strong internal controls (and involvement of audit and risk) are critically important toward the translation of stress scenarios into financial impacts. When such translations are undertaken at large, complex organisations, it can often involve multiple models and lines of business. Firms that have assembled strong stress testing approaches also have strong governance functions whereby all parties work together to implement the stress scenarios and coordinate assumptions, and the senior management of each business line is intimately engaged in the process.

Well-articulated documentation, standards and process orchestration and underlying data ensure that scenarios are implemented consistently across business functions in terms of the understanding of the intent of the scenario, the portfolios and business lines affected and the accuracy and consistency of reported results.

In addition to this, a sound governance process identifies and addresses gaps in an enterprise’s stress testing framework and has the authority to work with business owners and stakeholders to address them.

2.     Data Control Points for Scenario Generation

Data models are the core of each stress test.  Crucial assumptions used to estimate losses, risk weighted assets, and revenues must be clearly documented so that the detail of each scenario is fully parameterized.

To achieve consistency in preparing scenarios it is essential that common data inputs (e.g. securities data, historical market prices) and derived data (e.g. curves and surfaces) are sourced, validated and distributed from a single function. This is to make sure common data has assured lineage and provenance, and any changes can be quickly propagated throughout the organisation. High quality data is needed for understanding the historical relationship between scenario parameters (GDP, interest rates, unemployment, stock markets etc.) so that scenarios can be correctly discussed and challenged before they are released internally.

Where scenarios require price or volatility shocks to trading books these should also be defined from a validated central source and provided to models that are simulating their impact on positions for the stress test. For the results to be convincing to regulators, it is important that banks have a deep understanding of the data and models used in their scenarios so that their explanatory materials meet the required standards.

3.     Workflow Control Points

In order to establish a robust stress testing framework – and to make the process both repeatable and auditable – institutions must first understand the necessary workflows and the data integration challenges they face.

At such firms, control points have been established to both promote repeatable and automated practices within the modelling framework, and to verify the data and model quality throughout the process. These control points ensure that corrections or enrichments of data used are proliferated consistently both upstream and downstream from the control point and – wherever possible – the opportunity for manual input of data is eliminated, along with the risk of human error.

In short, governance, data management, model management and system integration are the keys to banks building a robust stress testing framework that is capable of both adapting to future changes in stress testing requirements and, more importantly, becoming the decision-making tool that regulators hope will inform both strategic and routine decisions.

In my opinion, making the process repeatable and cost effective will be made possible by the use of automation and integration, particularly with regard to the common data sets required for scenarios calculation around the various departments and divisions running them. This moves the process from what is currently a highly manual task, led by risk and audit experts, to an automated and scalable environment that leverages existing technology investments.

I believe financial institutions will ultimately be successful in creating stress testing environments that meet regulator’s needs and indeed institutionalise risk weighted decision making. After all, the single greatest engineering skill western financial institutions have shown over the last 50 years is the ability to take physical processes and make them virtual.

The question today is whether this transformation can be done at a pace that keeps regulators satisfied.

 

References

http://www.adamsmith.org/wp-content/uploads/2015/06/No-Stress.pdf

“The Adam Smith Institute has an open access policy. Copyright remains with the copyright holder, but users may download, save and distribute this work in any format provided: (1) that the Adam Smith Institute is cited; (2) that the web address adamsmith.org is published together with a prominent copy of this notice; (3) the text is used in full without amendment [extracts may be used for criticism or review]; (4) the work is not re–sold; (5) the link for any online use is sent to info@ adamsmith.org.”

http://www.bankofengland.co.uk/financialstability/Documents/fpc/keyelements.pdf

https://www.chicagofed.org/publications/speeches/2015/06-03-call-for-proactive-risk-culture

 

Asset Control serves over 60 global financial institutions worldwide, including 45% of G-SIBS.  Established in 1991, we play a central role in guaranteeing enterprise-wide data quality for our clients, as well as helping them comply with stress testing and other data-centric regulatory requirements such as BCBS 239. Asset Control provides these institutions with an essential foundation for a risk management stack, delivering ultra-consistent, auditable and SLA-driven data, including time series, end of day prices, static data and derived data both to the risk function and to the entire enterprise.

In June 2015 Asset Control released its all-new Risk Data Manager module to automate the process of preparing price, curve and surface shocks for stress testing frameworks.

Webinar: AC Risk Data Manager

ON TUESDAY 9TH JUNE, ASSET CONTROL LAUNCHED AC RISK DATA MANAGER

 

 Risk Data Manager Risk Data Visualisation Demonstration

 

Launched with Risk Managers in mind, AC Risk Data Manager seeks to bring consistency and control, quality and visualisation, and centralisation and seamless integration to clients’ risk data.

In a similar vein to the launch of Data Service Manager in April, Asset Control held a Webinar to introduce Risk Data Manager to both current and potential customers. The live broadcast took place on Tuesday 9th June at 2.30pm.

Product Consultant Cornelius Nandyal took viewers on a comprehensive tour of the ideas and motivations behind the product’s development, and how it could improve risk environments and bring sound governance to critical risk data.

This was followed by an in-depth product demonstration from Product Trainer Seán Barry, with useful tips and tricks on how to get the best from the new solution. With risk data visualisation and overall improved visibility across the board at the core of its development, Seán explained in detail how Risk Data Manager would fit smoothly within clients’ enterprise data management operations.

Watch the Webinar here (registration required) or visit the Risk Data Manager product page to find out more.

 

 

CCAR Results: Taking the stress out of stress testing

Basel IIIFollowing the results of the Comprehensive Capital Analysis and Review (CCAR), it’s fair to say that risk and capital management have significantly evolved since the global financial crisis of 2008. The positive CCAR results confirm that financial institutions are showing true commitment in having robust, forward-looking capital planning processes that take into account environment and firm-specific risks in order to operate through stressful conditions. This is supported by the recent DTCC report that indicates that systemic risk protection is becoming firmly embedded in corporate culture and standard business practices[1].

However, it’s important to keep in mind that the job is far from done. The Senior Supervisors Group on Counterparty Data states that data quality is of particular concern. While aggregation and automation capabilities have improved, data errors have not diminished proportionally. “Data aggregation issues and breakdowns in controls and governance result in that many firms cannot measure and monitor the accuracy of the data or rectify data quality issues in a timely manner”[2].

A robust data infrastructure, embedded data management processes, and dedicated data and reporting teams, can help implement the workflows that will allow compliance and reporting to become more effective and efficient for financial institutions.

By working with an experienced specialist to automate some of the core processes involved within stress testing, organizations can be relieved of some of the time, capital and resources that go into ensuring compliance. Without this, inaccurate data can have an exponential effect on the results and the capital requirements of a bank.

Take the stress out of stress testing – by developing a data supply chain that gets the right quality data to the right place at the right time, banks can concentrate on adopting a holistic and forward-looking view across the organization.

DFAST results: the untold story

In advance of the Dodd Frank Act Stress Test (DFAST) results, due today, and Comprehensive Capital Analysis and Review (CCAR), next week, there has been much media and analyst speculation. There is concern and conjecture that poor results could cost the bank’s investors billions in potential dividend payments that banks might be able to make post-results.

Underpinning these billion dollar dividends and corresponding fluctuations in stock prices are small decisions around the collection and treatment of data. The foundations of the stress test results reside in data management. Sound data management is the first, and most important step, into meeting any stress tests with confidence. Poor data management practices create thousands of extra hours gathering the information, allows inaccurate data into the calculations, and creates uncertainty with regulators. Once there, they will have an exponential effect on the results, and the ultimate capital requirements, of a financial firm.

From our vantage point in the world of data management, we have had a front row seat into the effort and investment that banks have undertaken in order to meet the requirements for compiling, modeling and reporting of data. The technology, infrastructure and resources necessary to get these stress tests met, on time, has impressed us time and time again. Firms are showing true commitment to the process and the results are expected to be positive.

But the question remains: has the financial sector moved on from the systemic risks of 2008?

We’ll be watching with interest as the results unfold, and wondering if, for some institutions, was all the manual effort for naught?

Basel III: Helping the financial services industry cope with stress

Stress; it affects us all, and as we’ve seen over the past few years, the financial world is no different. Luckily for European financial institutions, help is at hand in the form of the regulators very own stress reliever: Basel III. Designed to enhance the loss-absorbing capacity and thus resilience to crisis situations, the big brother of Basel’s I and II is a regulatory game changer and the cornerstone of the industry’s attempt to control risk.

True to trilogy form, Basel III raises the stakes on its early models boasting a comprehensive set of reform measures, developed by the Basel Committee on Banking Supervision, to strengthen the regulation, supervision and risk management of the banking sector. This move has sparked calls from Germany and France to water down some important elements of the Basel III guidelines on capital requirements in order to mitigate any “negative effect” on growth.

However, Basel III exists because Basel II fundamentally didn’t work. By allowing financial institutions to create their own sophisticated risk weighted models it hands the initiative back to the institutions, allowing them to justify their own risk weightings.

The question then becomes how firms can justify the lowest capital holdings possible. And the answer? By having the upmost confidence in the data they are using to base their decision on as risk infrastructure and calculations are nothing without the data that feeds them. In other words, banks will need to demonstrate how and why they have arrived at their risk weightings and the only way to do this is through their data.

So when it comes to Basel III compliance if you are going to set your own risk model;

-          Your data had better be accurate

-          You had better be able to prove it

-          And, you had better be able to prove the validity of data’s source

These are commandments of Basel III; stick to them, and you’ll quickly see your stress related stress vanish.