At a Glance: 2015 Stress Testing Results from the Bank of England

Bank of England, City of London

Designed to test the resilience of UK banks to the effects of potential crises with hypothetical ‘shocks’ and scenarios, stress testing is becoming an increasingly public discussion point across the globe.


Today’s stress test results from the Bank of England emphasise the critical nature of data quality for the financial services industry.

For the majority of banks, the overall data quality and accuracy of results submitted to the Bank also represented an improvement relative to the 2014 exercise. For those banks that performed best, data submissions contained no material omissions, were accurate and required fewer clarifications.”

In confirming that the most successful banks in their latest exercise had clearly prioritised the accuracy of their submissions, the BoE is corroborating the importance of a bank’s data foundation. Establish solid data integrity and you are giving yourself the confidence to make the right decisions.

Implementing internal stress testing and data governance strategies around new regulations (such as the incoming Fundamental Review of the Trading Book) do not simply stand banks in good stead for completing stress tests successfully, but they set them on the road to improving company-wide processes.

This exercise is, after all, aiming to stabilise the economy against future storms – and as such it cannot be undertaken to merely tick the right boxes.

The BoE – as with all regulators, central banks, committees worldwide – wants to engender systematic changes in the way banks operate to ensure we do not see a repeat of the 2007/2008 crash.

papers during financial crisis


Scope for improvement in regard to data is also specifically outlined for the Traded Risk area.

While overall data quality was generally improved, there were three areas where banks’ data quality was generally poorer: net interest income, traded risk and structured finance. Given the materiality of net interest income projections, the variability in data quality was notable. In addition, as the Bank highlighted in 2014, methodologies used to support assumptions and modelling decisions were less good than in other areas, such as credit risk, for example.”

As the report goes on to mention, the relative lack of data quality for traded risk compared to 2014 can be put down to changes in the data that is required, moving on from previous use of EBA templates.

This shows an individual renewed data focus from the Bank of England, and, as such, is something banks – in particular those who have performed less impressively this time around – will need to concentrate on in the months and years to come.

By undertaking traded risk scenario work in a centralised risk data system, firms can make vast improvements to their data and documentation and subsequently assist with regulatory submissions.

At Asset Control, our latest developments to AC Risk Data Manager are focused on exactly this – giving users the ability to handle all their risk data efficiently in one central system.





To address the vital roles of data governance and stress testing in the financial services industry, Asset Control assembled a panel of data experts to identify major challenges, share best practices, and troubleshoot compliance issues.

Building Stress Testing Models




Opinion: Post-Stress Testing Europe 2015




Stress Testing is here to stay”

On Tuesday 29th September, Asset Control’s VP Product Management Martijn Groot joined stress testing heads from a variety of banks, including BNP Paribas, Morgan Stanley, Barclays, UBS, Credit Suisse, Santander, for Stress Testing Europe 2015.

Taking place at London’s The Tower Hotel, under the rather spectacular shadow of Tower Bridge, CFP Events’ 3rd annual Stress Testing Europe event brought together almost 150 of the industry’s top players to discuss strategies, share ideas and formulate plans for the future.


Stress testing has truly come into its own as an independent discipline in the last 5 years, moving on from “a couple of paragraphs in the Basel framework”. Where 5 years ago a bank may have had a stress testing team made up of 4 or 5 people, to deal with incoming regulation-based scenarios – such as the FSA anchor 1 scenario –  in today’s environment this would now be up to 100 or more at the major G-SIBs in Europe.

Though it has been a gradual inception, it still takes institutions by surprise how much time is required to stay apace of regulation and stress testing requirements. This is before you even take into consideration the intent behind the regulation; to embed stress testing in an institution’s mentality and ultimately have it become part of everyday business planning and ‘what if’ risk analysis.

Stress testing may now be firmly entrenched as an element of risk management, but it is also no longer limited to risk. It now takes in Treasury and Finance, for which many institutions have overlapping organisational frameworks. The quest? To develop an intuitive narrative of severe yet plausible future scenarios in granular detail to test a bank’s ability to cope in any given situation.The challenge after creating and implementing these scenarios is to have one integrated view of them all, as one large assessment; being thoroughly prepared for the future.








Overcoming the Challenges of Reporting and Submitting Under Different Jurisdictions

Martijn’s panel discussion, alongside Cecilia Gejke, Executive Director, Liquidity Risk & Stress Testing at Mizuho Securities, and chaired by Ian Wilson of Deloitte, addressed ‘Overcoming the Challenges of Reporting and Submitting Under Different Jurisdictions’. This focused on how banks are dealing with the multitudinous regulations around the world, making specific reference to Mizuho’s need to report for European, US and Japanese regulators; all in different ways and with different inputs.

There is generally no synergy between regulators, meaning those involved in risk and stress testing become ‘mini IT project managers’ themselves in order to take on the various tasks involved, with an aim of achieving strategic solutions, often with people far away, with language differences.

aggregating your data






Data aggregation and reporting

With regard to data aggregation and reporting, banks often spend a huge amount of time simply avoiding and correcting errors in Excel, such as automatic rounding. For regulations such as BCBS 239, data quality and infrastructure is paramount and therefore finding and correcting errors may be an onerous task, but an essential one.

As Cecilia alluded to in the discussion, consolidation into one central data warehouse is key, though developing this to envelope both input and output is not always possible; input is the most important to initially ‘slice and dice’, with output delineation often taking a back seat due to time and manpower restrictions.

implementing shocks to risk factors






Scenario Generation

Despite the resource constraints that still engulf many banks’ stress testing departments, developing scenarios that can be aligned across an enterprise is seen as a logical step for the future. At present, however, it is a case of “trying your best”, with the linking of shocks, augmenting and interpolating variables proving immensely time consuming.

The parameters for models are not currently clearly defined, which leaves banks with a lot – potentially too much – freedom to model as they wish. This freedom could make it easy for banks to tweak models, which could prove dangerous –  highlighting the integral need for verification, for clear governance and for cascaded clarity around the purpose and meaning of the models.

With the rules around stress testing still being open to interpretation, clear, transparent communication across hierarchies and geographies is vital, as banks have to be able to make decisions of their own accord; and stand by these decisions if questions arise. Achieving this is possible through creating ‘working groups’ to manifest collaboration between teams, in particular the overlap of risk and finance. Each team and individual must understand their own limitations, while investing time in open dialogue – but this is a challenge to maintain, at all levels.

a central data layer is integral






The need for a central data layer

Best practice says you should have a central data layer which branches out into different scenarios, through construction, timing, mapping and workflow. The scenarios should be run past Front Office operations early on, to assist with research and strategy. Though this can prove a large workload, it improves stakeholder engagement and governance and gets all teams on board.

Recent events in China have only served to reinforce the importance of stress testing as not simply a lot of separate forecasts but as dynamic, scalable scenarios as potential responses to future stress. As delegates at the event confirmed, the same stress test scenarios should not be repeated year on year, but continually developed in-line with company and market conditions to ensure the most valid and useful results and insight – as well as alignment with the spirit of the regulation (in the USA, the Fed is insisting that banks start afresh with their stress testing scenarios each year, and roll them up and down an organisation).

Beyond insight, on a practical level, realism of your stress tests – true-to-life applications – is required to gain executive buy-in. Being able to articulate both what you are doing in relation to stress tests and why, will avoid you falling into a “too model-based” approach.

The fact that stress testing takes a lot of time away from day-to-day activities means that banks should seek to create value from the exercise. Ultimately, stress tests should be a powerful tool for manifesting investor and management confidence – but, even with the exponential growth of banks’ stress testing departments, earmarking sufficient time and manpower to fully roll out stress testing across an organisation is still proving difficult to many.

regulation stamp






Getting to grips with new regulations and their implications

As such, by way of a conclusion, it is clear that institutions across the board are yet to fully get to grips with regulatory stress testing requirements.

Certain larger banks are blazing a trail, building robust, repeatable and automated stress testing architectures, but many in the field are still carrying out task on an ad-hoc, somewhat tactical basis.

Consensus was reached amongst attendees that as regulators start to develop interoperability and standardise their stress testing requirements, the industry will follow suit. The institutions that face the greatest challenges are those which have activities spanning across jurisdictions.

Leading banks see centralised, data-based solutions, whether internally built or vendor-maintained, as market best practice, and disparate tactical solutions, like spreadsheets, are a hindrance and should start to be replaced if the industry is to progress.

Whatever the implications, and however institutions choose to deal with the inherent challenges, as Leif Boegelein addressed in his opening remarks – stress testing is here to stay.




Stress testing – what’s that coming over the hill?


Binoculars look over summer view

report published by the Adam Smith Institute on 18th June questions the effectiveness of the the Bank of England stress testing program.

The report, penned by Kevin Dowd, Senior Fellow of the Adam Smith Institute and Professor of Finance and Economics at Durham University, suggests that the stress tests are flawed, due to four key reasons:

1. A single, questionable stress scenario

2. Inadequate data

3. Poor metrics

4. Unreliable models – especially risk models

Professor Dowd also claims that stress tests are creating systemic instability by forcing banks to standardise towards the models set out by the Bank of England.

The report pays special attention to the issue of bad data and points out the age-old issue that no model is of any use if poor data is fed into it. Most stress test exercises, he claims, involve stresses to a spreadsheet-based valuation model, and these are prone to a number of problems, including a tendency to underestimate the risks of complicated derivative trading positions and unquantifiable risks such as litigation or regulatory fines.

Mr Dowd also claims that a bank is likely to have thousands of different spreadsheet models and that there is no straightforward method of combining or standardising the information they provide across the institution as a whole. Data fed into any models will therefore vary in quality and be susceptible to error. The report claims that the Bank of England itself has acknowledged these issues and stated that data quality varied considerably across banks.

The report portrays the Bank’s stress tests as highly unreliable – “worse than useless” in fact – because they paint an unreliable picture of resilience. So far the Bank of England has not commented on the criticism, but informally banks have shown some frustration with the effort involved in preparing what critics would call a box-ticking exercise of limited practical use to those involved in the process.

Meanwhile, in the US on 29th May, the Federal Reserve Governor Daniel Tarullo met with several US banks to discuss possible changes to their stress testing programme. Those familiar with the meeting report that the banks were concerned with the ’one size fits all’ nature of the tests, and urged the Fed not to incorporate a complicated set of requirements for how bigger banks must calculate their capital requirements. The Fed has so far deferred including these “advanced approaches” in the test, but officials are yet to make a permanent decision on the matter.

For its part, the Fed also has concerns on how stress tests are being conducted.  At a speech on 3rd June, the Federal Reserve Bank of Chicago CEO, Charles Evans, described how some institutions are failing in their efforts to produce reliable stress tests under CCAR and DFAST.

Some firms, according to Mr Evans, distribute the initial scenarios across business lines, but do not have proper controls or management engagement to ensure that the tests are conducted in a systematic way with controls around data and assumptions. Without proper guidance and controls there is a risk that scenarios are inconsistently interpreted and produce results that are overly optimistic – or just plain wrong. Poor documentation and audit trails of information make it difficult to identify these issues and ultimately such data and procedural weaknesses can undermine the credibility of the stress test modelling framework and the pro forma financial results.

While Mr Evans did not say in his speech whether he expected any changes to the stress testing regime, Mr Tarullo and the four banking executives he met in May plan to meet again this summer to discuss the issue again, along with other regulatory measures.

So what is coming over the hill? 

Despite the call from critics like the Adam Smith Institute to abolish the tests altogether, and the difficulty regulators admit that banks have in producing stress testing submissions, the public appetite for financial services regulation and the perceived failure of ‘light touch’ pre-2007 regulatory regimes mean that stress testing will not be scrapped anytime soon.

Moreover, given that we are in early stages of a major change to how financial market stability and industry solvency is monitored, we can expect further change as regulators refine their stress testing approaches, as they gain insight from the information provided by participants. Industry feedback may yet prompt further changes, as regulators look to build acceptance for their stress testing regimes.

Another powerful argument to suggest that stress testing will continue is because regulators see evidence that it is working.  Since the establishment of CCAR, DFAST, UK and EU stress tests and other risk data regulations such as BCBS 239 and the upcoming BCBS 265, firms have made sizeable improvements.

These improvements have been seen in areas such as data collection, risk analytics and elements of corporate governance.  All this suggests that, independent of whether institutions believe the scenarios being tested are realistic or not, we are seeing the start of a fundamental shift in risk culture which is what the regulations seek to achieve in the long run.

Banks on both sides of the Atlantic are faced with two common challenges:

1.     How to make stress testing into a transparent, repeatable and cost-efficient process

2.     How to cope with stress testing changes in the future

Regulators will remind banks to focus on the intent behind regulation, rather than what the specific tests will be this year or next.

The intent of the regulation, in both its American and European form, is to encourage institutions to seek the building of a capital planning and stress testing framework that identifies firm-specific risks and ultimately informs the decisions that management take. Therefore, when it comes to stress scenarios, banks should not merely try to comply with regulatory stress tests, but instead build a framework to understand and monitor the specific risks in their own markets.

A proactive, rather than reactive, approach to stress testing and risk reporting generally is viewed as the key to increasing the resilience of the individual firms, the overall stability of the financial system, and – ultimately – regulatory approval.

How is this done? 

Just as critics of stress testing say a ‘one size fits all’ approach can be considered too blunt for the wide variety of business models in financial markets, there is no single method (or off-the-shelf solution) to help banks manage the challenge of annual stress testing. That said, there are common approaches that institutions – including clients of Asset Control – adopt, that are being recommended by regulators as best practice.

From an organisational and systems perspective, these recommendations include:

1.     Governance

When conducting stress tests, strong internal controls (and involvement of audit and risk) are critically important toward the translation of stress scenarios into financial impacts. When such translations are undertaken at large, complex organisations, it can often involve multiple models and lines of business. Firms that have assembled strong stress testing approaches also have strong governance functions whereby all parties work together to implement the stress scenarios and coordinate assumptions, and the senior management of each business line is intimately engaged in the process.

Well-articulated documentation, standards and process orchestration and underlying data ensure that scenarios are implemented consistently across business functions in terms of the understanding of the intent of the scenario, the portfolios and business lines affected and the accuracy and consistency of reported results.

In addition to this, a sound governance process identifies and addresses gaps in an enterprise’s stress testing framework and has the authority to work with business owners and stakeholders to address them.

2.     Data Control Points for Scenario Generation

Data models are the core of each stress test.  Crucial assumptions used to estimate losses, risk weighted assets, and revenues must be clearly documented so that the detail of each scenario is fully parameterized.

To achieve consistency in preparing scenarios it is essential that common data inputs (e.g. securities data, historical market prices) and derived data (e.g. curves and surfaces) are sourced, validated and distributed from a single function. This is to make sure common data has assured lineage and provenance, and any changes can be quickly propagated throughout the organisation. High quality data is needed for understanding the historical relationship between scenario parameters (GDP, interest rates, unemployment, stock markets etc.) so that scenarios can be correctly discussed and challenged before they are released internally.

Where scenarios require price or volatility shocks to trading books these should also be defined from a validated central source and provided to models that are simulating their impact on positions for the stress test. For the results to be convincing to regulators, it is important that banks have a deep understanding of the data and models used in their scenarios so that their explanatory materials meet the required standards.

3.     Workflow Control Points

In order to establish a robust stress testing framework – and to make the process both repeatable and auditable – institutions must first understand the necessary workflows and the data integration challenges they face.

At such firms, control points have been established to both promote repeatable and automated practices within the modelling framework, and to verify the data and model quality throughout the process. These control points ensure that corrections or enrichments of data used are proliferated consistently both upstream and downstream from the control point and – wherever possible – the opportunity for manual input of data is eliminated, along with the risk of human error.

In short, governance, data management, model management and system integration are the keys to banks building a robust stress testing framework that is capable of both adapting to future changes in stress testing requirements and, more importantly, becoming the decision-making tool that regulators hope will inform both strategic and routine decisions.

In my opinion, making the process repeatable and cost effective will be made possible by the use of automation and integration, particularly with regard to the common data sets required for scenarios calculation around the various departments and divisions running them. This moves the process from what is currently a highly manual task, led by risk and audit experts, to an automated and scalable environment that leverages existing technology investments.

I believe financial institutions will ultimately be successful in creating stress testing environments that meet regulator’s needs and indeed institutionalise risk weighted decision making. After all, the single greatest engineering skill western financial institutions have shown over the last 50 years is the ability to take physical processes and make them virtual.

The question today is whether this transformation can be done at a pace that keeps regulators satisfied.



“The Adam Smith Institute has an open access policy. Copyright remains with the copyright holder, but users may download, save and distribute this work in any format provided: (1) that the Adam Smith Institute is cited; (2) that the web address is published together with a prominent copy of this notice; (3) the text is used in full without amendment [extracts may be used for criticism or review]; (4) the work is not re–sold; (5) the link for any online use is sent to info@”


Asset Control serves over 60 global financial institutions worldwide, including 45% of G-SIBS.  Established in 1991, we play a central role in guaranteeing enterprise-wide data quality for our clients, as well as helping them comply with stress testing and other data-centric regulatory requirements such as BCBS 239. Asset Control provides these institutions with an essential foundation for a risk management stack, delivering ultra-consistent, auditable and SLA-driven data, including time series, end of day prices, static data and derived data both to the risk function and to the entire enterprise.

In June 2015 Asset Control released its all-new Risk Data Manager module to automate the process of preparing price, curve and surface shocks for stress testing frameworks.

Webinar: AC Risk Data Manager



 Risk Data Manager Risk Data Visualisation Demonstration


Launched with Risk Managers in mind, AC Risk Data Manager seeks to bring consistency and control, quality and visualisation, and centralisation and seamless integration to clients’ risk data.

In a similar vein to the launch of Data Service Manager in April, Asset Control held a Webinar to introduce Risk Data Manager to both current and potential customers. The live broadcast took place on Tuesday 9th June at 2.30pm.

Product Consultant Cornelius Nandyal took viewers on a comprehensive tour of the ideas and motivations behind the product’s development, and how it could improve risk environments and bring sound governance to critical risk data.

This was followed by an in-depth product demonstration from Product Trainer Seán Barry, with useful tips and tricks on how to get the best from the new solution. With risk data visualisation and overall improved visibility across the board at the core of its development, Seán explained in detail how Risk Data Manager would fit smoothly within clients’ enterprise data management operations.

Watch the Webinar here (registration required) or visit the Risk Data Manager product page to find out more.



CCAR Results: Taking the stress out of stress testing

Basel IIIFollowing the results of the Comprehensive Capital Analysis and Review (CCAR), it’s fair to say that risk and capital management have significantly evolved since the global financial crisis of 2008. The positive CCAR results confirm that financial institutions are showing true commitment in having robust, forward-looking capital planning processes that take into account environment and firm-specific risks in order to operate through stressful conditions. This is supported by the recent DTCC report that indicates that systemic risk protection is becoming firmly embedded in corporate culture and standard business practices[1].

However, it’s important to keep in mind that the job is far from done. The Senior Supervisors Group on Counterparty Data states that data quality is of particular concern. While aggregation and automation capabilities have improved, data errors have not diminished proportionally. “Data aggregation issues and breakdowns in controls and governance result in that many firms cannot measure and monitor the accuracy of the data or rectify data quality issues in a timely manner”[2].

A robust data infrastructure, embedded data management processes, and dedicated data and reporting teams, can help implement the workflows that will allow compliance and reporting to become more effective and efficient for financial institutions.

By working with an experienced specialist to automate some of the core processes involved within stress testing, organizations can be relieved of some of the time, capital and resources that go into ensuring compliance. Without this, inaccurate data can have an exponential effect on the results and the capital requirements of a bank.

Take the stress out of stress testing – by developing a data supply chain that gets the right quality data to the right place at the right time, banks can concentrate on adopting a holistic and forward-looking view across the organization.

DFAST results: the untold story

In advance of the Dodd Frank Act Stress Test (DFAST) results, due today, and Comprehensive Capital Analysis and Review (CCAR), next week, there has been much media and analyst speculation. There is concern and conjecture that poor results could cost the bank’s investors billions in potential dividend payments that banks might be able to make post-results.

Underpinning these billion dollar dividends and corresponding fluctuations in stock prices are small decisions around the collection and treatment of data. The foundations of the stress test results reside in data management. Sound data management is the first, and most important step, into meeting any stress tests with confidence. Poor data management practices create thousands of extra hours gathering the information, allows inaccurate data into the calculations, and creates uncertainty with regulators. Once there, they will have an exponential effect on the results, and the ultimate capital requirements, of a financial firm.

From our vantage point in the world of data management, we have had a front row seat into the effort and investment that banks have undertaken in order to meet the requirements for compiling, modeling and reporting of data. The technology, infrastructure and resources necessary to get these stress tests met, on time, has impressed us time and time again. Firms are showing true commitment to the process and the results are expected to be positive.

But the question remains: has the financial sector moved on from the systemic risks of 2008?

We’ll be watching with interest as the results unfold, and wondering if, for some institutions, was all the manual effort for naught?

Basel III: Helping the financial services industry cope with stress

Stress; it affects us all, and as we’ve seen over the past few years, the financial world is no different. Luckily for European financial institutions, help is at hand in the form of the regulators very own stress reliever: Basel III. Designed to enhance the loss-absorbing capacity and thus resilience to crisis situations, the big brother of Basel’s I and II is a regulatory game changer and the cornerstone of the industry’s attempt to control risk.

True to trilogy form, Basel III raises the stakes on its early models boasting a comprehensive set of reform measures, developed by the Basel Committee on Banking Supervision, to strengthen the regulation, supervision and risk management of the banking sector. This move has sparked calls from Germany and France to water down some important elements of the Basel III guidelines on capital requirements in order to mitigate any “negative effect” on growth.

However, Basel III exists because Basel II fundamentally didn’t work. By allowing financial institutions to create their own sophisticated risk weighted models it hands the initiative back to the institutions, allowing them to justify their own risk weightings.

The question then becomes how firms can justify the lowest capital holdings possible. And the answer? By having the upmost confidence in the data they are using to base their decision on as risk infrastructure and calculations are nothing without the data that feeds them. In other words, banks will need to demonstrate how and why they have arrived at their risk weightings and the only way to do this is through their data.

So when it comes to Basel III compliance if you are going to set your own risk model;

-          Your data had better be accurate

-          You had better be able to prove it

-          And, you had better be able to prove the validity of data’s source

These are commandments of Basel III; stick to them, and you’ll quickly see your stress related stress vanish.