Skip to main content

FRTB and Optimal Data Management

The challenges banks have to cope with are ramping up all the time. They face increased regulatory scrutiny on data quality. They must achieve consistency between risk and front office to meet modellability and attribution tests. They have to navigate legacy systems that do not scale to handle required data volumes and miss crucial capabilities in e.g. data lineage and bi-temporality. At the same time, they are having to transition to the cloud to drive efficiencies and support business user enablement by delivering better access to data

FRTB raises the stakes further by introducing three critical changes to market data management, how to measure risk, assess risk factors, and classify risk factors.

  • Measuring risk
    The introduction of Expected Shortfall (ES) as a replacement for VaR entails that the outliers hit the tail and therefore the regulatory capital directly. Most banks currently use one- or two-years’ history, meaning the expected shortfall requirement for 10 years’ worth is not easily retro-fitted; particularly in cases where teams are working with legacy systems. Apart from that, banks need to be able to determine the most stressful 12 months over the last ten years. This means banks require reliable and flexible storage for time series data, with the ability to consolidate data sources – and fill any gaps. i
  • Assessment of risk factors: focus on real prices
    The second signifcant change FRTB introduces is a refreshed form of assessment for risk factors, based on their ‘modellability’. The use of real prices is required to prove modellability and identify ‘non-modellable’ risk factors (NMRFs). Banks can use data vendors to supply this real price data, providing they can be audited by regulators.
  • Classification of risk factors
    Risk factors are also subject to a new method of categorisation. This includes a need for model support behind risk factor classifications, the ability to extend reference data mappings, and the integration of the liquidity horizon categorisation.

These changes will place extra pressure on banks. Essentially, however, FRTB adheres to the same notion of a solid data foundation as many other regulations – and therefore should be seen in that context.

Ultimately, the best way to manage regulatory change is through the accurate collection, controlled sourcing, cross-referencing, and integration of data. This can address common regulatory “asks” around taxonomies, classifications, unambiguous identification, additional data context, links between related elements and requirements on audit and lineage. The capabilities to do this will pave the way to insight-driven data management, early warnings on market data issues and their implications and business user enablement to support users in risk, valuation, finance, operations and front-office with increasingly data intensive jobs.

Compliance with today’s financial services regulation cannot be a tick in the box. In order not to be overwhelmed by regulatory change management that crowds out everything else, and not to fall victim to the increasingly onerous penalties it is an urgent imperative that firms get their data management capabilities in order.

To learn more about our Managed Services for Financial Data program, please download our AC Pass guide. What are you solving for?

i ISDA. FRTB – Modellanle Risk Factors Business Requirements for Observation Data Supporting the Modellability Test, Draft v0.4, 2016