Skip to main content

Understanding the data challenges of the Fundamental Review of the Trading Book

Sponsored by Asset Control, and taking place 24-25 May at One America Square in the presence of remnants of the original London Wall, Risk EMEA covered all areas of risk management in financial services, with a particular, timely focus on regulation – from BCBS 239, to IFRS 9, and on to the aforementioned FRTB.
The report, commissioned in late 2015 and surveying 13 banks over a period of 6 months, including global investment banks, large regional banks and small regional banks, presents some clear findings, including:

  • The reasons why banks are investing in data management include both regulatory driven and internal reasons
  • The pattern of movement across banks as a whole from a siloed to centralised approach to data management
  • 85% of banks surveyed view regulation as a ‘highly significant’ driver of change in data management, yet only 31% currently deploy an EDM solution
  • In the area of risk management, data management is more centralised than across the enterprise as a whole. The management of market data is also considerably more centralised than other data types
  • The ‘gold standard’ to which banks are aspiring is maximum centralisation, with recognition that total centralisation may not be practical

The report is now available for free download here.

Following on from Michael Bryant’s introduction to the report, Martijn Groot continued the conversation in regard to the overlaps between risk management and data management, with his presentation ‘The Joined-Up Data Imperative: Data & Systems’.

Looking into the detail behind data integration and manipulation, and focusing on the transparency imperative inherent within the majority of new regulations – in particular the Fundamental Review of the Trading Book, BCBS 239 and stress testing – Martijn discussed how regulation is the thread that stitches together the fabric of risk management.

Over the last ten years, regulation has brought more taxonomies, new codes, new flags; and an increased importance to the strength of processes.

Sourcing clean, reliable financial data continues to be a challenge for organisations of all shapes and sizes, and the arrival of a ‘zero tolerance’ regulatory approach has ensured the prominence of high quality data as a foundation to build upon. The number of regulatory alerts has risen exponentially in the last few years – as noted in Thomson Reuters’s ‘Financial crime in MENA 2016: the need for forward planning’ report – in 2008, there were 8,704 regulatory alerts; in 2015, that figure jumped to over 43,000.

That is a piece of news on new regulation, a standards update, a QIS, a policy document, a consultation, etc. every 12 minutes.

But responses can no longer be ‘sticking plaster’ quick fixes: far-reaching, proactive changes to banking operations and culture are required.

As a result of this rise, establishing consistency of terminology and classification across an organisation has never been as significant. With strong data infrastructure, institutions can be better prepared to deal with whatever is thrown at them next. Addressing the market data requirements of regulations as soon as possible means not having to redress issues of a far harder to resolve and more expensive variety further down the line.

The relevant aspects of Risk EMEA’s hot topic, the Fundamental Review of the Trading Book, to data management include:

  • Stricter eligibility criteria for the internal model approach.
    • Selection of in-scope desks for model approval and the monthly P&L attribution and back-testing for each desk; separation between modellable and non-modellable
  • A new basic metric (ES) for distinction between IMA and SBA at desk level, for IMA the modellability criterion and a possible stressed capital add-on.
    • The expected shortfall measure must be calibrated to a period of stress, replicating an expected shortfall charge that would be generated on a bank’s current portfolio
  • Focus on real prices

Third party financial technology – fintech – has a paramount role to play in assisting the banking industry with regulatory compliance, and data management is just one of many areas where this comes into play, whether it is:

  • Common dictionaries
  • Data governance
  • Cross-referencing, mapping strategy
  • Capacity to create joined-up data
  • Smart sourcing

If you are already using third party data management technology, are you using it to its full effect?

In particular for FRTB and MiFID II, there are several key elements:

  • Data collection, controlled data sourcing, cross-referencing and integration capabilities
  • Workflow to control and track proxies and links between indices/constituents and derivatives/underlyings
  • A common data model with agreed naming conventions, definitions and domains
  • Distribution and reporting capabilities to slice and dice the data and zoom in on trouble spots

…and 2019 will come around in no time, so make sure you start the process now.