Solutions

Our consulting and services teams are experts in financial data, with practical experience in over 65 banking and buy-side implementations around the world.

From post-trade processing to independent price verification and stress testing, Asset Control provides insight-driven financial data management solutions for any application. A solid data sourcing and mastering process is combined with easy ways to distribute, integrate, discover and explore the data.

Asset Control’s provides a standard data model that tracks regulatory developments and keeps integration with content providers up to date. Its business rules capabilities can be used for risk factor classification, taxonomies, proxy management, risk factor preparation and eligibility tests

Asset Control’s provides a standard data model that tracks regulatory developments and keeps integration with content providers up to date. Its business rules capabilities can be used for risk factor classification, taxonomies, proxy management, risk factor preparation and eligibility tests

Products

Our range of products help financial organisations deliver high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Meet the specific data requirements of risk management and new regulation

Providing award-winning data mastering engine with AC Plus

A comprehensive file management and scheduling system through AC Connect

The industries largest managed data lakes for seamless data acquisitions

The growth in data volumes and in data diversity coupled with increasingly data intensive jobs and reporting requirements mean firms need to improve their market data access and analytics capabilities.

Asset Control offers flexible integration options with a firm’s existing infrastructure for greater speed, scalability and performance

Services

Asset Control helps financial organisations deliver high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Helping our customers succeed through client engagements while building personal connections

We are committed to empowering our customers to use our products to their full potential through our support services, individually tailored classroom-based training, easy-to-access e-learning modules and our state of the art Customer Portal.

Optimize operational costs and cost-effectively process change through using our AC PaSS managed data services.

Insights

The latest data management research and commentary from Asset Control

Industry comment and analysis from the Asset Control team

Data management applications and best practice videos

Insight on industry trends, regulatory challenges and solutions

About Us

Delivering high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Proven for unrivaled adaptability, reliability and efficiency

Stay up to date with all our latest news and company announcements

All the latest events we are hosting, sponsoring or and attending

Meet our management team

Asset Control partners with data, implementation and managed service firm to enhance clients’ experience of our data management solutions

All the latest vacancies at Asset Control

Contact Us

Get in touch with the Asset Control team - wherever you are in the world

A comprehensive, responsive programme for day-to-day support

We are a truly global company with offices around the world

Fill out our contact form and we'll get in touch as soon as possible

We appreciate your consent for our continued engagement.

Creating a shared service for FRTB compliance
17 August 2016

Examining the opportunity for a mid-office shared service to address the complexity of FRTB

Financial institutions are increasingly leveraging shared services, from enabling Know Your Customer (KYC) compliance to post-trade reference data management, in order to reduce both cost and compliance resources. And, as the new data requirements associated with the Fundamental Review of the Trading Book (FRTB) become clearer, whether it is the new risk models or the depth of historical information requirements, there is growing industry concern regarding the challenges ahead and the tight timescales.

From quote collection to risk factor approval, organisations are beginning to question the viability of institution-specific compliance activity. While there are without doubt challenges to address in areas such as instrument classification and determining the modellability of risk factors, the potential upsides of a single service approach that leverages data pooling and data sharing to mutualise the modellability of risk factor creation and approval are compelling.

 

Early Collaboration

It has become patently clear over the past decade that early collaboration with regulators is now an essential part of the compliance process. As organisations progressively look for commonalities in regulatory data requirements, it is the industry’s feedback and input into the procedures and standards needed to realise each specific requirement that are now underpinning the necessary change management programmes.

The Fundamental Review of the Trading Book (FRTB) is a prime example.

Since its finalisation in January, organisations have started to get to grips with the data requirements associated with this new need to calculate and report market risk and the refreshed risk modelling methodology. FRTB’s replacement of Value-at-Risk (VaR) with expected shortfall (ES) as the standard risk measure has very significant data implications.

Most notably, the concept of non-modellable risk factors (NMRF) will mandate banks demonstrate that the data going into their risk models is real and derived from actual transactions or committed quotes. The expected shortfall measure itself will be calibrated on a history of 10 years. Regulators have become more prescriptive; not only on the content of the data (length of history and modellability), but also on the enterprise-wide integration and the explicit links to P&L and Prudent Valuation.

The depth, range, volume and quality of information now required is unprecedented. Where Basel II risk engines could work with relatively simple price histories, FRTB requires them to be managed as risk factors, which implies an understanding of their behaviour and relationships. Apart from that, data quality isn’t just for modellability, the increased computation requirements mean that data errors become harder and more expensive to correct.

 

 historic data

 

Historical Data

From a data management perspective this will demand the collection, analysis, validation and reporting of information across multiple product silos, organisational entities and risk areas. And it raises two key issues: the need for a common data foundation and access to a depth of historical time series information. However, FRTB is just one component of a reinvigorated focus on historical data.

From identifying gaps in history to flagging history that doesn’t qualify for use due to inaccuracy and adding external data sources and proxies, institutions need to create a strong information management architecture to support the growing regulatory focus on historical time series data.

Does it, however, make sense for each and every institution to collect transactional data, identify gaps, introduce new sources and validate ten years of history across every single risk factor? Few, if any, institutions routinely store real price data, therefore collaboration will be required at some level to fill the gaps. If each bank seeks to solve this data gap separately not only will costs rise but there will still be a risk of data gaps and inconsistency.

There is clearly an opportunity for a shared service model, where one provider undertakes to consolidate this information and provide it as a service to the market.

 

Data Challenge

The challenges with creating this unified model will be in defining a common understanding of risk factors and then mapping and cross referencing this data.

The role of EDM will be key – enabling the collection and reconciliation of quotation data in multiple different formats from numerous banks and cross-referencing different instrument classes and alternative ways of labelling the same financial product types.

With a common data foundation and a common basis upon which to create or derive the various risk factors, the contribution of quotes to the shared service by multiple organisations will resolve the data acquisition problem. There should be no gaps, and hence no need for complex estimates. The shared service can then leverage the data foundation and data resource to undertake risk factor mapping and provide proof of modellability. The resultant ‘on-demand’ service would deliver institutions a cost effective risk data foundation, overcoming all the traditional data collection and data supply chain costs and integration issues.

The benefits would extend beyond financial institutions: regulators would have to approve this shared facility but, once risk factors and definitions are agreed, only the shared service would require audit, not each individual bank, significantly reducing the burden on each regulator.

 

 examining data | Unsplash photos

 

Proven Approach

The way the market has responded to other regulatory requirements – such as KYC – with new, consolidated data providers clearly demonstrates the industry’s appetite for shared services. Given the challenges now faced by financial institutions in meeting the FRTB reporting requirements, there is a strong case for collaboration in the middle office.

With the time constraints associated with FRTB, is it really viable to source and validate the required data, from multiple internal and external sources; map that data to risk factors and prove that it has sufficient market data to be deemed modellable? 

By sharing the data collection burden and creating a single, audited model for data structure and risk definition, a shared service will enable institutions to significantly reduce the financial and resource overhead associated with FRTB compliance.

The onus is now on the industry to engage in communication with regulatory bodies and embark on a collaborative process to realise the benefits of this shared service approach.

About the author

As VP - Product Management, Martijn Groot steering Asset Control's strategy for innovation and directs product investment and communications.  Martijn has unrivalled financial and risk data experience, as well as extensive knowledge of our customers, having held Market Strategy and Business Development roles here prior to rejoining the company in 2015. A published author, with an MBA from INSEAD and an MSc from VU University Amsterdam, Martijn's career history spans a variety of areas, including software development, financial analytics, risk, product and consultancy, at firms such as ABN AMRO, Euroclear and IGATE.

comments powered by Disqus