Solutions

Our consulting and services teams are experts in financial data, with practical experience in over 65 banking and buy-side implementations around the world.

From post-trade processing to independent price verification and stress testing, Asset Control provides insight-driven financial data management solutions for any application. A solid data sourcing and mastering process is combined with easy ways to distribute, integrate, discover and explore the data.

Asset Control’s provides a standard data model that tracks regulatory developments and keeps integration with content providers up to date. Its business rules capabilities can be used for risk factor classification, taxonomies, proxy management, risk factor preparation and eligibility tests

Asset Control’s provides a standard data model that tracks regulatory developments and keeps integration with content providers up to date. Its business rules capabilities can be used for risk factor classification, taxonomies, proxy management, risk factor preparation and eligibility tests

Products

Our range of products help financial organisations deliver high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Meet the specific data requirements of risk management and new regulation

Providing award-winning data mastering engine with AC Plus

A comprehensive file management and scheduling system through AC Connect

The industries largest managed data lakes for seamless data acquisitions

Continually innovating to stay on top of new customer requirements and technologies in data storage and processing

A comprehensive library of APIs and direct feeds to downstream systems

Services

Asset Control helps financial organisations deliver high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Helping our customers succeed through client engagements while building personal connections

We are committed to empowering our customers to use our products to their full potential through our support services, individually tailored classroom-based training, easy-to-access e-learning modules and our state of the art Customer Portal.

Optimize operational costs by utilizing both our expertise and that of our partners for business processing right through to technology operations

Insights

The latest data management research and commentary from Asset Control

Industry comment and analysis from the Asset Control team

Data management applications and best practice videos

Insight on industry trends, regulatory challenges and solutions

About Us

Delivering high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Proven for unrivaled adaptability, reliability and efficiency

Stay up to date with all our latest news and company announcements

All the latest events we are hosting, sponsoring or and attending

Meet our management team

Asset Control partners with data, implementation and managed service firm to enhance clients’ experience of our data management solutions

All the latest vacancies at Asset Control

Contact Us

Get in touch with the Asset Control team - wherever you are in the world

A comprehensive, responsive programme for day-to-day support

We are a truly global company with offices around the world

Fill out our contact form and we'll get in touch as soon as possible

We appreciate your consent for our continued engagement.

The valuation imperative

Raising the bar in market data management

Valuation is probably the most fundamental capability in financial services. Whether you are an investment manager, a bank, an insurance company or other, you need to know the value of your assets and liabilities. Valuation comes up not just in fund valuation or trading profit and loss, but in financial reporting, regulatory reporting, investor reporting, collateral management and capital allocation and risk-taking decisions.

Yet simple as it may sound, valuation practices and their resulting workflow and data needs have undergone substantial development in recent years to account for (il)liquidity and different costs associated with holding a position. Terms such as ‘fair value’, ‘prudent valuation’ and ‘additional valuation adjustments’ have come up in regulation impacting different parts of the industry.

In our latest blog series, we will explore the changes to the valuation landscape, the operational impact coming from that and market solutions that can help firms effectively cope with valuation challenges. We start with the changing valuation landscape.

The changing valuation landscape

Independent price verification (“IPV”), generally speaking, is the process of comparing front office or portfolio manager marks and valuations to a set of independently verifiable external prices to establish an accurate price to revalue positions. Accurate in this context means assuming that a willing buyer and seller entered into the transaction freely. The term ‘fair value’ is used to denote rational and unbiased market prices.

These fair value prices are subsequently distributed to other departments (such as Finance and Risk) and used to value an organization’s portfolios, calculate P&L impact and assess required Tier 1 capital. Fair value processes have developed to provide internal stakeholders as well as investors, customers and regulators with reliable valuations. Accounting standards have kept pace with changing markets and both US GAAP and IFRS have evolved[1] to incorporate fair value measurement techniques.

However, IPV and fair value can have different meanings depending on whether the context is buy-side or sell-side and whether the regulation is American or European. These contexts and requirements can result in different market data management aspects and potentially significant market data challenges.

Fair value in the buy-side

In the buy-side context the notion of fair value was introduced in the early 2000s following mutual funds mispricing scandals. The main driver was consistent valuation of securities’ closes across different time zones. Procedures and correction factors were introduced to adapt close prices from markets that had been closed for some time prior to the close and revaluation of the mutual fund that invested in those assets.

For example, in the US the Net Asset Value (“NAV”) of a mutual fund is usually calculated at 4pm EST. To avoid stale prices and to reflect market movements after the close of a non-US exchange (say stocks listed on the London Stock Exchange or Deutsche Boerse only), adjustment factors are needed to account for material events after the exchange close time that would affect the prices.

European rules for asset managers such as AIFMD for alternative investment funds or UCITS for mutual funds also stress the need for valuation as an independent function and stipulate appropriate policies and procedures around the valuation of assets.

Added market data requirements for the banking book: IFRS9

Valuation complications are not limited to the trading book: IFRS9 introduced a more dynamic valuation treatment of the banking book. Its main point is to recognize credit impairments earlier via an ‘Expected Credit Loss’ measure.

IFRS9 too has been designed to value assets and liabilities in a more risk-sensitive manner. Previously, risk and accounting data was aggregated separately and obeyed different rules and norms. Under IFRS9, firms need to calculate the risk of loss on an asset up to maturity with proper recovery estimates and a forward-looking stance. The significant impact of IFRS9 on market data management lies in the increase amount of the market data required to evaluate different scenarios.

In terms of a move to greater granularity, this trend is similar to new methodologies now being employed in stress testing and risk management in the trading book. Large volumes of credit historical data will need to be maintained including loan classification, measurement and allocation and client taxonomies. Banks often use their own, different scenarios and need a process to get from macro-economic scenarios to risk parameters including definition of macro-economic data, firm specific risk profile, sector/portfolio specific, global variables, analysis of relationship to historical defaults, from holistic PD to Point in Time PD, stage allocation decision and ECL calculation.

Similar to FRTB alerts on changes in liquidity horizon and risk factor modellability (“NMRF”), a data collection, integration and verification process can cater for and monitor these data needs and create triggers for both banking and trading book requirements, including alerts on rating changes and banking and trading book valuation changes generally because of a market or credit data movement.

Making sense of the complex market data landscape

The changing accounting treatment of prices, the calculation of different valuation adjustments and the requirement to not just rely on front office, portfolio manager or trader marks have led not only to an increase in market data sources but also to the scrutiny processes to compute the required prices and adjustments.

Moreover, in addition to the increased variety in market data sources, pricing models will vary in interpolation and fitting models and in the inferences and conversions they make (e.g. convert bond price to yield, option price to volatility).

Constituent instrument prices may need to be collated and condensed into summary statistics (risk factors) such as a bond curve, a forward curve, or a credit default swap curve. Instrument market data, risk factor and calibration information on hedging instruments is also needed. Finally, there are the model parameters and model hedges, Sensitivities such as Greeks can also be considered market data.

The major upgrade in capabilities needed to comply with these regulations requires the introduction of best practices in operations as well as the infrastructure to support that; allowing this in a cost-effective and scalable way will separate competitive banks from laggards.

 

In our next blog, we will explore the operational impact of the changing valuation landscape, and examine why a common market data source for finance, risk, scenario management, stress testing, product control, and quant group will be a major step forward to lower the cost of change and make for a faster cycle time in developing and deploying valuation and risk models

 

[1] In the case of US GAAP there is ASC820, IFRS has IFRS 13 on fair value measurements.

About the author

As VP - Product Management, Martijn Groot steering Asset Control's strategy for innovation and directs product investment and communications.  Martijn has unrivalled financial and risk data experience, as well as extensive knowledge of our customers, having held Market Strategy and Business Development roles here prior to rejoining the company in 2015. A published author, with an MBA from INSEAD and an MSc from VU University Amsterdam, Martijn's career history spans a variety of areas, including software development, financial analytics, risk, product and consultancy, at firms such as ABN AMRO, Euroclear and IGATE.

comments powered by Disqus