Our consulting and services teams are experts in financial data, with practical experience in over 65 banking and buy-side implementations around the world.

From post-trade processing to independent price verification and stress testing, Asset Control provides insight-driven financial data management solutions for any application. A solid data sourcing and mastering process is combined with easy ways to distribute, integrate, discover and explore the data.

Asset Control’s provides a standard data model that tracks regulatory developments and keeps integration with content providers up to date. Its business rules capabilities can be used for risk factor classification, taxonomies, proxy management, risk factor preparation and eligibility tests

Asset Control’s provides a standard data model that tracks regulatory developments and keeps integration with content providers up to date. Its business rules capabilities can be used for risk factor classification, taxonomies, proxy management, risk factor preparation and eligibility tests


Our range of products help financial organisations deliver high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Meet the specific data requirements of risk management and new regulation

Providing award-winning data mastering engine with AC Plus

A comprehensive file management and scheduling system through AC Connect

The industries largest managed data lakes for seamless data acquisitions

Continually innovating to stay on top of new customer requirements and technologies in data storage and processing

A comprehensive library of APIs and direct feeds to downstream systems


Asset Control helps financial organisations deliver high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Helping our customers succeed through client engagements while building personal connections

We are committed to empowering our customers to use our products to their full potential through our support services, individually tailored classroom-based training, easy-to-access e-learning modules and our state of the art Customer Portal.

Optimize operational costs by utilizing both our expertise and that of our partners for business processing right through to technology operations


The latest data management research and commentary from Asset Control

Industry comment and analysis from the Asset Control team

Data management applications and best practice videos

Insight on industry trends, regulatory challenges and solutions

About Us

Delivering high-quality reference, market and risk data to the people and applications that need it – on time, all the time

Proven for unrivaled adaptability, reliability and efficiency

Stay up to date with all our latest news and company announcements

All the latest events we are hosting, sponsoring or and attending

Meet our management team

Asset Control partners with data, implementation and managed service firm to enhance clients’ experience of our data management solutions

All the latest vacancies at Asset Control

Contact Us

Get in touch with the Asset Control team - wherever you are in the world

A comprehensive, responsive programme for day-to-day support

We are a truly global company with offices around the world

Fill out our contact form and we'll get in touch as soon as possible

We appreciate your consent for our continued engagement.

From Lehman to Amazon, Rethinking Financial Data Management
May 2014

Asset Control examines how reconfiguring old models of data management could help the financial services sector meet evolving global regulatory requirements. The trick is to shift emphasis away from building huge data repositories, and to concentrate on developing a data supply chain that gets the right data to the right place at the right time. The Amazon-ization of financial data management is upon us.

The tide of regulation is rising inexorably, swamping the financial services sector with ever more prescriptive disclosure requirements; from Dodd-Frank to Basel III and Solvency II, the regulatory response to the enduring financial crisis continues to evolve but the direction of change is constant. It’s widely acknowledged that the sector’s pre-crisis data architecture failed to support the management of financial risks. Banks’ inability to report risk exposures and identify concentrations quickly and accurately, undermined the stability of the system and left financial institutions vulnerable. Institutions and regulators have since looked to strengthen IT infrastructure to help mitigate risk – but the belief that the sector’s problems can be solved by yet more data misses the point.  It’s time to rethink financial data management.

The solution lies not in mighty infrastructure and huge repositories of data, but in treating the management of market and risk information as a matter of logistics. By shifting the focus away from accumulation and onto delivery, the development of a data supply chain model can help ensure that financial services organizations receive accurate risk information, reliably and on time – all the time.

Organizations need to develop a framework that begins with the ends, not the means. Progress requires the ‘Amazon-ization’ of financial data management, where activity is focused on ensuring the right package of data is delivered to the customer’s inbox at the right time – with everything working backwards from that primary objective.

Need for speed
The Principles for Effective Risk Data Aggregation and Risk Reporting included in Basel III mandate banks to impose strong data governance to assure the secure organization, assembly and production of risk information. The principles, similar to Dodd-Frank in the US, begin with traditional notions of soundness: risk reporting should be transparent, and the sourcing, validation, cleansing and delivery of data should be tightly controlled and auditable. But the new regulatory model also makes timeliness and adaptability fundamental requirements.  This is a significant change from Basel II which addressed the formulation of risk models in detail but, in retrospect, failed to identify the need for speed.

The data supply chain approach is a challenge to incumbent models that largely focus on the aggregation and organization of huge volumes of data. In a dynamic environment complicated by the seemingly boundless diversity of financial instruments, multiple data sources, dozens of downstream systems and myriad reporting requirements, institutions have to look beyond ‘big data’ to the dynamics of their organization and information needs. They want relevant, consistent and accurate data that can provide them with reliable positions, based on the timely and appropriate delivery of reference, pricing and volatility profiles, under consistently defined risk scenarios. End-of-day reporting is the minimum standard - in a global marketplace stretched across multiple time zones, several daily snapshots are needed. The challenge is monumental – but fighting size with size is impractical.

The Solution
The old approach of building a vast bucket of ‘golden data’ is a static concept that’s no longer fit for purpose. On its own, the golden data set is worthless. The value lies not in the volume, but in how it is put to use. To derive maximum value, financial data management systems need reorienting: dynamic concepts must replace static frameworks.

The core components of data management – capture, validation and delivery – remain the same. But to regard data aggregation and cleansing as the primary objective and justification of the system is to start at the wrong place. The process should begin from the end-user’s perspective, with Chief Data Officers considering two key questions: who am I delivering this data to? And under what Service-Level Agreement (SLA)? By adopting an SLA-led approach and focusing on the end-game of delivery, it becomes much easier to work backwards and align performance (and costs) with business needs. With the overarching SLA as the start-point, data management becomes a logistics exercise whose primary objective is to get the right data, to the right people in time to meet their local SLAs – in effect, a data supply chain.

The Amazon model of delivery does not start at the warehouse – it begins, as it ends, with the customer. The entire supply chain is optimized to deliver the best possible customer experience.  Financial data management must adopt the same model.

The second, critical shift in perspective is to recognize that change is a constant.  In a vibrant market, products, processes and organizations are always subject to innovation and evolution.  Financial data models need to be dynamic, adjusting quickly to capture new products created to solve client needs in new ways.  The patterns of information distribution need to evolve too, as organizations adapt to changing market opportunities across asset classes and geography. 

Increasingly, proactive organizations are deploying strategies that do indeed regard data management as a dynamic logistics activity. The most effective have placed a data management platform at the center of the complex multi-source, multi-system distribution process – taking inputs from vendor feeds and departmental sources, testing them for quality and routing them through the platform to downstream systems and users.  As data flows through the system, the platform provides the framework for auditing activity and monitoring performance against critical SLAs.

Such systems simplify the technical challenges significantly. Because they eliminate potentially hundreds of point-to-point connections, they make the administration, control and delivery of reference, market and risk data much more manageable. Moreover, workflows become more efficient, enabling organizations to save time and money. Crucially, the centralized approach built around the effective development of a data supply chain, is helping companies mitigate risk and meet the growing demands of regulatory compliance.

Although we may have survived the consequences of regulatory and information failures that characterized the financial crisis, organizations cannot afford to be complacent. A reliance on inefficient legacy models will no longer suffice.

To progress, Chief Risk Officers and Chief Data Officers must drive the reconfiguration of financial data management – and establish it as a logistical exercise. By adopting an SLA-driven approach, the sector can make the journey from Lehman to Amazon. It’s time, quite literally, to deliver.