The New Data Model
As the financial industry starts to explore the growing range of alternative data sources, there is a significant opportunity to gain new insights that can drive revenue, speed up time to market for new initiatives and drive down overall costs. Unlocking the value in this new data diversity is not only fundamentally changing the sourcing and mastering step of the traditional Financial data Management (FDM) model, it also requires efficient data exploration and easy access and distribution of this data in actionable form across the business.
From investment assessment to Know Your Customer (KYC), the way in which financial institutions consider decision making is set to change fundamentally over the next year as organizations begin to on-board and explore a new raft of data sources.
While this new data management model may initially have been driven by regulatory demands for change, the sheer depth of information now created and collected globally is extraordinary – and is set to take the industry far beyond the traditional catalog of price and reference data sources.
No longer will organizations be limited to published financial statements and earning calls; from social media sentiment analysis, to the availability of transcripts of all earning calls that give insight into who is asking specific questions and how CEOs and CFOs respond – there is now a much broader and deeper data set that can be analyzed to deliver fast, actionable investment insight. Similarly, with Know Your Customer (KYC) – the ability to rapidly deep dive through multiple diverse data sources provides a chance to address the escalating overhead associated with customer on-boarding and reduce the cost of doing business.
But where does this leave traditional FDM solutions that play a critical role in acquiring and mastering traditional data sources?
The mastering process must still provide a 360-degree version of the truth that can be used across the organization, from valuation to risk and financial reporting; the addition of data sources reinforces the need for structured processes that compare sources to find discrepancies and deliver that golden source. But this process must now also deliver excellent integration – with organizations looking for robust Application Programming Interfaces (APIs) to enable the fast stitching together and exploration of these new data sources by end users.
In addition to adding new depth to traditional information, these data sources also change the emphasis of the mastering process. Rather than focusing on error detection in order to achieve consistency and accuracy, these sources enable organizations to undertake pattern discovery, leveraging new techniques to spot new correlations or reveal unusual activity. Speed is critical; intelligent data mastering is at the heart of this new model.
Moreover, to maximize the value of these data sources, organizations also need to reconsider access and utilization. Making these new data sets easily accessible, not only to new algorithms and data scientists but also to end users within risk, investment, operations or compliance will mark a significant step change in data exploitation.
Ensuring the data easily integrates with the languages adopted by data scientists is fundamental; but to deliver the immense potential value to end users, data analysis must evolve beyond the traditional technical requirements of SQL queries. Offering end users self-service access via enterprise search, a browser, Excel and easy to understand interaction models rather than via proprietary APIs and custom symbologies, will open up these new data sources to deliver even greater corporate value.
These new data sources are radically different to the traditional data resources – and their potential value to an organization is untapped. The key for financial institutions over the next year or so is to move beyond traditional data management models and embrace the new mastering and distribution services that will enable essential exploitation of data across the business.