Best practices for creating a data quality framework for your organisation
Asset Control Sales Team < 1 minute
Bad data affects time, cost, customer service, cripples decision making and reduces firms’ ability to comply with regulations. With so much at stake, how can financial services organisations improve the accuracy, completeness and timeliness of their data? What approaches and technologies are available to ensure data quality meets regulatory requirements as well as their own data quality objectives?
This webinar will discuss how to establish a business focus on data quality, how to develop metrics as well as experiences of rolling out data quality enterprise wide. It will examine fixing data quality problems in real time and how dashboards and data quality remediation tools can help. Lastly, it will explore new approaches to improving data quality using AI, Machine Learning, NLP and text analytics tools and techniques.
Key topics discussed:
- Limitations associated with an ad-hoc approach
- Where to start, the lessons learned and how to roll out a comprehensive data quality solution
- How to establish a business focus on data quality and developing effective data quality metrics
- Using new and emerging technologies to improve data quality and automate data quality processes
- Neil Sandle, Head of Product Management, Asset Control
- James Whale, Head of Data Quality Management, Deutsche Bank
- Alex Brown, Chief Technology Officer, Datactics
- Ellen Gentile, Director of Enterprise Data Quality & Data Quality Incident Management, Sumitomo Mitsui Banking Corporation (SMBC)
- Moderator: Andrew Delaney, Chief Content Officer,
Download our this webinar now
Asset Control is delighted to have sponsored this A-team webinar discussing best practices for creating a data quality framework for your organisation.