Not too long ago the Economist published an article which blamed in part poor data quality at banks for the 2008 financial melt-down.  In essence data silos and poor data quality had exacerbated the problem of accurately determine the risk exposure at that particular time.

Recently BIIA came across a white paper which specifically deals with the issue of the importance of data quality in risk assessment.   The white paper was provided courtesy of Informatica.   Its author David Loshin, is president of Knowledge Integrity, Inc, ,  and a recognized thought leader and expert consultant in the areas of data quality, master data management, and business intelligence.

Information is used for anticipating risks and developing methods for mitigation, therefore errors in the data can snowball into issues with risk analysis, assessment, and management. Undesirable affects can occur as a result of low quality data, and risk management can benefit from incorporating the identification and assessment of those risk factors that are magnified when organizational data sets do not meet business expectations.

This attached white paper targeted to those technical analysts who seek to understand the connection between information and optimal business performance, looks at categorizing the risk dimension of business value drivers, corresponding performance measures, and a process for evaluating the link between acceptable performance and quality information. These support the practitioner’s development of quantifiable performance objectives for data quality improvement.

To access this white paper click on the attachment:  1542_KnowledgeIntegRiskdq

The White Paper was provided courtesy of Informatica