Most Indian organizations suffer from lack of data quality. Unfortunately, for most organizations, operational...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
challenges have become a part of their day-to-day activities making them devise both simple and complex workarounds to compensate for insufficient data quality.
However for organizations involved in financial risk exposures, the issue is slightly different than just poor data quality. Though these companies suffer from poor data quality, they are nonetheless able to function with apparent efficiency. For example, a Data Quality Assessment revealed that a particular bank had over €2 billion in corporate loan exposures without maturity dates. (Source: Informatica)
This case can be cited as a classic example of poordata quality and inadequate business process. While the repercussions of such a scenario can be appalling, such lack of data quality did not seem to have affected the bank’s business—up until the most recent credit crunch. So in the past, while the bank would not have prioritized this data deficiency and deemed other issues more worthy of attention and budget, today its senior level executives recognize data quality as a critical factor to support a range of banking reports.
Driving these attempts for improvement in data quality is Basel II, the revised framework of an international business standard. Since Basel II is used to assess risk, the underlying quality of data is critical to being able to deliver a report with any level of confidence. Today, Basel II is adopted by financial institutions not simply because it is a compliance directive but also because, for many, it is the embodiment of best practice.
Basel II (and particularly Pillar II of the Basel Accord) puts responsibility on financial institutions in the area of data quality and data management. Banks must look at the accuracy of their risk exposure calculations throughout the entire business. For many, this encompasses the exposures from businesses in many different countries and the resulting data quality.
In the West, it is a requirement that banks self-certify the accuracy, completeness, and appropriateness of Basel-critical data and that helps in ensuring data quality. Moreover, this is a practice, which regulators such as the Financial Services Authority/FSA (United Kingdom), the Federal Reserve (United States), and the Bundesbank (Germany) have mandated towards maintenance of data quality. Hence, Indian banks need to tailor their data management strategy as per this requirement. Last year, even the RBI (Reserve Bank of India) issued a notification laying down a time schedule for all scheduled commercial banks operating in the country for implementation of the advanced approaches for the regulatory capital measurement under Basel II framework.
An example of the explicit requirements for data quality, as per these regulations, is highlighted in the FSA’s application pack for internal ratings-based (IRB) approvals: “Describe how the firm ensures that IRB data standards are met, and in particular how it ensures the accuracy, completeness, and appropriateness of the data underlying the firm’s regulatory capital calculations.” This criterion has now pushed data quality from “would be nice to have” to “should have” – an issue that must be addressed to comply with banking regulations.
Data accuracy or data quality in banks can be tested if they establish quantified and documented targets and robust processes. This can be done in the following ways:
• Establish key risk indicators to monitor and ensure data accuracy or data quality
• Fully document processes for business and IT infrastructure
• Develop a comprehensive quantitative audit program
• Reconcile inputs and outputs of capital calculation with accounting systems
• Assign every exposure a probability of default (PD), loss given default (LGD) and, if applicable, a credit conversion factor (CCF)
• Set clear and documented standards on ownership and timeliness of data
• Develop a comprehensive quantitative audit program
If banks want to bring in these new priorities for maintaining data quality, they will require consolidated data collection across the institution. So data from all business units is brought together into a single source; typically a data warehouse from which reports are generated for risk and Basel II-related decisions.
Data quality and Basel II
Maintenance of data quality and adherence to Basel II, the two key priorities go hand-in-hand. These are addressed by most banks by investing in the data infrastructure: data warehouses, risk engines, business intelligence (BI) layers, and data integration software.
Unfortunately, at no point in the data stream is data quality managed as a full-fledged function. Instead, organizations make do with tools not designed specifically for the purpose. Since data quality is the converging point of infrastructure and the business, this is an important oversight. More importantly, data quality is an explicit requirement for Basel II compliance.
Score-carding: When FSA’s CP-189 (Certification of NDT Personnel) proposed score-carding as an external audit point, it became a focal point for data quality in Basel II.
Data quality firewalls: The selected solutions must extend the compliance score-carding approach to ensure that “data quality firewalls” are applied in front of the risk engines. This is irrespective of whether these are home-grown solutions or acquired from third parties.
Identifying poor data quality before it goes into the engine is the main function of the firewall. This in turn removes the need for manual data remediation on the risk engine’s log files and guarantees that only high quality data enters the risk engines. Both automated and manual tasks are performed by firewalls.
The ideal risk solution, implemented for maintaining data quality, should perform analysis on all types of master data: Customer and counterparty data, market and credit data, financial, reference, and transactional data. Therefore, this includes key data related to PD, LGD, and exposure at default.
Risk and Basel II data quality management
The chosen solutions should provide a data quality management framework that gives the business total assurance in following aspects:
• Act on areas identified for improvement without threatening the quality of existing data
• Ability to handle changing requests and new developments without compromising the quality of existing data
• Data quality can be managed on an aligned and integrated basis, in keeping with the requirements of the best practices on legacy data management and new business development
• Data quality can be measured and monitored using: Existing and newly-created internal reference data sources, third-party reference data sources, and the solution’s own reference data
• Assure the accuracy or quality of the data being stored, being generated, and being used for decisions to the senior management and the board
• Identify incidences of non-conformance along with monitoring and cleansing gaps in data accuracy or data quality on an ongoing basis
Data quality starter pack: Risk & Basel II management
The solution chosen for banishing poor data quality, while deciding on the starter pack for risk and Basel II management, should provide the following benefits:
• Framework data quality rules in specific areas as key attributes for: PD calculation, risk weighted asset calculation, obligors (dates, basic address), exposures (dates, amounts, and limits), ratings (obligor and product), and securitization.
• A schema for BI vendor independent reporting which supports: Drill-down by multiple dimensions, high-level aggregated data quality metrics for senior management, detailed results, ensuring the inclusion of indicators of potential loss on a per business rule basis.
About the author: Suganthi Shivkumar is the managing director of Informatica, South Asia.