anderm - Fotolia
Banks are under pressure to meet the Basel Committee on Banking Supervision (BCBS) 239: Principles for effective risk data aggregation and risk reporting (RDARR), but it is not easy to get the policy and technical elements right, such as governance and accuracy.
Risk data loads at financial institutions are growing as firms use more powerful analytical technologies to spot opportunities before their rivals, and regulators demand more stress tests, transparency and reporting than ever before.
These technological and regulatory forces are having an impact on investment, retail and corporate banks, asset managers and many other financial services players. They are evident in the European Union’s (EU) Markets in Financial Instruments Directive (MiFID) II, the US Dodd Frank rules and in the global Basel 3 capital adequacy regime, among others.
Collectively, the regulations mean financial institutions have to tag, aggregate and share risk data much better than before. At any time, they may be called upon to report their intraday liquidity under stressed conditions and the ‘new normal’ market conditions mean more frequent margin calls and business opportunities for institutions that are able to offer intraday pricing at the drop of a hat.
In the past, internal silos and external borders at large multinational banks have hampered the aggregation of data. Big banks often have diverse retail, investment or corporate banking units and standalone IT infrastructures following earlier merger and acquisition activity. It is not easy to get control of their data, and sometimes that of counterparties, to prove that know your customer (KYC) and sanction rules have been followed.
Connectivity to financial utilities, such as the Eurosystem’s TARGET2-Securities (T2S) single securities settlement engine, further complicates the data journey. Utilities necessitate increased standardisation work in financial services to underpin RDARR and aid governance and compliance.
Hopefully, standardisation should also increase data flexibility and cut costs for financial institutions. You cannot aggregate, report and share data across silos without tags, common messaging protocols and underpinning standards.
Data aggregation vital
Aggregation is becoming a necessity in the more risk-averse environment evident since the 2008 financial crash, with new market and regulatory obligations apparent. The BCBS 239 principles are the recommended way of achieving aggregation and reporting to regulators, and can also provide a pathway to aligning financial institutions’ data.
As Michael Atkin, managing director of the Enterprise Data Management (EDM) Council trade body, says: “Data management among the Tier 1 banks is being driven by BCBS 239. It is a priority. It is funded and has the attention of executive management. The governance objective (principle 1 out of 14) is needed to ensure the bank adheres to the other stipulations because governance drives change.
“Please understand, however, that BCBS 239 is not regulation. These are mandatory principles applied to a host of regulations, including transparency-related ones such as the European Market Infrastructure Regulation (EMIR), which covers over-the-counter (OTC) derivatives; and resolution-related regulations (such as SR14-1); best execution ones (MiFID, for example); and stress testing (under Basel III).
“The 14 principles are being used by bank examiners when conducting audits.”
The principles are designed to encourage effective RDARR and as another means to enhance the financial stability of the global financial system. The aim is to strengthen a bank’s risk data aggregation capabilities and internal/external risk reporting practices by recommending 14 principles to which banks must adhere [see below, The 14 BCBS 239 principles]. These will, in turn, enhance risk management and decision-making processes across the enterprise.
BCBS 239: technical elements
The Basel Committee wants banks to be able to prove they adhere to the technical principles in BCBS 239 about how to ensure good data clarity, timeliness, accuracy, comprehensiveness, and so on, because this enables effective reporting to the relevant supervisory authorities. This technical compliance duty generally falls to a bank’s chief information or technology officer (CI/TO) to fulfil.
Wise organisations also make BCBS 239 compliance a business project, in order to achieve fast cross-silo flexible data, which can then improve the bottom line. The compliance aspect merely releases technology investment budget, which may otherwise be hard for a CIO to obtain from the board at a time when banks’ cost of operation is going up and return on equity is falling.
“You should map to BCBS 239 for business reasons, such as to offer competitive intraday market pricing, rather than as just a compliance exercise,” says Mark Davies, a managing director at DTCC Europe and head of its Avox reference data unit, which provides legal entity identifier (LEI) and other data services.
Standardisation work is also under way in the LEI field to underpin RDARR across financial services, and aid straight through processing (STP), under the auspices of the Global Legal Entity Identifier Foundation (GLEIF).
BCBS 239: policy elements and timeframe
The Basel Committee also wants banks to prove they have effective data policies – it is not just a technical exercise for CIOs. These policy elements are outlined in the data governance and architecture principles in BCBS 239, which are usually the responsibility of a bank’s chief risk officer (CRO).
But to have a successful overhaul, there has to be some co-operation between the people in these operational silos, just as there does for the data itself. Alignment of people, process and technology is a crucial aspect of any successful project. The CRO, CI/TO and chief data officer (CDO) must all co-operate to allow data to flow enterprise-wide across a financial institution.
The BCBS 239 principles theoretically started for the 30 global systemically important banks (G-SIBs) identified by the Basel Committee on 1 January 2016. But, in reality, many banks have struggled to meet the deadline or have done so only by launching a narrow compliance project, instead of the hoped-for enterprise-wide data transformation project.
“It’s quite tough,” says a spokesperson at a large European bank. “The cost of compliance in terms of time and resources is high.”
This is a view supported by Tom Spellman, a partner at Deloitte, who says: “Some banks are struggling to define appropriate strategies to cope with stress conditions. One of the most significant challenges, from a systems and IT infrastructure perspective, is the identification of appropriate fall-back solutions to enable the production of reporting using limited calculation capabilities.
“The identification of these ‘workaround strategies’ is challenging due to the impossibility of properly forecasting and modelling the impact that stressed market conditions will have on data-sourcing activities.”
Nick Bouch, financial services data leader at the PwC consultancy, says there are “many facets” to BCBS 239. “Don’t forget, it’s a wide-ranging project, covering policy imperatives such as data governance and technical elements covering integrity, and so on,” he says.
“It’s fair to say that some non-technical aspects, such as governance and control structures, are easier – or at least faster – to comply with than others. Some banks hit some of the 14 principles, but not necessarily all of them.”
Ed Royan, chief operating officer (COO) for EMEA at risk management supplier AxiomSL, says: “BCBS 239 will only become a priority when there are clear penalties for non-compliance.”
Read more about risk data governance in financial services
- What is the Basel Committee on Banking Supervision (BCBS)?
- Why European banks need to improve their handling of data if they are to recover from the financial crisis and comply with demanding legislation.
- Find out about Basel III.
- With big data in financial services requiring various skills, a chief data officer may be ready to step up. A Capgemini expert discusses this evolving role.
Slow progress is being made, but the principles are a useful RDARR framework for a bank to follow. It is also likely that smaller Tier 1, 2 and 3 banks and other financial institutions, which are not G-SIBs, will still map to BCBS 239 as a de facto standard within the industry – even if they are not actually forced to do so by regulators that are currently more concerned with systemically important banks. The timeframe is likely to run for the rest of this decade.
As PwC’s Bouch says, data projects are hard and take time. “You first need to work out what data you’ve got [collect, clean and possibly correct it to ensure accuracy], then aggregate it across silos, and deliver it in a timely manner, within a sound governance framework that updates to ensure ongoing relevance,” he says. It’s not a quick process.
DTCC Europe’s Davies adds: “The big challenge in financial institution data governance is the transformation of data as it goes through the ‘chain’. Data handoffs happen all the time as clients are on-boarded, passed through KYC and go on to the buy or sell side of capital markets, or settlement.
“There are lots of departments, processes, counterparties and handoffs on this data journey – involving risk, credit, financial, legal, trade and other such activities – all of which can slightly alter data. Minor actions, such as the addition of capital letters, spaces, country codes and so on can all have massive impacts, making data linage [history] a key issue.
“You can see that data governance and management is an important controlling influence in this complicated FI environment.”
Davies points out the huge number of handoffs faced by G-SIBs because of their legacy IT architectures and cross-border and sector operations.
In itself, RDARR is useful for financial institutions because it improves opportunity spotting in a ‘big data’ world where volumes are escalating, as well as oversight demands. BCBS 239 can provide the impetus, but aggregating data is useful anyway. Other frameworks can be used by bank technologists or pre-existing ones leveraged to fit the compliance duty.
For instance, the EDM Council has a Data Management Capability Assessment Model (DCAM), which is a ‘how to’ guide and benchmark for data professionals and financial institutions seeking to improve their operations. The technical methodology synthesises data management best practice, but crucially also maps to the data architecture, quality and governance concepts expressed in BCBS 239, so if a bank has followed it already, it should have a head start.
“BCBS 239 is a set of guidelines that can provide a basic framework and a sense of direction for financial institution risk data aggregation and management,” says Chris Pickles, a Bloomberg adviser and member of the Open Symbology FIGI effort. “It can act as a basis for developing standards for the financial sector, but it isn’t the be-all and end-all. Banks have to work out for themselves how to adhere to the principles.
The Financial Instrument Global Identifier is attempting to introduce better instrument and corporate identification tags in a multi-asset trading and investment environment.
The Object Management Group (OMG) non-profit standards consortium supports the standardisation drive and adopted a 12-character metadata-based identifier as the basis for its FIGI specification in September 2014. FIGIs should make it easier to identify securities such as equities, fixed income, indices, derivatives, currency and structured instruments across various trading venues. Such standardisation is essential if the BCBS 239 principles are to be met.
But Pickles cautions: “The requirements of global regulatory bodies such as the Bank for International Settlements or Financial Stability Board, and of national regulators such as the UK Prudential Regulation Authority, are not the same as the needs of financial institutions themselves. This is one of many reasons why BCBS 239 should not merely be a compliance exercise.
“In many ways, the regulators require a subset of the data that the financial institution itself needs to operate its business successfully and to manage risk effectively. This is the best way to look at it and approach the data aggregation and alignment challenge inherent in BCBS 239.”
Pickles concludes: “Making major changes to data management and governance at financial institutions is now cost-justifiable and unavoidable. The need to aggregate data across multiple activities, and related multiple regulations for risk management and reporting purposes, is pushing firms to re-evaluate and significantly change their current approaches.”
Neil Ainger is a freelance journalist covering financial services, treasury and FinTech. He was previously editor-in-chief of the gtnews and bobsguide online titles, covering treasury and financial technology respectively. He has also worked as deputy editor at Informa's Banking Technology and edited Financial Sector Technology (FSTech) magazine.
The 14 BCBS 239 principles
Principle 1: Governance
Principle 2: Data architecture and IT infrastructure
Principle 3: Accuracy and integrity
Principle 4: Completeness
Principle 5: Timeliness
Principle 6: Adaptability
Principle 7: Accuracy
Principle 8: Comprehensive
Principle 9: Clarity and usefulness
Principle 10: Frequency
Principle 11: Distribution
Principle 12: Review
Principle 13: Remedial actions and supervisory measures
Principle 14: Home/host co-operation.