News

New FSA rules push financial data quality management efforts at banks

Tracey Caldwell, Contributor

The financial services sector in the UK is under pressure to get its data quality management

    Requires Free Membership to View

practices in order as it works to comply with a variety of new government regulations, and analysts say it looks likely that some banking firms will struggle to meet the demands.

For example, banks are now expected to submit an increasing number of reports based on clean and consistent data in order to meet new liquidity rules that the Financial Services Authority (FSA) put into effect last December as well as emerging single-customer-view regulations that the FSA plans to implement by the end of this year.

The growing financial data quality management requirements are having a measurable impact on the IT budgets at financial services firms, according to Rachel Hunt, an EMEA banking industry analyst at IDC Financial Insights.

“It is particularly true in the UK that banks are going to be increasingly stretched to meet new regulations requirements,” Hunt said. “From an IT budget perspective, regulatory compliance used to be around 15% of the IT budget and we have seen it grow to 20-25%.”

Hunt added that the cost of complying with regulations “is usually vastly underestimated.” For example, on the single-customer-view rules, “the FSA had very low investment requirements forecasts for the institutions," she said.

“They were saying, 'This should have very little impact on your infrastructure because you should have already worked on an integrated view of the customer.' But the reality is that you have huge legacy systems [and] data sets in various applications, and it is very time-consuming to get the [required] data.”

Meeting the FSA’s schedules for upgrading systems and data quality processes could also be a challenge for banks. Last January, the FSA sent a 'Dear CEO' letter to financial sector firms, setting out how it would monitor compliance with the new liquidity regime. The letter also asked the CEOs to confirm in writing by mid-February that their firms were complying with the new systems and controls requirements, or else outline any remaining actions that needed to be taken in order to become compliant.

Banks: More to do on improving financial data quality management
An FSA spokeswoman said: “It is fair to say that the majority of responses were, ‘Nearly there, but there are more things to do.’ That is what we expected, and we are following up with firms as you would expect us to.”

The spokeswoman added that the FSA began consulting with banks on the new liquidity rules back in December 2007, “so there has been a long lead-in time to this". She also noted that the agency asked some of the UK’s larger banks to provide similar information on a regular basis during the credit crisis that resulted in the nationalisation of Northern Rock PLC in 2008. “So some will be better placed than others for providing the information or for the changes they need to make to their systems,” she said.

PJ Di Giammarino, CEO of JWG, a London-based analyst firm that focuses on financial services regulations, has said he expects the liquidity rules to create a "data tsunami". He also thinks that UK officials have given banks little understanding of what constitutes good data and that the data quality improvement initiatives could be undermined by inconsistent demands from different regulatory teams within government.

The reality is that you have huge legacy systems [and] data sets in various applications, and it is very time-consuming to get the [required] data.

Rachel Hunt, IDC Financial Insights

“There is an awful lot of naivety in the policy making,” Di Giammarino said. “It is all fine and good asking for lots of information. But at the end of the day, you have to ingest it [from] tens of thousands of players in the industry who need to have a common understanding and agreed definition of what they think they are sending you, and what good information looks like.”

The liquidity reporting requirements took effect on 1 June for large financial services firms. However, after discussions earlier this year with bank IT departments and the FSA’s own systems developers, agency officials decided in March to begin with a ‘soft launch’ of the new online reporting system that firms will use to submit their liquidity data each week.

As part of the soft launch, participating banks will use an Excel-based workbook to submit two of the new liquidity reports through to 3 September. The FSA also said it wouldn’t launch a thematic investigation of firms’ data quality during the soft-launch period, although the agency pointedly advised banks to submit test reports via the new GABRIEL online system on a weekly basis in order to prove out their data quality process and systems.

Meanwhile, the FSA has indefinitely delayed the implementation of the quantitative aspects of the liquidity regime because of the continuing problems with the economy as a whole.

Adding to the uncertainty about the rules is the fact that the new Conservative government is acting on its promised plans to scrap the FSA and replace it with a regulatory authority reporting directly to the Bank of England while also introducing a new Financial Policy Committee and a Consumer Protection and Markets Authority. Following on from an announcement made by Chancellor George Osborne on 16 June, more details on the plan were expected in late July, with legislation abolishing the FSA due by 2012.

Nevertheless, government officials say the regulatory focus will continue to be on improving trust and confidence in the financial sector.

As part of that effort, the single-customer-view requirements also could pose compliance challenges for banks, based on survey results released on 23 July. The survey, commissioned by SAS Institute Inc’s DataFlux data quality software subsidiary and conducted by JWG, focused on data management professionals within the financial services industry. Only 59% of the respondents said they had heard of the new requirements, according to JWG and DataFlux.

Financial data quality management, regulatory compliance go hand in hand
The UK financial sector does recognise the importance of good data quality to regulatory compliance efforts, according to a multi-country survey of senior IT managers and data pros at large companies, which was jointly carried out in late 2009 by London-based BDRC Continental, Paris-based PAC and US-based Lodestar Research, also for DataFlux.

A full 100% of the UK respondents said data quality was either "very important" or "extremely important" for compliance projects. More than 90% said they expected to see more data-related regulations in the near future, and 73% reported that investments in data management projects at their firms were mainly driven by compliance requirements.

According to IDC’s Hunt, some UK banks are ahead of the curve when it comes to data quality. “Data warehousing and data cleansing has been for some banks one of the key areas where they have spent a lot of money," she said. "But for a lot of other banks, the data cleansing issue still has to be improved.”

There is, however, one small bright spot for data management pros at UK financial services firms: at least they don’t have it as bad as some of their European counterparts when it comes to complying with financial data quality management rules.

“The FSA has really ramped up its regulatory requirements,” Hunt said. “But compared to some of the other European countries, it is still fairly light touch.”

Tracey Caldwell is a freelance writer based in the UK.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy