News

Solvency II data management confronts regulative uncertainty

Kristina West, Contributor

The extension of the Solvency II compliance deadline from November of this year to January 2014 can be seen as both reassuring and concerning from a data management perspective. 

The European Union directive, aimed at regulating insurance firms within the EU, now affords insurance companies and their counterparties some much needed breathing room. But it also raises concerns on how preparations can be finalised when the legislation itself is still so uncertain.

EU flag.jpg

This is particularly true for the data management function – which is undergoing a series of challenges from the legislation itself and from its lack of a finite structure. 

There is concern that even the new deadline may be missed, due to the continuing scarcity of detail on data regulations. 

“Most organisations are still far from being ready," says Juergen Weiss, a research vice-president at Gartner. "They are not postponing their data projects as such, but there is so much still to do on Pillar 2 [for the governance and risk management of insurers] and they want to avoid double work.”

Minimum compliance tactic for Solvency II data management

The current effect of the uncertainty is that insurance companies, in particular, appear to be aiming for what is being called “minimum compliance”; working on as little as possible to ensure that their data is compliant in order to avoid wasted effort and costly errors of anticipation. 

Chris Gullick, director of Data Assurance, PricewaterhouseCoopers (PwC) says: “As it is closer to the deadline, and people are still not sure, they want the simplest solution. People know there may be an interim solution, so they are looking to rationalise and industrialise later.”

Under these constraints, the data issue that most insurers are flagging as key is that of data quality, especially with regard to preparations for Pillar 3 [for disclosure and transparency], which are now ramping up. According to Gartner’s Weiss, there was initially under-investment on this by insurers, but they now understand and realise that data quality is key in the context of this regulation. 

The process of ensuring data quality requires the establishment of internal standards, including the standardisation and harmonisation of systems to ensure consistency of data, and often combining this data in a data repository and ensuring completeness of data and lineage from the depository to the reports.

Laura Winter, associate partner at IBM says it is also a matter of: “Acquiring data which is of sufficient quality, especially from external parties, and establishing what is considered to be 'sufficient' quality'." 

Nearly all firms will depend on fund managers, global custodians and other external firms for data, and controls need to be in place to ensure quality, timeliness and completeness of that data.

According to Maxime Gibault, head of insurance companies at BNP Paribas: “The quality of data depends on the information available. Prices must be completely accurate, and insurance firms and asset managers need system architecture, monitoring, cross-checks – there is a long procedure in place.” 

Issues including pricing, market data, frequency and exchange of data must be monitored to ensure a 100% straight through processing (STP) environment.

Firms should also consider their liability in the case of errors in externally-collected data, and where this falls between themselves and their data suppliers. With a data chain containing three or more parties – such as the global custodian/asset manager/insurance firm chain – the possibility for error can increase exponentially.

Solvency II places demands on IT departments

PwC’s Gullick said: “Liability would be considered on a case-by-case basis, depending on the nature of the organisation providing the data. If an insurance firm is getting equities data from providers such as Reuters or Bloomberg, those companies trade on data, so the regulators would worry about them if that data is wrong. 

"If the data is from an agent, just providing data to one firm, there would be less of an argument. Firms need to show that they have thought about it and come up with a response, so that even if the regulators don’t like it, firms can show why they have done.”

The demands of Solvency II in terms of data are leading to particular challenges for the IT department, in terms of both technology and the associated budget. 

While many firms have already made significant improvements and investments in IT as part of their preparations for Pillar 1 [on quantitative requirements], the demands of Pillar 3 still require the provision of high quality information and the ability to report and publish this data according to regulatory requirements. This will place demands on IT in terms of investment in reporting systems.

Firms also need to identify a number of work streams, according to Gartner's Weiss, such as data quality, data governance, enterprise architecture, re-engineering for a full audit trail and sign-off procedures from, for example, the actuarial departments. 

However, IT costs account for less than might be expected, with Weiss noting that they account for around one-third of the total Solvency II costs. Some forward-looking firms are also spreading the costs across other departments with the intent to reuse tools in a wider context once the initial stage of Solvency II is complete.

This trend is particularly interesting considered alongside the terms in which Solvency II is usually discussed: beneath the concern over incomplete regulation, Solvency II is not a project that will definitely be completed by January 2014. 

Not only is there the expectation that Solvency III, or some other variation of the regulation, will follow hot on its predecessor’s heels with new data requirements, but many firms are not yet fully aware of how much work it will require for the day-to-day needs of Solvency II to be embedded in their business. 

The establishment of a common metadata model and further optimisation of processes are just two examples of future projects that will need to be considered, says Weiss. And it will be those firms who are looking beyond the "minimum compliance" approach that will be best placed to answer the future challenges and reap the competitive advantage that fuller preparation may offer.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy