Analyst advice

Gain a single view of your customer

There are regulations and then there are regulations. For example, there is the Dangerous Dogs Act – but enough said about that. In practice, in terms of regulations that impact on IT, there tend to be two types: those that you have to comply with because you have to and those that represent some form of best practice that can or should be exploited by the organisation in order to improve its business processes.

In the former category are regulations such as anti-money laundering (AML) legislation. It can be argued that this represents social, legal or moral best practice but in terms of the banks, treated purely as profit-making enterprises, AML actually hurts the banks by requiring additional technology and processes  (and therefore costs), as well as by reducing their revenue generating possibilities.

    Requires Free Membership to View

In the latter category are regulations such as Basel II, Basel III, and Solvency II. One can see that these are potentially onerous in terms of compliance requirements but they represent, or should represent, the sorts of things that banks and insurance companies ought to be doing: it is difficult to argue that they should not know what their exposure and risk is, and that they should not have adequate capital to cover those risks.

Regulating the single customer view

In terms of the banking community there are two sets of regulations that are, or should be, exercising minds at present. One is Basel III (which is not discussed here) and the other, which is UK specific but certainly represents best practice, is the FSCS/FSA (Financial Services Compensation Scheme/Financial Services Authority) regulation about single customer view (SCV). In addition, there is Independent Commission on Banking to be considered, whose Issues Paper was published in September 2010, and which has been specifically asked to ‘consider the complex issue of separating retail and investment banking’.

In so far as SCV is concerned, the intention is that the FSCS can readily identify the potential compensation required in the event of the failure of a bank or other financial institution. The regulation mandates that a deposit taker (bank, building society and so forth), of which there are over 800 in the UK, must be able to provide SCV details to the authorities given 72 hour notice. To test the system deposit takers will have to provide 10,000 SCV records, or 10% of their customer base, whichever is lower, to the authorities, early in 2011.

An SCV is defined under the legislation as being ‘a single, consistent view of an eligible claimant’s aggregated protected deposits …’. There is a standard format for providing the data that will not concern us here, and nor will the fact that some data is recommended for inclusion (such as national insurance number) but not mandatory. Data, naturally, is expected to be accurate but the regulations make allowance for the fact that, for example, you might call a customer on his last known phone number but you cannot get a response.

The problem, and what SCV is designed to resolve, is that many financial institutions take deposits from clients under multiple guises: a customer may have multiple deposits, of different types, with the same institution and institutions typically treat these as completely separate, with different departments and divisions each holding separate details about the customer’s name, contact details and so forth. The idea behind a single customer view is that you should be able to see a cohesive, accurate record of the customer’s details across all of these silos.

This raises three questions: how should this be achieved in the first place, how should it be maintained on an ongoing basis and, given that you have to do this anyway, what business benefits can be derived from your investment? You may also want to consider if it is worth investing slightly more if you can get significantly more business benefit.

SCV + BI = £

Let me start with the last point, which is the potential benefit. The first point is that banks will be forced to have a view of exactly how much money is deposited with them. I would bet a pound to a penny that most of them don’t know that right now: primarily because of data quality issues and the lack of a single customer view. And that, of course, will impinge on capital adequacy requirements. However, perhaps the most obvious advantage comes when you combine the single customer view with details of what products he or she has bought: add some relevant business intelligence capabilities and you are now in a better position to do up-selling and cross-selling to that customer. And this applies to any industry, not just banks and building societies.

Unfortunately, it does not look as if most companies are thinking along these lines. According to a recent survey published by JWG only 14% of organisations subject to SCV had recognised the strategic potential of having a single customer view and were planning to use this for marketing purposes. Of course, this will mean that these companies gain a significant advantage over the bulk of organisations that only plan to use SCV for tactical purposes or the 29% that simply viewed this as a tick-box requirement and cost.

Registry-style master data management

In terms of implementing suitable technology to create a single customer view and then maintain it on an ongoing basis then the simplest and quickest solution will be to implement a registry style master data management (MDM) solution. The way that this works is that record keys and core attribute data is held centrally. Whenever an update (change of address, say) is notified to one of the participating applications then that information is passed to the MDM registry which in turn notifies all the other applications that are involved with the same customer, so that all that customer’s data remains aligned. There are various ways that this information can be synchronised ranging for real-time to batch based systems.

The advantage of this registry style is that it can be implemented very quickly. I have known systems to be in place within 6 weeks, which means that if you are a late starter on SCV (and over 40% of respondents in JWG’s survey were unaware of the regulation) then it is still possible to catch up. Of course, you will also have to do some de-duplication of records and data cleansing in order to ensure accuracy. You could do this manually but I would recommend the use of data quality tools not only because I think they are more cost effective but also because, left to its own devices, data quality deteriorates over time. Thus, making sure that you have good data quality is not a one-time job and on-going monitoring and remediation of data is much easier using automated tools. Both MDM and data quality are also, of course, crucial parts of data governance.

The other thing to be borne in mind is the Independent Banking Commission’s recommendations. These will not be known until September 2011, though responses to the Issues Paper have been published, and it is widely expected that it will recommend the so-called ‘HSBC model’ by which the organisation is structured across divisional and/or geographic lines, with each of these effectively acting as single entities. If this happens then it may be necessary, or advantageous for marketing or other reasons, to be able to get a single customer view across these divisions. This should be allowed for in the design needed to meet the current SCV requirements. This will be particularly true if you intend to migrate from a registry to a hub-based approach to MDM in the future.

In conclusion, SCV represents an opportunity for better data governance and improved leverage of customer information. The fact that a significant number of companies do not seem to see it that way is disappointing. However, they will get left behind by their competitors that do see this value. As a result, it is precisely those organisations which see SCV as simply a tick-box that they have to complete, whose customers are most likely to have need of the FSCS’ services in the future.

Philip Howard is a research director focused on data management for Bloor Research. He tracks technologies and processes such as databases, data integration, data quality and master data management. Howard has worked as a Bloor analyst since 1992; he also writes frequently for IT publications and websites and is a regular speaker at conferences and other industry events.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in February 2011

 

COMMENTS powered by Disqus  //  Commenting policy