When the credit crunch wiped billions off the balance sheets and share values of financial services companies, the hunt for a scapegoat began. Leaders in the financial services sector surely considered pointing the finger at unreliable financial risk data generated by their sophisticated software systems.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
"If they could have blamed IT, they would have," says Bob McDowall, senior analyst at the Tower Group, a specialist research firm in financial services and IT.
But blaming IT was never going to let management and policy makers off the hook. "There been no allegations that computer systems did not do what they were programmed to do. This was not a malfunction, but there was a lack of qualitative judgement," McDowall says.
This does not mean that IT will escape a review of policy and practice following the collapse of credit markets and the subsequent liquidity crisis. Since 2004, the Basel Committee on Banking Supervision has introduced a Europe-wide accord, often called Basel II, to help banks manage financial risk better. This was a response, in part, to the collapse of Barings Bank in 1995. To comply with the accord, banks invested millions in IT systems to gather data quickly enough to make their judgement of financial risk meaningful.
Clearly, the credit crunch shows that the Basel II plan to ensure banks accurately quantify and properly manage operational risk and credit risk has its shortcomings. "The whole thing has been undone by the credit crunch," McDowall says.
A review of Basel II is already under way in response, and IT departments in banks and other financial institutions will need to be adapted to support the amended policy, he says.
With the cost of the credit crunch to world's banking system put at about £500bn by the International Monetary Fund, the stakes are high. The Basel committee's consultation on new regulation will finish in June, and new guidance on the management of financial and operation risk in banks will issued next year.
Significant IT investment will be needed to support this guidance, McDowall says, because Basel II left liquidity out of its risk-management equations on financial markets. Liquid assets are investments that banks can readily cash without them losing value.
The problem is that although banks can measure the liquidity of the market in the recent past, forecasting the liquidity of a financial institution is more difficult.
More sophisticated software models are necessary to capture market behaviour and assess market players' attitude to liquidity, McDowall says.
"Liquidity is the most emotional of risks," he says. "You need to capture a lot of behavioural data, such as all the bids on offer in the market.
"There are a number of software providers looking at this, as they see a commercial opportunity. If you look at emotional measures direct from pricing patterns before a crisis takes hold, you can see the warning signs. Banks need to strengthen the applications that monitor their liquidity."
Mark Elkins, strategy manager in financial services for business intelligence software provider SAS, says much of the technology already used in fraud detection could be applicable to these problems. "Behaviour analysis is already used insider trading and market abuse. These systems model behaviour and expected of each particular trader. These ideas could be applied to market liquidity."
A report by the Financial Stability Forum - which is made up of national banks from around the world, regulators and international financial institutions - shows what banks can expect from the revamped Basel II regulations. IT systems will be require to help banks comply in several ways.
Banks will need to disclose their securitisation exposures, particularly exposures held in the trading book and related to re-securitisation. Essentially, securitisation is the process of repacking debt into other financial products - a process that many blame for the spread of bad debt from the sub-prime mortgage market in the US throughout the world's financial markets. Software tools will necessary to help banks "stress test" various scenarios across their securities assets to ensure they are not taking too much risk.
Computer testing will also become more important in managing liquidity risk. In May, the Financial Services Authority (FSA) found banks were reviewing their stress testing scenarios and contingency funding plans in line with lessons learnt over the past year, according to its industry feedback on liquidity requirements.
Greater requirement for stress testing of liquidity and financial instruments is also likely to extend to the insurance industry. Speaking at the Institute of Economic Affairs' Future of Life Assurance conference in May, FSA director and insurance sector leader Sarah Wilson said product providers had to consider whether their models for stress testing against market and credit risk were up to scratch.
The authority said it would consult further on all aspects of the new regime later this year, including setting out proposals on sound practices for managing liquidity risk with a strong focus on stress-testing. These enhanced qualitative requirements would reflect the work currently under way in the Basel committee, and will be the centrepiece of the new liquidity policy, it says.
Applications developers and business analysts within banks will need to work with the line-of-business management to ensure effective systems in place to meet these requirements for stress testing. Although infrastructure staff will be less involved in the development phase, they will need to be ready for application roll-out, McDowall says.
Although meeting more stringent regulations following the credit crunch will require significant IT investment, it will not be an exercise in ripping and replacing applications, according to John Eggleston, IT director of CallCredit, a credit reference agency that provides information services to banks.
"From a technology point of view, there are a lot of legacy systems you can adapt to a new model using a service oriented architecture. This more modular approach to architecture will help keep costs down," Eggleston says.
For IT professionals in banks and other financial institutions, this is the time to prove they understand their business and get systems up and running to comply with the wave of legislation that is expected in response to the credit crunch, McDowall says. "They need to rise to the challenge if this is going to succeed. In the end, this is going to save their jobs."
Box How IT has supported legislation in financial services
In November 2007, the Markets in Financial Instruments Directive was introduced. This EU legislation required technology investment to support new standards in data storage, security and transmission.
One aspect of Mifid is the "best execution" rule. By this, financial institutions must store, and be able to retrieve, information attesting to best execution in share transactions - trades that take place in milliseconds but need to be capable of being reconstructed for many years to come.
The New Basel Capital Accord, dubbed Basel II, aims to make banks' assessments of their loans and investments more sensitive to risk, reflecting technological developments in global markets. The accord, which came into force by the end of 2006, required IT directors to link a maze of banking databases and reporting systems, update older applications and ensure information in systems is accurate. Estimates of global banks' spending on IT to comply with the new legislation vary between £20m and £100m.
In July 2006, the deadline passed for companies based outside the US to comply with the US Sarbanes-Oxley law.
All foreign companies capitalised at more than £75m and dealing with the US came under the rules, which require them to report on internal accounting controls and highlight any potential flaws.
Sarbanes-Oxley was brought into force in the US during 2002 in response to high-profile financial scandals such as Enron and Worldcom.
The legislation was imposed to protect shareholders and the general public from accounting errors and fraudulent practices in the enterprise.
The reach of the US law is global, and it affects all European enterprises with transatlantic operations or partners.
As part of the act, section 404 requires a management assessment of internal controls within a company's annual reporting, providing a statement on the responsibility for internal controls, and demonstrating that these controls are adequate for accurate complete financial reporting.
Case studies: AXA Bank and Dresdner Bank
SAS Credit Risk Management for Banking is intended to assess and report the risk of potential credit losses and calculate the capital reserves required to adequately cover that risk.
Belgium-based AXA Bank uses SAS to improve its risk management. With the software, the firm has optimised its workflow and is complying with legislation such as Basel II.
Dresdner Bank has implemented SAS's Risk Intelligence Architecture to help asses its risks when dealing in shares, bonds, foreign exchanges, derivative financial instruments and other products that are traded on the international financial markets. The SAS risk data warehouse offers users an extensive system of modules that calculate statistical data and co-ordinates the necessary processes for seeking out errors, weak points and potential for improvement.
The SAS risk intelligence architecture integrates the various market data delivery systems and permits the statistical analysis team to implement new methods.