When Admiral Horatio Nelson died he was placed in a barrel of brandy to preserve his corpse for the journey home. During the voyage, however, some miscreant drilled a tiny hole in the barrel, inserted a device known as a "monkey straw" and purloined the liquor - leaving the distinguished admiral quite literally high and dry.
The moral of this story is, of course, that workplace crime can be very subtle and its effects not immediately obvious.
In February 1995, merchant bank Barings Bank discovered this to its cost. The bank's sudden and utter collapse was primarily due to the unauthorised activities of its star futures trader Nick Leeson.
Leeson managed to deceive Barings for almost three years, reporting fictitious profits, while concealing massive losses. Yet the Barings' debacle had almost nothing to do with the murky world of derivatives trading. At the heart of the deception lay a failure of systems security.
Soon after Leeson arrived in Singapore in July 1992, he instructed a computer clerk to create an error account, number 88888. Leeson then instructed a systems engineer to amend the software in order to suppress account 88888 from reports to London.
The stage was now set. Leeson's job involved executing orders for colleagues based in Japan. Leeson appeared to be remarkably successful in obtaining discounted prices. In fact, Leeson was deliberately mis-pricing trades and hiding the loss in account 88888. Leeson's apparent success enabled him to move from execution into becoming a trader in his own right.
From September 1992, Leeson began selling options without authority in order to recoup his losses. An option gives another party the right, but not the obligation, to buy or sell a given quantity at some date in the future in return for payment of a premium. For example, to buy 1,000 apples at 10p each in nine months' time.
Options trading is a highly risky activity because if the price of apples rises, the loss may far outstrip the premium received.
This is precisely what happened to Leeson. Leeson, however, hid the losses in account 88888 and sold more options in increasingly desperate attempts to retrieve the situation. Since the premium was booked as profit, Leeson appeared increasingly successful, while in reality exposing Barings to huge risk.
Security frequently involves designing systems of control to prevent unauthorised access. Barings' collapse suggests that organisations may have more to fear from authorised users apparently going about their daily business.
Lack of supervision
Protecting systems from insider abuse is difficult because organisations require control and flexibility.
In Barings' case, however, the problem was deeper, and can be traced to a weak control culture. More specifically, the fault line in Barings was that Leeson controlled both front and back-offices - the front-office, where trading was conducted, the back-office where the documentation was processed.
The lack of segregation enabled Leeson to conceal his activities by adjusting prices, switching funds and so on. Moreover, since Leeson was virtually unsupervised there was no one close enough to see what he was up to.
Risk assessment typically concentrates on control points with greatest vulnerability and potential loss. In other words, like some imaginary Maginot Line, organisational defences are pointed in the direction from where an attack seems most probable. The defences may be impregnable, but that is irrelevant.
As the Barings' debacle shows, miscreants simply find another route. Brokers use error accounts to process mistakes made during trading. For example, if a contract to buy is wrongly executed as a contract to sell, the customer is made good and the company stands the loss. Such accounts seldom contain more than a dozen transactions and the amounts involved are relatively minuscule.
It took just three key strokes to create account 88888 and a routine instruction to suppress it from reports to London. The lesson is that the least consequential parts of the system are potentially highly vulnerable to abuse precisely because they are unguarded.
The Barings' case proves how systems of control can undermine themselves, a phenomenon known as the "paradox of consequences". Ironically, Barings' processing controls worked perfectly. Despite Leeson's efforts, the transactions booked to account 88888 were transmitted to London.
However, because the data failed to meet edit criteria - the contracts could not be matched to existing account numbers - the system rejected them into a suspense file. The suspense file was noticed only after the bank collapsed.
In part, it is the old story. The suspense file should have been audited regularly. It is more important, however, to try to understand why Barings' management was so incurious about the information. Fraud invariably generates evidence of its existence. The Barings' case suggests that although such evidence may be staring managers in the face, it may be ignored because it makes "no sense". Untimely and inappropriate output, as well as inaccuracies, indicate weaknesses in the system. For example, the piles of computer printout marked "garbage" gathering dust in office corners.
An alternative hypothesis is that "garbage" represents outcrops of fraud. To neglect such clues is to behave like the drunk who looks for his car keys not where he dropped them, but under the lamp post because the light is good.
Flawed risk system
Behind one paradox lies another. Barings possessed a "state of the art" risk management system. It was fatally flawed, however, because it depended on information fed by Leeson. Consequently, the system suggested "all clear" when the reverse was true.
In January 1995, market rumour began contradicting Barings' numerical data. During the next seven weeks the rumours became more specific. Reputable investment banks began warning customers to be careful about using Barings as a counter-party.
Had Barings investigated whether there was any substance in these rumours, Leeson's activities might have been exposed before his losses became catastrophic. Instead, since some exchanges published trading positions and others did not, Barings assumed that the market was seeing only half of the equation.
Barings' reaction to market rumour highlights the risk of relying on a single form of information technology. Computer-generated data, even if factually accurate, is not reality. Rather it depicts reality in a particular way, just as an anatomical sketch captures certain features of a human body, while missing others. The point is, to see something one way is not to see it another. Rumour can reveal features of situation that are suppressed by computer-generated data.
When systems are breached the instinctive reaction is to tighten security. Quite apart from destroying essential flexibility, this may only make the problem worse by creating an illusory sense of control.
Ultimately, system security depends not on having the most elaborate and rigorous controls, but developing a feel for the limits of those controls and a willingness to look beyond them.
This was first published in September 2000