At the same time, horror stories about loss of business resulting from application downtime are becoming legion. There is no brand loyalty on the web; if your site goes down, you're likely to lose customers forever.
As Mercury Interactive European field marketing manager Kevin Francis puts it: ''A lot of people are moving their whole revenue stream to an e-commerce front end, and that makes you very vulnerable to problems. When that happens, the end revenue suffers; when people have a bad experience with a web site, it is very difficult to get them back.'
How does a prudent management reconcile these two pressures? You need to get up and running quickly, or else you lose competitive edge. But if you get there too quickly, you lose customers in a different way. One answer is the unglamorous route of exhaustive application testing.
Historically, application software testing was done manually by in-house staff. But software has become steadily more complex. First, we moved from batch to online. Then GUIs took over from green screens. Then we moved to two or three layer applications in the client-server revolution. Finally application development started to become object-oriented.
Each of these technology developments increased the complexity of the testing process; collectively, they have provided both a need and a justification for ever more sophisticated automated testing tools.
According to Rational Software's UK product manager testing John Watkins: 'With the early tools, you could record something and play it back, recording some typical activities. You could then replay the test against later versions of the software. That was very coordinate based - you picked up coordinates where you clicked on the screen. That very quickly became useless, as objects tend to move around the screen.
'Modern testing is GUI-oriented. Controlling the complexity is the biggest deal - 76 per cent of development projects fail to come in on time, and to acceptable quality'.
Tools help to overcome this problem. As Original Software managing director Colin Armitage puts it: 'In an ideal world you would just press a button and be told 'right' or 'wrong'. We are a lot closer to that than we were.'
Testing tools started to emerge around 20 years ago, but have remained a small niche market till recently. The need for exhaustive Year 2000 compliance testing provided a major stimulus, because the scope of these projects was so great, covering the organisation's entire application portfolio. If any other work was to be done at all, the Year 2000 testing workload had to be constrained to a minimum, and the only way to do that without cutting corners was by automation.
The web is now providing a further stimulus. Rational's John Watkins observes that 'People are still facing exactly the same traditional issues; the big difference is that the timescales are much tighter. People are now talking about the concept of 'e-time'.'
Mercury Interactive's Kevin Francis agrees. 'In the past, people have produced new releases of application software every 12 to 15 months; in the e-commerce world people are doing that every six to eight weeks. People have been telling me they haven't got time to test.'
Original Software's Colin Armitage says: 'It's not just about having an attractive web site. Once you have for example ordered a book from amazon dot com it has to go through order processing and credit control, and there has to be a picking note, and you're not going to be happy if that process takes a month.' In other words, a web application has to be fully integrated with several existing applications if it is to work well.
There may be other elements involved in a Web application as well. As Mercury Interactive's Kevin Francis says: 'In an e-commerce environment there are a lot of components you are trying to put together. There is a complicated network infrastructure, a web server, possibly also a web application server, a database server, and maybe a legacy ERP system. We are entering into a new IT world. You now have to have an end-to-end business framework.'
And commercial pressures mean that development and integration have to be done in a hurry. According to Rational's John Watkins, e-commerce developments have 'the Ready-Fire-Aim philosophy of development. Timescales have been squeezed a lot by iteration and prototyping. E-development is worse because timescales are tighter... The difficulty for testing is the speed at which things have to be delivered'.
Scaleability is also an issue. 'A good marketing campaign can break an e-commerce environment', notes Mercury's Kevin Francis. Suddenly you can have tens, hundreds, or even thousands of times more web site hits than you had yesterday. The peaks and troughs are potentially much further apart than with traditional business-to-business transaction processing.
What is more, the scaleability problems that occur may not be your responsibility. Francis observes: 'We are finding that the bottleneck does occur at the ISP level'. But your business will suffer all the same.
Rational's John Watkins agrees. 'Scaleability is a problem. It is impossible to manually demonstrate scaleability; there is no choice but to use tools.'
Rational Software was set up 19 years ago to market a range of testing tools called Apex for the Ada programming language. The company diversified into the Windows market five years ago with a product called SQA, which was developed by a company of the same name. Rational merged with SQA around a year later, and now concentrates on Windows and Unix-based testing tools.
End-to-end development Rational's product line is based on its own methodology, which is called the Rational Development Process. John Watkins describes it as 'a comprehensive iteration based software development and testing process' Within that comes the concept of an end-to-end development process. For each of the roles - analyst, developer, tester - there are specialist tools.'
For Rational Software, the AS/400 is classed as a mainframe; the SQA toolset supports both S/390s and AS/400s by terminal emulation on PCs. According to Watkins, this is 'not a perfect solution, but it's much better than the manual approach'.
Mercury Interactive is another to have been in the testing tools market for a couple of decades, and for much of the earlier part of its life the AS/400, together with mainframes, provided the majority of its business. That was in the days when AS/400 application development was mainly carried out using 5250 monochrome screens, and Mercury still offers a complete set of 5250 environment testing tools.
Mercury now also provides tools for applications where the AS/400 is used as a back-end web server. The company offers four ranges of tools; for test management (under the name Test Director); for automatic testing (Winrunner); for load testing (Loadrunner); and for post-deployment monitoring (Topaz).
Mercury is also selling to smaller companies 'which are new to testing' According to Francis: 'The product range is called Astra. It is downloadable from the web; you can evaluate it free of charge; and you can purchase it over the web too.'
Market leader Original Software contrasts with Rational and Mercury in two major respects; it is much younger, entering the market with its TestBench400 product just two years ago, and it specialises in the AS/400 exclusively. According to Colin Armitage, Original is now the market leader in this sector with more than 80 customers worldwide.
'The majority of them are leading edge users, who are looking to get the maximum value out of their AS/400. We have also sold to some software houses.'
The company started 'with a blank sheet of paper', and this absence of historical baggage has allowed the development of a product tailored for today's needs.
According to Armitage: 'Checking what the screens look like is not the only thing you have to do. That is 5 per cent to 10 per cent of the total task of testing. There are 16 different areas of test activity; record and playback is just one.'
The testing tools vendors described in this article are all different. This reflects the fact that there is no simple answer to the testing requirement, and that every supplier is trying to hit a moving target. As John Watkins puts it: 'Technology is the nemesis of the tester. We're always playing catch-up. With object-oriented software you've got much more code to test much more quickly. The tester and the QA person will never get ahead of the game'.