User acceptance testing needs real training, not just a short course
In spite of its importance, user acceptance testing (UAT) is often chaotic, problematic and ineffective
Developing information systems is a high-risk occupation and user acceptance testing (UAT) is the backstop that can avoid disasters.
Not convinced? In June 2012, the interruption to NatWest and RBS payments following a routine update to software cost an eye-watering £170m to fix.
Similar disasters, which might have been avoided by thorough UAT, have been experienced by the Securities and Exchange Commission, the UK Passport Agency, the London Ambulance Service, Heathrow Terminal 5 and Microsoft Xbox 360.
Chaotic, problematic and ineffective
In spite of its importance, UAT is usually chaotic, problematic and ineffective. Often treated as an intuitive process – an exercise in which users and subject matter experts "have a go at trying to break the system" – it is actually a process that requires careful preparation.
Subject matter expertise is not enough, but it is vital that UAT is carried out by users, though it is not enough merely to be a user.
Why you need custom training
What is needed to enable users to become an effective UAT team is some custom-built training. Not a run-of-the-mill course on UAT, but training designed specifically to address the complex demands on users deployed as user acceptance testers.
Learn more about building a UAT team
Download a free extract from Brian Hambling and Pauline van Goethem's book, How to build an effective User Acceptance Testing (UAT) Team.
Includes a 20% discount voucher
Users have a unique perspective that is vital in user acceptance testing, and UAT training has to turn a group of users into a functioning team in which all members understand and take ownership of the quality of the system they test.
Users should have a thorough understanding of the business objectives and should feel comfortable carrying out all the tasks they are expected to complete, including writing requirements, test conditions, test cases and test scripts.
The key is incremental learning and development, supplementing short classroom courses with development of skills continued within the project team, supported by more experienced colleagues.
Developing an effective UAT team
What is required is a mix of training content and a process that supports users through the period of their learning, which will last for the length of the project and beyond.
This strategy to support the learning and the quality of the training is what underpins the confidence and skills that will make UAT effective. The ability of users to consult on, create content for and carry out UA testing should relieve some of the common time, budget and confidence pressures that may be felt by any project so close to implementation.
A properly thought out schedule of training will avoid the problems that so often beset UAT:
- Reliance on UAT training alone carries the risk that the training is not embedded by applying newly acquired UAT skills ‘back at the desk’;
- Reliance on ad-hoc on-the-job learning alone carries the risk that the skill level of the user is not up to the task, or that users learn bad habits on the job;
- No training or on-the-job learning (relying on the skills of business analysts and testers) carries the risk that UAT is not relevant enough to the needs of the business.
An effective UAT training strategy:
- Provides a thorough understanding of the UAT process;
- Prepares the participants for the tasks they will have to carry out;
- Promotes team formation and set out roles and responsibilities.
Using a considered approach to the learning and development needs of the UAT team will help to prevent the project falling at the very last hurdle.
Brian Hambling (pictured) and Pauline van Goethem have nearly 60 years' combined experience in the IT industry in a wide variety of development, testing and project management roles. Brian has been chair of the Software Testing Examination Board at ISEB and an examiner at the ISTQB. Pauline is a member of the ISTQB Glossary review team.