News

Ofqual blames flawed software testing for 'A' level grading errors

Jenny Williams

Examinations regulator Ofqual has highlighted three IT problems with the marking system that caused thousands of students to receive incorrect GCSE and A-level marks in 2010.

A report from Ofqual has highlighed shortcomings in the way the examination board, Assessment and Qualifications Alliance (AQA), dealt with project management, user acceptance testing (UAT), and software training for its onscreen marking system, used by GCSE 'A' level examiners.

The marking system was extended to support longer, written answers to exam questions. However the project to extend the system to support this did not adhere to project management best practices, according to the examination regulator.

The failure in the system, which was used to mark 3.3 million GCSE and 'A' Level exam papers in the summer of 2010, resulted in 3,353 students receiving incorrect marks and 622 being issued incorrect qualification grades.

"AQA could have identified the failure earlier if more effective risk assessment and arrangements for handling and reporting problems concerning the onscreen marking of scripts had been in place," said Ofqual.

Isabel Nisbet, chief executive of Ofqual, said: "Factors that contributed to the marking error included limited piloting of the on-screen marking system, a lack of effective risk assessments and deficiencies in the role and training of examiners on the system."

Ofqual also says AQA did not treat the extension of the system as a new project, which meant project management method, Prince 2 (Projects In Controlled Environments, version two) was not used and project managers and business analysts were not assigned to the project.

The regulator's report outlines a "lack of rigour" around user acceptance testing.

In the report, Ofqual states that the absence of proper user acceptance testing process meant some of the likely process errors were not picked up before the system was used in a live marking environment.

"The onscreen marking software for the June 2010 examinations was released later than expected. Testing was undertaken by IT staff rather than the end users. The testing focused on the technical functionality of the marking software rather than looking at the whole process," said the report.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy