Setting up a process for data quality management can be quite a difficult task. In this context, I would like to share four useful best practices that you can apply for improvement of data quality in your organization.
Conduct quality checks of the database: As part of the data quality process checks, another person has to check the database’s validity. After these findings, the database is locked. There is the assumption that when a database is locked, the required data quality has been acquired. It is then passed on to the statistics department. Before this transfer happens, it’s critical that you perform outlier testing using simple statistical methods. Using this process for data quality management, we can find out if there are any values which are outside the prescribed range. In case there are any such values, it should be determined as to whether there’s any justification required to understand that value. Rough quality checks should then be performed again to assess the validity.
Review the performance metric: The performance metric is linked to the data entry operator’s performance, and is critical to data quality process management. An operator is generally given the target of ‘x’ number with certain quality standards to adhere to. If the review notes erroneous data values, you can always track down the original value to find the person who has entered the data. This allows you to monitor the quality quotient of each individual at any given point of time. However, every operator is given a target along with a specified error rate.
Set up a data quality incident management system: We can trace the exact points where incorrect data is filled, at different levels. Most of the available applications provide this information. If there is an error which can be rectified at the basic level, this is done. In case it’s not corrected, then it is flagged for the next level with tools available in the system. If the issue is still not mitigated, you have to go back to the site from where the data was rendered. Investigate the contradiction in this value and sort it out as part of data quality process.
Once the data is moved to the statistics department, they should apply their own data quality process measurements. The department will utilize previously worked-out templates to find out how to perform the analysis. To set up a quality check at this stage, two separate individuals should be put to work on a specific table, and their end-results should match.
About the author: Manoj Yasodharan is the head of clinical data management and biostatistics of Clinigene International.
(As told to Snigdha Karjatkar)