Posted on June 17, 2022 at 12:52 PM by Daniel Spankie
In 21 CFR Part 11, the Code of Federal Regulations establishes the United States Food and Drug Administration requirements for electronic records and electronic signatures. However, over the last decade, the pharmaceutical industry has moved beyond the technical controls and validations required by these regulations, embracing a more holistic approach that recognizes the need for a strong data governance framework as part of any data integrity program.
According to Pharmaceutical Inspection Co-Operation Scheme (PIC/S, 2021), data governance is defined as:
“…the sum total of arrangements which provide assurance of data integrity. These arrangements ensure that data, irrespective of the process, format, or technology in which it is generated, recorded, processed, retained, retrieved, and used will ensure an attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, and available record throughout the data lifecycle."
Also, according to PIC/S (2021), the reason data governance is important to an organization:
“While there may be no legislative requirement to implement a ‘data governance system,’ its establishment enables the manufacturer to define, prioritize and communicate their data integrity risk management activities in a coherent manner. Absence of a data governance system may indicate uncoordinated data integrity systems, with potential for gaps in control measures.”
This system involves many aspects of a quality management system, including both technical and procedural controls. The requirements for data integrity apply whether the records are electronic or paper, or a hybrid system.
Potential Impact of a Data Integrity Gap
Data integrity gaps can have real consequences. At one pharmaceutical lab, an analyst had been manipulating and falsifying chromatographic results, underreporting impurities and re-injecting samples and standards to ensure passing results. Over 400 lots of products were released to market over a two year period. The lab self-reported the incident, but had to retain an independent consulting firm to oversee re-evaluation of the records generated by the analyst. The effort took three full-time resources almost six months.
In some cases, the acts that lead to a data integrity issue are unintentional. Either way, these types of issues cost companies time and money and may pose a serious risk to the company’s reputation.
Preventing Data Integrity Events
A well-structured data integrity program must account for people, processes, and technologies. Unfortunately, there are no foolproof technical solutions for preventing a data integrity event. Company management should consider how business goals and metrics may incentivize employees to take shortcuts or implement workarounds that could compromise data integrity controls.
For example, the leadership team for a large Quality Control (QC) laboratory wanted to tie the annual bonus of QC management staff to the average number of lots released per week. Structuring the compensation this way could create pressure to get lots out the door, and lead to shortcuts. A better approach would be to tie compensation to quality goals, such as no quality related recalls or no significant quality related findings in regulatory or third-party audits.
Understanding the Risk Factors
The fraud triangle captures primary risk factors for financial fraud. These drivers are also behind many data integrity events.
Risk Factors - Pressure
Pressure can come from both internal and external forces. The employee may be experiencing financial hardship or other personal issues. Similarly, the bonus structure for QC Lab management provided in the earlier example creates “pressure” that could push personnel to act in a way that could result in a data integrity issue.
Risk Factors - Rationalization
Rationalization can occur when an employee believes that procedures and processes are not being followed or enforced.
During the investigation of a potential data integrity incident, manufacturing personnel were suspected of recording false readings during a critical step in the process. Manufacturing operators stated that they started recording in-specification results regardless of the instrument readings because in the past, management would simply justify any out-of-specification (OOS) results and release the product anyway.
The employees felt there was no reason to accurately document the OOS result if management was still going to release the product, and justified their falsification of data as having no impact on the product or process.
Risk Factors - Opportunity
Opportunity occurs when employees feel there is no oversight or little chance to being caught. Some employees are at work to do a job, get paid, and go home. If there are opportunities for them to take shortcuts and get their job done sooner because no one is paying attention, data integrity events can occur.
In one 24/7, high-volume QC laboratory, employees expressed concern when another third-shift employee had completed a much higher number of testing assays compared to others in the group. This level of productivity seemed unlikely, since the employee was often seen on their phone or walking around talking to others, rather than at their bench working.
We reviewed assay results to determine if there was a potential data integrity issue. One of the tests was a manual titration with a color change for the endpoint, so some variation was expected based on inherent method error and analyst technique. An evaluation of the last 40 lots tested revealed a variance of around 2% above and below the target of 100%.
When the data was sorted by analyst instead of lot number, a different pattern emerged.
For the employee in question, samples 26 through 40 were clustered around the 100% result, with little of the variation expected for this method and seen by other employees performing the testing.
While the data did not conclusively prove that the analyst was falsifying data, it was suspicious when coupled with the observed behavior of the employee. As part of the investigation, a “data integrity sample” was created at 75% of the label claim. The next day, the employee’s results showed the sample at 99.9% of the label claim. When interviewed, the employee admitted to falsifying the results and calculating the amount of titrant needed to get a passing result. The employee felt the work was not being scrutinized.
Developing a Robust Data Integrity and Governance Program
A successful data integrity and governance program must have leadership support. Company leaders should ensure that employees are aware of their role in the organization and how they are expected to contribute to final product quality. It is also critical that leadership be receptive to employee feedback. While there may be factors outside of immediate management control, their concerns should be acknowledged.
Ideally, leadership should foster an environment where employees are not afraid to admit mistakes or point out bad practices. They should be comfortable suggesting improvements and providing feedback about potential data integrity issues. In some cases, an anonymous reporting tool may allow for reporting of issues without fear of retaliation.
An important part of a sustainable quality environment is ongoing data integrity training. Too often, companies keep quiet about data integrity issues, or limit discussions about them to the management team. If a data integrity event is uncovered, disclosing the root cause and mitigation plan may help diminish future potential issues.
If your company would like to strengthen its data integrity and data governance programs, ESi can help. Please contact Daniel Spankie at email@example.com for more information.