Computer system validation (CSV) is a core item that regularly comes up during regulatory inspections. And, the US Food & Drug Administration (FDA) conducted 2,442 inspections in 2022, according to a recent report.
To ensure inspection readiness, a two-part series on computer system validation (CSV) for regulatory inspections has been created to help you prepare.
Computerized systems are increasingly used in the pharmaceutical industry, including within regulated Good Laboratory Practice (GLP), Good Clinical Practice (GCP), Good Pharmacovigilance Practice (GPvP), and Good Manufacturing Practice (GMP) environments and supporting activities. Regulatory bodies such as the FDA carry out inspections to verify that life sciences firms, clinical research teams, and affiliated organizations comprehend and adhere to regulations designed to safeguard patient safety, maintain product quality, and uphold data integrity.
As a result, computerized systems are a focus area during audits and inspections.
The good news is that no one has to be afraid of regulatory inspections. Preparation is the ticket. To do so, let’s discuss regulatory expectations to ensure compliance with the quality system regulation with regards to computer system validation (CSV).
System Life Cycle (SLC)
Computerized systems are company assets, whether embedded in equipment, products (medical devices), or existing as support systems. As such, they should receive the same level of oversight and control as any other company asset. GxP professionals are not required to know the ins and outs of the computerized system. However, they should be familiar with the characteristics of system evolution, maintenance, and use that meets regulatory requirements.
All systems go through a predictable lifecycle. To understand how a system got where it is and where it’s going, you need to know where it is in its lifecycle.
With its typical characteristics, the lifecycle acts as a guidebook. GxP professionals should be prepared for several stages and anticipated outcomes when involved in computer system validation (CSV).
Below are a set of consideration and questions.
Identifying the need: Can a computerized system offer the most effective resolution to the business challenge?
Proposing solutions: When it comes to finding solutions, one can opt for a technical approach that involves letting the computer system handle everything, a procedural approach that requires human intervention to follow a set of steps, or a hybrid approach that combines both methods for optimal results.
Determining system development: Should we focus on constructing and enhancing a computer system using internal or external resources, purchasing a ready-made product that fulfills the requirement, or adapting an existing product to suit the need?
Installing the system. Were the necessary infrastructure and supporting systems accurately predicted? What is the plan for replacing the old system? How much disruption can be expected during the transition?
Testing and acceptance. What is the appropriate extent and level of detail for testing? Do the test results provide a definite and thorough confirmation that the system can reliably meet the business requirements?
Training. After confirming the system’s functional suitability, are the intended users equipped with the knowledge to effectively operate it?
Change control. Change is inevitable in all systems, and it’s important to manage these changes. Systems undergo three types of changes: corrective, improvement, and adaptive. Are there formal procedures and documentation in place that guide a system from its current state (as-is) to its desired state (to-be)?
Backup and disaster recovery. Hardware and software failures are occurring less often, but they still occur. All systems carry this risk. Is there a plan for backup and recovery systems to minimize potential data loss? Data loss should be deemed unacceptable in general, but the loss of GxP data should be seen as a failure to meet regulatory standards.
Decommissioning. Automated systems eventually become obsolete as hardware and support systems evolve, or a more effective method of meeting the growing business requirements will be discovered. As systems become outdated, it may be necessary to safeguard and preserve the data for regulatory, legal, or historical purposes. There is a need to consider how to replace the system and keep the old system operational to a certain extent. Not all information and processes can be seamlessly migrated to a new system, and not all systems can accommodate current data formats and processes.
Records. Maintaining documented evidence is a regulatory requirement, but it may not be feasible to retain all records. Legal obligations and guidance from regulatory bodies should determine which records are retained and for what duration.
What is computer system validation (CSV)?
Computer systems used to create, modify, and maintain electronic records and to manage electronic signatures are subject to the validation requirements (see 21 CFR §11.10(a).). Computer systems of this nature need validation to guarantee precision, dependability, consistent intended functionality, and the capability to identify invalid or modified records.
Validation for computer systems may be defined in several different ways, including:
- The process of establishing documented evidence that provides a high degree of assurance that a computer system consistently performs according to predetermined specifications and quality attributes (FDA).
- Validation, including confirmation by examination and through provision of objective evidence that the requirements for a specific intended use or application have been fulfilled (ISO 9000).
- Proving that the system is fit for purpose throughout the lifecycle.
Whichever definition is used, there is an implication that some form of testing against requirements or specification must be executed and documented to provide evidence that the work has been completed.
Computer system validation (CSV) should demonstrate that accuracy and reliability are built into the computerized system (quality assurance), without relying solely on testing (quality control). To this end, computer system validation (CSV) must be addressed throughout SLC from initiation through decommissioning steps, as outlined above.
Validation deliverables compromise the set of plans, specifications, protocols, results, and reports that guide and conclude on the activities and testing that is conducted to establish that the system is correctly configured, and its functions correctly match the requirements set out in design specifications.
Associated with the validation deliverables are the set of system specification documents which are linked, or mapped, to the system testing through the traceability matrix. This matrix is then used to verify that all functions required by the end-user have been appropriately tested and operate as expected.
Validation plans should be in place
The validation plan should describe the overall strategy and responsible parties for validating a computerized system within its operating environment.
Every validation plan should include success measures, responsibilities, agreed assumptions about limitations on the scope of the validation exercise and justification for any exclusions or limitations. The process or system owner has the responsibility to plan and organize the validation activities and is responsible for the structure and content of this validation plan.
Approval of the validation plan and the associated summary report should fall with the system owner, and the Quality Assurance unit should authorize both documents.
Five types of validation protocols should be executed
First, the installation qualification (IQ) protocol is executed to show that the system components are installed and configured as specified and to generate documented evidence that the system was installed according to approved design specifications and manufacturer instructions.
Second, the operational qualification (OQ) protocol is executed to show that the application components operate as expected by testing against the system functional specifications.
Third, the performance qualification (PQ) protocol is executed to show that the complete system functions as required by testing against critical end-user requirements in the production environment.
Fourth, user acceptance testing (UAT) may be used in lieu of the OQ and PQ provided that the developer has completed formal functional testing and is executed to show that the business requirements can be met by system functions.
Finally, a traceability matrix is used to map requirements to functions and tests to verify that all end-user requirements have been met through a combination of system specifications and functions and that each has been formally tested. This is often in a tabular or spreadsheet format (e.g., a matrix).
In Figure 1 below, system and acceptance testing replace OQ and PQ to leverage formal functional testing by the developer.
Figure 1: Testing and Qualification Protocols
As we consider the roles needed, we have system testers and reviewers. Their roles are as follows:
System testers
- Execute tests specified in the qualification protocol and test sheets.
- Verify that actual results are in accordance with expected results.
- Generate and reference all test proof and documented evidence required in the Test sheets.
- Record all unexpected results in the discrepancy sheets.
Reviewers
- Verify the actual results, using the completed test sheets and the corresponding documented evidence.
- Verify the validity of any resolutions to unexpected results.
- Conclude on the test execution.
- Test reviewers must be independent of the test executions.
The need for validation reports
Various reports may be produced depending on the strategy for computer system validation (CSV). For instance, the vendor or developer audit report would summarize the development audit findings and the status of corrective action requests, and the system risk assessment report would summarize identified risks with criticality assessments, mitigations, and the resulting validation approach.
The IQ, OQ, PQ and UAT reports summarize the issues identified during testing and how they were resolved or accepted with exclusions and deviations. They show:
- The IQ report summarizes the results showing by the execution of the IQ test scripts that the system components were installed as specified on the system/servers defined in the IQ protocol.
- The OQ Report summarizes the results showing by the execution of the OQ test scripts that the application components operate as expected on the system/servers defined in the OQ protocol.
- The PQ Report summarizes the results showing by the execution of the PQ test scripts that the complete system functions as required on the system/servers defined in the PQ protocol.
- The UAT report summarizes the results showing by the execution of the UAT test scripts that the business requirements can be met by system functions on the system/servers defined in the UAT protocol.
The sequence of activities in the computer system validation (CSV) process must be carefully controlled to ensure that the tests are completed in the correct order. For instance, OQ testing cannot be executed until an IQ has verified that the server components have been correctly installed.
The validation summary report concludes all the other reports and reiterates the validation status along with any exclusions or limitations on system use, including any deviations from the project and/or validation plans, and a summary of what was done and what remains to be done.
Stay tuned for Part 2 of this series for computer system validation (CSV) change management, policies and procedures, system audit, and resources.
For questions or further discussion, Please click here to connect with the right resource to respond.
Authored by: Yaritza Menéndez, Sr. Quality and Compliance Specialist, Quality and Compliance.