Master Plan Page 1 of 53
Document Number: M-173
Computer System Validation
Company Name:
Controls:
Superseded Document N/A, new
Reason for Revision N/A
Effective Date June 1, 2012
Signatures:
Author I indicate that I have authored or updated this Master Plan
according to applicable business requirements and our
company standards.
Name: ________________________________
Signature: ________________________________
Date: ________________________________
Approver I indicate that I have reviewed this Master Plan, and find it
meets all applicable business requirements and that it reflects
the procedure described. I approve it for use.
Name: ________________________________
Signature: ________________________________
Date: ________________________________
Reviewer I indicate that I have reviewed this Master Plan and find that it
meets all applicable quality requirements and company
standards. I approve it for use.
Name: ________________________________
Signature: ________________________________
Date: ________________________________
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 2 of 53
Document Number: M-173
Computer System Validation
Table of Contents
1. Introduction, Scope and Objectives of this Document..........................................6
1.1 Introduction............................................................................................................ 6
1.2 Scope.....................................................................................................................6
1.2.1 Enterprise Level........................................................................................................6
1.2.2 System Level.............................................................................................................6
1.3 Objectives.............................................................................................................. 7
2. Policy..........................................................................................................................7
3. Related Documents and Activities...........................................................................7
3.1 Other Master Plans................................................................................................7
3.1.1 Risk Management Master Plan (17.1)......................................................................8
3.1.2 Network Qualification Master Plan (17.2)...............................................................8
3.1.3 21 CFR Part 11 Compliance Master Plan (17.3).....................................................8
3.1.4 Security Master Plan (17.4).....................................................................................8
3.1.5 Training Master Plan (17.5).....................................................................................8
3.2 Procedures.............................................................................................................8
3.3 Checklists, Forms, Templates, Examples..............................................................9
3.4 Validation Project Plans.........................................................................................9
4. Responsibilities.......................................................................................................10
4.1 Validation Steering Committee.............................................................................10
4.2 System Owner..................................................................................................... 11
4.3 Validation Project Team.......................................................................................11
4.4 IT Department......................................................................................................12
4.5 Quality Assurance................................................................................................12
4.6 Regulatory Affairs.................................................................................................12
4.7 Operations (User Representatives)......................................................................13
4.8 Documentation Department.................................................................................13
4.9 Suppliers.............................................................................................................. 13
4.10 Plant Maintenance..........................................................................................14
5. Computer Systems to be Validated.......................................................................14
5.1 General Characteristics of Systems to be Validated............................................14
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 3 of 53
Document Number: M-173
Computer System Validation
5.2 Examples............................................................................................................. 14
5.3 List with Computer Systems to be Validated........................................................15
6. Validation Principle and Approach........................................................................15
6.1 Overview.............................................................................................................. 15
6.2 Definitions............................................................................................................ 15
6.2.1 Validation...............................................................................................................15
6.2.2 Computer Systems, Computerized Systems............................................................15
6.3 Software Categories.............................................................................................16
6.4 Life Cycle Models.................................................................................................17
6.5 Approach for Implementation...............................................................................19
7. Validation Steps...................................................................................................... 20
7.1 Define System Owner and Project Team.............................................................20
7.2 Planning............................................................................................................... 20
7.3 Assumptions, Exclusions and Limitations............................................................21
7.4 Setting Specifications...........................................................................................21
7.5 Vendor Selection and Assessment......................................................................21
7.6 Installation............................................................................................................23
7.6.1 Before installation..................................................................................................23
7.6.2 During installation.................................................................................................23
7.7 Testing for Operation........................................................................................... 24
7.7.1 Test plan.................................................................................................................24
7.7.2 Type and extent of testing.......................................................................................24
7.7.3 Test environment....................................................................................................25
7.7.4 Systems with identical configurations....................................................................25
7.7.5 Test traceability......................................................................................................25
7.7.6 Test data sets and procedures for ongoing regression testing...............................25
7.7.7 Ongoing tests (PQ).................................................................................................26
7.7.8 Documentation and review of testing.....................................................................26
7.7.9 Handling deviations................................................................................................26
7.7.10 Qualification of test personnel...........................................................................27
7.8 Revalidation......................................................................................................... 27
7.8.1 Time based..............................................................................................................27
7.8.2 Event driven............................................................................................................27
7.9 Existing Systems..................................................................................................28
7.10 Validation Report............................................................................................28
8. Approach for Networks and Networked Systems................................................29
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 4 of 53
Document Number: M-173
Computer System Validation
9. Approach for Spreadsheet Applications...............................................................30
9.1 General Guidelines.............................................................................................. 30
9.2 Design for Integrity...............................................................................................30
9.3 Development and Validation................................................................................31
10. Risk Assessment.....................................................................................................31
11. Configuration Management and Change Control.................................................32
11.1 Initial Set-up....................................................................................................32
11.2 Change Control...............................................................................................33
12. Maintenance and Support.......................................................................................33
12.1 Preventive Maintenance.................................................................................33
12.2 Back-up and Restore......................................................................................33
12.3 Archiving.........................................................................................................34
12.4 Contingency Planning and Disaster Recovery................................................34
12.5 Security and User Administration...................................................................35
12.6 Problem Handling...........................................................................................35
13. System Retirement..................................................................................................36
14. Periodic Reviews and Auditing..............................................................................36
14.1 Reviews.......................................................................................................... 36
14.2 Auditing...........................................................................................................37
15. Communication and Training.................................................................................37
15.1 Reference Papers and Industry Standards.....................................................38
15.2 FDA and Other Regulations and Guidelines...................................................38
16. Reference Documentation and Validation Deliverables......................................38
16.1 Standard Operating Procedures.....................................................................38
16.2 Validation Deliverables...................................................................................39
17. References............................................................................................................... 40
18. Attachments.............................................................................................................41
18.1 Attachment - Computer System Validation Policy..........................................41
18.2 Attachment - Members of Computer Validation Steering Committee.............42
18.3 Attachment - Members of Computer Validation Project Team........................43
18.4 Attachment - List with Computer Systems for Validation................................44
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 5 of 53
Document Number: M-173
Computer System Validation
18.5 Attachment - Validation Project Schedule......................................................45
18.6 Attachment - Requirement Specifications Table.............................................46
18.7 Attachment - Vendor Rating...........................................................................47
18.8 Attachment - Extent of Testing for Different Risk Levels................................47
18.9 Attachment - Template for a Test Traceability Matrix.....................................48
18.10 Attachment - Example for a Test Protocol......................................................49
18.11 Attachment - Change Request Form..............................................................50
18.12 Attachment - Change Release Form..............................................................51
18.13 Attachment - Retirement Request Form.........................................................52
18.14 Attachment - Validation Deliverables..............................................................53
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 6 of 53
Document Number: M-173
Computer System Validation
Master Plan
Computer System Validation
1. Introduction, Scope and Objectives of this Document
1.1 Introduction
Computers are widely used during development and manufacturing of drugs,
drug substances and medical devices. Correct functioning and performance
of software and computer systems play a major role in obtaining consistency,
reliability and accuracy of data. Therefore, Computer System Validation (CSV)
should be part of any good development and manufacturing practice. It is also
requested by the FDA and other regulations and guidelines through the
overall requirement that "equipment must be suitable for its intended use".
Because of the complexity and long duration of validation activities they
should be thoroughly planned, implemented and documented. This master
plan and the templates in the attachments should be used as a framework for
such planning.
1.2 Scope
The master plan addresses computer validation at the enterprise and system
level.
1.2.1 Enterprise Level
Specific contributions for enterprise level master planning are:
Corporate policy.
The company’s validation approaches.
Inventory of systems and associated validation status.
1.2.2 System Level
Types of systems that can be covered by this plan include:
Commercial software and computer systems.
Configurable software and computer systems.
Small and large systems.
Standalone and networked systems.
New and existing systems.
Spreadsheet applications.
Computers used in regulated and other business critical
environments.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 7 of 53
Document Number: M-173
Computer System Validation
This plan does not cover details of validation activities during development, for
example, details of design specifications and reviews, code development,
review and documentation or structural testing.
The plan also does not cover details of infrastructure management and
qualification, internet compliance or details of security and risk management.
1.3 Objectives
This computer master plan has four objectives:
1. It serves as a resource for development of computer system validation
project plans. This will help make such planning more consistent and
efficient.
2. It answers the inspector’s question about the company’s approach for
computer validation. A validation master plan is officially required by the
European GMP directive through Annex 15.
3. It demonstrates corporate commitment and support for computer system
validation through the corporate policy statement.
4. It helps personnel at all management levels understand how validation is
approached and implemented in the organization.
2. Policy
Because of the importance of computer validation for compliance and business
reasons, a company should lay out a policy either in a separate policy document,
as part of the quality plan, or in the validation master plan. The policy should start
with a management statement on the importance of computer validation for the
company. It should also include expectations, for example, that all computer
systems used in regulated environments should be validated. The policy should
also state activities that will help to meet the expectations. An example of a policy
statement is shown in Attachment 18.1.
3. Related Documents and Activities
Computer system validation and the validation master plan cannot be isolated from
other activities and documents. For example, risk management strategies as
defined in a risk management master plan should also apply to computer system
validation. Trainings on computer validation should be conducted and documented
following the company’s training master plan. This chapter describes documents
that are related to computer validation and to this master plan.
3.1 Other Master Plans
Master plans are documents that lay out a company’s approach for specific
activities. They help to implement individual projects efficiently and in a
consistent manner. Examples are the validation master plan, risk
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 8 of 53
Document Number: M-173
Computer System Validation
management master plan, network qualification master plan, the Part 11
compliance master plan and the security master plan. While this computer
validation master plan provides enough information to conduct qualification
tasks, it does not give enough details for supporting tasks. For example, it
does not include information on preparing, conducting and documenting
trainings, or information on password conventions and risk management
strategies. However, these three activities are also important for computer
system validation and strategies are laid out in the training master plan, the
security master plan and the risk management master plan.
3.1.1 Risk Management Master Plan (17.1)
A risk management master plan describes a company’s approach for
risk assessment and risk management, for example, to comply with the
FDA’s Part 11 Guidance: “Scope and Applications” based on a “justified
and documented” risk assessment. It is used as a source for project
specific individual risk management project plans.
3.1.2 Network Qualification Master Plan (17.2)
A network qualification master plan describes a company’s approach
for qualifying IT infrastructure and networks. It is used as a source for
project specific individual qualification project plans.
3.1.3 21 CFR Part 11 Compliance Master Plan (17.3)
A Part 11 compliance plan describes a company’s approach and steps
for implementing electronic records and electronic signatures..
3.1.4 Security Master Plan (17.4)
A security master plan describes a company’s approach to ensure
security and limited and authorized access to buildings, critical areas
within buildings, e.g., data centers and to computers and data.
3.1.5 Training Master Plan (17.5)
A training master plan describes a company’s approach on how to
identify training needs for employees, how to develop and implement a
training plan, how to conduct trainings and finally how to document the
trainings. Trainings for computer validation should follow the
recommendations in this master plan.
3.2 Procedures
Routine activities in regulated environments should follow written procedures.
These are typically defined as standard operating procedures. While master
plans describe the tasks and approaches, procedures give step-by-step
instructions on how to do the tasks. Examples are procedures for training, for
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 9 of 53
Document Number: M-173
Computer System Validation
validation of commercial off-the-shelf systems, for validation of custom-built
systems, for risk-based validation, for change control, for developing user
requirement specifications and for risk assessment.
Internet Computer
Computer System
Quality& System
Compliance Validation Validation
Master Project Plan
Part 11 Plan Primer ERP System
Master
Master Plans
Computer
Plan System
Security Computer Validation
Master Validation Project Plan
Plan Lab System
Master
Computer
Network Plan System
Qualification
Validation
Master
Project Plan
Plan
Training ECM
Master
Plan
SOP Checklist
Part 11 SOP Requirement Template
Risk Scope and Risk Template
SpecificationsBusiness
Management Controls SOP
Assessment Process Access
Master Computer Testing
Plan System
Validation Forms, Templates
SOPs Examples, Checklists
Figure 1: Linking Documents
3.3 Checklists, Forms, Templates, Examples
Checklists, forms, templates and examples help implement individual
validation projects effectively and consistently. Examples are checklists and
worksheets for commercial off-the-shelf systems, for validation of custom
systems, for development of specifications and for audits. Templates should
be available for system documentation, test protocols, maintenance and
change logs.
3.4 Validation Project Plans
Validation project plans are developed for the validation of individual systems,
for example, an Enterprise Resource Planning (ERP) System, a Laboratory
Information Management System (LIMS) or an Enterprise Content
Management (ECM) System. They are derived from the master plan and
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 10 of 53
Document Number: M-173
Computer System Validation
define validation approaches and activities that are specific for the system to
be validated.
Figure 1 illustrates how the different documents are linked together. The master
plan is developed with inputs from other master plans. This master plan is written
such that it can be used to develop individual project plans. It should be generic
enough so that it can handle all systems that need to be validated. Standard
operating procedures are either available and adequate for the target system or
need to be developed.
4. Responsibilities
Computer validation will affect different departments in an organization. Policies,
master plans and procedures should be preferably supported and used by the
entire organization. Individual projects need to be supported by anybody who is
affected by the computer system to be validated. Therefore, it is important that
responsibilities are well defined.
4.1 Validation Steering Committee
The steering committee selects the company’s approaches for computer validation
and develops master plans and procedures with templates.
Members should come from Operations (manufacturing, laboratories), IT, QA,
Documentation and Regulatory Affairs.
For each team member a back-up should be identified mitigating the risk of
unavailability of core members. A list should be created and maintained with
contact information of core members and back-ups. A template for such a list is
included in Attachment 18.2.
Tasks include:
Developing company policies and approaches for computer
validation.
Developing master plans that can be used to derive individual
project plans.
Developing procedures that are independent from individual
projects, e.g., for validation of commercial off-the-shelf computer systems.
Defining training requirements and developing training material for
software and computer system validation.
Reviewing and approving project plans of computer validation
projects that are critical for the organization.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 11 of 53
Document Number: M-173
Computer System Validation
4.2 System Owner
The system owner owns the validation project. Tasks and responsibilities
include:
Owning the process to define, execute and document the validation
activities and results. If no other person or group is mentioned as owning a
specific task outlined in the plan, the system owner should handle it.
Smaller systems should be handled by the system owner without the need
for an official project team. With the support of the steering committee the
system owner should decide if a project team is needed or not.
Selecting project validation team members together with functional
supervisors of affected departments (if such a project team is required).
Leading the project team and team meetings.
Drafting and updating the validation project plan.
Ensuring ongoing progress of the project according to the project
plan through timely escalations of go/no go conflicts to the validation
steering committee.
Reviewing and approving validation protocols and other validation
deliverables.
Managing the risk assessment process to define the risk category of
the system and validation tasks for the selected risk category.
Managing the development of system specific procedures, back-up
strategies, archiving strategies and security strategies.
Reporting the progress of the plan to the project sponsor and
management.
Ensuring necessary training of project team members.
Ensuring compliance of the project with the validation master plan
and company procedures.
Reviewing and auditing computer systems together with QA.
4.3 Validation Project Team
This team is formed for a specific validation project. Members come from all
departments that are affected by the specific system. As a minimum,
representation should come from user departments, QA and IT. The system owner
should lead the team.
For each team member a back-up should be identified mitigating the risk of
unavailability of core members. A list should be created and maintained with
contact information of core members and back-ups. A template for such a list is
included in Attachment 18.3.
Tasks of team members include:
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 12 of 53
Document Number: M-173
Computer System Validation
Representing their departments.
Attending all team meetings or arranging for a substitute.
Collecting and giving inputs for risk assessment.
Reviewing project plans.
Developing and reviewing procedures that are specific for the
individual project.
Reviewing test and validation protocols and other validation
deliverables.
4.4 IT Department
This department has technical responsibility for the project. Responsibilities
include:
Helping to define specifications for software and computer systems.
Assisting the system owner in identifying and selecting software and
computer system suppliers and models.
Creating and maintaining hardware and software inventory for
computer systems.
Qualifying IT infrastructure.
Providing technical expertise for risk assessment and the extent of
testing and revalidation related to networked systems.
Reviewing and approving validation documentation related to
network infrastructure.
Developing and maintaining security controls.
4.5 Quality Assurance
Reviewing and approving procedures and other documents for
compliance with internal standards and regulations.
Providing quality assurance and regulatory expertise.
Developing training material and delivering training on regulations
and corporate standards.
Auditing computer systems together with the system owner for
compliance with procedures.
4.6 Regulatory Affairs
Communicating with regulatory agencies to get the most accurate
information on regulations, guidelines and their interpretations.
Updating the project team on regulations, guidelines and their
interpretations.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 13 of 53
Document Number: M-173
Computer System Validation
4.7 Operations (User Representatives)
Ensuring that all software and computer systems in the department
are listed in the inventory list.
Ensuring that all systems in the department are validated according
to the project plan.
Ensuring that QA and IT are notified before purchase of new
systems.
Providing user expertise inputs in the creation and review of
validation deliverables.
Providing resources for functional and performance testing.
Ensuring that SOPs are developed covering use of the system and
contingency situations and system recovery in case of system failure.
4.8 Documentation Department
Providing templates and forms to develop procedures and other
documents.
Training authors of validation deliverables on how to use the
templates.
Maintaining and archiving procedures and other documents.
4.9 Suppliers
Suppliers can be vendors of commercial systems, companies that develop
software on a contract basis, internal software development resources or a
combination of the three categories.
Tasks include:
Developing software and computer systems according to
documented procedures.
Providing documented evidence that the software has been
developed in a quality assurance environment and validated during
development.
Allowing users to audit development and validation process, if
necessary.
Developing and providing functional specifications for the software
and computer system.
Offering services to assist users in specifying, installing and
validating the system.
Offering support in case the user has a problem with the system.
Informing users on critical software errors and workaround solutions
and corrective action plans.
Maintaining version control of the code.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 14 of 53
Document Number: M-173
Computer System Validation
Informing users on new versions, e.g., what is new and how the
change can impact the validation state.
4.10 Plant Maintenance
Preparing the site for installation of the computer system according
to information provided by the supplier of the computer system.
5. Computer Systems to be Validated
Validation of software and computer systems is a regulatory requirement
specifically spelled out in the FDA’s regulation for electronic records and signatures
and in Annex 11 of the European GMP directive. The requirement affects any
computerized system that is used to create, modify, maintain, archive, retrieve or
transmit data. This and other regulations do not differentiate between small and big
systems, old and new systems, commercial and custom built systems, or between
self-developed and purchased systems. However, the extent of validation depends
on most of these factors. This chapter lists general criteria and examples for
systems to be validated.
5.1 General Characteristics of Systems to be Validated
Computer systems, software, network modules and networked
systems that are used for regulated activities. These are systems with
records that are either required by a regulation or that are necessary to
demonstrate compliance with a regulation.
Computer systems, software, network modules and networked
systems that are critical to the operation of a company or department.
New computer systems and existing systems.
Purchased systems and systems developed in house.
Small and large systems, for example, spreadsheet applications
and LIMS.
5.2 Examples
Computer systems to be validated include, for example, computerized
analytical instruments, other automated laboratory equipment, computers
used to acquire and evaluate data and Laboratory Information Management
Systems (LIMS). Also included are systems to create, manage and maintain
electronic documents, Calibration Tracking Systems (CTS) and e-mail
systems if used for regulated activities. Other examples are Supervisory
Control And Data Acquisition (SCADA), Electronic Batch Record Systems
(EBRS), Programmable Logic Controllers (PLC), Process Control Systems
(PCS), integrated information/business systems, training records systems,
Enterprise Resource Planning (ERP) Systems, Digital Control Systems (DCS),
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 15 of 53
Document Number: M-173
Computer System Validation
Manufacturing Execution Systems (MES), Document Management Systems
(DMS) and Enterprise Content Management (ECM) Systems.
5.3 List with Computer Systems to be Validated
A list should be generated and maintained with all computer systems that are
used or planned to be used in regulated and other business critical
environments. The list should include information on system identification,
description, location, application(s), regulated environment, risk level and
system owners. The list should also include a time frame for validation of each
system. Priorities should be based on compliance and business criticality of
the system. Attachment 18.4 includes a template with an example for such a
list.
6. Validation Principle and Approach
6.1 Overview
Validation of computer systems is not a once off event. For new systems it
starts when a user department has a need for a new computer system and
thinks about how the system can solve an existing problem. For an existing
system it starts when the system owner gets the task of bringing the system
into a validated state. Validation ends when the system is retired and all-
important quality data is successfully migrated to the new system. Important
steps in between are validation planning, defining user requirements,
validation during development, vendor assessment for purchased systems,
installation, initial and ongoing testing and change control. In other words,
computer systems should be validated during the entire life of the system.
6.2 Definitions
6.2.1 Validation
In the context of this master plan validation is defined as “Establishing
documented evidence, which provides a high degree of assurance that
a specific process will consistently produce a product meeting its
predetermined specification” (Source: “FDA Guidelines on General
Principles of Validation”, March 1986).
6.2.2 Computer Systems, Computerized Systems
Computer systems consist of computer hardware and software and
peripherals like printers and CD or DVD drives.
Computerized systems comprise of computer systems, equipment
controlled by the computer system and documentation such as SOPs
and operating manuals.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 16 of 53
Document Number: M-173
Computer System Validation
SOPs
Man u
als
e
tc.
Eq
uip
m e
nt Co
m p
ute
rsy
ste
m
Co
m p
ute
riz
edS
yst
em
Figure 2: Computer System and Computerized System
6.3 Software Categories
The extent of validation depends on the complexity of the computer system.
At the user’s site the extent of validation also depends on the widespread use
of the same software product and version. The more standard software is
used and the less customization is made for a specific software, the less
amount of testing is required by users. The GAMP Guide (17.7) for validation
of automated system defines software categories based on the level of
customization. There are five categories in total. In the context of this master
plan only categories three to five are of interest. Definitions can be found in
the chart below. Each computer system should be associated to one of the
three categories.
Category Description
GAMP 3 Standard software package. No customization.
Non- Examples: MS Word (without VBA scripts). Computer
configurable controlled spectrophotometers.
GAMP 4 Standard software package. Customization of
Configurable configuration.
Examples: LIMS, Excel spreadsheet application where
formulae and/or input data are linked to specific cells.
Networked data systems.
GAMP 5 Custom software package. Either all software or a part of
Customized the complete package has been developed for a specific
user and application.
Examples: Add-ons to GAMP Categories 3 and 4, Excel
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 17 of 53
Document Number: M-173
Computer System Validation
with VBA scripts.
6.4 Life Cycle Models
Because of the complexity and the long time span for computer validation the
process is typically broken down into life cycle phases. Several life cycle
models have been described in literature. One model frequently used is the V-
model as shown in Figure 3.
URS PQ
FS OQ
DS IQ
Build/Code
Figure 3: V-Model Life Cycle
This model comprises of User Requirement Specifications (URS), Functional
Specifications (FS), Design Specifications (DS), development and testing of
code, Installation Qualification (IQ), Operational Qualification (OQ) and
Performance Qualification (PQ).
The V-model as described above is quite good if the validation process also
includes software development. However, it does not address some very
important steps, for example, vendor assessment. It also looks quite complex
for true commercial off-the-shelf systems with no code development for
customization. Phases like design specification or code development and
code testing are not necessary. For such systems the 4Q model is
recommended with just four phases: Design Qualification (DQ), Installation
Qualification (IQ), Operational Qualification (OQ) and Performance
Qualification (PQ). The process is illustrated in Figure 4.
Neither of these models addresses the retirement phase. The 4Q model is
also not suitable when additional software is required that is not included in
the standard product and is developed by the user’s firm or by a 3rd party, for
example Macro programs.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 18 of 53
Document Number: M-173
Computer System Validation
User requirement specifications
Design Qualification Functional specifications
Operational specifications
Vendor qualification
Installation Qualification Check arrival as purchased
Check installation of hardware and
software
Test of key operational functions
Operational Qualification
Test of security functions
Test for specified application
Performance Qualification Preventive maintenance
On-going performance tests
Figure 4: 4Q Life Cycle Model
In this case a life cycle model that combines system development and system
integration is preferred. An example is shown in Figure 5.
User representatives define User or System Requirement Specifications
(URS, SRS). If there is no vendor that offers a commercial system the
software needs to be developed and validated following the steps on the left
side of the diagram. Programmers develop functional specifications, design
specifications and the code and perform testing in all development phases
under the supervision of quality assurance.
When commercial systems are available either the SRS or a special Request
For Proposal (RFP) is sent to one or more vendors (see right side of the
diagram). Vendors either respond to each requirement or with a set of
functional specifications of a system that is most suitable for the user’s
requirements. Users compare the vendor’s responses with their own
requirements. If none of the vendors meet all user requirements, the
requirements may be adjusted to the best fit or additional software is written to
fulfill the user requirements following the development cycle on the left side of
the diagram. The vendor that best meets the user’s technical and business
requirements is selected and qualified.
Next the system is installed, configured and well documented. Before the
system is used in a routine it should be tested in a suitable environment to
verify functional specifications (OQ) and in the final operating environment to
meet user requirement specifications (PQ). Any change to the system should
follow a documented change control procedure and before it is retired all
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 19 of 53
Document Number: M-173
Computer System Validation
quality and compliance relevant records generated on the system should be
successfully migrated to the new system.
User/System
requirements
Home made Purchased
COTS
Specification
Customization (optional)
Functional Request for
specifications proposal
Development
Design Verify with
specifications vendor’s specs
Integration
Development/
Code development Finalize Design Qualification
Code review requirements
(DQ)
Unit and Vendor
Integration testing assessment
IQ Installation (IQ)
Qualifications
OQ
Operation (OQ/PQ)
PQ Implementation
and use
Maintenance/use
Retirement
Retirement
Figure 5: Combined System Development/System Integration Life Cycle
Activities for a specific validation project should follow a validation project
plan. The plan outlines validation tasks, a time schedule, deliverables and
owners for each deliverable. This validation project plan is derived from a
company or a site validation master plan. Validation summary results are
documented in a validation report.
6.5 Approach for Implementation
Validation of software and computer systems should follow the life cycle
approach. The exact model depends on the system, e.g., whether it is a
commercial or custom built system, or a combination of both. True commercial
systems follow the 4Q model, custom built systems follow the V-model and
combinations of customized commercial systems follow the combined system
development and system integration life cycle as described in 6.4.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 20 of 53
Document Number: M-173
Computer System Validation
7. Validation Steps
Computer system validation can be triggered by two events: 1) A new system
is purchased and 2) An existing system should be brought into compliance.
This could be a system not previously used for regulated or other business
critical applications. This chapter covers the validation of both new and
existing systems. A procedure for the validation of computer systems is
described in Reference 16.1.3. Reference 16.2.1 includes a checklist for
computer validation. Reference 16.3.7 includes a complete validation
example.
7.1 Define System Owner and Project Team
The computer system validation should start when a decision has been made
that there is a need for a new computer system. Steps should include:
Management should identify a system owner.
The system owner with the help of the steering committee should
decide whether or not a validation committee should be formed. This is
normally required for networked systems.
If a validation team is needed, the system owner should form a
validation project team. Team members should come from QA and all
departments that will be affected by the system.
7.2 Planning
The system owner should draft a validation project plan. The plan should
include chapters on:
Purpose and scope of the system, what it includes and what it
doesn’t include.
Background.
System description.
References to other documents.
Responsibilities.
Validation approach.
Assumptions, exclusions and limitations.
Risk assessment.
Validation steps.
Configuration management and change control.
Validation deliverables.
Training.
Schedule (Attachment 18.5 includes a template and examples).
For an example of a validation project plan check Reference 17.10.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 21 of 53
Document Number: M-173
Computer System Validation
7.3 Assumptions, Exclusions and Limitations
Any assumptions, exclusions and limitations should be mentioned early in the
project. This is important not only for setting the right expectations for internal
reviews and approvals, but also for internal and external audits. For example,
for purchased systems it should be mentioned that detailed development
documents like design specifications or code reviews are not included in the
validation package, or that functions available on the system but not used are
not tested.
7.4 Setting Specifications
With the support of the validation team the system owner collects inputs from
user departments on the anticipated use of the system and application
requirements, and from QA on up-to-date regulatory requirements. The
system owner drafts the system requirements specifications document. This
document should include sections on:
Background information and description of the process, the
workflow, application problem and the limitations of the current solution.
Description how the new system can overcome limitations of the
current solution.
Description of the purpose and intended use of the system.
System overview.
Description of the intended environment. This includes location,
operating system, network and type of anticipated users.
User requirements. They include requirements to perform business
tasks, security requirements, regulatory requirements, e.g., 21 CFR Part
11, configuration requirements and requirements for services support, for
example, installation, user training and compliance services.
A procedure for developing specifications for computer systems is described
in Reference 16.1.6. For a template with examples check Reference 16.3.1.
Reference 16.3.3 includes 20 examples for good and bad specifications.
Reference 16.2.3 includes a checklist for user requirement specifications.
7.5 Vendor Selection and Assessment
The system owner together with the help of IT should select one or
more vendors and send the SRS to the selected vendor(s) with a request
to reply within two weeks. Alternatively the system owner compares the
selected vendor’s specifications with the requirement specifications.
The vendor that has the best match with the system requirement
specifications should be selected and assessed.
The system owner together with QA and IT should define the
vendor assessment process.
Alternatives are:
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 22 of 53
Document Number: M-173
Computer System Validation
# Assessment Comment
1 Through own Experience may come from the product
experience with the under consideration or from other products.
vendor Criteria are:
Quality of the products (failure rate).
Responsiveness in case of errors
(phone call, on-site visit, bug fix).
2 Through references Useful if there is no experience with the
outside the company vendor within your company. Criteria are:
Acceptance of the vendor in the market
place.
Image of the vendor as software supplier.
Quality reputation of the product.
3 Checklist - Mail audit Use checklists available within your company
and through public organizations, e.g., PDA
and from private authors.
4 Assessment through 3rd Gives an independent assessment of the
party audits quality system and/or product development.
5 Vendor audit through Gives a good picture of the vendor’s quality
the user firm system.
Costs for the assessment increase from 1 to 5. The final assessment
procedure should depend on a risk assessment. Criteria are the product risk
and vendor risk. The system owner should justify and document the selected
procedure.
The system owner with the help of QA and IT should perform the
vendor assessment and documents the results.
Vendor audits should be documented in detailed reports with a final
rating. Other assessments should be documented in summary reports also
with a final rating. Attachment 18.7 includes a template that can be used to
document the vendor rating.
A procedure for selecting the right software supplier is described in Reference
16.1.25. A procedure for the assessment of software suppliers is described in
Reference 16.1.7. A procedure for auditing software suppliers is described in
Reference 16.1.8. Reference 16.2.2 includes a checklist for vendor
assessment.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 23 of 53
Document Number: M-173
Computer System Validation
7.6 Installation
The system owner should coordinate tasks prior to and during the installation
of the computer system. Steps should include:
7.6.1 Before installation
Obtain manufacturer's recommendations for site installation
requirements.
Check the site for the fulfillment of the manufacturer’s
recommendations (utilities such as electricity and environmental
conditions such as humidity, temperature and vibration level).
7.6.2 During installation
Compare computer hardware and software, as received, with
purchase order (including software, accessories and spare parts).
Check documentation for completeness (operating manuals,
maintenance instructions, standard operating procedures for
testing, safety and validation certificates).
Check computer hardware and peripherals for any damage.
Install hardware (computer, peripherals, network devices,
cables).
Install software on the computer’s hard disk following the
manufacturer’s recommendation.
Verify correct software installation, e.g., ensure that all files are
accurately copied on the computer hard disk. Utilities to do this
should be included in the software itself or should be purchased
separately.
Make a back-up copy of software.
Configure network devices and peripherals, e.g. printers and
equipment modules and other parameters.
Identify and make a list with a description of all hardware,
include drawings where appropriate, e.g., for networked data
systems.
Make a list with a description of all software installed on the
computer.
Store configuration settings either electronically or on paper.
List equipment manuals and SOPs.
Prepare an installation report.
Installation and Installation Qualification (IQ) of larger commercial systems is
normally performed by a supplier’s representative. In this case both the
supplier’s representative and a representative of the user’s firm should sign-
off the IQ documents.
A template to document computer systems is included in Reference 16.3.4.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 24 of 53
Document Number: M-173
Computer System Validation
7.7 Testing for Operation
Testing should prove that the system can perform the functions as defined in
the specifications.
7.7.1 Test plan
Testing should follow a test plan. The plan should be developed by the
system owner with the support of the project validation team. It should
include test environment, functions to be tested, extent of testing, test
protocols, test personnel and a timetable. It should also include an
action plan in case test criteria are not met. The test plan should be
reviewed and approved by QA before the tests start.
7.7.2 Type and extent of testing
Functions to be tested and extent of testing depend on:
Criticality of the system based on risk assessment. Criteria are
impact of the system on (medicinal) products quality and data
integrity. .
Complexity of the system.
Information on test efforts and results from the vendor.
The level of customization as expressed by the GAMP
categories 3 to 5.
For example, for a low risk system with GAMP category 3 no functional
testing is required. On the other hand for a custom built highly critical
system all functions should be tested. Attachment 18.8 includes a table
with high-level recommendations for the extent of testing for different
risk and GAMP categories.
Tests should include:
Functions that are required to perform the application, for
example, to perform a quality control analysis including instrument
control, data acquisition, data processing, reporting, archiving and
retrieval.
Other critical functions, for example, to limit system access or
functions that are required to comply with regulations, such as
electronic audit trail for FDA’s 21 CFR Part 11.
Compatibility of data with previous systems.
Data back-up and restore.
Data archiving and retrieval.
System recovery after a failure.
High load and stress testing, for example, entering an input that
is not accepted by the system.
System tests to make sure that the complete application works
as intended. This kind of application testing is also called PQ
testing.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 25 of 53
Document Number: M-173
Computer System Validation
7.7.3 Test environment
Testing should be performed under conditions as close as possible to
the live use of the system. If a live environment cannot be used
because interruption of ongoing live applications is not possible, tests
should be performed in a test environment that mirrors the live
environment. The system owner with the support of the project team
should decide which test environment should be used. The decision
should be based on a risk assessment and should be justified and
documented.
7.7.4 Systems with identical configurations
Systems with identical configurations and used in an identical manner
do not require full testing of all software functions for all systems.
However, it is of utmost importance that the systems are identical and
used in the same manner. This includes identical computer hardware
and firmware, the same versions of operating system and application
software and the same configuration settings. Any differences should
be documented and tested for each system. For example, IP addresses
will be different for different clients, so connectivity tests should be
performed for each system. The decision not to test all functions should
be made by the system owner with the help of the validation team. The
decision should be based on a risk analysis and should be justified and
documented.
7.7.5 Test traceability
Tests should be linked to system requirement specifications. Normally
one test is required for each specification. It can also happen that one
test case serves two or more specifications or that several tests are
required for one specification. If functions are not tested, the reason for
such an omission should be documented, for example: “Function has
been tested by vendor and is not impacted by the user’s environment”.
Attachment 18.9 includes a template for a traceability matrix. The
matrix can be documented in paper format but for larger projects it is
recommended to use electronic document management systems. This
can range from simple Word tables to databases and software
specifically developed for managing traceability matrices.
7.7.6 Test data sets and procedures for ongoing regression testing
During initial testing procedures and test sets should be developed that
can be executed on an ongoing basis or after system changes. This
can be a set of data that are initially reprocessed under normal and
high load conditions and whenever there is a need for retesting. This
type of testing is called regression testing. After successful execution
this test proves that the complete system performs key functions as
intended.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 26 of 53
Document Number: M-173
Computer System Validation
7.7.7 Ongoing tests (PQ)
After the system is installed and tested for initial operation the system
performance should be verified on an ongoing basis. This is normally
called PQ testing. The type and extent of testing depends on the
criticality and stability of the system.
As a minimum regression tests as developed in section 7.7.6
should be performed every three months.
Additional tests should be developed and executed if there is
any indication that the performance of the system or any subsystem
can deteriorate over time.
7.7.8 Documentation and review of testing
Tests should be documented with a unique test number, the related
specification, test purpose, test environment, expected results,
acceptance criteria, the criticality of the test or function to be tested as
defined by the test personnel and the name and signature of the test
person.
In some cases documentation should include evidence that the tests
have been performed. This could be, for example, screen captures or
print outs of test results. Such evidence should be available for highly
critical functions and the need for the evidence should be defined in the
test protocol.
A summary and conclusions document should be written and reviewed
for correct technical information together with test protocols and
supporting reference materials and signed by the system owner or one
of his/her delegates. QA should review and sign the test set for
compliance with internal procedures. Based on the summary and
conclusions the validation team should evaluate whether or not the
process should proceed.
Attachment 18.10 includes an example for a test protocol. A procedure
for developing test scripts is described in Reference 16.1.9.
Reference 16.3.5 includes templates and examples for functional
testing, including a test traceability matrix, test protocols and test
summary sheets
7.7.9 Handling deviations
The system owner with the support of the validation team should
decide how deviations are handled. Most important is a decision
whether or not the system can be used without any modification.
Such a decision should be based on risk assessment and justified and
documented. All deviations should be recorded and a corrective action
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 27 of 53
Document Number: M-173
Computer System Validation
plan initiated. A procedure for handling deviations is described in
Reference 16.1.10
7.7.10 Qualification of test personnel
Tests should be performed by the supplier or users of the system. In
some cases temporary personnel are hired for testing. Test personal
should be qualified for the assigned task and the qualification should be
documented. As a minimum test personnel should have a good
understanding on:
The system to be tested.
The test philosophy.
The application.
The purpose of the system.
How to use the test protocol.
How to document test results and supporting information such
as plots and screens.
How to handle deviations of actual results from previously
specified acceptance criteria.
The regulated environment, e.g., GMP, GLP
.
7.8 Revalidation
Computer systems should be revalidated to maintain the validation status
during the entire life of the system. Revalidation is either time based or event
driven:
7.8.1 Time based
Computer systems should be regularly revalidated. Type of revalidation
and frequency depend on system criticality and stability.
Systems supporting highly critical applications should undergo
full revalidation after one year. Test procedures should be the same
as for initial validation.
Systems supporting medium critical applications should be
reviewed for compliance of the actual configuration with
documentation and ongoing tests with tests plans. If evaluation
findings meet acceptance criteria, no revalidation is required.
Systems supporting low critical applications don’t need
revalidation.
Time based qualification can be omitted if the system has been
revalidated for other reasons, for example, after changes.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 28 of 53
Document Number: M-173
Computer System Validation
7.8.2 Event driven
Event driven revalidation is mostly triggered through changes of
hardware, software or accessories. Any change to the system should
include an assessment of what type of revalidation is required.
Systems should be revalidated after installation of new versions of
software. Functions that are new or have been changed should be
validated. In addition, a regression test should be performed to verify
correct functioning of the complete system.
The detailed evaluation and final decision on type and extent of
revalidation should be made by the system owner and supported by IT.
The decision what and how to revalidate should be based on risk
assessment and should be justified and documented. Criteria for the
extent of revalidation are the criticality of the system and the type of
change.
A procedure for revalidation of software and computer systems is
described in Reference 16.1.17. Reference 16.2.6 includes a checklist
for revalidation.
7.9 Existing Systems
Validation of existing systems should follow the same principles as new
systems with some exceptions:
User requirements should be written based on current use.
Vendor qualification can be replaced by a well-documented history
of the system. Such information should include test documentation,
change control logs, service logs and experience of users with the system.
If there is no documented evidence that the system delivers reliable
and accurate results, a test plan should be developed for functional and
system testing.
A procedure for validation of existing software and computer
systems is described in Reference 16.1.23. Reference 16.2.6 includes a
checklist for the validation of existing systems.
7.10 Validation Report
When the validation project is completed a validation summary report should
be generated by the system owner. The report documents the outcome of the
validation project. The validation report should mirror the validation project
plan and should include:
A brief description of the system.
Identification of the system and all software versions that were
tested.
Description of hardware used.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 29 of 53
Document Number: M-173
Computer System Validation
Major project activities.
Listing of test protocols, test results and conclusions.
Statement on system status prior to release.
List of all major or critical issues and deviations with risk
assessment and corrective actions.
Statement that all tasks have been performed as defined in the
project plan.
Statement that validation has been performed according to the
documented procedures.
Listing of all deliverables.
Final approval or rejection statement.
The validation report should be reviewed, approved and signed by QA and the
system owner.
8. Approach for Networks and Networked Systems
Infrastructure supporting regulated or business critical applications should be
formally qualified before the application is installed on the network. Once the
network is qualified the application is installed and validated following the approach
in 7. Network qualification should include:
Specifying network requirements.
Specifications should include: network devices, software, computer hardware,
computer peripherals and cables. Specifications are based on anticipated
current and future use of the network.
Developing a network infrastructure plan.
Designing network infrastructure and drawings.
Selecting equipment and vendors for computers, Network Operating
Systems (NOS), network devices etc.
Ordering equipment: computer hardware, software (OS, NOS), network
devices, peripherals etc.
Installing all hardware devices according to design drawings and vendor
documentation.
Performing self-diagnostics and documenting hardware installation and
settings (this completes the IQ part).
Documenting the above as a network baseline.
Making a back-up of installed software and network configurations.
Whatever happens, it should be possible to return to this point.
Testing communication between networked computers and peripherals,
and access control including remote access control (only for networks
supporting medium and high risk applications).
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 30 of 53
Document Number: M-173
Computer System Validation
Developing and implementing rigorous configuration management and
change control procedures for all network hardware and software. This should
also include updates of system drawings and other documentation if there are
any changes.
Before applying any system changes to a production environment the
correct function should be verified in a test environment to ensure that the
change does not impact the intended functionality of the system.
Monitoring ongoing network traffic using network health monitoring
software (only for networks supporting high risk applications).
Verification of file transfer accuracy before and during ongoing use.
More details on network qualification can be found in Reference 17.11.
9. Approach for Spreadsheet Applications
Excel spreadsheets are software and should be validated. Excel without a VBA
script is an example of a configurable software and Excel with a VBA script is an
example of a custom built code.
Validation of such software applications should follow the V-model life cycle or part
of the combined system development/system integration life cycle model as
described in 6.4. This chapter gives general guidelines for spreadsheets and also
specific recommendations for development and validation and how to ensure
spreadsheet integrity.
9.1 General Guidelines
All spreadsheets used in regulated environments should be
validated, no matter if they have been developed for single users or for
multiple users and no matter if they are used once or on multiple
occasions.
Development, validation and use should follow a documented
procedure.
QA should create and maintain an inventory list with all
spreadsheets used in the department.
Spreadsheets should be designed for ease of use and to minimize
operator errors.
Spreadsheets should be designed and used to ensure their
integrity.
9.2 Design for Integrity
Access to spreadsheet programs should be limited to authorized
persons. Be aware that passwords included in some spreadsheet
programs to access workbooks, worksheets and cells are not really
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 31 of 53
Document Number: M-173
Computer System Validation
secure. Limited access to the spreadsheet should be built into the
operating system.
Spreadsheets should be stored on write-protected directories.
The file location of the spreadsheet should be documented together
with the output data.
9.3 Development and Validation
Development and validation of spreadsheets should follow a standard
operating procedure.
A user drafts a proposal for a new spreadsheet. The proposal
should include a description of the problem that the spreadsheet should
solve, how it is handled now and how the spreadsheet can improve
efficiency.
The system owner writes a project plan.
The system owner collects inputs from anticipated users on
requirement specifications and writes requirement specifications.
The programmer defines and documents required functions.
Functions are reviewed by users.
The programmer develops design specifications, for example, which
formulas are used and the location of input/output cells. For complex
spreadsheets and for spreadsheets with VBA scripts the design
specifications are reviewed by peers of the programmer.
The programmer develops the worksheet and forms functional
tests. For spreadsheets with VBA scripts the code is reviewed by peers of
the programmer (structural testing).
The programmer writes a user manual.
The system owner develops a test protocol for users.
Users load the spreadsheet onto their computer.
Users test the spreadsheet and document the results.
The system owner develops a validation package.
QA reviews and approves the package.
The system owner releases the package.
A detailed procedure for the validation of spreadsheets is described in
Reference 16.1.4.
10. Risk Assessment
Risk assessment should be applied for all computer validation activities, for
example, type of vendor assessment and extent of initial and ongoing testing. Risk-
based validation is supported by regulatory agencies and should help to reduce
overall validation costs and/or increase system uptime by focusing resources and
efforts on high risk systems. The principle is that problems are identified and
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 32 of 53
Document Number: M-173
Computer System Validation
mitigated before they occur. The risk level of a system depends on the number and
criticality of records generated and/or processed by the system. Typically risk
categories are defined as high, medium and low. Steps for risk assessment are:
1. Developing a risk management master plan.
2. Developing a procedure for risk assessment, mitigation and control.
3. Developing a risk management project plan using the risk master plan as a
framework.
4. Determining risk levels for the system, e.g., high, medium, low. Criteria for
risk levels are impact on product quality and business continuity.
5. For high and medium risk systems, identifying critical system functions.
These are functions that have a high compliance or business impact. Criteria
are the severity of a potential problem, the likelihood that it occurs and the
detectability.
6. Mitigating risks as identified for those functions, for example: through testing,
through increasing the level of detectability or through availability of
redundant modules or systems.
The system owner should trigger the risk management process during the various
validation steps.
The process is described in the risk management master plan in Reference 17.1.
Reference 16.1.2 describes a procedure for risk assessment for systems used in
regulated environments. Reference 16.1.5 includes a procedure for risk-based
validation of software and computer systems.
11. Configuration Management and Change Control
The purpose of configuration management is to know the composition of the
system during its entire life from planning to retirement. The configuration of a
system should be well documented and changes should be authorized,
implemented and documented. Configuration management includes two steps:
initial set-up and change control.
11.1 Initial Set-up
Once a computer system is installed, the initial set-up of all configuration
items should be documented. Configuration items include:
Computer hardware, e.g., supplier, model.
Computer firmware, e.g., revision number.
Operating system: supplier, product identifier and version.
Application software: supplier, product identifier and version.
Hardware peripherals, e.g., printers, CD ROMS.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 33 of 53
Document Number: M-173
Computer System Validation
Network hardware, firmware, software and cables.
Documentation, e.g., validation plan, operating manuals and
specifications.
11.2 Change Control
Change control should be carried out during all phases of system design,
development and use. It applies to all configuration items as defined in the
initial set-up. Information on change control should include:
System ID and location.
Persons who initiated, approved and implemented the change.
Description of the change, including the reason for the change and
the business benefit.
Priority.
Expected impact on validation.
Date of implementation.
Other important points are:
Changes are managed by the system owner.
Change control procedures should be able to handle planned and
unplanned changes. An example of an unplanned change is replacing a
defect hard disk with a new one.
Change control should always include a risk assessment on how
the change may impact system performance.
All changes should be recorded in a change control history log
document.
Attachment 18.11 has a template for a change request form and Attachment
18.12 has a template for a change release form.
Reference 16.1.16 describes a procedure for change control of software and
computer systems. The SOP in Reference 16.1.2 describes configuration
management and version control of software.
12. Maintenance and Support
12.1 Preventive Maintenance
Preventive maintenance should ensure smooth and reliable operation on a
day-by-day basis. Activities should include:
Regular removal of temporary files from the hard disk. A hard disk
should not be loaded more than 80% of full capacity.
Regular virus checks of systems that are connected to a network
and/or to the Internet.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 34 of 53
Document Number: M-173
Computer System Validation
12.2 Back-up and Restore
Operating software, application software, configuration settings and data
should be backed up on external media to ensure access if on-line records
are lost either through accidental deletion or equipment problems. Back-up
should be performed:
For software: before installation of any new revision.
For configuration settings: after initial configuration set-up and
whenever configurations are changed.
For data: frequency should be based on risk assessment. The
frequency should be justified and documented by the system owner with
the support of IT. Criticality of data and stability of the system are the
primary parameters for the back-up frequency.
The system owner with the help of IT also determines the back-up strategy,
for example, full back-up vs. incremental back-up.
Back-up and restore procedures should be validated as part of an initial and
ongoing validation program.
Reference 16.1.11 describes a procedure for back-up and restore of
electronic records.
12.3 Archiving
Data generated by the computer system should be regularly removed from the
system’s hard disk(s) and archived to avoid overloading the hard disk(s) and
loss of data.
The system owner with the support of IT and QA should develop an archiving
strategy for each system. It is important to define the type of data to be
archived, the media for archiving, archiving format and timing for archiving.
Archiving and retrieval procedures should be validated as part of the initial
and ongoing validation program.
References 16.1.13 and 16.1.18 describe a procedures for retention,
archiving and retrieval of electronic records.
12.4 Contingency Planning and Disaster Recovery
Contingency planning and disaster recovery are important to continue
business and to ensure access to records in case of internal or external
adverse events.
A contingency and disaster recovery strategy should be defined by
the system owner. This should be based on a risk assessment and should
be justified and documented.
Contingency planning and disaster recovery procedures should be
validated as part of an initial and ongoing validation program. Reference
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 35 of 53
Document Number: M-173
Computer System Validation
16.1.12 describes a procedure for Disaster Recovery of Computer
Systems.. Reference 16.1.23 describes a procedure for Disaster Recovery
of Computer Systems. Reference 16.2.10 includes a checklist for
contingency and disaster recovery planning.
12.5 Security and User Administration
System security is important to ensure confidentiality, authenticity and integrity
of data.
The system owner with the support of IT should develop a security
plan for each system. The corporate security master plan should be used
as a guideline to define security controls for each system. For example,
password conventions and administration should be implemented as
defined in the security master plan.
The security strategy depends on the criticality of the system and
the system functions. The decision should be based on a risk assessment
and should be justified and documented. Alternatives are physical security
and logical security to limit access to the systems.
The system owner with the help of IT should decide on the sign-on
procedure, e.g., single sign-on for the operating system and applications or
requesting different user IDs and passwords for both software packages.
The system owner should define rights to access the type of
records and data management activities, e.g., no access, read access or
read and write access or the right to create and delete records.
IT should implement technical controls to ensure proper operation
as defined by the system owner.
The system owner with the help of IT should also define security
measures during ongoing use of the system, e.g., when the user walks
away from the system.
Correct functionality to control limited access should be validated.
Reference 16.1.14 describes a procedure for access control to computer
systems and data. Reference 16.3.6 includes a test protocol for validation of
access control to computer systems.
12.6 Problem Handling
Operators of software and computer systems will have a clear procedure on
how to handle problems with computer systems during operation. Some
examples are:
Operation problems that cannot be resolved by the operator.
Software errors.
Hardware errors.
Network errors.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 36 of 53
Document Number: M-173
Computer System Validation
Problems should be documented, reported, verified and a corrective
action plan initiated to solve the problem quickly or to develop a
workaround solution. Problem handling should follow the SOP: Handling of
Problems with Software and Computer Systems.
13. System Retirement
Retirement of computer systems should be thoroughly planned and implemented.
Most important is to ensure that data created and/or processed on the system are
readily available on the new systems in a form required by regulations and
business standards. Tasks are managed by the system owner with the support of
IT. Tasks include:
Initiating the retirement process. Attachment 18.13 includes a
retirement request form.
Drafting a retirement plan.
Collecting and reviewing all system documentation that may be
necessary to demonstrate compliance of the system.
Developing a plan to migrate critical data to the new system and
verifying that the data can be retrieved on the new system in the same way
as on the existing one.
Documenting the latest configuration settings.
Deleting all data from the hard disk of the existing system.
Taking the system out of service.
Reference 16.1.20 describes a procedure for retirement of computer systems.
14. Periodic Reviews and Auditing
Computer systems should be regularly reviewed and included in the department’s
audit schedule.
14.1 Reviews
The purpose of the review is to verify that the actual system is
identical to the current documentation and that a system, once validated,
remains in a validated state. Differences between documentation and the
actual system should be identified and documented and the impact on the
validation status should be evaluated and corrective actions initiated if
necessary.
Reviews should be regularly scheduled and conducted once a year
by QA and the system owner.
Special focus should be on changes such as checking for software
revisions of the operating system and application software, and on
changes of configuration settings and documents.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 37 of 53
Document Number: M-173
Computer System Validation
The review should also check if scheduled performance tests have
been performed and in case of deviations if corrective actions have been
planned and implemented.
Reference 16.1.21 describes a procedure for periodic evaluation and review
of computer systems. Reference 16.2.7 includes a checklist for periodic
evaluation and review.
14.2 Auditing
Computer systems should be audited as part of the regular department or site
audit schedule. Focus of the audit should be on:
User training: Is the training material current, are all users properly
trained and is the training documented?
Procedures: Are they available at the work place, are they current
and followed?
Operating manuals: Are they current?
Security policies: Are they followed?
Back-up: Has data back-up been regularly performed according to
the back-up schedule?
Audits should be conducted by an audit team from outside the department.
Deviations found during the audit should be documented and a corrective
action plan developed and implemented.
Reference 16.1.22 describes a procedure for auditing computer systems.
15. Communication and Training
Activities related to computer validation should be communicated within a company
and users of computer systems and validation personnel should be qualified for
their assigned tasks.
The benefits of validation should be shared as well as any successes
during the validation process.
All employees involved in computer validation projects should be well
trained to ensure high success rates of validation activities.
All users of computer systems should be trained on the computer systems
to ensure that they are being used by qualified personnel and that they have the
required knowledge to operate the system in the most efficient way.
Trainees include system developers, end users, IT personnel, QA, internal
auditors, validation personnel and document control personnel. The system owner
should develop a training plan using guideline outlined in the company’s training
master plan (17.5). Reference 16.1.1 describes a procedure for GxP and computer
system validation training.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 38 of 53
Document Number: M-173
Computer System Validation
15.1 Reference Papers and Industry Standards
Reference papers and industry standards help to better understand specific
requirements for selected applications. Examples are:
1. GAMP Good Automated Manufacturing Practice, Guide for Validation of
Automated Systems in Pharmaceutical Manufacture, Version 3, March
1998, Version 4, December 2001, Version 5 - A Risk-Based Approach to
Compliant GxP Computerized Systems, March 2008, www.ispe.org.
15.2 FDA and Other Regulations and Guidelines
Regulations are important to help understand and interpret requirements.
The most important US regulations are 21 CFR Part 11, the FDA’s regulation
on electronic records and signatures and predicate rules such as Good
Manufacturing Practices, Good Clinical Practices and Good Laboratory
Practices. Also important are guidance documents such as the Part 11
validation guidance and a guidance titled “Using Electronic Means to
Distribute Certain Product Information”.
21 CFR Part 11 – Electronic Records and Signatures.
Part 11 Industry Guide: Part 11 Scope and Applications.
21 CFR Part 211 – DRUG GMPs.
21 CFR Part 58 – Good Laboratory Practices.
FDA Draft Guidance: 21 CFR Part 11; Electronic Records;
Electronic Signatures, Validation.
The most important EU regulatory documents in Europe are GLP and GMP
directives for Good Manufacturing Practices and Good Laboratory Practices.
Also important are appendices to the EU directives and guidance documents.
Annex 11 of the European GMP directive: Computerized Systems.
A new version has been published in January 2011.
Pharmaceutical Inspection Convention, January 2002, Good
Practices for Computerised Systems in Regulated “GxP” Environments.
16. Reference Documentation and Validation Deliverables
This chapter discusses which reference documents should be available to prepare
a validation plan and to perform the validation tasks and which deliverables should
be prepared during and after validation.
16.1 Standard Operating Procedures
Validation activities should be performed according to written procedures.
Generic procedures should be taken from the corporate SOP list. System
specific procedures should be developed for the system to be validated.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 39 of 53
Document Number: M-173
Computer System Validation
Procedures should be available under the same or a similar title as follows:
1. Training for GxP, 21 CFR Part 11 and Computer Validation (S-125).
2. Risk Assessment for Systems Used in GxP Environments (S-134).
3. Validation of Commercial Off-the-Shelf (COTS) Computer Systems (S-
271).
4. Validation of Macro Programs and Other Application Software (S-263).
5. Risk-Based Validation of Computer Systems (S-252).
6. Development of User Requirement Specifications for Computers (S-253).
7. Quality Assessment of Software and Computer System Suppliers (S-274).
8. Auditing Software Suppliers: Preparation, Conduct, Follow-up (S-273).
9. Development and Maintenance of Test Scripts for Equipment Hardware,
Software and Systems (S-237).
10. Handling Deviations during Equipment and Computer System Testing
(S-238).
11. Data Back-Up and Restore (S-317).
12. Disaster Recovery of Computer Systems (S-319).
13. Archiving and Retrieval of GMP Data and Other Documents (S-162).
14. Access Control to Computer Systems and Data (S-320).
15. Configuration Management and Version Control of Software (S-259).
16. Change Control of Software and Computer Systems (S-262).
17. Revalidation of Software and Computer Systems (S-260).
18. Retention and Archiving of Electronic Records (S-315).
19. Qualification of PC Clients (S-289).
20. Retirement of Computer Systems (S-261).
21. Periodic Evaluation and Review of Computerized Systems (S-258).
22. Auditing Computer Systems (S-272)
23. Handling Contingency Situations for Computer Systems (S-318)
24. Responsibilities for Computer System Validation (S-277)
25. Selecting the Right Software and Equipment Supplier for Compliance
(S-251-02)
16.2 Validation Deliverables
Validation deliverables should include the project plan, project schedule,
requirement specifications, validation protocols, risk assessment and
strategies for contingency planning, data back-up and system security. The
deliverables should be summarized in a table. Attachment 18.14 includes a
template for a summary table.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 40 of 53
Document Number: M-173
Computer System Validation
17. References
1. 21 CFR Part 11
2. Security Master Plan.
3. Training Master Plan.
4. GAMP Good Automated Manufacturing Practice, Guide for Validation of
Automated Systems in Pharmaceutical Manufacture, Version 3, March
1998, Version 4, December 2001, Version 5 - A Risk-Based Approach to
Compliant GxP Computerized Systems, March 2008, www.ispe.org
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 41 of 53
Document Number: M-173
Computer System Validation
18. Attachments
18.1 Attachment - Computer System Validation Policy
1) Computer validation is not only important for our organization to comply
with regulations but also for other business reasons, such as increased
system uptime during operation.
2) All systems used in regulated environments should be validated. The extent
of validation depends on the risk the system has on product quality and
business continuity. It also depends on the complexity of the system and the
level of customization. The extent of validation should be assessed for each
system based on criteria mentioned above.
3) Systems used in non-regulated environments should be validated based on
the business criticality of the systems.
4) To most effectively achieve requirements laid out in 2 and 3 supporting
activities are required:
Management supports computer system validation as a business-
critical activity.
Management nominates a computer system validation steering
committee.
The steering committee supported by management communicates
advantages of computer system validation for business reasons other than
for compliance, for example, higher system uptime.
The steering committee provides procedures and templates to
ensure effective and consistent implementation across the organization.
For individual validation projects the concept of risk-based
validation will be applied to reduce overall efforts.
The progress of the validation program will be shared with
employees and management and any successes should be celebrated.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 42 of 53
Document Number: M-173
Computer System Validation
18.2 Attachment - Members of Computer Validation Steering Committee
Function Name Office Cell Home E-mail
Phone Phone Phone
Project Sponsor
Committee
Leader
Back-up
Quality
Assurance
Back-up
Consultant
Back-up
Regulatory
Affairs
Back-up
Information
Technology
Back-up
Manufacturing
Back-up
Laboratory
Back-up
Documentation
Back-up
Vendor
Representative
Back-up
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 43 of 53
Document Number: M-173
Computer System Validation
18.3 Attachment - Members of Computer Validation Project Team
Function Name Office Cell Home E-mail
Phone Phone Phone
Team Leader
Back-up
Quality
Assurance
Back-up
Information
Technology
Back-up
Operation
Back-up
Documentation
Back-up
Plant
Maintenance
Back-up
Vendor
Representative
Back-up
Consultant
Back-up
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 44 of 53
Document Number: M-173
Computer System Validation
18.4 Attachment - List with Computer Systems for Validation
ID/Asset Description Location Application GxP Risk Contact Time Frame
Number h,m,l for Validation
RV3212 Document G4 West1 Training Yes m Bill Jan – April
Management Tracking Hinch 2006
System TN 432
123
h = high risk
m = medium risk
l = low risk
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 45 of 53
Document Number: M-173
Computer System Validation
18.5 Attachment - Validation Project Schedule
Task Owner Due Actual Initials
Date Date
Initiate the project
Approve project by management
Define a system owner
Form a project team
Develop project plan
Develop training material for test personnel
and users
Develop and approve requirement
specifications
Develop risk analysis and assessment
Assess the vendor and document the
outcome
Develop a test plan with traceability matrix
Develop test protocols for functional testing
Develop installation procedure
Train users and test personnel
Install the system and create IQ protocol
Perform functional tests and create OQ
protocol
Develop tests for ongoing evaluation
Develop back-up strategy
Develop archiving strategy
Develop security strategy and tests
Test functions to limit access to the system,
to control functions and data
Develop validation report
Get project approval
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 46 of 53
Document Number: M-173
Computer System Validation
18.6 Attachment - Requirement Specifications Table
System Owner/Author:
Department:
Date:
System ID:
System Location:
Specifications Requirement Priority: (This column
Number/Identifier must should be
want used later on
nice to have as link to test
case)
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 47 of 53
Document Number: M-173
Computer System Validation
18.7 Attachment - Vendor Rating
Rating Meaning Interpretation
3 Excellent Vendor procedures and practices are above
average.
2 Adequate Vendor procedures and practices are about average.
1 Poor Vendor procedures and practices are below average
and need to be improved.
0 Unsatisfactory Vendor procedures and practices are unacceptable.
18.8 Attachment - Extent of Testing for Different Risk Levels
Validation Steps – Functional Testing
System GAMP 3 GAMP 4 GAMP 5
High Risk Test critical Test critical standard Test critical
functions. functions. standard
functions.
Link tests to Test all non-standard
requirements. functions. Test all non-
standard
Link tests to
functions.
requirements.
Link tests to
requirements.
Medium Risk Test critical Test all critical Test critical
functions. standard and non- standard
standard functions. functions.
Link tests to Test all non-
requirements. standard
functions.
Link tests to
requirements.
Low Risk No testing. Test critical non- Test critical non-
standard functions. standard
functions.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 48 of 53
Document Number: M-173
Computer System Validation
18.9 Attachment - Template for a Test Traceability Matrix
Requirement Requirement Test ID
Number
1.1 Example 1 4.1, 4.3
1.2 Example 2 1.2
1.3 Example 3 Not tested (1)
1.4 Example 4 3.3, 4.1
(1) Function tested by vendor and is not impacted by our environment.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 49 of 53
Document Number: M-173
Computer System Validation
18.10 Attachment - Example for a Test Protocol
Test number:
Specification:
Purpose of test:
Test environment (PC hardware, peripherals, interfaces, operating system,
Excel version, service pack):
Test execution:
Step 1:
Step 2:
Step 3:
Expected result:
Acceptance criterion:
Actual result:
Comment:
Criticality of test:
Low 0 Medium 0 High 0
Test person printed name: ______________
Signature: ______________
Date: ______________
Reviewer printed name: ______________
Signature: ______________
Date: ______________
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 50 of 53
Document Number: M-173
Computer System Validation
18.11 Attachment - Change Request Form
Form ID: Change ID: Item ID:
Item Location:
Change Initiator: Enter name. Date of request.
Description of Change: Enter a summary and a reason for the change and
the business benefit.
Change Priority: High O Medium O Low O
Latest Acceptable Date: Only necessary if the change is time critical.
Risk Assessment: Risk:
Likelihood:
Severity:
Recovery:
Test Plan: Describe test efforts.
(Validation Group)
Regulatory Notification Yes O No O
Required: (Done by QA)
Change Approval: Accepted O Rejected O
Comments or reasons for rejection:
Signatures: Name: Signature: Date:
Functional Mgt.
Change Adv. Board
QA Mgt.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 51 of 53
Document Number: M-173
Computer System Validation
18.12 Attachment - Change Release Form
Form ID: Change ID: Item ID:
Item Location:
Change Initiator: Enter name. Date of request.
Description of Change: Enter a summary and a reason for the change and the
business benefit.
Change Priority: High O Medium O Low O
Latest Acceptable Date: Only necessary if the change is time critical.
Risk Assessment: Risk:
Likelihood:
Severity:
Recovery:
Test Plan: Describe test efforts.
(Validation Group)
Regulatory Notification Yes O No O
Required: (Done by QA)
Change Approval: Accepted O Rejected O
Comments or reasons for rejection:
Signatures: Name: Signature: Date:
Functional Mgt.
Change Adv. Board
QA Mgt.
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 52 of 53
Document Number: M-173
Computer System Validation
18.13 Attachment - Retirement Request Form
Requester:
Reason for retirement:
Data created or processed
on the system:
If the system is replaced by a
new one, describe the new
system and how it compares
with the existing one:
How will data and other
records be retained,
maintained and retrieved?:
Will data migration be
validated for typical files?:
Approvals:
Quality Assurance ______ ____________ ____________
Date Printed Name Signature
Operation’s Manager ______ ____________ ____________
Date Printed Name Signature
IT Manager ______ ____________ ____________
Date Printed Name Signature
Documentation Manager ______ ____________ ____________
Date Printed Name Signature
(Replace with your company’s name) FOR INTERNAL USE
Master Plan Page 53 of 53
Document Number: M-173
Computer System Validation
18.14 Attachment - Validation Deliverables
Deliverable Document Prepared Reviewed Approved
ID by by by
Validation (project) plan
Project schedule
System requirement
specifications
Vendor assessment
Design specification
(for configurable and
customized systems)
IQ protocol
OQ protocol
with functional tests
PQ protocol
with system tests
Validation report
Back-up strategy
Risk assessment
Training records of test
personnel
Change control logs
(Replace with your company’s name) FOR INTERNAL USE