m 171 Validation Masterplan

60
Best Practice Validation Master Plan for Equipment, Computer Systems, Networked Systems and Spreadsheet Applications May 2012 A master plan published for www.labcompliance.com Global on-line resource for validation and compliance Copyright by Labcompliance. This document may only be saved and viewed or printed for personal use. User may not transmit or duplicate this document in whole or in part, in any medium. Additional copies and other publications on validation and compliance can be ordered from www.labcompliance.com While every effort has been made to ensure the accuracy of information contained in this document, Labcompliance accepts no responsibility for errors or omissions. No liability can be accepted in any way. Labcompliance offers books, master plans, complete Quality Packages with validation procedures, scripts and examples, SOPs, publications, training and presentation material, user

description

vmp

Transcript of m 171 Validation Masterplan

Product life cycle approach for validation of computer systems

Validation Master Plan

Best Practice

Validation Master Plan

for Equipment, Computer Systems, Networked Systems and Spreadsheet Applications

May 2012A master plan published for

www.labcompliance.com

Global on-line resource for validation and compliance

Copyright by Labcompliance. This document may only be saved and viewed or printed for personal use. User may not transmit or duplicate this document in whole or in part, in any medium.

Additional copies and other publications on validation and compliance can be ordered from www.labcompliance.comWhile every effort has been made to ensure the accuracy of information contained in this document, Labcompliance accepts no responsibility for errors or omissions. No liability can be accepted in any way.

Labcompliance offers books, master plans, complete Quality Packages with validation procedures, scripts and examples, SOPs, publications, training and presentation material, user club membership with more than 500 downloads and audio/web seminars. For more information and ordering, visit www.labcompliance.com/solutions

Contents

31. Purpose and scope of this plan

32. Introduction

43. Responsibilities

74. Related documents

75. Products/processes to be validated and/or qualified

86. Validation approach

107. Steps for Equipment Qualification

198. Computer System Validation

249. Networks and Networked Systems

2810. Existing (Legacy) Systems

3011. Validation of Macros and Spreadsheet Applications

3212. Change Control

3513. Instrument Obsolescence and Removal

3514. Glossary

3515. Documentation maintenance

36Appendix A. Checklists and Forms

36Checklists

36Forms

1. Purpose and scope of this plan

Provides a framework and practices for validation and qualification of equipment, computer systems and networked systems for laboratories and manufacturing. It also is applicable to the validation of Macros and Spreadsheet applications. The Validation Master plan should ensure that validations and qualifications are done efficiently and consistently throughout the organization and meet regulatory, quality and business requirements. The plan should ensure that the company's validation procedures are followed. The company validation master plan is the basis of individual project validation plans, sometimes also called master validation plan.

2. Introduction

This is the most important part of a validation plan. It summarizes what the project is all about and what you are trying to accomplish with it and with validation. It summarizes

The purpose of project

Brief description of the system

Validation approach

The name and location of the facility where the equipment or system is located

Timeline

3. Responsibilities

Identifies all validation responsibilities like system owner, system manager, validation team, QA, operations, plant maintenance/engineering, IT, lab support services, users etc.

3.1 Validation team creates, updates, and reviews/ approves individual project validation plans and validation deliverables

ensures validation compliance with the company validation master plan and project validation plan

coordinates, implements, verifies elements of the VMP

consults on, evaluates and approves changes

reviews and approves IQ/OQ/PQ procedures and plans

reviews test results and makes recommendations regarding release

assess risks and develops contingency plan

3.2 User representation

create and maintain equipment inventory

makes a proposal for specifications(user requirements, functional specifications)

provides resources for all required qualification activities (IQ, OQ, PQ)

reviews procedures test methods for usability and practicability

recommends criticality level of failures

monitors on-going system performance and reports problems

3.3 Plant safety/maintenance/engineering

advises on and prepares the facility/laboratory to meet the equipment's environmental and safety specifications

3.4 Information Services (IS) assures that selected system fit into current and future IT infrastructure

performs installation and maintenance of computer networks

provides tools for data collection and archiving

provides training for users on the use of networks

3.5 Validation (engineering) groups provides expertise to develop and implement validation deliverables in efficient and consistent way

provides detailed validation guidance

trains employees on validation requirements and techniques

3.6 Quality Assurance (QA)

provides quality assurance expertise in the creation of the project validation plan and other validation deliverables

monitors compliance with regulations and established standards

approves and administrates SOPs

audits equipment/computer systems

develops test plans for equipment and computer systems(This can also be done by the development department. In this case QA reviews the plans)

leads vendor assessment program

trains employees on regulations

3.7 Responsible person

manages the validation project

establishes a project validation team

schedules and facilitates validation team meetings

initiates creation and maintenance of documentation

assures that change(s) are managed and controlled

assures the final approval of the system for use

keeps information on all validation activities

informs all parties on changes

3.8 Consultants

Some of the activities can be out-sourced to consultants. For example, developing specifications based on inputs from users, developing IQ/OQ/PQ procedures, leading vendor assessment programs and preparing validation summary reports

3.9 Vendors

Design, develop and validates equipment, software and systems during development following documented procedures

Provides documented evidence that the software has been validated

Provides documentation on the validation approach taken by the vendor

Provides a complete list of functional and performance specifications that can be used to derive a company's specification for the project

Allows a vendor audit if required by regulatory agencies or the user

Provides documentation and services for installation qualification

Provides documentation and services for operational qualification

Provides training on the technique and operation of the system

Provides on-going support through phone or on-site

3.10 All

Assure that all people who are involved in the specification, design, development, validation and use of equipment and computer/networked systems are trained and understand regulations.

4. Related documents

Corporate and/or local validation policies

Standard operating procedures

Checklists, e.g., for vendor assessment

Templates, e.g., for change control

5. Products/processes to be validated and/or qualified

5.1 Equipment, computers, software, network modules and networked systems that are used in regulated environment. Requirements may come from the US FDA and EPA, EU authorities, Japanese MHW or any other country that may have an impact or the companies business.

5.2 Equipment, computers and networked systems and software that are critical to the operation of a company or department

5.3 New equipment, computers and networked systems

5.4 Existing equipment, computers and networked systems

5.5 Any computerized system that is used to create, modify, maintain, archive, retrieve or transmit data (US FDA 21 CFR Part 11). This includes, for example, analytical instruments, other automated laboratory equipment, computers that are used to acquire, and evaluate data and Laboratory Information Management Systems (LIMS). It also includes systems to create and maintain Standard Operating Procedures (SOP) in electronic format, Calibration Tracking Systems (CTS), and may be e-mail systems. Other examples are Supervisory Control And Data Acquisition (SCADA), Electronic Batch Record Systems (EBRS), Programmable Logic Controllers (PLC), Process Control Systems (PCS), integrated information/business Systems, Enterprise Resource Planning (ERP) Systems, , Digital Control Systems (DCS), Systems, Manufacturing Execution Systems (MES) and Laboratory Information Management Systems (LIMS).

Ask yourself, if the information on the system is absolutely required for GMP/GMP. If yes, the system should be validated.

5.6 Equipment and computers/networked systems as identified in chapter 5.5 after changes.

6. Validation approach

Approach for new systems

6.1 If the equipment/computers are used across more than one department, form a project validation team representing all affected departments. Expertise of any of the department is needed for doing initial validation 'right' but also for revalidation when the system is changed.

6.2 Develop a validation project plan with time schedule and owners

6.3 Develop user requirement and functional specifications

6.4 Select a supplier (may be internal or external)

6.5 For computer systems, qualify the supplier (for details, see chapter 7)

6.6 Develop validation procedures (SOPs) and templates that can be used to qualify or validate the system during installation (IQ), prior to operation (OQ) and during routine use (PQ).

6.7 Develop criteria and procedures for revalidation/requalification

6.8 Develop acceptance criteria for OQ and PQ tests.

6.9 Get approval for test procedures and acceptance criteria from QA and the validation team.

6.10 Install the system and perform IQ

6.11 Perform and document OO tests

6.12 Check if criteria under 6.8 are met

6.13 If acceptance criteria are not met, check equipment/computerized/networked systems or procedure. Acceptance criteria should not be changed. Repeat steps 6.11 and 6.12. If the problem still exists, report to the supplier.

6.14 Write an OQ summary report.

6.15 Implement procedures for on-going performance qualification (PQ).

6.16 The validation/qualification documents should be reviewed and approved by the validation team, the system owner, e.g., the lab manager for small systems, QA/QC and other departments if, affected, for example, IT departments, for networked computer systems. Changes to the documents should be approved and documented (versioned).

6.17 All validation/qualification documents should be archived in secure locations

Approach for existing systems

When the FDA or other regulatory agencies audit computer systems, they will not ask if the system is new or legacy (legacy means, the system has been in place before a new regulation has been released). They will ask for a validation approach and expect to have similar or the same documents available as for new systems. The following steps demonstrate the approach the company takes to bring legacy systems into the validated status.

6.18 Develop an inventory (master lists) that includes all equipment and computers currently in use in the laboratory/operation and appropriate business area. The inventory list should be designed such that it can easily be upgraded.

6.19 Categorize the equipment and computers based on impact to regulatory submission and business impact, use yes or no. (yes = requires validation, no = does not require validation)

6.20 Further identify impact on product quality of all equipment as classified in 6.19 with 'yes'. Use categories high (immediate impact on product quality), medium (no immediate impact on product quality), low (no impact on product quality)

6.21 Develop a time plan to validate the selected systems using classification in 6.20 as criterion. Immediately start to validate all systems classified as "high"6.22 For further steps, follow 6.1) to 6.17).

7. Steps for Equipment Qualification

Equipment qualification should include:

7.1 Design Qualification (DQ)

DQ defines the functional and operational specifications of the instrument and details the conscious decisions in the selection of the supplier. DQ should ensure that instruments/computer systems have all the necessary functions and performance criteria that will enable them to be successfully implemented for the intended application and to meet business and regulatory requirements. Inputs for specifications come from anticipated users. Design qualification should include these steps: Background information and description of the application problem. If the instrument will be used for several applications, describe a few typical scenarios.

Selection of the technique and/or type of equipment

Description of the purpose and intended use of the equipment. If the instrument will be used for several applications, describe a few typical scenarios.

Description of the intended environment

Description on how the instrument will be used in the selected environment and within a process. If the instrument will be used for several applications, describe a few typical scenarios.

Development of user requirement specifications

Preliminary selection of the functional and performance specifications (technical, environmental, safety, security access, compatibility with existing and future systems/platforms)

For larger systems, development of system schematics, network layouts etc

Preliminary selection of the supplier

Instrument tests (if the technique is new)

Final selection of the equipment

Final selection and qualification of the supplier and equipment

Discussion and documentation of warranty, familiarization, training, consulting and other vendor services

Development and documentation of final functional and operational specifications

Review and approval of user requirement and functional specifications by users of the system and by the project validation team.

7.2. Installation Qualification (IQ)

Installation qualification establishes that the instrument is received as designed and specified, that it is properly installed and configured in the selected environment, and that this environment is suitable for the operation and use of the instrument.

Installation Qualification should include these steps:

Check if the environmental and safety conditions, e.g., power condition requirements, meet the criteria as specified for the instrument

Compare equipment, as received, with purchase order (including software, accessories, spare parts)

Check documentation for completeness (operating manuals, maintenance instructions, standard operating procedures for testing, safety and validation certificates)

Check equipment for any damage

Install hardware (computer, equipment, fittings and tubings for fluid connections, power cables, data flow and instrument control cables)

Switch on the instruments and ensure that all modules power up and perform an electronic self-test

Install software on computer following the manufacturers recommendation

Verify correct software installation, e.g., verify that are all files loaded. Utilities to do this should be included in the software itself.

Make back-up copy of software.

Configure peripherals, e.g. printers and equipment modules.

Identify and make a list with a description of all hardware, include drawings where appropriate.

Make a list with a description of all software installed on the computer.

List equipment manuals and SOPs.

Develop operation and calibration procedures.

Make entries into the equipment logbook.

Prepare an installation report.

Enter the equipment in the laboratory's equipment database

Items that should be included for each piece of equipment in the database include:

Unique in-house identification number (asset number)

Name of the item of equipment

The manufacturers name, address and phone number for service calls, service contract number, if there is one

Serial number and firmware revision number of equipment

Date received

Date placed in service

Current location

Size, weight

Condition when received, for example, new, used, reconditioned

List with authorized users and responsible personThe database should also have entry fields for dates of the last and next operational qualification.

For large systems, e.g., computerized protein sequencers, IQ should be performed and documented by the supplier. The user should sign the IQ report. For small equipment, e.g., pH meters, IQ should be performed and documented by the user.

IQ documents must be updated, if there is any change to the system. Changes can include:

move to an other room

update with new hardware, firmware, software

exchange of modules which have individual serial and/or asset numbers

7.3. Operational Qualification (OQ)

Operational qualification (OQ) is the process of demonstrating that an instrument will function according to its operational specification in the selected environment. The instrument should be tested against critical performance specifications as specified in the Design Specifications document.

Steps for OQ include:

Obtain functional and performance specifications (preferably use information from DQ).

Identify critical functions that should be tested in the user's environment.

Link the test cases to the user requirement and functional specifications as defined in the DQ phase.

Develop SOPs for testing

Test procedures should include what to test, how testing should be conducted and the expected results with acceptance criteria.

For modular systems, test the system as a whole, not module by module

If there are several systems of the same type and function in a laboratory/operation, use the same procedure and preferably the same acceptance criteria for all systems

Whenever possible use the manufacturer's procedure for testing

Don't use the manufacturer's performance specification limits if the performance is expected to deteriorate over time. Take performance specification as required by the application.

Decide for each instrument whether to do OQ tests by your own people, by the vendor or by a 3rd party. Criteria are costs, availability and training of people and available of (traceable) qualification tools.

Define the frequency of OQ as recommended by the vendor.

Define re-qualification criteria and procedures after equipment up-dates, moves and repairs.

7.4. Performance qualification (PQ)

Performance Qualification (PQ) is the process of demonstrating that an instrument consistently performs according to a specification appropriate for its routine use. Important here is the word consistently.

Main activities in the PQ phase are

preventive maintenance

requalification after changes and

on-going tests.

The test frequency is much higher than for OQ. Another difference is that PQ should always be performed under method conditions that are similar to routine sample analysis.

In practice, for laboratory operation PQ testing can mean system suitability testing, where critical key system performance characteristics are measured and compared with documented, preset limits. For example, a well characterized standard can be injected 5 or 6 times and the standard deviation of amounts are then compared with a predefined value. If the limit of detection and/or quantitation are critical, the lamps intensity profile or the baseline noise should be tested. Recommended steps for PQ are:

1. Define test procedures and the performance criteria for the complete system.

2. Select critical parameters. For a chromatography system this can be

precision of the amounts

precision of retention times

resolution between two peaks

peak width at half height or

peak tailing

limit of detection and limit of quantitation

wavelength accuracy of a UV/Visible wavelength detector.

3. Define the test intervals, e.g.,

every day

every time the system is used

before, between and after a series of runs

4. Record any unusual event in the equipment logbook

5. Define corrective actions on what to do if the system does not meet the criteria, in other words if the system operates out of specification.

7.5. Documentation

On completion of equipment qualification, documentation should be available that consists of:

Validation summary report with final approval (disapproval) statement, the qualified operating ranges, and a list of all deliverables generated during the validation process. The report should also contain a list of follow-up activities if there are some deviations.

Design qualification documents with purpose of the system, user requirements and functional and performance specifications.

Installation qualification document (includes description of hardware and software).

Procedures for OQ testing and acceptance criteria.

OQ test protocols and results with signatures and dates

PQ test procedures, acceptance criteria, test protocols and representative results

OQ stickers for the equipment with information on test engineer, date of OQ and date next OQ

Statement by the user on formal acceptance

Updated equipment database with information on validation status.

8. Computer System Validation

In principle validation of computer systems should follow the same guidelines as outlined in Chapter 7 for equipment qualification. If the computer is connected to equipment and used for instrument control and data evaluation for that equipment without networking to other computers, the computer can be treated as a module of that system and should be qualified as part the system.

For networked computer systems with more complex functionality additional tasks should be performed for the individual qualification steps (see also next chapter).

8.1 Design qualification

DQ should include a background information and a description of the current computer system platform, future plans and how the new system fits into this environment. It should describe which role the system plays in the workflow of an overall process. This includes interfaces to equipment but also to a corporate IT system. Once the workflow and data flow known, there should be a definition

of raw data and meta data,

how integrity of the data is ensured

for how long these data must be archived and

how they can be retrieved.

DQ should also describe whether the system will be configured or otherwise customized in the user's environment. If the system is purchased from a vendor and customized by the user, it should clearly describe who is providing what.

When documenting communication between equipment and the computer, the user requirement and functional specifications should clearly state if it is a one-way or two-way communication and how many instruments can communicate with the computer simultaneously. 8.2 Vendor qualification

For computer systems there should be documented evidence that software and system has been validated during and at the end of development. The vendor should also provide assistance for DQ, IQ and OQ and on-going support during routine use. The user must have confidence in the supplier's quality, validation and support processes that should be documented in a formal vendor assessment document. There are different possibilities to assess a vendor. They are listed in table 8.1. Table 8.1 Possibilities to assess a vendor

#AssessmentComment

1Through references outside your companyUseful if there is no experience with the vendor within your company. Criteria are

acceptance of the vendor in the market place

image of the vendor as software supplier

quality reputation of the product

2Through own experience with the vendorExperience may come from the product under consideration or from other products. Criteria are

Quality of the products (failure rate)

Responsiveness in case of errors(phone call, on-site visit, bug fix)

3Through own experience with the productVery much the same as in 2, but more accurate because most likely the software has been developed and is supported by the same people as the new purchase

4Checklist - Mail auditUse checklists available within your company, through public organizations, e.g., PDA and from private authors, e.g., in this book

5Follow up through video or teleconference Can be used to proof that documents referred to in the mail audit exist, are readily available and have required content and quality

6Assessment through 3rd party auditsGive an independent assessment of the quality system and/or product development

7Vendor audit through user to assess the quality systemGives a good picture on the vendors quality system

8Vendor audits through user for individual projects Gives a good picture on how the quality of the individual product

The required assessment level depends on the complexity of the system and should be defined by the project team and documented in the individual project validation plan.

The vendor should be qualified based on business aspects (size, market focus, financial strengths, attitude to people qualification, future perspective) and technical aspects (specification, design, development, testing and support).

Key elements of a vendor assessment are:

The vendor must provide documented evidence that the software has been validated during development. This can include a validation certificate and a description on the vendor's development and validation procedure. The vendor should also agree to provide access to additional information like test plans, test results or source code at the vendor's location.

The vendor should have procedures and/or systems in place for: security, maintenance, user feedback, change control, backup and disaster recovery system to ensure safe ongoing support of the system.

All people in software development should be adequately trained on software engineering practices but should also have a basic understanding on regulatory requirements, e.g., GLP and GMP.

Customer support people should be available on the phone and for on-site visits, if need arises.

Depending on the criticality and complexity of computer system, the assessment can be based on historical experience with the vendor, checking the documentation as provided by the vendor, developing and evaluation of additional questions/answers and finally through a vendor audit.

A vendor assessment or qualification report should be developed which includes a statement on 'vendor qualification status'.

Depending on the complexity and information existing on a vendor a direct audit can be necessary. The validation team should decide on this for each individual project.

Criteria for a vendor audit are:

Commercial off the shelf system or be-spoke

Complexity of the system

Number of systems purchased from the vendor

Availability of development validation from the vendor

Stand alone or networked system

Influence of the system on other systems

Maturity of system

Experience with the system

Reputation of the system in the market place

Experience with the vendor

A vendor's documented evidence of audits from other companies within the same industry?

Own audit experience with the vendor

Business impact of the system

8.3 Installation qualification

IQ should include a test to verify that the software has been copied accurately and completely to the computer's hard disk. Preferably the vendor should provide such verification programs as part of the application software.

For complex computer systems, a schematic diagram should be developed to illustrate the individual hardware connections and the data flow.

Verify correct use of cables as specified by the vendor, for example type and length.

Computers should be entered in the equipment inventory database. Information should include

Computer hardware with information on the processor, hard disk space, memory and the monitor

Software with product and revision number

8.4 Operational qualification

Most important for OQ testing of computer systems is the right selection of tests. It is obvious, that not all functions can be tested. Selection of tests should be based on risk assessment and must be approved by the validation team.

OQ should also include specific tests to meet requirements of 21 CFR 11, e.g., limited system access, electronic audit trail, generation of exact copies, prevention of overwriting raw data when reprocessing data.

If multiple analytical instruments are connected to single computers, proper function and correct data transmission should be verified under the highest anticipated data transfer rate (high load testing).

If the computer system is expected to perform multiple tasks simultaneously, proper function should be verified under highest anticipated workload.

Tests cases and acceptance criteria should be traceable to specifications. Test data sets and procedures should be designed such that they can be easily re-used for later requalification.

8.5 Performance qualification

Performance qualification should include

regular virus checks

regular data backup and removal of unnecessary files to avoid hard disk overflow. As a rule of thumb, the hard disk should not be loaded more than 80%

procedures and regular check for authorized user access

procedure and regular problem reporting system back to the system developer 9. Networks and Networked Systems

In principal networked systems and applications running on networks are computer systems and as such should follow practices for computer system validation. Everything what is important about computer validation can be applied to networks as well. A network component should be treated like a piece of equipment, which is installed and qualified. As next typical network functions such as limited access and network transactions should be qualified. As the final step the complete network application should be tested and test results compared with specifications.

Compared to standalone computer systems, networks and networked systems require some special considerations for all qualification phases:

9.1 Design qualification

When a network is specified, the following questions should be answered:

How much traffic will it carry, especially under highest load or under so called worst case conditions?

What is the maximum distance between individual network components?

How many users are using the network concurrently?

What is the risk and impact of a server failure?

What are the backup and archiving cycles?

What availability is expected from the network?

What is the environment in which it will be operating?

What are the protocols: GPIB-IEEE vs. TCP/IP?

For networked systems: Description of the scope of the system: what is it, what is it not (important to define the validation box). This is important to draw the boundaries to other systems.

9.2. Installation Qualification

Network specific IQ activities include:

Configure and document all network settings (e.g., routers settings).

Include network cables and other hardware components, e.g., hubs, and software into the equipment data base.

Check if physical security specifications are met (e.g., lockable hardware to prevent unauthorized access to the server's RAID system).

Develop high level and detailed system drawings, physical diagrams such as floor plans with component locations and cabling, and logical diagrams like TCP/IP schemes and how components interrelate with each other.

9.3 Operational Qualification

Network specific tests should include

Starting and shutting down networked systems (including documenting messages indicating whether the network was successfully started/shutdowned)

Switching of and on network components (e.g., hubs, routers, switches) and to note the impact of the action on the network capability to operate.

Security tests, for example logging on with correct and incorrect password.

Checking correct password administration, for example, when the password policy says that a password should have a minimum length of six characters, try to specify one with four characters.

Checking session unlock and user specific automated time-outs, if there are any.

Checking access to task and file permissions. For example, if an operator has permission to review data, try to modify them and see if the system behaves as expected.

Checking correct backup and restoration of data.

Checking how the system behaves and recovers from system failures. Trigger a failure through disconnecting and reconnecting network components.

Checking correct audit trail of network transactions.

Checking correct data transfer under normal AND under high traffic. Checking individual software modules and systems for virus infection9.3 Performance Qualification

Develop and implement procedures for back-up and retrieval. The frequency should be based on risk assessment and be defined by the validation team.

Validate back-up and retrieval procedures.

Develop and implement a contingency plan for failures of network components to avoid loss of data.

Review of network diagrams.

Verify individual software modules and systems for virus infection.

9.5 Documentation

Network specific documentation should include

Network drawings.

Updated glossary with network terminology

Updated equipment data base with network hardware, firmware and software

Procedures to control network security in general

Procedures for generation, distribution and use of passwords

Procedures for back-up and contingency planning

Network configuration management

10. Existing (Legacy) Systems

Existing equipment and computer systems should be qualified and/or validated. The same principles apply as for new systems. As explained in the preamble of FDA's 21 CFR Part 11 does not grandfather legacy systems and FDA expects that firms using legacy systems will begin taking steps to achieve full compliance. Firms should have a reasonable timetable for promptly modifying any system not in compliance (including legacy systems) to make them Part 11 compliant, and should be able to demonstrate progress in implementing their timetable. FDA expects that 21CFR Part11 requirements for procedural controls will already be in place. FDA recognizes that technology based controls may take longer to install in older systems

The main difference to new systems is that typically no information on the development process is available. Instead, there is lots of knowledge from past experience. This experience should be used to proof correct function and performance. Develop an equipment master plan that lists all equipment and computer systems together with a judgment for validation need. Develop an implementation plan with a time schedule for the validation of all systems. Validation should follow the same steps as for new systems.

10.1 Design qualification

Describe the use of the systems.

Describe required functions and performance.

No vendor qualification or evidence of development validation is required.

10.2 Installation qualification

Document system hardware and software and enter equipment data into the laboratory's database.

10.3 Operational qualification

Evaluate which tests have been done in the past and check if they are sufficient to proof operational qualification

If not, develop test procedures and acceptance criteria.

Execute and document test and compare test results with acceptance criteria.

Declare the system as validated.

10.4 Performance qualification

Follow steps as described in chapter 7 and 8.

11. Validation of Macros and Spreadsheet Applications

Macro programs and spreadsheet calculations are considered to be software and should be validated to demonstrate suitability for their intended use. The extent of validation depends on the impact of the Macro on product quality and on the complexity of the Macro. Any step can be passed over as long as there is sufficient explanation that the skipped step has no relevance for the program. The validation steps should include:

11.1 Develop a project validation plan

Defines the scope of the project, responsibilities, validation approach and timeline and it includes checkpoints.

11.2 Define responsible persons

Identify persons responsible for specifications, design, development, tests, and approvals.

11.3 Define user requirements

Describe the task, how the task is done currently and how the new program will do it more efficiently. Describe user requirements (what the user wants to do with the system), system requirements (computer hardware, operating system), regulatory (GLP/GMP/GCP, 21 CFR Part 11) requirements and security requirements. Describe the user's current environment: hardware environment, software environment, IT environment. Specify minimum hardware and software requirements that will be used. Describe the required skill level of the end users. Describe how extensively and how long the software is intended to be used (anticipated number of users, life time, frequency of use) and whether any changes are anticipated during the life of the program.

11.4 Specify Functions

Describe the program in terms of the functions it will perform, written in such a way that it is understood by both the software developer and the user. Review the functional description against the requirement specifications

11.5 Design and implementation

Document formulae or algorithms used within the program. Write the code. Define file structure, e.g., whether to use one Excel workbook with multiple sheets, several single-sheet workbooks or just a template. Define formulas to perform calculations. When using Excel, define whether to use Excel functions or VBA procedures. Determine the most appropriate user interface: dialog boxes, list boxes, menus, toolbars, short cut keys. Define the level of security to restrict access to the whole or particular elements of the program. Define how to handle errors, e.g., how the application detects and deals with errors. Annotate the code using documentation standards of your organization, if available Document the program such that it can be understood by other people whose education and experience are similar to the programmer.

11.6 Testing

Develop a test plan to test the Macro and Spreadsheet application for all functions the user will use. Develop a matrix that links functions to be tested to requirement specifications. Develop test cases and test data sets with known inputs and outputs. Describe the test environment and the execution of tests. Test protocols must clearly state the Macro title, revision number, and file name. Tests should describe the test environment and the execution of tests. Develop test cases and data test sets that can be used for current and future testing that simulate as much as possible the real-life environment (life testing). Include test cases with normal data across the operating range, boundary testing and unusual cases (wrong inputs). Include procedures to verify calculations. Test procedure and results should be documented, reviewed and approved by the programmers, and for complex macros by quality assurance departments.

11.7 Ongoing performance checks

Specify type and frequency of checks as well as expected results and acceptance criteria. Develop test data sets and procedures for ongoing performance checks.

11.8 User documentation

Describe the programs functionality and formulae used for calculations and how to install, operate and test the program.

11.9 System security and data integrity

Describe which features are implemented to meet security requirements, for example, back-up procedures and limited authorized system access to the program and data. Describe how integrity of Macro and Spreadsheet program and data generated by the program is ensured, for example how can you ensure that only authorized Macros are used and Macros are not changed, either incidentally or accidentally, outside the change control environment.

11.10 Change and version control

Develop a procedure to authorize, test, document and approve any changes to the software before implementation.

12. Change Control

Equipment, computers and networked systems are frequently changed during the life of the systems. It is of utmost importance to keep systems in the validation status.

Any change to equipment and computerized/networked systems that may affect the systems validation status should follow documented procedures and should be formally approved. The procedure should include instructions for requesting, approving, documenting and implementing changes. They should also describe the method of evaluation to determine the degree of revalidation or requalification necessary to maintain the validated state of the system. The procedure should also identify the persons responsible for determining the necessity for change control and for authorizing the change.

If changes are made within a complex computerized/networked system, the project validation team consisting of users of equipment, support personnel and computer validation experts should evaluate the need, type and extend of re-qualifications on a case by case basis. The user of the system should get advice from the computer validation experts on which impact the change may have on her/his system and which type tests should be done.

If a vendor changes the software the new release should be shipped to the users site with a validation certificate to demonstrate that the update has been developed and validated at the vendors site to appropriate standards. The vendor should also inform users if the new software may have an impact on Marco programs the user has written.

The vendor should provide a list with functions that have been changed, added or deleted and with information on which impact the changes have on the validation status.

At the users site the system with the new software revision should undergo formal installation qualification (IQ) as for a new system.

The validation team should evaluate the need for OQ testing. Where possible, the test cases test data used for the re-qualification should be designed so that the results can be compared with those obtained using earlier versions.

Before the system continues with routine analysis, a performance qualification (PQ) should be carried out. All details of the change should be formally documented with information on:

clear identification of software or hardware to be changed

reference number of change for clear process identification

nature of the change

the reason for the change

date when the change was requested

removed software errors

new functions together with their features and benefits

impact analysis the change may have on the performance of connected modules and systems

the impact the change may have on user contributed Macro programs

the type of revalidation or re-qualification tests based on the impact analysis

acceptance criteria and

test results

revision history with old and new version numbers

date when the update was installed

list of people who have been informed

13. Instrument Obsolescence and Removal

When instruments are retired, a plan should be in place to ensure that the application can run on the new system and data generated on the old system work on the new instrument.

Steps include

Transfer the application, e.g., analytical methods, raw data, analytical results and databases to the instrument.

Check if the new system generates the same results from raw data as the old system.

14. Glossary

For an extensive glossary on qualification and validation see: www.labcompliance.com/glossaryThe glossary also includes terminology for network hardware and software.

15. Documentation maintenance

Lists all documents

Describes where documentation and software is archived.

Describes criteria and examples for time the documentation must be archived: examples include validation master plans, project validation plans, validation deliverables, user manuals

Defines owners of documents

Appendix A. Checklists and Forms

Checklists

Forms

F1001 Change control form

F1002 Change control summary log

F1004 Initiation and authorization of Macros and Spreadsheets

F1006 Validation deliverables and owners

F1009 Validation team members

F1010 Preparation and approval of the validation master plan

Form F1001: Change control

Change Control Form

Form ID

Date:

System ID:

Location:

Description of change (should include reason for change)

Expected impact on validation

Authorization to change

Name: Signature Date

Change implemented on Date:

Comments (implementation, testing)e.g., document any observation and new version or revision number, types of tests that have been performed

Completed byName: Signature Date

Approved byName: Signature Date

Form F1002: Change control summary log

System IDName: CDS ChemStation Plus Number: C4663B

DateType of ChangeChange control form ID

04/75/01ChemStation software upgrade from version A.08.01 to A.08.01H1674

Form F1004: Initiation and Authorization of Macros & Spreadsheets

Form ID

System Location:

Initiator:

Description of problem, how it is solved now and how the new software can improve efficiency

Expected duration of use and frequency of use: e.g., 6 months, 5 times a day

Approval of supervisor:

Supervisor function:

Name Signature Date

Comments:

Approval/rejection by QA:Approval yes/no

Reason for approval/rejection

Name Signature Date

Comments/Recommendations:

Form F1006: Validation deliverables and owners

DeliverablePrepared by Reviewed by Approved by Referencedocument

Validation plan

Validation teamOperation managerQA system OwnerVP 679

IQ Protocol

VendorValidation teamQA managerV456

Form F1009: Validation team members

DepartmentNameTelE-mail

QA

Consultant

IT

Lab

Validation group

Vendor

Form F1010: Preparation and approval of validation project plans (master validation plan)

Preparation

Representing department Name

SignatureDate

User

QA

IT

Validation group

System ownerBusiness Management

Review and approval

Representing department NameTitle

SignatureDate

System Owner/Management

QA Compliance

www.labcompliance.com Page 12 of 12