Upload
others
View
12
Download
0
Embed Size (px)
Citation preview
General Principles of Software Validation
Stewart Crumpler
Office of ComplianceCenter for Devices and Radiological Health
Medical Device Software Recalls (1983 - 1998)
0
10
20
30
40
50
1980 1985 1990 1995
Year
No. o
f Rec
alls
0 2 4 6 8 10 12 14
Average Number of Software Recalls Per Year
Other
General Hospital
Anesthesiology
Cardiovascular
InVitro Diagnostic
Radiology
1983-91
1992-96
Medical Device Software Recalls (By Medical Specialty)
Medical Device Software Guidance
u General Principles of Software Validation (draft)
u Pre-market Submissions for Software Contained in Medical Devices
u Off-the-Shelf Software Use in Medical Devices
u Premarket Submissions for Blood Establishment Software
u Software Used in Computerized Clinical Trials
Quality System Regulation
u For device software§ 820.30(g) “… design validation shall include software validation and risk analysis where appropriate…”
u For automation of production processes or automation of the quality system§ 820.70(i) “… the [device] manufacturer shall validate computer software for its intended use according to an established protocol. All software changes shall be validated before approval and issuance. These validation activities and results shall be documented.”
Design Controls
u Planningu Inputu Outputu Reviewu Verificationu Validationu Transferu Changesu History File
Concurrent Engineering
Iterative Processes
Configuration Management
Ongoing Risk Analysis
Multiple Verification and Validation Activities
Software Validation Guidance
u Draft issued June 9, 1997u Comment period extended - December 1997
u Comments from 35 organizations and individuals
u Global Harmonization Task Force reviewu Generally favorable overall
u Many proposed editorial changes
Principles
u Requirements - documented baseline for V & V
u Defect Prevention - better than finding & fixing defects; mix of static analyses and dynamic testing,
u Time and Effort - validation throughout life cycle
u Software Life Cycle - defined activities, tasks, and documentation
u Plans - “what”
u Procedures - “how”
Principles (Cont.)
u Software Validation After Change - impact analysis & regression testing
u Validation Coverage - commensurate with risk and complexity
u Independence of review - preferred where possible
u Flexibility and Responsibility - wide variety of environments, devices and risks
General Comments
u Title and scope– Validation vs design control– Device vs. pharmaceutical use– Concept phase
u Reference / harmonize with other standardsu Consistency with and cross reference to
other FDA guidance
u Examples, graphics & map to QS regulation
System vs Software
System requirementsHigh level system design
Software requirementsSoftware design
Software verificationIntegration
System validation
u Planning
u Risk Analysis
– FTA
– FMECA
u Verification
u Validation
Software Validation
u Part of design validation
u Validation does not include verification and testing, but is highly dependent upon them
u Verification activities occur throughout the software development life cycle (SDLC)
u Validation activities can occur both during and toward the end of the SDLC
Operational Definition
u Software Validation is “confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled.”
u Includes evidence that all software requirements:– implemented correctly and completely– traceable to system requirements
u Developing an acceptable “Level of Confidence”
Requirements & Design
u “Predetermined” requirements
u Iterative / incremental approaches to requirements definition
u Reuse of code
u Assertions
Requirements
u System requirements for computer automated features and functions must be documented
u Developers software requirements must be documented for in-house developed software, but may not be available for third-party software
u Written requirements by the device manufacturer establish “intended use” of the automated equipment for manufacturing and quality system
Other Terminology Issues
u Validation, verification, qualification
u IQ / OQ / PQ
u Risk vs hazard
u Is software different from hardware?
u Reviews vs validation
u Plans vs procedures
Flexibility
u More flexibility / less prescriptive language
u Alternative methodologies– Iterative– Incremental– Component based
u Level of effort– Hazard or Level of Concern– Complexity
Verification
u Typical (verification / quality system) tasksu Traceability
– design to requirements– code to design– software safety requirements to hazard
analysisu More emphasis on static analysis (e.g.,
inspection and walk through)
Testing
u Testing - less prescriptive, more detailedu How much coverage?u Incremental development and testingu Statistical testingu Rigorous compiler error checking u User site testingu Early test planning
Changes
u Staff turnover - need for documentationu Configuration management
u Partial validation & regression testing
u Revalidationu Retrospective validation
u Do repairs establish new design?u Root cause analysis
Processes and Quality System
u ODE guidance on off-the-shelf softwareu Separate section on validating process and
quality system software u Validating vendor supplied and configurable
software– Development tools– OTS validation– Impact on CAPA
Vendor Supplied Softwareu Intended use - documented requirementsu Safety risk analysis - role of OTS softwareu Level of validation effort
– Obtain vendor information• Copy of vendor’s software requirements• Conduct audits / assessments• Access to vendor’s software QA program• List of known “bugs”
– “Black box testing” by device manufacturer
Next Steps
u Document revision underwayu Internal review and clearance
u Will be available on CDRH Web page
http://www.fda.gov/cdrh
u CDRH Facts-On-Demand at:
1 - 800 - 899 - 0381
Closing Thoughts
u Manage the development process
u Begin with the end in mind
u Plan for maintenance by someone else
u Get the software requirements right
u Apply risk management early and throughout
Closing Thoughts
u Assume nothing - provide evidence
u Use a life cycle model that works for you
u Use available standards, texts and guidance
u Measure software quality
u Measure software process improvement