This editorial is the fifth in a series of six articles focused upon the design controls process from a standpoint of design verification and design validation in the orthopaedic medical device marketplace. First, let’s review some pertinent definitions of terms from the 21 CFR, Part 820.30 and ISO 13485:2003, Section 7.3. Our article begins after the review of definitions.
Design Verification (expanded): confirmation that the design output meets the design input requirements. The results of the design verification, including identification of the design, method(s), the date, and the individual(s) performing the verification, shall be documented in the DHF. Risk analysis is often considered to be part of design verification. Bench-top tests and quality inspections are also common forms of verification and may serve as mitigating actions as part of the Failure Mode and Effects Analysis. A design matrix (a table in which the design inputs are matched with the design outputs) is another tool used in design verification. Design verification precedes design validation and is not a substitute. The two are distinctly different and should be treated as such. The Quality System Regulation defines design verification as confirmation by examination and provision of objective evidence that specified requirements have been fulfilled.
Verification: confirmation by examination and provision of objective evidence that specified requirements have been fulfilled.
Validation: confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use can be consistently fulfilled.
Process Validation: establishing by objective evidence that a process consistently produces a result or product meeting its predetermined specifications.
Design Validation: establishing by objective evidence that device specifications conform with user needs and the intended use(s) of the medical device. Design validation shall ensure that devices conform to defined user needs and intended uses and shall include testing of production units under actual or simulated use conditions. Design validation shall include software validation and risk analysis, where appropriate. The results of the design validation, including identification of the design, methods, the date, and the individuals performing the validation, shall be documented in the Design History File. According to the Quality System Regulation, validation testing must be performed on actual production units or their equivalents. When equivalents are used, the manufacturer must provide documentation that clearly demonstrates the equality. There can be a tendency to manufacture test units under special conditions that do not mimic actual production. An example would be the manufacturing engineer performing final assembly on a device where this would normally be done by a manufacturing operator. The manufacturing engineer’s skill level and knowledge base are most likely different from those of the operator. Companies should strive to produce their test units in an environment as close to the final production environment as possible.
A Firm Foundation of Objective Evidence: Manufacturers and Specification Developers shall establish and maintain procedures for verifying and then validating the device design.
Design verification is most always done according to specifications. Therefore, to control the specifications and increase the probability of achieving desired safety and performance characteristics, device, software, labeling, packaging and any other specifications should be complete and thoroughly reviewed before development commences. As the hardware and software designs evolve, they should be evaluated in comparison to their current specifications as well.
Design verification should be achieved with test equipment calibrated and controlled according to quality system requirements. Otherwise, there is limited confidence in the data.
Verification and validation can also be done according to a written protocol. The protocol(s) should include defined conditions for the testing. The protocol(s) should be approved before use. Test protocol(s) are not perfect for a design, particularly a new design. Therefore, the designers and other verification personnel carefully annotate any ongoing changes to a protocol. Likewise, the verification personnel should record technical comments about any deviations or other events that occurred during testing. The slightest problem should not be ignored. During design reviews, the comments, notes and deviations may be as important as test data from the formal protocol(s).
It should be noted that design changes start to be realized as inputs to the design are approved by the appropriate individuals. All design changes are accounted for and are eventually documented in the Design History File (DHF). Design changes are, of course, possible after commercialization activities begin, and should be handled in such a way as to document the change and then initialize a “trigger” to pull concerning re-verification and potentially re-validation of the design.