Method validation

Preview:

DESCRIPTION

Contains information on how to evaluate and validate a method before putting it into use.

Citation preview

METHOD VALIDATION

BY:-

DR. HINAL SHAH

INTRODUCTION

Method selection and validation – Key steps in the process of implementing new methods in the clinical laboratory.

In Clinical Chemistry Laboratory- influenced strongly by guidelines.

The Clinical & Laboratory Standards Institute (formerly National Committee for Clinical Laboratory Standards) and International Organization for Standardization have published a series of protocols and documents to be followed by CCL and manufacturers for method validation.

Also, laboratory accreditation requirements have to be met.

DEFINITIONS

“The suite of procedures to which an analytical method is subjected to provide objective evidence that the method, if used in the manner specified, will produce results that conform to the statement of the method validation parameters”

“The process used to confirm that the analytical procedure employed for a specific test is suitable for its intended use”

ISO/IEC 17025 (ISO/IEC 2005, section 5.4) states that method validation is “confirmation by examination and provision of objective evidence that the particular requirements for a specified intended use are fulfilled”

When Should Methods Be Validated?

No method should be used without validation

ISO/IEC 17025 encourages the use of methods published in international, regional, or national standards and implies that if these methods are used without deviation, then the validation requirements have been satisfied.

What does need to be validated are-

i. Nonstandard methods, designed and developed by individual laboratories

ii. Standard methods used outside their scope

iii. Amplifications and modifications of standard methods.

Method selection/ development and validation cycle

VERIFICATION

Synonymous with Single Laboratory Validation

To be done when a laboratory uses a method for the first time (with standard materials)

Also done when a particular aspect of the method or its implementation is changed, for example-

a) New analyst

b) New equipment or equipment part

c) New batch of reagent

d) Changes in the laboratory premises

Minimum verification is to analyze a material before and after the change and check for consistency of the results, both in terms of mean and standard deviation.

Basic experiments that should be performed to verify the use of a standard method before first use in a laboratory: Bias/recovery

Precision

Measurement uncertainty

Calibration model (linearity)

Limit of detection

METHOD SELECTION- CRITERIA

1. Medical Need

2. Analytical Performance

3. Practical Criteria- depend on whether it is a Laboratory Developed Test or a Commercial kit method.

Method Validation- Requirements

I. Calibration

II. Precision

III. Accuracy and Trueness

IV. Linearity

V. Limit of Detection (LOD)

VI. Analytical/Measuring Range

VII. Analytical Sensitivity

VIII. Analytical Specificity and Interference

IX. Ruggedness and Robustness

X. Documentation

I. CALIBRATION

o Established by measurement of samples with known quantities of the analyte (calibrators). [Only certified reference materials with known traceability should be used]

o Done by- Chemical standards or samples with known quantities of analyte present in the typical matrix.

o Calibration functions- may be linear or curved.

o If the assumed calibration function does not correctly reflect the true relationship between instrument response & analyte concentration, a systemic error or bias is likely to be associated with the analytical method.

II. PRECISION

Precision has been defined as the closeness of agreement between independent results of measurements obtained under stipulated conditions.

Depends on – the stability of the instrument response for a given quantity of analyte.

The degree of precision is usually expressed on the basis of statistical measures of imprecision, such as Standard Deviation(SD) OR Coefficient of Variation(CV)

If calibration function is linear and imprecision remains same over analytical measurement range, SD also tends to be constant over analytical measurement range. If imprecision increases proportionally, SD also tends to increase proportionally to the concentration and relative imprecision(i.e. CV) remains constant.

Imprecision of measurements is solely related to the random error.

Precision is specified as follows:-

1) Repeatability: Corresponds to with-in run precision. It is closeness of agreement between results of successive measurements carried out under the same conditions.

2) Reproducibility: Closeness of agreement between results of measurements performed under changed conditions of measurements.

2 specifications of reproducibility –

Total or between-run precision (intermediate precision):- its method of assessment depends on usage of method and variation in analyst/equipment.

Inter-laboratory precision.

An important aspect of a full method validation is estimating bias components attributable to the method itself and to the laboratory carrying out the analysis. It can be evaluated by inter-laboratory studies.

III. TRUENESS & ACCURACY

Trueness:- • Closeness of agreement between the mean value (obtained from

a large series of results of measurements) & the true value.

• Bias – Difference between the mean and the true value.

Inversely related to the trueness.

• Qualitative term.

• True value – in practice is the accepted reference value.

• Evaluated by comparison of measurements by a given routine method & a reference measurement procedure.

Accuracy :-

• Closeness of agreement between the result of a measurement and true value.

• Qualitative term.

• Influenced by- BIAS and IMPRECISION. Reflects total error.

• Total error= bias + 2SD

Concepts of recovery, drift and carryover are related to trueness.

RECOVERY

It means - the fraction of a test material in a matrix that can be quantified.

target conc. – conc. of blank

% recovery= -------------------------------------------- X 100

measured conc.

A recovery close to 100% is a prerequisite for a high degree of trueness, but does not ensure unbiased results.

DRIFT & CARRYOVER

DRIFT- caused by instrument instability over time, so that calibration becomes biased.

CARRYOVER- must be close to ‘Zero’ to ensure unbiased results.

IV. LINEARITY

It refers to the relationship between measured and expected values over the analytical measurement range.

Linearity of the analytical procedure is its ability to obtain test results which are directly proportional to the concentration of analyte in sample.

Most of the times, a dilution series of a sample is examined. E.g., linearity of glucose estimation is found by measuring serial dilutions of a stock solution of 1000mg/dl concentration.

A response curve of an electrochemical measurement of copper with concentration of analyte

y is the measured current, x is the concentration of copper

V. LIMIT OF DETECTION

Defined as the lowest value that exceeds the measurements of a blank sample.

It is the “minimum detectable net concentration”.

It presumably is the smallest concentration of analyte that can be determined to be actually present, even if the quantification has large uncertainty

“Limit of detection” a clearly defined cut off above which the analyte is measured and below which it is not

VI. ANALYTICAL / MEASURING RANGE

It is the analyte concentration range over which measurements are within the declared tolerances for imprecision and bias of the method.

This range extends from the lowest concentration (i.e. lower limit of quantification {LloQ}) to the highest concentration (i.e. upper limit of quantification {UloQ}) for which the performance specifications are fulfilled.

VII. ANALYTICAL SENSITIVITY

Defined as the ability of an analytical method to assess small differences in the concentration of the analyte.

The smaller the random variation of the instrument response and the steeper the slope of the calibration function at a given point, the better is the ability to distinguish small differences in analyte concentration.

It depends on the PRECISION of the method.

VIII. ANALYTICAL SPECIFICITY & INTERFERENCE

Analytical specificity is the ability of an assay procedure to determine the concentration of the target analyte without influence of potentially interfering substances or factors in the sample matrix.

Interfering substances/factors:-

i. Hyperlipemia

ii. Hemolysis

iii. Bilirubin

iv. Antibodies

v. Degradation products of the analyte

vi. Exogenous substances

vii. Anticoagulants

Interferences from hyperlipemia, hemolysis and bilirubin – concentration dependent &can be quantified as a function of the concentration of the interfering compound.

CLSI has published a detailed protocol for evaluation of interference.

IX. RUGGEDNESS & ROBUSTNESS

A method is classed as rugged if its results remain sufficiently unaffected as designated environmental and operational conditions change.

Practicality of the method ultimately depends on how rugged it is.

A robustness study addresses two areas of concern: the need for a method to be able to yield acceptable results under the normal variation of conditions expected during routine operation, and the portability of the method between laboratories with changes in instrumentation, people, and chemical reagents.

X. DOCUMENTATION

o It is the most important step in method validation.

o ‘If you have not documented, you have not done’

o Proper documentation is proof of validity.

o Method validation is only as good as its documentation, which describes the method, the parameters investigated during the method validation study, the results of the study, and the Method Validation conclusions concerning the validation.

Reasons for Documentation

1. to allow a user of the method to understand the scope of validation and how to perform experiments within the scope of the validation

2. the method is not validated until it is documented. No number of experiments is of any use if they are not documented for future users and clients to consult

As the documentation will become part of the laboratory’s quality assurance system, proper documentary traceability (as opposed to metrological traceability of a result) must be maintained, with revision versions and appropriate authorizations for updates.

When all the above steps are complete, the method is said to be validated and can be now put into use.

References

1. Tietz Textbook of Clinical Chemistry and Molecular Diagnosis; Edition; Chapter 2; pg

2. Pictures and graphs from m.authorstream.com

Recommended