Validation of liquid chromatography mass spectrometry (LC-MS) methods
Glossary
or simply run:
Definition 1: A set of samples that are analysed in one batch, during short time. In LC and LC-MS this term typically refers to an automatized sequential analysis of a set of samples, calibrants and QC samples that have been loaded into autosampler.
Definition 2 (defined by Clinical Laboratory Improvement Amendments (CLIA)): An interval (i.e., a period of time or series of measurements) within which the accuracy and precision of the measuring system is expected to be stable. In laboratory operations, control samples are analyzed during each analytical run to evaluate method performance, therefore the analytical run defines the interval (period of time or number of specimens) between evaluations of control results. Between quality control evaluations, events may occur causing the measurement process to be susceptible to variations that are important to detect.
Quantitative estimate of trueness, can be found as measured value difference from the reference value.
Decision limit: the concentration level, as determined by the method, at which there is probability α (usually defined as 0.05 or 5%) that a blank sample will give a signal at this level or higher.
Detection capability: the concentration level of the analyte in sample at which there is probability β (again usually defined as 0.05 or 5%) that the method will give a result lower than CCα, meaning that the analyte will be declared as undetected.
The range where the response changes when the analyte concentration is changed but the relationship may be non-linear. If the response is linear it can be specified as a dynamic linear range.
Results where the analyte is declared to be present although actually it is below LoD.
Difference of variance; describes the data where the standard deviation of the signal (y-value) depends on the concentration of the anaylte (x-value).
Homogeneity of variance; describes the data where the standard deviation of the signal (y-value) does not depend on the concentration of the anaylte (x-value).
Providing evidence that the analytical signal registered during sample analysis is due to analyte and not any other (interfering) compound.
LoD estimated for the analysis instrument by measuring the analyte from pure solvent without any sample pretreatment.
The precision obtained within a single laboratory over a longer period of time (generally at least several months).
Range of concentrations where the signals are directly proportional to the concentration of the analyte in the sample.
Methods ability to obtain signals, which are directly proportional to the concentration of analyte in the sample.
Limit of detection: the smallest amount or concentration of analyte in the test sample that can be reliably distinguished from zero.
Limit of quantitation: the lowest concentration of analyte that can be determined with an acceptable repeatability and trueness.
Measurement uncertainty (or simply uncertainty) defines an interval around the measured value CMEASURED, where the true value CTRUE lies with some predefined probability. The measurement uncertainty U itself is the half-width of that interval and is always non-negative. This definition differs from the VIM [ref 6] definition (but does not contradict it) and we use it here as it is generally easier to understand and to relate to practice.
LoD estimated by using matrix matched samples that are put through the whole analysis method (including the sample preparation).
Maximum residue limit: the maximum allowed concentration limit set for the compound in certain matrices.
Characterizes the closeness of agreement between the measured values obtained by replicate measurements on the same or similar objects under specified conditions.
A prodrug is a medication or compound that, after administration, is metabolized (i.e., converted within the body) into a pharmacologically active drug (e.g. by ester bond cleavage within prodrug). Inactive prodrugs are pharmacologically inactive medications that are metabolized into an active form within the body.
The recovery of an analyte in an assay is the detector response obtained from an amount of the analyte added to and extracted from the biological matrix, compared to the detector response obtained for the true concentration of the analyte in solvent. Recovery pertains to the extraction efficiency of an analytical method within the limits of variability.
Expresses the closeness of the results obtained with the same sample using the same measurement procedure, same operators, same measuring system, same operating conditions and same location over a short period of time.
Expresses the precision between measurement results obtained at different laboratories.
The difference between experimental signal and signal calculated according to the calibration function.
A parameter used to evaluate constancy of the results to variations of the internal factors of the method such as sample preparation, mobile phase composition, mobile phase flow rate, injection volume, column temperature etc.
A parameter used to evaluate constancy of the results when external factors such as analyst, laboratory, instrument, reagents and days are varied.
The extent to which other substances interfere with the determination of a substance according to a given procedure.
The change in instrument response, which corresponds to a change in the measured quantity; the gradient of the response curve.
Stability (ST%) characterizes the change in the analyte content in the given matrix under the specific conditions and over the period of time.