Bioanalytical method development and validation
U K Singh, S Pandey, P Pandey, P K Keshri, P Wal
The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. The objective of this paper is to review the sample preparation of drug in biological matrix and to provide practical approaches for determining selectivity, specificity, limit of detection, lower limit of quantitation, linearity, range, accuracy, precision, recovery, stability, ruggedness, and robustness of liquid chromatographic methods to support pharmacokinetic, toxicokinetic, bioavailability, and bioequivalence studies.
Bioanalysis, employed for the quantitative determination of drugs and their metabolites in biological fluids, plays a significant role in the evaluation and interpretation of bioequivalence, pharmacokinetic (PK), and toxicokinetic studies. Selective and sensitive analytical methods for quantitative evaluation of drugs and their metabolites are critical for the successful conduct of pre-clinical and/or biopharmaceutics and clinical pharmacology studies.
A bioanalytical method consists of two main components—Sample preparation—which is a technique used to clean up a sample before analysis and/or to concentrate a sample to improve its detection. When samples are biological fluids such as plasma, serum or urine, this technique is described as bioanalytical sample preparation. The determination of drug concentrations in biological fluids yields the data used to understand the time course of drug action, or pharmacokinetics, in animals and man and is an essential component of the drug discovery and development process. Sample preparation prior to the chromatographic separation has three principal objectives
The dissolution of the analyte in a suitable solvent
Removal of as many interfering compounds as possible and
Pre-concentration of the analyte.
The bioanalytical chemist can choose from a range of sample preparation methodologies, as listed in table 12.
Protein precipitation, liquid-liquid extraction and solid phase extraction (SPE) are routinely used.
Protein precipitation—In protein precipitation, acids or water-miscible organic solvents are used to remove the protein by denaturation and precipitation. Acids, such as trichloroacetic acid (TCA) and perchloric acid, are very efficient at precipitating proteins. The proteins, which are in their cationic form at low pH, form insoluble salts with the acids. Organic solvents, such as methanol, acetonitrile, acetone and ethanol, although having a relatively low efficiency in removing plasma proteins, have been widely used in bioanalysis because of their compatibility with high-performance liquid chromatography (HPLC) mobile phases. These organic solvents which lower the solubility of proteins and precipitate them from solutions have an effectiveness which is inversely related to their polarity.
A volume of sample matrix (I part) is diluted with a volume of organic solvent or other precipitating agent (three-four parts), followed by vortex mixing and then centrifugation or filtration to isolate or remove the precipitated protein mass. The supernatant or filtrate is then analysed directly. Protein precipitation dilutes the sample. When a concentration step is required, the supernatant can be isolated, evaporated to dryness and then reconstituted before analysis. This procedure is popular because it is simple, universal, inexpensive and can be automated in microplates. However, matrix components are not efficiently removed and will be contained in the isolated supernatant or filtrate. In MS/MS detection systems, matrix contaminants have been shown to reduce the efficiency of the ionisation process. The observation seen is a loss in response and this phenomenon is referred to as ionisation suppression4.
Liquid-liquid extraction (LLE): Liquid-liquid extraction (LLE) is the direct extraction of the biological material with a water-immiscible solvent. The analyte is isolated by partitioning between the organic phase and the aqueous phase. The analyte should be preferentially distributed in the organic phase under the chosen conditions. For effective LLE some considerations should be taken such as
The analyte must be soluble in the extracting solvent.
A low viscosity of solvent to facilitate mixing with the sample matrix.
The solvent should also have a low boiling point to facilitate removal at the end of the extraction.
The use of pH control allows the fractionation of the sample into acid, neutral and basic components
A large surface area is important to ensure rapid equilibrium. This is achieved by thoroughly mixing using either mechanical or manual shaking or vortexing.
Generally, selectivity is improved by choosing the least polar solvent in which the analyte is soluble.
Solid phase extraction (SPE)—In SPE the analyte is retained on the solid phase while the sample passes through, followed by elution of the analyte with an appropriate solvent. SPE can be considered as a simple on/off type of chromatography. A typical SPE sorbent consists of a 40-60 µm silica particle to which has been bonded a hydrocarbon phase. The SPE is typically carried out using a five-step process—condition, equilibrate, load, wash and elute. The solid phase sorbent is conditioned by passing a solvent, usually methanol, through the sorbent to wet the packing material and solvate the functional groups of the sorbent. The sorbent is then equilibrated with water or an aqueous buffer. Care must be taken to prevent the phase from drying out before the sample is loaded, otherwise variable recoveries can be obtained. Samples are diluted 1:1 with aqueous prior to loading to reduce viscosity and prevent the sorbent bed from becoming blocked. Aqueous and/or organic washes are used to remove interferences.
|Typical choices of sample preparation techniques useful in bioanalysis
Detection of the compound— usually following chromatographic separation from other components present in the biological extract. The detector of choice is a mass spectrometer. Before a bioanalytical method can be implemented for routine use, it is widely recognised that it must first be validated to demonstrate that it is suitable for its intended purpose.
Bioanalytical method validation includes all of the procedures that demonstrate that a particular method used for quantitative measurement of analytes in a given biological matrix, such as blood, plasma, serum, or urine, is reliable and reproducible for the intended use. The fundamental parameters for this validation include accuracy, precision, selectivity, sensitivity, reproducibility, and stability. Validation involves documenting, through the use of specific laboratory investigations, that the performance characteristics of the method are suitable and reliable for the intended analytical applications. The acceptability of analytical data corresponds directly to the criteria used to validate the method.
In early stages of drug development, it is usually not necessary to perform all of the various validation studies. Many researchers focus on specificity, linearity and precision studies for drugs in preclinical through Phase II (preliminary efficacy) stages. The remaining studies penetrating validation are performed when the drug reaches the Phase II (efficacy) stage of development and has a higher probability of becoming marketed product. But now, for pharmaceutical methods, guidelines from the United States pharmacopoeia (USP), International Conference on Harmonisation (ICH) and Food and Drug Administration (FDA) provide a framework for regulatory submission must include study on such fundamental parameters.
There is a general agreement that at least the following parameters should be evaluated for quantitative procedures.
1.Specificity/selectivity—A method is specific if it produces a response for only one single analyte. Since it is almost impossible to develop a chromatographic assay for a drug in a biological matrix that will respond to only the compound of interest, the term selectivity is more appropriate. The selectivity of a method is its ability to produce a response for the target analyte which is distinguishable from all other responses (eg. endogenous compounds).
2. Accuracy—Accuracy of an analytical method describes the closeness of mean test results obtained by the method to the true value (concentration) of the analyte. This is sometimes termed as trueness. The two most commonly used ways to determine the accuracy or method bias of an analytical method, are (i) analysing control samples spiked with analyte and (ii) by comparison of the analytical method with a reference method 7.
3. Precision—It is the closeness of individual measures of an analyte when the procedure is applied repeatedly to multiple aliquots of a single homogenous volume of biological matrix. There are various parts to precision, such as repeatability, intermediate precision, and reproducibility ruggedness). Repeatability means how the method performs in one lab and on one instrument, within a given day. Intermediate precision refers to how the method performs, both qualitatively and quantitatively, within one lab, but now from instrument-to-instrument and day to-day. Finally, reproducibility refers to how that method performs from lab-to lab, from day-to-day, from analyst-to-analyst, and from instrument-to-instrument, again in both qualitative and quantitative terms. The duration of these time intervals are not defined. Within/intraday, assay, run and batch are commonly used to express the repeatability. Expressions for reproducibility of the analytical method are between/interday, assay, run and batch. The expressions intra/within-day and inter/between- day precision are not preferred, because a set of measurements could take longer than 24 hours or multiple sets could be analysed within the same day.
4. Detection limit—The limit of detection (LOD) is the lowest concentration of analyte in the sample that can be detected but not quantified under the stated experimental conditions. The LOD is also defined as the lowest concentration that can be distinguished from the background noise with a certain degree of confidence. There is an overall agreement that the LOD should represent the smallest detectable amount or concentration of the analyte of interest
5. Quantitation limit—The quantitation limit of individual analytical procedures is the lowest amount of analyte in a sample, which can be quantitatively determined with suitable precision and accuracy. The LLQ-value is determined by the presence of background signal (accuracy) and the reproducibility of the analytical method (precision). The LLQ is the lowest concentration point in the calibration curve.
6. Linearity—According to the ICH-definition ‘the linearity of an analytical procedure is its ability (within a given range) to obtain test results which are directly proportional to the concentration (amount) of analyte in the sample. The concentration range of the calibration curve should at least span those concentrations expected to be measured in the study samples. If the total range cannot be described by a single calibration curve, two calibration ranges can be validated. It should be kept in mind that the accuracy and precision of the method will be negatively affected at the extremes of the range by extensively expanding the range beyond necessity. Correlation coefficients were most widely used to test linearity. Although the correlation coefficient is of benefit for demonstrating a high degree of relationship between concentration-response data, it is of little value in establishing linearity. Therefore, by assessing an acceptable high correlation coefficient alone the linearity is not guaranteed and further tests on linearity are necessary, for example a lack-of-fit test.
7. Range—The range of an analytical procedure is the interval between the upper and lower concentration (amounts) of analyte in the sample (including these concentrations) for which it has been demonstrated that the analytical procedure has a suitable level of precision, accuracy and linearity.
8. Robustness—It is the measure of its capacity to remain unaffected by small, but deliberate variations in method parameters and provides an indication of its reliability during normal usage.
9. Extraction recovery—It can be calculated by comparison of the analyte response after sample workup with the response of a solution containing the analyte at the theoretical maximum concentration. Therefore absolute recoveries can usually not be determined if the sample workup includes a derivatisation step, as the derivatives are usually not available as reference substances. Nevertheless, the guidelines of the Journal of Chromatography require the determination of the recovery for analyte and internal standard at high and low concentrations. Recovery does not seem to be a big issue for forensic and clinical toxicologists as long as precision, accuracy (bias), LLOQ and especially LOD are satisfactory. However, during method development one should of course try to optimise recovery.
10. Stability—It is the chemical stability of an analyte in a given matrix under specific conditions for given time intervals. The aim of a stability test is to detect any degradation of the analyte(s) of interest, during the entire period of sample collection, processing, storing, preparing, and analysis. All but long term stability studies can be performed during the validation of the analytical method. Long term stability studies might not be complete for several years after clinical trials begin. The condition under which the stability is determined is largely dependent on the nature of the analyte, the biological matrix, and the anticipated time period of storage (before analysis).
|Selectivity (specificity)||Analyses of blank samples of the appropriate biological matrix (plasma, urine or other matrix) should be obtained from at least six sources. Each blank should be tested for interference and selectivity should be ensured at LLOQ.|
|Accuracy||Should be measured using a minimum of 6 determinations per concentrations. Minimum of three concentrations in range of expected concentrations is recommended for determination of accuracy. The mean should be ± 15 percent of the actual value except at LLOQ, where it should not deviate by ± 20 percent. This deviation of mean from the true values serves as the measure of accuracy.|
|Precision||Should be measured using a minimum of five determinations per concentrations. Minimum of three concentrations in the range of expected concentrations is recommended. The precision determined at each concentration level should not exceed 15 percent of the coefficient of variation (CV) except for the LLOQ, where it should not exceed 20 percent of the CV.|
|Recovery||Recovery experiments should be performed at three concentrations (low, medium and high) with unextracted standards that represent 100 percent recovery.|
|Calibration curve||Should consist of a blank sample (matrix sample processed without internal standard), a zero sample (matrix sample processed with internal standard) and six to eight non-zero samples covering the expected range, including LLOQ.|
|LLOQ||Analyte response should be five times the response compared to blank response. Analyte peak should be identifiable, discrete and reproducible with a precision of 20 percent and accuracy of 80-120 percent.|
|Freeze-thaw stability||Analyte stability should be determined after three freeze-thaw cycles. At least three aliquots at each of the low and high concentrations should be stored at intended storage temperature for 24 hours and thawed at room temperature. When completely thawed, refreeze again for 12-24 hours under same conditions. This cycle should be repeated two more times, then analyse on 3 cycle. Standard deviation of error should be|