Enzymes are often exquisitely selective for one and only one chemical structure (substrate) that is specifically acted on by the enzyme. Enzymatic reactions are used for the determination of either a substrate concentration or an enzyme's activity. For many analytes in biologic fluids, such as glucose, cholesterol, bilirubin or creatinine, nature has provided enzymes that selectively react with these important molecules. Such enzymes are used to catalyze the conversion of these molecules in reactions that generate products that can be observed photometrically. For example, glucose is converted by the enzyme hexokinase to glucose-6-phosphate, which in turn is used to produce a molecule of NADPH. For each molecule of glucose one molecule of NADPH is produced. NADPH can be measured in the ultraviolet region at 340 nm. Detection of a biologic substrate like glucose can be carried out as an endpoint reaction by measuring the maximum amount of NADPH formed or as a rate reaction by measuring the rate at which NADPH is formed. Many analytes of interest are themselves enzymes. To measure the amount of enzyme present, a substrate that is recognized only by that enzyme is used. For example, the enzyme lipase releases fatty acids from triglycerides and diglycerides. Lipase activity is measured by using products from lipase action on diglycerides to generate glycerol molecules. These glycerol molecules produce a colored product that absorbs light at 548 nm. The rate of increase in absorbance at 548 nm is a function of lipase activity.
Chemical reactions of analytes (catalysts that regulate metabolic reactions in the body) produce products that can be detected by using optical methods. Changes in the light absorbed by these products are used to determine the concentration of the analyte in question. Solutions of known concentration (calibrators) are used to establish a relationship between the size of an optical signal and the corresponding concentration of the analyte.
Some tests are based on formation of insoluble particles that interfere with the passage of light through the solution. Here, the analyte reacts with an added reagent to produce insoluble particles that remain suspended in the solution. When light hits these particles some of it is reflected in different directions. As the amount of analyte increases, the number of particles formed increases; consequently, the amount of light reflected by the particles increases and the amount of light that passes through the solution decreases. It is possible to measure the loss of light passing straight through the solution — a method called turbidimetry. In turbidimetry, the detector is placed in a direct line with the incident light, and the light sensed by the detector decreases as the number of analyte particles increases. Often, antibodies are used with this method and represent a type of immunometric assay — specifically, immunoturbidimetry. The antibodies in the reagents cause analyte molecules to form complexes or lattices and these large particle aggregates enhance the reflection of light, increasing the analytical signal that is measured. Turbidimetric methods are often chosen to measure proteins such as transferrin or prealbumin — two important transport proteins in the blood. Proteins are relatively large molecules that can be easily cross-linked by selective reagents to produce aggregate particles that are the right size to reflect light in the visible or ultraviolet range. The protein transferrin can be mixed with antitransferrin antibodies and the resultant immunoprecipitate can be quantified in a turbidimetric rate, or endpoint reaction.
Antibodies to protein antigens can bind to multiple sites (or epitopes) on the protein molecule and can cross-link many different molecules of the same protein to form an insoluble precipitate composed solely of antibody and antigen molecules. This immunoprecipitate can be detected using a turbidimetric method. When an analyte is a small molecule that cannot be cross-linked to produce an immunoprecipitate, it is still possible to use a turbidimetric method.
This important process links the analytical signal (i.e., the amount of light measured in photometric analyses) with the concentration of analyte. We use a series of solutions containing the analyte at known concentrations and observe the signal that is produced at each concentration. These results can be expressed as a calibration curve. The purpose of a calibration curve is to establish a relationship between the concentration of analyte and the magnitude of the optical signal that is given by the measuring device. This relationship can be linear or nonlinear, i.e., the curve could show the signal rising linearly with increasing concentration of analyte, or show the signal falling in a nonlinear fashion with rising analyte concentration. Interpolation also establishes the expected signal for a range of concentrations of analyte that fall between the lowest and the highest calibrator. The signal from a sample can be compared to the calibration curve and the concentration of analyte that produces this signal can be determined. Because measurement technologies keep developing and growing, it becomes more and more imperative to rely on standards.
For many analytes, the preparation of a pure reference material that can serve as a primary standard is impossible. Analytes such as proteins often have many different forms that may be present in differing amounts in different patients. Thus, it is difficult to identify one form of a protein as an ideal reference material. Other analytes, like bilirubin, are inherently unstable and would break down when exposed to light or air or when separated from other stabilizing molecules found in solution. Enzymes would often lose their enzymatic activity if isolated in a pure form. For these types of analytes, no suitable primary reference material can be prepared. Instead, values for these analytes are traceable to a consensus value based on an approach that has been established by agreement among laboratory professionals. Here, the calibrator values are assigned for specific methods and instruments. Many clinical laboratory methods are affected by the sample matrix — a term for the physiologic or artificial milieu that contains the analyte. If a calibrator is used with a different method, matrix effects may result in inaccurate calibration.
Different methods that give the same result for a patient sample may give different results when analyzing synthetic samples — such as calibrator fluids, proficiency testing (PT) samples and quality control (QC) samples. Ideally, these synthetic samples should mimic a patient sample, but many times they do not because the matrix has undergone a type of manufacturing process and does not resemble a fresh human patient specimen. Calibrator, PT and QC solutions differ from patient samples in that they are supplemented with many different substances to generate wide ranges of analyte concentrations. In addition, these samples are often frozen or lyophilized (freeze-dried to remove all liquid) to minimize decomposition of analytes during storage. The process of freezing or lyophilization followed by thawing or reconstitution with liquid may also change some properties of the solution. When the addition of extra substances, or freezing or lyophilization alters the solution properties in a way that biases the measured result, the bias is said to result from a "matrix effect". One of the challenges in the calibration process is the determination of the highest and lowest signal that can be reliably measured and related to a concentration of analyte. These limits of measurement are dictated in part by properties of the method and in part by properties of the instrument being used for the test. The company that develops a test method typically determines the analytic measurement range that defines the lowest to highest measurable quantities.
In general, most prescribed drugs do not require any special monitoring of the drug level in blood; although occasionally they do require tests to ensure that the drug is not adversely affecting liver or kidney function. Medication can be toxic if consumed in quantities that exceed the body's capability to metabolize them.