|Preskorn.com||Printed from: http://www.preskorn.com/columns/9603.html|
|Have You Phenotyped Your Patient Lately?|
SHELDON H. PRESKORN, MD
Journal of Practical Psychiatry and Behavioral Health, March 1996,115-117
When I lecture on the subject of drug metabolism, I frequently ask clinical audiences whether they have ever phenotyped their patients in terms of whether they are deficient in the cytochrome P450 enzyme, CYP 2D6. They invariably answer: "No."
I then ask if they have ever used therapeutic drug monitoring (TDM) to determine what plasma level of a tricyclic antidepressant (TCA) a patient has developed on a given done of a TCA. Most answer: "Yes."
I then point out that using TDM with such drugs is tantamount to assessing (i.e., phenotyping) the functional status of CYP 2D6 activity in that patient since that activity is the principal factor that determines what plasma TCA level the patient will develop on a given dose of a TCA. Note that dose here refers to the dose the patient is actually taking, not what the physician is prescribing. (For a detailed discussion of cytochrome P450 enzymes, see my November 1995 column.)
In this column, I continue a series devoted to understanding why different patients respond differently to the same dose of the same drug. Here I focus on the role of TDM, which is a refinement of the standard dose-response strategy and involves measuring the concentration of the drug achieved in a specific patient on a specific dose. The concentration in plasma is typically used as a surrogate for the concentration at the site of action because it is relatively easy to obtain and is correlated with concentration in other body compartments (e.g., a specific receptor in the brain). The goal of TDM is typically to ensure that the patient is on a dose that will produce a plasma drug concentration within a range that usually provides a therapeutically desired response in the majority of patients without undue adverse effects. Advances such as TDM have been made possible by research in clinical pharmacology and pharmacogenetics. They can help clinicians detect the reasons behind the interindividual differences that determine much of the variance in response among patients receiving the same dose of the same drug.
Dose Titration Based on Clinical Assessment of Response
Before such techniques as TDM were developed, physicians had to rely exclusively on the time honored and widely used but inefficient approach of dose titration based on clinical assessment of response (Figure 1). In the dose titration approach, the physician starts the patient on the "usual" starting dose and waits to see the response. If the response is inadequate, the physician titrates the dose upward until the patient either responds or experiences an adverse effect. Depending on the drug (e.g., TCA), such adverse effects can be a nuisance problem (e.g., excessively dry mouth) or a serious problem (e.g., cardiac conduction disturbance). In the latter case, the inefficiency of dose titration based on clinical assessment of response can have tragic consequences for the patient.
|FIGURE 1 - Dose titration based on clinical assessment of response|
The dose titration approach may be even more error prone in psychiatry than in other areas of medicine because the adverse effects produced by psychiatric medications (e.g., TCAs, selective serotonin reuptake inhibitors, and neuroleptics) can present as a worsening of the psychiatric syndrome for which the drug was initially prescribed. In such a scenario, the physician may increase rather than reduce the dose and thus increase either the magnitude and/or the nature of the adverse effects.
At this point, some readers are undoubtedly shaking their heads and saying there is no "usual" dose and that they always adjust the dose based on the patient's unique characteristics. This contention is both right and wrong. There is no dose that works for everyone. That is one point of this series of columns. Nonetheless, most drugs have a dose that works for most people. If that were not the case, the drug would undoubtedly never have been marketed because the response in the clinical trials would have been too variable to show the group differences needed for approval.
Obviously, physicians try to adjust the dose to fit the patient. The old adage of "start low and go slow" with the elderly is just such an adjustment strategy . Physicians often use the same adage with children and adolescents. The dose may also be adjusted based on the body weight of the patient or on their health status (e.g., the presence of significant impairment of cardiac, hepatic, or renal function). However, these approaches are crude and do not account for the bulk of differences that are seen in physically healthy, young to middle aged adults on the same dose of the same drug.
Factors That Cause Variance in Drug Response
Pharmacogenetics is the study of how genetic factors affect drug response. Such genetic factors include the functional activity of CYP enzymes. These enzymes determine what concentration a patient will achieve on a given dose of a given drug by determining the elimination rate of that drug in that patient. This relationship is shown in equation 1:
steady-state drug concentration = dosing rate / elimination rate
A drug must affect a site of action in the body (e.g., a receptor, an uptake pump) that is physiologically capable of mediating the effect we clinically observe (see my September 1995 column). However, being capable of affecting such a site of action is not enough. The drug must reach a critical concentration at the site of action, and that concentration must be sustained for a critical period of time to activate or inhibit that site of action to a sufficient extent and for a sufficient duration to produce a clinically meaningful effect. As seen in equation 1, reaching and maintaining such a concentration is dependent on the dose the patient takes per unit of time (i.e., the dosing rate) relative to the elimination rate of that drug in that patient.
As is readily apparent from equation 1, a patient may appear to be "sensitive" to a drug if the true dosing rate is functionally much higher than the apparent dosing rate because the patient's elimination rate is substantially slower than that of the "usual" patient. In this scenario, the patient appears to be "sensitive" but in fact actually develops a much higher concentration than the "usual" patient on the same dose. The converse can also occur, when the patient appears to be "insensitive" or "resistant" to the effect of the drug due to a faster than usual elimination rate.
So why do some patients have a substantially slower or faster elimination rate for a drug than other patients? And can dose adjustment based on TDM be a more efficient way of fitting the dose to the patient than the traditional dose titration approach?
The elimination rate of a given drug in a given patient (equation 1) is a function of multiple variables. As I discussed in my November 1995 column, most drugs are converted by oxidative metabolism (i.e., phase I) into polar metabolites which are then conjugated with substances like glucuronic acid (i.e., phase II metabolism). The conjugated products are then excreted in the urine. The oxidative reaction (i.e., "biotransformation" or "metabolism") is typically performed by a specific CYP enzyme which is termed the principal CYP enzyme for that drug in the "usual" patient. The activity of that enzyme can vary among different individuals on a genetically determined basis (i.e., a trait phenomenon), or can be induced or inhibited by environmental agents including co-administered drugs (i.e., a state phenomenon).
The bulk of such biotransformation occurs in the liver. A significant decrease in liver mass can thus physically reduce the elimination rate of the drug. Since the drug must be delivered to the liver to be metabolized, the elimination rate is also dependent on hepatic arterial blood flow, which in turn is dependent on cardiac output. Since the polar metabolites are generally eliminated in the urine, significant loss of renal mass or function and/or significant decrease in renal arterial blood flow can also reduce the elimination rate of the metabolites of the drug, which may or may not be pharmacologically active. Parenthetically, "active" metabolites can have a pharmacology quite similar to the parent drug, or quite different.
Given this background, the fact that there can be considerable variance in the response of different patients is not surprising. The factors that can affect a patient's elimination rate for a drug can range from genetics to environmental factors to diseases affecting the function of specific organs. As seen in equation 1, a change in the elimination rate is analogous to changing the dosing rate. In the past, physicians were not able to tell that the dose for one patient was functionally different from the dose for another. Instead, they primarily detected and dealt with this variance by adjusting the dose based on the patient's response -- which meant that the patient had to fail to respond optimally to the drug (either failing to improve and/or experiencing adverse effects) for the physician to be able to decide that a dose adjustment was needed.
Therapeutic Drug Monitoring
TDM is one tool that physicians can use to more precisely individualize the dose of a drug to fit a specific patient by assessing the patient's individual elimination rate for that drug. By obtaining a steady-state plasma level of a drug and by knowing the dosing rate, the physician has determined two of the three variables in equation 1 and can use them to solve for the elimination rate of that drug in that patient by dividing the plasma drug level by the dosing rate. The dosing rate can then be adjusted to achieve the desired concentration, which effectively removes that source of response variance between different patients. This is done by dividing the ratio or estimate of clearance shown in the footnote of Table 1 (ng/mL/mg) into the target or desired plasma drug level to determine the dose the patient will need to receive to achieve the desired plasma level. I do recommend making modest dose adjustments in patients, particularly outpatients, who have an unusually low ratio since their actual dosing rate (i.e., what they are taking) is likely to be less than what the physician has recommended (i.e., the presumed dosing rate). In other words, the patient is not complying with the recommended dosing rate.
|TABLE 1 - Analysis and use of tricyclic antidepressant (TCA) plasma level results*†|
|< 0.5 ng/mL/mg |
|Either rapid metabolizer or noncompliance |
Most likely slow metabolizer†
|* TDM is used to estimate patient's clearance. Divide steady-state TCA plasma level (ng/mL) by total |
daily dose (mg) to yield an estimate of clearance in units of ng/mL/mg.
† Other possible explanations for variance include the sample being drawn too early
in dosing interval, co-chromatography, infection.
The preceding discussion provides the rationale for the statement at the beginning of this column -- that many physicians have unwittingly used TDM of TCAs to phenotype their patients in terms of the functional integrity of their CYP 2D6, the principal P460 enzyme for determining the elimination rate of TCAs (Table 1). Many physicians do not realize that repeated TDM is not typically necessary with drugs like TCAs. Instead, such repeat TDM should be done for a specific reason (e.g., the addition of a drug that could change the patient's elimination rate by inducing or inhibiting the principal P450 enzyme for that drug or the suspicion of significant noncompliance).
Repeat TDM of oxidatively metabolized drugs (in contrast to lithium which, being a salt, is not metabolized) is generally not necessary because TDM measures a specific patient's ability to clear a particular drug. This ability does not change without a specific reason (e.g., intercurrent disease affecting the heart, liver, or kidneys or concomitant drug therapy affecting the principal P450 enzyme). This fact also explains why repeat TDM can be used to detect noncompliance. If the patient's plasma drug level does not change appropriately with a dose change as expressed in equation 1, the most likely explanation is that the true dosing rate (i.e., the amount the patient is taking per unit of time) is not the prescribed dosing rate. Noncompliance can also be detected by significant fluctuations in the plasma drug levels (i.e., beyond the limits of assay error) even though the patient is on a presumably stable dose. Before drawing this conclusion, however, the physician needs to consider whether other potential factors could be causing such variance. That may require a call to the laboratory to discuss what other factors need to be considered, since they can vary from one drug to another and from one assay to another.
In my next column, I will discuss some of the features of a drug that predict the usefulness of TDM in the clinical setting. I will also examine some of the pitfalls in TDM research in psychiatry and how the tendency to "look at the glass of water as half empty" impeded the adoption of this simple and straightforward approach to fine tuning the dosing of patients. In later columns, I will discuss other pharmacogenetic factors that can shift a patient's responsivity to a drug from one time to another.