Approved in the US in 1954, warfarin is one of the most widely used oral anti-coagulants today. By inhibiting vitamin K-dependent activation of clotting factors, it prevents and treats thromboembolism associated with atrial fibrillation, artificial heart valves, and myocardial infarction.
Despite being used for more than 60 years, warfarin therapy is beset with challenges.
The narrow therapeutic range often makes warfarin use a tightrope walk between bleeding due to excessive anti-coagulation and thromboembolism due to insufficient anti-coagulation. Therefore, patients must be closely monitored to ensure that the international normalized ratio (INR; a standardized measure of coagulation) remains within the therapeutic range. This can be challenging, given that warfarin dose requirements may vary up to 20-fold among patients.
Clinical and demographic factors (e.g. age, weight, race, disease state, and diet) account for significant variability in dose requirement. When these are combined with genotype data, more than 50 percent of dose variability is explained. Blazing advances in genomic technologies have paved the path for pharmacogenetics (PGx) as a useful tool in warfarin dosing.
The following gene variants alter the pharmacokinetics and pharmacodynamics of warfarin:
- Cytochrome P450 2C9 (CYP2C9):CYP2C9 variants alter warfarin metabolism. A majority of CYP2C9 gene variants identified are single nucleotide polymorphisms (SNPs) that reduce enzyme activity. This leads to longer half-life of the drug, increasing the risk of adverse bleeding events.
- Vitamin K epoxide reductase complex 1 (VKORC1): This variant encodes the target protein that warfarin inhibits in order to mediate its anti-coagulant action. Therefore, SNPs in the regulatory region controlling VKORC1 gene expression dictate the warfarin dose required, with lower VKORC1 expression requiring a lower dose.
- Cytochrome P450 4F2 (CYP4F2): An SNP in CYP4F2 results in reduced expression of the enzyme and altered vitamin K metabolism. Consequently, patients that carry the SNP have higher vitamin K concentrations, requiring higher warfarin doses than wild-type carriers to mediate similar levels of anti-coagulation.
Six clinically-relevant CYP2C9 gene variants, two functional VKORC1 SNPs, and one CYP4F2 SNP have been identified. The frequency of these gene variants differs based on race. CYP2C9*2 and *3 variants are more common in Caucasians than African Americans, whereas VKORC1-1639G>A is more common in Asians.
Policy and regulations
Based on early evidence in 2007, the U.S. Food and Drug Association (FDA) added pharmacogenomic information on bleeding risk to the warfarin label. Then in 2010, a dosing table was included on the label to provide a range of therapeutic warfarin doses for CYP2C9 and/or VKORC1 variant carriers. However, no regulations mandated pharmacogenetic testing prior to warfarin initiation, leaving the need for routine testing open to debate.
The holy trinity of PGx testing
When making recommendations for or against routine pharmacogenetic testing, three parameters need to be considered: 1) analytic validity, 2) clinical validity, and 3) clinical utility.
Analytic validity is the accuracy with which a test can identify SNPs and gene variants. Genetic tests that detect reduced function CYP2C9 and VKORC1 variants have been cleared by the FDA to be used as diagnostic tools in warfarin therapy. In addition, several laboratory-developed assays that detect CYP2C9, VKORC1, and CYP4F2 variants also exist. Studies show that the accuracy of laboratory-developed genotype assays is comparable to FDA-approved tests. However, number of gene variants tested and turn-around time of the tests may vary.
Clinical validity refers to the reliability in predicting the appropriate warfarin dose. The International Warfarin Pharmacogenetics Consortium developed a dosing algorithm that combined clinical and pharmacogenetic data to predict the appropriate warfarin dose required. This dosing algorithm was significantly more effective than one that only relied on clinical factors in identifying patients who required low (<21 mg/week) or high (>49 mg/week) warfarin doses than the standard dose of 35 mg/week. However, since the algorithm did not include many of the variants important in African Americans, it does not perform as well in more racially diverse populations.
Clinical utility refers to the net benefits and risks of using the test. This includes criteria such as time to stable INR, time spent in therapeutic INR range, and incidence of adverse events. Although several randomized clinical trials have been performed, proving the clinical utility of pharmacogenetics testing remains elusive. Results are clouded by small sample sizes, racial demographics of the study population, different primary outcomes, and the dosing algorithm used.
Is PGx testing cost-effective?
The economics of warfarin pharmacogenetic testing is entwined with its clinical utility. It would be remiss to say a solution is cost-effective if it is not effective to begin with.
Two randomized control trials performed in the US and Europe generated conflicting results. In the EU-PACT trial, genotype-guided warfarin dosing resulted in significantly more time in the therapeutic INR range, shorter time to therapeutic INR, and fewer hemorrhagic events than standard warfarin dosing. In contrast, the US COAG trial did not find significant differences in any of these criteria between the two dosing algorithms. In fact, subgroup analysis showed that African American patients in the COAG trial fared better with the clinical algorithm compared to the genotype-guided algorithm.
Nevertheless, several studies have assessed the economic benefit of PGx over alternative treatment strategies. A recent review reported that of the 12 studies published on warfarin pharmacoeconomics, three found PGx-guided dosing cost-effective, four studies were inconclusive, and five did not find any cost-effectiveness.
Cost-effectiveness studies are subject to certain inherent limitations, most importantly, accuracy of parameters imputed into the economic evaluation model. In the case of warfarin, data on the assumed effectiveness of genotyping (often derived from randomized controlled studies) is highly variable. In addition, the perception of cost-effectiveness varies based on the cost of genetic testing and the healthcare system of each country. For instance, PGx-guided therapy was favored by a US cost-effectiveness study, but not by studies from UK, Canada, and Thailand. In fact, some of the studies in the latter category support the use of newer oral anti-coagulants such as dabigatran and apixaban as more cost-effective alternatives given more predictable pharmacokinetics, fewer drug interactions and lower incidence of major bleeding events than warfarin. However, other studies from UK, Sweden, and Croatia support the US study’s conclusion that PGx-guided dosing is a cost-effective strategy over standard treatment.
For a test to be routinely prescribed in the clinical setting, it should demonstrate consistent and significant benefits over existing methodologies. The gene variants most commonly tested can reliably identify patients that require a higher or lower dose of warfarin than the standard in a largely Caucasian population. However, clinical utility and cost-effectiveness of routine PGx testing in more diverse populations remains controversial.
The pursuit of routine PGx testing is further called into question by the availability of newer oral anti-coagulants that lack warfarin’s capriciousness. However, their use is riddled with concerns over increased gastrointestinal bleeding, absence of antidotes to reverse overdosing, limited use in the elderly, and poor adherence. Consequently, warfarin use continues unabated.
Ultimately, lack of a clear consensus on clinical utility and cost-effectiveness dampens the case for routine PGx testing for warfarin. More research is clearly needed before any sweeping health policy changes can be implemented.