July 8, 2019 – The study results suggest that clinical factors may be more useful than international normalized ratio metrics in predicting future risk of bleeding or thrombotic events.
A metric typically used to determine doses of warfarin in patients with atrial fibrillation, international normalized ratio (INR), was once thought to be a potential indicator of future risk levels for bleeding or thrombotic events. However, research published recently in JAMA Cardiology by the DCRI’s Sean Pokorney, MD, MBA, suggests that historical INR values may not be the most accurate predictor of future events.
The study examined patients from the ORBIT-AF registry—5,545 patients for the bleeding analysis and 5,635 patients for the thrombotic event analysis. Data analysis was performed from August 2016 to February 2019.
Traditionally, patients’ risk of bleeding and thrombotic events has been estimated based on clinical factors. However, this study sought to determine whether past measures of INR control factored into this risk independently of clinical factors. It had been previously described that in patients with atrial fibrillation taking vitamin K antagonists, low international normalized ratio (INR) values mean higher risk of thrombotic events, while high INR values may signal higher risk for bleeding events.
“These results suggest that using metrics for warfarin control may not be helpful in predicting patients’ potential for future risk beyond the predictive capacity of risk scores using clinical factors,” Pokorney said. “It is just challenging to predict future bleeding or stroke events for patients on warfarin, since historical INR values are not predictive of future INR values.”
Other DCRI contributors to this study include Laine Thomas, PhD; Jonathan Piccini, MD, MHS; and Eric Peterson, MD, MPH.