User login
Cosmetic Corner: Dermatologists Weigh in on Nail Care Products
To improve patient care and outcomes, leading dermatologists offered their recommendations on nail care products. Consideration must be given to:
- Aquaphor Healing Ointment
Beiersdorf, Inc
Recommended by Gary Goldenberg, MD, New York, New York
- Biotin Oral Supplements
Manufacturers vary
“Biotin is a helpful supplement for brittle nails. It may take 6 months to see improvement in the nails.”—Shari Lipner, MD, PhD, New York, New York
Recommended by Gary Goldenberg, MD, New York, New York
- Deep Comfort Hand and Cuticle Cream
Clinique
“It has good hydration for cuticles with sodium hyaluronate and squalene. It also is fragrance free.”—Anthony M. Rossi, MD, New York, New York
- Genadur
Medimetriks Pharmaceuticals, Inc
Recommended by Gary Goldenberg, MD, New York, New York
- Lanolin-Rich Nail Conditioner
Elon
“It’s great for moisturizing and nail hardening.”—Marta Rendon, MD, Boca Raton, Florida
- Nail Renewal System
Dr. Dana
“Developed by dermatologist Dr. Dana Stern, the system combines glycolic acid to improve discoloration and ridging, along with hydrating and strengthening botanicals to improve the look, feel, and overall health of the nails.”— Joshua Zeichner, MD, New York, New York
Cutis invites readers to send us their recommendations. Acne scar treatments, self-tanners, and cleansing devices will be featured in upcoming editions of Cosmetic Corner. Please e-mail your recommendation(s) to the Editorial Office.
Disclaimer: Opinions expressed herein do not necessarily reflect those of Cutis or Frontline Medical Communications Inc. and shall not be used for product endorsement purposes. Any reference made to a specific commercial product does not indicate or imply that Cutis or Frontline Medical Communications Inc. endorses, recommends, or favors the product mentioned. No guarantee is given to the effects of recommended products.
To improve patient care and outcomes, leading dermatologists offered their recommendations on nail care products. Consideration must be given to:
- Aquaphor Healing Ointment
Beiersdorf, Inc
Recommended by Gary Goldenberg, MD, New York, New York
- Biotin Oral Supplements
Manufacturers vary
“Biotin is a helpful supplement for brittle nails. It may take 6 months to see improvement in the nails.”—Shari Lipner, MD, PhD, New York, New York
Recommended by Gary Goldenberg, MD, New York, New York
- Deep Comfort Hand and Cuticle Cream
Clinique
“It has good hydration for cuticles with sodium hyaluronate and squalene. It also is fragrance free.”—Anthony M. Rossi, MD, New York, New York
- Genadur
Medimetriks Pharmaceuticals, Inc
Recommended by Gary Goldenberg, MD, New York, New York
- Lanolin-Rich Nail Conditioner
Elon
“It’s great for moisturizing and nail hardening.”—Marta Rendon, MD, Boca Raton, Florida
- Nail Renewal System
Dr. Dana
“Developed by dermatologist Dr. Dana Stern, the system combines glycolic acid to improve discoloration and ridging, along with hydrating and strengthening botanicals to improve the look, feel, and overall health of the nails.”— Joshua Zeichner, MD, New York, New York
Cutis invites readers to send us their recommendations. Acne scar treatments, self-tanners, and cleansing devices will be featured in upcoming editions of Cosmetic Corner. Please e-mail your recommendation(s) to the Editorial Office.
Disclaimer: Opinions expressed herein do not necessarily reflect those of Cutis or Frontline Medical Communications Inc. and shall not be used for product endorsement purposes. Any reference made to a specific commercial product does not indicate or imply that Cutis or Frontline Medical Communications Inc. endorses, recommends, or favors the product mentioned. No guarantee is given to the effects of recommended products.
To improve patient care and outcomes, leading dermatologists offered their recommendations on nail care products. Consideration must be given to:
- Aquaphor Healing Ointment
Beiersdorf, Inc
Recommended by Gary Goldenberg, MD, New York, New York
- Biotin Oral Supplements
Manufacturers vary
“Biotin is a helpful supplement for brittle nails. It may take 6 months to see improvement in the nails.”—Shari Lipner, MD, PhD, New York, New York
Recommended by Gary Goldenberg, MD, New York, New York
- Deep Comfort Hand and Cuticle Cream
Clinique
“It has good hydration for cuticles with sodium hyaluronate and squalene. It also is fragrance free.”—Anthony M. Rossi, MD, New York, New York
- Genadur
Medimetriks Pharmaceuticals, Inc
Recommended by Gary Goldenberg, MD, New York, New York
- Lanolin-Rich Nail Conditioner
Elon
“It’s great for moisturizing and nail hardening.”—Marta Rendon, MD, Boca Raton, Florida
- Nail Renewal System
Dr. Dana
“Developed by dermatologist Dr. Dana Stern, the system combines glycolic acid to improve discoloration and ridging, along with hydrating and strengthening botanicals to improve the look, feel, and overall health of the nails.”— Joshua Zeichner, MD, New York, New York
Cutis invites readers to send us their recommendations. Acne scar treatments, self-tanners, and cleansing devices will be featured in upcoming editions of Cosmetic Corner. Please e-mail your recommendation(s) to the Editorial Office.
Disclaimer: Opinions expressed herein do not necessarily reflect those of Cutis or Frontline Medical Communications Inc. and shall not be used for product endorsement purposes. Any reference made to a specific commercial product does not indicate or imply that Cutis or Frontline Medical Communications Inc. endorses, recommends, or favors the product mentioned. No guarantee is given to the effects of recommended products.
First EDition: News for and about the practice of emergency medicine
Black Patients Are Less Likely to Receive Opioids for Back Pain, Abdominal Pain, But Not for “Definitive” Pain
BY JEFF BAUER
FROM PLOS ONE
Black patients who present to the ED with back pain or abdominal pain are significantly less likely to be treated with or prescribed an opioid than are white patients who report similar pain, according to an analysis of data on national ED visits. However, there were no differences in opioid use among black patients and white patients for more objective pain conditions, such as kidney stones or long bone fractures.
Researchers evaluated National Hospital Ambulatory Medical Care Survey data that included descriptions of ED visits made by adults ages 18 to 65 years from 2007 to 2011. They looked specifically at pain-related visits, and defined the reason for each visit as being a “nondefinitive condition” (toothache, back pain, or abdominal pain) or a “definitive condition” (long bone fractures or kidney stones). These nondefinitive conditions have been associated with drug-seeking behavior.
The subjects were categorized as non-Hispanic white, non-Hispanic black, Hispanic, or non-Hispanic other. Pain was rated on a scale from 0 (no pain) to 10 (severe). Researchers also measured whether the patients received an opioid while they were in the ED, were discharged with a prescription for an opioid, or both.
During the study period, there were approximately 36.5 million ED visits for abdominal pain, 14.3 visits for back pain, 6.9 million visits for toothache, 3.4 million visits for kidney stones, and 2.1 million visits for long bone fractures. For each of these conditions, most visits were associated with severe pain.
After adjusting for pain severity, non-Hispanic black patients with abdominal pain or back pain were significantly less likely than their white counterparts to be administered an opioid while in the ED or to be discharged with a prescription for an opioid. However, there was no significant difference between these groups in opioid administration or prescription for patients with long bone fractures, kidney stones, or toothache. Researchers suggested that although toothache was considered a nondefinitive condition in this study, physicians might have been able to verify dental disease during examination of the mouth, thus limiting the subjectivity in their decision to prescribe an opioid.
1. Singhal A, Tien YY, Hsia RY. Racial-ethnic disparities in opioid prescriptions at emergency department visits for conditions commonly associated with prescription drug abuse. PLoS One. 2016;11(8):e0159224. doi: 10.1371/journal.pone.0159224.
FDA Updates Warning Label for Systemic Fluoroquinolones
BY DEEPAK CHITNIS
FRONTLINE MEDICAL NEWS
The Food and Drug Administration (FDA) has amended the boxed warning on labels for fluoroquinolone antibiotics, taken either orally or by injection, to reflect recent findings of the drugs’ potential adverse events.
“These medicines are associated with disabling and potentially permanent side effects of the tendons, muscles, joints, nerves, and central nervous system that can occur together in the same patient,” the FDA stated in its Safety Announcement.
As a result, health care providers should reserve systemic fluoroquinolones for patients who have no other treatment options for any of the following conditions: acute bacterial sinusitis (ABS), acute bacterial exacerbation of chronic bronchitis (ABECB), and uncomplicated urinary tract infections (UTIs). The FDA also said that for some serious bacterial infections, the benefits of fluoroquinolones outweigh the risks, and it is appropriate for them to remain available as a therapeutic option.
Patients taking fluoroquinolones must also be vigilant and let their provider know immediately if they begin suffering from any new pain in their joints, tendons, or muscles. Additionally, if patients begin feeling any numbness in their arms and legs, a prickling or “pins and needles” sensation, or confusion and hallucinations, they should contact their health care provider right away so that they may be switched to a nonfluoroquinolone antibacterial drug for the remainder of their treatment regimen.
Avelox (Moxifloxacin); Cipro, both standard and extended release (ciprofloxacin); Factive (gemifloxacin); Levaquin (levofloxacin); and ofloxacin are the fluoroquinolones currently approved by the FDA for systemic use.
Additional adverse effects for patients taking fluoroquinolones could include tendinitis, tendon rupture, and joint swelling. Central nervous system afflictions could include depression and thoughts of suicide. Fluoroquinolones could also bring about skin rashes, sunburn, arrhythmia, and diarrhea, and could aggravate myasthenia gravis in patients who suffer from it. Warnings regarding these conditions are already included on the drugs’ existing boxed warning.
“In addition to updating information in the Boxed Warning, we are also including information about these safety issues in the Warnings and Precautions section of the label,” the FDA stated. “The Indications and Usage section contains new limitation-of-use statements to reserve fluoroquinolones for patients who do not have other available treatment options for ABS, ABECB, and uncomplicated UTIs.”
The FDA added that it will continue to monitor and assess safety issues associated with fluoroquinolones and will issue any further updates if necessary.
1. US Food and Drug Administration. FDA Drug Safety Communication: FDA updates warnings for oral and injectable fluoroquinolone antibiotics due to disabling side effects. http://www.fda.gov/Drugs/DrugSafety/ucm511530.htm. Published July 26, 2016 Accessed August 26, 2016.
Skin Rash in a Recent Traveler? Think Dengue Fever
BY SHARON WORCESTER
FRONTLINE MEDICAL NEWS
Maintain clinical suspicion for dengue fever among individuals with recent travel to endemic areas who present with a rash and other signs and symptoms of infection, an expert advised at the American Academy of Dermatology summer meeting.
Dengue fever accounts for nearly 10% of skin rashes among individuals returning from endemic areas, and related illness can range from mild to fatal, said Jose Dario Martinez, MD, chief of the Internal Medicine Clinic at University Hospital J.E. Gonzalez, UANL Monterrey, Mexico.
“This is the most prevalent arthropod-borne virus in the world at this time, and it is a resurgent disease in some countries, like Mexico, Brazil, and Colombia,” he noted.
Worldwide, more than 2.5 billion people are at risk of dengue infection, and between 50 million and 100 million cases occur each year, while about 250,000 to 500,000 cases of dengue hemorrhagic fever (DHF) occur each year, and about 25,000 related deaths occur.
In 2005, there was a dengue outbreak in Texas, where 25 cases occurred; in southern Florida, an outbreak of 90 cases was reported in 2009 and 2010. More recently, in 2015, there was an outbreak of 107 cases of locally acquired dengue on the Big Island, Hawaii. But in Mexico, 18,000 new cases occurred in 2015, Dr Martinez said.
Of the RNA virus serotypes 1 to 4, type 1 (DEN-1) is the most common, and DEN-2 and DEN-3 are the most severe, but up to 40% of cases are asymptomatic, he noted, adding that the virus has an incubation period of 2 to 8 days. When symptoms occur, they are representative of acute febrile illness, and may include headache, high fever, myalgia, arthralgia, retro-orbital pain, and fatigue. A faint, itchy, macular rash commonly occurs at 2 to 6 days into the illness. According to the World Health Organization (WHO), a probable dengue fever case includes acute febrile illness and at least two of the following: headache, retro-orbital pain, myalgia, arthralgia, rash, hemorrhagic manifestations, leukopenia, or supportive serology.
“Sometimes the nose bleeds, the gums bleed, and there is bruising in the patient,” Dr Martinez said. “Most important are retro-orbital pain and hemorrhagic manifestations, but also supportive serology.”
About 1% of patients progress to DHF or dengue shock syndrome (DSS) during the critical phase (days 4-7) of illness. This is most likely in those with serotypes 2 and 3, but can occur with all serotypes. Warning signs of such severe disease include abdominal pain or tenderness, persistent vomiting, pleural effusion or ascites, and of particular importance—mucosal bleeding, Dr Martinez said.
By the WHO definition, a diagnosis of DHF requires the presence of fever for at least 2 to 7 days, hemorrhagic tendencies, thrombocytopenia, and evidence and signs of plasma leakage; DSS requires these, as well as evidence of circulatory failure, such as rapid and weak pulse, narrow pulse pressure, hypotension, and shock.
It is important to maintain clinical suspicion for dengue fever, particularly in anyone who has traveled to an endemic area in the 2 weeks before presentation. Serologic tests are important to detect anti-dengue antibodies. Immunoglobulin G is important because its presence could suggest recurrent infection and thus the potential for severe disease, Dr Martinez said. Polymerase chain reaction can be used for detection in the first 4 to 5 days of infection, and the nonstructural glycoprotein 1 rapid test can be positive on the first day, he noted.
The differential diagnosis for dengue fever is broad, and can include chikungunya fever, malaria, leptospirosis, meningococcemia, drug eruption, and Zika fever.
Management of dengue fever includes bed rest, liquids, and mosquito net isolation to prevent reinfection, as more severe disease can occur after reinfection. Acetaminophen can be used for pain relief; aspirin should be avoided due to risk of bleeding, Dr Martinez said. Hospitalization and supportive care are required for those with DHF or DSS. Intensive care unit admission may be required.
Of note, a vaccine against dengue fever has shown promise in phase III trials. The vaccine has been approved in Mexico and Brazil, but not yet in the United States.
For more on dengue fever, see the case report “Dengue Fever: Two Unexpected Findings” on page 408.
Pertussis Often Goes Undiagnosed, Especially in Adults
BY ABIGAIL CRUZ
FRONTLINE MEDICAL NEWS
A majority of pertussis cases in the United States may go undetected in people younger than age 50 years, particularly in adults, results of a retrospective database cohort study suggest.
“The incidence of pertussis in adolescents and adults is very difficult to quantify,” wrote Chi-Chang Chen, MD, of IMS Health, Plymouth Meeting, Pennsylvania, and associates. Symptoms may be misdiagnosed as other respiratory illnesses; infected individuals may not seek treatment; and pertussis may not be considered as a possible diagnosis in adults, they noted.
To project the possible range of pertussis incidence in this population, investigators used three different models to analyze information from private insurance and laboratory databases as well as data from the Centers for Disease Control and Prevention for a 6-year period. The first method, which used medical claims for International Classification of Diseases (ICD-9) diagnosed pertussis, found an annual incidence rate of 9 per 100,000 population. The second used a proxy pertussis model that was based on symptoms that could indicate undiagnosed pertussis, showing an incidence rate of 21 per 100,000. The third method used pathogen data to estimate the fraction of cough illness statistically attributable to pertussis, resulting in an incidence rate of 649 per 100,000 population, which is 58 to 93 times higher than the ICD-9 estimated incidence.
These estimates “highlight the need for improved preventive measures—such as increased vaccination—against pertussis,” the investigators said, noting that immunization recommendations for additional age groups and research involving strategies to reduce waning immunity after vaccination should be considered.
1. Chen CC, Balderston McGuiness C, Krishnarajah G, et al. Estimated incidence of pertussis in people aged <50 years in the United States. Hum Vaccin Immunother. 2016;31:1-10. [Epub ahead of print]
Black Patients Are Less Likely to Receive Opioids for Back Pain, Abdominal Pain, But Not for “Definitive” Pain
BY JEFF BAUER
FROM PLOS ONE
Black patients who present to the ED with back pain or abdominal pain are significantly less likely to be treated with or prescribed an opioid than are white patients who report similar pain, according to an analysis of data on national ED visits. However, there were no differences in opioid use among black patients and white patients for more objective pain conditions, such as kidney stones or long bone fractures.
Researchers evaluated National Hospital Ambulatory Medical Care Survey data that included descriptions of ED visits made by adults ages 18 to 65 years from 2007 to 2011. They looked specifically at pain-related visits, and defined the reason for each visit as being a “nondefinitive condition” (toothache, back pain, or abdominal pain) or a “definitive condition” (long bone fractures or kidney stones). These nondefinitive conditions have been associated with drug-seeking behavior.
The subjects were categorized as non-Hispanic white, non-Hispanic black, Hispanic, or non-Hispanic other. Pain was rated on a scale from 0 (no pain) to 10 (severe). Researchers also measured whether the patients received an opioid while they were in the ED, were discharged with a prescription for an opioid, or both.
During the study period, there were approximately 36.5 million ED visits for abdominal pain, 14.3 visits for back pain, 6.9 million visits for toothache, 3.4 million visits for kidney stones, and 2.1 million visits for long bone fractures. For each of these conditions, most visits were associated with severe pain.
After adjusting for pain severity, non-Hispanic black patients with abdominal pain or back pain were significantly less likely than their white counterparts to be administered an opioid while in the ED or to be discharged with a prescription for an opioid. However, there was no significant difference between these groups in opioid administration or prescription for patients with long bone fractures, kidney stones, or toothache. Researchers suggested that although toothache was considered a nondefinitive condition in this study, physicians might have been able to verify dental disease during examination of the mouth, thus limiting the subjectivity in their decision to prescribe an opioid.
1. Singhal A, Tien YY, Hsia RY. Racial-ethnic disparities in opioid prescriptions at emergency department visits for conditions commonly associated with prescription drug abuse. PLoS One. 2016;11(8):e0159224. doi: 10.1371/journal.pone.0159224.
FDA Updates Warning Label for Systemic Fluoroquinolones
BY DEEPAK CHITNIS
FRONTLINE MEDICAL NEWS
The Food and Drug Administration (FDA) has amended the boxed warning on labels for fluoroquinolone antibiotics, taken either orally or by injection, to reflect recent findings of the drugs’ potential adverse events.
“These medicines are associated with disabling and potentially permanent side effects of the tendons, muscles, joints, nerves, and central nervous system that can occur together in the same patient,” the FDA stated in its Safety Announcement.
As a result, health care providers should reserve systemic fluoroquinolones for patients who have no other treatment options for any of the following conditions: acute bacterial sinusitis (ABS), acute bacterial exacerbation of chronic bronchitis (ABECB), and uncomplicated urinary tract infections (UTIs). The FDA also said that for some serious bacterial infections, the benefits of fluoroquinolones outweigh the risks, and it is appropriate for them to remain available as a therapeutic option.
Patients taking fluoroquinolones must also be vigilant and let their provider know immediately if they begin suffering from any new pain in their joints, tendons, or muscles. Additionally, if patients begin feeling any numbness in their arms and legs, a prickling or “pins and needles” sensation, or confusion and hallucinations, they should contact their health care provider right away so that they may be switched to a nonfluoroquinolone antibacterial drug for the remainder of their treatment regimen.
Avelox (Moxifloxacin); Cipro, both standard and extended release (ciprofloxacin); Factive (gemifloxacin); Levaquin (levofloxacin); and ofloxacin are the fluoroquinolones currently approved by the FDA for systemic use.
Additional adverse effects for patients taking fluoroquinolones could include tendinitis, tendon rupture, and joint swelling. Central nervous system afflictions could include depression and thoughts of suicide. Fluoroquinolones could also bring about skin rashes, sunburn, arrhythmia, and diarrhea, and could aggravate myasthenia gravis in patients who suffer from it. Warnings regarding these conditions are already included on the drugs’ existing boxed warning.
“In addition to updating information in the Boxed Warning, we are also including information about these safety issues in the Warnings and Precautions section of the label,” the FDA stated. “The Indications and Usage section contains new limitation-of-use statements to reserve fluoroquinolones for patients who do not have other available treatment options for ABS, ABECB, and uncomplicated UTIs.”
The FDA added that it will continue to monitor and assess safety issues associated with fluoroquinolones and will issue any further updates if necessary.
1. US Food and Drug Administration. FDA Drug Safety Communication: FDA updates warnings for oral and injectable fluoroquinolone antibiotics due to disabling side effects. http://www.fda.gov/Drugs/DrugSafety/ucm511530.htm. Published July 26, 2016 Accessed August 26, 2016.
Skin Rash in a Recent Traveler? Think Dengue Fever
BY SHARON WORCESTER
FRONTLINE MEDICAL NEWS
Maintain clinical suspicion for dengue fever among individuals with recent travel to endemic areas who present with a rash and other signs and symptoms of infection, an expert advised at the American Academy of Dermatology summer meeting.
Dengue fever accounts for nearly 10% of skin rashes among individuals returning from endemic areas, and related illness can range from mild to fatal, said Jose Dario Martinez, MD, chief of the Internal Medicine Clinic at University Hospital J.E. Gonzalez, UANL Monterrey, Mexico.
“This is the most prevalent arthropod-borne virus in the world at this time, and it is a resurgent disease in some countries, like Mexico, Brazil, and Colombia,” he noted.
Worldwide, more than 2.5 billion people are at risk of dengue infection, and between 50 million and 100 million cases occur each year, while about 250,000 to 500,000 cases of dengue hemorrhagic fever (DHF) occur each year, and about 25,000 related deaths occur.
In 2005, there was a dengue outbreak in Texas, where 25 cases occurred; in southern Florida, an outbreak of 90 cases was reported in 2009 and 2010. More recently, in 2015, there was an outbreak of 107 cases of locally acquired dengue on the Big Island, Hawaii. But in Mexico, 18,000 new cases occurred in 2015, Dr Martinez said.
Of the RNA virus serotypes 1 to 4, type 1 (DEN-1) is the most common, and DEN-2 and DEN-3 are the most severe, but up to 40% of cases are asymptomatic, he noted, adding that the virus has an incubation period of 2 to 8 days. When symptoms occur, they are representative of acute febrile illness, and may include headache, high fever, myalgia, arthralgia, retro-orbital pain, and fatigue. A faint, itchy, macular rash commonly occurs at 2 to 6 days into the illness. According to the World Health Organization (WHO), a probable dengue fever case includes acute febrile illness and at least two of the following: headache, retro-orbital pain, myalgia, arthralgia, rash, hemorrhagic manifestations, leukopenia, or supportive serology.
“Sometimes the nose bleeds, the gums bleed, and there is bruising in the patient,” Dr Martinez said. “Most important are retro-orbital pain and hemorrhagic manifestations, but also supportive serology.”
About 1% of patients progress to DHF or dengue shock syndrome (DSS) during the critical phase (days 4-7) of illness. This is most likely in those with serotypes 2 and 3, but can occur with all serotypes. Warning signs of such severe disease include abdominal pain or tenderness, persistent vomiting, pleural effusion or ascites, and of particular importance—mucosal bleeding, Dr Martinez said.
By the WHO definition, a diagnosis of DHF requires the presence of fever for at least 2 to 7 days, hemorrhagic tendencies, thrombocytopenia, and evidence and signs of plasma leakage; DSS requires these, as well as evidence of circulatory failure, such as rapid and weak pulse, narrow pulse pressure, hypotension, and shock.
It is important to maintain clinical suspicion for dengue fever, particularly in anyone who has traveled to an endemic area in the 2 weeks before presentation. Serologic tests are important to detect anti-dengue antibodies. Immunoglobulin G is important because its presence could suggest recurrent infection and thus the potential for severe disease, Dr Martinez said. Polymerase chain reaction can be used for detection in the first 4 to 5 days of infection, and the nonstructural glycoprotein 1 rapid test can be positive on the first day, he noted.
The differential diagnosis for dengue fever is broad, and can include chikungunya fever, malaria, leptospirosis, meningococcemia, drug eruption, and Zika fever.
Management of dengue fever includes bed rest, liquids, and mosquito net isolation to prevent reinfection, as more severe disease can occur after reinfection. Acetaminophen can be used for pain relief; aspirin should be avoided due to risk of bleeding, Dr Martinez said. Hospitalization and supportive care are required for those with DHF or DSS. Intensive care unit admission may be required.
Of note, a vaccine against dengue fever has shown promise in phase III trials. The vaccine has been approved in Mexico and Brazil, but not yet in the United States.
For more on dengue fever, see the case report “Dengue Fever: Two Unexpected Findings” on page 408.
Pertussis Often Goes Undiagnosed, Especially in Adults
BY ABIGAIL CRUZ
FRONTLINE MEDICAL NEWS
A majority of pertussis cases in the United States may go undetected in people younger than age 50 years, particularly in adults, results of a retrospective database cohort study suggest.
“The incidence of pertussis in adolescents and adults is very difficult to quantify,” wrote Chi-Chang Chen, MD, of IMS Health, Plymouth Meeting, Pennsylvania, and associates. Symptoms may be misdiagnosed as other respiratory illnesses; infected individuals may not seek treatment; and pertussis may not be considered as a possible diagnosis in adults, they noted.
To project the possible range of pertussis incidence in this population, investigators used three different models to analyze information from private insurance and laboratory databases as well as data from the Centers for Disease Control and Prevention for a 6-year period. The first method, which used medical claims for International Classification of Diseases (ICD-9) diagnosed pertussis, found an annual incidence rate of 9 per 100,000 population. The second used a proxy pertussis model that was based on symptoms that could indicate undiagnosed pertussis, showing an incidence rate of 21 per 100,000. The third method used pathogen data to estimate the fraction of cough illness statistically attributable to pertussis, resulting in an incidence rate of 649 per 100,000 population, which is 58 to 93 times higher than the ICD-9 estimated incidence.
These estimates “highlight the need for improved preventive measures—such as increased vaccination—against pertussis,” the investigators said, noting that immunization recommendations for additional age groups and research involving strategies to reduce waning immunity after vaccination should be considered.
1. Chen CC, Balderston McGuiness C, Krishnarajah G, et al. Estimated incidence of pertussis in people aged <50 years in the United States. Hum Vaccin Immunother. 2016;31:1-10. [Epub ahead of print]
Black Patients Are Less Likely to Receive Opioids for Back Pain, Abdominal Pain, But Not for “Definitive” Pain
BY JEFF BAUER
FROM PLOS ONE
Black patients who present to the ED with back pain or abdominal pain are significantly less likely to be treated with or prescribed an opioid than are white patients who report similar pain, according to an analysis of data on national ED visits. However, there were no differences in opioid use among black patients and white patients for more objective pain conditions, such as kidney stones or long bone fractures.
Researchers evaluated National Hospital Ambulatory Medical Care Survey data that included descriptions of ED visits made by adults ages 18 to 65 years from 2007 to 2011. They looked specifically at pain-related visits, and defined the reason for each visit as being a “nondefinitive condition” (toothache, back pain, or abdominal pain) or a “definitive condition” (long bone fractures or kidney stones). These nondefinitive conditions have been associated with drug-seeking behavior.
The subjects were categorized as non-Hispanic white, non-Hispanic black, Hispanic, or non-Hispanic other. Pain was rated on a scale from 0 (no pain) to 10 (severe). Researchers also measured whether the patients received an opioid while they were in the ED, were discharged with a prescription for an opioid, or both.
During the study period, there were approximately 36.5 million ED visits for abdominal pain, 14.3 visits for back pain, 6.9 million visits for toothache, 3.4 million visits for kidney stones, and 2.1 million visits for long bone fractures. For each of these conditions, most visits were associated with severe pain.
After adjusting for pain severity, non-Hispanic black patients with abdominal pain or back pain were significantly less likely than their white counterparts to be administered an opioid while in the ED or to be discharged with a prescription for an opioid. However, there was no significant difference between these groups in opioid administration or prescription for patients with long bone fractures, kidney stones, or toothache. Researchers suggested that although toothache was considered a nondefinitive condition in this study, physicians might have been able to verify dental disease during examination of the mouth, thus limiting the subjectivity in their decision to prescribe an opioid.
1. Singhal A, Tien YY, Hsia RY. Racial-ethnic disparities in opioid prescriptions at emergency department visits for conditions commonly associated with prescription drug abuse. PLoS One. 2016;11(8):e0159224. doi: 10.1371/journal.pone.0159224.
FDA Updates Warning Label for Systemic Fluoroquinolones
BY DEEPAK CHITNIS
FRONTLINE MEDICAL NEWS
The Food and Drug Administration (FDA) has amended the boxed warning on labels for fluoroquinolone antibiotics, taken either orally or by injection, to reflect recent findings of the drugs’ potential adverse events.
“These medicines are associated with disabling and potentially permanent side effects of the tendons, muscles, joints, nerves, and central nervous system that can occur together in the same patient,” the FDA stated in its Safety Announcement.
As a result, health care providers should reserve systemic fluoroquinolones for patients who have no other treatment options for any of the following conditions: acute bacterial sinusitis (ABS), acute bacterial exacerbation of chronic bronchitis (ABECB), and uncomplicated urinary tract infections (UTIs). The FDA also said that for some serious bacterial infections, the benefits of fluoroquinolones outweigh the risks, and it is appropriate for them to remain available as a therapeutic option.
Patients taking fluoroquinolones must also be vigilant and let their provider know immediately if they begin suffering from any new pain in their joints, tendons, or muscles. Additionally, if patients begin feeling any numbness in their arms and legs, a prickling or “pins and needles” sensation, or confusion and hallucinations, they should contact their health care provider right away so that they may be switched to a nonfluoroquinolone antibacterial drug for the remainder of their treatment regimen.
Avelox (Moxifloxacin); Cipro, both standard and extended release (ciprofloxacin); Factive (gemifloxacin); Levaquin (levofloxacin); and ofloxacin are the fluoroquinolones currently approved by the FDA for systemic use.
Additional adverse effects for patients taking fluoroquinolones could include tendinitis, tendon rupture, and joint swelling. Central nervous system afflictions could include depression and thoughts of suicide. Fluoroquinolones could also bring about skin rashes, sunburn, arrhythmia, and diarrhea, and could aggravate myasthenia gravis in patients who suffer from it. Warnings regarding these conditions are already included on the drugs’ existing boxed warning.
“In addition to updating information in the Boxed Warning, we are also including information about these safety issues in the Warnings and Precautions section of the label,” the FDA stated. “The Indications and Usage section contains new limitation-of-use statements to reserve fluoroquinolones for patients who do not have other available treatment options for ABS, ABECB, and uncomplicated UTIs.”
The FDA added that it will continue to monitor and assess safety issues associated with fluoroquinolones and will issue any further updates if necessary.
1. US Food and Drug Administration. FDA Drug Safety Communication: FDA updates warnings for oral and injectable fluoroquinolone antibiotics due to disabling side effects. http://www.fda.gov/Drugs/DrugSafety/ucm511530.htm. Published July 26, 2016 Accessed August 26, 2016.
Skin Rash in a Recent Traveler? Think Dengue Fever
BY SHARON WORCESTER
FRONTLINE MEDICAL NEWS
Maintain clinical suspicion for dengue fever among individuals with recent travel to endemic areas who present with a rash and other signs and symptoms of infection, an expert advised at the American Academy of Dermatology summer meeting.
Dengue fever accounts for nearly 10% of skin rashes among individuals returning from endemic areas, and related illness can range from mild to fatal, said Jose Dario Martinez, MD, chief of the Internal Medicine Clinic at University Hospital J.E. Gonzalez, UANL Monterrey, Mexico.
“This is the most prevalent arthropod-borne virus in the world at this time, and it is a resurgent disease in some countries, like Mexico, Brazil, and Colombia,” he noted.
Worldwide, more than 2.5 billion people are at risk of dengue infection, and between 50 million and 100 million cases occur each year, while about 250,000 to 500,000 cases of dengue hemorrhagic fever (DHF) occur each year, and about 25,000 related deaths occur.
In 2005, there was a dengue outbreak in Texas, where 25 cases occurred; in southern Florida, an outbreak of 90 cases was reported in 2009 and 2010. More recently, in 2015, there was an outbreak of 107 cases of locally acquired dengue on the Big Island, Hawaii. But in Mexico, 18,000 new cases occurred in 2015, Dr Martinez said.
Of the RNA virus serotypes 1 to 4, type 1 (DEN-1) is the most common, and DEN-2 and DEN-3 are the most severe, but up to 40% of cases are asymptomatic, he noted, adding that the virus has an incubation period of 2 to 8 days. When symptoms occur, they are representative of acute febrile illness, and may include headache, high fever, myalgia, arthralgia, retro-orbital pain, and fatigue. A faint, itchy, macular rash commonly occurs at 2 to 6 days into the illness. According to the World Health Organization (WHO), a probable dengue fever case includes acute febrile illness and at least two of the following: headache, retro-orbital pain, myalgia, arthralgia, rash, hemorrhagic manifestations, leukopenia, or supportive serology.
“Sometimes the nose bleeds, the gums bleed, and there is bruising in the patient,” Dr Martinez said. “Most important are retro-orbital pain and hemorrhagic manifestations, but also supportive serology.”
About 1% of patients progress to DHF or dengue shock syndrome (DSS) during the critical phase (days 4-7) of illness. This is most likely in those with serotypes 2 and 3, but can occur with all serotypes. Warning signs of such severe disease include abdominal pain or tenderness, persistent vomiting, pleural effusion or ascites, and of particular importance—mucosal bleeding, Dr Martinez said.
By the WHO definition, a diagnosis of DHF requires the presence of fever for at least 2 to 7 days, hemorrhagic tendencies, thrombocytopenia, and evidence and signs of plasma leakage; DSS requires these, as well as evidence of circulatory failure, such as rapid and weak pulse, narrow pulse pressure, hypotension, and shock.
It is important to maintain clinical suspicion for dengue fever, particularly in anyone who has traveled to an endemic area in the 2 weeks before presentation. Serologic tests are important to detect anti-dengue antibodies. Immunoglobulin G is important because its presence could suggest recurrent infection and thus the potential for severe disease, Dr Martinez said. Polymerase chain reaction can be used for detection in the first 4 to 5 days of infection, and the nonstructural glycoprotein 1 rapid test can be positive on the first day, he noted.
The differential diagnosis for dengue fever is broad, and can include chikungunya fever, malaria, leptospirosis, meningococcemia, drug eruption, and Zika fever.
Management of dengue fever includes bed rest, liquids, and mosquito net isolation to prevent reinfection, as more severe disease can occur after reinfection. Acetaminophen can be used for pain relief; aspirin should be avoided due to risk of bleeding, Dr Martinez said. Hospitalization and supportive care are required for those with DHF or DSS. Intensive care unit admission may be required.
Of note, a vaccine against dengue fever has shown promise in phase III trials. The vaccine has been approved in Mexico and Brazil, but not yet in the United States.
For more on dengue fever, see the case report “Dengue Fever: Two Unexpected Findings” on page 408.
Pertussis Often Goes Undiagnosed, Especially in Adults
BY ABIGAIL CRUZ
FRONTLINE MEDICAL NEWS
A majority of pertussis cases in the United States may go undetected in people younger than age 50 years, particularly in adults, results of a retrospective database cohort study suggest.
“The incidence of pertussis in adolescents and adults is very difficult to quantify,” wrote Chi-Chang Chen, MD, of IMS Health, Plymouth Meeting, Pennsylvania, and associates. Symptoms may be misdiagnosed as other respiratory illnesses; infected individuals may not seek treatment; and pertussis may not be considered as a possible diagnosis in adults, they noted.
To project the possible range of pertussis incidence in this population, investigators used three different models to analyze information from private insurance and laboratory databases as well as data from the Centers for Disease Control and Prevention for a 6-year period. The first method, which used medical claims for International Classification of Diseases (ICD-9) diagnosed pertussis, found an annual incidence rate of 9 per 100,000 population. The second used a proxy pertussis model that was based on symptoms that could indicate undiagnosed pertussis, showing an incidence rate of 21 per 100,000. The third method used pathogen data to estimate the fraction of cough illness statistically attributable to pertussis, resulting in an incidence rate of 649 per 100,000 population, which is 58 to 93 times higher than the ICD-9 estimated incidence.
These estimates “highlight the need for improved preventive measures—such as increased vaccination—against pertussis,” the investigators said, noting that immunization recommendations for additional age groups and research involving strategies to reduce waning immunity after vaccination should be considered.
1. Chen CC, Balderston McGuiness C, Krishnarajah G, et al. Estimated incidence of pertussis in people aged <50 years in the United States. Hum Vaccin Immunother. 2016;31:1-10. [Epub ahead of print]
The ED Is a Safer Place…and Can Be Safer Still
Improving medication accuracy, transitions of care, health information technology, and other ED patient-safety strategies are offered in this month’s Emergency Medicine cover article, “Patient Safety in the Emergency Department,” by emergency physician (EP)/toxicologist Brenna M. Farmer, MD, a colleague for many years.
As Dr Farmer notes in her introduction, patient safety—in the ED and elsewhere—has received a great deal of attention since the publication of the two landmark Institute of Medicine (IOM) studies in 1999 and 2001 that documented an enormous number of medical errors and recommended improvements in medical care. More than a decade and a half after their publication, is there any evidence that these reports have led to a reduction in the number of serious adverse effects and deaths due to medical errors?
Although most EPs believe that ED safety measures have reduced the overall number of errors, there is a scarcity of published data demonstrating a direct cause-and-effect relationship in reducing the number of adverse events and deaths. A recent analysis of National Hospital Ambulatory Medical Care Survey data by EPs Kanzaria, Probst, and Hsia (Health Aff [Millwood]. 2016;35[7]:1303-1308) found that ED death rates dropped by nearly 50% between 1997 and 2011. Most of this reporting period includes the years following the IOM reports before the implementation of the Affordable Care Act measures. One might reasonably assume that the decrease in ED death rates since 1997 is at least partly due to the safety measures described by Dr Farmer. However, Kanzaria et al hypothesize that the reduction is probably due to palliative and prehospital care efforts which “shift the locus of deaths,” to recent advances in emergency critical care, and to public health successes in smoking cessation, motor vehicle safety, etc. Conspicuously absent from their list of possible measures responsible for the reduction in ED death rates are ED safety measures.
If Kanzaria et al are correct in attributing the reduction in ED deaths to measures taken by others to decrease the number of dying patients brought to EDs, then it may be reasonable to look for the benefit of eliminating serious ED errors to a decrease in death rates after patients leave the ED for inpatient services. Though inpatient death-rate data is available only since 2005, Kanzaria et al report no significant change in the inpatient death rate between 2005 and 2011. It is possible, however, that the improvements in ED critical care hypothesized by the authors to be partly responsible for reducing ED death rates enable sicker patients to survive longer and ultimately succumb to their serious illnesses as inpatients. If so, this could offset any evident reduction in inpatient mortality from the avoidance of serious errors in the ED.
In any case, Dr Farmer does present direct evidence that safety measures are effective in reducing morbidity, and probably mortality. For example, in one study cited, medication errors were 13.5 times less likely to occur when an ED pharmacist was present, and clearly, avoiding doubling the doses of potent cardiac medications or sedative hypnotics, avoiding dangerous drug interactions, and choosing the correct type, dose, and time of administration of antibiotics and all meds must be responsible for reducing morbidity and ultimately mortality. It is also worth recalling that with respect to patient safety, emergency medicine is undoubtedly the safest medical specialty ever created, pioneering from its inception 24/7 bedside attending presence and mandatory recertifications, years to decades before other specialties adopted these practices. Thanks to these efforts, EDs are much safer than they had been previously, and by implementing the measures described by Dr Farmer will be safer still.
Improving medication accuracy, transitions of care, health information technology, and other ED patient-safety strategies are offered in this month’s Emergency Medicine cover article, “Patient Safety in the Emergency Department,” by emergency physician (EP)/toxicologist Brenna M. Farmer, MD, a colleague for many years.
As Dr Farmer notes in her introduction, patient safety—in the ED and elsewhere—has received a great deal of attention since the publication of the two landmark Institute of Medicine (IOM) studies in 1999 and 2001 that documented an enormous number of medical errors and recommended improvements in medical care. More than a decade and a half after their publication, is there any evidence that these reports have led to a reduction in the number of serious adverse effects and deaths due to medical errors?
Although most EPs believe that ED safety measures have reduced the overall number of errors, there is a scarcity of published data demonstrating a direct cause-and-effect relationship in reducing the number of adverse events and deaths. A recent analysis of National Hospital Ambulatory Medical Care Survey data by EPs Kanzaria, Probst, and Hsia (Health Aff [Millwood]. 2016;35[7]:1303-1308) found that ED death rates dropped by nearly 50% between 1997 and 2011. Most of this reporting period includes the years following the IOM reports before the implementation of the Affordable Care Act measures. One might reasonably assume that the decrease in ED death rates since 1997 is at least partly due to the safety measures described by Dr Farmer. However, Kanzaria et al hypothesize that the reduction is probably due to palliative and prehospital care efforts which “shift the locus of deaths,” to recent advances in emergency critical care, and to public health successes in smoking cessation, motor vehicle safety, etc. Conspicuously absent from their list of possible measures responsible for the reduction in ED death rates are ED safety measures.
If Kanzaria et al are correct in attributing the reduction in ED deaths to measures taken by others to decrease the number of dying patients brought to EDs, then it may be reasonable to look for the benefit of eliminating serious ED errors to a decrease in death rates after patients leave the ED for inpatient services. Though inpatient death-rate data is available only since 2005, Kanzaria et al report no significant change in the inpatient death rate between 2005 and 2011. It is possible, however, that the improvements in ED critical care hypothesized by the authors to be partly responsible for reducing ED death rates enable sicker patients to survive longer and ultimately succumb to their serious illnesses as inpatients. If so, this could offset any evident reduction in inpatient mortality from the avoidance of serious errors in the ED.
In any case, Dr Farmer does present direct evidence that safety measures are effective in reducing morbidity, and probably mortality. For example, in one study cited, medication errors were 13.5 times less likely to occur when an ED pharmacist was present, and clearly, avoiding doubling the doses of potent cardiac medications or sedative hypnotics, avoiding dangerous drug interactions, and choosing the correct type, dose, and time of administration of antibiotics and all meds must be responsible for reducing morbidity and ultimately mortality. It is also worth recalling that with respect to patient safety, emergency medicine is undoubtedly the safest medical specialty ever created, pioneering from its inception 24/7 bedside attending presence and mandatory recertifications, years to decades before other specialties adopted these practices. Thanks to these efforts, EDs are much safer than they had been previously, and by implementing the measures described by Dr Farmer will be safer still.
Improving medication accuracy, transitions of care, health information technology, and other ED patient-safety strategies are offered in this month’s Emergency Medicine cover article, “Patient Safety in the Emergency Department,” by emergency physician (EP)/toxicologist Brenna M. Farmer, MD, a colleague for many years.
As Dr Farmer notes in her introduction, patient safety—in the ED and elsewhere—has received a great deal of attention since the publication of the two landmark Institute of Medicine (IOM) studies in 1999 and 2001 that documented an enormous number of medical errors and recommended improvements in medical care. More than a decade and a half after their publication, is there any evidence that these reports have led to a reduction in the number of serious adverse effects and deaths due to medical errors?
Although most EPs believe that ED safety measures have reduced the overall number of errors, there is a scarcity of published data demonstrating a direct cause-and-effect relationship in reducing the number of adverse events and deaths. A recent analysis of National Hospital Ambulatory Medical Care Survey data by EPs Kanzaria, Probst, and Hsia (Health Aff [Millwood]. 2016;35[7]:1303-1308) found that ED death rates dropped by nearly 50% between 1997 and 2011. Most of this reporting period includes the years following the IOM reports before the implementation of the Affordable Care Act measures. One might reasonably assume that the decrease in ED death rates since 1997 is at least partly due to the safety measures described by Dr Farmer. However, Kanzaria et al hypothesize that the reduction is probably due to palliative and prehospital care efforts which “shift the locus of deaths,” to recent advances in emergency critical care, and to public health successes in smoking cessation, motor vehicle safety, etc. Conspicuously absent from their list of possible measures responsible for the reduction in ED death rates are ED safety measures.
If Kanzaria et al are correct in attributing the reduction in ED deaths to measures taken by others to decrease the number of dying patients brought to EDs, then it may be reasonable to look for the benefit of eliminating serious ED errors to a decrease in death rates after patients leave the ED for inpatient services. Though inpatient death-rate data is available only since 2005, Kanzaria et al report no significant change in the inpatient death rate between 2005 and 2011. It is possible, however, that the improvements in ED critical care hypothesized by the authors to be partly responsible for reducing ED death rates enable sicker patients to survive longer and ultimately succumb to their serious illnesses as inpatients. If so, this could offset any evident reduction in inpatient mortality from the avoidance of serious errors in the ED.
In any case, Dr Farmer does present direct evidence that safety measures are effective in reducing morbidity, and probably mortality. For example, in one study cited, medication errors were 13.5 times less likely to occur when an ED pharmacist was present, and clearly, avoiding doubling the doses of potent cardiac medications or sedative hypnotics, avoiding dangerous drug interactions, and choosing the correct type, dose, and time of administration of antibiotics and all meds must be responsible for reducing morbidity and ultimately mortality. It is also worth recalling that with respect to patient safety, emergency medicine is undoubtedly the safest medical specialty ever created, pioneering from its inception 24/7 bedside attending presence and mandatory recertifications, years to decades before other specialties adopted these practices. Thanks to these efforts, EDs are much safer than they had been previously, and by implementing the measures described by Dr Farmer will be safer still.
Data are mixed on cancerous transformation of cardiac mucosa in Barrett’s esophagus
CHICAGO – If scouring data is what makes a gastroenterologist feel good about risk assessment, there may be a lot of unhappy gastroenterologists out there, at least when it comes to the risk of cancer arising from cardiac mucosa in Barrett’s esophagus, according to Nicholas J. Shaheen, MD.
The risk arising from this nonintestinal metaplasia growth is probably quite low in real life, but the extant literature gives doctors a lot of contradictions, he said at the meeting sponsored by the American Gastroenterological Association.
“The risk of cancer with cardiac mucosa is unclear,” said Dr. Shaheen of the University of North Carolina at Chapel Hill. “Some data do suggest that, at least when present in the tubular esophagus in patients with gastroesophageal reflux symptoms, there may be a risk of adenocarcinoma close to what’s seen in patients with intestinal metaplasia. Other data suggest the risk is quite low, perhaps even approximating that of the general population.”
The reasons for what Dr. Shaheen called “remarkable variability” in these data probably arise more from sampling error than real life. The studies are retrospective, and many lack long-term follow-up data, are plagued with insufficient numbers, and – perhaps most importantly – are not grounded in any standard clinical methodology.
“People who do endoscopy for a living understand that the stuff you read about systematic biopsy protocols is hardly ever honored in the breach. None of these studies ever reports the biopsy protocol from which the samples were taken.”
This lack of protocol means that studies on the cancer risk of columnar lined esophagus (CLE), which is negative for intestinal metaplasia are probably flawed from the beginning.
“The truth is that most gastroenterologists do a lousy job of biopsying Barrett’s, so there is probably a lot of sampling error in these studies, and they are contaminated with a high rate of intestinal metaplasia [IM],” said Dr. Shaheen.
And these studies do not report on the length of the CLE segment from which the biopsy was taken. “The likelihood of finding goblet cells [a characteristic of cardiac mucosa] increases with the length of Barrett’s. None of the studies is normalized for Barrett’s length. When we see studies saying the cancer risk is higher in the presence of goblet cells, length could be a partially confounding association.”
A 2009 study with a small sample size of 68 CLE patients found that abnormal DNA was just as likely in IM-negative samples as IM-positive ones. All of the samples were significantly different from the control samples, suggesting that any metaplasia in the CLE may already be on the path to cancer, Dr. Shaheen said (Am J Gastro. 2009;104:816-24)
In fact, a 2007 Scandinavian study supported the idea that IM isn’t even necessary for neoplastic progression of CLE (Scand J Gastroenterol 2007;42:1271-4). The investigators followed 712 patients for 12 years, and found that the adenocarcinoma rate was about 0.4 per patient per year whether the sample was IM positive or not.
“This study was enough to put a little shudder in the endoscopy community. If IM doesn’t matter, you’re talking about increasing the work in the endoscopy lab by 100%, because there are twice as many non-IM patients as those with IM.”
A 2008 study seemingly found something similar – but with a caveat, Dr. Shaheen said. The CLE patients in this study were followed for 3.5 years, and the cancer rate was virtually identical. But as the follow-up progressed, more and more biopsies turned up IM positive. “A first negative biopsy looked like it was associated with disease-free survival, but almost all IM-negative samples eventually became IM positive, so this didn’t really answer our question.”
Other studies have found that non-IM CLE has a very low neoplastic risk, and that IM is almost always a prerequisite for cancer to develop. The largest of these was conducted in the Northern Ireland Barrett’s Esophagus Registry in 2011. It followed more than 8,000 patients for 7 years. Patients with IM were 3.5 times more likely to develop a related adenocarcinoma than were those without IM (J Natl Cancer Inst. 2011;103:1049-57).
The contradictory evidence leads Dr. Shaheen to suggest a specific biopsy protocol for patients with Barrett’s esophagus.
“In my opinion, if you see a long segment of Barrett’s – more than 2 cm – and the biopsy is negative for IM, there is a good chance that you have a sampling error there, and a second endoscopy and biopsy are not unreasonable. If you see a short segment of Barrett’s and the biopsy is negative for IM, the cancer risk is unclear, but in general it’s probably pretty low, whether there are goblet cells there or not. I would say retaining these patients under endoscopic surveillance is of dubious value. [With] the likely low absolute risk of cancer in this patient population, no blanket recommendation for surveillance is advisable.”
Dr. Shaheen had no relevant financial disclosures.
On Twitter @Alz_Gal
CHICAGO – If scouring data is what makes a gastroenterologist feel good about risk assessment, there may be a lot of unhappy gastroenterologists out there, at least when it comes to the risk of cancer arising from cardiac mucosa in Barrett’s esophagus, according to Nicholas J. Shaheen, MD.
The risk arising from this nonintestinal metaplasia growth is probably quite low in real life, but the extant literature gives doctors a lot of contradictions, he said at the meeting sponsored by the American Gastroenterological Association.
“The risk of cancer with cardiac mucosa is unclear,” said Dr. Shaheen of the University of North Carolina at Chapel Hill. “Some data do suggest that, at least when present in the tubular esophagus in patients with gastroesophageal reflux symptoms, there may be a risk of adenocarcinoma close to what’s seen in patients with intestinal metaplasia. Other data suggest the risk is quite low, perhaps even approximating that of the general population.”
The reasons for what Dr. Shaheen called “remarkable variability” in these data probably arise more from sampling error than real life. The studies are retrospective, and many lack long-term follow-up data, are plagued with insufficient numbers, and – perhaps most importantly – are not grounded in any standard clinical methodology.
“People who do endoscopy for a living understand that the stuff you read about systematic biopsy protocols is hardly ever honored in the breach. None of these studies ever reports the biopsy protocol from which the samples were taken.”
This lack of protocol means that studies on the cancer risk of columnar lined esophagus (CLE), which is negative for intestinal metaplasia are probably flawed from the beginning.
“The truth is that most gastroenterologists do a lousy job of biopsying Barrett’s, so there is probably a lot of sampling error in these studies, and they are contaminated with a high rate of intestinal metaplasia [IM],” said Dr. Shaheen.
And these studies do not report on the length of the CLE segment from which the biopsy was taken. “The likelihood of finding goblet cells [a characteristic of cardiac mucosa] increases with the length of Barrett’s. None of the studies is normalized for Barrett’s length. When we see studies saying the cancer risk is higher in the presence of goblet cells, length could be a partially confounding association.”
A 2009 study with a small sample size of 68 CLE patients found that abnormal DNA was just as likely in IM-negative samples as IM-positive ones. All of the samples were significantly different from the control samples, suggesting that any metaplasia in the CLE may already be on the path to cancer, Dr. Shaheen said (Am J Gastro. 2009;104:816-24)
In fact, a 2007 Scandinavian study supported the idea that IM isn’t even necessary for neoplastic progression of CLE (Scand J Gastroenterol 2007;42:1271-4). The investigators followed 712 patients for 12 years, and found that the adenocarcinoma rate was about 0.4 per patient per year whether the sample was IM positive or not.
“This study was enough to put a little shudder in the endoscopy community. If IM doesn’t matter, you’re talking about increasing the work in the endoscopy lab by 100%, because there are twice as many non-IM patients as those with IM.”
A 2008 study seemingly found something similar – but with a caveat, Dr. Shaheen said. The CLE patients in this study were followed for 3.5 years, and the cancer rate was virtually identical. But as the follow-up progressed, more and more biopsies turned up IM positive. “A first negative biopsy looked like it was associated with disease-free survival, but almost all IM-negative samples eventually became IM positive, so this didn’t really answer our question.”
Other studies have found that non-IM CLE has a very low neoplastic risk, and that IM is almost always a prerequisite for cancer to develop. The largest of these was conducted in the Northern Ireland Barrett’s Esophagus Registry in 2011. It followed more than 8,000 patients for 7 years. Patients with IM were 3.5 times more likely to develop a related adenocarcinoma than were those without IM (J Natl Cancer Inst. 2011;103:1049-57).
The contradictory evidence leads Dr. Shaheen to suggest a specific biopsy protocol for patients with Barrett’s esophagus.
“In my opinion, if you see a long segment of Barrett’s – more than 2 cm – and the biopsy is negative for IM, there is a good chance that you have a sampling error there, and a second endoscopy and biopsy are not unreasonable. If you see a short segment of Barrett’s and the biopsy is negative for IM, the cancer risk is unclear, but in general it’s probably pretty low, whether there are goblet cells there or not. I would say retaining these patients under endoscopic surveillance is of dubious value. [With] the likely low absolute risk of cancer in this patient population, no blanket recommendation for surveillance is advisable.”
Dr. Shaheen had no relevant financial disclosures.
On Twitter @Alz_Gal
CHICAGO – If scouring data is what makes a gastroenterologist feel good about risk assessment, there may be a lot of unhappy gastroenterologists out there, at least when it comes to the risk of cancer arising from cardiac mucosa in Barrett’s esophagus, according to Nicholas J. Shaheen, MD.
The risk arising from this nonintestinal metaplasia growth is probably quite low in real life, but the extant literature gives doctors a lot of contradictions, he said at the meeting sponsored by the American Gastroenterological Association.
“The risk of cancer with cardiac mucosa is unclear,” said Dr. Shaheen of the University of North Carolina at Chapel Hill. “Some data do suggest that, at least when present in the tubular esophagus in patients with gastroesophageal reflux symptoms, there may be a risk of adenocarcinoma close to what’s seen in patients with intestinal metaplasia. Other data suggest the risk is quite low, perhaps even approximating that of the general population.”
The reasons for what Dr. Shaheen called “remarkable variability” in these data probably arise more from sampling error than real life. The studies are retrospective, and many lack long-term follow-up data, are plagued with insufficient numbers, and – perhaps most importantly – are not grounded in any standard clinical methodology.
“People who do endoscopy for a living understand that the stuff you read about systematic biopsy protocols is hardly ever honored in the breach. None of these studies ever reports the biopsy protocol from which the samples were taken.”
This lack of protocol means that studies on the cancer risk of columnar lined esophagus (CLE), which is negative for intestinal metaplasia are probably flawed from the beginning.
“The truth is that most gastroenterologists do a lousy job of biopsying Barrett’s, so there is probably a lot of sampling error in these studies, and they are contaminated with a high rate of intestinal metaplasia [IM],” said Dr. Shaheen.
And these studies do not report on the length of the CLE segment from which the biopsy was taken. “The likelihood of finding goblet cells [a characteristic of cardiac mucosa] increases with the length of Barrett’s. None of the studies is normalized for Barrett’s length. When we see studies saying the cancer risk is higher in the presence of goblet cells, length could be a partially confounding association.”
A 2009 study with a small sample size of 68 CLE patients found that abnormal DNA was just as likely in IM-negative samples as IM-positive ones. All of the samples were significantly different from the control samples, suggesting that any metaplasia in the CLE may already be on the path to cancer, Dr. Shaheen said (Am J Gastro. 2009;104:816-24)
In fact, a 2007 Scandinavian study supported the idea that IM isn’t even necessary for neoplastic progression of CLE (Scand J Gastroenterol 2007;42:1271-4). The investigators followed 712 patients for 12 years, and found that the adenocarcinoma rate was about 0.4 per patient per year whether the sample was IM positive or not.
“This study was enough to put a little shudder in the endoscopy community. If IM doesn’t matter, you’re talking about increasing the work in the endoscopy lab by 100%, because there are twice as many non-IM patients as those with IM.”
A 2008 study seemingly found something similar – but with a caveat, Dr. Shaheen said. The CLE patients in this study were followed for 3.5 years, and the cancer rate was virtually identical. But as the follow-up progressed, more and more biopsies turned up IM positive. “A first negative biopsy looked like it was associated with disease-free survival, but almost all IM-negative samples eventually became IM positive, so this didn’t really answer our question.”
Other studies have found that non-IM CLE has a very low neoplastic risk, and that IM is almost always a prerequisite for cancer to develop. The largest of these was conducted in the Northern Ireland Barrett’s Esophagus Registry in 2011. It followed more than 8,000 patients for 7 years. Patients with IM were 3.5 times more likely to develop a related adenocarcinoma than were those without IM (J Natl Cancer Inst. 2011;103:1049-57).
The contradictory evidence leads Dr. Shaheen to suggest a specific biopsy protocol for patients with Barrett’s esophagus.
“In my opinion, if you see a long segment of Barrett’s – more than 2 cm – and the biopsy is negative for IM, there is a good chance that you have a sampling error there, and a second endoscopy and biopsy are not unreasonable. If you see a short segment of Barrett’s and the biopsy is negative for IM, the cancer risk is unclear, but in general it’s probably pretty low, whether there are goblet cells there or not. I would say retaining these patients under endoscopic surveillance is of dubious value. [With] the likely low absolute risk of cancer in this patient population, no blanket recommendation for surveillance is advisable.”
Dr. Shaheen had no relevant financial disclosures.
On Twitter @Alz_Gal
EXPERT ANALYSIS FROM THE 2016 JAMES W. FRESTON CONFERENCE
The new NOACs are generally the best bet
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
New NOACs have largely replaced the need for vitamin K antagonists
The discovery of oral anticoagulants began in 1924, when Schofield linked the death of grazing cattle from internal hemorrhage to the consumption of spoiled sweet clover hay.1 It was not until 1941, however, while trying to understand this observation that Campbell and Link were able to identify the dicoumarol anticoagulant, which formed as a result of the spoiling process.2 Ultimately, after noting that vitamin K led to reversal of the dicoumarol effect, synthesis of the first class of oral anticoagulants, known as vitamin K antagonists (VKAs) began. Despite the numerous challenges associated with managing patients using this class of anticoagulants, VKAs have become the mainstay of oral anticoagulation therapy for the past 70 years. Over the past 5 years, however, new oral anticoagulants (NOACs) have emerged and are changing clinical practice. Mechanistically, these medications are targeted therapies and work as either direct thrombin inhibitors (dabigatran etexilate) or direct factor Xa inhibitors (rivaroxaban, apixaban, and edoxaban). Given their favorable pharmacologic design, NOACs have the potential to replace VKAs as they not only have an encouraging safety profile, but also are therapeutically equivalent or even superior to VKAs when used in certain patient populations.
Pharmacologic design
The targeted drug design of NOACs provides many pharmacologic advantages. Compared with VKAs, NOACs have a notably more predictable pharmacologic profile and relatively wide therapeutic window, which allows for fixed dosing, a rapid onset and offset, and fewer drug interactions.3 These characteristics eliminate the need for the routine dose monitoring and serial dose adjustments frequently associated with VKAs. Additionally, NOACs less commonly require bridging therapy with parenteral unfractionated heparin or low molecular weight heparins (LMWH) while awaiting therapeutic drug levels, as these levels are reached sooner and more predictably than with VKAs.4 As with any medication, however, appropriate consideration should to be given to specific patient populations such as those who are older or have significant comorbidities which may influence drug effect and clearance.
Lastly, it should be mentioned that the pharmacologic benefits of NOACs are not only beneficial from a patient perspective, but also from a health care systems standpoint as their use may provide an opportunity to deliver more cost-effective care. Specifically, economic models using available clinical trial data for stroke prevention in nonvalvular atrial fibrillation have shown that NOACs (apixaban, dabigatran, and rivaroxaban) are cost-effective alternatives when compared with warfarin.5 Although the results from such economic analyses are limited by the modeling assumptions they rely upon, these findings suggest that, at least initially, cost should not be used as a prohibitive reason for adopting these new therapeutics.
Patient selection
The decision to institute oral anticoagulation therapy depends on each patient’s individualized bleeding risk to benefit of ischemia prevention ratio. A major determinant of this ratio is the clinical indication for which anticoagulation is begun. Numerous phase III clinical trials have been conducted comparing the use of NOACs versus VKAs or placebos for the management of nonvalvular atrial fibrillation (AF), venous thromboembolism (VTE), and as adjunctive therapy for patients with acute coronary syndrome.6 Meta-analyses of randomized trials have shown the most significant benefit to be in patients with nonvalvular atrial fibrillation where NOACs have significant reductions in stroke, intracranial hemorrhage, and all-cause mortality, compared with warfarin while displaying variable effects with regards to gastrointestinal bleeding.6,7
In patients with VTE, NOACs have been found to have similar efficacy, compared with VKAs, with regard to the prevention of VTE or VTE-related death, and have been noted to have a better safety profile.6 Lastly, when studied as an adjunctive agent to dual antiplatelet therapy in patients with acute coronary syndrome, it should be noted that NOACs have been associated with an increased bleeding risk without a significant decrease in thrombosis risk.6 Taken together, these data suggest that the primary indication for instituting NOAC therapy should be considered strongly when deciding upon the class of anticoagulant to use.
Overcoming challenges
Since the introduction of NOACs, there has been concern over the lack of specific antidotes to therapy, especially when administered in patients with impaired clearance, a high likelihood of need for an urgent or emergent procedure, or those presenting with life-threatening bleeding complications. Most recently, however, interim analysis from clinical trial data has shown complete reversal of the direct thrombin inhibitor dabigatran with the humanized monocolonal antibody idarucizumab within minutes of administration in greater than 88% of patients studied.8 Similarly, agents such as a PER977 are currently in phase II clinical trials as they have been shown to form noncovalent hydrogen bonds and charge-charge interactions with oral factor Xa inhibitors as well as oral thrombin inhibitors leading to their reversal.9 Given these promising findings, it likely will not be long until reversal agents for NOACs become clinically available. Until that time, it is encouraging that the bleeding profile of these drugs has been found to be favorable, compared with VKAs, and their short half-life allows for a relatively expeditious natural reversal of their anticoagulant effect as the drug is eliminated.
Conclusions
Unlike the serendipitous path leading to the discovery of the first class of oral anticoagulants (VKAs), NOACs have been specifically designed to provide targeted anticoagulation and to address the shortcomings of VKAs. To this end, NOACs are becoming increasingly important in the management of patients with specific clinical conditions such as nonvalvular atrial fibrillation and venous thromboembolism where they have been shown to provide a larger net clinical benefit relative to the available alternatives. Furthermore, with economic analyses providing evidence that NOACs are cost-effective for the health care system and clinical trial results suggesting progress in the development of antidotes for reversal, it is likely that with growing experience, these agents will replace VKAs as the mainstay for prophylactic and therapeutic oral anticoagulation in targeted patient populations.
Madhukar S. Patel, MD, and Elliot L. Chaikof, MD, are from the department of surgery, Beth Israel Deaconess Medical Center, Boston. They reported having no conflicts of interest.
References
1. J Am Vet Med Assoc 1924;64:553-575
3. Hematology Am Soc Hematol Educ Program 2013;2013:464-470
4. Eur Heart J 2013;34:2094-2106
6. Nat Rev Cardiol 2014;11:693-703
8. N Engl J Med 2015;373:511-520
9. N Engl J Med 2014;371:2141-2142
What the doctor didn’t order: unintended consequences and pitfalls of NOACs
Recently, several new oral anticoagulants (NOACs) have gained FDA approval to replace warfarin, capturing the attention of popular media. These include dabigatran, rivaroxaban, apixaban, and edoxaban. Dabigatran targets activated factor II (factor IIa), while rivaroxaban, apixaban, and edoxaban target activated factor X (factor Xa). Easy to take with a once or twice daily pill, with no cumbersome monitoring, they represent a seemingly ideal treatment for the chronically anticoagulated patient. All agents are currently FDA approved in the United States for treatment of acute VTE and AF.
Dabigatran and edoxaban
Similar to warfarin, dabigatran and edoxaban require the use of a LMWH or UFH “bridge” when therapy is beginning, while rivaroxaban and apixaban are instituted as monotherapy without such a bridge. Dabigatran etexilate (PradaxaR, Boehringer Ingelheim) has the longest half-life of all of the NOACs at 12-17 hours, and this half-life is prolonged with increasing age and decreasing renal function.1 It is the only new agent which can be at least partially reversed with dialysis.2 Edoxaban (SavaysaR, Daiichi Sankyo) carries a boxed warning stating that this agent is less effective in AF patients with a creatinine clearance greater than 95 mL/min, and that kidney function should be assessed prior to starting treatment: Such patients have a greater risk of stroke, compared with similar patients treated with warfarin. Edoxaban is the only agent specifically tested at a lower dose in patients at significantly increased risk of bleeding complications (low body weight and/or decreased creatinine clearance).3
Rivaroxaban and apixaban
Rivaroxaban (XareltoR, Bayer and Janssen), and apixaban (EliquisR, Bristol Myers-Squibb), unique amongst the NOACs, have been tested for extended therapy of acute deep vein thrombosis after treatment of 6-12 months. They were found to result in a significant decrease in recurrent VTE without an increase in major bleeding, compared with placebo.4,5 Rivaroxaban has once-daily dosing and apixaban has twice-daily dosing; both are immediate monotherapy, making them quite convenient for patients. Apixaban is the only agent among the NOACs to have a slight decrease in gastrointestinal bleeding, compared with warfarin.6
Consequences and pitfalls with NOACs
Problems with these new drugs, which may diminish our current level of enthusiasm for these agents to totally replace warfarin, include the inability to reliably follow their levels or reverse their anticoagulant effects, the lack of data available on bridging when other procedures need to be performed, their short half-lives, and the lack of data on their anti-inflammatory effects. With regard to monitoring of anticoagulation, the International Society of Thrombosis and Hemostasis (ISTH) has published the times when it might be useful to obtain levels. These times include:
• When a patient is bleeding.
• Before surgery or an invasive procedure when the patient has taken the drug in the previous 24 hours, or longer if creatinine clearance (CrCl) is less than 50 mL min.
• Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to affect pharmacokinetics.
• Identification of subtherapeutic or supratherapeutic levels in patients at body weight extremes.
• Patients with deteriorating renal function.
• During perioperative management.
• During reversal of anticoagulation.
• When there is suspicion of overdose.
• Assessment of compliance in patients suffering thrombotic events while on treatment.7
Currently, there exists no commercially available reversal agent for any of the NOACs, and existing reversal agents for traditional anticoagulants are of limited, if any, use. Drugs under development include agents for the factor Xa inhibitors and for the thrombin inhibitor. Until the time that specific reversal agents exist, supportive care is the mainstay of therapy. In cases of trauma or severe or life-threatening bleeding, administration of concentrated clotting factors (prothrombin complex concentrate) or dialysis (dabigatran only) may be utilized. However, data from large clinical trials are lacking. A recent study of 90 patients receiving an antibody directed against dabigatran has revealed that the anticoagulant effects of dabigatran were reversed safely within minutes of administration; however drug levels were not consistently suppressed at 24 hours in 20% of the cohort.8
Currently there are no national guidelines or large scale studies to guide bridging NOACs for procedures.
The relatively short half-life for these agents makes it likely that traditional bridging as is practiced for warfarin is not necessary.9 However, this represents a double-edged sword; withholding anticoagulation for two doses (such as if a patient becomes ill or a clinician is overly cautious around the time of a procedure) may leave the patient unprotected.
The final question with the new agents is their anti-inflammatory effects. We know that heparin and LMWH have significant pleiotropic effects that are not necessarily related to their anticoagulant effects. These effects are important in order to decrease the inflammatory nature of the thrombus and its effect on the vein wall. We do not know if the new oral agents have similar effects, as this has never fully been tested. In view of the fact that two of the agents are being used as monotherapy agents without any heparin/LMWH bridge, the anti-inflammatory properties of these new agents should be defined to make sure that such a bridge is not necessary.
So, in summary, although these agents have much to offer, there are many questions that remain to be addressed and answered before they totally replace traditional approaches to anticoagulation, in the realm of VTE. It must not be overlooked that despite all the benefits, they also each carry a risk of bleeding as they all target portions of the coagulation mechanism. We caution that, as with any “gift horse,” physicians should perhaps examine the data more closely and proceed with caution.
Thomas Wakefield, MD, is the Stanley Professor of Vascular Surgery; head, section of vascular surgery; and director, Samuel and Jean Frankel Cardiovascular Center. Andrea Obi, MD, is a vascular surgery fellow and Dawn Coleman MD, is the program director, section of vascular surgery, all at the University of Michigan, Ann Arbor. They reported having no conflicts of interest.
References
1. N Engl J Med. 2009;361:2342-2352
2. J Vasc Surg: Venous and Lymphatic Disorders. 2013;1:418-426
3. N Engl J Med 2013;369:1406-1415
4. N Engl J Med 2010;363:2499-2510
5. N Engl J Med 2013;368:699-708
6. Arteriosclerosis, thrombosis, and vascular biology 2015;35:1056-1065
7. J Thrombosis and Haemostasis 2013;11:756-760
Whole brain radiotherapy not beneficial for NSCLC metastasis
Whole brain radiotherapy, a standard treatment for patients with metastatic non–small-cell lung cancer, provided no clinical benefit in a noninferiority trial specifically designed to assess both patient survival and quality of life.
The findings were published online Sept. 4 in the Lancet.
Whole brain radiotherapy, with or without concomitant steroid treatment, has been widely used for decades in that patient population, even though no sufficiently powered, definitive studies support the approach. It is likely that patients and clinicians alike continue to embrace it because of the absence of alternative treatment options.
The Quality of Life After Treatment for Brain Metastases (QUARTZ) trial was intended to assess whether any improvement in survival offered by whole brain radiotherapy is balanced by deterioration in quality of life, said Paula Mulvenna, MBBS, of the Northern Center for Cancer Care, Newcastle (England) Hospitals, and her associates (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736(16)30825-X).
QUARTZ involved 538 adults seen during a 7-year period who had NSCLC with brain metastases and who were not suited for either brain surgery or stereotactic radiotherapy. The median age was 66 years (range, 35-85 years), and 38% had a Karnofsky Performance Status score of less than 70.
The participants were randomly assigned to receive either optimal supportive care plus whole brain radiotherapy (269 patients) or optimal supportive care alone (269 patients) at 69 U.K. and 3 Australian medical centers. They reported on 20 symptoms and adverse effects, as well as health-related quality of life, approximately once per week.
The primary outcome measure – quality-adjusted life-years (QALY), which combines overall survival and quality of life – was 46.4 days with radiotherapy and 41.7 days without it.
Symptoms, adverse effects, and quality of life (QOL) were similar between the two study groups at 4 weeks, except that the radiotherapy group reported more moderate or severe episodes of drowsiness, hair loss, nausea, and dry or itchy scalp. The number and severity of serious adverse events were similar through 12 weeks of follow-up.
The percentage of patients whose QOL was either maintained or improved over time was similar between the two groups at 4 weeks (54% vs. 57%), 8 weeks (44% vs. 51%), and 12 weeks (44% vs. 49%). Changes in Karnofsky scores also were similar.
The study refuted the widely held belief that whole brain radiotherapy allows patients to reduce or discontinue steroid treatment, averting the associated adverse effects. Steroid doses were not significantly different between the two study groups through the first 8 weeks of treatment, which “challenges the dogma that whole brain radiotherapy can be seen as a steroid-sparing modality,” the investigators said.
Taken together, the findings “suggest that whole brain radiotherapy can be omitted and patients treated with optimal supportive care alone, without an important reduction in either overall survival or quality of life,” Dr. Mulvenna and her associates said.
The approximately 5-day difference between the two study groups in median overall survival highlights both the limited benefit offered by radiotherapy and the poor prognosis of this patient population, the researchers added.
Whole brain radiotherapy did offer a small survival benefit to the youngest patients who had good performance status and a “controlled” primary NSCLC. “For all other groups, [it] does not significantly affect QALY or overall survival,” they said.
Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the National Health and Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.
Managing brain metastases from NSCLC is a challenge, because the lesions may well produce life-threatening symptoms and serious impairment, which could be ameliorated with whole brain radiotherapy.
This is a large and well designed trial, but it was limited in that the maximal benefit of radiotherapy is believed to occur 6 weeks after the end of treatment. Given that median overall survival was only 8 weeks and considering the time it took to deliver the treatment, approximately half of the patients in this study died before an optimal assessment of symptoms could be done.
This might also explain why radiotherapy didn’t have an effect on steroid use in this study. Many patients didn’t live long enough for radiotherapy’s steroid-sparing effect to be observed.
Cécile Le Pechoux, MD, is in the department of radiation oncology at Gustave Roussy Cancer Campus in Villejuif, France. She and her associates reported having no relevant financial disclosures. They made these remarks in a comment accompanying the report on the QUARTZ trial (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736[16]31391-5).
Managing brain metastases from NSCLC is a challenge, because the lesions may well produce life-threatening symptoms and serious impairment, which could be ameliorated with whole brain radiotherapy.
This is a large and well designed trial, but it was limited in that the maximal benefit of radiotherapy is believed to occur 6 weeks after the end of treatment. Given that median overall survival was only 8 weeks and considering the time it took to deliver the treatment, approximately half of the patients in this study died before an optimal assessment of symptoms could be done.
This might also explain why radiotherapy didn’t have an effect on steroid use in this study. Many patients didn’t live long enough for radiotherapy’s steroid-sparing effect to be observed.
Cécile Le Pechoux, MD, is in the department of radiation oncology at Gustave Roussy Cancer Campus in Villejuif, France. She and her associates reported having no relevant financial disclosures. They made these remarks in a comment accompanying the report on the QUARTZ trial (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736[16]31391-5).
Managing brain metastases from NSCLC is a challenge, because the lesions may well produce life-threatening symptoms and serious impairment, which could be ameliorated with whole brain radiotherapy.
This is a large and well designed trial, but it was limited in that the maximal benefit of radiotherapy is believed to occur 6 weeks after the end of treatment. Given that median overall survival was only 8 weeks and considering the time it took to deliver the treatment, approximately half of the patients in this study died before an optimal assessment of symptoms could be done.
This might also explain why radiotherapy didn’t have an effect on steroid use in this study. Many patients didn’t live long enough for radiotherapy’s steroid-sparing effect to be observed.
Cécile Le Pechoux, MD, is in the department of radiation oncology at Gustave Roussy Cancer Campus in Villejuif, France. She and her associates reported having no relevant financial disclosures. They made these remarks in a comment accompanying the report on the QUARTZ trial (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736[16]31391-5).
Whole brain radiotherapy, a standard treatment for patients with metastatic non–small-cell lung cancer, provided no clinical benefit in a noninferiority trial specifically designed to assess both patient survival and quality of life.
The findings were published online Sept. 4 in the Lancet.
Whole brain radiotherapy, with or without concomitant steroid treatment, has been widely used for decades in that patient population, even though no sufficiently powered, definitive studies support the approach. It is likely that patients and clinicians alike continue to embrace it because of the absence of alternative treatment options.
The Quality of Life After Treatment for Brain Metastases (QUARTZ) trial was intended to assess whether any improvement in survival offered by whole brain radiotherapy is balanced by deterioration in quality of life, said Paula Mulvenna, MBBS, of the Northern Center for Cancer Care, Newcastle (England) Hospitals, and her associates (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736(16)30825-X).
QUARTZ involved 538 adults seen during a 7-year period who had NSCLC with brain metastases and who were not suited for either brain surgery or stereotactic radiotherapy. The median age was 66 years (range, 35-85 years), and 38% had a Karnofsky Performance Status score of less than 70.
The participants were randomly assigned to receive either optimal supportive care plus whole brain radiotherapy (269 patients) or optimal supportive care alone (269 patients) at 69 U.K. and 3 Australian medical centers. They reported on 20 symptoms and adverse effects, as well as health-related quality of life, approximately once per week.
The primary outcome measure – quality-adjusted life-years (QALY), which combines overall survival and quality of life – was 46.4 days with radiotherapy and 41.7 days without it.
Symptoms, adverse effects, and quality of life (QOL) were similar between the two study groups at 4 weeks, except that the radiotherapy group reported more moderate or severe episodes of drowsiness, hair loss, nausea, and dry or itchy scalp. The number and severity of serious adverse events were similar through 12 weeks of follow-up.
The percentage of patients whose QOL was either maintained or improved over time was similar between the two groups at 4 weeks (54% vs. 57%), 8 weeks (44% vs. 51%), and 12 weeks (44% vs. 49%). Changes in Karnofsky scores also were similar.
The study refuted the widely held belief that whole brain radiotherapy allows patients to reduce or discontinue steroid treatment, averting the associated adverse effects. Steroid doses were not significantly different between the two study groups through the first 8 weeks of treatment, which “challenges the dogma that whole brain radiotherapy can be seen as a steroid-sparing modality,” the investigators said.
Taken together, the findings “suggest that whole brain radiotherapy can be omitted and patients treated with optimal supportive care alone, without an important reduction in either overall survival or quality of life,” Dr. Mulvenna and her associates said.
The approximately 5-day difference between the two study groups in median overall survival highlights both the limited benefit offered by radiotherapy and the poor prognosis of this patient population, the researchers added.
Whole brain radiotherapy did offer a small survival benefit to the youngest patients who had good performance status and a “controlled” primary NSCLC. “For all other groups, [it] does not significantly affect QALY or overall survival,” they said.
Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the National Health and Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.
Whole brain radiotherapy, a standard treatment for patients with metastatic non–small-cell lung cancer, provided no clinical benefit in a noninferiority trial specifically designed to assess both patient survival and quality of life.
The findings were published online Sept. 4 in the Lancet.
Whole brain radiotherapy, with or without concomitant steroid treatment, has been widely used for decades in that patient population, even though no sufficiently powered, definitive studies support the approach. It is likely that patients and clinicians alike continue to embrace it because of the absence of alternative treatment options.
The Quality of Life After Treatment for Brain Metastases (QUARTZ) trial was intended to assess whether any improvement in survival offered by whole brain radiotherapy is balanced by deterioration in quality of life, said Paula Mulvenna, MBBS, of the Northern Center for Cancer Care, Newcastle (England) Hospitals, and her associates (Lancet 2016 Sep 4. doi: 10.1016/S0140-6736(16)30825-X).
QUARTZ involved 538 adults seen during a 7-year period who had NSCLC with brain metastases and who were not suited for either brain surgery or stereotactic radiotherapy. The median age was 66 years (range, 35-85 years), and 38% had a Karnofsky Performance Status score of less than 70.
The participants were randomly assigned to receive either optimal supportive care plus whole brain radiotherapy (269 patients) or optimal supportive care alone (269 patients) at 69 U.K. and 3 Australian medical centers. They reported on 20 symptoms and adverse effects, as well as health-related quality of life, approximately once per week.
The primary outcome measure – quality-adjusted life-years (QALY), which combines overall survival and quality of life – was 46.4 days with radiotherapy and 41.7 days without it.
Symptoms, adverse effects, and quality of life (QOL) were similar between the two study groups at 4 weeks, except that the radiotherapy group reported more moderate or severe episodes of drowsiness, hair loss, nausea, and dry or itchy scalp. The number and severity of serious adverse events were similar through 12 weeks of follow-up.
The percentage of patients whose QOL was either maintained or improved over time was similar between the two groups at 4 weeks (54% vs. 57%), 8 weeks (44% vs. 51%), and 12 weeks (44% vs. 49%). Changes in Karnofsky scores also were similar.
The study refuted the widely held belief that whole brain radiotherapy allows patients to reduce or discontinue steroid treatment, averting the associated adverse effects. Steroid doses were not significantly different between the two study groups through the first 8 weeks of treatment, which “challenges the dogma that whole brain radiotherapy can be seen as a steroid-sparing modality,” the investigators said.
Taken together, the findings “suggest that whole brain radiotherapy can be omitted and patients treated with optimal supportive care alone, without an important reduction in either overall survival or quality of life,” Dr. Mulvenna and her associates said.
The approximately 5-day difference between the two study groups in median overall survival highlights both the limited benefit offered by radiotherapy and the poor prognosis of this patient population, the researchers added.
Whole brain radiotherapy did offer a small survival benefit to the youngest patients who had good performance status and a “controlled” primary NSCLC. “For all other groups, [it] does not significantly affect QALY or overall survival,” they said.
Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the National Health and Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.
FROM THE LANCET
Key clinical point: Whole brain radiotherapy provided no clinically significant benefit for most patients with metastatic NSCLC.
Major finding: The primary outcome measure, quality-adjusted life-years, was 46.4 days with radiotherapy and 41.7 days without it.
Data source: An international, randomized, phase III noninferiority trial involving 538 patients treated during a 7-year period.
Disclosures: Cancer Research U.K., the Medical Research Council in the U.K., the Trans Tasman Radiation Oncology Group, and the Medical Research Council Australia supported the study. Dr. Mulvenna and her associates reported having no relevant financial disclosures.
Clot retrieval devices approved for initial ischemic stroke treatment
Two Trevo clot retrieval devices can now be marketed as an initial therapy to reduce paralysis from strokes that are caused by blood clots, according to a press release from the Food and Drug Administration.
Previously, the only first-line treatment approved for acute ischemic stroke was tissue plasminogen activator (TPA) delivered intravenously. The FDA approved Trevo devices based on a clinical trial in which 29% of patients treated with the Trevo device combined with TPA and medical management of blood pressure and disability symptoms were shown to be functionally independent 3 months after their stroke, compared with only 19% of patients treated with TPA plus medical management alone.
The Trevo devices are approved for usage within 6 hours of symptom onset and only following treatment with TPA, which should be administered within 3 hours of stroke onset. Associated risks with Trevo device usage include failure to retrieve the blood clot, device malfunctions including breakage and navigation difficulties, potential damage of blood vessels, and the chance of perforation or hemorrhage.
The Trevo device was first approved by the FDA in 2012 to remove blood clots in order to restore blood flow in stroke patients who could not receive TPA or for those patients who did not respond to TPA therapy. The current approval expands the devices’ indication to a broader group of patients, according to the release.
“This is the first time FDA has allowed the use of these devices alongside TPA, which has the potential to help further reduce the devastating disabilities associated with strokes compared to the use of TPA alone. Now health care providers and their patients have another tool for treating stroke and potentially preventing long-term disability,” Carlos Peña, PhD, director of the division of neurological and physical medicine devices at the FDA’s Center for Devices and Radiological Health, said in the press release.
Find the full press release on the FDA website.
Two Trevo clot retrieval devices can now be marketed as an initial therapy to reduce paralysis from strokes that are caused by blood clots, according to a press release from the Food and Drug Administration.
Previously, the only first-line treatment approved for acute ischemic stroke was tissue plasminogen activator (TPA) delivered intravenously. The FDA approved Trevo devices based on a clinical trial in which 29% of patients treated with the Trevo device combined with TPA and medical management of blood pressure and disability symptoms were shown to be functionally independent 3 months after their stroke, compared with only 19% of patients treated with TPA plus medical management alone.
The Trevo devices are approved for usage within 6 hours of symptom onset and only following treatment with TPA, which should be administered within 3 hours of stroke onset. Associated risks with Trevo device usage include failure to retrieve the blood clot, device malfunctions including breakage and navigation difficulties, potential damage of blood vessels, and the chance of perforation or hemorrhage.
The Trevo device was first approved by the FDA in 2012 to remove blood clots in order to restore blood flow in stroke patients who could not receive TPA or for those patients who did not respond to TPA therapy. The current approval expands the devices’ indication to a broader group of patients, according to the release.
“This is the first time FDA has allowed the use of these devices alongside TPA, which has the potential to help further reduce the devastating disabilities associated with strokes compared to the use of TPA alone. Now health care providers and their patients have another tool for treating stroke and potentially preventing long-term disability,” Carlos Peña, PhD, director of the division of neurological and physical medicine devices at the FDA’s Center for Devices and Radiological Health, said in the press release.
Find the full press release on the FDA website.
Two Trevo clot retrieval devices can now be marketed as an initial therapy to reduce paralysis from strokes that are caused by blood clots, according to a press release from the Food and Drug Administration.
Previously, the only first-line treatment approved for acute ischemic stroke was tissue plasminogen activator (TPA) delivered intravenously. The FDA approved Trevo devices based on a clinical trial in which 29% of patients treated with the Trevo device combined with TPA and medical management of blood pressure and disability symptoms were shown to be functionally independent 3 months after their stroke, compared with only 19% of patients treated with TPA plus medical management alone.
The Trevo devices are approved for usage within 6 hours of symptom onset and only following treatment with TPA, which should be administered within 3 hours of stroke onset. Associated risks with Trevo device usage include failure to retrieve the blood clot, device malfunctions including breakage and navigation difficulties, potential damage of blood vessels, and the chance of perforation or hemorrhage.
The Trevo device was first approved by the FDA in 2012 to remove blood clots in order to restore blood flow in stroke patients who could not receive TPA or for those patients who did not respond to TPA therapy. The current approval expands the devices’ indication to a broader group of patients, according to the release.
“This is the first time FDA has allowed the use of these devices alongside TPA, which has the potential to help further reduce the devastating disabilities associated with strokes compared to the use of TPA alone. Now health care providers and their patients have another tool for treating stroke and potentially preventing long-term disability,” Carlos Peña, PhD, director of the division of neurological and physical medicine devices at the FDA’s Center for Devices and Radiological Health, said in the press release.
Find the full press release on the FDA website.
Commentary: INR instability in the NOAC era
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Progress in the development of new oral anticoagulants (NOACs), as well as agents for their reversal, has lowered the threshold to use these therapeutics as first line agents for the management of nonvalvular atrial fibrillation and venous thromboembolism.1,2 Despite this increase in adoption, however, debate persists as to whether patients chronically maintained on vitamin K antagonists (VKAs), such as warfarin, should be switched to NOACs. The recently published research letter by Pokorney et al. assessed the stability of international normalized ratios (INRs) in patients on long-term warfarin therapy in order to address this question.3
Specifically, prospective registry data from 3,749 patients with at least three INR values in the first 6 months of therapy as well as six or more in the following year were included. Patients were deemed stable if 80% or more of their INRs were in a therapeutic range defined as an INR between 2 and 3.3 During the initiation period, only one in four patients taking warfarin had a stable INR.3 Furthermore, stability in the first 6 months was found to have limited ability to predict stability in the subsequent year (concordance index of 0.61). With regard to time in therapeutic range (TTR), only 32% of patients had a TTR of greater than 80% during the first 6 months with less than half (42%) of these patients able to maintain this in the following year.
Findings from Pokorney et al. add to the growing body of literature demonstrating the difficulty of achieving and maintaining a therapeutic INR while on warfarin therapy.4-7 Clinically, these findings are important, as deviations from TTR have been shown to be associated with increased risk of bleeding and thrombosis as well as increased health care costs.8-10 Mechanistically, patient factors such as differences in vitamin K consumption, comorbid conditions, drug-drug interactions, and medication compliance, as well as genetic differences that impact drug metabolism undoubtedly contribute to the variation of INR noted in patients on warfarin therapy.
Attempts to improve stability have included the administration of low-dose oral vitamin K. However, recent data from a multicenter randomized control trial suggests that while such therapy may help to decrease extreme variations in INR, it does not lead to an increased TTR.11 Furthermore, while significant work has been conducted in identifying specific gene variants, such as CYP2C9 and VKORC, which encode cytochrome P450 and vitamin K epoxide reductase enzymes, respectively, economic analyses suggest that testing for these gene variants would not be cost-effective.12 Additionally, clinical prediction tools, which incorporate important patient factors to help guide anticoagulation explain less than 10% of TTR variability.4
Nonetheless, some caution is warranted in the interpretation of the results reported by Pokorney and his colleagues. The proportion of registry patients treated with warfarin who had a low TTR was much lower than that previously reported by the pivotal U.S. trials of NOACs (55%-68%) and significantly lower than the results of a recent nationwide Swedish registry involving 40,449 patients.13
In the Swedish registry, the mean individual TTR was 70% with more than half the patients having a TTR of 70% or more, emphasizing the importance of health care system effects. Moreover, regardless of whether a patient is on warfarin or a NOAC, patients with a lower TTR have higher rates of diabetes, chronic obstructive pulmonary disease, heart failure, and renal failure, which may contribute to the need for additional therapies that may influence TTR.
For example, INR may be increased by ciprofloxacin or omeprazole when taken with warfarin, and CYP3A4 and P-glycoprotein (P-gp) inducers and inhibitors can result in an increased or decreased anticoagulation effect when used with NOACs. Recent reports have also highlighted variability in the safety of NOACs, particularly among patients with renal or liver insufficiency, African Americans, or patients with a prior history of GI bleeding.14-16 For these subgroups, determining NOAC activity to improve clinical safety of these agents is difficult.
PT or INR testing is largely insensitive or otherwise highly variable and the blood draw time relative to the most recent dose significantly influences the measured level of anti-Xa activity. Importantly, socioeconomic factors and family support systems also influence TTR, as important determinants of access to needed drugs or the ability to sustain related costs over time.
Taken together, prior INR stability on warfarin therapy does not ensure continued stability and, as a consequence, long-term warfarin therapy requires close monitoring in order to remain effective. To this end, further development of point-of-care coagulometers for self-testing and self-management, which have been found to be acceptable and preferred by patients, should be pursued.17 Similarly, attempts to decrease INR variability through research on optimizing computer assisted dosing programs remains warranted.18 NOACs offer an advantage over warfarin therapy in that they have a more predictable pharmacokinetic profile, which precludes the need for routine monitoring of anticoagulation parameters. However, many of the same factors, which influence TTR for warfarin do so for NOACs; NOACs have increased bleeding risk in comparison to warfarin for a number of demographic groups; and the high cost of NOACs may influence patient compliance.
Accordingly, until further data is available, consideration of the conversion of a patient on warfarin with a low TTR to a NOAC should be individualized.
Madhukar S. Patel, MD, is a general surgeon at the Department of Surgery, Massachusetts General Hospital, Boston, and Elliot L. Chaikof, MD, is Surgeon-in-Chief, Beth Israel Deaconess Medical Center, and Chairman, Roberta and Stephen R. Weiner Department of Surgery, Johnson and Johnson Professor of Surgery, Harvard Medical School. Dr. Chaikof is also an associate editor for Vascular Specialist. They have no relevant conflicts.
References
2. Nat Rev Cardiol. 2014;11:693-703.
5. J Thromb Haemost. 2010;8:2182-91.
6. Thromb Haemost. 2009;101:552-6.
7. Am J Cardiovasc Drugs. 2015;15:205-11.
8. Circ Cardiovasc Qual Outcomes. 2008;1:84-91.
10. J Med Econ. 2015;18:333-40.
11. Thromb Haemost. 2016;116:480-5.
12. Ann Intern Med. 2009;150:73-83.
13. JAMA Cardiol. 2016;1:172-80.
14. N Engl J Med. 2013;369:2093-104.
15. JAMA Intern Med. 2015;175:18-24.
16. J Am Coll Cardiol. 2014;63:891-900.
Antibiotic susceptibility differs in transplant recipients
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
Antibiotic susceptibility in bacteria cultured from transplant recipients at a single hospital differed markedly from that in hospital-wide antibiograms, according to a report published in Diagnostic Microbiology and Infectious Disease.
Understanding the differences in antibiotic susceptibility among these highly immunocompromised patients can help guide treatment when they develop infection, and reduce the delay before they begin receiving appropriate antibiotics, said Rossana Rosa, MD, of Jackson Memorial Hospital, Miami, and her associates.
The investigators examined the antibiotic susceptibility of 1,889 isolates from blood and urine specimens taken from patients who had received solid-organ transplants at a single tertiary-care teaching hospital and then developed bacterial infections during a 2-year period. These patients included both children and adults who had received kidney, pancreas, liver, heart, lung, or intestinal transplants and were treated in numerous, “geographically distributed” units throughout the hospital. Their culture results were compared with those from 10,439 other patients with bacterial infections, which comprised the hospital-wide antibiograms developed every 6 months during the study period.
The Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa isolates from the transplant recipients showed markedly less susceptibility to first-line antibiotics than would have been predicted by the hospital-antibiograms. In particular, in the transplant recipients E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin (Diag Microbiol Infect Dis. 2016 Aug 25. doi: 10.1016/j.diagmicrobio.2016.08.018).
“We advocate for the development of antibiograms specific to solid-organ transplant recipients. This may allow intrahospital comparisons and intertransplant-center monitoring of trends in antimicrobial resistance over time,” Dr. Rosa and her associates said.
FROM DIAGNOSTIC MICROBIOLOGY AND INFECTIOUS DISEASE
Key clinical point: Antibiotic susceptibility in bacteria cultured from transplant recipients differs markedly from that in hospital-wide antibiograms.
Major finding: In the transplant recipients, E. coli infections were resistant to trimethoprim-sulfamethoxazole, levofloxacin, and ceftriaxone; K. pneumoniae infections were resistant to every antibiotic except amikacin; and P. aeruginosa infections were resistant to levofloxacin, cefepime, and amikacin.
Data source: A single-center study comparing the antibiotic susceptibility of 1,889 bacterial isolates from transplant recipients with 10,439 isolates from other patients.
Disclosures: This study was not supported by funding from any public, commercial, or not-for-profit entities. Dr. Rosa and her associates reported having no relevant financial disclosures.
Size does matter
I grew up in a small town. I went to a small college. And I live and practiced in a small town in a sparsely populated state. Clearly, I’m partial to smallness. I have practiced in a two-man partnership, a solo practice, a small group, and finally in a large multicenter organization. The 10 years I practiced by myself were the most productive. It was also the most rewarding period of my life both professionally and financially.
The small group environment was a close second as a far as job satisfaction. What little I lost in autonomy was almost balanced by professional stimulation and camaraderie or working shoulder to shoulder with peers. However, all that was lost as our small group was engulfed by a larger entity. Our overhead inflated to a point that it was almost unsustainable. Meetings gobbled up productive office time. Even making little changes that might have allowed us to adapt to the changing clinical landscape seemed to take forever. That is, if they ever happened at all.
From my personal experience, health care delivery doesn’t benefit from the economics of scale that is claimed by other industries. Bigger is not better for health care delivery.
When the promoters of the Affordable Care Act promised that it would encourage cost saving and quality enhancing mergers of health delivery organizations, many other physicians and I had serious doubts about this claim. It turns that our concerns were well founded. The consolidations that were predicted and so eagerly anticipated by the architects of the ACA have occurred, but have not resulted in the promised cost savings or quality improvement.
The results have been so disappointing that Dr. Bob Kocher, special assistant to President Obama for health care and economic policy from 2009 to 2010, has felt the need to issue a mea culpa in the form of an op-ed piece in the Wall Street Journal (“How I Was Wrong About ObamaCare,” July 31, 2016). Although Dr. Kocher still believes organizing medicine into networks “that can share information, coordinate care for patients, and manage risk is critical for delivering higher-quality care, generating cost savings and improving the experience for patients,” he acknowledges, “having every provider in health care ‘owned’ by a single organization is more likely to be a barrier to better care.”
He cites recent evidence that “savings and quality improvement are generated much more often by independent primary care doctors than large hospital-centric health systems.” Small independent practices know their patients better. Unencumbered by the weight of multiple organizational layers, they can more nimbly adjust to change. And there will always be change.
Dr. Kocher also admits that he and his co-crafters of the ACA were mistaken in their belief that “it would take three to five years for physicians to use electronic health records effectively.” Unfortunately, he places the failure on what he views as delay tactics by organized medicine. Sadly, he shares this blind spot with too many other former and current government health care officials, most of whom who have never suffered under the burden of a user-unfriendly, free time–wasting, electronic medical record system.
While it is nice of Dr. Kocher to acknowledge his revelation about size, it comes too late. The bridges have already been burned. Most of the smaller independent practices he now realizes could have provided a solution to the accelerating cost of medical care are gone. In some cases, one of forces driving small practices to merge was the cost and complexity of converting to electronic medical records.
The mea culpa that we really need to hear now is the one admitting that the roll out of the Health Information Technology for Economic and Clinical Health Act (HITECH) of 2009 and meaningful use requirements were poorly conceived and implemented. Electronic health record systems that are capable of seamlessly communicating with one another, and are inexpensive, intuitive, and user friendly might have allowed more small independent practices to survive.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.” Email him at pdnews@frontlinemedcom.com.
I grew up in a small town. I went to a small college. And I live and practiced in a small town in a sparsely populated state. Clearly, I’m partial to smallness. I have practiced in a two-man partnership, a solo practice, a small group, and finally in a large multicenter organization. The 10 years I practiced by myself were the most productive. It was also the most rewarding period of my life both professionally and financially.
The small group environment was a close second as a far as job satisfaction. What little I lost in autonomy was almost balanced by professional stimulation and camaraderie or working shoulder to shoulder with peers. However, all that was lost as our small group was engulfed by a larger entity. Our overhead inflated to a point that it was almost unsustainable. Meetings gobbled up productive office time. Even making little changes that might have allowed us to adapt to the changing clinical landscape seemed to take forever. That is, if they ever happened at all.
From my personal experience, health care delivery doesn’t benefit from the economics of scale that is claimed by other industries. Bigger is not better for health care delivery.
When the promoters of the Affordable Care Act promised that it would encourage cost saving and quality enhancing mergers of health delivery organizations, many other physicians and I had serious doubts about this claim. It turns that our concerns were well founded. The consolidations that were predicted and so eagerly anticipated by the architects of the ACA have occurred, but have not resulted in the promised cost savings or quality improvement.
The results have been so disappointing that Dr. Bob Kocher, special assistant to President Obama for health care and economic policy from 2009 to 2010, has felt the need to issue a mea culpa in the form of an op-ed piece in the Wall Street Journal (“How I Was Wrong About ObamaCare,” July 31, 2016). Although Dr. Kocher still believes organizing medicine into networks “that can share information, coordinate care for patients, and manage risk is critical for delivering higher-quality care, generating cost savings and improving the experience for patients,” he acknowledges, “having every provider in health care ‘owned’ by a single organization is more likely to be a barrier to better care.”
He cites recent evidence that “savings and quality improvement are generated much more often by independent primary care doctors than large hospital-centric health systems.” Small independent practices know their patients better. Unencumbered by the weight of multiple organizational layers, they can more nimbly adjust to change. And there will always be change.
Dr. Kocher also admits that he and his co-crafters of the ACA were mistaken in their belief that “it would take three to five years for physicians to use electronic health records effectively.” Unfortunately, he places the failure on what he views as delay tactics by organized medicine. Sadly, he shares this blind spot with too many other former and current government health care officials, most of whom who have never suffered under the burden of a user-unfriendly, free time–wasting, electronic medical record system.
While it is nice of Dr. Kocher to acknowledge his revelation about size, it comes too late. The bridges have already been burned. Most of the smaller independent practices he now realizes could have provided a solution to the accelerating cost of medical care are gone. In some cases, one of forces driving small practices to merge was the cost and complexity of converting to electronic medical records.
The mea culpa that we really need to hear now is the one admitting that the roll out of the Health Information Technology for Economic and Clinical Health Act (HITECH) of 2009 and meaningful use requirements were poorly conceived and implemented. Electronic health record systems that are capable of seamlessly communicating with one another, and are inexpensive, intuitive, and user friendly might have allowed more small independent practices to survive.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.” Email him at pdnews@frontlinemedcom.com.
I grew up in a small town. I went to a small college. And I live and practiced in a small town in a sparsely populated state. Clearly, I’m partial to smallness. I have practiced in a two-man partnership, a solo practice, a small group, and finally in a large multicenter organization. The 10 years I practiced by myself were the most productive. It was also the most rewarding period of my life both professionally and financially.
The small group environment was a close second as a far as job satisfaction. What little I lost in autonomy was almost balanced by professional stimulation and camaraderie or working shoulder to shoulder with peers. However, all that was lost as our small group was engulfed by a larger entity. Our overhead inflated to a point that it was almost unsustainable. Meetings gobbled up productive office time. Even making little changes that might have allowed us to adapt to the changing clinical landscape seemed to take forever. That is, if they ever happened at all.
From my personal experience, health care delivery doesn’t benefit from the economics of scale that is claimed by other industries. Bigger is not better for health care delivery.
When the promoters of the Affordable Care Act promised that it would encourage cost saving and quality enhancing mergers of health delivery organizations, many other physicians and I had serious doubts about this claim. It turns that our concerns were well founded. The consolidations that were predicted and so eagerly anticipated by the architects of the ACA have occurred, but have not resulted in the promised cost savings or quality improvement.
The results have been so disappointing that Dr. Bob Kocher, special assistant to President Obama for health care and economic policy from 2009 to 2010, has felt the need to issue a mea culpa in the form of an op-ed piece in the Wall Street Journal (“How I Was Wrong About ObamaCare,” July 31, 2016). Although Dr. Kocher still believes organizing medicine into networks “that can share information, coordinate care for patients, and manage risk is critical for delivering higher-quality care, generating cost savings and improving the experience for patients,” he acknowledges, “having every provider in health care ‘owned’ by a single organization is more likely to be a barrier to better care.”
He cites recent evidence that “savings and quality improvement are generated much more often by independent primary care doctors than large hospital-centric health systems.” Small independent practices know their patients better. Unencumbered by the weight of multiple organizational layers, they can more nimbly adjust to change. And there will always be change.
Dr. Kocher also admits that he and his co-crafters of the ACA were mistaken in their belief that “it would take three to five years for physicians to use electronic health records effectively.” Unfortunately, he places the failure on what he views as delay tactics by organized medicine. Sadly, he shares this blind spot with too many other former and current government health care officials, most of whom who have never suffered under the burden of a user-unfriendly, free time–wasting, electronic medical record system.
While it is nice of Dr. Kocher to acknowledge his revelation about size, it comes too late. The bridges have already been burned. Most of the smaller independent practices he now realizes could have provided a solution to the accelerating cost of medical care are gone. In some cases, one of forces driving small practices to merge was the cost and complexity of converting to electronic medical records.
The mea culpa that we really need to hear now is the one admitting that the roll out of the Health Information Technology for Economic and Clinical Health Act (HITECH) of 2009 and meaningful use requirements were poorly conceived and implemented. Electronic health record systems that are capable of seamlessly communicating with one another, and are inexpensive, intuitive, and user friendly might have allowed more small independent practices to survive.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.” Email him at pdnews@frontlinemedcom.com.