User login
In lupus, optimize non-immunosuppressives to dial back prednisone
LAS VEGAS – For patients with mild to moderate systemic lupus erythematosus, don’t go above six milligrams of prednisone daily, because the treatment is likely to be worse than the disease in the long run, said Michelle Petri, MD.
Speaking at the annual Perspectives in Rheumatic Diseases presented by Global Academy for Medical Education, Dr. Petri shared evidence-backed clinical pearls to help rheumatologists dial in good disease control for their systemic lupus erythematosus (SLE) patients without turning to too much prednisone.
Prednisone is also an independent risk factor for cardiovascular events, with dose-dependent effects, she said. She referred to a 2012 study she coauthored (Am J Epidemiol. 2012 Oct;176[8]:708-19) that found an age-adjusted cardiovascular event rate ratio of 2.4 when patients took 10-19 mg of prednisone daily (95% confidence interval [CI], 1.5-3.8; P = .0002). When the daily prednisone dose was over 20 mg, the rate ratio was 5.1 (95% CI, 3.1-8.4; P less than .0001).
Since it’s so important to minimize prednisone exposure, rheumatologists should be familiar with the full toolkit of non-immunosuppressive immunomodulators, and understand how to help patients assess risks and benefits of various treatments.“Non-immunosuppressive immunomodulators can control mild to moderate systemic lupus, helping to avoid steroids,” Dr. Petri said.
Hydroxychloroquine, when used as background therapy, has proved to have multiple benefits. Not only are flares reduced, but organ damage is reduced, lipids improve, fewer clots occur, and seizures are prevented, she said. There’s also an overall improvement in survival with hydroxychloroquine. In lupus nephritis, “continuing hydroxychloroquine improves complete response rates with mycophenolate mofetil,” Dr. Petri said.
Concerns about hydroxychloroquine-related retinopathy sometimes stand in the way of its use as background therapy, so Dr. Petri encouraged rheumatologists to have a realistic risk-benefit assessment. Higher risk for retinopathy is associated with higher dosing (greater than 6.5 mg/kg of hydroxychloroquine or 3 mg/kg of chloroquine); a higher fat body habitus, unless dosing is appropriately adjusted; the presence of renal disease or concomitant retinal disease; and age over 60.
Newer imaging techniques may sacrifice specificity for very high sensitivity in detecting hydroxychloroquine-related retinopathy, Dr. Petri said. High-speed ultra-high resolution optical coherence tomography (hsUHR-OCT) and multifocal electroretinography (mfERG) are imaging techniques that can detect hydroxychloroquine-related retinopathy, but should be reserved for patients with SLE who actually have visual symptoms, she said. Dr. Petri cited a study of 15 patients who were taking hydroxychloroquine, and 6 age-matched controls with normal visual function. All underwent visual field testing, mfERG, and hsUHR-OCT. The study “was unable to find an asymptomatic patient with evidence of definite damage on hsUHR-OCT” as well as mfERG, she said (Arch Ophthalmol. 2007 Jun;125[6]:775-80).
Vitamin D supplementation gives a modest boost to overall disease control and also affords some renal protection, Dr. Petri said. She was the lead author on a 2013 study that showed that a 20-unit increase in 25-hydroxy vitamin D was associated with reduced global disease severity scores, as well as a 21% reduction in the odds of having a SLEDAI (Systemic Lupus Erythematosus Disease Activity Index) score of 5 or more, and a 15% decrease in the odds of having a urine protein to creatinine ratio greater than 0.5 (Arthritis Rheum. 2013 Jul;65[7]:1865-71).
When it comes to vitamin D, more matters, Dr. Petri said. “Go above 40 [ng/mL] on vitamin D,” she said, noting that there may be pleiotropic cardiovascular and hematologic benefits as well.
Though dehydroepiandrosterone (DHEA) is not approved by the Food and Drug Administration to treat SLE, 200 mg of DHEA daily helped women with active SLE improve or stabilize disease activity, and also helped 51% of women in one study reduce prednisone to less than 7.5 mg daily, compared with 29% of women taking placebo (P = .03) (Arthritis Rheum. 2002 Jul;46[7]:1820-9). There’s also “mild protection against bone loss,” Dr. Petri said.
Dr. Petri reported receiving research grants and serving as a consultant to several pharmaceutical companies. Global Academy for Medical Education and this news organization are owned by the same parent company.
koakes@frontlinemedcom.com
On Twitter @karioakes
LAS VEGAS – For patients with mild to moderate systemic lupus erythematosus, don’t go above six milligrams of prednisone daily, because the treatment is likely to be worse than the disease in the long run, said Michelle Petri, MD.
Speaking at the annual Perspectives in Rheumatic Diseases presented by Global Academy for Medical Education, Dr. Petri shared evidence-backed clinical pearls to help rheumatologists dial in good disease control for their systemic lupus erythematosus (SLE) patients without turning to too much prednisone.
Prednisone is also an independent risk factor for cardiovascular events, with dose-dependent effects, she said. She referred to a 2012 study she coauthored (Am J Epidemiol. 2012 Oct;176[8]:708-19) that found an age-adjusted cardiovascular event rate ratio of 2.4 when patients took 10-19 mg of prednisone daily (95% confidence interval [CI], 1.5-3.8; P = .0002). When the daily prednisone dose was over 20 mg, the rate ratio was 5.1 (95% CI, 3.1-8.4; P less than .0001).
Since it’s so important to minimize prednisone exposure, rheumatologists should be familiar with the full toolkit of non-immunosuppressive immunomodulators, and understand how to help patients assess risks and benefits of various treatments.“Non-immunosuppressive immunomodulators can control mild to moderate systemic lupus, helping to avoid steroids,” Dr. Petri said.
Hydroxychloroquine, when used as background therapy, has proved to have multiple benefits. Not only are flares reduced, but organ damage is reduced, lipids improve, fewer clots occur, and seizures are prevented, she said. There’s also an overall improvement in survival with hydroxychloroquine. In lupus nephritis, “continuing hydroxychloroquine improves complete response rates with mycophenolate mofetil,” Dr. Petri said.
Concerns about hydroxychloroquine-related retinopathy sometimes stand in the way of its use as background therapy, so Dr. Petri encouraged rheumatologists to have a realistic risk-benefit assessment. Higher risk for retinopathy is associated with higher dosing (greater than 6.5 mg/kg of hydroxychloroquine or 3 mg/kg of chloroquine); a higher fat body habitus, unless dosing is appropriately adjusted; the presence of renal disease or concomitant retinal disease; and age over 60.
Newer imaging techniques may sacrifice specificity for very high sensitivity in detecting hydroxychloroquine-related retinopathy, Dr. Petri said. High-speed ultra-high resolution optical coherence tomography (hsUHR-OCT) and multifocal electroretinography (mfERG) are imaging techniques that can detect hydroxychloroquine-related retinopathy, but should be reserved for patients with SLE who actually have visual symptoms, she said. Dr. Petri cited a study of 15 patients who were taking hydroxychloroquine, and 6 age-matched controls with normal visual function. All underwent visual field testing, mfERG, and hsUHR-OCT. The study “was unable to find an asymptomatic patient with evidence of definite damage on hsUHR-OCT” as well as mfERG, she said (Arch Ophthalmol. 2007 Jun;125[6]:775-80).
Vitamin D supplementation gives a modest boost to overall disease control and also affords some renal protection, Dr. Petri said. She was the lead author on a 2013 study that showed that a 20-unit increase in 25-hydroxy vitamin D was associated with reduced global disease severity scores, as well as a 21% reduction in the odds of having a SLEDAI (Systemic Lupus Erythematosus Disease Activity Index) score of 5 or more, and a 15% decrease in the odds of having a urine protein to creatinine ratio greater than 0.5 (Arthritis Rheum. 2013 Jul;65[7]:1865-71).
When it comes to vitamin D, more matters, Dr. Petri said. “Go above 40 [ng/mL] on vitamin D,” she said, noting that there may be pleiotropic cardiovascular and hematologic benefits as well.
Though dehydroepiandrosterone (DHEA) is not approved by the Food and Drug Administration to treat SLE, 200 mg of DHEA daily helped women with active SLE improve or stabilize disease activity, and also helped 51% of women in one study reduce prednisone to less than 7.5 mg daily, compared with 29% of women taking placebo (P = .03) (Arthritis Rheum. 2002 Jul;46[7]:1820-9). There’s also “mild protection against bone loss,” Dr. Petri said.
Dr. Petri reported receiving research grants and serving as a consultant to several pharmaceutical companies. Global Academy for Medical Education and this news organization are owned by the same parent company.
koakes@frontlinemedcom.com
On Twitter @karioakes
LAS VEGAS – For patients with mild to moderate systemic lupus erythematosus, don’t go above six milligrams of prednisone daily, because the treatment is likely to be worse than the disease in the long run, said Michelle Petri, MD.
Speaking at the annual Perspectives in Rheumatic Diseases presented by Global Academy for Medical Education, Dr. Petri shared evidence-backed clinical pearls to help rheumatologists dial in good disease control for their systemic lupus erythematosus (SLE) patients without turning to too much prednisone.
Prednisone is also an independent risk factor for cardiovascular events, with dose-dependent effects, she said. She referred to a 2012 study she coauthored (Am J Epidemiol. 2012 Oct;176[8]:708-19) that found an age-adjusted cardiovascular event rate ratio of 2.4 when patients took 10-19 mg of prednisone daily (95% confidence interval [CI], 1.5-3.8; P = .0002). When the daily prednisone dose was over 20 mg, the rate ratio was 5.1 (95% CI, 3.1-8.4; P less than .0001).
Since it’s so important to minimize prednisone exposure, rheumatologists should be familiar with the full toolkit of non-immunosuppressive immunomodulators, and understand how to help patients assess risks and benefits of various treatments.“Non-immunosuppressive immunomodulators can control mild to moderate systemic lupus, helping to avoid steroids,” Dr. Petri said.
Hydroxychloroquine, when used as background therapy, has proved to have multiple benefits. Not only are flares reduced, but organ damage is reduced, lipids improve, fewer clots occur, and seizures are prevented, she said. There’s also an overall improvement in survival with hydroxychloroquine. In lupus nephritis, “continuing hydroxychloroquine improves complete response rates with mycophenolate mofetil,” Dr. Petri said.
Concerns about hydroxychloroquine-related retinopathy sometimes stand in the way of its use as background therapy, so Dr. Petri encouraged rheumatologists to have a realistic risk-benefit assessment. Higher risk for retinopathy is associated with higher dosing (greater than 6.5 mg/kg of hydroxychloroquine or 3 mg/kg of chloroquine); a higher fat body habitus, unless dosing is appropriately adjusted; the presence of renal disease or concomitant retinal disease; and age over 60.
Newer imaging techniques may sacrifice specificity for very high sensitivity in detecting hydroxychloroquine-related retinopathy, Dr. Petri said. High-speed ultra-high resolution optical coherence tomography (hsUHR-OCT) and multifocal electroretinography (mfERG) are imaging techniques that can detect hydroxychloroquine-related retinopathy, but should be reserved for patients with SLE who actually have visual symptoms, she said. Dr. Petri cited a study of 15 patients who were taking hydroxychloroquine, and 6 age-matched controls with normal visual function. All underwent visual field testing, mfERG, and hsUHR-OCT. The study “was unable to find an asymptomatic patient with evidence of definite damage on hsUHR-OCT” as well as mfERG, she said (Arch Ophthalmol. 2007 Jun;125[6]:775-80).
Vitamin D supplementation gives a modest boost to overall disease control and also affords some renal protection, Dr. Petri said. She was the lead author on a 2013 study that showed that a 20-unit increase in 25-hydroxy vitamin D was associated with reduced global disease severity scores, as well as a 21% reduction in the odds of having a SLEDAI (Systemic Lupus Erythematosus Disease Activity Index) score of 5 or more, and a 15% decrease in the odds of having a urine protein to creatinine ratio greater than 0.5 (Arthritis Rheum. 2013 Jul;65[7]:1865-71).
When it comes to vitamin D, more matters, Dr. Petri said. “Go above 40 [ng/mL] on vitamin D,” she said, noting that there may be pleiotropic cardiovascular and hematologic benefits as well.
Though dehydroepiandrosterone (DHEA) is not approved by the Food and Drug Administration to treat SLE, 200 mg of DHEA daily helped women with active SLE improve or stabilize disease activity, and also helped 51% of women in one study reduce prednisone to less than 7.5 mg daily, compared with 29% of women taking placebo (P = .03) (Arthritis Rheum. 2002 Jul;46[7]:1820-9). There’s also “mild protection against bone loss,” Dr. Petri said.
Dr. Petri reported receiving research grants and serving as a consultant to several pharmaceutical companies. Global Academy for Medical Education and this news organization are owned by the same parent company.
koakes@frontlinemedcom.com
On Twitter @karioakes
EXPERT ANALYSIS FROM THE ANNUAL PERSPECTIVES IN RHEUMATIC DISEASES
Teen birth rates continue to decline in the United States
The teen birth rate of 22.3/1,000 females aged 15-19 years for 2015 was down by almost 8% from the year before and marks the seventh consecutive year of historic lows. Since 1991, when 61.8/1,000 teens aged 15-19 gave birth, the rate has fallen 64%, the NCHS reported.
For teens aged 15-19 years, the birth rate declined for each race/ethnicity: dropping 8% for non-Hispanic whites and Hispanics, 9% for non-Hispanic blacks, 10% for Asians or Pacific Islanders, and 6% for American Indians or Alaska Natives. All rates for 2015 were historically low.
The teen birth rate of 22.3/1,000 females aged 15-19 years for 2015 was down by almost 8% from the year before and marks the seventh consecutive year of historic lows. Since 1991, when 61.8/1,000 teens aged 15-19 gave birth, the rate has fallen 64%, the NCHS reported.
For teens aged 15-19 years, the birth rate declined for each race/ethnicity: dropping 8% for non-Hispanic whites and Hispanics, 9% for non-Hispanic blacks, 10% for Asians or Pacific Islanders, and 6% for American Indians or Alaska Natives. All rates for 2015 were historically low.
The teen birth rate of 22.3/1,000 females aged 15-19 years for 2015 was down by almost 8% from the year before and marks the seventh consecutive year of historic lows. Since 1991, when 61.8/1,000 teens aged 15-19 gave birth, the rate has fallen 64%, the NCHS reported.
For teens aged 15-19 years, the birth rate declined for each race/ethnicity: dropping 8% for non-Hispanic whites and Hispanics, 9% for non-Hispanic blacks, 10% for Asians or Pacific Islanders, and 6% for American Indians or Alaska Natives. All rates for 2015 were historically low.
Vitamin D deficiency linked to hepatitis B infection
More than half of hepatitis B patients were vitamin D deficient, and deficiency was associated with worse outcomes, in a Vietnamese study published in BMC Infectious Diseases.
A total of 48% of 165 chronic hepatitis B patients, 54% of 127 hepatitis B virus (HBV)-associated cirrhosis patients, and 55% of 108 virus-associated hepatocellular carcinoma (HCC) patients had levels below 20 ng/mL, compared with 33% of 122 healthy controls.
Vitamin D levels inversely correlated with viral DNA loads, and lower levels were associated with more severe cirrhosis. On multivariate analyses adjusted for age and sex, HBV infection was an independent risk factor for vitamin D inadequacy (BMC Infect Dis. 2016 Sep 23;16[1]:507).
“These findings suggest that serum vitamin D levels contribute significantly to the clinical courses of HBV infection, including the severe consequences of” cirrhosis and HCC. “Our findings allow speculating that vitamin D and its analogs might provide a potential therapeutic addendum in the treatment of chronic liver diseases,” concluded investigators led by Nghiem Xuan Hoan, MD, of the Vietnamese-German Center for Medical Research in Hanoi.
Total vitamin D (25[OH] D2 and D3) levels were measured only once in the study; the cross-sectional design meant that “we could not determine the fluctuation of vitamin D levels over the course of HBV infection as well as the causative association of vitamin D levels and HBV-related liver diseases,” the researchers explained.
Low levels could be caused by impaired hepatic function, because the liver is key to vitamin D metabolism. HBV patients also tended to be older than the healthy controls were, which also might have contributed to the greater likelihood of deficiency.
Even so, a case seems to be building that vitamin D has an active role in HBV liver disease. Other investigations also have found an inverse relationship between viral loads and vitamin D status, and low levels previously have been associated with HCC.
Vitamin D is involved in adaptive and innate immune responses that clear the virus and other infections from the body, or at least keep them in check. Deficiency has been associated with susceptibility to various infections and inflammatory diseases, as well as carcinogenesis.
There was no industry funding for the work, and the authors had no disclosures.
More than half of hepatitis B patients were vitamin D deficient, and deficiency was associated with worse outcomes, in a Vietnamese study published in BMC Infectious Diseases.
A total of 48% of 165 chronic hepatitis B patients, 54% of 127 hepatitis B virus (HBV)-associated cirrhosis patients, and 55% of 108 virus-associated hepatocellular carcinoma (HCC) patients had levels below 20 ng/mL, compared with 33% of 122 healthy controls.
Vitamin D levels inversely correlated with viral DNA loads, and lower levels were associated with more severe cirrhosis. On multivariate analyses adjusted for age and sex, HBV infection was an independent risk factor for vitamin D inadequacy (BMC Infect Dis. 2016 Sep 23;16[1]:507).
“These findings suggest that serum vitamin D levels contribute significantly to the clinical courses of HBV infection, including the severe consequences of” cirrhosis and HCC. “Our findings allow speculating that vitamin D and its analogs might provide a potential therapeutic addendum in the treatment of chronic liver diseases,” concluded investigators led by Nghiem Xuan Hoan, MD, of the Vietnamese-German Center for Medical Research in Hanoi.
Total vitamin D (25[OH] D2 and D3) levels were measured only once in the study; the cross-sectional design meant that “we could not determine the fluctuation of vitamin D levels over the course of HBV infection as well as the causative association of vitamin D levels and HBV-related liver diseases,” the researchers explained.
Low levels could be caused by impaired hepatic function, because the liver is key to vitamin D metabolism. HBV patients also tended to be older than the healthy controls were, which also might have contributed to the greater likelihood of deficiency.
Even so, a case seems to be building that vitamin D has an active role in HBV liver disease. Other investigations also have found an inverse relationship between viral loads and vitamin D status, and low levels previously have been associated with HCC.
Vitamin D is involved in adaptive and innate immune responses that clear the virus and other infections from the body, or at least keep them in check. Deficiency has been associated with susceptibility to various infections and inflammatory diseases, as well as carcinogenesis.
There was no industry funding for the work, and the authors had no disclosures.
More than half of hepatitis B patients were vitamin D deficient, and deficiency was associated with worse outcomes, in a Vietnamese study published in BMC Infectious Diseases.
A total of 48% of 165 chronic hepatitis B patients, 54% of 127 hepatitis B virus (HBV)-associated cirrhosis patients, and 55% of 108 virus-associated hepatocellular carcinoma (HCC) patients had levels below 20 ng/mL, compared with 33% of 122 healthy controls.
Vitamin D levels inversely correlated with viral DNA loads, and lower levels were associated with more severe cirrhosis. On multivariate analyses adjusted for age and sex, HBV infection was an independent risk factor for vitamin D inadequacy (BMC Infect Dis. 2016 Sep 23;16[1]:507).
“These findings suggest that serum vitamin D levels contribute significantly to the clinical courses of HBV infection, including the severe consequences of” cirrhosis and HCC. “Our findings allow speculating that vitamin D and its analogs might provide a potential therapeutic addendum in the treatment of chronic liver diseases,” concluded investigators led by Nghiem Xuan Hoan, MD, of the Vietnamese-German Center for Medical Research in Hanoi.
Total vitamin D (25[OH] D2 and D3) levels were measured only once in the study; the cross-sectional design meant that “we could not determine the fluctuation of vitamin D levels over the course of HBV infection as well as the causative association of vitamin D levels and HBV-related liver diseases,” the researchers explained.
Low levels could be caused by impaired hepatic function, because the liver is key to vitamin D metabolism. HBV patients also tended to be older than the healthy controls were, which also might have contributed to the greater likelihood of deficiency.
Even so, a case seems to be building that vitamin D has an active role in HBV liver disease. Other investigations also have found an inverse relationship between viral loads and vitamin D status, and low levels previously have been associated with HCC.
Vitamin D is involved in adaptive and innate immune responses that clear the virus and other infections from the body, or at least keep them in check. Deficiency has been associated with susceptibility to various infections and inflammatory diseases, as well as carcinogenesis.
There was no industry funding for the work, and the authors had no disclosures.
FROM BMC INFECTIOUS DISEASES
Asleep deep brain stimulation placement offers advantages in Parkinson’s
PORTLAND, ORE. – Performing deep brain stimulation surgery for Parkinson’s disease using intraoperative CT imaging while the patient is under general anesthesia had clinical advantages and no disadvantages over surgery using microelectrode recording for lead placement with the patient awake, in a prospective, open-label study of 64 patients.
Regarding motor outcomes after asleep surgery, “We found that it was noninferior, so in other words, the change in scores following surgery were the same for asleep and awake patients,” physician assistant and study coauthor Shannon Anderson said during a poster session at the World Parkinson Congress. “What was surprising to us was that verbal fluency ... the ability to come up with the right word, actually improved in our asleep DBS [deep brain stimulation] group, which is a huge complication for patients [and] has a really negative impact on their life.”
Patients with Parkinson’s disease and motor complications (n = 64) were enrolled prospectively at the Oregon Health & Science University in Portland. Thirty received asleep procedures under general anesthesia with ICT guidance for lead targeting to the globus pallidus pars interna (GPi; n = 21) or to the subthalamic nucleus (STN; n = 9). Thirty-four patients received DBS devices with MER guidance (15 STN; 19 GPi). At baseline, the two groups were similar in age (mean 61.1-62.7 years) and off-medication motor subscale scores of the Unified Parkinson’s Disease Rating Scale (mUPDRS; mean 43.0-43.5). The university investigators optimized the DBS parameters at 1, 2, 3, and 6 months after implantation. The same surgeon performed all the procedures at the same medical center.
Motor improvements were similar between the asleep and awake cohorts. At 6 months, the ICT (asleep) group experienced a mean improvement in motor abilities of 14.3 (plus or minus 10.88) on the mUPDRS off-medication and on DBS, compared with an improvement of 17.6 (plus or minus 12.26) for the MER (awake) group (P = .25).
Better language measures with asleep DBS
Asleep DBS with ICT resulted in improvements in aspects of language, whereas awake patients lost language abilities. The asleep group showed a 0.8-point increase in phonemic fluency and a 1.0-point increase in semantic fluency at 6 months versus a worsening on both language measures (–3.5 points and –4.7 points, respectively; both P less than .001) if DBS was performed via MER on awake patients.
Although both cohorts showed significant improvements on the 39-item Parkinson’s Disease Questionnaire at 6 months, the cohorts did not differ in their degrees of improvement. Similarly, both had improvements on scores of activities of daily living, and both cohorts had a 4-4.5 hours/day increase in “on” time without dyskinesia and a 2.6-3.5 hours/day decrease in “on” time with dyskinesia.
Patients tolerated asleep DBS well, and there were no serious complications.
The sleep surgery is much shorter, “so it’s about 2 hours long as opposed to 4, 5, sometimes 8, 10 hours with the awake. There [are fewer] complications, so less risk of hemorrhage or seizures or things like that,” Ms. Anderson said. “In a separate study, we found that it’s a much more accurate placement of the electrodes so the target is much more accurate. So, all of those things considered, we feel the asleep version is definitely the superior choice between the two.”
Being asleep is much more comfortable for the patient, added study leader Matthew Brodsky, MD. “But the biggest advantage is that it’s a single pass into the brain as opposed to multiple passes.” The average number of passes using MER is two to three per side of the brain, and in some centers, four or more. “Problems such as speech prosody are related to pokes in the brain, if you will, rather than stimulation,” he said.
Ms. Anderson said MER “is a fantastic research tool, and it gives us a lot of information on the electrophysiology, but really, there’s no need for it in the clinical application of DBS.”
Based on the asleep procedure’s accuracy, lower rate of complications, shorter operating room time, and noninferiority in terms of motor outcomes, she said, “Our recommendation is that more centers, more neurosurgeons be trained in this technique ... We’d like to see the clinical field move toward that area and really reserve microelectrode recording for the research side of things.”
“If you talk to folks who are considering brain surgery for their Parkinson’s, for some of them, the idea of being awake in the operating room and undergoing this is a barrier that they can’t quite overcome,” Dr. Brodsky said. “So, having this as an option makes it easier for them to sign up for the process.”
Richard Smeyne, PhD, director of the Jefferson Comprehensive Parkinson’s Center at Thomas Jefferson University in Philadelphia, said that the asleep procedure is the newer one and can target either the GPi or the STN. “The asleep DBS seems to have a little bit better improvement on speech afterwards than the awake DBS, and there could be several causes of this,” he said. “Some might be operative in that you can make smaller holes, you can get really nice guidance, you don’t have to sort of move around as in the awake DBS.”
In addition, CT scanning with the patients asleep in the operating room allows more time in the scanner and greater precision in anatomical placement of the DBS leads.
“If I had to choose, looking at this particular study, it would suggest that the asleep DBS is actually a better overall way to go,” Dr. Smeyne said. However, he had no objection to awake procedures “if the neurosurgeon has a record of good results with it ... But if you have the option ... that becomes an individual choice that you should discuss with the neurosurgeon.”
Some of the work presented in the study was supported by a research grant from Medtronic. Ms. Anderson and Dr. Brodsky reported having no other financial disclosures. Dr. Smeyne reported having no financial disclosures.
PORTLAND, ORE. – Performing deep brain stimulation surgery for Parkinson’s disease using intraoperative CT imaging while the patient is under general anesthesia had clinical advantages and no disadvantages over surgery using microelectrode recording for lead placement with the patient awake, in a prospective, open-label study of 64 patients.
Regarding motor outcomes after asleep surgery, “We found that it was noninferior, so in other words, the change in scores following surgery were the same for asleep and awake patients,” physician assistant and study coauthor Shannon Anderson said during a poster session at the World Parkinson Congress. “What was surprising to us was that verbal fluency ... the ability to come up with the right word, actually improved in our asleep DBS [deep brain stimulation] group, which is a huge complication for patients [and] has a really negative impact on their life.”
Patients with Parkinson’s disease and motor complications (n = 64) were enrolled prospectively at the Oregon Health & Science University in Portland. Thirty received asleep procedures under general anesthesia with ICT guidance for lead targeting to the globus pallidus pars interna (GPi; n = 21) or to the subthalamic nucleus (STN; n = 9). Thirty-four patients received DBS devices with MER guidance (15 STN; 19 GPi). At baseline, the two groups were similar in age (mean 61.1-62.7 years) and off-medication motor subscale scores of the Unified Parkinson’s Disease Rating Scale (mUPDRS; mean 43.0-43.5). The university investigators optimized the DBS parameters at 1, 2, 3, and 6 months after implantation. The same surgeon performed all the procedures at the same medical center.
Motor improvements were similar between the asleep and awake cohorts. At 6 months, the ICT (asleep) group experienced a mean improvement in motor abilities of 14.3 (plus or minus 10.88) on the mUPDRS off-medication and on DBS, compared with an improvement of 17.6 (plus or minus 12.26) for the MER (awake) group (P = .25).
Better language measures with asleep DBS
Asleep DBS with ICT resulted in improvements in aspects of language, whereas awake patients lost language abilities. The asleep group showed a 0.8-point increase in phonemic fluency and a 1.0-point increase in semantic fluency at 6 months versus a worsening on both language measures (–3.5 points and –4.7 points, respectively; both P less than .001) if DBS was performed via MER on awake patients.
Although both cohorts showed significant improvements on the 39-item Parkinson’s Disease Questionnaire at 6 months, the cohorts did not differ in their degrees of improvement. Similarly, both had improvements on scores of activities of daily living, and both cohorts had a 4-4.5 hours/day increase in “on” time without dyskinesia and a 2.6-3.5 hours/day decrease in “on” time with dyskinesia.
Patients tolerated asleep DBS well, and there were no serious complications.
The sleep surgery is much shorter, “so it’s about 2 hours long as opposed to 4, 5, sometimes 8, 10 hours with the awake. There [are fewer] complications, so less risk of hemorrhage or seizures or things like that,” Ms. Anderson said. “In a separate study, we found that it’s a much more accurate placement of the electrodes so the target is much more accurate. So, all of those things considered, we feel the asleep version is definitely the superior choice between the two.”
Being asleep is much more comfortable for the patient, added study leader Matthew Brodsky, MD. “But the biggest advantage is that it’s a single pass into the brain as opposed to multiple passes.” The average number of passes using MER is two to three per side of the brain, and in some centers, four or more. “Problems such as speech prosody are related to pokes in the brain, if you will, rather than stimulation,” he said.
Ms. Anderson said MER “is a fantastic research tool, and it gives us a lot of information on the electrophysiology, but really, there’s no need for it in the clinical application of DBS.”
Based on the asleep procedure’s accuracy, lower rate of complications, shorter operating room time, and noninferiority in terms of motor outcomes, she said, “Our recommendation is that more centers, more neurosurgeons be trained in this technique ... We’d like to see the clinical field move toward that area and really reserve microelectrode recording for the research side of things.”
“If you talk to folks who are considering brain surgery for their Parkinson’s, for some of them, the idea of being awake in the operating room and undergoing this is a barrier that they can’t quite overcome,” Dr. Brodsky said. “So, having this as an option makes it easier for them to sign up for the process.”
Richard Smeyne, PhD, director of the Jefferson Comprehensive Parkinson’s Center at Thomas Jefferson University in Philadelphia, said that the asleep procedure is the newer one and can target either the GPi or the STN. “The asleep DBS seems to have a little bit better improvement on speech afterwards than the awake DBS, and there could be several causes of this,” he said. “Some might be operative in that you can make smaller holes, you can get really nice guidance, you don’t have to sort of move around as in the awake DBS.”
In addition, CT scanning with the patients asleep in the operating room allows more time in the scanner and greater precision in anatomical placement of the DBS leads.
“If I had to choose, looking at this particular study, it would suggest that the asleep DBS is actually a better overall way to go,” Dr. Smeyne said. However, he had no objection to awake procedures “if the neurosurgeon has a record of good results with it ... But if you have the option ... that becomes an individual choice that you should discuss with the neurosurgeon.”
Some of the work presented in the study was supported by a research grant from Medtronic. Ms. Anderson and Dr. Brodsky reported having no other financial disclosures. Dr. Smeyne reported having no financial disclosures.
PORTLAND, ORE. – Performing deep brain stimulation surgery for Parkinson’s disease using intraoperative CT imaging while the patient is under general anesthesia had clinical advantages and no disadvantages over surgery using microelectrode recording for lead placement with the patient awake, in a prospective, open-label study of 64 patients.
Regarding motor outcomes after asleep surgery, “We found that it was noninferior, so in other words, the change in scores following surgery were the same for asleep and awake patients,” physician assistant and study coauthor Shannon Anderson said during a poster session at the World Parkinson Congress. “What was surprising to us was that verbal fluency ... the ability to come up with the right word, actually improved in our asleep DBS [deep brain stimulation] group, which is a huge complication for patients [and] has a really negative impact on their life.”
Patients with Parkinson’s disease and motor complications (n = 64) were enrolled prospectively at the Oregon Health & Science University in Portland. Thirty received asleep procedures under general anesthesia with ICT guidance for lead targeting to the globus pallidus pars interna (GPi; n = 21) or to the subthalamic nucleus (STN; n = 9). Thirty-four patients received DBS devices with MER guidance (15 STN; 19 GPi). At baseline, the two groups were similar in age (mean 61.1-62.7 years) and off-medication motor subscale scores of the Unified Parkinson’s Disease Rating Scale (mUPDRS; mean 43.0-43.5). The university investigators optimized the DBS parameters at 1, 2, 3, and 6 months after implantation. The same surgeon performed all the procedures at the same medical center.
Motor improvements were similar between the asleep and awake cohorts. At 6 months, the ICT (asleep) group experienced a mean improvement in motor abilities of 14.3 (plus or minus 10.88) on the mUPDRS off-medication and on DBS, compared with an improvement of 17.6 (plus or minus 12.26) for the MER (awake) group (P = .25).
Better language measures with asleep DBS
Asleep DBS with ICT resulted in improvements in aspects of language, whereas awake patients lost language abilities. The asleep group showed a 0.8-point increase in phonemic fluency and a 1.0-point increase in semantic fluency at 6 months versus a worsening on both language measures (–3.5 points and –4.7 points, respectively; both P less than .001) if DBS was performed via MER on awake patients.
Although both cohorts showed significant improvements on the 39-item Parkinson’s Disease Questionnaire at 6 months, the cohorts did not differ in their degrees of improvement. Similarly, both had improvements on scores of activities of daily living, and both cohorts had a 4-4.5 hours/day increase in “on” time without dyskinesia and a 2.6-3.5 hours/day decrease in “on” time with dyskinesia.
Patients tolerated asleep DBS well, and there were no serious complications.
The sleep surgery is much shorter, “so it’s about 2 hours long as opposed to 4, 5, sometimes 8, 10 hours with the awake. There [are fewer] complications, so less risk of hemorrhage or seizures or things like that,” Ms. Anderson said. “In a separate study, we found that it’s a much more accurate placement of the electrodes so the target is much more accurate. So, all of those things considered, we feel the asleep version is definitely the superior choice between the two.”
Being asleep is much more comfortable for the patient, added study leader Matthew Brodsky, MD. “But the biggest advantage is that it’s a single pass into the brain as opposed to multiple passes.” The average number of passes using MER is two to three per side of the brain, and in some centers, four or more. “Problems such as speech prosody are related to pokes in the brain, if you will, rather than stimulation,” he said.
Ms. Anderson said MER “is a fantastic research tool, and it gives us a lot of information on the electrophysiology, but really, there’s no need for it in the clinical application of DBS.”
Based on the asleep procedure’s accuracy, lower rate of complications, shorter operating room time, and noninferiority in terms of motor outcomes, she said, “Our recommendation is that more centers, more neurosurgeons be trained in this technique ... We’d like to see the clinical field move toward that area and really reserve microelectrode recording for the research side of things.”
“If you talk to folks who are considering brain surgery for their Parkinson’s, for some of them, the idea of being awake in the operating room and undergoing this is a barrier that they can’t quite overcome,” Dr. Brodsky said. “So, having this as an option makes it easier for them to sign up for the process.”
Richard Smeyne, PhD, director of the Jefferson Comprehensive Parkinson’s Center at Thomas Jefferson University in Philadelphia, said that the asleep procedure is the newer one and can target either the GPi or the STN. “The asleep DBS seems to have a little bit better improvement on speech afterwards than the awake DBS, and there could be several causes of this,” he said. “Some might be operative in that you can make smaller holes, you can get really nice guidance, you don’t have to sort of move around as in the awake DBS.”
In addition, CT scanning with the patients asleep in the operating room allows more time in the scanner and greater precision in anatomical placement of the DBS leads.
“If I had to choose, looking at this particular study, it would suggest that the asleep DBS is actually a better overall way to go,” Dr. Smeyne said. However, he had no objection to awake procedures “if the neurosurgeon has a record of good results with it ... But if you have the option ... that becomes an individual choice that you should discuss with the neurosurgeon.”
Some of the work presented in the study was supported by a research grant from Medtronic. Ms. Anderson and Dr. Brodsky reported having no other financial disclosures. Dr. Smeyne reported having no financial disclosures.
AT WPC 2016
Key clinical point:
Major finding: The asleep group showed a 0.8-point increase in phonemic fluency and a 1.0-point increase in semantic fluency at 6 months versus a worsening on both language measures (–3.5 points and –4.7 points, respectively; both P less than .001) if DBS was performed via MER on awake patients.
Data source: Prospective, open-label study of 64 patients receiving either awake or asleep deep brain stimulation placement.
Disclosures: Some of the work presented in the study was supported by a research grant from Medtronic. Ms. Anderson and Dr. Brodsky reported having no other financial disclosures. Dr. Smeyne reported having no financial disclosures.
Acne and diet: Still no solid basis for dietary recommendations
BOSTON – For years, much of the information that has circulated about the relationship of diet to acne has been inconsistent. And while recent studies have pointed to two aggravating factors – foods with a high glycemic index and skim milk – whether dietary changes can help control acne remains up in the air, according to Andrea Zaenglein, MD.
At the American Academy of Dermatology summer meeting, Dr. Zaenglein, professor of dermatology and pediatrics, Penn State University, Hershey, reviewed information about diet and acne dating back to the early 1930s, when reports suggested an increased risk of acne in people who were glucose intolerant. Subsequent studies looked at the role of dairy, carbohydrate, chloride, saturated and total fats, sugar, and chocolate in acne. Researchers found no acne in a study of natives of Kitava, an island in Papua New Guinea, whose diet consisted mostly of roots, coconut, fruit, and fish (Arch Dermatol. 2002;138[12]:1584-90). But genetics certainly appears to play a dominant role in acne, Dr. Zaenglein said, referring to a large twin study that found that in 81% of the population with acne, it was due to genetic factors, and in 19% it was due to environmental factors (J Invest Dermatol. 2002 Dec;119[6]:1317-22).
For now, Dr. Zaenglein’s advice is to recommend a low glycemic index diet mainly of fruits and vegetables and low in saturated fats and sugars, which may ameliorate acne by decreasing weight and insulin resistance – especially in patients who already have abnormal metabolic parameters. Patients should also be advised to limit processed foods, meat, and dairy, she added.
As far as dairy and acne, it is too early to say what the impact of intervention would be “because we don’t have any great interventional studies that have been done,” she said. Moreover, it is unclear how early dietary interventions would need to be started to reduce an individual’s risk of acne, she added.“It could go all the way back to things that happened when you were born that influence your outcome,” Dr. Zaenglein said. For example, babies born prematurely are at greater risk for endocrine stressors, which lead to premature adrenarche, increasing their risk for acne, she noted.
“We really need new, better intervention studies to be able to give advice to our patients,” she said. However, conducting such studies is challenging because it is difficult for people to eliminate dairy or other types of food from their diets for a prolonged period. To date, the majority of dietary studies have been observational and have relied on subjects’ recall of what they consumed, so these studies have their own inherent problems, she added.Dr. Zaenglein’s disclosures include serving as an adviser to Anacor Pharmaceuticals and Valeant Pharmaceuticals International, and serving as an investigator and receiving grants/research funding from Anacor, Astellas Pharma US, Ranbaxy Laboratories Limited, and Stiefel, a GSK company.
BOSTON – For years, much of the information that has circulated about the relationship of diet to acne has been inconsistent. And while recent studies have pointed to two aggravating factors – foods with a high glycemic index and skim milk – whether dietary changes can help control acne remains up in the air, according to Andrea Zaenglein, MD.
At the American Academy of Dermatology summer meeting, Dr. Zaenglein, professor of dermatology and pediatrics, Penn State University, Hershey, reviewed information about diet and acne dating back to the early 1930s, when reports suggested an increased risk of acne in people who were glucose intolerant. Subsequent studies looked at the role of dairy, carbohydrate, chloride, saturated and total fats, sugar, and chocolate in acne. Researchers found no acne in a study of natives of Kitava, an island in Papua New Guinea, whose diet consisted mostly of roots, coconut, fruit, and fish (Arch Dermatol. 2002;138[12]:1584-90). But genetics certainly appears to play a dominant role in acne, Dr. Zaenglein said, referring to a large twin study that found that in 81% of the population with acne, it was due to genetic factors, and in 19% it was due to environmental factors (J Invest Dermatol. 2002 Dec;119[6]:1317-22).
For now, Dr. Zaenglein’s advice is to recommend a low glycemic index diet mainly of fruits and vegetables and low in saturated fats and sugars, which may ameliorate acne by decreasing weight and insulin resistance – especially in patients who already have abnormal metabolic parameters. Patients should also be advised to limit processed foods, meat, and dairy, she added.
As far as dairy and acne, it is too early to say what the impact of intervention would be “because we don’t have any great interventional studies that have been done,” she said. Moreover, it is unclear how early dietary interventions would need to be started to reduce an individual’s risk of acne, she added.“It could go all the way back to things that happened when you were born that influence your outcome,” Dr. Zaenglein said. For example, babies born prematurely are at greater risk for endocrine stressors, which lead to premature adrenarche, increasing their risk for acne, she noted.
“We really need new, better intervention studies to be able to give advice to our patients,” she said. However, conducting such studies is challenging because it is difficult for people to eliminate dairy or other types of food from their diets for a prolonged period. To date, the majority of dietary studies have been observational and have relied on subjects’ recall of what they consumed, so these studies have their own inherent problems, she added.Dr. Zaenglein’s disclosures include serving as an adviser to Anacor Pharmaceuticals and Valeant Pharmaceuticals International, and serving as an investigator and receiving grants/research funding from Anacor, Astellas Pharma US, Ranbaxy Laboratories Limited, and Stiefel, a GSK company.
BOSTON – For years, much of the information that has circulated about the relationship of diet to acne has been inconsistent. And while recent studies have pointed to two aggravating factors – foods with a high glycemic index and skim milk – whether dietary changes can help control acne remains up in the air, according to Andrea Zaenglein, MD.
At the American Academy of Dermatology summer meeting, Dr. Zaenglein, professor of dermatology and pediatrics, Penn State University, Hershey, reviewed information about diet and acne dating back to the early 1930s, when reports suggested an increased risk of acne in people who were glucose intolerant. Subsequent studies looked at the role of dairy, carbohydrate, chloride, saturated and total fats, sugar, and chocolate in acne. Researchers found no acne in a study of natives of Kitava, an island in Papua New Guinea, whose diet consisted mostly of roots, coconut, fruit, and fish (Arch Dermatol. 2002;138[12]:1584-90). But genetics certainly appears to play a dominant role in acne, Dr. Zaenglein said, referring to a large twin study that found that in 81% of the population with acne, it was due to genetic factors, and in 19% it was due to environmental factors (J Invest Dermatol. 2002 Dec;119[6]:1317-22).
For now, Dr. Zaenglein’s advice is to recommend a low glycemic index diet mainly of fruits and vegetables and low in saturated fats and sugars, which may ameliorate acne by decreasing weight and insulin resistance – especially in patients who already have abnormal metabolic parameters. Patients should also be advised to limit processed foods, meat, and dairy, she added.
As far as dairy and acne, it is too early to say what the impact of intervention would be “because we don’t have any great interventional studies that have been done,” she said. Moreover, it is unclear how early dietary interventions would need to be started to reduce an individual’s risk of acne, she added.“It could go all the way back to things that happened when you were born that influence your outcome,” Dr. Zaenglein said. For example, babies born prematurely are at greater risk for endocrine stressors, which lead to premature adrenarche, increasing their risk for acne, she noted.
“We really need new, better intervention studies to be able to give advice to our patients,” she said. However, conducting such studies is challenging because it is difficult for people to eliminate dairy or other types of food from their diets for a prolonged period. To date, the majority of dietary studies have been observational and have relied on subjects’ recall of what they consumed, so these studies have their own inherent problems, she added.Dr. Zaenglein’s disclosures include serving as an adviser to Anacor Pharmaceuticals and Valeant Pharmaceuticals International, and serving as an investigator and receiving grants/research funding from Anacor, Astellas Pharma US, Ranbaxy Laboratories Limited, and Stiefel, a GSK company.
Perinatal problems raise adult OCD risk
Adverse events during the perinatal period were independently associated with an increased risk of obsessive-compulsive disorder at age 40 years, based on data published Oct. 5 from a population-based cohort study of more than 2 million Swedish children.
Perinatal complications, including C-section delivery and preterm birth, have been linked to psychiatric disorders including schizophrenia, autism, and attention-deficit/hyperactivity disorder, but evidence of an impact of obsessive-compulsive disorders has not been well studied, reported Gustaf Brander of the Karolinska Institutet, Stockholm, and his colleagues.
The researchers reviewed data from 2,421,284 live singleton births in Sweden between Jan. 1, 1973, and Dec. 31, 1996. The overall prevalence of OCD was 1.3% at 40 years of age. Several perinatal factors were independently associated with an increased OCD risk: breech presentation, cesarean section delivery, gestational age of less than 32 weeks, maternal smoking during pregnancy, Apgar scores near the distress level, and both low and high birth weight (defined as 1,500-2,500 g and greater than 4,500 g, respectively).
“These findings contradict the widely held notion that, although mental disorders share genetic risk factors, the contribution of environmental risk factors is largely disorder specific,” with the exception of maternal smoking during pregnancy, the researchers wrote. In addition, the researchers found a dose-response relationship in which the risk for OCD increased with the greater number of perinatal adverse events.
“The findings are important for the understanding of the cause of OCD and will inform future studies of gene by environment interaction and epigenetics,” the researchers said. “If the finding is replication, the association between maternal smoking during pregnancy and OCD may emerge as an interesting disorder-specific risk factor for evaluation in future research,” they added.
Mr. Brander had no financial conflicts to disclose; several coauthors disclosed relationships with companies including Eli Lilly, Shire, and Medice.
Find the full study here: (JAMA Psychiatry. 2016 Oct 5. doi: 10.1001/jamapsychiatry.2016.2095).
Adverse events during the perinatal period were independently associated with an increased risk of obsessive-compulsive disorder at age 40 years, based on data published Oct. 5 from a population-based cohort study of more than 2 million Swedish children.
Perinatal complications, including C-section delivery and preterm birth, have been linked to psychiatric disorders including schizophrenia, autism, and attention-deficit/hyperactivity disorder, but evidence of an impact of obsessive-compulsive disorders has not been well studied, reported Gustaf Brander of the Karolinska Institutet, Stockholm, and his colleagues.
The researchers reviewed data from 2,421,284 live singleton births in Sweden between Jan. 1, 1973, and Dec. 31, 1996. The overall prevalence of OCD was 1.3% at 40 years of age. Several perinatal factors were independently associated with an increased OCD risk: breech presentation, cesarean section delivery, gestational age of less than 32 weeks, maternal smoking during pregnancy, Apgar scores near the distress level, and both low and high birth weight (defined as 1,500-2,500 g and greater than 4,500 g, respectively).
“These findings contradict the widely held notion that, although mental disorders share genetic risk factors, the contribution of environmental risk factors is largely disorder specific,” with the exception of maternal smoking during pregnancy, the researchers wrote. In addition, the researchers found a dose-response relationship in which the risk for OCD increased with the greater number of perinatal adverse events.
“The findings are important for the understanding of the cause of OCD and will inform future studies of gene by environment interaction and epigenetics,” the researchers said. “If the finding is replication, the association between maternal smoking during pregnancy and OCD may emerge as an interesting disorder-specific risk factor for evaluation in future research,” they added.
Mr. Brander had no financial conflicts to disclose; several coauthors disclosed relationships with companies including Eli Lilly, Shire, and Medice.
Find the full study here: (JAMA Psychiatry. 2016 Oct 5. doi: 10.1001/jamapsychiatry.2016.2095).
Adverse events during the perinatal period were independently associated with an increased risk of obsessive-compulsive disorder at age 40 years, based on data published Oct. 5 from a population-based cohort study of more than 2 million Swedish children.
Perinatal complications, including C-section delivery and preterm birth, have been linked to psychiatric disorders including schizophrenia, autism, and attention-deficit/hyperactivity disorder, but evidence of an impact of obsessive-compulsive disorders has not been well studied, reported Gustaf Brander of the Karolinska Institutet, Stockholm, and his colleagues.
The researchers reviewed data from 2,421,284 live singleton births in Sweden between Jan. 1, 1973, and Dec. 31, 1996. The overall prevalence of OCD was 1.3% at 40 years of age. Several perinatal factors were independently associated with an increased OCD risk: breech presentation, cesarean section delivery, gestational age of less than 32 weeks, maternal smoking during pregnancy, Apgar scores near the distress level, and both low and high birth weight (defined as 1,500-2,500 g and greater than 4,500 g, respectively).
“These findings contradict the widely held notion that, although mental disorders share genetic risk factors, the contribution of environmental risk factors is largely disorder specific,” with the exception of maternal smoking during pregnancy, the researchers wrote. In addition, the researchers found a dose-response relationship in which the risk for OCD increased with the greater number of perinatal adverse events.
“The findings are important for the understanding of the cause of OCD and will inform future studies of gene by environment interaction and epigenetics,” the researchers said. “If the finding is replication, the association between maternal smoking during pregnancy and OCD may emerge as an interesting disorder-specific risk factor for evaluation in future research,” they added.
Mr. Brander had no financial conflicts to disclose; several coauthors disclosed relationships with companies including Eli Lilly, Shire, and Medice.
Find the full study here: (JAMA Psychiatry. 2016 Oct 5. doi: 10.1001/jamapsychiatry.2016.2095).
For HPV-negative women, longer screening intervals are effective
It may be safe to extend cervical cancer screening intervals beyond 5 years, at least for women who are not infected with human papillomavirus.
The rate of cervical cancers was the same among HPV-negative women, whether they had gone through two or three rounds of 5-year exams using both HPV testing and cervical cytology, a large Dutch study has found. This suggests a longer period between screenings wouldn’t significantly increase the risk of letting new cancers go unnoticed, Maaike G Dijkstra, MD, and associates reported (BMJ. 2016;355:i4924. doi: 10.1136/bmj.i4924).
The picture, however, is very different for women with an HPV infection, the researchers noted. These women were 12 times more likely to develop cervical cancer and up to 29 times more likely to have a cervical intraepithelial neoplasia of at least grade 3 (CIN3+), compared with women who were HPV negative.
The findings bring a measure of reassurance to the Dutch population, the researchers wrote. The Netherlands intends to lengthen its cervical cancer screening interval to 10 years for HPV-negative women aged 40 years or older.
“HPV-negative women have a very low risk of CIN3+ in the long term, indicating that extension of the current screening interval in the Netherlands to 10 years seems justifiable,” wrote Dr. Dijkstra of the VU University Medical Centre, Amsterdam, and her colleagues. “For HPV positive, triage negative women, the long term risk of CIN3+ was too high to support an extension of the screening interval beyond 5 years.”
The study assessed the 14-year risks of cervical cancer and CIN3+ in 43,339 women aged 29 years and older. As per national guidelines, all women underwent cervical cancer screening every 5 years, resulting in three rounds. They were randomized to receive HPV and cytology testing or cytology only.
Among HPV-negative women in the double-screening group, the cumulative cervical cancer incidence was 0.03% after round two and 0.09% after round three – not a statistically significant difference.
After round three, cervical cancer incidence among HPV-negative women with negative cytology (double negative) was similar to that among cytology-negative women from the control group after round two.
In the cytology-only group, the rates among negative women were 0.09% after round two of screening and 0.19% after round three.
“This indicated that a negative HPV test provides longer reassurance against cervical cancer than negative cytology,” the researchers noted.
HPV-positive women, however, faced much higher risks, regardless of the screening protocol. Even with negative cytology, they were 12 times more likely to have a cancer by round three than HPV-negative women. The risk of CIN3+ was up to 29 times higher. Even HPV-positive women with negative cytology, negative HPV 16/18 genotyping, and negative repeat cytology faced a 10-fold increased risk of CIN+3, the researchers reported.
Dr. Dijkstra reported having no financial disclosures. Several of the coauthors disclosed financial relationships with various pharmaceutical companies.
msullivan@frontlinemedcom.com
On Twitter @Alz_Gal
The ultimate goal of cervical cancer screening is to minimize unnecessary procedures while maximizing identification of preinvasive disease. The recent article by Dr. Dijkstra and her associates provides 14-year follow-up for a national population-based cervical screening randomized trial.
The authors, who focused their analysis on patients older than age 40, describe how HPV results allow for a more targeted intervention triage than cytology alone. The study is impressive in its large patient cohort and extremely low detection rate of cervical cancer among HPV-negative patients in the intervention group. Their main argument stems from identifying an equivalent risk for CIN3+ between patients who were negative for HPV at 10 years and those with negative cytology at 5 years.
In the United States, current guidelines endorse cotesting for patients in this age group, with a negative cotest requiring repeat testing at 5 years. Changes to the screening protocol in the United States are often met with skepticism and slow uptake. This study certainly shows strong data to support HPV testing as the preferred option, but when does the time gap between screenings become too large, particularly for women whose only health care evaluation is through the gynecologist? Patients often think that not needing yearly Pap test equates to no need for routine examinations. We may be missing out on the opportunity to not only grow the patient-physician relationship, but also impact other aspects of the patients’ general health.
This article may not change practice, but it does add considerable weight to the growing literature on primary HPV screening, especially in an older population.
Ritu Salani, MD, is associate professor of gynecologic oncology at Ohio State University in Columbus. Robert T. Neff, MD, is a second-year fellow in gynecologic oncology at the university. They reported having no relevant financial disclosures.
The ultimate goal of cervical cancer screening is to minimize unnecessary procedures while maximizing identification of preinvasive disease. The recent article by Dr. Dijkstra and her associates provides 14-year follow-up for a national population-based cervical screening randomized trial.
The authors, who focused their analysis on patients older than age 40, describe how HPV results allow for a more targeted intervention triage than cytology alone. The study is impressive in its large patient cohort and extremely low detection rate of cervical cancer among HPV-negative patients in the intervention group. Their main argument stems from identifying an equivalent risk for CIN3+ between patients who were negative for HPV at 10 years and those with negative cytology at 5 years.
In the United States, current guidelines endorse cotesting for patients in this age group, with a negative cotest requiring repeat testing at 5 years. Changes to the screening protocol in the United States are often met with skepticism and slow uptake. This study certainly shows strong data to support HPV testing as the preferred option, but when does the time gap between screenings become too large, particularly for women whose only health care evaluation is through the gynecologist? Patients often think that not needing yearly Pap test equates to no need for routine examinations. We may be missing out on the opportunity to not only grow the patient-physician relationship, but also impact other aspects of the patients’ general health.
This article may not change practice, but it does add considerable weight to the growing literature on primary HPV screening, especially in an older population.
Ritu Salani, MD, is associate professor of gynecologic oncology at Ohio State University in Columbus. Robert T. Neff, MD, is a second-year fellow in gynecologic oncology at the university. They reported having no relevant financial disclosures.
The ultimate goal of cervical cancer screening is to minimize unnecessary procedures while maximizing identification of preinvasive disease. The recent article by Dr. Dijkstra and her associates provides 14-year follow-up for a national population-based cervical screening randomized trial.
The authors, who focused their analysis on patients older than age 40, describe how HPV results allow for a more targeted intervention triage than cytology alone. The study is impressive in its large patient cohort and extremely low detection rate of cervical cancer among HPV-negative patients in the intervention group. Their main argument stems from identifying an equivalent risk for CIN3+ between patients who were negative for HPV at 10 years and those with negative cytology at 5 years.
In the United States, current guidelines endorse cotesting for patients in this age group, with a negative cotest requiring repeat testing at 5 years. Changes to the screening protocol in the United States are often met with skepticism and slow uptake. This study certainly shows strong data to support HPV testing as the preferred option, but when does the time gap between screenings become too large, particularly for women whose only health care evaluation is through the gynecologist? Patients often think that not needing yearly Pap test equates to no need for routine examinations. We may be missing out on the opportunity to not only grow the patient-physician relationship, but also impact other aspects of the patients’ general health.
This article may not change practice, but it does add considerable weight to the growing literature on primary HPV screening, especially in an older population.
Ritu Salani, MD, is associate professor of gynecologic oncology at Ohio State University in Columbus. Robert T. Neff, MD, is a second-year fellow in gynecologic oncology at the university. They reported having no relevant financial disclosures.
It may be safe to extend cervical cancer screening intervals beyond 5 years, at least for women who are not infected with human papillomavirus.
The rate of cervical cancers was the same among HPV-negative women, whether they had gone through two or three rounds of 5-year exams using both HPV testing and cervical cytology, a large Dutch study has found. This suggests a longer period between screenings wouldn’t significantly increase the risk of letting new cancers go unnoticed, Maaike G Dijkstra, MD, and associates reported (BMJ. 2016;355:i4924. doi: 10.1136/bmj.i4924).
The picture, however, is very different for women with an HPV infection, the researchers noted. These women were 12 times more likely to develop cervical cancer and up to 29 times more likely to have a cervical intraepithelial neoplasia of at least grade 3 (CIN3+), compared with women who were HPV negative.
The findings bring a measure of reassurance to the Dutch population, the researchers wrote. The Netherlands intends to lengthen its cervical cancer screening interval to 10 years for HPV-negative women aged 40 years or older.
“HPV-negative women have a very low risk of CIN3+ in the long term, indicating that extension of the current screening interval in the Netherlands to 10 years seems justifiable,” wrote Dr. Dijkstra of the VU University Medical Centre, Amsterdam, and her colleagues. “For HPV positive, triage negative women, the long term risk of CIN3+ was too high to support an extension of the screening interval beyond 5 years.”
The study assessed the 14-year risks of cervical cancer and CIN3+ in 43,339 women aged 29 years and older. As per national guidelines, all women underwent cervical cancer screening every 5 years, resulting in three rounds. They were randomized to receive HPV and cytology testing or cytology only.
Among HPV-negative women in the double-screening group, the cumulative cervical cancer incidence was 0.03% after round two and 0.09% after round three – not a statistically significant difference.
After round three, cervical cancer incidence among HPV-negative women with negative cytology (double negative) was similar to that among cytology-negative women from the control group after round two.
In the cytology-only group, the rates among negative women were 0.09% after round two of screening and 0.19% after round three.
“This indicated that a negative HPV test provides longer reassurance against cervical cancer than negative cytology,” the researchers noted.
HPV-positive women, however, faced much higher risks, regardless of the screening protocol. Even with negative cytology, they were 12 times more likely to have a cancer by round three than HPV-negative women. The risk of CIN3+ was up to 29 times higher. Even HPV-positive women with negative cytology, negative HPV 16/18 genotyping, and negative repeat cytology faced a 10-fold increased risk of CIN+3, the researchers reported.
Dr. Dijkstra reported having no financial disclosures. Several of the coauthors disclosed financial relationships with various pharmaceutical companies.
msullivan@frontlinemedcom.com
On Twitter @Alz_Gal
It may be safe to extend cervical cancer screening intervals beyond 5 years, at least for women who are not infected with human papillomavirus.
The rate of cervical cancers was the same among HPV-negative women, whether they had gone through two or three rounds of 5-year exams using both HPV testing and cervical cytology, a large Dutch study has found. This suggests a longer period between screenings wouldn’t significantly increase the risk of letting new cancers go unnoticed, Maaike G Dijkstra, MD, and associates reported (BMJ. 2016;355:i4924. doi: 10.1136/bmj.i4924).
The picture, however, is very different for women with an HPV infection, the researchers noted. These women were 12 times more likely to develop cervical cancer and up to 29 times more likely to have a cervical intraepithelial neoplasia of at least grade 3 (CIN3+), compared with women who were HPV negative.
The findings bring a measure of reassurance to the Dutch population, the researchers wrote. The Netherlands intends to lengthen its cervical cancer screening interval to 10 years for HPV-negative women aged 40 years or older.
“HPV-negative women have a very low risk of CIN3+ in the long term, indicating that extension of the current screening interval in the Netherlands to 10 years seems justifiable,” wrote Dr. Dijkstra of the VU University Medical Centre, Amsterdam, and her colleagues. “For HPV positive, triage negative women, the long term risk of CIN3+ was too high to support an extension of the screening interval beyond 5 years.”
The study assessed the 14-year risks of cervical cancer and CIN3+ in 43,339 women aged 29 years and older. As per national guidelines, all women underwent cervical cancer screening every 5 years, resulting in three rounds. They were randomized to receive HPV and cytology testing or cytology only.
Among HPV-negative women in the double-screening group, the cumulative cervical cancer incidence was 0.03% after round two and 0.09% after round three – not a statistically significant difference.
After round three, cervical cancer incidence among HPV-negative women with negative cytology (double negative) was similar to that among cytology-negative women from the control group after round two.
In the cytology-only group, the rates among negative women were 0.09% after round two of screening and 0.19% after round three.
“This indicated that a negative HPV test provides longer reassurance against cervical cancer than negative cytology,” the researchers noted.
HPV-positive women, however, faced much higher risks, regardless of the screening protocol. Even with negative cytology, they were 12 times more likely to have a cancer by round three than HPV-negative women. The risk of CIN3+ was up to 29 times higher. Even HPV-positive women with negative cytology, negative HPV 16/18 genotyping, and negative repeat cytology faced a 10-fold increased risk of CIN+3, the researchers reported.
Dr. Dijkstra reported having no financial disclosures. Several of the coauthors disclosed financial relationships with various pharmaceutical companies.
msullivan@frontlinemedcom.com
On Twitter @Alz_Gal
FROM BMJ
Key clinical point:
Major finding: The cumulative cervical cancer incidence was 0.03% after round two and 0.09% after round three – not a statistically significant difference.
Data source: The 14-year study randomized more than 43,000 women to either the double-method screen or just to cervical cytology.
Disclosures: Dr. Dijkstra reported having no financial disclosures. Several of the coauthors disclosed financial relationships with various pharmaceutical companies.
Rivaroxaban linked to more bleeding compared with dabigatran in elderly patients with nonvalvular AF
Rivaroxaban is associated with significantly more intra- and extracranial bleeding than is dabigatran in older patients who have nonvalvular atrial fibrillation, according to a report published online Oct. 3 in JAMA Internal Medicine.
This is the principal finding of a retrospective cohort study – the only study to directly compare the two oral non–vitamin-K-antagonists – that involved more than 118,000 patients who initiated anticoagulation treatment during a 2.5-year period. The Centers for Medicare & Medicaid Services and the Food and Drug Administration jointly conducted the study.
During the study period, rivaroxaban was used 2-3 times more often than was dabigatran in AF patients in the United States, “perhaps partly because of prescriber misperceptions about bleeding risks with dabigatran, arising from FDA receipt of a large number of postmarketing case reports following its approval. Ironically, we [now find] substantially higher bleeding risks with use of rivaroxaban than dabigatran,” said David J. Graham, MD, of the Office of Surveillance and Epidemiology, Center for Drug Evaluation and Research, FDA, Silver Spring, Md., and his associates.
The researchers assessed Medicare beneficiaries who initiated standard oral doses of rivaroxaban (66,651 patients) or dabigatran (52,240 patients) and were followed for a mean of 110 days.
The primary outcome measure – a composite of thromboembolic stroke, intracranial hemorrhage, major extracranial bleeding events including GI bleeding, and mortality – occurred in significantly more patients taking rivaroxaban than in those taking dabigatran. When the individual components of this composite outcome were considered, rivaroxaban was associated with significant increases in intracranial hemorrhage (HR, 1.65), major extracranial bleeding (HR, 1.48), and major GI bleeding (HR, 1.40); a nonsignificant decrease in thromboembolic stroke (HR, 0.81); and a nonsignificant increase in mortality (HR, 1.15).
In a further analysis of the data, rivaroxaban was linked to 2.3 excess cases of intracranial hemorrhage, 13 excess cases of major extracranial bleeding, 9.4 excess cases of major GI bleeding, and 3.1 excess deaths per 1,000 person-years of treatment. In addition, rivaroxaban was associated with a significantly increased risk of death in two subgroups of patients: those aged 75 and older and those whose CHADS-2 scores indicated higher bleeding risk, Dr. Graham and his associates said (JAMA Intern. Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.5954).
Of note, “the net increase in intracranial hemorrhage, the outcome with the highest case fatality rate, exceeded the net reduction in thromboembolic stroke” with rivaroxaban treatment, they added.
This “milestone” study offers real-world data for a large number of older patients with multiple comorbidities who constitute the rising tide of the AF population.
The findings should lead physicians to prescribe dabigatran over rivaroxaban in most patients with AF. Even though this was a retrospective cohort study, there are no prospective randomized trials directly comparing the two non–vitamin-K oral anticoagulants, and the few indirect comparisons derived from clinical trial data are very limited.
Anna L. Parks, MD, is at the University of California, San Francisco. Rita F. Redberg, M.D., is the editor of JAMA Internal Medicine and professor of cardiology at UCSF. Dr. Parks and Dr. Redberg made these remarks in an Editor’s Note accompanying Dr. Graham’s report (JAMA Intern. Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.6429).
This “milestone” study offers real-world data for a large number of older patients with multiple comorbidities who constitute the rising tide of the AF population.
The findings should lead physicians to prescribe dabigatran over rivaroxaban in most patients with AF. Even though this was a retrospective cohort study, there are no prospective randomized trials directly comparing the two non–vitamin-K oral anticoagulants, and the few indirect comparisons derived from clinical trial data are very limited.
Anna L. Parks, MD, is at the University of California, San Francisco. Rita F. Redberg, M.D., is the editor of JAMA Internal Medicine and professor of cardiology at UCSF. Dr. Parks and Dr. Redberg made these remarks in an Editor’s Note accompanying Dr. Graham’s report (JAMA Intern. Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.6429).
This “milestone” study offers real-world data for a large number of older patients with multiple comorbidities who constitute the rising tide of the AF population.
The findings should lead physicians to prescribe dabigatran over rivaroxaban in most patients with AF. Even though this was a retrospective cohort study, there are no prospective randomized trials directly comparing the two non–vitamin-K oral anticoagulants, and the few indirect comparisons derived from clinical trial data are very limited.
Anna L. Parks, MD, is at the University of California, San Francisco. Rita F. Redberg, M.D., is the editor of JAMA Internal Medicine and professor of cardiology at UCSF. Dr. Parks and Dr. Redberg made these remarks in an Editor’s Note accompanying Dr. Graham’s report (JAMA Intern. Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.6429).
Rivaroxaban is associated with significantly more intra- and extracranial bleeding than is dabigatran in older patients who have nonvalvular atrial fibrillation, according to a report published online Oct. 3 in JAMA Internal Medicine.
This is the principal finding of a retrospective cohort study – the only study to directly compare the two oral non–vitamin-K-antagonists – that involved more than 118,000 patients who initiated anticoagulation treatment during a 2.5-year period. The Centers for Medicare & Medicaid Services and the Food and Drug Administration jointly conducted the study.
During the study period, rivaroxaban was used 2-3 times more often than was dabigatran in AF patients in the United States, “perhaps partly because of prescriber misperceptions about bleeding risks with dabigatran, arising from FDA receipt of a large number of postmarketing case reports following its approval. Ironically, we [now find] substantially higher bleeding risks with use of rivaroxaban than dabigatran,” said David J. Graham, MD, of the Office of Surveillance and Epidemiology, Center for Drug Evaluation and Research, FDA, Silver Spring, Md., and his associates.
The researchers assessed Medicare beneficiaries who initiated standard oral doses of rivaroxaban (66,651 patients) or dabigatran (52,240 patients) and were followed for a mean of 110 days.
The primary outcome measure – a composite of thromboembolic stroke, intracranial hemorrhage, major extracranial bleeding events including GI bleeding, and mortality – occurred in significantly more patients taking rivaroxaban than in those taking dabigatran. When the individual components of this composite outcome were considered, rivaroxaban was associated with significant increases in intracranial hemorrhage (HR, 1.65), major extracranial bleeding (HR, 1.48), and major GI bleeding (HR, 1.40); a nonsignificant decrease in thromboembolic stroke (HR, 0.81); and a nonsignificant increase in mortality (HR, 1.15).
In a further analysis of the data, rivaroxaban was linked to 2.3 excess cases of intracranial hemorrhage, 13 excess cases of major extracranial bleeding, 9.4 excess cases of major GI bleeding, and 3.1 excess deaths per 1,000 person-years of treatment. In addition, rivaroxaban was associated with a significantly increased risk of death in two subgroups of patients: those aged 75 and older and those whose CHADS-2 scores indicated higher bleeding risk, Dr. Graham and his associates said (JAMA Intern. Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.5954).
Of note, “the net increase in intracranial hemorrhage, the outcome with the highest case fatality rate, exceeded the net reduction in thromboembolic stroke” with rivaroxaban treatment, they added.
Rivaroxaban is associated with significantly more intra- and extracranial bleeding than is dabigatran in older patients who have nonvalvular atrial fibrillation, according to a report published online Oct. 3 in JAMA Internal Medicine.
This is the principal finding of a retrospective cohort study – the only study to directly compare the two oral non–vitamin-K-antagonists – that involved more than 118,000 patients who initiated anticoagulation treatment during a 2.5-year period. The Centers for Medicare & Medicaid Services and the Food and Drug Administration jointly conducted the study.
During the study period, rivaroxaban was used 2-3 times more often than was dabigatran in AF patients in the United States, “perhaps partly because of prescriber misperceptions about bleeding risks with dabigatran, arising from FDA receipt of a large number of postmarketing case reports following its approval. Ironically, we [now find] substantially higher bleeding risks with use of rivaroxaban than dabigatran,” said David J. Graham, MD, of the Office of Surveillance and Epidemiology, Center for Drug Evaluation and Research, FDA, Silver Spring, Md., and his associates.
The researchers assessed Medicare beneficiaries who initiated standard oral doses of rivaroxaban (66,651 patients) or dabigatran (52,240 patients) and were followed for a mean of 110 days.
The primary outcome measure – a composite of thromboembolic stroke, intracranial hemorrhage, major extracranial bleeding events including GI bleeding, and mortality – occurred in significantly more patients taking rivaroxaban than in those taking dabigatran. When the individual components of this composite outcome were considered, rivaroxaban was associated with significant increases in intracranial hemorrhage (HR, 1.65), major extracranial bleeding (HR, 1.48), and major GI bleeding (HR, 1.40); a nonsignificant decrease in thromboembolic stroke (HR, 0.81); and a nonsignificant increase in mortality (HR, 1.15).
In a further analysis of the data, rivaroxaban was linked to 2.3 excess cases of intracranial hemorrhage, 13 excess cases of major extracranial bleeding, 9.4 excess cases of major GI bleeding, and 3.1 excess deaths per 1,000 person-years of treatment. In addition, rivaroxaban was associated with a significantly increased risk of death in two subgroups of patients: those aged 75 and older and those whose CHADS-2 scores indicated higher bleeding risk, Dr. Graham and his associates said (JAMA Intern. Med. 2016 Oct 3. doi: 10.1001/jamainternmed.2016.5954).
Of note, “the net increase in intracranial hemorrhage, the outcome with the highest case fatality rate, exceeded the net reduction in thromboembolic stroke” with rivaroxaban treatment, they added.
Key clinical point: Rivaroxaban is associated with significantly more intra- and extracranial bleeding than dabigatran in patients aged 75 and older with nonvalvular atrial fibrillation.
Major finding: Rivaroxaban was linked to 2.3 excess cases of intracranial hemorrhage, 13 excess cases of major extracranial bleeding, 9.4 excess cases of major GI bleeding, and 3.1 excess deaths per 1,000 person-years of treatment.
Data source: A retrospective cohort study of 118,891 patients aged 65 and older who initiated anticoagulation therapy for AF during a 2.5-year period.
Disclosures: This study was conducted by employees or contractors of the Centers for Medicare & Medicaid Services and the Food and Drug Administration. Dr. Graham and his associates reported having no relevant financial disclosures.
COPD patient characteristics predict response to maintenance drug
Azithromycin maintenance therapy may be best reserved for patients with mild to moderate chronic obstructive pulmonary disease (COPD) and few symptoms, according to an analysis from the COLUMBUS randomized controlled trial. The study, reported on in Family Practice News, also revealed that patients with a high serum eosinophil level… http://www.familypracticenews.com/specialty-focus/pulmonary-sleep-medicine/single-article-page/copd-patient-characteristics-predict-response-to-maintenance-drug/f29efaba9a4874ed9b754fb87b77b663.html.
Azithromycin maintenance therapy may be best reserved for patients with mild to moderate chronic obstructive pulmonary disease (COPD) and few symptoms, according to an analysis from the COLUMBUS randomized controlled trial. The study, reported on in Family Practice News, also revealed that patients with a high serum eosinophil level… http://www.familypracticenews.com/specialty-focus/pulmonary-sleep-medicine/single-article-page/copd-patient-characteristics-predict-response-to-maintenance-drug/f29efaba9a4874ed9b754fb87b77b663.html.
Azithromycin maintenance therapy may be best reserved for patients with mild to moderate chronic obstructive pulmonary disease (COPD) and few symptoms, according to an analysis from the COLUMBUS randomized controlled trial. The study, reported on in Family Practice News, also revealed that patients with a high serum eosinophil level… http://www.familypracticenews.com/specialty-focus/pulmonary-sleep-medicine/single-article-page/copd-patient-characteristics-predict-response-to-maintenance-drug/f29efaba9a4874ed9b754fb87b77b663.html.
VIDEO: New oral anticoagulants cut intracranial bleeds in real-world atrial fib patients
During the first year on anticoagulant treatment, patients who received a new oral anticoagulant (NOAC) had an ischemic stroke rate similar to that of patients who received the traditional oral anticoagulant, warfarin. But they also had a significantly reduced rate of intracranial hemorrhage, according to Laila Stærk, MD, who reported on the findings at the annual congress of the European Society of Cardiology. The study included 43,299 Danish patients, of which 42% received warfarin, 29% received dabigatran, 16% received apixaban, and 13% received rivaroxaban. More on the results of this study are available in this article and video from Cardiology News: http://www.ecardiologynews.com/specialty-focus/arrhythmias-electrophysiology/single-article-page/video-noacs-cut-intracranial-bleeds-in-real-world-atrial-fib-patients/2c213686c34e2f2e9fb58000ff2cad80.html.
During the first year on anticoagulant treatment, patients who received a new oral anticoagulant (NOAC) had an ischemic stroke rate similar to that of patients who received the traditional oral anticoagulant, warfarin. But they also had a significantly reduced rate of intracranial hemorrhage, according to Laila Stærk, MD, who reported on the findings at the annual congress of the European Society of Cardiology. The study included 43,299 Danish patients, of which 42% received warfarin, 29% received dabigatran, 16% received apixaban, and 13% received rivaroxaban. More on the results of this study are available in this article and video from Cardiology News: http://www.ecardiologynews.com/specialty-focus/arrhythmias-electrophysiology/single-article-page/video-noacs-cut-intracranial-bleeds-in-real-world-atrial-fib-patients/2c213686c34e2f2e9fb58000ff2cad80.html.
During the first year on anticoagulant treatment, patients who received a new oral anticoagulant (NOAC) had an ischemic stroke rate similar to that of patients who received the traditional oral anticoagulant, warfarin. But they also had a significantly reduced rate of intracranial hemorrhage, according to Laila Stærk, MD, who reported on the findings at the annual congress of the European Society of Cardiology. The study included 43,299 Danish patients, of which 42% received warfarin, 29% received dabigatran, 16% received apixaban, and 13% received rivaroxaban. More on the results of this study are available in this article and video from Cardiology News: http://www.ecardiologynews.com/specialty-focus/arrhythmias-electrophysiology/single-article-page/video-noacs-cut-intracranial-bleeds-in-real-world-atrial-fib-patients/2c213686c34e2f2e9fb58000ff2cad80.html.