Low Vitamin B12 Level Is Associated With Worsening in Parkinson’s Disease

Article Type
Changed
Thu, 12/15/2022 - 15:51
The underlying mechanism of vitamin B12’s role in Parkinson’s disease remains unclear.

People with early, untreated Parkinson’s disease who have low vitamin B12 levels appear to have greater worsening of mobility and cognitive decline over time, according to research published online ahead of print March 6 in Movement Disorders. The results suggest that correcting low levels may slow disease progression.

Chadwick W. Christine, MD

Previous research revealed that low serum vitamin B12 levels are common in patients with moderately advanced Parkinson’s disease and are associated with neuropathy and cognitive impairment. Investigators led by Chadwick W. Christine, MD, a neurologist at the University of California, San Francisco, sought to understand what contributes to variation in the progression of Parkinson’s disease.

An Analysis of the DATATOP Study

Because little is known about B12’s role in early disease, the investigators analyzed data from patients with early, untreated Parkinson’s disease who participated in the DATATOP study, a double-blind, randomized trial designed to test whether treatment with selegiline, the antioxidant alpha-tocopherol, or both slowed disease progression.

They measured serum methylmalonic acid, homocysteine, and holotranscobalamin in addition to B12 because of the limited sensitivity of serum B12 testing alone to detect B12 deficiency. At baseline, 13% of 680 patients had borderline-low B12 levels (ie, less than 184 pmol/L), and 5% had deficient B12 levels (ie, less than 157 pmol/L). Homocysteine was moderately elevated (ie, greater than 15 mmol/L) in 7% of subjects, and 14% of patients with borderline-low B12 also had elevated homocysteine.

Homocysteine Was Associated With Cognitive Outcomes

Low B12 at baseline predicted greater worsening of mobility, in terms of a higher ambulatory capacity score. Investigators calculated this score by adding the falling, freezing when walking, walking, gait, and postural stability scores of the Unified Parkinson’s Disease Rating Scale (UPDRS). Participants in the low-B12 tertile (ie, less than 234 pmol/L) developed greater morbidity, as assessed by greater annualized worsening of the ambulatory capacity score. Participants in the low-B12 tertile had annualized change of 1.53, compared with 0.77 in the upper tertile. The worsening score mostly resulted from poorer gait and postural instability.

“We consider the magnitude of difference to be clinically relevant, particularly given that components of gait dysfunction that develop in Parkinson’s disease may not respond to dopaminergic treatments or [deep brain stimulation],” said Dr. Christine and colleagues.

Elevated homocysteine predicted greater cognitive decline. Baseline elevated homocysteine was associated with lower baseline Mini-Mental State Examination (MMSE) score, as well as greater annualized decline in MMSE (1.96 vs 0.06).

Of the 456 subjects who continued in the study for nine to 24 months and had a second blood sample available, 226 had an increase of more than 20% in B12 levels, 210 stayed within 20% of the original B12 measurement, and 19 had a decrease greater than 20%.

Overall, mean annualized increase in B12 was 52.6 pmol/L, mean annualized decrease of homocysteine was 0.83 mmol/L, and mean annualized increase of holotranscobalamin was 14.7 pmol/L.

“These findings are consistent with improved nutritional status during the course of the study, likely attributed to subjects starting the optional [multivitamin] after the baseline visit and/or subjects changing their diets,” said the investigators.

While the improvement in B12 status did not lead to statistically significant improvements in UPDRS scores, there was a trend toward improvement, which provides “support for a disease-modifying effect of B12,” they added.

The researchers speculated that a link between low B12 levels and worse outcomes could be attributed to an independent comorbid effect on the CNS and peripheral nervous system or a direct effect on Parkinson’s disease pathogenesis. Alternatively, low B12 may be a marker of an unknown associated factor.

“Given that low B12 status is associated with neurologic and other medical morbidities and is readily treated, great care would be needed to design an ethically acceptable randomized, prospective study to evaluate the effect of B12 supplementation on Parkinson’s disease progression, given that serum measurements collected as part of a prospective study in unsupplemented patients would likely reveal some subjects with B12 deficiency,” said Dr. Christine and colleagues.

Study Raises Questions

“The course of Parkinson’s disease can be quite variable, and it is difficult for clinicians to predict what will happen to an individual person with Parkinson’s disease, but identifying prognostic factors ought to help practitioners answer their patients’ questions and potentially improve the understanding of mechanisms underlying the disease pathogenesis,” said Francisco Cardoso, MD, PhD, Professor of Neurology at the Federal University of Minas Gerais in Belo Horizonte, Brazil, in an accompanying editorial.

If vitamin B12 is related to the progression of Parkinson’s disease, then replacing it may slow patients’ decline. But the findings raise questions that need to be addressed, said Dr. Cardoso.

“First, what constitutes a low vitamin B12 level is not a simple issue. If the evaluation is limited to measurement of vitamin B12 concentration, the diagnosis of genuine deficiency is unreliable. Most experts agree that combined measurement of vitamin B12 with determination of homocysteine levels is necessary, and while Christine et al measured both levels, their statistical analysis and subsequent conclusions are exclusively based on the levels of the vitamin.

“Moreover, 34 patients in the study were classified as having ‘borderline-low’ vitamin B12 levels, and 14 had both borderline-low B12 level and high homocysteine concentration. This brings into question whether the researchers identified Parkinson’s disease patients who actually had low B12 levels.

“The design of the DATATOP trial could also introduce some bias into the findings. At the time of publication, the study was criticized for disregarding the symptomatic effect of selegiline and for a lack of objective definition of criteria for the trial’s primary end point—introduction of levodopa.

“This could have led to the termination of individuals at different stages of the disease, introducing a potential bias in the sample of patients who remained in the study for enough time to undergo subsequent determination of B12 levels.”

Furthermore, the findings of Christine et al contrast with those of previous research. The underlying mechanism of vitamin B12 in Parkinson’s disease is unclear. “Nevertheless, the results of the study by Dr. Christine and his colleagues are intriguing, and further investigations to address this hypothesis are warranted,” Dr. Cardoso concluded.

 

 

—Nicola Garrett

Suggested Reading

Cardoso F. Vitamin B12 and Parkinson’s disease: What is the relationship? Mov Disord. 2018 Mar 6 [Epub ahead of print].

Christine CW, Auinger P, Joslin A, et al. Vitamin B12 and homocysteine levels predict different outcomes in early Parkinson’s disease. Mov Disord. 2018 Mar 6 [Epub ahead of print].

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
10
Sections
Related Articles
The underlying mechanism of vitamin B12’s role in Parkinson’s disease remains unclear.
The underlying mechanism of vitamin B12’s role in Parkinson’s disease remains unclear.

People with early, untreated Parkinson’s disease who have low vitamin B12 levels appear to have greater worsening of mobility and cognitive decline over time, according to research published online ahead of print March 6 in Movement Disorders. The results suggest that correcting low levels may slow disease progression.

Chadwick W. Christine, MD

Previous research revealed that low serum vitamin B12 levels are common in patients with moderately advanced Parkinson’s disease and are associated with neuropathy and cognitive impairment. Investigators led by Chadwick W. Christine, MD, a neurologist at the University of California, San Francisco, sought to understand what contributes to variation in the progression of Parkinson’s disease.

An Analysis of the DATATOP Study

Because little is known about B12’s role in early disease, the investigators analyzed data from patients with early, untreated Parkinson’s disease who participated in the DATATOP study, a double-blind, randomized trial designed to test whether treatment with selegiline, the antioxidant alpha-tocopherol, or both slowed disease progression.

They measured serum methylmalonic acid, homocysteine, and holotranscobalamin in addition to B12 because of the limited sensitivity of serum B12 testing alone to detect B12 deficiency. At baseline, 13% of 680 patients had borderline-low B12 levels (ie, less than 184 pmol/L), and 5% had deficient B12 levels (ie, less than 157 pmol/L). Homocysteine was moderately elevated (ie, greater than 15 mmol/L) in 7% of subjects, and 14% of patients with borderline-low B12 also had elevated homocysteine.

Homocysteine Was Associated With Cognitive Outcomes

Low B12 at baseline predicted greater worsening of mobility, in terms of a higher ambulatory capacity score. Investigators calculated this score by adding the falling, freezing when walking, walking, gait, and postural stability scores of the Unified Parkinson’s Disease Rating Scale (UPDRS). Participants in the low-B12 tertile (ie, less than 234 pmol/L) developed greater morbidity, as assessed by greater annualized worsening of the ambulatory capacity score. Participants in the low-B12 tertile had annualized change of 1.53, compared with 0.77 in the upper tertile. The worsening score mostly resulted from poorer gait and postural instability.

“We consider the magnitude of difference to be clinically relevant, particularly given that components of gait dysfunction that develop in Parkinson’s disease may not respond to dopaminergic treatments or [deep brain stimulation],” said Dr. Christine and colleagues.

Elevated homocysteine predicted greater cognitive decline. Baseline elevated homocysteine was associated with lower baseline Mini-Mental State Examination (MMSE) score, as well as greater annualized decline in MMSE (1.96 vs 0.06).

Of the 456 subjects who continued in the study for nine to 24 months and had a second blood sample available, 226 had an increase of more than 20% in B12 levels, 210 stayed within 20% of the original B12 measurement, and 19 had a decrease greater than 20%.

Overall, mean annualized increase in B12 was 52.6 pmol/L, mean annualized decrease of homocysteine was 0.83 mmol/L, and mean annualized increase of holotranscobalamin was 14.7 pmol/L.

“These findings are consistent with improved nutritional status during the course of the study, likely attributed to subjects starting the optional [multivitamin] after the baseline visit and/or subjects changing their diets,” said the investigators.

While the improvement in B12 status did not lead to statistically significant improvements in UPDRS scores, there was a trend toward improvement, which provides “support for a disease-modifying effect of B12,” they added.

The researchers speculated that a link between low B12 levels and worse outcomes could be attributed to an independent comorbid effect on the CNS and peripheral nervous system or a direct effect on Parkinson’s disease pathogenesis. Alternatively, low B12 may be a marker of an unknown associated factor.

“Given that low B12 status is associated with neurologic and other medical morbidities and is readily treated, great care would be needed to design an ethically acceptable randomized, prospective study to evaluate the effect of B12 supplementation on Parkinson’s disease progression, given that serum measurements collected as part of a prospective study in unsupplemented patients would likely reveal some subjects with B12 deficiency,” said Dr. Christine and colleagues.

Study Raises Questions

“The course of Parkinson’s disease can be quite variable, and it is difficult for clinicians to predict what will happen to an individual person with Parkinson’s disease, but identifying prognostic factors ought to help practitioners answer their patients’ questions and potentially improve the understanding of mechanisms underlying the disease pathogenesis,” said Francisco Cardoso, MD, PhD, Professor of Neurology at the Federal University of Minas Gerais in Belo Horizonte, Brazil, in an accompanying editorial.

If vitamin B12 is related to the progression of Parkinson’s disease, then replacing it may slow patients’ decline. But the findings raise questions that need to be addressed, said Dr. Cardoso.

“First, what constitutes a low vitamin B12 level is not a simple issue. If the evaluation is limited to measurement of vitamin B12 concentration, the diagnosis of genuine deficiency is unreliable. Most experts agree that combined measurement of vitamin B12 with determination of homocysteine levels is necessary, and while Christine et al measured both levels, their statistical analysis and subsequent conclusions are exclusively based on the levels of the vitamin.

“Moreover, 34 patients in the study were classified as having ‘borderline-low’ vitamin B12 levels, and 14 had both borderline-low B12 level and high homocysteine concentration. This brings into question whether the researchers identified Parkinson’s disease patients who actually had low B12 levels.

“The design of the DATATOP trial could also introduce some bias into the findings. At the time of publication, the study was criticized for disregarding the symptomatic effect of selegiline and for a lack of objective definition of criteria for the trial’s primary end point—introduction of levodopa.

“This could have led to the termination of individuals at different stages of the disease, introducing a potential bias in the sample of patients who remained in the study for enough time to undergo subsequent determination of B12 levels.”

Furthermore, the findings of Christine et al contrast with those of previous research. The underlying mechanism of vitamin B12 in Parkinson’s disease is unclear. “Nevertheless, the results of the study by Dr. Christine and his colleagues are intriguing, and further investigations to address this hypothesis are warranted,” Dr. Cardoso concluded.

 

 

—Nicola Garrett

Suggested Reading

Cardoso F. Vitamin B12 and Parkinson’s disease: What is the relationship? Mov Disord. 2018 Mar 6 [Epub ahead of print].

Christine CW, Auinger P, Joslin A, et al. Vitamin B12 and homocysteine levels predict different outcomes in early Parkinson’s disease. Mov Disord. 2018 Mar 6 [Epub ahead of print].

People with early, untreated Parkinson’s disease who have low vitamin B12 levels appear to have greater worsening of mobility and cognitive decline over time, according to research published online ahead of print March 6 in Movement Disorders. The results suggest that correcting low levels may slow disease progression.

Chadwick W. Christine, MD

Previous research revealed that low serum vitamin B12 levels are common in patients with moderately advanced Parkinson’s disease and are associated with neuropathy and cognitive impairment. Investigators led by Chadwick W. Christine, MD, a neurologist at the University of California, San Francisco, sought to understand what contributes to variation in the progression of Parkinson’s disease.

An Analysis of the DATATOP Study

Because little is known about B12’s role in early disease, the investigators analyzed data from patients with early, untreated Parkinson’s disease who participated in the DATATOP study, a double-blind, randomized trial designed to test whether treatment with selegiline, the antioxidant alpha-tocopherol, or both slowed disease progression.

They measured serum methylmalonic acid, homocysteine, and holotranscobalamin in addition to B12 because of the limited sensitivity of serum B12 testing alone to detect B12 deficiency. At baseline, 13% of 680 patients had borderline-low B12 levels (ie, less than 184 pmol/L), and 5% had deficient B12 levels (ie, less than 157 pmol/L). Homocysteine was moderately elevated (ie, greater than 15 mmol/L) in 7% of subjects, and 14% of patients with borderline-low B12 also had elevated homocysteine.

Homocysteine Was Associated With Cognitive Outcomes

Low B12 at baseline predicted greater worsening of mobility, in terms of a higher ambulatory capacity score. Investigators calculated this score by adding the falling, freezing when walking, walking, gait, and postural stability scores of the Unified Parkinson’s Disease Rating Scale (UPDRS). Participants in the low-B12 tertile (ie, less than 234 pmol/L) developed greater morbidity, as assessed by greater annualized worsening of the ambulatory capacity score. Participants in the low-B12 tertile had annualized change of 1.53, compared with 0.77 in the upper tertile. The worsening score mostly resulted from poorer gait and postural instability.

“We consider the magnitude of difference to be clinically relevant, particularly given that components of gait dysfunction that develop in Parkinson’s disease may not respond to dopaminergic treatments or [deep brain stimulation],” said Dr. Christine and colleagues.

Elevated homocysteine predicted greater cognitive decline. Baseline elevated homocysteine was associated with lower baseline Mini-Mental State Examination (MMSE) score, as well as greater annualized decline in MMSE (1.96 vs 0.06).

Of the 456 subjects who continued in the study for nine to 24 months and had a second blood sample available, 226 had an increase of more than 20% in B12 levels, 210 stayed within 20% of the original B12 measurement, and 19 had a decrease greater than 20%.

Overall, mean annualized increase in B12 was 52.6 pmol/L, mean annualized decrease of homocysteine was 0.83 mmol/L, and mean annualized increase of holotranscobalamin was 14.7 pmol/L.

“These findings are consistent with improved nutritional status during the course of the study, likely attributed to subjects starting the optional [multivitamin] after the baseline visit and/or subjects changing their diets,” said the investigators.

While the improvement in B12 status did not lead to statistically significant improvements in UPDRS scores, there was a trend toward improvement, which provides “support for a disease-modifying effect of B12,” they added.

The researchers speculated that a link between low B12 levels and worse outcomes could be attributed to an independent comorbid effect on the CNS and peripheral nervous system or a direct effect on Parkinson’s disease pathogenesis. Alternatively, low B12 may be a marker of an unknown associated factor.

“Given that low B12 status is associated with neurologic and other medical morbidities and is readily treated, great care would be needed to design an ethically acceptable randomized, prospective study to evaluate the effect of B12 supplementation on Parkinson’s disease progression, given that serum measurements collected as part of a prospective study in unsupplemented patients would likely reveal some subjects with B12 deficiency,” said Dr. Christine and colleagues.

Study Raises Questions

“The course of Parkinson’s disease can be quite variable, and it is difficult for clinicians to predict what will happen to an individual person with Parkinson’s disease, but identifying prognostic factors ought to help practitioners answer their patients’ questions and potentially improve the understanding of mechanisms underlying the disease pathogenesis,” said Francisco Cardoso, MD, PhD, Professor of Neurology at the Federal University of Minas Gerais in Belo Horizonte, Brazil, in an accompanying editorial.

If vitamin B12 is related to the progression of Parkinson’s disease, then replacing it may slow patients’ decline. But the findings raise questions that need to be addressed, said Dr. Cardoso.

“First, what constitutes a low vitamin B12 level is not a simple issue. If the evaluation is limited to measurement of vitamin B12 concentration, the diagnosis of genuine deficiency is unreliable. Most experts agree that combined measurement of vitamin B12 with determination of homocysteine levels is necessary, and while Christine et al measured both levels, their statistical analysis and subsequent conclusions are exclusively based on the levels of the vitamin.

“Moreover, 34 patients in the study were classified as having ‘borderline-low’ vitamin B12 levels, and 14 had both borderline-low B12 level and high homocysteine concentration. This brings into question whether the researchers identified Parkinson’s disease patients who actually had low B12 levels.

“The design of the DATATOP trial could also introduce some bias into the findings. At the time of publication, the study was criticized for disregarding the symptomatic effect of selegiline and for a lack of objective definition of criteria for the trial’s primary end point—introduction of levodopa.

“This could have led to the termination of individuals at different stages of the disease, introducing a potential bias in the sample of patients who remained in the study for enough time to undergo subsequent determination of B12 levels.”

Furthermore, the findings of Christine et al contrast with those of previous research. The underlying mechanism of vitamin B12 in Parkinson’s disease is unclear. “Nevertheless, the results of the study by Dr. Christine and his colleagues are intriguing, and further investigations to address this hypothesis are warranted,” Dr. Cardoso concluded.

 

 

—Nicola Garrett

Suggested Reading

Cardoso F. Vitamin B12 and Parkinson’s disease: What is the relationship? Mov Disord. 2018 Mar 6 [Epub ahead of print].

Christine CW, Auinger P, Joslin A, et al. Vitamin B12 and homocysteine levels predict different outcomes in early Parkinson’s disease. Mov Disord. 2018 Mar 6 [Epub ahead of print].

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
10
Page Number
10
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Young Women With Stroke Have Higher Rates of Pregnancy Complications

Article Type
Changed
Thu, 12/15/2022 - 15:51
Miscarriages appear to be the most frequent pregnancy complication among women with stroke.

 

When compared with the general population, young women with stroke have more pregnancy loss throughout their lives, according to research published in the April issue of Stroke. After stroke, nulliparous women more frequently experience serious pregnancy complications, compared with the general population. “We found that one out of three women experiences a serious pregnancy complication after stroke,” said Mayte E. van Alebeek, MD, of the Department of Neurology at the Donders Institute for Brain, Cognition, and Behavior, Center for Neuroscience in Nijmegen, the Netherlands, and colleagues.

“Our cohort shows high rates of miscarriages, multiple miscarriages, and extremely high rates of fetal death,” said Dr. van Alebeek. “Our study provides insight about the frequency of pregnancy complications in women who experience a stroke at young age.”

A Prospective Stroke Study

Although 16 to 59 per 100,000 women of childbearing age have stroke every year, there is limited research about the risks of future pregnancy complications after stroke. Dr. van Alebeek and colleagues hypothesized that women with stroke have an increased risk of future pregnancy complications. To test this hypothesis, they conducted a prospective study to investigate the prevalence of pregnancy complications and pregnancy loss in young women before, during, and after ischemic stroke or transient ischemic attack (TIA).

The study was a part of the Dutch Follow-Up of TIA and Stroke Patients and Unelucidated Risk Factor Evaluation (FUTURE) study. Eligible participants were women with first-ever TIA or ischemic stroke who reported that they had been pregnant at least once. Exclusion criteria were cerebral venous sinus thrombosis and retinal infarction. The investigators defined TIA as rapidly evolving focal neurologic deficit without positive phenomena such as twitches, jerks, or myoclonus, with vascular cause only, and persisting for fewer than 24 hours. They defined stroke as focal neurologic deficit persisting for more than 24 hours.

The primary outcome was the occurrence of pregnancy complications (ie, gestational hypertension; preeclampsia; hemolysis, elevated liver enzymes, low platelet count [HELLP] syndrome; preterm delivery; gestational diabetes mellitus; and miscarriage). The secondary outcome was the risk of any vascular event after stroke, stratified by the occurrence of pregnancy complications. Researchers identified the occurrence of recurrent vascular events during a telephone assessment.

Miscarriages Occurred in 35.2% of Women With Stroke

Two hundred thirteen participants completed follow-up assessment on vascular events and pregnancy complications. The mean age at event was 39.6, with a mean follow-up of 12.7 years. The number of pregnancies was unknown for three women. Of the remaining 210 women, 569 pregnancies resulted in 425 live births. All pregnancy complications were equally reported in the nulliparious (patients who experienced stroke/TIA before their first pregnancy of a live-born child), primi/multiparous (patients who have had one more pregnancies), and the gravidas (during pregnancy or postpartum, defined as within six weeks after delivery) groups.

Miscarriage occurred in 35.2% of women with stroke vs 13.5% of the Dutch population. Fetal death occurred in 6.1% of women with stroke vs 0.9% of the Dutch population.

Compared with the Dutch population, nulliparous women after stroke had a high prevalence of hypertensive disorders in pregnancy (12.2% vs 33.3%), HELLP syndrome (0.5% vs 9.5%), and early preterm delivery less than 32 weeks (1.4% vs 9.0%).

In primiparous and multiparous women after stroke, 29 events occurred. None of these events occurred during subsequent pregnancies. A history of hypertensive disorder in pregnancy did not modify this risk.

This is the first study to address the risk of pregnancy complications in a large group of women after their stroke, said the researchers.

These findings “may imply that women with a history of stroke should be put under intensive control of a gynecologist during pregnancy to prevent serious and possibly life-threatening pregnancy complications,” Dr. van Alebeek and his team concluded.

—Erica Tricarico

Suggested Reading

van Alebeek ME, de Vrijer M, Arntz RM, et al. Increased risk of pregnancy complications after stroke: the FUTURE study (Follow-Up of Transient Ischemic Attack and Stroke Patients and Unelucidated Risk Factor Evaluation). Stroke. 2018;49(4):877-883.

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
32
Sections
Related Articles
Miscarriages appear to be the most frequent pregnancy complication among women with stroke.
Miscarriages appear to be the most frequent pregnancy complication among women with stroke.

 

When compared with the general population, young women with stroke have more pregnancy loss throughout their lives, according to research published in the April issue of Stroke. After stroke, nulliparous women more frequently experience serious pregnancy complications, compared with the general population. “We found that one out of three women experiences a serious pregnancy complication after stroke,” said Mayte E. van Alebeek, MD, of the Department of Neurology at the Donders Institute for Brain, Cognition, and Behavior, Center for Neuroscience in Nijmegen, the Netherlands, and colleagues.

“Our cohort shows high rates of miscarriages, multiple miscarriages, and extremely high rates of fetal death,” said Dr. van Alebeek. “Our study provides insight about the frequency of pregnancy complications in women who experience a stroke at young age.”

A Prospective Stroke Study

Although 16 to 59 per 100,000 women of childbearing age have stroke every year, there is limited research about the risks of future pregnancy complications after stroke. Dr. van Alebeek and colleagues hypothesized that women with stroke have an increased risk of future pregnancy complications. To test this hypothesis, they conducted a prospective study to investigate the prevalence of pregnancy complications and pregnancy loss in young women before, during, and after ischemic stroke or transient ischemic attack (TIA).

The study was a part of the Dutch Follow-Up of TIA and Stroke Patients and Unelucidated Risk Factor Evaluation (FUTURE) study. Eligible participants were women with first-ever TIA or ischemic stroke who reported that they had been pregnant at least once. Exclusion criteria were cerebral venous sinus thrombosis and retinal infarction. The investigators defined TIA as rapidly evolving focal neurologic deficit without positive phenomena such as twitches, jerks, or myoclonus, with vascular cause only, and persisting for fewer than 24 hours. They defined stroke as focal neurologic deficit persisting for more than 24 hours.

The primary outcome was the occurrence of pregnancy complications (ie, gestational hypertension; preeclampsia; hemolysis, elevated liver enzymes, low platelet count [HELLP] syndrome; preterm delivery; gestational diabetes mellitus; and miscarriage). The secondary outcome was the risk of any vascular event after stroke, stratified by the occurrence of pregnancy complications. Researchers identified the occurrence of recurrent vascular events during a telephone assessment.

Miscarriages Occurred in 35.2% of Women With Stroke

Two hundred thirteen participants completed follow-up assessment on vascular events and pregnancy complications. The mean age at event was 39.6, with a mean follow-up of 12.7 years. The number of pregnancies was unknown for three women. Of the remaining 210 women, 569 pregnancies resulted in 425 live births. All pregnancy complications were equally reported in the nulliparious (patients who experienced stroke/TIA before their first pregnancy of a live-born child), primi/multiparous (patients who have had one more pregnancies), and the gravidas (during pregnancy or postpartum, defined as within six weeks after delivery) groups.

Miscarriage occurred in 35.2% of women with stroke vs 13.5% of the Dutch population. Fetal death occurred in 6.1% of women with stroke vs 0.9% of the Dutch population.

Compared with the Dutch population, nulliparous women after stroke had a high prevalence of hypertensive disorders in pregnancy (12.2% vs 33.3%), HELLP syndrome (0.5% vs 9.5%), and early preterm delivery less than 32 weeks (1.4% vs 9.0%).

In primiparous and multiparous women after stroke, 29 events occurred. None of these events occurred during subsequent pregnancies. A history of hypertensive disorder in pregnancy did not modify this risk.

This is the first study to address the risk of pregnancy complications in a large group of women after their stroke, said the researchers.

These findings “may imply that women with a history of stroke should be put under intensive control of a gynecologist during pregnancy to prevent serious and possibly life-threatening pregnancy complications,” Dr. van Alebeek and his team concluded.

—Erica Tricarico

Suggested Reading

van Alebeek ME, de Vrijer M, Arntz RM, et al. Increased risk of pregnancy complications after stroke: the FUTURE study (Follow-Up of Transient Ischemic Attack and Stroke Patients and Unelucidated Risk Factor Evaluation). Stroke. 2018;49(4):877-883.

 

When compared with the general population, young women with stroke have more pregnancy loss throughout their lives, according to research published in the April issue of Stroke. After stroke, nulliparous women more frequently experience serious pregnancy complications, compared with the general population. “We found that one out of three women experiences a serious pregnancy complication after stroke,” said Mayte E. van Alebeek, MD, of the Department of Neurology at the Donders Institute for Brain, Cognition, and Behavior, Center for Neuroscience in Nijmegen, the Netherlands, and colleagues.

“Our cohort shows high rates of miscarriages, multiple miscarriages, and extremely high rates of fetal death,” said Dr. van Alebeek. “Our study provides insight about the frequency of pregnancy complications in women who experience a stroke at young age.”

A Prospective Stroke Study

Although 16 to 59 per 100,000 women of childbearing age have stroke every year, there is limited research about the risks of future pregnancy complications after stroke. Dr. van Alebeek and colleagues hypothesized that women with stroke have an increased risk of future pregnancy complications. To test this hypothesis, they conducted a prospective study to investigate the prevalence of pregnancy complications and pregnancy loss in young women before, during, and after ischemic stroke or transient ischemic attack (TIA).

The study was a part of the Dutch Follow-Up of TIA and Stroke Patients and Unelucidated Risk Factor Evaluation (FUTURE) study. Eligible participants were women with first-ever TIA or ischemic stroke who reported that they had been pregnant at least once. Exclusion criteria were cerebral venous sinus thrombosis and retinal infarction. The investigators defined TIA as rapidly evolving focal neurologic deficit without positive phenomena such as twitches, jerks, or myoclonus, with vascular cause only, and persisting for fewer than 24 hours. They defined stroke as focal neurologic deficit persisting for more than 24 hours.

The primary outcome was the occurrence of pregnancy complications (ie, gestational hypertension; preeclampsia; hemolysis, elevated liver enzymes, low platelet count [HELLP] syndrome; preterm delivery; gestational diabetes mellitus; and miscarriage). The secondary outcome was the risk of any vascular event after stroke, stratified by the occurrence of pregnancy complications. Researchers identified the occurrence of recurrent vascular events during a telephone assessment.

Miscarriages Occurred in 35.2% of Women With Stroke

Two hundred thirteen participants completed follow-up assessment on vascular events and pregnancy complications. The mean age at event was 39.6, with a mean follow-up of 12.7 years. The number of pregnancies was unknown for three women. Of the remaining 210 women, 569 pregnancies resulted in 425 live births. All pregnancy complications were equally reported in the nulliparious (patients who experienced stroke/TIA before their first pregnancy of a live-born child), primi/multiparous (patients who have had one more pregnancies), and the gravidas (during pregnancy or postpartum, defined as within six weeks after delivery) groups.

Miscarriage occurred in 35.2% of women with stroke vs 13.5% of the Dutch population. Fetal death occurred in 6.1% of women with stroke vs 0.9% of the Dutch population.

Compared with the Dutch population, nulliparous women after stroke had a high prevalence of hypertensive disorders in pregnancy (12.2% vs 33.3%), HELLP syndrome (0.5% vs 9.5%), and early preterm delivery less than 32 weeks (1.4% vs 9.0%).

In primiparous and multiparous women after stroke, 29 events occurred. None of these events occurred during subsequent pregnancies. A history of hypertensive disorder in pregnancy did not modify this risk.

This is the first study to address the risk of pregnancy complications in a large group of women after their stroke, said the researchers.

These findings “may imply that women with a history of stroke should be put under intensive control of a gynecologist during pregnancy to prevent serious and possibly life-threatening pregnancy complications,” Dr. van Alebeek and his team concluded.

—Erica Tricarico

Suggested Reading

van Alebeek ME, de Vrijer M, Arntz RM, et al. Increased risk of pregnancy complications after stroke: the FUTURE study (Follow-Up of Transient Ischemic Attack and Stroke Patients and Unelucidated Risk Factor Evaluation). Stroke. 2018;49(4):877-883.

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
32
Page Number
32
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Siponimod May Benefit Patients With Secondary Progressive MS

Article Type
Changed
Thu, 12/15/2022 - 15:51
The therapy appears to be safe and reduces the risk of disability progression, compared with placebo.

 

Among patients with secondary progressive multiple sclerosis (MS), siponimod appears to decrease the risk of disability progression, according to data published online ahead of print March 22 in Lancet. The drug’s safety profile is similar to those of other sphingosine-1-phosphate receptor modulators, and siponimod “might be a useful treatment for patients with secondary progressive MS,” according to the authors.

To date, no molecule has slowed disability progression consistently in clinical trials of patients with secondary progressive MS. Preclinical data indicate that siponimod might prevent synaptic neurodegeneration and promote remyelination. The drug reduced the number of active brain lesions and the annualized relapse rate in a phase II study.

Patients Received 2 mg/day of Oral Siponimod

Ludwig Kappos, MD, Professor of Neurology at the University of Basel in Switzerland, and colleagues conducted a double-blind, phase III trial to assess siponimod’s efficacy and safety in patients with secondary progressive MS. They enrolled 1,651 patients at 292 hospitals and MS centers in 31 countries. Eligible participants were between ages 18 and 60, had an Expanded Disability Status Scale (EDSS) score of 3.0 to 6.5, a history of relapsing-remitting MS, EDSS progression in the previous two years, and no evidence of relapse in the three months before randomization.

Ludwig Kappos, MD

The investigators randomized patients 2:1 to once-daily oral siponimod (2 mg) or matching placebo. A trained assessor performed a full neurologic examination every three months. Participants underwent MRI scans at baseline, 12 months, 24 months, 36 months, and the end of the controlled treatment phase. Patients with six-month confirmed disability progression during the double-blind phase were given the opportunity to continue double-blind treatment, switch to open-label siponimod, or stop study treatment and remain untreated or receive another therapy.

The study’s primary end point was time to three-month confirmed disability progression, which was defined as a one-point increase in EDSS for patients with a baseline score of 3.0 to 5.0, or a 0.5-point increase for patients with a baseline score of 5.5 to 6.5. The key secondary end points were time to three-month confirmed worsening of at least 20% from baseline on the Timed 25-Foot Walk (T25FW) and change from baseline in T2 lesion volume.

Treatment Did Not Affect the T25FW

Dr. Kappos and colleagues randomized 1,105 patients to siponimod and 546 patients to placebo. Baseline characteristics were similar between the two study arms. Mean age was 48, and about 60% of patients were women. Participants’ median time in the study was 21 months, and median exposure to study drug was 18 months. Approximately 82% of the siponimod group and 78% of controls completed the study.

The rate of three-month confirmed disability progression was 26% in the siponimod group and 32% in the placebo group. Siponimod thus reduced the risk of this outcome by 21%. The researchers did not observe a significant difference between groups in time to three-month confirmed worsening of at least 20% on the T25FW. The mean increase in T2 lesion volume from baseline was 183.9 mm3 in the siponimod group and 879.2 mm3 among controls.

Siponimod reduced the risk of six-month confirmed disability progression by 26%, compared with placebo. It also was associated with a lower annualized relapse rate and a longer time to confirmed first relapse, compared with placebo.

The rate of adverse events was 89% in the siponimod group and 82% among controls. The rate of serious adverse events was 18% in the siponimod group and 15% among controls. The most frequent adverse events were headache, nasopharyngitis, urinary tract infection, and falls. Serious adverse events included increased liver transaminase concentrations, basal cell carcinoma, concussion, depression, urinary tract infection, suicide attempt, gait disturbance, MS relapse, and paraparesis. The rate of discontinuation because of adverse events was 8% for siponimod and 5% for placebo.

Subgroup analyses suggested that the treatment effect of siponimod decreased with increasing age, disability, baseline disease duration, and diminishing signs of disease activity. One interpretation of this finding “is that siponimod exerts its effect on both aspects of the pathogenesis of secondary progressive disease, albeit not equally,” according to the authors.

The study was funded by Novartis Pharma, which helped design and conduct the study; collect, manage, analyze, and interpret the data; and write the study report.

Drug Might Affect Inflammatory Activity

“The reduction in the proportion of participants reaching the primary end point of only 6% and the absence of a significant difference for the key secondary clinical outcome are disappointing results and do not suggest that siponimod is an effective treatment for secondary progressive MS,” said Luanne M. Metz, MD, Professor of Medicine, and Wei-Qiao Liu, MD, a doctoral student, both at the University of Calgary, Alberta, in an accompanying editorial.

 

 

Although siponimod had a “small benefit” on the primary end point, it did not reduce the time to three-month confirmed worsening of the T25FW, said the authors. “Worsening of the T25FW by 20%, as required in this trial, is a reliable measure of change and suggests clinical significance.”

The significant differences between the siponimod group and controls on the other secondary outcomes “might all reflect an effect on the inflammatory disease activity that characterizes relapsing-remitting MS, and this trial does not, in our opinion, provide convincing evidence that we have found a treatment that exerts its clinical effect through other mechanisms,” said Drs. Metz and Liu.

“Confidence in the treatment benefit of siponimod in progressive MS will … require confirmation in a second trial,” they continued. “Trials of other novel treatments that target noninflammatory mechanisms are still needed.”

—Erik Greb

Suggested Reading

Kappos L, Bar-Or A, Cree BAC, et al. Siponimod versus placebo in secondary progressive multiple sclerosis (EXPAND): a double-blind, randomised, phase 3 study. Lancet. 2018 Mar 22 [Epub ahead of print].

Metz LM, Liu WQ. Effective treatment of progressive MS remains elusive. Lancet. 2018 Mar 22 [Epub ahead of print].

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
63
Sections
Related Articles
The therapy appears to be safe and reduces the risk of disability progression, compared with placebo.
The therapy appears to be safe and reduces the risk of disability progression, compared with placebo.

 

Among patients with secondary progressive multiple sclerosis (MS), siponimod appears to decrease the risk of disability progression, according to data published online ahead of print March 22 in Lancet. The drug’s safety profile is similar to those of other sphingosine-1-phosphate receptor modulators, and siponimod “might be a useful treatment for patients with secondary progressive MS,” according to the authors.

To date, no molecule has slowed disability progression consistently in clinical trials of patients with secondary progressive MS. Preclinical data indicate that siponimod might prevent synaptic neurodegeneration and promote remyelination. The drug reduced the number of active brain lesions and the annualized relapse rate in a phase II study.

Patients Received 2 mg/day of Oral Siponimod

Ludwig Kappos, MD, Professor of Neurology at the University of Basel in Switzerland, and colleagues conducted a double-blind, phase III trial to assess siponimod’s efficacy and safety in patients with secondary progressive MS. They enrolled 1,651 patients at 292 hospitals and MS centers in 31 countries. Eligible participants were between ages 18 and 60, had an Expanded Disability Status Scale (EDSS) score of 3.0 to 6.5, a history of relapsing-remitting MS, EDSS progression in the previous two years, and no evidence of relapse in the three months before randomization.

Ludwig Kappos, MD

The investigators randomized patients 2:1 to once-daily oral siponimod (2 mg) or matching placebo. A trained assessor performed a full neurologic examination every three months. Participants underwent MRI scans at baseline, 12 months, 24 months, 36 months, and the end of the controlled treatment phase. Patients with six-month confirmed disability progression during the double-blind phase were given the opportunity to continue double-blind treatment, switch to open-label siponimod, or stop study treatment and remain untreated or receive another therapy.

The study’s primary end point was time to three-month confirmed disability progression, which was defined as a one-point increase in EDSS for patients with a baseline score of 3.0 to 5.0, or a 0.5-point increase for patients with a baseline score of 5.5 to 6.5. The key secondary end points were time to three-month confirmed worsening of at least 20% from baseline on the Timed 25-Foot Walk (T25FW) and change from baseline in T2 lesion volume.

Treatment Did Not Affect the T25FW

Dr. Kappos and colleagues randomized 1,105 patients to siponimod and 546 patients to placebo. Baseline characteristics were similar between the two study arms. Mean age was 48, and about 60% of patients were women. Participants’ median time in the study was 21 months, and median exposure to study drug was 18 months. Approximately 82% of the siponimod group and 78% of controls completed the study.

The rate of three-month confirmed disability progression was 26% in the siponimod group and 32% in the placebo group. Siponimod thus reduced the risk of this outcome by 21%. The researchers did not observe a significant difference between groups in time to three-month confirmed worsening of at least 20% on the T25FW. The mean increase in T2 lesion volume from baseline was 183.9 mm3 in the siponimod group and 879.2 mm3 among controls.

Siponimod reduced the risk of six-month confirmed disability progression by 26%, compared with placebo. It also was associated with a lower annualized relapse rate and a longer time to confirmed first relapse, compared with placebo.

The rate of adverse events was 89% in the siponimod group and 82% among controls. The rate of serious adverse events was 18% in the siponimod group and 15% among controls. The most frequent adverse events were headache, nasopharyngitis, urinary tract infection, and falls. Serious adverse events included increased liver transaminase concentrations, basal cell carcinoma, concussion, depression, urinary tract infection, suicide attempt, gait disturbance, MS relapse, and paraparesis. The rate of discontinuation because of adverse events was 8% for siponimod and 5% for placebo.

Subgroup analyses suggested that the treatment effect of siponimod decreased with increasing age, disability, baseline disease duration, and diminishing signs of disease activity. One interpretation of this finding “is that siponimod exerts its effect on both aspects of the pathogenesis of secondary progressive disease, albeit not equally,” according to the authors.

The study was funded by Novartis Pharma, which helped design and conduct the study; collect, manage, analyze, and interpret the data; and write the study report.

Drug Might Affect Inflammatory Activity

“The reduction in the proportion of participants reaching the primary end point of only 6% and the absence of a significant difference for the key secondary clinical outcome are disappointing results and do not suggest that siponimod is an effective treatment for secondary progressive MS,” said Luanne M. Metz, MD, Professor of Medicine, and Wei-Qiao Liu, MD, a doctoral student, both at the University of Calgary, Alberta, in an accompanying editorial.

 

 

Although siponimod had a “small benefit” on the primary end point, it did not reduce the time to three-month confirmed worsening of the T25FW, said the authors. “Worsening of the T25FW by 20%, as required in this trial, is a reliable measure of change and suggests clinical significance.”

The significant differences between the siponimod group and controls on the other secondary outcomes “might all reflect an effect on the inflammatory disease activity that characterizes relapsing-remitting MS, and this trial does not, in our opinion, provide convincing evidence that we have found a treatment that exerts its clinical effect through other mechanisms,” said Drs. Metz and Liu.

“Confidence in the treatment benefit of siponimod in progressive MS will … require confirmation in a second trial,” they continued. “Trials of other novel treatments that target noninflammatory mechanisms are still needed.”

—Erik Greb

Suggested Reading

Kappos L, Bar-Or A, Cree BAC, et al. Siponimod versus placebo in secondary progressive multiple sclerosis (EXPAND): a double-blind, randomised, phase 3 study. Lancet. 2018 Mar 22 [Epub ahead of print].

Metz LM, Liu WQ. Effective treatment of progressive MS remains elusive. Lancet. 2018 Mar 22 [Epub ahead of print].

 

Among patients with secondary progressive multiple sclerosis (MS), siponimod appears to decrease the risk of disability progression, according to data published online ahead of print March 22 in Lancet. The drug’s safety profile is similar to those of other sphingosine-1-phosphate receptor modulators, and siponimod “might be a useful treatment for patients with secondary progressive MS,” according to the authors.

To date, no molecule has slowed disability progression consistently in clinical trials of patients with secondary progressive MS. Preclinical data indicate that siponimod might prevent synaptic neurodegeneration and promote remyelination. The drug reduced the number of active brain lesions and the annualized relapse rate in a phase II study.

Patients Received 2 mg/day of Oral Siponimod

Ludwig Kappos, MD, Professor of Neurology at the University of Basel in Switzerland, and colleagues conducted a double-blind, phase III trial to assess siponimod’s efficacy and safety in patients with secondary progressive MS. They enrolled 1,651 patients at 292 hospitals and MS centers in 31 countries. Eligible participants were between ages 18 and 60, had an Expanded Disability Status Scale (EDSS) score of 3.0 to 6.5, a history of relapsing-remitting MS, EDSS progression in the previous two years, and no evidence of relapse in the three months before randomization.

Ludwig Kappos, MD

The investigators randomized patients 2:1 to once-daily oral siponimod (2 mg) or matching placebo. A trained assessor performed a full neurologic examination every three months. Participants underwent MRI scans at baseline, 12 months, 24 months, 36 months, and the end of the controlled treatment phase. Patients with six-month confirmed disability progression during the double-blind phase were given the opportunity to continue double-blind treatment, switch to open-label siponimod, or stop study treatment and remain untreated or receive another therapy.

The study’s primary end point was time to three-month confirmed disability progression, which was defined as a one-point increase in EDSS for patients with a baseline score of 3.0 to 5.0, or a 0.5-point increase for patients with a baseline score of 5.5 to 6.5. The key secondary end points were time to three-month confirmed worsening of at least 20% from baseline on the Timed 25-Foot Walk (T25FW) and change from baseline in T2 lesion volume.

Treatment Did Not Affect the T25FW

Dr. Kappos and colleagues randomized 1,105 patients to siponimod and 546 patients to placebo. Baseline characteristics were similar between the two study arms. Mean age was 48, and about 60% of patients were women. Participants’ median time in the study was 21 months, and median exposure to study drug was 18 months. Approximately 82% of the siponimod group and 78% of controls completed the study.

The rate of three-month confirmed disability progression was 26% in the siponimod group and 32% in the placebo group. Siponimod thus reduced the risk of this outcome by 21%. The researchers did not observe a significant difference between groups in time to three-month confirmed worsening of at least 20% on the T25FW. The mean increase in T2 lesion volume from baseline was 183.9 mm3 in the siponimod group and 879.2 mm3 among controls.

Siponimod reduced the risk of six-month confirmed disability progression by 26%, compared with placebo. It also was associated with a lower annualized relapse rate and a longer time to confirmed first relapse, compared with placebo.

The rate of adverse events was 89% in the siponimod group and 82% among controls. The rate of serious adverse events was 18% in the siponimod group and 15% among controls. The most frequent adverse events were headache, nasopharyngitis, urinary tract infection, and falls. Serious adverse events included increased liver transaminase concentrations, basal cell carcinoma, concussion, depression, urinary tract infection, suicide attempt, gait disturbance, MS relapse, and paraparesis. The rate of discontinuation because of adverse events was 8% for siponimod and 5% for placebo.

Subgroup analyses suggested that the treatment effect of siponimod decreased with increasing age, disability, baseline disease duration, and diminishing signs of disease activity. One interpretation of this finding “is that siponimod exerts its effect on both aspects of the pathogenesis of secondary progressive disease, albeit not equally,” according to the authors.

The study was funded by Novartis Pharma, which helped design and conduct the study; collect, manage, analyze, and interpret the data; and write the study report.

Drug Might Affect Inflammatory Activity

“The reduction in the proportion of participants reaching the primary end point of only 6% and the absence of a significant difference for the key secondary clinical outcome are disappointing results and do not suggest that siponimod is an effective treatment for secondary progressive MS,” said Luanne M. Metz, MD, Professor of Medicine, and Wei-Qiao Liu, MD, a doctoral student, both at the University of Calgary, Alberta, in an accompanying editorial.

 

 

Although siponimod had a “small benefit” on the primary end point, it did not reduce the time to three-month confirmed worsening of the T25FW, said the authors. “Worsening of the T25FW by 20%, as required in this trial, is a reliable measure of change and suggests clinical significance.”

The significant differences between the siponimod group and controls on the other secondary outcomes “might all reflect an effect on the inflammatory disease activity that characterizes relapsing-remitting MS, and this trial does not, in our opinion, provide convincing evidence that we have found a treatment that exerts its clinical effect through other mechanisms,” said Drs. Metz and Liu.

“Confidence in the treatment benefit of siponimod in progressive MS will … require confirmation in a second trial,” they continued. “Trials of other novel treatments that target noninflammatory mechanisms are still needed.”

—Erik Greb

Suggested Reading

Kappos L, Bar-Or A, Cree BAC, et al. Siponimod versus placebo in secondary progressive multiple sclerosis (EXPAND): a double-blind, randomised, phase 3 study. Lancet. 2018 Mar 22 [Epub ahead of print].

Metz LM, Liu WQ. Effective treatment of progressive MS remains elusive. Lancet. 2018 Mar 22 [Epub ahead of print].

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
63
Page Number
63
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

MRI Techniques Could Help Distinguish Between MS and Migraine

Article Type
Changed
Wed, 01/16/2019 - 15:39
Using a combination of imaging methods could prevent misdiagnoses and aid the administration of effective treatment.

STOWE, VT—Some patients with migraine receive an inappropriate diagnosis of multiple sclerosis (MS). The two disorders share certain clinical and radiologic features, and misdiagnosis is a significant problem. Using MRI scanners widely available to clinicians, researchers are developing several imaging techniques that can provide an objective basis for distinguishing between MS and migraine, according to an overview provided at the Headache Cooperative of New England’s 28th Annual Stowe Headache Symposium.

Andrew J. Solomon, MD

The imaging techniques evaluate different aspects of MS pathology, said Andrew J. Solomon, MD, Associate Professor of Neurological Sciences at the University of Vermont College of Medicine in Burlington. The techniques have been automated to a large extent, which reduces the need for human interpretation of data. The incorporation of machine learning could further aid differential diagnosis.

Grounds for Confusion

Various similarities between migraine and MS increase the likelihood of misdiagnosis. The two disorders are chronic and entail attacks and remissions. Both are associated with changes in brain structure and white matter abnormalities that may be subclinical.

In a study of patients with migraine by Liu et al, between 25% and 35% of participants met MRI criteria for dissemination in space for MS, depending on how lesions were defined. The first report of natalizumab-associated progressive multifocal leukoencephalopathy occurred in a patient who, on autopsy, was found not to have had MS. In a 1988 study, Engell and colleagues found that of 518 consecutive patients who had died with a diagnosis of clinically definite MS, the diagnosis was incorrect for 6%.

In 2005, Carmosino and colleagues evaluated 281 patients who had been referred to an MS center and found that 67% of them did not have MS. The investigators identified 37 alternative diagnoses, of which migraine was the second most common. About 10% of participants had a final diagnosis of migraine.

In a recent survey, Dr. Solomon and colleagues asked more than 100 MS specialists whether they had seen patients who had had a diagnosis of MS for more than one year, but, on evaluation, determined that they did not have MS. Approximately 95% of respondents answered affirmatively. About 40% of respondents reported having seen three to five such patients in the previous year.

The current diagnostic criteria for MS rely on clinicians to interpret clinical and radiologic data and contain many caveats regarding their application, said Dr. Solomon. The criteria “were not developed to differentiate MS from other disorders,” but to predict which patients with an initial neurologic syndrome typical for MS will subsequently develop MS, he added. Physicians who are unfamiliar with the diagnostic criteria may misapply them and make an incorrect diagnosis.

The Central Vein Sign

Autopsy studies have indicated that MS lesions generally center around veins. Researchers have recently been able to visualize these veins within MS lesions using 7-T MRI. This finding, which investigators have called the central vein sign, could be a way to distinguish MS from other disorders. But 7-T MRI generally is not available to clinical neurologists. In 2012, scientists at the NIH developed a method that combines T2* imaging, which helps visualize veins, and fluid-attenuated inversion recovery (FLAIR) imaging that visualizes MS lesions. This method visualizes veins within lesions, or central vein sign, using 3-T MRI, which is more commonly available to clinical neurologists. The researchers called this sequence FLAIR*, and numerous studies have suggested that it may differentiate MS from other diagnoses.

Dr. Solomon and collaborators tested this technique on a group of 10 patients with MS who had no other comorbidities for white matter disease and 10 patients with migraine and white matter abnormalities who also had no other comorbidities for white matter disease. The mean percentage of lesions with central vessels per participant was 80% in patients with MS and 34% in migraineurs. The patients with migraine had fewer juxtacortical, periventricular, and infratentorial lesions, compared with patients with MS.

Because researchers have used various definitions of the central vein sign, Dr. Solomon and colleagues published a consensus statement to improve the interpretation of the imaging findings. They recommended that neurologists disregard periventricular lesions and concentrate on subcortical and white matter lesions that are visible from two perspectives.

Another limitation of this diagnostic imaging technique is that it “requires evaluation of every single lesion to determine if a central vein was present,” said Dr. Solomon. He and his colleagues developed a simplified algorithm that required the examination of three lesions. To test this algorithm, they examined their original cohort plus 10 patients with MS and comorbidities for white matter disease (eg, migraine or hypertension) and 10 patients who had been misdiagnosed with MS (most of whom had migraine). Three blinded raters examined three lesions chosen at random from each MRI. This method had a 0.98 specificity for MS and a sensitivity of 0.52. The study demonstrated problems with inter-rater reliability, however.

Dr. Solomon later collaborated with researchers at the University of Pennsylvania to develop a machine learning technique that could identify the central vein sign. When they applied the technique to the expanded cohort of 40 patients, it identified the sign accurately with an area under the curve of about 0.86. The central vein sign may be a good biomarker for MS, and using this automated technique to assess 3-T MRI images appears to be clinically applicable, said Dr. Solomon.

 

 

Thalamic Volume

Thalamic atrophy is common in the early stages of relapsing-remitting MS. The thalamus also is implicated in migraine. Although studies have examined volumetric brain changes in migraine, none has examined thalamic volume specifically, said Dr. Solomon.

He and his colleagues used an automatic segmentation method to analyze thalamic volume in their cohort of 40 patients. Analysis of variance indicated that thalamic volume was significantly smaller in patients with MS, compared with patients without MS. When the researchers used a thalamic volume less than 0.0077 as a cutoff, the technique’s sensitivity and specificity for the diagnosis of MS were 0.75.

Recent data suggest that thalamic atrophy in MS does not result from thalamic lesions, but from diffuse white matter abnormalities. Like the central vein sign, thalamic atrophy may reflect MS pathophysiology and could be incorporated into MS diagnostic criteria, said Dr. Solomon.

Cortical Lesions

Autopsy and MRI studies have shown that cortical lesions are characteristic of MS, but MRI studies have suggested that migraineurs generally do not have cortical lesions. Although neurologists can see these lesions in vivo on 7-T MRI, 3-T MRI is not as sensitive and makes cortical lesion detection challenging.

In 2017, Nakamura and colleagues found that ratio maps of T1- and T2-weighted 3-T MRI, images that are acquired in routine clinical care for MS, could identify areas of cortical demyelination. Dr. Solomon and colleagues tested whether this method could distinguish MS from migraine. They defined a z score of less than 3 as an indication of low myelin density. When they examined the cohort of 40 patients, they were able to correlate areas with z scores below the cutoff with cortical lesions that were visible on conventional imaging. The technique accurately distinguished patients with MS from patients with migraine.

None of these emerging imaging techniques is 100% accurate. In the future, however, combining several of these techniques in conjunction with tests of blood biomarkers such as microRNA could accurately distinguish between MS and other disorders with high specificity and sensitivity, Dr. Solomon concluded.

—Erik Greb

Suggested Reading

Carmosino MJ, Brousseau KM, Arciniegas DB, Corboy JR. Initial evaluations for multiple sclerosis in a university multiple sclerosis center: outcomes and role of magnetic resonance imaging in referral. Arch Neurol. 2005;62(4):585-590.

Engell T. A clinico-pathoanatomical study of multiple sclerosis diagnosis. Acta Neurol Scand. 1988;78(1):39-44.

Liu S, Kullnat J, Bourdette D, et al. Prevalence of brain magnetic resonance imaging meeting Barkhof and McDonald criteria for dissemination in space among headache patients. Mult Scler. 2013;19(8):1101-1105.

Nakamura K, Chen JT, Ontaneda D, et al. T1-/T2-weighted ratio differs in demyelinated cortex in multiple sclerosis. Ann Neurol. 2017;82(4):635-639.

Sati P, Oh J, Constable RT, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol. 2016;12(12):714-722.

Solomon AJ, Klein EP, Bourdette D. “Undiagnosing” multiple sclerosis: the challenge of misdiagnosis in MS. Neurology. 2012;78(24):1986-1991.

Solomon AJ, Schindler MK, Howard DB, et al. “Central vessel sign” on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol. 2015;3(2):82-87.

Solomon AJ, Watts R, Dewey BE, Reich DS. MRI evaluation of thalamic volume differentiates MS from common mimics. Neurol Neuroimmunol Neuroinflamm. 2017;4(5):e387.

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
3-4
Sections
Related Articles
Using a combination of imaging methods could prevent misdiagnoses and aid the administration of effective treatment.
Using a combination of imaging methods could prevent misdiagnoses and aid the administration of effective treatment.

STOWE, VT—Some patients with migraine receive an inappropriate diagnosis of multiple sclerosis (MS). The two disorders share certain clinical and radiologic features, and misdiagnosis is a significant problem. Using MRI scanners widely available to clinicians, researchers are developing several imaging techniques that can provide an objective basis for distinguishing between MS and migraine, according to an overview provided at the Headache Cooperative of New England’s 28th Annual Stowe Headache Symposium.

Andrew J. Solomon, MD

The imaging techniques evaluate different aspects of MS pathology, said Andrew J. Solomon, MD, Associate Professor of Neurological Sciences at the University of Vermont College of Medicine in Burlington. The techniques have been automated to a large extent, which reduces the need for human interpretation of data. The incorporation of machine learning could further aid differential diagnosis.

Grounds for Confusion

Various similarities between migraine and MS increase the likelihood of misdiagnosis. The two disorders are chronic and entail attacks and remissions. Both are associated with changes in brain structure and white matter abnormalities that may be subclinical.

In a study of patients with migraine by Liu et al, between 25% and 35% of participants met MRI criteria for dissemination in space for MS, depending on how lesions were defined. The first report of natalizumab-associated progressive multifocal leukoencephalopathy occurred in a patient who, on autopsy, was found not to have had MS. In a 1988 study, Engell and colleagues found that of 518 consecutive patients who had died with a diagnosis of clinically definite MS, the diagnosis was incorrect for 6%.

In 2005, Carmosino and colleagues evaluated 281 patients who had been referred to an MS center and found that 67% of them did not have MS. The investigators identified 37 alternative diagnoses, of which migraine was the second most common. About 10% of participants had a final diagnosis of migraine.

In a recent survey, Dr. Solomon and colleagues asked more than 100 MS specialists whether they had seen patients who had had a diagnosis of MS for more than one year, but, on evaluation, determined that they did not have MS. Approximately 95% of respondents answered affirmatively. About 40% of respondents reported having seen three to five such patients in the previous year.

The current diagnostic criteria for MS rely on clinicians to interpret clinical and radiologic data and contain many caveats regarding their application, said Dr. Solomon. The criteria “were not developed to differentiate MS from other disorders,” but to predict which patients with an initial neurologic syndrome typical for MS will subsequently develop MS, he added. Physicians who are unfamiliar with the diagnostic criteria may misapply them and make an incorrect diagnosis.

The Central Vein Sign

Autopsy studies have indicated that MS lesions generally center around veins. Researchers have recently been able to visualize these veins within MS lesions using 7-T MRI. This finding, which investigators have called the central vein sign, could be a way to distinguish MS from other disorders. But 7-T MRI generally is not available to clinical neurologists. In 2012, scientists at the NIH developed a method that combines T2* imaging, which helps visualize veins, and fluid-attenuated inversion recovery (FLAIR) imaging that visualizes MS lesions. This method visualizes veins within lesions, or central vein sign, using 3-T MRI, which is more commonly available to clinical neurologists. The researchers called this sequence FLAIR*, and numerous studies have suggested that it may differentiate MS from other diagnoses.

Dr. Solomon and collaborators tested this technique on a group of 10 patients with MS who had no other comorbidities for white matter disease and 10 patients with migraine and white matter abnormalities who also had no other comorbidities for white matter disease. The mean percentage of lesions with central vessels per participant was 80% in patients with MS and 34% in migraineurs. The patients with migraine had fewer juxtacortical, periventricular, and infratentorial lesions, compared with patients with MS.

Because researchers have used various definitions of the central vein sign, Dr. Solomon and colleagues published a consensus statement to improve the interpretation of the imaging findings. They recommended that neurologists disregard periventricular lesions and concentrate on subcortical and white matter lesions that are visible from two perspectives.

Another limitation of this diagnostic imaging technique is that it “requires evaluation of every single lesion to determine if a central vein was present,” said Dr. Solomon. He and his colleagues developed a simplified algorithm that required the examination of three lesions. To test this algorithm, they examined their original cohort plus 10 patients with MS and comorbidities for white matter disease (eg, migraine or hypertension) and 10 patients who had been misdiagnosed with MS (most of whom had migraine). Three blinded raters examined three lesions chosen at random from each MRI. This method had a 0.98 specificity for MS and a sensitivity of 0.52. The study demonstrated problems with inter-rater reliability, however.

Dr. Solomon later collaborated with researchers at the University of Pennsylvania to develop a machine learning technique that could identify the central vein sign. When they applied the technique to the expanded cohort of 40 patients, it identified the sign accurately with an area under the curve of about 0.86. The central vein sign may be a good biomarker for MS, and using this automated technique to assess 3-T MRI images appears to be clinically applicable, said Dr. Solomon.

 

 

Thalamic Volume

Thalamic atrophy is common in the early stages of relapsing-remitting MS. The thalamus also is implicated in migraine. Although studies have examined volumetric brain changes in migraine, none has examined thalamic volume specifically, said Dr. Solomon.

He and his colleagues used an automatic segmentation method to analyze thalamic volume in their cohort of 40 patients. Analysis of variance indicated that thalamic volume was significantly smaller in patients with MS, compared with patients without MS. When the researchers used a thalamic volume less than 0.0077 as a cutoff, the technique’s sensitivity and specificity for the diagnosis of MS were 0.75.

Recent data suggest that thalamic atrophy in MS does not result from thalamic lesions, but from diffuse white matter abnormalities. Like the central vein sign, thalamic atrophy may reflect MS pathophysiology and could be incorporated into MS diagnostic criteria, said Dr. Solomon.

Cortical Lesions

Autopsy and MRI studies have shown that cortical lesions are characteristic of MS, but MRI studies have suggested that migraineurs generally do not have cortical lesions. Although neurologists can see these lesions in vivo on 7-T MRI, 3-T MRI is not as sensitive and makes cortical lesion detection challenging.

In 2017, Nakamura and colleagues found that ratio maps of T1- and T2-weighted 3-T MRI, images that are acquired in routine clinical care for MS, could identify areas of cortical demyelination. Dr. Solomon and colleagues tested whether this method could distinguish MS from migraine. They defined a z score of less than 3 as an indication of low myelin density. When they examined the cohort of 40 patients, they were able to correlate areas with z scores below the cutoff with cortical lesions that were visible on conventional imaging. The technique accurately distinguished patients with MS from patients with migraine.

None of these emerging imaging techniques is 100% accurate. In the future, however, combining several of these techniques in conjunction with tests of blood biomarkers such as microRNA could accurately distinguish between MS and other disorders with high specificity and sensitivity, Dr. Solomon concluded.

—Erik Greb

Suggested Reading

Carmosino MJ, Brousseau KM, Arciniegas DB, Corboy JR. Initial evaluations for multiple sclerosis in a university multiple sclerosis center: outcomes and role of magnetic resonance imaging in referral. Arch Neurol. 2005;62(4):585-590.

Engell T. A clinico-pathoanatomical study of multiple sclerosis diagnosis. Acta Neurol Scand. 1988;78(1):39-44.

Liu S, Kullnat J, Bourdette D, et al. Prevalence of brain magnetic resonance imaging meeting Barkhof and McDonald criteria for dissemination in space among headache patients. Mult Scler. 2013;19(8):1101-1105.

Nakamura K, Chen JT, Ontaneda D, et al. T1-/T2-weighted ratio differs in demyelinated cortex in multiple sclerosis. Ann Neurol. 2017;82(4):635-639.

Sati P, Oh J, Constable RT, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol. 2016;12(12):714-722.

Solomon AJ, Klein EP, Bourdette D. “Undiagnosing” multiple sclerosis: the challenge of misdiagnosis in MS. Neurology. 2012;78(24):1986-1991.

Solomon AJ, Schindler MK, Howard DB, et al. “Central vessel sign” on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol. 2015;3(2):82-87.

Solomon AJ, Watts R, Dewey BE, Reich DS. MRI evaluation of thalamic volume differentiates MS from common mimics. Neurol Neuroimmunol Neuroinflamm. 2017;4(5):e387.

STOWE, VT—Some patients with migraine receive an inappropriate diagnosis of multiple sclerosis (MS). The two disorders share certain clinical and radiologic features, and misdiagnosis is a significant problem. Using MRI scanners widely available to clinicians, researchers are developing several imaging techniques that can provide an objective basis for distinguishing between MS and migraine, according to an overview provided at the Headache Cooperative of New England’s 28th Annual Stowe Headache Symposium.

Andrew J. Solomon, MD

The imaging techniques evaluate different aspects of MS pathology, said Andrew J. Solomon, MD, Associate Professor of Neurological Sciences at the University of Vermont College of Medicine in Burlington. The techniques have been automated to a large extent, which reduces the need for human interpretation of data. The incorporation of machine learning could further aid differential diagnosis.

Grounds for Confusion

Various similarities between migraine and MS increase the likelihood of misdiagnosis. The two disorders are chronic and entail attacks and remissions. Both are associated with changes in brain structure and white matter abnormalities that may be subclinical.

In a study of patients with migraine by Liu et al, between 25% and 35% of participants met MRI criteria for dissemination in space for MS, depending on how lesions were defined. The first report of natalizumab-associated progressive multifocal leukoencephalopathy occurred in a patient who, on autopsy, was found not to have had MS. In a 1988 study, Engell and colleagues found that of 518 consecutive patients who had died with a diagnosis of clinically definite MS, the diagnosis was incorrect for 6%.

In 2005, Carmosino and colleagues evaluated 281 patients who had been referred to an MS center and found that 67% of them did not have MS. The investigators identified 37 alternative diagnoses, of which migraine was the second most common. About 10% of participants had a final diagnosis of migraine.

In a recent survey, Dr. Solomon and colleagues asked more than 100 MS specialists whether they had seen patients who had had a diagnosis of MS for more than one year, but, on evaluation, determined that they did not have MS. Approximately 95% of respondents answered affirmatively. About 40% of respondents reported having seen three to five such patients in the previous year.

The current diagnostic criteria for MS rely on clinicians to interpret clinical and radiologic data and contain many caveats regarding their application, said Dr. Solomon. The criteria “were not developed to differentiate MS from other disorders,” but to predict which patients with an initial neurologic syndrome typical for MS will subsequently develop MS, he added. Physicians who are unfamiliar with the diagnostic criteria may misapply them and make an incorrect diagnosis.

The Central Vein Sign

Autopsy studies have indicated that MS lesions generally center around veins. Researchers have recently been able to visualize these veins within MS lesions using 7-T MRI. This finding, which investigators have called the central vein sign, could be a way to distinguish MS from other disorders. But 7-T MRI generally is not available to clinical neurologists. In 2012, scientists at the NIH developed a method that combines T2* imaging, which helps visualize veins, and fluid-attenuated inversion recovery (FLAIR) imaging that visualizes MS lesions. This method visualizes veins within lesions, or central vein sign, using 3-T MRI, which is more commonly available to clinical neurologists. The researchers called this sequence FLAIR*, and numerous studies have suggested that it may differentiate MS from other diagnoses.

Dr. Solomon and collaborators tested this technique on a group of 10 patients with MS who had no other comorbidities for white matter disease and 10 patients with migraine and white matter abnormalities who also had no other comorbidities for white matter disease. The mean percentage of lesions with central vessels per participant was 80% in patients with MS and 34% in migraineurs. The patients with migraine had fewer juxtacortical, periventricular, and infratentorial lesions, compared with patients with MS.

Because researchers have used various definitions of the central vein sign, Dr. Solomon and colleagues published a consensus statement to improve the interpretation of the imaging findings. They recommended that neurologists disregard periventricular lesions and concentrate on subcortical and white matter lesions that are visible from two perspectives.

Another limitation of this diagnostic imaging technique is that it “requires evaluation of every single lesion to determine if a central vein was present,” said Dr. Solomon. He and his colleagues developed a simplified algorithm that required the examination of three lesions. To test this algorithm, they examined their original cohort plus 10 patients with MS and comorbidities for white matter disease (eg, migraine or hypertension) and 10 patients who had been misdiagnosed with MS (most of whom had migraine). Three blinded raters examined three lesions chosen at random from each MRI. This method had a 0.98 specificity for MS and a sensitivity of 0.52. The study demonstrated problems with inter-rater reliability, however.

Dr. Solomon later collaborated with researchers at the University of Pennsylvania to develop a machine learning technique that could identify the central vein sign. When they applied the technique to the expanded cohort of 40 patients, it identified the sign accurately with an area under the curve of about 0.86. The central vein sign may be a good biomarker for MS, and using this automated technique to assess 3-T MRI images appears to be clinically applicable, said Dr. Solomon.

 

 

Thalamic Volume

Thalamic atrophy is common in the early stages of relapsing-remitting MS. The thalamus also is implicated in migraine. Although studies have examined volumetric brain changes in migraine, none has examined thalamic volume specifically, said Dr. Solomon.

He and his colleagues used an automatic segmentation method to analyze thalamic volume in their cohort of 40 patients. Analysis of variance indicated that thalamic volume was significantly smaller in patients with MS, compared with patients without MS. When the researchers used a thalamic volume less than 0.0077 as a cutoff, the technique’s sensitivity and specificity for the diagnosis of MS were 0.75.

Recent data suggest that thalamic atrophy in MS does not result from thalamic lesions, but from diffuse white matter abnormalities. Like the central vein sign, thalamic atrophy may reflect MS pathophysiology and could be incorporated into MS diagnostic criteria, said Dr. Solomon.

Cortical Lesions

Autopsy and MRI studies have shown that cortical lesions are characteristic of MS, but MRI studies have suggested that migraineurs generally do not have cortical lesions. Although neurologists can see these lesions in vivo on 7-T MRI, 3-T MRI is not as sensitive and makes cortical lesion detection challenging.

In 2017, Nakamura and colleagues found that ratio maps of T1- and T2-weighted 3-T MRI, images that are acquired in routine clinical care for MS, could identify areas of cortical demyelination. Dr. Solomon and colleagues tested whether this method could distinguish MS from migraine. They defined a z score of less than 3 as an indication of low myelin density. When they examined the cohort of 40 patients, they were able to correlate areas with z scores below the cutoff with cortical lesions that were visible on conventional imaging. The technique accurately distinguished patients with MS from patients with migraine.

None of these emerging imaging techniques is 100% accurate. In the future, however, combining several of these techniques in conjunction with tests of blood biomarkers such as microRNA could accurately distinguish between MS and other disorders with high specificity and sensitivity, Dr. Solomon concluded.

—Erik Greb

Suggested Reading

Carmosino MJ, Brousseau KM, Arciniegas DB, Corboy JR. Initial evaluations for multiple sclerosis in a university multiple sclerosis center: outcomes and role of magnetic resonance imaging in referral. Arch Neurol. 2005;62(4):585-590.

Engell T. A clinico-pathoanatomical study of multiple sclerosis diagnosis. Acta Neurol Scand. 1988;78(1):39-44.

Liu S, Kullnat J, Bourdette D, et al. Prevalence of brain magnetic resonance imaging meeting Barkhof and McDonald criteria for dissemination in space among headache patients. Mult Scler. 2013;19(8):1101-1105.

Nakamura K, Chen JT, Ontaneda D, et al. T1-/T2-weighted ratio differs in demyelinated cortex in multiple sclerosis. Ann Neurol. 2017;82(4):635-639.

Sati P, Oh J, Constable RT, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol. 2016;12(12):714-722.

Solomon AJ, Klein EP, Bourdette D. “Undiagnosing” multiple sclerosis: the challenge of misdiagnosis in MS. Neurology. 2012;78(24):1986-1991.

Solomon AJ, Schindler MK, Howard DB, et al. “Central vessel sign” on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol. 2015;3(2):82-87.

Solomon AJ, Watts R, Dewey BE, Reich DS. MRI evaluation of thalamic volume differentiates MS from common mimics. Neurol Neuroimmunol Neuroinflamm. 2017;4(5):e387.

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
3-4
Page Number
3-4
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Conference News Roundup—European Society of Cardiology

Article Type
Changed
Mon, 01/07/2019 - 10:41

Stroke Prevention Drugs May Reduce Dementia Risk

Patients with atrial fibrillation could reduce the risk of dementia by taking stroke prevention medications, according to recommendations published online ahead of print March 18 in EP Europace and presented at the conference. The international consensus document was also published in Heart Rhythm, the official journal of the Heart Rhythm Society (HRS), and Journal of Arrhythmia, the official journal of the Japanese Heart Rhythm Society (JHRS) and the Asia Pacific Heart Rhythm Society (APHRS).

The expert consensus statement on arrhythmias and cognitive function was developed by the European Heart Rhythm Association (EHRA), a branch of the European Society of Cardiology (ESC); HRS; APHRS; and the Latin American Heart Rhythm Society (LAHRS).

Arrhythmias, as well as some procedures undertaken to treat them, can increase the risk of cognitive decline and dementia. The international consensus document was written for doctors specializing in arrhythmias and aims to raise awareness of the risks of cognitive impairment and dementia and of methods to reduce them.

Atrial fibrillation is associated with a higher risk for cognitive impairment and dementia, even in the absence of apparent stroke, according to the document. This increased risk may arise because atrial fibrillation is linked with a more than twofold risk of silent strokes. The accumulation of silent strokes and the associated brain injuries over time may contribute to cognitive impairment.

Stroke prevention with oral anticoagulant drugs is the main priority in the management of patients with atrial fibrillation. Oral anticoagulation may reduce the risk of dementia, according to the consensus document.

Adopting a healthy lifestyle also may reduce the risk of cognitive decline in patients with atrial fibrillation. This lifestyle includes not smoking and preventing or controlling hypertension, obesity, diabetes, and sleep apnea.

The document also reviews the association between other arrhythmias and cognitive dysfunction, including postcardiac arrest, in patients with cardiac implantable devices such as implantable cardioverter defibrillators and pacemakers, and ablation procedures.

Treatment of atrial fibrillation with catheter ablation can itself lead to silent strokes and cognitive impairment. To reduce this risk, physicians should follow recommendations for performing ablation and for the management of patients before and after the procedure, according to the document.

Physicians may suspect cognitive impairment if a patient’s appearance or behavior changes (eg, if appointments are missed). Family members should be asked for collateral information. If suspicions are confirmed, the consensus document recommends tools to conduct an objective assessment of cognitive function.

The paper highlights gaps in knowledge and areas for further research. These gaps include, for instance, how to identify patients with atrial fibrillation at increased risk of cognitive impairment and dementia, the effect of rhythm control on cognitive function, and the impact of cardiac resynchronization therapy on cognitive function.

EHRA Updates Guide on NOACs

A new version of the European Heart Rhythm Association (EHRA) Practical Guide on the use of non-vitamin K antagonist oral anticoagulants (NOACs) in patients with atrial fibrillation was published online ahead of print March 19 in European Heart Journal and presented at the meeting.

“European Society of Cardiology guidelines state that NOACs should be preferred over vitamin K antagonists, such as warfarin, for stroke prevention in patients with atrial fibrillation, except those with a mechanical heart valve or rheumatic mitral valve stenosis, and their use in clinical practice is increasing,” said Jan Steffel, MD, Head of the Department of Cardiology at University Heart Center Zurich.

The guide gives advice about how to use NOACs in specific clinical situations. While companies provide a Summary of Product Characteristics for a drug, there are legal restrictions on the content, and the information is often not detailed enough for doctors.

The 2018 edition of the guide has several new chapters. One outlines how to use NOACs in particular groups of patients, including those with very low body weight, the very obese, athletes, frail patients for whom there is concern about bleeding, and patients with cognitive impairment who may forget to take their pills.

Another new chapter briefly summarizes the correct dosing of NOACs in conditions other than atrial fibrillation, such as prevention of deep venous thrombosis, treatment of venous thromboembolism, and treatment of ischemic heart disease. The dosing for each condition is different, which underscores the need for clarity.

Updated advice is given on the combined use of antiplatelets and NOACs in patients with coronary artery disease, particularly those with an acute coronary syndrome or patients scheduled for percutaneous coronary intervention with stenting.

The guide also offers scientific evidence about the use of anticoagulants around cardioversion. The authors give detailed advice about what to do in patients on long-term NOAC treatment who need cardioversion versus patients newly diagnosed with atrial fibrillation and started on a NOAC before cardioversion.

Since the previous edition of the guide was published, the first NOAC reversal agent has received market approval. The authors provide advice about using idarucizumab, which reverses the anticoagulant effect of dabigatran, when there is acute bleeding, when urgent surgery is required, or when the patient has a stroke. Guidance is also included on andexanet alfa, another reversal agent expected to receive market approval, with the caveat that the instructions on the label should be followed.

Unlike warfarin, NOACs do not require monitoring of plasma levels followed by dose adjustments. The guide describes rare scenarios in which physicians might want to know the NOAC plasma level. One scenario concerns patients undergoing major surgery in whom it is unclear, for example because of other drugs or renal dysfunction, whether the usual practice of stopping the NOAC 48 hours in advance is sufficient. The plasma level of the NOAC could be measured just before surgery to confirm that the anticoagulant effect has waned.

The chapter on drug–drug interactions has been expanded with information about anticancer and antiepileptic drugs. “While this is mostly based on potential pharmacokinetic interactions and case reports, it is the first of its kind. This is likely to be adapted and become more complete over the years as our experience increases at this new frontier,” said Dr. Steffel.

 

 

Apixaban Is Safe During Catheter Ablation

Apixaban and warfarin are equally safe during catheter ablation of atrial fibrillation, according to results of the AXAFA-AFNET 5 trial. The drugs have similar rates of stroke and bleeding, and an improvement in cognitive function was shown for the first time.

Nearly one-third of all strokes are caused by atrial fibrillation. Oral anticoagulation is the cornerstone of stroke prevention in patients with atrial fibrillation. European Society of Cardiology (ESC) guidelines recommend non-vitamin K antagonist oral anticoagulants (NOACs) in preference over vitamin K antagonists (VKAs) such as warfarin, except in patients with a mechanical heart valve or rheumatic mitral valve stenosis. Unlike VKAs, NOACs do not require frequent monitoring and dose adjustment, and NOACs reduce long-term rates of stroke and death, compared with VKAs.

Catheter ablation is used in patients with atrial fibrillation to restore and maintain the heart’s normal rhythm, but the procedure entails risks of stroke, bleeding, acute brain lesions, and cognitive impairment. ESC guidelines recommend that patients continue taking their prescribed NOAC or VKA during the procedure. The results of this study confirm that the NOAC apixaban is as safe as a VKA in this situation.

The AXAFA-AFNET 5 trial was the first randomized trial to examine whether continuous apixaban was a safe alternative to a VKA during catheter ablation of atrial fibrillation. In all, 633 patients with atrial fibrillation and additional stroke risk factors scheduled to undergo atrial fibrillation ablation in Europe and the United States were randomized to receive either continuous apixaban or the locally used VKA (ie, warfarin, phenprocoumon, acenocoumarol, or fluindione).

The primary outcome was a composite of all-cause death, stroke, and major bleeding up to three months after ablation. It occurred in 22 patients randomized to apixaban and 23 randomized to VKA. “The results show that apixaban is a safe alternative to warfarin during catheter ablation of atrial fibrillation in patients at risk of stroke,” said Professor Paulus Kirchhof, MD, Chair in Cardiovascular Medicine at the University of Birmingham in the United Kingdom.

The researchers assessed cognitive function at the beginning and end of the trial and found that it improved equally in both treatment groups. “This is the first randomized trial to show that cognitive function is improving after atrial fibrillation ablation,” said Professor Kirchhof. “It is possible that this is due to continuous anticoagulation, although we did not test this specifically.” An MRI substudy in 335 patients showed a similar rate of silent strokes in the apixaban (27%) and VKA (25%) groups.

Patients in the trial were four years older than participants of previous studies with the NOACs rivaroxaban and dabigatran, said Professor Kirchhof. Local investigators chose the VKA and catheter ablation procedure, which led to the use of various drugs and techniques. “These characteristics of the trial mean that the results apply to older patients and in different clinical settings,” said Professor Kirchhof.

European Society of Cardiology Publishes Guidelines on Syncope

European Society of Cardiology guidelines on syncope were presented at the conference and published online ahead of print March 19 in the European Heart Journal.

Syncope is a transient loss of consciousness caused by reduced blood flow to the brain. Approximately 50% of people have one syncopal event during their lifetime. The most common type is vasovagal syncope, commonly known as fainting, triggered by fear, seeing blood, or prolonged standing, for example.

The challenge for doctors is to identify the minority of patients whose syncope is caused by a potentially deadly heart problem. The guidelines recommend a new algorithm for emergency departments to stratify patients and discharge those at low risk. Patients at intermediate or high risk should receive diagnostic tests in the emergency department or an outpatient syncope clinic.

“The new pathway avoids costly hospitalizations while ensuring the patient is properly diagnosed and treated,” said Professor Michele Brignole, MD, a cardiologist at Ospedali del Tigullio in Lavagna, Italy.

Most syncope does not increase the risk of death, but it can cause injury due to falls or be dangerous in certain occupations, such as airline pilots. The guidelines provide recommendations on how to prevent syncope, which include keeping hydrated; avoiding hot, crowded environments; tensing the muscles; and lying down. The document gives advice on driving for patients with syncope, although the risk of accidents is low.

The document emphasizes the value of video recording in the hospital or at home to improve diagnosis. It recommends that friends and relatives use their smartphones to film the attack and recovery. Clinical clues, such as the duration of the loss of consciousness, whether the patient’s eyes are open or closed, and jerky movements, can distinguish between syncope, epilepsy, and other conditions.

Another diagnostic tool is the implantable loop recorder, a small device inserted underneath the skin of the chest that records the heart’s electrical signals. The guidelines recommend extending its use for diagnosis in patients with unexplained falls, suspected epilepsy, or recurrent episodes of unexplained syncope and a low risk of sudden cardiac death.

The guidelines include an addendum with practical instructions for doctors about how to perform and interpret diagnostic tests.

“The Task Force that prepared the guidelines was truly multidisciplinary,” said Professor Brignole. “A minority of cardiologists was joined by experts in emergency medicine, internal medicine and physiology, neurology and autonomic diseases, geriatric medicine, and nursing.”

 

 

Drinking Alcohol Makes the Heart Race

The more alcohol one drinks, the higher one’s heart rate gets, according to research. Binge drinking has been linked with atrial fibrillation, a phenomenon called “the holiday heart syndrome.” The connection was initially based on small studies and anecdotal evidence from the late 1970s.

The Munich Beer Related Electro-cardiogram Workup (Munich BREW) study was conducted by researchers from the LMU University Hospital Munich Department of Cardiology and supported by the German Cardiovascular Research Centre and the European Commission. It was the first assessment of the acute effects of alcohol on ECG readings. The study included more than 3,000 people attending the 2015 Munich Oktoberfest.ECG readings were taken, and breath alcohol concentrations were measured. Age, sex, heart disease, heart medications, and smoking status were recorded. Participants were, on average, 35 years old, and 30% were women.

The average breath alcohol concentration was 0.85 g/kg. Increasing breath alcohol concentration was significantly associated with sinus tachycardia of more than 100 bpm in 25.9% of the cohort.

The current analysis of the MunichBREW study looked in more detail at the quantitative ECG measurements in 3,012 participants. The researchers investigated the association between blood alcohol concentration and the ECG parameters of excitation (ie, heart rate), conduction (ie, PR interval and QRS complex), and repolarization (ie, QT interval).

Increased heart rate was associated with higher breath alcohol concentration, confirming the initial results of the MunichBREW study. The association was linear, with no threshold. Alcohol consumption had no effect on the other three parameters.

“The more alcohol you drink, the higher your heart rate gets,” said Stefan Brunner, MD, a cardiologist at the University Hospital Munich, one of the lead authors.

The researchers are currently investigating whether the increase in heart rate with alcohol consumption could lead to heart rhythm disorders in the longer term.

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
58-59
Sections

Stroke Prevention Drugs May Reduce Dementia Risk

Patients with atrial fibrillation could reduce the risk of dementia by taking stroke prevention medications, according to recommendations published online ahead of print March 18 in EP Europace and presented at the conference. The international consensus document was also published in Heart Rhythm, the official journal of the Heart Rhythm Society (HRS), and Journal of Arrhythmia, the official journal of the Japanese Heart Rhythm Society (JHRS) and the Asia Pacific Heart Rhythm Society (APHRS).

The expert consensus statement on arrhythmias and cognitive function was developed by the European Heart Rhythm Association (EHRA), a branch of the European Society of Cardiology (ESC); HRS; APHRS; and the Latin American Heart Rhythm Society (LAHRS).

Arrhythmias, as well as some procedures undertaken to treat them, can increase the risk of cognitive decline and dementia. The international consensus document was written for doctors specializing in arrhythmias and aims to raise awareness of the risks of cognitive impairment and dementia and of methods to reduce them.

Atrial fibrillation is associated with a higher risk for cognitive impairment and dementia, even in the absence of apparent stroke, according to the document. This increased risk may arise because atrial fibrillation is linked with a more than twofold risk of silent strokes. The accumulation of silent strokes and the associated brain injuries over time may contribute to cognitive impairment.

Stroke prevention with oral anticoagulant drugs is the main priority in the management of patients with atrial fibrillation. Oral anticoagulation may reduce the risk of dementia, according to the consensus document.

Adopting a healthy lifestyle also may reduce the risk of cognitive decline in patients with atrial fibrillation. This lifestyle includes not smoking and preventing or controlling hypertension, obesity, diabetes, and sleep apnea.

The document also reviews the association between other arrhythmias and cognitive dysfunction, including postcardiac arrest, in patients with cardiac implantable devices such as implantable cardioverter defibrillators and pacemakers, and ablation procedures.

Treatment of atrial fibrillation with catheter ablation can itself lead to silent strokes and cognitive impairment. To reduce this risk, physicians should follow recommendations for performing ablation and for the management of patients before and after the procedure, according to the document.

Physicians may suspect cognitive impairment if a patient’s appearance or behavior changes (eg, if appointments are missed). Family members should be asked for collateral information. If suspicions are confirmed, the consensus document recommends tools to conduct an objective assessment of cognitive function.

The paper highlights gaps in knowledge and areas for further research. These gaps include, for instance, how to identify patients with atrial fibrillation at increased risk of cognitive impairment and dementia, the effect of rhythm control on cognitive function, and the impact of cardiac resynchronization therapy on cognitive function.

EHRA Updates Guide on NOACs

A new version of the European Heart Rhythm Association (EHRA) Practical Guide on the use of non-vitamin K antagonist oral anticoagulants (NOACs) in patients with atrial fibrillation was published online ahead of print March 19 in European Heart Journal and presented at the meeting.

“European Society of Cardiology guidelines state that NOACs should be preferred over vitamin K antagonists, such as warfarin, for stroke prevention in patients with atrial fibrillation, except those with a mechanical heart valve or rheumatic mitral valve stenosis, and their use in clinical practice is increasing,” said Jan Steffel, MD, Head of the Department of Cardiology at University Heart Center Zurich.

The guide gives advice about how to use NOACs in specific clinical situations. While companies provide a Summary of Product Characteristics for a drug, there are legal restrictions on the content, and the information is often not detailed enough for doctors.

The 2018 edition of the guide has several new chapters. One outlines how to use NOACs in particular groups of patients, including those with very low body weight, the very obese, athletes, frail patients for whom there is concern about bleeding, and patients with cognitive impairment who may forget to take their pills.

Another new chapter briefly summarizes the correct dosing of NOACs in conditions other than atrial fibrillation, such as prevention of deep venous thrombosis, treatment of venous thromboembolism, and treatment of ischemic heart disease. The dosing for each condition is different, which underscores the need for clarity.

Updated advice is given on the combined use of antiplatelets and NOACs in patients with coronary artery disease, particularly those with an acute coronary syndrome or patients scheduled for percutaneous coronary intervention with stenting.

The guide also offers scientific evidence about the use of anticoagulants around cardioversion. The authors give detailed advice about what to do in patients on long-term NOAC treatment who need cardioversion versus patients newly diagnosed with atrial fibrillation and started on a NOAC before cardioversion.

Since the previous edition of the guide was published, the first NOAC reversal agent has received market approval. The authors provide advice about using idarucizumab, which reverses the anticoagulant effect of dabigatran, when there is acute bleeding, when urgent surgery is required, or when the patient has a stroke. Guidance is also included on andexanet alfa, another reversal agent expected to receive market approval, with the caveat that the instructions on the label should be followed.

Unlike warfarin, NOACs do not require monitoring of plasma levels followed by dose adjustments. The guide describes rare scenarios in which physicians might want to know the NOAC plasma level. One scenario concerns patients undergoing major surgery in whom it is unclear, for example because of other drugs or renal dysfunction, whether the usual practice of stopping the NOAC 48 hours in advance is sufficient. The plasma level of the NOAC could be measured just before surgery to confirm that the anticoagulant effect has waned.

The chapter on drug–drug interactions has been expanded with information about anticancer and antiepileptic drugs. “While this is mostly based on potential pharmacokinetic interactions and case reports, it is the first of its kind. This is likely to be adapted and become more complete over the years as our experience increases at this new frontier,” said Dr. Steffel.

 

 

Apixaban Is Safe During Catheter Ablation

Apixaban and warfarin are equally safe during catheter ablation of atrial fibrillation, according to results of the AXAFA-AFNET 5 trial. The drugs have similar rates of stroke and bleeding, and an improvement in cognitive function was shown for the first time.

Nearly one-third of all strokes are caused by atrial fibrillation. Oral anticoagulation is the cornerstone of stroke prevention in patients with atrial fibrillation. European Society of Cardiology (ESC) guidelines recommend non-vitamin K antagonist oral anticoagulants (NOACs) in preference over vitamin K antagonists (VKAs) such as warfarin, except in patients with a mechanical heart valve or rheumatic mitral valve stenosis. Unlike VKAs, NOACs do not require frequent monitoring and dose adjustment, and NOACs reduce long-term rates of stroke and death, compared with VKAs.

Catheter ablation is used in patients with atrial fibrillation to restore and maintain the heart’s normal rhythm, but the procedure entails risks of stroke, bleeding, acute brain lesions, and cognitive impairment. ESC guidelines recommend that patients continue taking their prescribed NOAC or VKA during the procedure. The results of this study confirm that the NOAC apixaban is as safe as a VKA in this situation.

The AXAFA-AFNET 5 trial was the first randomized trial to examine whether continuous apixaban was a safe alternative to a VKA during catheter ablation of atrial fibrillation. In all, 633 patients with atrial fibrillation and additional stroke risk factors scheduled to undergo atrial fibrillation ablation in Europe and the United States were randomized to receive either continuous apixaban or the locally used VKA (ie, warfarin, phenprocoumon, acenocoumarol, or fluindione).

The primary outcome was a composite of all-cause death, stroke, and major bleeding up to three months after ablation. It occurred in 22 patients randomized to apixaban and 23 randomized to VKA. “The results show that apixaban is a safe alternative to warfarin during catheter ablation of atrial fibrillation in patients at risk of stroke,” said Professor Paulus Kirchhof, MD, Chair in Cardiovascular Medicine at the University of Birmingham in the United Kingdom.

The researchers assessed cognitive function at the beginning and end of the trial and found that it improved equally in both treatment groups. “This is the first randomized trial to show that cognitive function is improving after atrial fibrillation ablation,” said Professor Kirchhof. “It is possible that this is due to continuous anticoagulation, although we did not test this specifically.” An MRI substudy in 335 patients showed a similar rate of silent strokes in the apixaban (27%) and VKA (25%) groups.

Patients in the trial were four years older than participants of previous studies with the NOACs rivaroxaban and dabigatran, said Professor Kirchhof. Local investigators chose the VKA and catheter ablation procedure, which led to the use of various drugs and techniques. “These characteristics of the trial mean that the results apply to older patients and in different clinical settings,” said Professor Kirchhof.

European Society of Cardiology Publishes Guidelines on Syncope

European Society of Cardiology guidelines on syncope were presented at the conference and published online ahead of print March 19 in the European Heart Journal.

Syncope is a transient loss of consciousness caused by reduced blood flow to the brain. Approximately 50% of people have one syncopal event during their lifetime. The most common type is vasovagal syncope, commonly known as fainting, triggered by fear, seeing blood, or prolonged standing, for example.

The challenge for doctors is to identify the minority of patients whose syncope is caused by a potentially deadly heart problem. The guidelines recommend a new algorithm for emergency departments to stratify patients and discharge those at low risk. Patients at intermediate or high risk should receive diagnostic tests in the emergency department or an outpatient syncope clinic.

“The new pathway avoids costly hospitalizations while ensuring the patient is properly diagnosed and treated,” said Professor Michele Brignole, MD, a cardiologist at Ospedali del Tigullio in Lavagna, Italy.

Most syncope does not increase the risk of death, but it can cause injury due to falls or be dangerous in certain occupations, such as airline pilots. The guidelines provide recommendations on how to prevent syncope, which include keeping hydrated; avoiding hot, crowded environments; tensing the muscles; and lying down. The document gives advice on driving for patients with syncope, although the risk of accidents is low.

The document emphasizes the value of video recording in the hospital or at home to improve diagnosis. It recommends that friends and relatives use their smartphones to film the attack and recovery. Clinical clues, such as the duration of the loss of consciousness, whether the patient’s eyes are open or closed, and jerky movements, can distinguish between syncope, epilepsy, and other conditions.

Another diagnostic tool is the implantable loop recorder, a small device inserted underneath the skin of the chest that records the heart’s electrical signals. The guidelines recommend extending its use for diagnosis in patients with unexplained falls, suspected epilepsy, or recurrent episodes of unexplained syncope and a low risk of sudden cardiac death.

The guidelines include an addendum with practical instructions for doctors about how to perform and interpret diagnostic tests.

“The Task Force that prepared the guidelines was truly multidisciplinary,” said Professor Brignole. “A minority of cardiologists was joined by experts in emergency medicine, internal medicine and physiology, neurology and autonomic diseases, geriatric medicine, and nursing.”

 

 

Drinking Alcohol Makes the Heart Race

The more alcohol one drinks, the higher one’s heart rate gets, according to research. Binge drinking has been linked with atrial fibrillation, a phenomenon called “the holiday heart syndrome.” The connection was initially based on small studies and anecdotal evidence from the late 1970s.

The Munich Beer Related Electro-cardiogram Workup (Munich BREW) study was conducted by researchers from the LMU University Hospital Munich Department of Cardiology and supported by the German Cardiovascular Research Centre and the European Commission. It was the first assessment of the acute effects of alcohol on ECG readings. The study included more than 3,000 people attending the 2015 Munich Oktoberfest.ECG readings were taken, and breath alcohol concentrations were measured. Age, sex, heart disease, heart medications, and smoking status were recorded. Participants were, on average, 35 years old, and 30% were women.

The average breath alcohol concentration was 0.85 g/kg. Increasing breath alcohol concentration was significantly associated with sinus tachycardia of more than 100 bpm in 25.9% of the cohort.

The current analysis of the MunichBREW study looked in more detail at the quantitative ECG measurements in 3,012 participants. The researchers investigated the association between blood alcohol concentration and the ECG parameters of excitation (ie, heart rate), conduction (ie, PR interval and QRS complex), and repolarization (ie, QT interval).

Increased heart rate was associated with higher breath alcohol concentration, confirming the initial results of the MunichBREW study. The association was linear, with no threshold. Alcohol consumption had no effect on the other three parameters.

“The more alcohol you drink, the higher your heart rate gets,” said Stefan Brunner, MD, a cardiologist at the University Hospital Munich, one of the lead authors.

The researchers are currently investigating whether the increase in heart rate with alcohol consumption could lead to heart rhythm disorders in the longer term.

Stroke Prevention Drugs May Reduce Dementia Risk

Patients with atrial fibrillation could reduce the risk of dementia by taking stroke prevention medications, according to recommendations published online ahead of print March 18 in EP Europace and presented at the conference. The international consensus document was also published in Heart Rhythm, the official journal of the Heart Rhythm Society (HRS), and Journal of Arrhythmia, the official journal of the Japanese Heart Rhythm Society (JHRS) and the Asia Pacific Heart Rhythm Society (APHRS).

The expert consensus statement on arrhythmias and cognitive function was developed by the European Heart Rhythm Association (EHRA), a branch of the European Society of Cardiology (ESC); HRS; APHRS; and the Latin American Heart Rhythm Society (LAHRS).

Arrhythmias, as well as some procedures undertaken to treat them, can increase the risk of cognitive decline and dementia. The international consensus document was written for doctors specializing in arrhythmias and aims to raise awareness of the risks of cognitive impairment and dementia and of methods to reduce them.

Atrial fibrillation is associated with a higher risk for cognitive impairment and dementia, even in the absence of apparent stroke, according to the document. This increased risk may arise because atrial fibrillation is linked with a more than twofold risk of silent strokes. The accumulation of silent strokes and the associated brain injuries over time may contribute to cognitive impairment.

Stroke prevention with oral anticoagulant drugs is the main priority in the management of patients with atrial fibrillation. Oral anticoagulation may reduce the risk of dementia, according to the consensus document.

Adopting a healthy lifestyle also may reduce the risk of cognitive decline in patients with atrial fibrillation. This lifestyle includes not smoking and preventing or controlling hypertension, obesity, diabetes, and sleep apnea.

The document also reviews the association between other arrhythmias and cognitive dysfunction, including postcardiac arrest, in patients with cardiac implantable devices such as implantable cardioverter defibrillators and pacemakers, and ablation procedures.

Treatment of atrial fibrillation with catheter ablation can itself lead to silent strokes and cognitive impairment. To reduce this risk, physicians should follow recommendations for performing ablation and for the management of patients before and after the procedure, according to the document.

Physicians may suspect cognitive impairment if a patient’s appearance or behavior changes (eg, if appointments are missed). Family members should be asked for collateral information. If suspicions are confirmed, the consensus document recommends tools to conduct an objective assessment of cognitive function.

The paper highlights gaps in knowledge and areas for further research. These gaps include, for instance, how to identify patients with atrial fibrillation at increased risk of cognitive impairment and dementia, the effect of rhythm control on cognitive function, and the impact of cardiac resynchronization therapy on cognitive function.

EHRA Updates Guide on NOACs

A new version of the European Heart Rhythm Association (EHRA) Practical Guide on the use of non-vitamin K antagonist oral anticoagulants (NOACs) in patients with atrial fibrillation was published online ahead of print March 19 in European Heart Journal and presented at the meeting.

“European Society of Cardiology guidelines state that NOACs should be preferred over vitamin K antagonists, such as warfarin, for stroke prevention in patients with atrial fibrillation, except those with a mechanical heart valve or rheumatic mitral valve stenosis, and their use in clinical practice is increasing,” said Jan Steffel, MD, Head of the Department of Cardiology at University Heart Center Zurich.

The guide gives advice about how to use NOACs in specific clinical situations. While companies provide a Summary of Product Characteristics for a drug, there are legal restrictions on the content, and the information is often not detailed enough for doctors.

The 2018 edition of the guide has several new chapters. One outlines how to use NOACs in particular groups of patients, including those with very low body weight, the very obese, athletes, frail patients for whom there is concern about bleeding, and patients with cognitive impairment who may forget to take their pills.

Another new chapter briefly summarizes the correct dosing of NOACs in conditions other than atrial fibrillation, such as prevention of deep venous thrombosis, treatment of venous thromboembolism, and treatment of ischemic heart disease. The dosing for each condition is different, which underscores the need for clarity.

Updated advice is given on the combined use of antiplatelets and NOACs in patients with coronary artery disease, particularly those with an acute coronary syndrome or patients scheduled for percutaneous coronary intervention with stenting.

The guide also offers scientific evidence about the use of anticoagulants around cardioversion. The authors give detailed advice about what to do in patients on long-term NOAC treatment who need cardioversion versus patients newly diagnosed with atrial fibrillation and started on a NOAC before cardioversion.

Since the previous edition of the guide was published, the first NOAC reversal agent has received market approval. The authors provide advice about using idarucizumab, which reverses the anticoagulant effect of dabigatran, when there is acute bleeding, when urgent surgery is required, or when the patient has a stroke. Guidance is also included on andexanet alfa, another reversal agent expected to receive market approval, with the caveat that the instructions on the label should be followed.

Unlike warfarin, NOACs do not require monitoring of plasma levels followed by dose adjustments. The guide describes rare scenarios in which physicians might want to know the NOAC plasma level. One scenario concerns patients undergoing major surgery in whom it is unclear, for example because of other drugs or renal dysfunction, whether the usual practice of stopping the NOAC 48 hours in advance is sufficient. The plasma level of the NOAC could be measured just before surgery to confirm that the anticoagulant effect has waned.

The chapter on drug–drug interactions has been expanded with information about anticancer and antiepileptic drugs. “While this is mostly based on potential pharmacokinetic interactions and case reports, it is the first of its kind. This is likely to be adapted and become more complete over the years as our experience increases at this new frontier,” said Dr. Steffel.

 

 

Apixaban Is Safe During Catheter Ablation

Apixaban and warfarin are equally safe during catheter ablation of atrial fibrillation, according to results of the AXAFA-AFNET 5 trial. The drugs have similar rates of stroke and bleeding, and an improvement in cognitive function was shown for the first time.

Nearly one-third of all strokes are caused by atrial fibrillation. Oral anticoagulation is the cornerstone of stroke prevention in patients with atrial fibrillation. European Society of Cardiology (ESC) guidelines recommend non-vitamin K antagonist oral anticoagulants (NOACs) in preference over vitamin K antagonists (VKAs) such as warfarin, except in patients with a mechanical heart valve or rheumatic mitral valve stenosis. Unlike VKAs, NOACs do not require frequent monitoring and dose adjustment, and NOACs reduce long-term rates of stroke and death, compared with VKAs.

Catheter ablation is used in patients with atrial fibrillation to restore and maintain the heart’s normal rhythm, but the procedure entails risks of stroke, bleeding, acute brain lesions, and cognitive impairment. ESC guidelines recommend that patients continue taking their prescribed NOAC or VKA during the procedure. The results of this study confirm that the NOAC apixaban is as safe as a VKA in this situation.

The AXAFA-AFNET 5 trial was the first randomized trial to examine whether continuous apixaban was a safe alternative to a VKA during catheter ablation of atrial fibrillation. In all, 633 patients with atrial fibrillation and additional stroke risk factors scheduled to undergo atrial fibrillation ablation in Europe and the United States were randomized to receive either continuous apixaban or the locally used VKA (ie, warfarin, phenprocoumon, acenocoumarol, or fluindione).

The primary outcome was a composite of all-cause death, stroke, and major bleeding up to three months after ablation. It occurred in 22 patients randomized to apixaban and 23 randomized to VKA. “The results show that apixaban is a safe alternative to warfarin during catheter ablation of atrial fibrillation in patients at risk of stroke,” said Professor Paulus Kirchhof, MD, Chair in Cardiovascular Medicine at the University of Birmingham in the United Kingdom.

The researchers assessed cognitive function at the beginning and end of the trial and found that it improved equally in both treatment groups. “This is the first randomized trial to show that cognitive function is improving after atrial fibrillation ablation,” said Professor Kirchhof. “It is possible that this is due to continuous anticoagulation, although we did not test this specifically.” An MRI substudy in 335 patients showed a similar rate of silent strokes in the apixaban (27%) and VKA (25%) groups.

Patients in the trial were four years older than participants of previous studies with the NOACs rivaroxaban and dabigatran, said Professor Kirchhof. Local investigators chose the VKA and catheter ablation procedure, which led to the use of various drugs and techniques. “These characteristics of the trial mean that the results apply to older patients and in different clinical settings,” said Professor Kirchhof.

European Society of Cardiology Publishes Guidelines on Syncope

European Society of Cardiology guidelines on syncope were presented at the conference and published online ahead of print March 19 in the European Heart Journal.

Syncope is a transient loss of consciousness caused by reduced blood flow to the brain. Approximately 50% of people have one syncopal event during their lifetime. The most common type is vasovagal syncope, commonly known as fainting, triggered by fear, seeing blood, or prolonged standing, for example.

The challenge for doctors is to identify the minority of patients whose syncope is caused by a potentially deadly heart problem. The guidelines recommend a new algorithm for emergency departments to stratify patients and discharge those at low risk. Patients at intermediate or high risk should receive diagnostic tests in the emergency department or an outpatient syncope clinic.

“The new pathway avoids costly hospitalizations while ensuring the patient is properly diagnosed and treated,” said Professor Michele Brignole, MD, a cardiologist at Ospedali del Tigullio in Lavagna, Italy.

Most syncope does not increase the risk of death, but it can cause injury due to falls or be dangerous in certain occupations, such as airline pilots. The guidelines provide recommendations on how to prevent syncope, which include keeping hydrated; avoiding hot, crowded environments; tensing the muscles; and lying down. The document gives advice on driving for patients with syncope, although the risk of accidents is low.

The document emphasizes the value of video recording in the hospital or at home to improve diagnosis. It recommends that friends and relatives use their smartphones to film the attack and recovery. Clinical clues, such as the duration of the loss of consciousness, whether the patient’s eyes are open or closed, and jerky movements, can distinguish between syncope, epilepsy, and other conditions.

Another diagnostic tool is the implantable loop recorder, a small device inserted underneath the skin of the chest that records the heart’s electrical signals. The guidelines recommend extending its use for diagnosis in patients with unexplained falls, suspected epilepsy, or recurrent episodes of unexplained syncope and a low risk of sudden cardiac death.

The guidelines include an addendum with practical instructions for doctors about how to perform and interpret diagnostic tests.

“The Task Force that prepared the guidelines was truly multidisciplinary,” said Professor Brignole. “A minority of cardiologists was joined by experts in emergency medicine, internal medicine and physiology, neurology and autonomic diseases, geriatric medicine, and nursing.”

 

 

Drinking Alcohol Makes the Heart Race

The more alcohol one drinks, the higher one’s heart rate gets, according to research. Binge drinking has been linked with atrial fibrillation, a phenomenon called “the holiday heart syndrome.” The connection was initially based on small studies and anecdotal evidence from the late 1970s.

The Munich Beer Related Electro-cardiogram Workup (Munich BREW) study was conducted by researchers from the LMU University Hospital Munich Department of Cardiology and supported by the German Cardiovascular Research Centre and the European Commission. It was the first assessment of the acute effects of alcohol on ECG readings. The study included more than 3,000 people attending the 2015 Munich Oktoberfest.ECG readings were taken, and breath alcohol concentrations were measured. Age, sex, heart disease, heart medications, and smoking status were recorded. Participants were, on average, 35 years old, and 30% were women.

The average breath alcohol concentration was 0.85 g/kg. Increasing breath alcohol concentration was significantly associated with sinus tachycardia of more than 100 bpm in 25.9% of the cohort.

The current analysis of the MunichBREW study looked in more detail at the quantitative ECG measurements in 3,012 participants. The researchers investigated the association between blood alcohol concentration and the ECG parameters of excitation (ie, heart rate), conduction (ie, PR interval and QRS complex), and repolarization (ie, QT interval).

Increased heart rate was associated with higher breath alcohol concentration, confirming the initial results of the MunichBREW study. The association was linear, with no threshold. Alcohol consumption had no effect on the other three parameters.

“The more alcohol you drink, the higher your heart rate gets,” said Stefan Brunner, MD, a cardiologist at the University Hospital Munich, one of the lead authors.

The researchers are currently investigating whether the increase in heart rate with alcohol consumption could lead to heart rhythm disorders in the longer term.

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
58-59
Page Number
58-59
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Headache Remains a Significant Public Health Problem

Article Type
Changed
Thu, 12/15/2022 - 15:51
The prevalence of migraine and severe headache was stable between 2005 and 2015.

 

Severe headache and migraine remain significant public health problems, and their prevalence has been stable for years, according to a review published online ahead of print March 12 in Headache. Results confirm that “migraine disproportionately affects women and several other historically disadvantaged segments of the population,” according to the authors. “These inequities could be exacerbated if new high-cost treatments are inaccessible to those who need them most.”

Rebecca Burch, MD, Instructor in Neurology at Harvard Medical School in Boston, and colleagues reviewed population-based US government surveys to obtain updated estimates of the prevalence of migraine and severe headache in adults. The authors examined the most recent data from the National Health Interview Survey, the National Hospital Ambulatory Medical Care Survey, and the National Ambulatory Medical Care Survey.

Rebecca Burch, MD


The most recent National Health Interview Survey data were from 2015. They indicated that the overall prevalence of migraine or severe headache was 15.3%. The prevalence was 20.7% in women and 9.7% in men. The age group with the highest prevalence of migraine (17.9%) included patients between ages 18 and 44. Prevalence was 15.9% in people between ages 45 and 64.

The prevalence of migraine or severe headache also varied by race. The highest prevalence (20.3%) was among native Hawaiians and other Pacific islanders. Prevalence was 18.4% among American Indians or Alaska natives, 16.2% among blacks or African Americans, 15.4% among whites, and 11.3% among Asians.

Data indicated that prevalence varied with income and insurance status. People living below the poverty line had a prevalence of 21.7%, and those with an annual family income of less than $35,000 had a prevalence of 19.9%. For people younger than 65, prevalence was higher in people insured by Medicaid (26.0%), compared with people with private insurance (15.1%) or no insurance (17.1%).

The most recent data for the National Hospital Ambulatory Medical Care Survey were from 2014. In that year, headache or pain in the head prompted approximately four million emergency department visits. Women of childbearing age made more than half of emergency department visits for headache.

Headache or pain in the head accounted for 3.0% of all emergency department visits and was the fifth leading cause of visits to the emergency department, as reported by patients. Headache was the 12th most common diagnosis among emergency department physicians (1.8% of all visits). It was the sixth most common diagnosis for women aged 15 to 64 (1.7%), and migraine was the 15th most common for this population (0.8%). Headache was the 19th most common diagnosis among men between ages 15 and 64 (0.5%).

No new data about headache or head pain from the National Ambulatory Medical Care Survey were available. Headache has not been among the top 20 reasons for outpatient visits since the 2009–2010 survey.

“It is important to understand the distribution of headache in specific segments of the population,” said Dr. Burch and colleagues. “This can guide efforts to ensure that treatments are accessible to those with the highest level of need.”

—Erik Greb

Suggested Reading

Burch R, Rizzoli P, Loder E. The prevalence and impact of migraine and severe headache in the United States: figures and trends from government health studies. Headache. 2018 Mar 12 [Epub ahead of print].

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
45
Sections
The prevalence of migraine and severe headache was stable between 2005 and 2015.
The prevalence of migraine and severe headache was stable between 2005 and 2015.

 

Severe headache and migraine remain significant public health problems, and their prevalence has been stable for years, according to a review published online ahead of print March 12 in Headache. Results confirm that “migraine disproportionately affects women and several other historically disadvantaged segments of the population,” according to the authors. “These inequities could be exacerbated if new high-cost treatments are inaccessible to those who need them most.”

Rebecca Burch, MD, Instructor in Neurology at Harvard Medical School in Boston, and colleagues reviewed population-based US government surveys to obtain updated estimates of the prevalence of migraine and severe headache in adults. The authors examined the most recent data from the National Health Interview Survey, the National Hospital Ambulatory Medical Care Survey, and the National Ambulatory Medical Care Survey.

Rebecca Burch, MD


The most recent National Health Interview Survey data were from 2015. They indicated that the overall prevalence of migraine or severe headache was 15.3%. The prevalence was 20.7% in women and 9.7% in men. The age group with the highest prevalence of migraine (17.9%) included patients between ages 18 and 44. Prevalence was 15.9% in people between ages 45 and 64.

The prevalence of migraine or severe headache also varied by race. The highest prevalence (20.3%) was among native Hawaiians and other Pacific islanders. Prevalence was 18.4% among American Indians or Alaska natives, 16.2% among blacks or African Americans, 15.4% among whites, and 11.3% among Asians.

Data indicated that prevalence varied with income and insurance status. People living below the poverty line had a prevalence of 21.7%, and those with an annual family income of less than $35,000 had a prevalence of 19.9%. For people younger than 65, prevalence was higher in people insured by Medicaid (26.0%), compared with people with private insurance (15.1%) or no insurance (17.1%).

The most recent data for the National Hospital Ambulatory Medical Care Survey were from 2014. In that year, headache or pain in the head prompted approximately four million emergency department visits. Women of childbearing age made more than half of emergency department visits for headache.

Headache or pain in the head accounted for 3.0% of all emergency department visits and was the fifth leading cause of visits to the emergency department, as reported by patients. Headache was the 12th most common diagnosis among emergency department physicians (1.8% of all visits). It was the sixth most common diagnosis for women aged 15 to 64 (1.7%), and migraine was the 15th most common for this population (0.8%). Headache was the 19th most common diagnosis among men between ages 15 and 64 (0.5%).

No new data about headache or head pain from the National Ambulatory Medical Care Survey were available. Headache has not been among the top 20 reasons for outpatient visits since the 2009–2010 survey.

“It is important to understand the distribution of headache in specific segments of the population,” said Dr. Burch and colleagues. “This can guide efforts to ensure that treatments are accessible to those with the highest level of need.”

—Erik Greb

Suggested Reading

Burch R, Rizzoli P, Loder E. The prevalence and impact of migraine and severe headache in the United States: figures and trends from government health studies. Headache. 2018 Mar 12 [Epub ahead of print].

 

Severe headache and migraine remain significant public health problems, and their prevalence has been stable for years, according to a review published online ahead of print March 12 in Headache. Results confirm that “migraine disproportionately affects women and several other historically disadvantaged segments of the population,” according to the authors. “These inequities could be exacerbated if new high-cost treatments are inaccessible to those who need them most.”

Rebecca Burch, MD, Instructor in Neurology at Harvard Medical School in Boston, and colleagues reviewed population-based US government surveys to obtain updated estimates of the prevalence of migraine and severe headache in adults. The authors examined the most recent data from the National Health Interview Survey, the National Hospital Ambulatory Medical Care Survey, and the National Ambulatory Medical Care Survey.

Rebecca Burch, MD


The most recent National Health Interview Survey data were from 2015. They indicated that the overall prevalence of migraine or severe headache was 15.3%. The prevalence was 20.7% in women and 9.7% in men. The age group with the highest prevalence of migraine (17.9%) included patients between ages 18 and 44. Prevalence was 15.9% in people between ages 45 and 64.

The prevalence of migraine or severe headache also varied by race. The highest prevalence (20.3%) was among native Hawaiians and other Pacific islanders. Prevalence was 18.4% among American Indians or Alaska natives, 16.2% among blacks or African Americans, 15.4% among whites, and 11.3% among Asians.

Data indicated that prevalence varied with income and insurance status. People living below the poverty line had a prevalence of 21.7%, and those with an annual family income of less than $35,000 had a prevalence of 19.9%. For people younger than 65, prevalence was higher in people insured by Medicaid (26.0%), compared with people with private insurance (15.1%) or no insurance (17.1%).

The most recent data for the National Hospital Ambulatory Medical Care Survey were from 2014. In that year, headache or pain in the head prompted approximately four million emergency department visits. Women of childbearing age made more than half of emergency department visits for headache.

Headache or pain in the head accounted for 3.0% of all emergency department visits and was the fifth leading cause of visits to the emergency department, as reported by patients. Headache was the 12th most common diagnosis among emergency department physicians (1.8% of all visits). It was the sixth most common diagnosis for women aged 15 to 64 (1.7%), and migraine was the 15th most common for this population (0.8%). Headache was the 19th most common diagnosis among men between ages 15 and 64 (0.5%).

No new data about headache or head pain from the National Ambulatory Medical Care Survey were available. Headache has not been among the top 20 reasons for outpatient visits since the 2009–2010 survey.

“It is important to understand the distribution of headache in specific segments of the population,” said Dr. Burch and colleagues. “This can guide efforts to ensure that treatments are accessible to those with the highest level of need.”

—Erik Greb

Suggested Reading

Burch R, Rizzoli P, Loder E. The prevalence and impact of migraine and severe headache in the United States: figures and trends from government health studies. Headache. 2018 Mar 12 [Epub ahead of print].

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
45
Page Number
45
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Is Chronification the Natural History of Migraine?

Article Type
Changed
Mon, 01/07/2019 - 10:40
Headache chronification may be reversible in some patients.

OJAI, CA—Diagnosis, while critically important in the management of migraine, “is just half the picture,” said Robert Cowan, MD, Higgins Professor of Neurology and Neurosciences and Director of the Division of Headache and Facial Pain at Stanford University in California. Changes over time, attack frequency, chronification, comorbidity, and disability complicate the management of this disorder. At the 11th Annual Headache Cooperative of the Pacific’s Winter Conference, Dr. Cowan reviewed the clinical evidence suggesting that episodic and chronic migraine are two distinct entities, stressed the importance of classification and staging in the diagnosis and treatment of migraine, and elucidated the signs and symptoms of migraine chronification.

Robert Cowan, MD

Challenging Assumptions

For years, the prevailing perception among clinicians has been that patients with migraine progress from episodic migraine into a state of chronic headache at an annual conversion rate of about 3%. A wealth of data supports this concept. But recent studies have challenged the idea that episodic migraine and chronic migraine are the same entity differentiated only by attack frequency. Research by Schwedt and colleagues, for example, seemed to differentiate between episodic and chronic migraine purely on the basis of anatomy. Additionally, evidence suggests that there are differences between episodic and chronic migraine in regard to connectivity based on functional MRI mapping of pain processing networks. Chronic and episodic migraine seem to affect the brain in different ways, Dr. Cowan said. Structural, functional, and pharmacologic changes in the brain differentiate chronic migraine from episodic migraine.

Classification and Staging

There is evidence that years of headache may make a difference. “Over time, headache after headache can remodel certain areas of the brain, in some areas thickening cortex, and in others, thinning it. This is likely the result of recurrent migraine attacks over time,” Dr. Cowan said. “The patient who has had chronic headache for 30 years is likely to process sensory input differently from the patient who has gone from eight to 14 headaches per month in the last three months.” Currently, studies are under way at Stanford and elsewhere to determine if these changes are reversible with proper treatment. If chronic migraine and episodic migraine are distinct disorders, then clinicians should consider staging the disease. “In the same way the American Cancer Society not only gives patients a diagnosis, but also stages the disease and gives patients an idea as to what they can expect as far as prognosis, I am suggesting that we should be doing the same thing for our migraine patients,” Dr. Cowan said. To properly stage migraine, the following factors must be taken into account: years of headache; type(s) of headache; changes in frequency, severity, duration, disability; comorbidities; and medication use patterns.

The Physician’s Role in Chronification

Physicians themselves can play a role in migraine chronification. Misdiagnosis and underdiagnosis increase the risk of migraine chronification. “When a patient is not doing well, I think it is important to go back and revisit the diagnosis. Make sure that you have the right diagnosis and that the diagnosis has not changed. Are there additional diagnoses?” Other pitfalls that may lead to chronification include failure to recognize treatable comorbidities, inadequate or inappropriate medication use, and signs of chronification such as central sensitization. Weak or recall-biased intervisit data collection also may be a problem.

Risk Factors for Chronification

“It is useful to differentiate between modifiable and nonmodifiable factors,” Dr. Cowan said. Modifiable risk factors include medication overuse, ineffective acute treatment, obesity, depression, stressful life events, and low educational level. Nonmodifiable risk factors include age and female sex. “Another approach is to look at state-specific factors versus process-related factors,” Dr. Cowan said. State-specific factors include obesity, history of abuse, comorbid pain, and head injury. Process-related factors include years of headache, increasing frequency, catastrophizing, and medication overuse.

Chronification Patterns

Clinical clues that chronification may be occurring include increasing headache frequency, severity, or duration, emergence of a second headache type, and a change in pattern of symptoms unrelated to pain. Additionally, allodynia may be a marker of chronification, and central sensitization plays a large role in chronification. “These are things we can assess clinically,” Dr. Cowan said. “We should be thinking about all these things and asking our patients about them as we follow them from visit to visit.” As headache specialists, “our job is not done once we have a diagnosis and go through the Rolodex of treatments for that diagnosis.” NR

—Glenn S. Williams

Suggested Reading

Bigal ME, Lipton RB. Clinical course in migraine: conceptualizing migraine transformation. Neurology. 2008;71(11)848-855.

Mainero C, Boshyan J, Hadjikhani N. Altered functional magnetic resonance imaging resting-state connectivity in periaqueductal gray networks in migraine. Ann Neurol. 2011;70(5):838-845.

May A, Schulte LH. Chronic migraine: risk factors, mechanisms and treatment. Nat Rev Neurol. 2016;12(8):455-464.

Schwedt TJ, Chong CD, Wu T, et al. Accurate classification of chronic migraine via brain magnetic resonance imaging. Headache. 2015;55(6):762-777.

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
21
Sections
Related Articles
Headache chronification may be reversible in some patients.
Headache chronification may be reversible in some patients.

OJAI, CA—Diagnosis, while critically important in the management of migraine, “is just half the picture,” said Robert Cowan, MD, Higgins Professor of Neurology and Neurosciences and Director of the Division of Headache and Facial Pain at Stanford University in California. Changes over time, attack frequency, chronification, comorbidity, and disability complicate the management of this disorder. At the 11th Annual Headache Cooperative of the Pacific’s Winter Conference, Dr. Cowan reviewed the clinical evidence suggesting that episodic and chronic migraine are two distinct entities, stressed the importance of classification and staging in the diagnosis and treatment of migraine, and elucidated the signs and symptoms of migraine chronification.

Robert Cowan, MD

Challenging Assumptions

For years, the prevailing perception among clinicians has been that patients with migraine progress from episodic migraine into a state of chronic headache at an annual conversion rate of about 3%. A wealth of data supports this concept. But recent studies have challenged the idea that episodic migraine and chronic migraine are the same entity differentiated only by attack frequency. Research by Schwedt and colleagues, for example, seemed to differentiate between episodic and chronic migraine purely on the basis of anatomy. Additionally, evidence suggests that there are differences between episodic and chronic migraine in regard to connectivity based on functional MRI mapping of pain processing networks. Chronic and episodic migraine seem to affect the brain in different ways, Dr. Cowan said. Structural, functional, and pharmacologic changes in the brain differentiate chronic migraine from episodic migraine.

Classification and Staging

There is evidence that years of headache may make a difference. “Over time, headache after headache can remodel certain areas of the brain, in some areas thickening cortex, and in others, thinning it. This is likely the result of recurrent migraine attacks over time,” Dr. Cowan said. “The patient who has had chronic headache for 30 years is likely to process sensory input differently from the patient who has gone from eight to 14 headaches per month in the last three months.” Currently, studies are under way at Stanford and elsewhere to determine if these changes are reversible with proper treatment. If chronic migraine and episodic migraine are distinct disorders, then clinicians should consider staging the disease. “In the same way the American Cancer Society not only gives patients a diagnosis, but also stages the disease and gives patients an idea as to what they can expect as far as prognosis, I am suggesting that we should be doing the same thing for our migraine patients,” Dr. Cowan said. To properly stage migraine, the following factors must be taken into account: years of headache; type(s) of headache; changes in frequency, severity, duration, disability; comorbidities; and medication use patterns.

The Physician’s Role in Chronification

Physicians themselves can play a role in migraine chronification. Misdiagnosis and underdiagnosis increase the risk of migraine chronification. “When a patient is not doing well, I think it is important to go back and revisit the diagnosis. Make sure that you have the right diagnosis and that the diagnosis has not changed. Are there additional diagnoses?” Other pitfalls that may lead to chronification include failure to recognize treatable comorbidities, inadequate or inappropriate medication use, and signs of chronification such as central sensitization. Weak or recall-biased intervisit data collection also may be a problem.

Risk Factors for Chronification

“It is useful to differentiate between modifiable and nonmodifiable factors,” Dr. Cowan said. Modifiable risk factors include medication overuse, ineffective acute treatment, obesity, depression, stressful life events, and low educational level. Nonmodifiable risk factors include age and female sex. “Another approach is to look at state-specific factors versus process-related factors,” Dr. Cowan said. State-specific factors include obesity, history of abuse, comorbid pain, and head injury. Process-related factors include years of headache, increasing frequency, catastrophizing, and medication overuse.

Chronification Patterns

Clinical clues that chronification may be occurring include increasing headache frequency, severity, or duration, emergence of a second headache type, and a change in pattern of symptoms unrelated to pain. Additionally, allodynia may be a marker of chronification, and central sensitization plays a large role in chronification. “These are things we can assess clinically,” Dr. Cowan said. “We should be thinking about all these things and asking our patients about them as we follow them from visit to visit.” As headache specialists, “our job is not done once we have a diagnosis and go through the Rolodex of treatments for that diagnosis.” NR

—Glenn S. Williams

Suggested Reading

Bigal ME, Lipton RB. Clinical course in migraine: conceptualizing migraine transformation. Neurology. 2008;71(11)848-855.

Mainero C, Boshyan J, Hadjikhani N. Altered functional magnetic resonance imaging resting-state connectivity in periaqueductal gray networks in migraine. Ann Neurol. 2011;70(5):838-845.

May A, Schulte LH. Chronic migraine: risk factors, mechanisms and treatment. Nat Rev Neurol. 2016;12(8):455-464.

Schwedt TJ, Chong CD, Wu T, et al. Accurate classification of chronic migraine via brain magnetic resonance imaging. Headache. 2015;55(6):762-777.

OJAI, CA—Diagnosis, while critically important in the management of migraine, “is just half the picture,” said Robert Cowan, MD, Higgins Professor of Neurology and Neurosciences and Director of the Division of Headache and Facial Pain at Stanford University in California. Changes over time, attack frequency, chronification, comorbidity, and disability complicate the management of this disorder. At the 11th Annual Headache Cooperative of the Pacific’s Winter Conference, Dr. Cowan reviewed the clinical evidence suggesting that episodic and chronic migraine are two distinct entities, stressed the importance of classification and staging in the diagnosis and treatment of migraine, and elucidated the signs and symptoms of migraine chronification.

Robert Cowan, MD

Challenging Assumptions

For years, the prevailing perception among clinicians has been that patients with migraine progress from episodic migraine into a state of chronic headache at an annual conversion rate of about 3%. A wealth of data supports this concept. But recent studies have challenged the idea that episodic migraine and chronic migraine are the same entity differentiated only by attack frequency. Research by Schwedt and colleagues, for example, seemed to differentiate between episodic and chronic migraine purely on the basis of anatomy. Additionally, evidence suggests that there are differences between episodic and chronic migraine in regard to connectivity based on functional MRI mapping of pain processing networks. Chronic and episodic migraine seem to affect the brain in different ways, Dr. Cowan said. Structural, functional, and pharmacologic changes in the brain differentiate chronic migraine from episodic migraine.

Classification and Staging

There is evidence that years of headache may make a difference. “Over time, headache after headache can remodel certain areas of the brain, in some areas thickening cortex, and in others, thinning it. This is likely the result of recurrent migraine attacks over time,” Dr. Cowan said. “The patient who has had chronic headache for 30 years is likely to process sensory input differently from the patient who has gone from eight to 14 headaches per month in the last three months.” Currently, studies are under way at Stanford and elsewhere to determine if these changes are reversible with proper treatment. If chronic migraine and episodic migraine are distinct disorders, then clinicians should consider staging the disease. “In the same way the American Cancer Society not only gives patients a diagnosis, but also stages the disease and gives patients an idea as to what they can expect as far as prognosis, I am suggesting that we should be doing the same thing for our migraine patients,” Dr. Cowan said. To properly stage migraine, the following factors must be taken into account: years of headache; type(s) of headache; changes in frequency, severity, duration, disability; comorbidities; and medication use patterns.

The Physician’s Role in Chronification

Physicians themselves can play a role in migraine chronification. Misdiagnosis and underdiagnosis increase the risk of migraine chronification. “When a patient is not doing well, I think it is important to go back and revisit the diagnosis. Make sure that you have the right diagnosis and that the diagnosis has not changed. Are there additional diagnoses?” Other pitfalls that may lead to chronification include failure to recognize treatable comorbidities, inadequate or inappropriate medication use, and signs of chronification such as central sensitization. Weak or recall-biased intervisit data collection also may be a problem.

Risk Factors for Chronification

“It is useful to differentiate between modifiable and nonmodifiable factors,” Dr. Cowan said. Modifiable risk factors include medication overuse, ineffective acute treatment, obesity, depression, stressful life events, and low educational level. Nonmodifiable risk factors include age and female sex. “Another approach is to look at state-specific factors versus process-related factors,” Dr. Cowan said. State-specific factors include obesity, history of abuse, comorbid pain, and head injury. Process-related factors include years of headache, increasing frequency, catastrophizing, and medication overuse.

Chronification Patterns

Clinical clues that chronification may be occurring include increasing headache frequency, severity, or duration, emergence of a second headache type, and a change in pattern of symptoms unrelated to pain. Additionally, allodynia may be a marker of chronification, and central sensitization plays a large role in chronification. “These are things we can assess clinically,” Dr. Cowan said. “We should be thinking about all these things and asking our patients about them as we follow them from visit to visit.” As headache specialists, “our job is not done once we have a diagnosis and go through the Rolodex of treatments for that diagnosis.” NR

—Glenn S. Williams

Suggested Reading

Bigal ME, Lipton RB. Clinical course in migraine: conceptualizing migraine transformation. Neurology. 2008;71(11)848-855.

Mainero C, Boshyan J, Hadjikhani N. Altered functional magnetic resonance imaging resting-state connectivity in periaqueductal gray networks in migraine. Ann Neurol. 2011;70(5):838-845.

May A, Schulte LH. Chronic migraine: risk factors, mechanisms and treatment. Nat Rev Neurol. 2016;12(8):455-464.

Schwedt TJ, Chong CD, Wu T, et al. Accurate classification of chronic migraine via brain magnetic resonance imaging. Headache. 2015;55(6):762-777.

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
21
Page Number
21
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Early Planning Helps Ease Transition From Pediatric to Adult Epilepsy Care

Article Type
Changed
Mon, 01/07/2019 - 10:41
Finding a neurologist who feels confident in caring for a youth with complex epilepsy can be a challenge; therefore, it may be helpful to identify an adult provider a year before making the transfer.

WASHINGTON, DC—Transitioning pediatric epilepsy patients to receive care in the adult system can be a daunting task, said Rebecca J. Schultz, PhD, RN, CPNP, at the 71st Annual Meeting of the American Epilepsy Society. A gradual shift in responsibility from the healthcare provider to the parent or caregiver to the adolescent is recommended. “The transition requires careful planning that must be patient-centered, comprehensive, and uninterrupted,” Dr. Schultz said. “The process should begin early, at around age 12.” Dr. Schultz is an Assistant Professor of Pediatric Neurology and Developmental Medicine at Baylor College of Medicine and Principal Nurse Practitioner in the Comprehensive Epilepsy Program and the Blue Bird Circle Rett Center at Texas Children’s Hospital in Houston.

Rebecca J. Schultz, PhD, RN, CPNP

Considerations for an Individualized Plan

Youth with epilepsy persisting into adulthood form a complex and diverse group. A transition plan should be based on the patient’s epilepsy type and severity, Dr. Schultz said. Childhood epilepsy persisting into adulthood can have one of three broad outcomes. The first group of patients has well-controlled seizures and is of normal intellect. The second group is patients who have seizures that remit in childhood, but have social comorbidities that persist into adulthood. “The third group is those with refractory epilepsy who continue to have seizures and comorbidities as adults,” she said. Intellectual disabilities are common in epilepsy, affecting at least 20% of children, she added. In addition, many psychosocial issues may have a greater impact than seizures on social outcome. Each patient requires an individualized transition care plan.

“Patients who have well-controlled epilepsy and are of normal intellect need to learn self-management skills—how to take their medicines by themselves without prompting, how to call in for a refill, and how to recognize when they should seek help for a medical emergency.” In addition, they need to know about the teratogenicity of the antiepileptic drugs (AEDs) that they are taking, as well as the impact of alcohol and substance abuse on their epilepsy and their medications. Other factors that should be addressed include driving eligibility and contraception.

Patients with refractory epilepsy and intellectual disabilities often have complex health needs and require multidisciplinary care, including occupational therapy, physical therapy, and other specialty services. Patients with intellectual disabilities may be able to attain some, but perhaps not all, self-management skills, Dr. Schultz said.

Treatment Concerns

With regard to treatment, some AEDs used in pediatrics are rarely used by adults. “The questions for transition are: which drugs are effective in adults, what is the optimal dosage, and what are the drug–drug interactions?” Dr. Schultz said. “Another treatment challenge is the overall lack of availability of ketogenic diet programs for adults. We need to think about developing more such programs, since research has shown that transfer to an adult ketogenic diet program results in continued adherence to the diet.”

Finding a neurologist who feels confident in caring for a youth with complex epilepsy also can be a challenge; therefore, it may be helpful to identify an adult provider a year before making the transfer, Dr. Schultz said. In a survey of adult and pediatric neurologists, only 11.2% of adult neurologists felt confident in caring for youth with epileptic encephalopathies, and 9.8% felt confident caring for those with epilepsies with genetic syndromes. This finding highlights the need for good communication between pediatric and adult providers. “The more information those of us on the pediatric side can provide to the adult team [the better]the chance of a satisfactory transition,” she noted.

Steps for Transition

“It is important for healthcare providers to let parents know that the current relationship has an end date,” Dr. Schultz said. “Starting transition at around 12 years and no later than approximately 16 years gives parents time to prepare and gives adolescents time to increase their competence, if they have the capability.”

In 2016, the Child Neurology Foundation developed principles for transition of care in epilepsy for neurologists (www.childneurologyfoundation.org/transitions). Among the foundation’s resources is a checklist that can help youth and their families determine where they stand, what they have accomplished, and what they still need to learn. “It includes such items as ‘I know what my medications are and what they are for’ and ‘I know what I’m allergic to,’ with columns for ‘Yes, I know this,’ ‘I still need to learn that,’ and ‘I still need help,’” said Dr. Schultz. “It is recommended that a readiness assessment be made on an annual basis.” The Ontario Implementation Task Force published steps for the transition process in 2017.

“The expectation is that transition be a shared responsibility among team members and not the sole responsibility of the neurology provider,” Dr. Schultz noted. Other tools include the Transition Readiness Assessment Questionnaire (www.etsu.edu/com/pediatrics/traq/) and the MyHealth Passport (www.sickkids.ca/Good2Go/for-health-care-providers/Transition-Tools-and-Supports/).

It has been suggested that epilepsy diagnosis be reevaluated between ages 16 and 19 by repeating EEG and neuroimaging. Also, clinicians should think about updating genetic testing. “Many of these adolescents have not had these tests performed since infancy or early childhood, and many new tests have been developed over the past few years,” Dr. Schultz said. “Finding a genetic cause for epilepsy can lead to customized treatment and a dramatic improvement in seizure control. It might end the diagnostic odyssey in some of these patients, thereby improving their quality of life.”

 

 

—Adriene Marshall

Suggested Reading

Andrade DM, Bassett AS, Bercovici E, et al. Epilepsy: transition from pediatric to adult care. Recommendations of the Ontario Epilepsy Implementation Task Force. Epilepsia. 2017;58(9):1502-1517.

Borlot F, Tellez-Zenteno JF, Allen A, et al. Epilepsy transition: challenges of caring for adults with childhood-onset seizures. Epilepsia. 2014;55(10):1659-1666.

Brown LW, Camfield P, Capers M, et al. The neurologist’s role in supporting transition to adult health care: a consensus statement. Neurology. 2016;87(8):835-840.

Nabbout R, Camfield CS, Andrade DM, et al. Treatment issues for children with epilepsy transitioning to adult care. Epilepsy Behav. 2017;69:153-160.

Issue
Neurology Reviews - 26(5)
Publications
Topics
Page Number
33
Sections
Related Articles
Finding a neurologist who feels confident in caring for a youth with complex epilepsy can be a challenge; therefore, it may be helpful to identify an adult provider a year before making the transfer.
Finding a neurologist who feels confident in caring for a youth with complex epilepsy can be a challenge; therefore, it may be helpful to identify an adult provider a year before making the transfer.

WASHINGTON, DC—Transitioning pediatric epilepsy patients to receive care in the adult system can be a daunting task, said Rebecca J. Schultz, PhD, RN, CPNP, at the 71st Annual Meeting of the American Epilepsy Society. A gradual shift in responsibility from the healthcare provider to the parent or caregiver to the adolescent is recommended. “The transition requires careful planning that must be patient-centered, comprehensive, and uninterrupted,” Dr. Schultz said. “The process should begin early, at around age 12.” Dr. Schultz is an Assistant Professor of Pediatric Neurology and Developmental Medicine at Baylor College of Medicine and Principal Nurse Practitioner in the Comprehensive Epilepsy Program and the Blue Bird Circle Rett Center at Texas Children’s Hospital in Houston.

Rebecca J. Schultz, PhD, RN, CPNP

Considerations for an Individualized Plan

Youth with epilepsy persisting into adulthood form a complex and diverse group. A transition plan should be based on the patient’s epilepsy type and severity, Dr. Schultz said. Childhood epilepsy persisting into adulthood can have one of three broad outcomes. The first group of patients has well-controlled seizures and is of normal intellect. The second group is patients who have seizures that remit in childhood, but have social comorbidities that persist into adulthood. “The third group is those with refractory epilepsy who continue to have seizures and comorbidities as adults,” she said. Intellectual disabilities are common in epilepsy, affecting at least 20% of children, she added. In addition, many psychosocial issues may have a greater impact than seizures on social outcome. Each patient requires an individualized transition care plan.

“Patients who have well-controlled epilepsy and are of normal intellect need to learn self-management skills—how to take their medicines by themselves without prompting, how to call in for a refill, and how to recognize when they should seek help for a medical emergency.” In addition, they need to know about the teratogenicity of the antiepileptic drugs (AEDs) that they are taking, as well as the impact of alcohol and substance abuse on their epilepsy and their medications. Other factors that should be addressed include driving eligibility and contraception.

Patients with refractory epilepsy and intellectual disabilities often have complex health needs and require multidisciplinary care, including occupational therapy, physical therapy, and other specialty services. Patients with intellectual disabilities may be able to attain some, but perhaps not all, self-management skills, Dr. Schultz said.

Treatment Concerns

With regard to treatment, some AEDs used in pediatrics are rarely used by adults. “The questions for transition are: which drugs are effective in adults, what is the optimal dosage, and what are the drug–drug interactions?” Dr. Schultz said. “Another treatment challenge is the overall lack of availability of ketogenic diet programs for adults. We need to think about developing more such programs, since research has shown that transfer to an adult ketogenic diet program results in continued adherence to the diet.”

Finding a neurologist who feels confident in caring for a youth with complex epilepsy also can be a challenge; therefore, it may be helpful to identify an adult provider a year before making the transfer, Dr. Schultz said. In a survey of adult and pediatric neurologists, only 11.2% of adult neurologists felt confident in caring for youth with epileptic encephalopathies, and 9.8% felt confident caring for those with epilepsies with genetic syndromes. This finding highlights the need for good communication between pediatric and adult providers. “The more information those of us on the pediatric side can provide to the adult team [the better]the chance of a satisfactory transition,” she noted.

Steps for Transition

“It is important for healthcare providers to let parents know that the current relationship has an end date,” Dr. Schultz said. “Starting transition at around 12 years and no later than approximately 16 years gives parents time to prepare and gives adolescents time to increase their competence, if they have the capability.”

In 2016, the Child Neurology Foundation developed principles for transition of care in epilepsy for neurologists (www.childneurologyfoundation.org/transitions). Among the foundation’s resources is a checklist that can help youth and their families determine where they stand, what they have accomplished, and what they still need to learn. “It includes such items as ‘I know what my medications are and what they are for’ and ‘I know what I’m allergic to,’ with columns for ‘Yes, I know this,’ ‘I still need to learn that,’ and ‘I still need help,’” said Dr. Schultz. “It is recommended that a readiness assessment be made on an annual basis.” The Ontario Implementation Task Force published steps for the transition process in 2017.

“The expectation is that transition be a shared responsibility among team members and not the sole responsibility of the neurology provider,” Dr. Schultz noted. Other tools include the Transition Readiness Assessment Questionnaire (www.etsu.edu/com/pediatrics/traq/) and the MyHealth Passport (www.sickkids.ca/Good2Go/for-health-care-providers/Transition-Tools-and-Supports/).

It has been suggested that epilepsy diagnosis be reevaluated between ages 16 and 19 by repeating EEG and neuroimaging. Also, clinicians should think about updating genetic testing. “Many of these adolescents have not had these tests performed since infancy or early childhood, and many new tests have been developed over the past few years,” Dr. Schultz said. “Finding a genetic cause for epilepsy can lead to customized treatment and a dramatic improvement in seizure control. It might end the diagnostic odyssey in some of these patients, thereby improving their quality of life.”

 

 

—Adriene Marshall

Suggested Reading

Andrade DM, Bassett AS, Bercovici E, et al. Epilepsy: transition from pediatric to adult care. Recommendations of the Ontario Epilepsy Implementation Task Force. Epilepsia. 2017;58(9):1502-1517.

Borlot F, Tellez-Zenteno JF, Allen A, et al. Epilepsy transition: challenges of caring for adults with childhood-onset seizures. Epilepsia. 2014;55(10):1659-1666.

Brown LW, Camfield P, Capers M, et al. The neurologist’s role in supporting transition to adult health care: a consensus statement. Neurology. 2016;87(8):835-840.

Nabbout R, Camfield CS, Andrade DM, et al. Treatment issues for children with epilepsy transitioning to adult care. Epilepsy Behav. 2017;69:153-160.

WASHINGTON, DC—Transitioning pediatric epilepsy patients to receive care in the adult system can be a daunting task, said Rebecca J. Schultz, PhD, RN, CPNP, at the 71st Annual Meeting of the American Epilepsy Society. A gradual shift in responsibility from the healthcare provider to the parent or caregiver to the adolescent is recommended. “The transition requires careful planning that must be patient-centered, comprehensive, and uninterrupted,” Dr. Schultz said. “The process should begin early, at around age 12.” Dr. Schultz is an Assistant Professor of Pediatric Neurology and Developmental Medicine at Baylor College of Medicine and Principal Nurse Practitioner in the Comprehensive Epilepsy Program and the Blue Bird Circle Rett Center at Texas Children’s Hospital in Houston.

Rebecca J. Schultz, PhD, RN, CPNP

Considerations for an Individualized Plan

Youth with epilepsy persisting into adulthood form a complex and diverse group. A transition plan should be based on the patient’s epilepsy type and severity, Dr. Schultz said. Childhood epilepsy persisting into adulthood can have one of three broad outcomes. The first group of patients has well-controlled seizures and is of normal intellect. The second group is patients who have seizures that remit in childhood, but have social comorbidities that persist into adulthood. “The third group is those with refractory epilepsy who continue to have seizures and comorbidities as adults,” she said. Intellectual disabilities are common in epilepsy, affecting at least 20% of children, she added. In addition, many psychosocial issues may have a greater impact than seizures on social outcome. Each patient requires an individualized transition care plan.

“Patients who have well-controlled epilepsy and are of normal intellect need to learn self-management skills—how to take their medicines by themselves without prompting, how to call in for a refill, and how to recognize when they should seek help for a medical emergency.” In addition, they need to know about the teratogenicity of the antiepileptic drugs (AEDs) that they are taking, as well as the impact of alcohol and substance abuse on their epilepsy and their medications. Other factors that should be addressed include driving eligibility and contraception.

Patients with refractory epilepsy and intellectual disabilities often have complex health needs and require multidisciplinary care, including occupational therapy, physical therapy, and other specialty services. Patients with intellectual disabilities may be able to attain some, but perhaps not all, self-management skills, Dr. Schultz said.

Treatment Concerns

With regard to treatment, some AEDs used in pediatrics are rarely used by adults. “The questions for transition are: which drugs are effective in adults, what is the optimal dosage, and what are the drug–drug interactions?” Dr. Schultz said. “Another treatment challenge is the overall lack of availability of ketogenic diet programs for adults. We need to think about developing more such programs, since research has shown that transfer to an adult ketogenic diet program results in continued adherence to the diet.”

Finding a neurologist who feels confident in caring for a youth with complex epilepsy also can be a challenge; therefore, it may be helpful to identify an adult provider a year before making the transfer, Dr. Schultz said. In a survey of adult and pediatric neurologists, only 11.2% of adult neurologists felt confident in caring for youth with epileptic encephalopathies, and 9.8% felt confident caring for those with epilepsies with genetic syndromes. This finding highlights the need for good communication between pediatric and adult providers. “The more information those of us on the pediatric side can provide to the adult team [the better]the chance of a satisfactory transition,” she noted.

Steps for Transition

“It is important for healthcare providers to let parents know that the current relationship has an end date,” Dr. Schultz said. “Starting transition at around 12 years and no later than approximately 16 years gives parents time to prepare and gives adolescents time to increase their competence, if they have the capability.”

In 2016, the Child Neurology Foundation developed principles for transition of care in epilepsy for neurologists (www.childneurologyfoundation.org/transitions). Among the foundation’s resources is a checklist that can help youth and their families determine where they stand, what they have accomplished, and what they still need to learn. “It includes such items as ‘I know what my medications are and what they are for’ and ‘I know what I’m allergic to,’ with columns for ‘Yes, I know this,’ ‘I still need to learn that,’ and ‘I still need help,’” said Dr. Schultz. “It is recommended that a readiness assessment be made on an annual basis.” The Ontario Implementation Task Force published steps for the transition process in 2017.

“The expectation is that transition be a shared responsibility among team members and not the sole responsibility of the neurology provider,” Dr. Schultz noted. Other tools include the Transition Readiness Assessment Questionnaire (www.etsu.edu/com/pediatrics/traq/) and the MyHealth Passport (www.sickkids.ca/Good2Go/for-health-care-providers/Transition-Tools-and-Supports/).

It has been suggested that epilepsy diagnosis be reevaluated between ages 16 and 19 by repeating EEG and neuroimaging. Also, clinicians should think about updating genetic testing. “Many of these adolescents have not had these tests performed since infancy or early childhood, and many new tests have been developed over the past few years,” Dr. Schultz said. “Finding a genetic cause for epilepsy can lead to customized treatment and a dramatic improvement in seizure control. It might end the diagnostic odyssey in some of these patients, thereby improving their quality of life.”

 

 

—Adriene Marshall

Suggested Reading

Andrade DM, Bassett AS, Bercovici E, et al. Epilepsy: transition from pediatric to adult care. Recommendations of the Ontario Epilepsy Implementation Task Force. Epilepsia. 2017;58(9):1502-1517.

Borlot F, Tellez-Zenteno JF, Allen A, et al. Epilepsy transition: challenges of caring for adults with childhood-onset seizures. Epilepsia. 2014;55(10):1659-1666.

Brown LW, Camfield P, Capers M, et al. The neurologist’s role in supporting transition to adult health care: a consensus statement. Neurology. 2016;87(8):835-840.

Nabbout R, Camfield CS, Andrade DM, et al. Treatment issues for children with epilepsy transitioning to adult care. Epilepsy Behav. 2017;69:153-160.

Issue
Neurology Reviews - 26(5)
Issue
Neurology Reviews - 26(5)
Page Number
33
Page Number
33
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

What is an old doctor to do?

Article Type
Changed
Thu, 03/28/2019 - 14:38

 

I was in Miami recently to give a talk on diabetes when a physician, Pablo Michel, MD, asked me whether we could address an issue that’s important to him and many of his colleagues. His question was, Do we have any suggestions about how to help “older doctors” such as himself deal with electronic health records?

One of the problems with his question was that he didn’t really look “old”; he looked like he was about 50 years of age and in good shape. This physician had come on a Saturday morning to spend 4 hours learning about diabetes, which made it clear that he cared about his patients, his craft, and staying current with the medical literature.

Dr. Chris Notte and Dr. Neil Skolnik
Further discussion revealed that he also was bothered about what he saw happening on many consult notes that he received, as well as the undermining of history and physical notes by copy and paste; the inclusion of a lot of meaningless information made it hard to find information that was relevant. He said that he had become used to doing his old SOAP notes in a really efficient manner and found he was now slogging through mud having to reproduce large parts of the chart in every note that he did.

I was struck by his questions, as well as his concern for both the quality of care for his patients and the issues he and his colleagues were facing. And it is not just him. Increased computerization of practices has been listed among the top five causes of physician burnout.1

A recent article in Annals of Internal Medicine showed that physicians spent only a quarter of their total time directly talking with patients and 50% of their time on EHR and other administrative tasks.2 It is likely that, among older physicians, the EHR takes proportionally more time and is an even larger cause of burnout. Given the importance of EHR, it seems time to revisit both the dilemma of, and propose some solutions for, this common problem.

One of the core issues for many older physicians is an inability to type. If you don’t type well, then entering a patient’s history or documenting the assessment and plan is unduly burdensome. Ten years ago, we might have suggested learning to type, which was an unrealistic recommendation then and, fortunately, is unnecessary now.

Now, solutions ranging from medical scribes to voice recognition have become commonplace. Voice recognition technology has advanced incredibly over the past 10 years, so much so that it is used now in our everyday life. The most well-known voice technology in everyday life might be Siri, Apple’s voice technology. It is easy now to dictate texts and to look up information. Similar voice technologies are available with the Amazon Echo and Google Assistant.

 

 


We now also have the advantage of well-developed medical voice recognition technology that can be used with most EHRs. Although some doctors say that the software is expensive, it can cost about $1,500 for the software and another $200-$300 for a good microphone, as well as the time to train on the software. But that expense needs to be weighed against the lost productivity of not using such software. A common complaint we hear from older doctors is that they are spending 1 to 2 hours a night completing charts. If voice recognition software could shave off half that time, decrease stress, and increase satisfaction, then it would pay for itself in 2 weeks.

Another issue is that, because the EHR enables so many things to be done from the EHR platform, many doctors find themselves doing all the work. It is important to work as a team and let each member of the team contribute to making the process more efficient. It turns out that this usually ends up being satisfying for everyone who contributes to patient care. It requires standing back from the process periodically and thinking about areas of inefficiency and how things can be done better.

One clear example is medication reconciliation: A nurse or clinical pharmacist can go over medicines with patients, and while the physician still needs to review the medications, it takes much less time to review medications than it does to enter each medication with the correct dose. Nurses also can help with preventive health initiatives. Performing recommended preventive health activities ranging from hepatitis C screening to colonoscopy can be greatly facilitated by the participation of nursing staff, and their participation will free up doctors so they can have more time to focus on diagnosis and treatment. Teamwork is critical.

Finally, if you don’t know something that is important to your practice – learn it! We are accustomed to going to CME conferences and spending our time learning about diseases like diabetes, asthma, and COPD. Each of these disease accounts for 5%-10% of the patients we see in our practice, and it is critically important to stay current and learn about them. We use our EHR for 100% of the patients we see; therefore, we should allocate time to learning about how to navigate the EHR and work more efficiently with it.

 

 


These issues are real, and the processes continue to change, but by standing back and acknowledging the challenges, we can thoughtfully construct an approach to maximize our ability to continue to have productive, gratifying careers while helping our patients.

Dr. Skolnik is a professor of family and community medicine at Jefferson Medical College, Philadelphia, and an associate director of the family medicine residency program at Abington (Pa.) Jefferson Health. Dr. Notte is a family physician and associate chief medical information officer for Abington Jefferson Health. Follow him on twitter @doctornotte.

References

1. Medscape Physician Lifestyle Report 2015. Accessed April 27, 2018. https://www.medscape.com/slideshow/lifestyle-2015-overview-6006535#1.

2. Sinsky C et al. Ann Intern Med. 2016;165(11):753-60.

Publications
Topics
Sections

 

I was in Miami recently to give a talk on diabetes when a physician, Pablo Michel, MD, asked me whether we could address an issue that’s important to him and many of his colleagues. His question was, Do we have any suggestions about how to help “older doctors” such as himself deal with electronic health records?

One of the problems with his question was that he didn’t really look “old”; he looked like he was about 50 years of age and in good shape. This physician had come on a Saturday morning to spend 4 hours learning about diabetes, which made it clear that he cared about his patients, his craft, and staying current with the medical literature.

Dr. Chris Notte and Dr. Neil Skolnik
Further discussion revealed that he also was bothered about what he saw happening on many consult notes that he received, as well as the undermining of history and physical notes by copy and paste; the inclusion of a lot of meaningless information made it hard to find information that was relevant. He said that he had become used to doing his old SOAP notes in a really efficient manner and found he was now slogging through mud having to reproduce large parts of the chart in every note that he did.

I was struck by his questions, as well as his concern for both the quality of care for his patients and the issues he and his colleagues were facing. And it is not just him. Increased computerization of practices has been listed among the top five causes of physician burnout.1

A recent article in Annals of Internal Medicine showed that physicians spent only a quarter of their total time directly talking with patients and 50% of their time on EHR and other administrative tasks.2 It is likely that, among older physicians, the EHR takes proportionally more time and is an even larger cause of burnout. Given the importance of EHR, it seems time to revisit both the dilemma of, and propose some solutions for, this common problem.

One of the core issues for many older physicians is an inability to type. If you don’t type well, then entering a patient’s history or documenting the assessment and plan is unduly burdensome. Ten years ago, we might have suggested learning to type, which was an unrealistic recommendation then and, fortunately, is unnecessary now.

Now, solutions ranging from medical scribes to voice recognition have become commonplace. Voice recognition technology has advanced incredibly over the past 10 years, so much so that it is used now in our everyday life. The most well-known voice technology in everyday life might be Siri, Apple’s voice technology. It is easy now to dictate texts and to look up information. Similar voice technologies are available with the Amazon Echo and Google Assistant.

 

 


We now also have the advantage of well-developed medical voice recognition technology that can be used with most EHRs. Although some doctors say that the software is expensive, it can cost about $1,500 for the software and another $200-$300 for a good microphone, as well as the time to train on the software. But that expense needs to be weighed against the lost productivity of not using such software. A common complaint we hear from older doctors is that they are spending 1 to 2 hours a night completing charts. If voice recognition software could shave off half that time, decrease stress, and increase satisfaction, then it would pay for itself in 2 weeks.

Another issue is that, because the EHR enables so many things to be done from the EHR platform, many doctors find themselves doing all the work. It is important to work as a team and let each member of the team contribute to making the process more efficient. It turns out that this usually ends up being satisfying for everyone who contributes to patient care. It requires standing back from the process periodically and thinking about areas of inefficiency and how things can be done better.

One clear example is medication reconciliation: A nurse or clinical pharmacist can go over medicines with patients, and while the physician still needs to review the medications, it takes much less time to review medications than it does to enter each medication with the correct dose. Nurses also can help with preventive health initiatives. Performing recommended preventive health activities ranging from hepatitis C screening to colonoscopy can be greatly facilitated by the participation of nursing staff, and their participation will free up doctors so they can have more time to focus on diagnosis and treatment. Teamwork is critical.

Finally, if you don’t know something that is important to your practice – learn it! We are accustomed to going to CME conferences and spending our time learning about diseases like diabetes, asthma, and COPD. Each of these disease accounts for 5%-10% of the patients we see in our practice, and it is critically important to stay current and learn about them. We use our EHR for 100% of the patients we see; therefore, we should allocate time to learning about how to navigate the EHR and work more efficiently with it.

 

 


These issues are real, and the processes continue to change, but by standing back and acknowledging the challenges, we can thoughtfully construct an approach to maximize our ability to continue to have productive, gratifying careers while helping our patients.

Dr. Skolnik is a professor of family and community medicine at Jefferson Medical College, Philadelphia, and an associate director of the family medicine residency program at Abington (Pa.) Jefferson Health. Dr. Notte is a family physician and associate chief medical information officer for Abington Jefferson Health. Follow him on twitter @doctornotte.

References

1. Medscape Physician Lifestyle Report 2015. Accessed April 27, 2018. https://www.medscape.com/slideshow/lifestyle-2015-overview-6006535#1.

2. Sinsky C et al. Ann Intern Med. 2016;165(11):753-60.

 

I was in Miami recently to give a talk on diabetes when a physician, Pablo Michel, MD, asked me whether we could address an issue that’s important to him and many of his colleagues. His question was, Do we have any suggestions about how to help “older doctors” such as himself deal with electronic health records?

One of the problems with his question was that he didn’t really look “old”; he looked like he was about 50 years of age and in good shape. This physician had come on a Saturday morning to spend 4 hours learning about diabetes, which made it clear that he cared about his patients, his craft, and staying current with the medical literature.

Dr. Chris Notte and Dr. Neil Skolnik
Further discussion revealed that he also was bothered about what he saw happening on many consult notes that he received, as well as the undermining of history and physical notes by copy and paste; the inclusion of a lot of meaningless information made it hard to find information that was relevant. He said that he had become used to doing his old SOAP notes in a really efficient manner and found he was now slogging through mud having to reproduce large parts of the chart in every note that he did.

I was struck by his questions, as well as his concern for both the quality of care for his patients and the issues he and his colleagues were facing. And it is not just him. Increased computerization of practices has been listed among the top five causes of physician burnout.1

A recent article in Annals of Internal Medicine showed that physicians spent only a quarter of their total time directly talking with patients and 50% of their time on EHR and other administrative tasks.2 It is likely that, among older physicians, the EHR takes proportionally more time and is an even larger cause of burnout. Given the importance of EHR, it seems time to revisit both the dilemma of, and propose some solutions for, this common problem.

One of the core issues for many older physicians is an inability to type. If you don’t type well, then entering a patient’s history or documenting the assessment and plan is unduly burdensome. Ten years ago, we might have suggested learning to type, which was an unrealistic recommendation then and, fortunately, is unnecessary now.

Now, solutions ranging from medical scribes to voice recognition have become commonplace. Voice recognition technology has advanced incredibly over the past 10 years, so much so that it is used now in our everyday life. The most well-known voice technology in everyday life might be Siri, Apple’s voice technology. It is easy now to dictate texts and to look up information. Similar voice technologies are available with the Amazon Echo and Google Assistant.

 

 


We now also have the advantage of well-developed medical voice recognition technology that can be used with most EHRs. Although some doctors say that the software is expensive, it can cost about $1,500 for the software and another $200-$300 for a good microphone, as well as the time to train on the software. But that expense needs to be weighed against the lost productivity of not using such software. A common complaint we hear from older doctors is that they are spending 1 to 2 hours a night completing charts. If voice recognition software could shave off half that time, decrease stress, and increase satisfaction, then it would pay for itself in 2 weeks.

Another issue is that, because the EHR enables so many things to be done from the EHR platform, many doctors find themselves doing all the work. It is important to work as a team and let each member of the team contribute to making the process more efficient. It turns out that this usually ends up being satisfying for everyone who contributes to patient care. It requires standing back from the process periodically and thinking about areas of inefficiency and how things can be done better.

One clear example is medication reconciliation: A nurse or clinical pharmacist can go over medicines with patients, and while the physician still needs to review the medications, it takes much less time to review medications than it does to enter each medication with the correct dose. Nurses also can help with preventive health initiatives. Performing recommended preventive health activities ranging from hepatitis C screening to colonoscopy can be greatly facilitated by the participation of nursing staff, and their participation will free up doctors so they can have more time to focus on diagnosis and treatment. Teamwork is critical.

Finally, if you don’t know something that is important to your practice – learn it! We are accustomed to going to CME conferences and spending our time learning about diseases like diabetes, asthma, and COPD. Each of these disease accounts for 5%-10% of the patients we see in our practice, and it is critically important to stay current and learn about them. We use our EHR for 100% of the patients we see; therefore, we should allocate time to learning about how to navigate the EHR and work more efficiently with it.

 

 


These issues are real, and the processes continue to change, but by standing back and acknowledging the challenges, we can thoughtfully construct an approach to maximize our ability to continue to have productive, gratifying careers while helping our patients.

Dr. Skolnik is a professor of family and community medicine at Jefferson Medical College, Philadelphia, and an associate director of the family medicine residency program at Abington (Pa.) Jefferson Health. Dr. Notte is a family physician and associate chief medical information officer for Abington Jefferson Health. Follow him on twitter @doctornotte.

References

1. Medscape Physician Lifestyle Report 2015. Accessed April 27, 2018. https://www.medscape.com/slideshow/lifestyle-2015-overview-6006535#1.

2. Sinsky C et al. Ann Intern Med. 2016;165(11):753-60.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Allergy, eczema common after pediatric solid organ transplantation

Track atopy, allergy in pediatric transplantation
Article Type
Changed
Tue, 02/14/2023 - 13:05

 

A total of 34% of children who underwent solid organ transplantation subsequently developed eczema, food allergy, rhinitis, eosinophilic gastrointestinal disease, or asthma, according to the results of a single-center retrospective cohort study.

Another 6.6% of patients developed autoimmunity, usually autoimmune cytopenia, inflammatory bowel disease, or vasculitis, wrote Nufar Marcus, MD, of the University of Toronto, and her associates.

Posttransplant allergy, autoimmunity, and immune-mediated disorders (PTAA) likely share a common pathogenesis “and may represent a unique state of post-transplant immune-dysregulation,” they wrote. The report was published in the Journal of Pediatrics.

The study included 273 children who underwent solid organ transplantation and were followed for a median 3.6 years (range, 1.7-6.3 years). None had immune-mediated conditions or allergies diagnosed at baseline. Posttransplantation allergies most commonly included eczema (51%), asthma (32%), food allergy (25%, including 5% with associated anaphylaxis), rhinitis (17%), and eosinophilic esophagitis, gastritis, or enteritis (13%).

aniaostudio/Thinkstock.com
Median age at transplantation was 2.9 years (range, 0.7-10.3 years) and 59% of patients were male. Procedures usually involved liver (111) or heart (103) transplantation, while 52 patients underwent kidney transplantation and 7 underwent multivisceral transplantation. Heart transplantation patients were significantly more likely to develop asthma and autoimmunity, while liver transplantation patients had a significantly greater incidence of food allergies and eosinophilic gastrointestinal disease. “Recipients of multivisceral transplantation [also] had a high prevalence of autoimmunity [43%],” the researchers wrote.

Although only 31% of patients had information available on family history of allergy, those with a positive family history of allergy had a fivefold greater odds of posttransplantation PTAA, compared with other patients. Other risk factors for PTAA included female sex, young age at transplantation, eosinophilia, and a positive test for Epstein-Barr virus after transplantation, Dr. Marcus and associates said.

“The association of blood eosinophilia and PTAA reached statistical significance only when the transplant recipient was at least 6 months of age, demonstrating the nonspecific nature of abnormally high eosinophil counts during the first months of life,” they noted. The longer patients had eosinophilia after transplantation, the more likely they were to develop PTAA, “suggest[ing] a potential detrimental effect of prolonged activation of the eosinophilic-associated immune arms.”

 

 


Factors that appeared unlinked with PTAA included acute organ rejection, duration of posttransplantation steroidal treatment, organ type (living versus cadaveric), donor/recipient blood type and compatibility, infections besides Epstein-Barr virus, and posttransplant lymphoproliferative disease. “The specific type of post-transplantation immunosuppression regimen was neither associated nor protective of PTAA,” the investigators wrote. “However, a significant limitation was our inability to assess the effect of tacrolimus, as nearly all the cohort (97.8%) was treated with this medication.”

Ashley’s Angels fund provided support. The researchers reported having no conflicts of interest.

SOURCE: Marcus N et al. J Pediatr. 2018;196:154-60.

Body

 

The study is one of several to highlight the occurrence of atopy and allergy following solid organ transplantation in children, Helen M. Evans, MBChB, wrote in an editorial accompanying the report by Marcus et al.

This report differed because it studied the differences in rates of atopy and allergy between transplanted solid organ groups. These occurred in 41% and 40% of liver and heart recipients, respectively, but in only 4% of kidney recipients. Atopy or allergy developed in 57% of multivisceral transplant patients, but the number of patients was very small (n = 7). The majority of the conditions developed within 1 year of transplantation.

The recent spike in these reports could signify better recognition of the problem or “the widespread switch of primary immunosuppression from cyclosporine to tacrolimus over the last few decades,” wrote Dr. Evans.

Most of these reports have been single-center retrospective studies, which are subject to inconsistent case definitions and recall bias, she noted. “The time is right for well-conducted multicenter prospective studies to better inform the true extent of these conditions after solid organ transplantation.”

In the meantime, transplantation centers should routinely track de novo eczema, allergy, and eosinophilic gastrointestinal disease in children being assessed for solid organ transplantation, and should take “rigorous” personal and family histories, said Dr. Evans. Ultimately, this work will help “minimize the risk of children developing these conditions” and “effectively treat them in the setting of immunosuppression after transplantation.”
 

Dr. Evans is a pediatric gastroenterologist at Starship Child Health in Aukland, New Zealand. She reported having no conflicts of interest. These comments summarize her editorial ( J Pediatr. 2018;196:10-11 ).

Publications
Topics
Sections
Body

 

The study is one of several to highlight the occurrence of atopy and allergy following solid organ transplantation in children, Helen M. Evans, MBChB, wrote in an editorial accompanying the report by Marcus et al.

This report differed because it studied the differences in rates of atopy and allergy between transplanted solid organ groups. These occurred in 41% and 40% of liver and heart recipients, respectively, but in only 4% of kidney recipients. Atopy or allergy developed in 57% of multivisceral transplant patients, but the number of patients was very small (n = 7). The majority of the conditions developed within 1 year of transplantation.

The recent spike in these reports could signify better recognition of the problem or “the widespread switch of primary immunosuppression from cyclosporine to tacrolimus over the last few decades,” wrote Dr. Evans.

Most of these reports have been single-center retrospective studies, which are subject to inconsistent case definitions and recall bias, she noted. “The time is right for well-conducted multicenter prospective studies to better inform the true extent of these conditions after solid organ transplantation.”

In the meantime, transplantation centers should routinely track de novo eczema, allergy, and eosinophilic gastrointestinal disease in children being assessed for solid organ transplantation, and should take “rigorous” personal and family histories, said Dr. Evans. Ultimately, this work will help “minimize the risk of children developing these conditions” and “effectively treat them in the setting of immunosuppression after transplantation.”
 

Dr. Evans is a pediatric gastroenterologist at Starship Child Health in Aukland, New Zealand. She reported having no conflicts of interest. These comments summarize her editorial ( J Pediatr. 2018;196:10-11 ).

Body

 

The study is one of several to highlight the occurrence of atopy and allergy following solid organ transplantation in children, Helen M. Evans, MBChB, wrote in an editorial accompanying the report by Marcus et al.

This report differed because it studied the differences in rates of atopy and allergy between transplanted solid organ groups. These occurred in 41% and 40% of liver and heart recipients, respectively, but in only 4% of kidney recipients. Atopy or allergy developed in 57% of multivisceral transplant patients, but the number of patients was very small (n = 7). The majority of the conditions developed within 1 year of transplantation.

The recent spike in these reports could signify better recognition of the problem or “the widespread switch of primary immunosuppression from cyclosporine to tacrolimus over the last few decades,” wrote Dr. Evans.

Most of these reports have been single-center retrospective studies, which are subject to inconsistent case definitions and recall bias, she noted. “The time is right for well-conducted multicenter prospective studies to better inform the true extent of these conditions after solid organ transplantation.”

In the meantime, transplantation centers should routinely track de novo eczema, allergy, and eosinophilic gastrointestinal disease in children being assessed for solid organ transplantation, and should take “rigorous” personal and family histories, said Dr. Evans. Ultimately, this work will help “minimize the risk of children developing these conditions” and “effectively treat them in the setting of immunosuppression after transplantation.”
 

Dr. Evans is a pediatric gastroenterologist at Starship Child Health in Aukland, New Zealand. She reported having no conflicts of interest. These comments summarize her editorial ( J Pediatr. 2018;196:10-11 ).

Title
Track atopy, allergy in pediatric transplantation
Track atopy, allergy in pediatric transplantation

 

A total of 34% of children who underwent solid organ transplantation subsequently developed eczema, food allergy, rhinitis, eosinophilic gastrointestinal disease, or asthma, according to the results of a single-center retrospective cohort study.

Another 6.6% of patients developed autoimmunity, usually autoimmune cytopenia, inflammatory bowel disease, or vasculitis, wrote Nufar Marcus, MD, of the University of Toronto, and her associates.

Posttransplant allergy, autoimmunity, and immune-mediated disorders (PTAA) likely share a common pathogenesis “and may represent a unique state of post-transplant immune-dysregulation,” they wrote. The report was published in the Journal of Pediatrics.

The study included 273 children who underwent solid organ transplantation and were followed for a median 3.6 years (range, 1.7-6.3 years). None had immune-mediated conditions or allergies diagnosed at baseline. Posttransplantation allergies most commonly included eczema (51%), asthma (32%), food allergy (25%, including 5% with associated anaphylaxis), rhinitis (17%), and eosinophilic esophagitis, gastritis, or enteritis (13%).

aniaostudio/Thinkstock.com
Median age at transplantation was 2.9 years (range, 0.7-10.3 years) and 59% of patients were male. Procedures usually involved liver (111) or heart (103) transplantation, while 52 patients underwent kidney transplantation and 7 underwent multivisceral transplantation. Heart transplantation patients were significantly more likely to develop asthma and autoimmunity, while liver transplantation patients had a significantly greater incidence of food allergies and eosinophilic gastrointestinal disease. “Recipients of multivisceral transplantation [also] had a high prevalence of autoimmunity [43%],” the researchers wrote.

Although only 31% of patients had information available on family history of allergy, those with a positive family history of allergy had a fivefold greater odds of posttransplantation PTAA, compared with other patients. Other risk factors for PTAA included female sex, young age at transplantation, eosinophilia, and a positive test for Epstein-Barr virus after transplantation, Dr. Marcus and associates said.

“The association of blood eosinophilia and PTAA reached statistical significance only when the transplant recipient was at least 6 months of age, demonstrating the nonspecific nature of abnormally high eosinophil counts during the first months of life,” they noted. The longer patients had eosinophilia after transplantation, the more likely they were to develop PTAA, “suggest[ing] a potential detrimental effect of prolonged activation of the eosinophilic-associated immune arms.”

 

 


Factors that appeared unlinked with PTAA included acute organ rejection, duration of posttransplantation steroidal treatment, organ type (living versus cadaveric), donor/recipient blood type and compatibility, infections besides Epstein-Barr virus, and posttransplant lymphoproliferative disease. “The specific type of post-transplantation immunosuppression regimen was neither associated nor protective of PTAA,” the investigators wrote. “However, a significant limitation was our inability to assess the effect of tacrolimus, as nearly all the cohort (97.8%) was treated with this medication.”

Ashley’s Angels fund provided support. The researchers reported having no conflicts of interest.

SOURCE: Marcus N et al. J Pediatr. 2018;196:154-60.

 

A total of 34% of children who underwent solid organ transplantation subsequently developed eczema, food allergy, rhinitis, eosinophilic gastrointestinal disease, or asthma, according to the results of a single-center retrospective cohort study.

Another 6.6% of patients developed autoimmunity, usually autoimmune cytopenia, inflammatory bowel disease, or vasculitis, wrote Nufar Marcus, MD, of the University of Toronto, and her associates.

Posttransplant allergy, autoimmunity, and immune-mediated disorders (PTAA) likely share a common pathogenesis “and may represent a unique state of post-transplant immune-dysregulation,” they wrote. The report was published in the Journal of Pediatrics.

The study included 273 children who underwent solid organ transplantation and were followed for a median 3.6 years (range, 1.7-6.3 years). None had immune-mediated conditions or allergies diagnosed at baseline. Posttransplantation allergies most commonly included eczema (51%), asthma (32%), food allergy (25%, including 5% with associated anaphylaxis), rhinitis (17%), and eosinophilic esophagitis, gastritis, or enteritis (13%).

aniaostudio/Thinkstock.com
Median age at transplantation was 2.9 years (range, 0.7-10.3 years) and 59% of patients were male. Procedures usually involved liver (111) or heart (103) transplantation, while 52 patients underwent kidney transplantation and 7 underwent multivisceral transplantation. Heart transplantation patients were significantly more likely to develop asthma and autoimmunity, while liver transplantation patients had a significantly greater incidence of food allergies and eosinophilic gastrointestinal disease. “Recipients of multivisceral transplantation [also] had a high prevalence of autoimmunity [43%],” the researchers wrote.

Although only 31% of patients had information available on family history of allergy, those with a positive family history of allergy had a fivefold greater odds of posttransplantation PTAA, compared with other patients. Other risk factors for PTAA included female sex, young age at transplantation, eosinophilia, and a positive test for Epstein-Barr virus after transplantation, Dr. Marcus and associates said.

“The association of blood eosinophilia and PTAA reached statistical significance only when the transplant recipient was at least 6 months of age, demonstrating the nonspecific nature of abnormally high eosinophil counts during the first months of life,” they noted. The longer patients had eosinophilia after transplantation, the more likely they were to develop PTAA, “suggest[ing] a potential detrimental effect of prolonged activation of the eosinophilic-associated immune arms.”

 

 


Factors that appeared unlinked with PTAA included acute organ rejection, duration of posttransplantation steroidal treatment, organ type (living versus cadaveric), donor/recipient blood type and compatibility, infections besides Epstein-Barr virus, and posttransplant lymphoproliferative disease. “The specific type of post-transplantation immunosuppression regimen was neither associated nor protective of PTAA,” the investigators wrote. “However, a significant limitation was our inability to assess the effect of tacrolimus, as nearly all the cohort (97.8%) was treated with this medication.”

Ashley’s Angels fund provided support. The researchers reported having no conflicts of interest.

SOURCE: Marcus N et al. J Pediatr. 2018;196:154-60.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JOURNAL OF PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Children undergoing solid organ transplantation often developed allergy or autoimmunity.

Major finding: A total 34% of children developed posttransplantation allergy or autoimmunity such as eczema, asthma, food allergy, and eosinophilic gastrointestinal disease. Study details: Single-center retrospective cross-sectional study of 273 patients aged 18 and under who underwent solid organ transplantation followed for a median 3.6 years.

Disclosures: Ashley’s Angels fund provided support. The researchers reported having no conflicts of interest.

Source: Marcus N et al. J Pediatr. 2018;196:154-60.

Disqus Comments
Default
Use ProPublica