Neurology Reviews covers innovative and emerging news in neurology and neuroscience every month, with a focus on practical approaches to treating Parkinson's disease, epilepsy, headache, stroke, multiple sclerosis, Alzheimer's disease, and other neurologic disorders.

Theme
medstat_nr
Top Sections
Literature Review
Expert Commentary
Expert Interview
nr
Main menu
NR Main Menu
Explore menu
NR Explore Menu
Proclivity ID
18828001
Unpublish
Negative Keywords
Ocrevus PML
PML
Progressive multifocal leukoencephalopathy
Rituxan
Altmetric
DSM Affiliated
Display in offset block
QuickLearn Excluded Topics/Sections
Best Practices
CME
CME Supplements
Education Center
Medical Education Library
Disqus Exclude
Best Practices
CE/CME
Education Center
Medical Education Library
Enable Disqus
Display Author and Disclosure Link
Publication Type
Clinical
Slot System
Featured Buckets
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
Current Issue
Title
Neurology Reviews
Description

The leading independent newspaper covering neurology news and commentary.

Current Issue Reference

Is migraine really a female disorder?

Article Type
Changed
Tue, 12/19/2023 - 06:47

BARCELONA, SPAIN — Migraine is widely considered a predominantly female disorder. Its frequency, duration, and severity tend to be higher in women, and women are also more likely than men to receive a migraine diagnosis. However, gender expectations, differences in the likelihood of self-reporting, and problems with how migraine is classified make it difficult to estimate its true prevalence in men and women. 

Epidemiologists and migraine specialists discussed these apparent sex differences and the difficulties in obtaining accurate estimates of migraine prevalence in a debate session at the 17th European Headache Congress in Barcelona. 

Different Symptoms

Headache disorders are estimated to affect 50% of the general population ; tension-type headache and migraine are the two most common. According to epidemiologic studies, migraine is more prevalent in women, with a female-to-male ratio of 3:1. There are numerous studies of why this might be, most of which focus largely on female-related factors, such as hormones and the menstrual cycle

“Despite many years of research, there isn’t one clear factor explaining this substantial difference between women and men,” said Tobias Kurth of Charité – Universitätsmedizin Berlin, Germany. “So the question is: Are we missing something else?”

One factor in these perceived sex differences in migraine is that women seem to report their migraines differently from men, and they also have different symptoms. For example, women are more likely than men to report severe pain, and their migraine attacks are more often accompanied by photophobia, phonophobia, and nausea, whereas men’s migraines are more often accompanied by aura. 

“By favoring female symptoms, the classification system may not be picking up male symptoms because they’re not being classified in the right way,” Dr. Kurth said, with one consequence being that migraine is underdiagnosed in men. “Before trying to understand the biological and behavioral reasons for these sex differences, we first need to consider these methodological challenges that we all apply knowingly or unknowingly.” 

Christian Lampl, professor of neurology at Konventhospital der Barmherzigen Brüder Linz, Austria, and president of the European Headache Federation, said in an interview, “I’m convinced that this 3:1 ratio which has been stated for decades is wrong, but we still don’t have the data. The criteria we have [for classifying migraine] are useful for clinical trials, but they are useless for determining the male-to-female ratio. 

“We need a new definition of migraine,” he added. “Migraine is an episode, not an attack. Attacks have a sudden onset, and migraine onset is not sudden — it is an episode with a headache attack.” 

Inadequate Menopause Services

Professor Anne MacGregor of St. Bartholomew’s Hospital in London, United Kingdom, specializes in migraine and women’s health. She presented data showing that migraine is underdiagnosed in women; one reason being that the disorder receives inadequate attention from healthcare professionals at specialist menopause services. 

Menopause is associated with an increased prevalence of migraine, but women do not discuss headache symptoms at specialist menopause services, Dr. MacGregor said. 

She then described unpublished results from a survey of 117 women attending the specialist menopause service at St. Bartholomew’s Hospital. Among the respondents, 34% reported experiencing episodic migraine and an additional 8% reported having chronic migraine. 

“Within this population of women who were not reporting headache as a symptom [to the menopause service until asked in the survey], 42% of them were positive for a diagnosis of migraine,” said Dr. MacGregor. “They were mostly relying on prescribed paracetamol and codeine, or buying it over the counter, and only 22% of them were receiving triptans. 

“They are clearly being undertreated,” she added. “Part of this issue is that they didn’t spontaneously report headache as a menopause symptom, so they weren’t consulting for headache to their primary care physicians.” 

Correct diagnosis by a consultant is a prerequisite for receiving appropriate migraine treatment. Yet, according to a US study published in 2012, only 45.5% of women with episodic migraine consulted a prescribing healthcare professional. Of those who consulted, 89% were diagnosed correctly, and only 68% of those received the appropriate treatment.

A larger, more recent study confirmed that there is a massive unmet need for improving care in this patient population. The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study, which analyzed data from nearly 90,000 participants, showed that just 4.8% of people with chronic migraine received consultation, correct diagnosis, and treatment, with 89% of women with chronic migraine left undiagnosed. 

The OVERCOME Study further revealed that although many people with migraine were repeat consulters, they were consulting their physicians for other health problems. 

“This makes it very clear that people in other specialties need to be more aware about picking up and diagnosing headache,” said MacGregor. “That’s where the real need is in managing headache. We have the treatments, but if the patients can’t access them, they’re not much good to them.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

BARCELONA, SPAIN — Migraine is widely considered a predominantly female disorder. Its frequency, duration, and severity tend to be higher in women, and women are also more likely than men to receive a migraine diagnosis. However, gender expectations, differences in the likelihood of self-reporting, and problems with how migraine is classified make it difficult to estimate its true prevalence in men and women. 

Epidemiologists and migraine specialists discussed these apparent sex differences and the difficulties in obtaining accurate estimates of migraine prevalence in a debate session at the 17th European Headache Congress in Barcelona. 

Different Symptoms

Headache disorders are estimated to affect 50% of the general population ; tension-type headache and migraine are the two most common. According to epidemiologic studies, migraine is more prevalent in women, with a female-to-male ratio of 3:1. There are numerous studies of why this might be, most of which focus largely on female-related factors, such as hormones and the menstrual cycle

“Despite many years of research, there isn’t one clear factor explaining this substantial difference between women and men,” said Tobias Kurth of Charité – Universitätsmedizin Berlin, Germany. “So the question is: Are we missing something else?”

One factor in these perceived sex differences in migraine is that women seem to report their migraines differently from men, and they also have different symptoms. For example, women are more likely than men to report severe pain, and their migraine attacks are more often accompanied by photophobia, phonophobia, and nausea, whereas men’s migraines are more often accompanied by aura. 

“By favoring female symptoms, the classification system may not be picking up male symptoms because they’re not being classified in the right way,” Dr. Kurth said, with one consequence being that migraine is underdiagnosed in men. “Before trying to understand the biological and behavioral reasons for these sex differences, we first need to consider these methodological challenges that we all apply knowingly or unknowingly.” 

Christian Lampl, professor of neurology at Konventhospital der Barmherzigen Brüder Linz, Austria, and president of the European Headache Federation, said in an interview, “I’m convinced that this 3:1 ratio which has been stated for decades is wrong, but we still don’t have the data. The criteria we have [for classifying migraine] are useful for clinical trials, but they are useless for determining the male-to-female ratio. 

“We need a new definition of migraine,” he added. “Migraine is an episode, not an attack. Attacks have a sudden onset, and migraine onset is not sudden — it is an episode with a headache attack.” 

Inadequate Menopause Services

Professor Anne MacGregor of St. Bartholomew’s Hospital in London, United Kingdom, specializes in migraine and women’s health. She presented data showing that migraine is underdiagnosed in women; one reason being that the disorder receives inadequate attention from healthcare professionals at specialist menopause services. 

Menopause is associated with an increased prevalence of migraine, but women do not discuss headache symptoms at specialist menopause services, Dr. MacGregor said. 

She then described unpublished results from a survey of 117 women attending the specialist menopause service at St. Bartholomew’s Hospital. Among the respondents, 34% reported experiencing episodic migraine and an additional 8% reported having chronic migraine. 

“Within this population of women who were not reporting headache as a symptom [to the menopause service until asked in the survey], 42% of them were positive for a diagnosis of migraine,” said Dr. MacGregor. “They were mostly relying on prescribed paracetamol and codeine, or buying it over the counter, and only 22% of them were receiving triptans. 

“They are clearly being undertreated,” she added. “Part of this issue is that they didn’t spontaneously report headache as a menopause symptom, so they weren’t consulting for headache to their primary care physicians.” 

Correct diagnosis by a consultant is a prerequisite for receiving appropriate migraine treatment. Yet, according to a US study published in 2012, only 45.5% of women with episodic migraine consulted a prescribing healthcare professional. Of those who consulted, 89% were diagnosed correctly, and only 68% of those received the appropriate treatment.

A larger, more recent study confirmed that there is a massive unmet need for improving care in this patient population. The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study, which analyzed data from nearly 90,000 participants, showed that just 4.8% of people with chronic migraine received consultation, correct diagnosis, and treatment, with 89% of women with chronic migraine left undiagnosed. 

The OVERCOME Study further revealed that although many people with migraine were repeat consulters, they were consulting their physicians for other health problems. 

“This makes it very clear that people in other specialties need to be more aware about picking up and diagnosing headache,” said MacGregor. “That’s where the real need is in managing headache. We have the treatments, but if the patients can’t access them, they’re not much good to them.”

A version of this article appeared on Medscape.com.

BARCELONA, SPAIN — Migraine is widely considered a predominantly female disorder. Its frequency, duration, and severity tend to be higher in women, and women are also more likely than men to receive a migraine diagnosis. However, gender expectations, differences in the likelihood of self-reporting, and problems with how migraine is classified make it difficult to estimate its true prevalence in men and women. 

Epidemiologists and migraine specialists discussed these apparent sex differences and the difficulties in obtaining accurate estimates of migraine prevalence in a debate session at the 17th European Headache Congress in Barcelona. 

Different Symptoms

Headache disorders are estimated to affect 50% of the general population ; tension-type headache and migraine are the two most common. According to epidemiologic studies, migraine is more prevalent in women, with a female-to-male ratio of 3:1. There are numerous studies of why this might be, most of which focus largely on female-related factors, such as hormones and the menstrual cycle

“Despite many years of research, there isn’t one clear factor explaining this substantial difference between women and men,” said Tobias Kurth of Charité – Universitätsmedizin Berlin, Germany. “So the question is: Are we missing something else?”

One factor in these perceived sex differences in migraine is that women seem to report their migraines differently from men, and they also have different symptoms. For example, women are more likely than men to report severe pain, and their migraine attacks are more often accompanied by photophobia, phonophobia, and nausea, whereas men’s migraines are more often accompanied by aura. 

“By favoring female symptoms, the classification system may not be picking up male symptoms because they’re not being classified in the right way,” Dr. Kurth said, with one consequence being that migraine is underdiagnosed in men. “Before trying to understand the biological and behavioral reasons for these sex differences, we first need to consider these methodological challenges that we all apply knowingly or unknowingly.” 

Christian Lampl, professor of neurology at Konventhospital der Barmherzigen Brüder Linz, Austria, and president of the European Headache Federation, said in an interview, “I’m convinced that this 3:1 ratio which has been stated for decades is wrong, but we still don’t have the data. The criteria we have [for classifying migraine] are useful for clinical trials, but they are useless for determining the male-to-female ratio. 

“We need a new definition of migraine,” he added. “Migraine is an episode, not an attack. Attacks have a sudden onset, and migraine onset is not sudden — it is an episode with a headache attack.” 

Inadequate Menopause Services

Professor Anne MacGregor of St. Bartholomew’s Hospital in London, United Kingdom, specializes in migraine and women’s health. She presented data showing that migraine is underdiagnosed in women; one reason being that the disorder receives inadequate attention from healthcare professionals at specialist menopause services. 

Menopause is associated with an increased prevalence of migraine, but women do not discuss headache symptoms at specialist menopause services, Dr. MacGregor said. 

She then described unpublished results from a survey of 117 women attending the specialist menopause service at St. Bartholomew’s Hospital. Among the respondents, 34% reported experiencing episodic migraine and an additional 8% reported having chronic migraine. 

“Within this population of women who were not reporting headache as a symptom [to the menopause service until asked in the survey], 42% of them were positive for a diagnosis of migraine,” said Dr. MacGregor. “They were mostly relying on prescribed paracetamol and codeine, or buying it over the counter, and only 22% of them were receiving triptans. 

“They are clearly being undertreated,” she added. “Part of this issue is that they didn’t spontaneously report headache as a menopause symptom, so they weren’t consulting for headache to their primary care physicians.” 

Correct diagnosis by a consultant is a prerequisite for receiving appropriate migraine treatment. Yet, according to a US study published in 2012, only 45.5% of women with episodic migraine consulted a prescribing healthcare professional. Of those who consulted, 89% were diagnosed correctly, and only 68% of those received the appropriate treatment.

A larger, more recent study confirmed that there is a massive unmet need for improving care in this patient population. The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study, which analyzed data from nearly 90,000 participants, showed that just 4.8% of people with chronic migraine received consultation, correct diagnosis, and treatment, with 89% of women with chronic migraine left undiagnosed. 

The OVERCOME Study further revealed that although many people with migraine were repeat consulters, they were consulting their physicians for other health problems. 

“This makes it very clear that people in other specialties need to be more aware about picking up and diagnosing headache,” said MacGregor. “That’s where the real need is in managing headache. We have the treatments, but if the patients can’t access them, they’re not much good to them.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EHC 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Specific personality traits may influence dementia risk

Article Type
Changed
Tue, 12/12/2023 - 15:02

 

TOPLINE:

People who are extroverted and conscientious and have a positive outlook may be at lower dementia risk, whereas those who score highly for neuroticism and have a negative outlook may be at increased risk, new research suggests. 

METHODOLOGY: 

  • Researchers examined the link between the “big five” personality traits (conscientiousness, extraversion, openness to experience, neuroticism, and agreeableness) and subjective well-being (positive and negative affect and life satisfaction) and clinical symptoms of dementia (cognitive test performance) and neuropathology at autopsy. 
  • Data for the meta-analysis came from eight longitudinal studies with 44,531 adults (aged 49-81 years at baseline; 26%-61% women) followed for up to 21 years, during which 1703 incident cases of dementia occurred. 
  • Bayesian multilevel models tested whether personality traits and subjective well-being differentially predicted neuropsychological and neuropathologic characteristics of dementia. 

TAKEAWAY:

  • High neuroticism, negative affect, and low conscientiousness were risk factors for dementia, whereas conscientiousness, extraversion, and positive affect were protective.
  • Across all analyses, there was directional consistency in estimates across samples, which is noteworthy given between-study differences in sociodemographic and design characteristics. 
  • No consistent associations were found between psychological factors and neuropathology. 
  • However, individuals higher in conscientiousness who did not receive a clinical diagnosis tended to have a lower Braak stage at autopsy, suggesting the possibility that conscientiousness is related to cognitive resilience. 

IN PRACTICE:

“These results replicate and extend evidence that personality traits may assist in early identification and dementia-care planning strategies, as well as risk stratification for dementia diagnosis. Moreover, our findings provide further support for recommendations to incorporate psychological trait measures into clinical screening or diagnosis criteria,” the authors write. SOURCE:

The study, with first author Emorie Beck, PhD, Department of Psychology, University of California, Davis, was published online on November 29, 2023, in Alzheimer’s & Dementia.

 LIMITATIONS:

Access to autopsy data was limited. The findings may not generalize across racial groups. The analysis did not examine dynamic associations between changing personality and cognition and neuropathology over time.

DISCLOSURES:

The study was supported by grants from the National Institute on Aging. The authors have declared no conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

People who are extroverted and conscientious and have a positive outlook may be at lower dementia risk, whereas those who score highly for neuroticism and have a negative outlook may be at increased risk, new research suggests. 

METHODOLOGY: 

  • Researchers examined the link between the “big five” personality traits (conscientiousness, extraversion, openness to experience, neuroticism, and agreeableness) and subjective well-being (positive and negative affect and life satisfaction) and clinical symptoms of dementia (cognitive test performance) and neuropathology at autopsy. 
  • Data for the meta-analysis came from eight longitudinal studies with 44,531 adults (aged 49-81 years at baseline; 26%-61% women) followed for up to 21 years, during which 1703 incident cases of dementia occurred. 
  • Bayesian multilevel models tested whether personality traits and subjective well-being differentially predicted neuropsychological and neuropathologic characteristics of dementia. 

TAKEAWAY:

  • High neuroticism, negative affect, and low conscientiousness were risk factors for dementia, whereas conscientiousness, extraversion, and positive affect were protective.
  • Across all analyses, there was directional consistency in estimates across samples, which is noteworthy given between-study differences in sociodemographic and design characteristics. 
  • No consistent associations were found between psychological factors and neuropathology. 
  • However, individuals higher in conscientiousness who did not receive a clinical diagnosis tended to have a lower Braak stage at autopsy, suggesting the possibility that conscientiousness is related to cognitive resilience. 

IN PRACTICE:

“These results replicate and extend evidence that personality traits may assist in early identification and dementia-care planning strategies, as well as risk stratification for dementia diagnosis. Moreover, our findings provide further support for recommendations to incorporate psychological trait measures into clinical screening or diagnosis criteria,” the authors write. SOURCE:

The study, with first author Emorie Beck, PhD, Department of Psychology, University of California, Davis, was published online on November 29, 2023, in Alzheimer’s & Dementia.

 LIMITATIONS:

Access to autopsy data was limited. The findings may not generalize across racial groups. The analysis did not examine dynamic associations between changing personality and cognition and neuropathology over time.

DISCLOSURES:

The study was supported by grants from the National Institute on Aging. The authors have declared no conflicts of interest.

A version of this article first appeared on Medscape.com.

 

TOPLINE:

People who are extroverted and conscientious and have a positive outlook may be at lower dementia risk, whereas those who score highly for neuroticism and have a negative outlook may be at increased risk, new research suggests. 

METHODOLOGY: 

  • Researchers examined the link between the “big five” personality traits (conscientiousness, extraversion, openness to experience, neuroticism, and agreeableness) and subjective well-being (positive and negative affect and life satisfaction) and clinical symptoms of dementia (cognitive test performance) and neuropathology at autopsy. 
  • Data for the meta-analysis came from eight longitudinal studies with 44,531 adults (aged 49-81 years at baseline; 26%-61% women) followed for up to 21 years, during which 1703 incident cases of dementia occurred. 
  • Bayesian multilevel models tested whether personality traits and subjective well-being differentially predicted neuropsychological and neuropathologic characteristics of dementia. 

TAKEAWAY:

  • High neuroticism, negative affect, and low conscientiousness were risk factors for dementia, whereas conscientiousness, extraversion, and positive affect were protective.
  • Across all analyses, there was directional consistency in estimates across samples, which is noteworthy given between-study differences in sociodemographic and design characteristics. 
  • No consistent associations were found between psychological factors and neuropathology. 
  • However, individuals higher in conscientiousness who did not receive a clinical diagnosis tended to have a lower Braak stage at autopsy, suggesting the possibility that conscientiousness is related to cognitive resilience. 

IN PRACTICE:

“These results replicate and extend evidence that personality traits may assist in early identification and dementia-care planning strategies, as well as risk stratification for dementia diagnosis. Moreover, our findings provide further support for recommendations to incorporate psychological trait measures into clinical screening or diagnosis criteria,” the authors write. SOURCE:

The study, with first author Emorie Beck, PhD, Department of Psychology, University of California, Davis, was published online on November 29, 2023, in Alzheimer’s & Dementia.

 LIMITATIONS:

Access to autopsy data was limited. The findings may not generalize across racial groups. The analysis did not examine dynamic associations between changing personality and cognition and neuropathology over time.

DISCLOSURES:

The study was supported by grants from the National Institute on Aging. The authors have declared no conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Younger heart disease onset tied to higher dementia risk

Article Type
Changed
Mon, 12/11/2023 - 15:04

 

TOPLINE:

Adults diagnosed with coronary heart disease (CHD) are at an increased risk for dementia, including all-cause dementia, Alzheimer›s disease (AD), and vascular dementia (VD), with the risk highest — at 36% — if onset is before age 45, results of a large observational study show.

METHODOLOGY:

  • The study included 432,667 of the more than 500,000 participants in the UK Biobank, with a mean age of 56.9 years, 50,685 (11.7%) of whom had CHD and 50,445 had data on age at CHD onset.
  • Researchers divided participants into three groups according to age at CHD onset (below 45 years, 45-59 years, and 60 years and older), and carried out a propensity score matching analysis.
  • Outcomes included all-cause dementia, AD, and VD.
  • Covariates included age, sex, race, educational level, body mass index, low-density lipoprotein cholesterol, smoking status, alcohol intake, exercise, depressed mood, hypertension, diabetes, statin use, and apolipoprotein E4 status.

TAKEAWAY:

  • During a median follow-up of 12.8 years, researchers identified 5876 cases of all-cause dementia, 2540 cases of AD, and 1220 cases of VD.
  • Fully adjusted models showed participants with CHD had significantly higher risks than those without CHD of developing all-cause dementia (hazard ratio [HR], 1.36; 95% CI, 1.28-1.45; P < .001), AD (HR, 1.13; 95% CI, 1.02-1.24; P = .019), and VD (HR, 1.78; 95% CI, 1.56-2.02; P < .001). The higher risk for VD suggests CHD has a more profound influence on neuropathologic changes involved in this dementia type, said the authors.
  • Those with CHD diagnosed at a younger age had higher risks of developing dementia (HR per 10-year decrease in age, 1.25; 95% CI, 1.20-1.30 for all-cause dementia, 1.29; 95% CI, 1.20-1.38 for AD, and 1.22; 95% CI, 1.13-1.31 for VD; P for all < .001).
  • Propensity score matching analysis showed patients with CHD had significantly higher risks for dementia compared with matched controls, with the highest risk seen in patients diagnosed before age 45 (HR, 2.40; 95% CI, 1.79-3.20; P < .001), followed by those diagnosed between 45 and 59 years (HR, 1.46; 95% CI, 1.32-1.62; < .001) and at or above 60 years (HR, 1.11; 95% CI, 1.03-1.19; P = .005), with similar results for AD and VD.

IN PRACTICE:

The findings suggest “additional attention should be paid to the cognitive status of patients with CHD, especially the ones diagnosed with CHD at a young age,” the authors conclude, noting that “timely intervention, such as cognitive training, could be implemented once signs of cognitive deteriorations are detected.”

SOURCE:

The study was conducted by Jie Liang, BS, School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, and colleagues. It was published online on November 29, 2023, in the Journal of the American Heart Association.

LIMITATIONS:

As this is an observational study, it can’t conclude a causal relationship. Although the authors adjusted for many potential confounders, unknown risk factors that also contribute to CHD can’t be ruled out. As the study excluded 69,744 participants, selection bias is possible. The study included a mostly White population.

 

 

DISCLOSURES:

The study was supported by the National Natural Science Foundation of China, the Non-Profit Central Research Institute Fund of the Chinese Academy of Medical Sciences, and the China Medical Board. The authors have no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Adults diagnosed with coronary heart disease (CHD) are at an increased risk for dementia, including all-cause dementia, Alzheimer›s disease (AD), and vascular dementia (VD), with the risk highest — at 36% — if onset is before age 45, results of a large observational study show.

METHODOLOGY:

  • The study included 432,667 of the more than 500,000 participants in the UK Biobank, with a mean age of 56.9 years, 50,685 (11.7%) of whom had CHD and 50,445 had data on age at CHD onset.
  • Researchers divided participants into three groups according to age at CHD onset (below 45 years, 45-59 years, and 60 years and older), and carried out a propensity score matching analysis.
  • Outcomes included all-cause dementia, AD, and VD.
  • Covariates included age, sex, race, educational level, body mass index, low-density lipoprotein cholesterol, smoking status, alcohol intake, exercise, depressed mood, hypertension, diabetes, statin use, and apolipoprotein E4 status.

TAKEAWAY:

  • During a median follow-up of 12.8 years, researchers identified 5876 cases of all-cause dementia, 2540 cases of AD, and 1220 cases of VD.
  • Fully adjusted models showed participants with CHD had significantly higher risks than those without CHD of developing all-cause dementia (hazard ratio [HR], 1.36; 95% CI, 1.28-1.45; P < .001), AD (HR, 1.13; 95% CI, 1.02-1.24; P = .019), and VD (HR, 1.78; 95% CI, 1.56-2.02; P < .001). The higher risk for VD suggests CHD has a more profound influence on neuropathologic changes involved in this dementia type, said the authors.
  • Those with CHD diagnosed at a younger age had higher risks of developing dementia (HR per 10-year decrease in age, 1.25; 95% CI, 1.20-1.30 for all-cause dementia, 1.29; 95% CI, 1.20-1.38 for AD, and 1.22; 95% CI, 1.13-1.31 for VD; P for all < .001).
  • Propensity score matching analysis showed patients with CHD had significantly higher risks for dementia compared with matched controls, with the highest risk seen in patients diagnosed before age 45 (HR, 2.40; 95% CI, 1.79-3.20; P < .001), followed by those diagnosed between 45 and 59 years (HR, 1.46; 95% CI, 1.32-1.62; < .001) and at or above 60 years (HR, 1.11; 95% CI, 1.03-1.19; P = .005), with similar results for AD and VD.

IN PRACTICE:

The findings suggest “additional attention should be paid to the cognitive status of patients with CHD, especially the ones diagnosed with CHD at a young age,” the authors conclude, noting that “timely intervention, such as cognitive training, could be implemented once signs of cognitive deteriorations are detected.”

SOURCE:

The study was conducted by Jie Liang, BS, School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, and colleagues. It was published online on November 29, 2023, in the Journal of the American Heart Association.

LIMITATIONS:

As this is an observational study, it can’t conclude a causal relationship. Although the authors adjusted for many potential confounders, unknown risk factors that also contribute to CHD can’t be ruled out. As the study excluded 69,744 participants, selection bias is possible. The study included a mostly White population.

 

 

DISCLOSURES:

The study was supported by the National Natural Science Foundation of China, the Non-Profit Central Research Institute Fund of the Chinese Academy of Medical Sciences, and the China Medical Board. The authors have no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

 

TOPLINE:

Adults diagnosed with coronary heart disease (CHD) are at an increased risk for dementia, including all-cause dementia, Alzheimer›s disease (AD), and vascular dementia (VD), with the risk highest — at 36% — if onset is before age 45, results of a large observational study show.

METHODOLOGY:

  • The study included 432,667 of the more than 500,000 participants in the UK Biobank, with a mean age of 56.9 years, 50,685 (11.7%) of whom had CHD and 50,445 had data on age at CHD onset.
  • Researchers divided participants into three groups according to age at CHD onset (below 45 years, 45-59 years, and 60 years and older), and carried out a propensity score matching analysis.
  • Outcomes included all-cause dementia, AD, and VD.
  • Covariates included age, sex, race, educational level, body mass index, low-density lipoprotein cholesterol, smoking status, alcohol intake, exercise, depressed mood, hypertension, diabetes, statin use, and apolipoprotein E4 status.

TAKEAWAY:

  • During a median follow-up of 12.8 years, researchers identified 5876 cases of all-cause dementia, 2540 cases of AD, and 1220 cases of VD.
  • Fully adjusted models showed participants with CHD had significantly higher risks than those without CHD of developing all-cause dementia (hazard ratio [HR], 1.36; 95% CI, 1.28-1.45; P < .001), AD (HR, 1.13; 95% CI, 1.02-1.24; P = .019), and VD (HR, 1.78; 95% CI, 1.56-2.02; P < .001). The higher risk for VD suggests CHD has a more profound influence on neuropathologic changes involved in this dementia type, said the authors.
  • Those with CHD diagnosed at a younger age had higher risks of developing dementia (HR per 10-year decrease in age, 1.25; 95% CI, 1.20-1.30 for all-cause dementia, 1.29; 95% CI, 1.20-1.38 for AD, and 1.22; 95% CI, 1.13-1.31 for VD; P for all < .001).
  • Propensity score matching analysis showed patients with CHD had significantly higher risks for dementia compared with matched controls, with the highest risk seen in patients diagnosed before age 45 (HR, 2.40; 95% CI, 1.79-3.20; P < .001), followed by those diagnosed between 45 and 59 years (HR, 1.46; 95% CI, 1.32-1.62; < .001) and at or above 60 years (HR, 1.11; 95% CI, 1.03-1.19; P = .005), with similar results for AD and VD.

IN PRACTICE:

The findings suggest “additional attention should be paid to the cognitive status of patients with CHD, especially the ones diagnosed with CHD at a young age,” the authors conclude, noting that “timely intervention, such as cognitive training, could be implemented once signs of cognitive deteriorations are detected.”

SOURCE:

The study was conducted by Jie Liang, BS, School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, and colleagues. It was published online on November 29, 2023, in the Journal of the American Heart Association.

LIMITATIONS:

As this is an observational study, it can’t conclude a causal relationship. Although the authors adjusted for many potential confounders, unknown risk factors that also contribute to CHD can’t be ruled out. As the study excluded 69,744 participants, selection bias is possible. The study included a mostly White population.

 

 

DISCLOSURES:

The study was supported by the National Natural Science Foundation of China, the Non-Profit Central Research Institute Fund of the Chinese Academy of Medical Sciences, and the China Medical Board. The authors have no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Which migraine medications are most effective?

Article Type
Changed
Tue, 12/12/2023 - 08:07

 

For relief of acute migraine, triptans, ergots, and antiemetics are two to five times more effective than ibuprofen, and acetaminophen is the least effective medication, new results from large, real-world analysis of self-reported patient data show. 

METHODOLOGY: 

  • Researchers analyzed nearly 11 million migraine attack records extracted from Migraine Buddy, an e-diary smartphone app, over a 6-year period. 
  • They evaluated self-reported treatment effectiveness for 25 acute migraine medications among seven classes: acetaminophen, NSAIDs, triptans, combination analgesics, ergots, antiemetics, and opioids. 
  • A two-level nested multivariate logistic regression model adjusted for within-subject dependency and for concomitant medications taken within each analyzed migraine attack. 
  • The final analysis included nearly 5 million medication-outcome pairs from 3.1 million migraine attacks in 278,000 medication users. 

TAKEAWAY:

  • Using ibuprofen as the reference, triptans, ergots, and antiemetics were the top three medication classes with the highest effectiveness (mean odds ratios [OR] 4.80, 3.02, and 2.67, respectively). 
  • The next most effective medication classes were opioids (OR, 2.49), NSAIDs other than ibuprofen (OR, 1.94), combination analgesics acetaminophen/acetylsalicylic acid/caffeine (OR, 1.69), and others (OR, 1.49).
  • Acetaminophen (OR, 0.83) was considered to be the least effective.
  • The most effective individual medications were eletriptan (Relpax) (OR, 6.1); zolmitriptan (Zomig) (OR, 5.7); and sumatriptan (Imitrex) (OR, 5.2).

IN PRACTICE:

“Our findings that triptans, ergots, and antiemetics are the most effective classes of medications align with the guideline recommendations and offer generalizable insights to complement clinical practice,” the authors wrote. 

laflor/gettyimages

SOURCE:

The study, with first author Chia-Chun Chiang, MD, Department of Neurology, Mayo Clinic, Rochester, Minnesota, was published online November 29 in Neurology.

LIMITATIONS:

The findings are based on subjective user-reported ratings of effectiveness and information on side effects, dosages, and formulations were not available. The newer migraine medication classes, gepants and ditans, were not included due to the relatively lower number of treated attacks. The regression model did not include age, gender, pain intensity, and other migraine-associated symptoms, which could potentially affect treatment effectiveness. 

DISCLOSURES: 

Funding for the study was provided by the Kanagawa University of Human Service research fund. A full list of author disclosures can be found with the original article.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

For relief of acute migraine, triptans, ergots, and antiemetics are two to five times more effective than ibuprofen, and acetaminophen is the least effective medication, new results from large, real-world analysis of self-reported patient data show. 

METHODOLOGY: 

  • Researchers analyzed nearly 11 million migraine attack records extracted from Migraine Buddy, an e-diary smartphone app, over a 6-year period. 
  • They evaluated self-reported treatment effectiveness for 25 acute migraine medications among seven classes: acetaminophen, NSAIDs, triptans, combination analgesics, ergots, antiemetics, and opioids. 
  • A two-level nested multivariate logistic regression model adjusted for within-subject dependency and for concomitant medications taken within each analyzed migraine attack. 
  • The final analysis included nearly 5 million medication-outcome pairs from 3.1 million migraine attacks in 278,000 medication users. 

TAKEAWAY:

  • Using ibuprofen as the reference, triptans, ergots, and antiemetics were the top three medication classes with the highest effectiveness (mean odds ratios [OR] 4.80, 3.02, and 2.67, respectively). 
  • The next most effective medication classes were opioids (OR, 2.49), NSAIDs other than ibuprofen (OR, 1.94), combination analgesics acetaminophen/acetylsalicylic acid/caffeine (OR, 1.69), and others (OR, 1.49).
  • Acetaminophen (OR, 0.83) was considered to be the least effective.
  • The most effective individual medications were eletriptan (Relpax) (OR, 6.1); zolmitriptan (Zomig) (OR, 5.7); and sumatriptan (Imitrex) (OR, 5.2).

IN PRACTICE:

“Our findings that triptans, ergots, and antiemetics are the most effective classes of medications align with the guideline recommendations and offer generalizable insights to complement clinical practice,” the authors wrote. 

laflor/gettyimages

SOURCE:

The study, with first author Chia-Chun Chiang, MD, Department of Neurology, Mayo Clinic, Rochester, Minnesota, was published online November 29 in Neurology.

LIMITATIONS:

The findings are based on subjective user-reported ratings of effectiveness and information on side effects, dosages, and formulations were not available. The newer migraine medication classes, gepants and ditans, were not included due to the relatively lower number of treated attacks. The regression model did not include age, gender, pain intensity, and other migraine-associated symptoms, which could potentially affect treatment effectiveness. 

DISCLOSURES: 

Funding for the study was provided by the Kanagawa University of Human Service research fund. A full list of author disclosures can be found with the original article.

A version of this article first appeared on Medscape.com.

 

For relief of acute migraine, triptans, ergots, and antiemetics are two to five times more effective than ibuprofen, and acetaminophen is the least effective medication, new results from large, real-world analysis of self-reported patient data show. 

METHODOLOGY: 

  • Researchers analyzed nearly 11 million migraine attack records extracted from Migraine Buddy, an e-diary smartphone app, over a 6-year period. 
  • They evaluated self-reported treatment effectiveness for 25 acute migraine medications among seven classes: acetaminophen, NSAIDs, triptans, combination analgesics, ergots, antiemetics, and opioids. 
  • A two-level nested multivariate logistic regression model adjusted for within-subject dependency and for concomitant medications taken within each analyzed migraine attack. 
  • The final analysis included nearly 5 million medication-outcome pairs from 3.1 million migraine attacks in 278,000 medication users. 

TAKEAWAY:

  • Using ibuprofen as the reference, triptans, ergots, and antiemetics were the top three medication classes with the highest effectiveness (mean odds ratios [OR] 4.80, 3.02, and 2.67, respectively). 
  • The next most effective medication classes were opioids (OR, 2.49), NSAIDs other than ibuprofen (OR, 1.94), combination analgesics acetaminophen/acetylsalicylic acid/caffeine (OR, 1.69), and others (OR, 1.49).
  • Acetaminophen (OR, 0.83) was considered to be the least effective.
  • The most effective individual medications were eletriptan (Relpax) (OR, 6.1); zolmitriptan (Zomig) (OR, 5.7); and sumatriptan (Imitrex) (OR, 5.2).

IN PRACTICE:

“Our findings that triptans, ergots, and antiemetics are the most effective classes of medications align with the guideline recommendations and offer generalizable insights to complement clinical practice,” the authors wrote. 

laflor/gettyimages

SOURCE:

The study, with first author Chia-Chun Chiang, MD, Department of Neurology, Mayo Clinic, Rochester, Minnesota, was published online November 29 in Neurology.

LIMITATIONS:

The findings are based on subjective user-reported ratings of effectiveness and information on side effects, dosages, and formulations were not available. The newer migraine medication classes, gepants and ditans, were not included due to the relatively lower number of treated attacks. The regression model did not include age, gender, pain intensity, and other migraine-associated symptoms, which could potentially affect treatment effectiveness. 

DISCLOSURES: 

Funding for the study was provided by the Kanagawa University of Human Service research fund. A full list of author disclosures can be found with the original article.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Adverse events in childhood alter brain function

Article Type
Changed
Fri, 12/08/2023 - 13:36

Early childhood trauma alters brain function in adults, according to new research.

In a meta-analysis of 83 functional magnetic resonance imaging (fMRI) studies that included more than 5000 patients, exposure to adversity was associated with higher amygdala reactivity and lower prefrontal cortical reactivity across a range of task domains. 

The altered responses were only observed in studies including adult participants and were clearest in participants who had been exposed to severe threat and trauma. Children and adolescents did not show significant adversity-related differences in brain function.

“By integrating the results from 83 previous brain imaging studies, we were able to provide what is arguably the clearest evidence to date that adults who have been exposed to early life trauma have different brain responses to psychological challenges,” senior author Marco Leyton, PhD, professor of psychiatry and director of the Temperament Adversity Biology Lab at McGill University in Montreal, Quebec, Canada, said in a press release. “This includes exaggerated responses in a region that processes emotionally intense information (the amygdala) and reduced responses in a region that helps people regulate emotions and associated behaviors (the frontal cortex).”

The findings were published in JAMA Network Open.
 

Changes in Reactivity 

“One big issue we have in psychology, and especially in neuroscience, is that single-study results are often not reproducible,” lead author Niki Hosseini-Kamkar, PhD, neuroimaging research associate at Atlas Institute for Veterans and Families at Royal Ottawa Hospital, said in an interview.

“It was very important to me to use a meta-analysis to get an overall picture of what brain regions are consistently reported across all these different studies. That is what we did here,” she added. Dr. Hosseini-Kamkar conducted this analysis while she was a postdoctoral research fellow at McGill University in Montreal.

She and her group examined adversity exposure and brain function in the following four domains of task-based fMRI: emotion processing, memory processing, inhibitory control, and reward processing. Their study included 5242 participants. The researchers used multilevel kernel density analyses (MKDA) to analyze the data more accurately. 

Adversity exposure was associated with higher amygdala reactivity (P < .001) and lower prefrontal cortical reactivity (P < .001), compared with controls with no adversity exposure.

Threat types of adversity were associated with greater blood-oxygen-level-dependent (BOLD) responses in the superior temporal gyrus and lower prefrontal cortex activity in participants exposed to threat, compared with controls. 

Analysis of studies of inhibitory control tasks found greater activity in the claustrum, anterior cingulate cortex, and insula in the adversity-exposed participants, compared with controls.

In addition, studies that administered emotion processing tasks showed greater amygdala reactivity and lower prefrontal cortex (superior frontal gyrus) reactivity in the adversity exposure group, compared with controls.

“The main takeaway is that there’s an exaggerated activity in the amygdala, and diminished prefrontal cortex activity, and together, this might point to a mechanism for how a history of adversity diminishes the ability to cope with later stressors and can therefore heighten susceptibility to mental illness,” said Dr. Hosseini-Kamkar.
 

‘Important Next Step’ 

“Overall, the meta-analysis by Dr. Hosseini-Kamkar and colleagues represents an important next step in understanding associations of adversity exposure with brain function while highlighting the importance of considering the role of development,” wrote Dylan G. Gee, PhD, associate professor of psychology at Yale University in New Haven, Connecticut, and Alexis Brieant, PhD, assistant professor of research or creative works at the University of Vermont in Burlington, in an accompanying commentary

They also applauded the authors for their use of MKDA. They noted that the technique “allows inferences about the consistency and specificity of brain activation across studies and is thought to be more robust to small sample sizes than activation likelihood estimation (ALE) meta-analysis.” 

Dr. Gee and Dr. Brieant also observed that a recent ALE meta-analysis failed to find a link between adversity and brain function. “Although it is important to note that the file drawer problem — by which researchers are less likely to publish null results — presents challenges to the inferences that can be drawn in the current work, the current study may provide complementary information to prior ALE meta-analyses.” 
 

 

 

Epigenetic Changes? 

Commenting on the findings for this article, Victor Fornari, MD, director of child and adolescent psychiatry at Northwell Health in Glen Oaks, New York, said, “Historically, when someone went through a traumatic event, they were told to just get over it, because somehow trauma doesn’t have a lasting impact on the brain.” Dr. Fornari was not involved in the research.

“We have certainly learned so much more over the past decade about early adversity and that it does have a profound impact on the brain and probably even epigenetic changes in our genes,” Dr. Fornari said.

“This is a very important avenue of investigation. People are really trying to understand if there are biological markers that we can actually measure in the brain that will offer us a window to better understand the consequence of adversity, as well as possible avenues of treatment.” 

No funding source for this study was reported. Dr. Leyton, Dr. Hosseini-Kamkar, and Dr. Fornari report no relevant financial relationships. Gee reports receiving grants from the National Science Foundation and National Institutes of Health outside the submitted work. Dr. Brieant reports receiving grants from the National Institute of Mental Health outside the submitted work. 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Early childhood trauma alters brain function in adults, according to new research.

In a meta-analysis of 83 functional magnetic resonance imaging (fMRI) studies that included more than 5000 patients, exposure to adversity was associated with higher amygdala reactivity and lower prefrontal cortical reactivity across a range of task domains. 

The altered responses were only observed in studies including adult participants and were clearest in participants who had been exposed to severe threat and trauma. Children and adolescents did not show significant adversity-related differences in brain function.

“By integrating the results from 83 previous brain imaging studies, we were able to provide what is arguably the clearest evidence to date that adults who have been exposed to early life trauma have different brain responses to psychological challenges,” senior author Marco Leyton, PhD, professor of psychiatry and director of the Temperament Adversity Biology Lab at McGill University in Montreal, Quebec, Canada, said in a press release. “This includes exaggerated responses in a region that processes emotionally intense information (the amygdala) and reduced responses in a region that helps people regulate emotions and associated behaviors (the frontal cortex).”

The findings were published in JAMA Network Open.
 

Changes in Reactivity 

“One big issue we have in psychology, and especially in neuroscience, is that single-study results are often not reproducible,” lead author Niki Hosseini-Kamkar, PhD, neuroimaging research associate at Atlas Institute for Veterans and Families at Royal Ottawa Hospital, said in an interview.

“It was very important to me to use a meta-analysis to get an overall picture of what brain regions are consistently reported across all these different studies. That is what we did here,” she added. Dr. Hosseini-Kamkar conducted this analysis while she was a postdoctoral research fellow at McGill University in Montreal.

She and her group examined adversity exposure and brain function in the following four domains of task-based fMRI: emotion processing, memory processing, inhibitory control, and reward processing. Their study included 5242 participants. The researchers used multilevel kernel density analyses (MKDA) to analyze the data more accurately. 

Adversity exposure was associated with higher amygdala reactivity (P < .001) and lower prefrontal cortical reactivity (P < .001), compared with controls with no adversity exposure.

Threat types of adversity were associated with greater blood-oxygen-level-dependent (BOLD) responses in the superior temporal gyrus and lower prefrontal cortex activity in participants exposed to threat, compared with controls. 

Analysis of studies of inhibitory control tasks found greater activity in the claustrum, anterior cingulate cortex, and insula in the adversity-exposed participants, compared with controls.

In addition, studies that administered emotion processing tasks showed greater amygdala reactivity and lower prefrontal cortex (superior frontal gyrus) reactivity in the adversity exposure group, compared with controls.

“The main takeaway is that there’s an exaggerated activity in the amygdala, and diminished prefrontal cortex activity, and together, this might point to a mechanism for how a history of adversity diminishes the ability to cope with later stressors and can therefore heighten susceptibility to mental illness,” said Dr. Hosseini-Kamkar.
 

‘Important Next Step’ 

“Overall, the meta-analysis by Dr. Hosseini-Kamkar and colleagues represents an important next step in understanding associations of adversity exposure with brain function while highlighting the importance of considering the role of development,” wrote Dylan G. Gee, PhD, associate professor of psychology at Yale University in New Haven, Connecticut, and Alexis Brieant, PhD, assistant professor of research or creative works at the University of Vermont in Burlington, in an accompanying commentary

They also applauded the authors for their use of MKDA. They noted that the technique “allows inferences about the consistency and specificity of brain activation across studies and is thought to be more robust to small sample sizes than activation likelihood estimation (ALE) meta-analysis.” 

Dr. Gee and Dr. Brieant also observed that a recent ALE meta-analysis failed to find a link between adversity and brain function. “Although it is important to note that the file drawer problem — by which researchers are less likely to publish null results — presents challenges to the inferences that can be drawn in the current work, the current study may provide complementary information to prior ALE meta-analyses.” 
 

 

 

Epigenetic Changes? 

Commenting on the findings for this article, Victor Fornari, MD, director of child and adolescent psychiatry at Northwell Health in Glen Oaks, New York, said, “Historically, when someone went through a traumatic event, they were told to just get over it, because somehow trauma doesn’t have a lasting impact on the brain.” Dr. Fornari was not involved in the research.

“We have certainly learned so much more over the past decade about early adversity and that it does have a profound impact on the brain and probably even epigenetic changes in our genes,” Dr. Fornari said.

“This is a very important avenue of investigation. People are really trying to understand if there are biological markers that we can actually measure in the brain that will offer us a window to better understand the consequence of adversity, as well as possible avenues of treatment.” 

No funding source for this study was reported. Dr. Leyton, Dr. Hosseini-Kamkar, and Dr. Fornari report no relevant financial relationships. Gee reports receiving grants from the National Science Foundation and National Institutes of Health outside the submitted work. Dr. Brieant reports receiving grants from the National Institute of Mental Health outside the submitted work. 

A version of this article appeared on Medscape.com.

Early childhood trauma alters brain function in adults, according to new research.

In a meta-analysis of 83 functional magnetic resonance imaging (fMRI) studies that included more than 5000 patients, exposure to adversity was associated with higher amygdala reactivity and lower prefrontal cortical reactivity across a range of task domains. 

The altered responses were only observed in studies including adult participants and were clearest in participants who had been exposed to severe threat and trauma. Children and adolescents did not show significant adversity-related differences in brain function.

“By integrating the results from 83 previous brain imaging studies, we were able to provide what is arguably the clearest evidence to date that adults who have been exposed to early life trauma have different brain responses to psychological challenges,” senior author Marco Leyton, PhD, professor of psychiatry and director of the Temperament Adversity Biology Lab at McGill University in Montreal, Quebec, Canada, said in a press release. “This includes exaggerated responses in a region that processes emotionally intense information (the amygdala) and reduced responses in a region that helps people regulate emotions and associated behaviors (the frontal cortex).”

The findings were published in JAMA Network Open.
 

Changes in Reactivity 

“One big issue we have in psychology, and especially in neuroscience, is that single-study results are often not reproducible,” lead author Niki Hosseini-Kamkar, PhD, neuroimaging research associate at Atlas Institute for Veterans and Families at Royal Ottawa Hospital, said in an interview.

“It was very important to me to use a meta-analysis to get an overall picture of what brain regions are consistently reported across all these different studies. That is what we did here,” she added. Dr. Hosseini-Kamkar conducted this analysis while she was a postdoctoral research fellow at McGill University in Montreal.

She and her group examined adversity exposure and brain function in the following four domains of task-based fMRI: emotion processing, memory processing, inhibitory control, and reward processing. Their study included 5242 participants. The researchers used multilevel kernel density analyses (MKDA) to analyze the data more accurately. 

Adversity exposure was associated with higher amygdala reactivity (P < .001) and lower prefrontal cortical reactivity (P < .001), compared with controls with no adversity exposure.

Threat types of adversity were associated with greater blood-oxygen-level-dependent (BOLD) responses in the superior temporal gyrus and lower prefrontal cortex activity in participants exposed to threat, compared with controls. 

Analysis of studies of inhibitory control tasks found greater activity in the claustrum, anterior cingulate cortex, and insula in the adversity-exposed participants, compared with controls.

In addition, studies that administered emotion processing tasks showed greater amygdala reactivity and lower prefrontal cortex (superior frontal gyrus) reactivity in the adversity exposure group, compared with controls.

“The main takeaway is that there’s an exaggerated activity in the amygdala, and diminished prefrontal cortex activity, and together, this might point to a mechanism for how a history of adversity diminishes the ability to cope with later stressors and can therefore heighten susceptibility to mental illness,” said Dr. Hosseini-Kamkar.
 

‘Important Next Step’ 

“Overall, the meta-analysis by Dr. Hosseini-Kamkar and colleagues represents an important next step in understanding associations of adversity exposure with brain function while highlighting the importance of considering the role of development,” wrote Dylan G. Gee, PhD, associate professor of psychology at Yale University in New Haven, Connecticut, and Alexis Brieant, PhD, assistant professor of research or creative works at the University of Vermont in Burlington, in an accompanying commentary

They also applauded the authors for their use of MKDA. They noted that the technique “allows inferences about the consistency and specificity of brain activation across studies and is thought to be more robust to small sample sizes than activation likelihood estimation (ALE) meta-analysis.” 

Dr. Gee and Dr. Brieant also observed that a recent ALE meta-analysis failed to find a link between adversity and brain function. “Although it is important to note that the file drawer problem — by which researchers are less likely to publish null results — presents challenges to the inferences that can be drawn in the current work, the current study may provide complementary information to prior ALE meta-analyses.” 
 

 

 

Epigenetic Changes? 

Commenting on the findings for this article, Victor Fornari, MD, director of child and adolescent psychiatry at Northwell Health in Glen Oaks, New York, said, “Historically, when someone went through a traumatic event, they were told to just get over it, because somehow trauma doesn’t have a lasting impact on the brain.” Dr. Fornari was not involved in the research.

“We have certainly learned so much more over the past decade about early adversity and that it does have a profound impact on the brain and probably even epigenetic changes in our genes,” Dr. Fornari said.

“This is a very important avenue of investigation. People are really trying to understand if there are biological markers that we can actually measure in the brain that will offer us a window to better understand the consequence of adversity, as well as possible avenues of treatment.” 

No funding source for this study was reported. Dr. Leyton, Dr. Hosseini-Kamkar, and Dr. Fornari report no relevant financial relationships. Gee reports receiving grants from the National Science Foundation and National Institutes of Health outside the submitted work. Dr. Brieant reports receiving grants from the National Institute of Mental Health outside the submitted work. 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Some reasons to get off the fence about COVID booster

Article Type
Changed
Mon, 12/11/2023 - 10:54

Though many people remain on the fence about getting the latest COVID vaccine booster, new research suggests a strong argument for getting the shot this winter: It sharply reduces the risk for COVID. 

Researchers found that getting vaccinated led to a 69% reduction in long-COVID risk among adults who received three vaccines before being infected. The risk reduction was 37% for those who received two doses. Experts say the research provides a strong argument for getting the vaccine, noting that about 10% of people infected with COVID go on to have long COVID, which can be debilitating for one quarter of those with long-lasting symptoms.

The data come from a systematic literature review and meta-analysis published in October in Antimicrobial Stewardship & Epidemiology. Researchers examined 32 studies published between December 2019 and June 2023, involving 775,931 adults. Twenty-four studies, encompassing 620,221 individuals, were included in the meta-analysis. 

“The body of evidence from all these different studies converge on one single reality — that vaccines reduce the risk of long COVID, and people who keep up to date on their vaccinations also fared better than people who got it once or twice and didn’t follow up,” said Ziyad Al-Aly, MD, a clinical epidemiologist at Washington University in St Louis. 

Researchers have reported similar results for children. The National Institutes of Health RECOVER Initiative team found that vaccines are up to 42% effective in preventing long COVID in children, said Dr. Carlos Oliveira, MD, a pediatric infectious diseases specialist and Yale researcher who contributed to the study, which is in preprint. 

Vaccines also protect children from multisystem inflammatory syndrome, a condition that can happen after COVID, as well as protect against other COVID-related problems, such as missed school days, Oliveira said. “Even if the vaccine doesn’t completely stop long COVID, it’s still good for kids to get vaccinated for all these other reasons.” 

However, uptake for the latest boosters has been slow: the Centers for Disease Control and Prevention reported that by mid-November, less than 16% of people aged 18 years or older had received a shot. For children, the number was closer to 6%. A recent Kaiser Family Foundation survey found that booster rates for adults are similar to what it was 1 year ago. 

The survey results suggest that people are no longer as worried about COVID, which is why there is less concerned about keeping up with boosters. Though the current mutation of the virus is not as debilitating as its predecessors, long COVID continues to be a problem: as of January 2023, 28% of people who had contracted the virus had experienced long-COVID symptoms. And though the mechanisms are still not fully understood, and researchers have yet to agree on a definition of long COVID, they are certain about this much: The best way to avoid it is to avoid getting infected to begin with. 

The lack of a diagnostic test for long COVID and the fact that the symptoms mimic those of other diseases lead to inconsistency that can make studies hard to replicate. In the papers reviewed for the Antimicrobial Stewardship & Epidemiology study, long COVID was defined as having symptoms lasting from more than 4 weeks to more than 6 months. Alexandre Marra, MD, the lead author and a researcher at the Hospital Israelita Albert Einstein, in São Paulo, Brazil, and at the University of Iowa, said that a clear standard definition is needed to better understand the actual prevalence and evaluate vaccine effectiveness. 

Al-Aly noted that there is a logical explanation for one finding in the paper: The percentage of individuals who had COVID and reported that long-COVID symptoms declined from 19% in June 2022 to 11% in January 2023. 

Because a pandemic is a dynamic event, constantly producing different variants with different phenotypes, the prevalence of disease is naturally going to be affected. “People who got infected early in the pandemic may have a different long COVID profile and long COVID risk than people who got infected in the second or third year of the pandemic,” Al-Aly said. 

Most of the studies reported data from before the Omicron-variant era. Only eight reported data during that era. Omicron was not as lethal as previous variants, and consequently, fewer patients developed long COVID during that time. 

One of those who did is Yeng Chang, age 40 years, a family doctor who lives in Sherwood Park, Alberta, Canada. Chang developed long COVID during fall 2022 after getting the virus in June. By then, she’d been vaccinated three times, but she isn’t surprised that she got sick because each vaccine she had was developed before Omicron.

“When I had COVID I was really sick, but I was well enough to stay home,” she said. “I think if I didn’t have my immunizations, I might have been hospitalized, and I don’t know what would have happened.” 

Long COVID has left Chang with brain fog, fatigue, and a lack of physical stamina that forced her to pause her medical practice. For the past year and a half, she’s spent more time as a patient than a physician. 

Chang had her fifth COVID vaccination in the fall and recommends that others do the same. “The booster you got however many years ago was effective for the COVID of that time but there is a new COVID now. You can’t just say, ‘I had one and I’m fine forever.’” 
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Though many people remain on the fence about getting the latest COVID vaccine booster, new research suggests a strong argument for getting the shot this winter: It sharply reduces the risk for COVID. 

Researchers found that getting vaccinated led to a 69% reduction in long-COVID risk among adults who received three vaccines before being infected. The risk reduction was 37% for those who received two doses. Experts say the research provides a strong argument for getting the vaccine, noting that about 10% of people infected with COVID go on to have long COVID, which can be debilitating for one quarter of those with long-lasting symptoms.

The data come from a systematic literature review and meta-analysis published in October in Antimicrobial Stewardship & Epidemiology. Researchers examined 32 studies published between December 2019 and June 2023, involving 775,931 adults. Twenty-four studies, encompassing 620,221 individuals, were included in the meta-analysis. 

“The body of evidence from all these different studies converge on one single reality — that vaccines reduce the risk of long COVID, and people who keep up to date on their vaccinations also fared better than people who got it once or twice and didn’t follow up,” said Ziyad Al-Aly, MD, a clinical epidemiologist at Washington University in St Louis. 

Researchers have reported similar results for children. The National Institutes of Health RECOVER Initiative team found that vaccines are up to 42% effective in preventing long COVID in children, said Dr. Carlos Oliveira, MD, a pediatric infectious diseases specialist and Yale researcher who contributed to the study, which is in preprint. 

Vaccines also protect children from multisystem inflammatory syndrome, a condition that can happen after COVID, as well as protect against other COVID-related problems, such as missed school days, Oliveira said. “Even if the vaccine doesn’t completely stop long COVID, it’s still good for kids to get vaccinated for all these other reasons.” 

However, uptake for the latest boosters has been slow: the Centers for Disease Control and Prevention reported that by mid-November, less than 16% of people aged 18 years or older had received a shot. For children, the number was closer to 6%. A recent Kaiser Family Foundation survey found that booster rates for adults are similar to what it was 1 year ago. 

The survey results suggest that people are no longer as worried about COVID, which is why there is less concerned about keeping up with boosters. Though the current mutation of the virus is not as debilitating as its predecessors, long COVID continues to be a problem: as of January 2023, 28% of people who had contracted the virus had experienced long-COVID symptoms. And though the mechanisms are still not fully understood, and researchers have yet to agree on a definition of long COVID, they are certain about this much: The best way to avoid it is to avoid getting infected to begin with. 

The lack of a diagnostic test for long COVID and the fact that the symptoms mimic those of other diseases lead to inconsistency that can make studies hard to replicate. In the papers reviewed for the Antimicrobial Stewardship & Epidemiology study, long COVID was defined as having symptoms lasting from more than 4 weeks to more than 6 months. Alexandre Marra, MD, the lead author and a researcher at the Hospital Israelita Albert Einstein, in São Paulo, Brazil, and at the University of Iowa, said that a clear standard definition is needed to better understand the actual prevalence and evaluate vaccine effectiveness. 

Al-Aly noted that there is a logical explanation for one finding in the paper: The percentage of individuals who had COVID and reported that long-COVID symptoms declined from 19% in June 2022 to 11% in January 2023. 

Because a pandemic is a dynamic event, constantly producing different variants with different phenotypes, the prevalence of disease is naturally going to be affected. “People who got infected early in the pandemic may have a different long COVID profile and long COVID risk than people who got infected in the second or third year of the pandemic,” Al-Aly said. 

Most of the studies reported data from before the Omicron-variant era. Only eight reported data during that era. Omicron was not as lethal as previous variants, and consequently, fewer patients developed long COVID during that time. 

One of those who did is Yeng Chang, age 40 years, a family doctor who lives in Sherwood Park, Alberta, Canada. Chang developed long COVID during fall 2022 after getting the virus in June. By then, she’d been vaccinated three times, but she isn’t surprised that she got sick because each vaccine she had was developed before Omicron.

“When I had COVID I was really sick, but I was well enough to stay home,” she said. “I think if I didn’t have my immunizations, I might have been hospitalized, and I don’t know what would have happened.” 

Long COVID has left Chang with brain fog, fatigue, and a lack of physical stamina that forced her to pause her medical practice. For the past year and a half, she’s spent more time as a patient than a physician. 

Chang had her fifth COVID vaccination in the fall and recommends that others do the same. “The booster you got however many years ago was effective for the COVID of that time but there is a new COVID now. You can’t just say, ‘I had one and I’m fine forever.’” 
 

A version of this article appeared on Medscape.com.

Though many people remain on the fence about getting the latest COVID vaccine booster, new research suggests a strong argument for getting the shot this winter: It sharply reduces the risk for COVID. 

Researchers found that getting vaccinated led to a 69% reduction in long-COVID risk among adults who received three vaccines before being infected. The risk reduction was 37% for those who received two doses. Experts say the research provides a strong argument for getting the vaccine, noting that about 10% of people infected with COVID go on to have long COVID, which can be debilitating for one quarter of those with long-lasting symptoms.

The data come from a systematic literature review and meta-analysis published in October in Antimicrobial Stewardship & Epidemiology. Researchers examined 32 studies published between December 2019 and June 2023, involving 775,931 adults. Twenty-four studies, encompassing 620,221 individuals, were included in the meta-analysis. 

“The body of evidence from all these different studies converge on one single reality — that vaccines reduce the risk of long COVID, and people who keep up to date on their vaccinations also fared better than people who got it once or twice and didn’t follow up,” said Ziyad Al-Aly, MD, a clinical epidemiologist at Washington University in St Louis. 

Researchers have reported similar results for children. The National Institutes of Health RECOVER Initiative team found that vaccines are up to 42% effective in preventing long COVID in children, said Dr. Carlos Oliveira, MD, a pediatric infectious diseases specialist and Yale researcher who contributed to the study, which is in preprint. 

Vaccines also protect children from multisystem inflammatory syndrome, a condition that can happen after COVID, as well as protect against other COVID-related problems, such as missed school days, Oliveira said. “Even if the vaccine doesn’t completely stop long COVID, it’s still good for kids to get vaccinated for all these other reasons.” 

However, uptake for the latest boosters has been slow: the Centers for Disease Control and Prevention reported that by mid-November, less than 16% of people aged 18 years or older had received a shot. For children, the number was closer to 6%. A recent Kaiser Family Foundation survey found that booster rates for adults are similar to what it was 1 year ago. 

The survey results suggest that people are no longer as worried about COVID, which is why there is less concerned about keeping up with boosters. Though the current mutation of the virus is not as debilitating as its predecessors, long COVID continues to be a problem: as of January 2023, 28% of people who had contracted the virus had experienced long-COVID symptoms. And though the mechanisms are still not fully understood, and researchers have yet to agree on a definition of long COVID, they are certain about this much: The best way to avoid it is to avoid getting infected to begin with. 

The lack of a diagnostic test for long COVID and the fact that the symptoms mimic those of other diseases lead to inconsistency that can make studies hard to replicate. In the papers reviewed for the Antimicrobial Stewardship & Epidemiology study, long COVID was defined as having symptoms lasting from more than 4 weeks to more than 6 months. Alexandre Marra, MD, the lead author and a researcher at the Hospital Israelita Albert Einstein, in São Paulo, Brazil, and at the University of Iowa, said that a clear standard definition is needed to better understand the actual prevalence and evaluate vaccine effectiveness. 

Al-Aly noted that there is a logical explanation for one finding in the paper: The percentage of individuals who had COVID and reported that long-COVID symptoms declined from 19% in June 2022 to 11% in January 2023. 

Because a pandemic is a dynamic event, constantly producing different variants with different phenotypes, the prevalence of disease is naturally going to be affected. “People who got infected early in the pandemic may have a different long COVID profile and long COVID risk than people who got infected in the second or third year of the pandemic,” Al-Aly said. 

Most of the studies reported data from before the Omicron-variant era. Only eight reported data during that era. Omicron was not as lethal as previous variants, and consequently, fewer patients developed long COVID during that time. 

One of those who did is Yeng Chang, age 40 years, a family doctor who lives in Sherwood Park, Alberta, Canada. Chang developed long COVID during fall 2022 after getting the virus in June. By then, she’d been vaccinated three times, but she isn’t surprised that she got sick because each vaccine she had was developed before Omicron.

“When I had COVID I was really sick, but I was well enough to stay home,” she said. “I think if I didn’t have my immunizations, I might have been hospitalized, and I don’t know what would have happened.” 

Long COVID has left Chang with brain fog, fatigue, and a lack of physical stamina that forced her to pause her medical practice. For the past year and a half, she’s spent more time as a patient than a physician. 

Chang had her fifth COVID vaccination in the fall and recommends that others do the same. “The booster you got however many years ago was effective for the COVID of that time but there is a new COVID now. You can’t just say, ‘I had one and I’m fine forever.’” 
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Poverty tied to poor cognition in patients with epilepsy

Article Type
Changed
Thu, 12/07/2023 - 13:42

ORLANDO — Older people with epilepsy who live in deprived neighborhoods with lower socioeconomic status, fewer educational opportunities, and less access to health care have poorer memory, executive function, and processing speed than those living in more affluent areas, early research suggests.

Seniors with epilepsy present with multiple comorbidities, including, for example, hypertension and diabetes, and they are at increased risk of developing dementia, said study investigator Anny Reyes, PhD, a postdoctoral scholar at the University of California at San Diego.

Past research has shown neighborhood disadvantage is associated with numerous adverse health outcomes, including an increased risk for developing Alzheimer’s disease and related dementias (ADRD).

“We already know epilepsy on its own increases risks for dementia, and when you add disadvantaged to that, it’s going to increase the risk even more,” said Dr. Reyes.

Neurologists should ask their older patients with epilepsy, many of whom live alone, about food insecurity and access to resources “not just within the hospital system but also within their community,” she said.

The findings were presented at the annual meeting of the American Epilepsy Society.
 

Proxy Measure of Disadvantage

The incidence and prevalence of epilepsy increases with age. Older adults represent the fastest growing segment of individuals with epilepsy, said Dr. Reyes.

The new study included 40 patients with focal epilepsy, average age 67 years, from three areas: San Diego, California; Madison, Wisconsin; and Cleveland, Ohio.

Researchers collected clinical and sociodemographic information as well as vascular biomarkers. They also gathered individual-level data, including income, parental education levels, details on childhood upbringing, etc.

Using residential addresses, investigators determined the area deprivation index (ADI) value for study participants. The ADI is a proxy measure for neighborhood-level socioeconomic disadvantage that captures factors such a poverty, employment, housing, and education opportunities.

ADI values range from 1 to 10, with a higher number indicating greater neighborhood disadvantage. About 30% of the cohort had an ADI decile greater than 6.

Researchers divided subjects into Most Disadvantaged (ADI greater than 7) and Least Disadvantaged (AD 7 or less). The two groups were similar with regard to age, education level, and race/ethnicity.

But those from the most disadvantaged areas were younger, taking more antiseizure medications, had fewer years of education, lower levels of father’s education, less personal and family income, and were less likely to be diagnosed with hypertension.

Study subjects completed neuropsychological testing, including:

  • Measures of learning (Rey Auditory Verbal Learning Test [RAVLT] Learning Over Trials; Wechsler Memory Scale 4th Edition [WMS-4] Logical Memory [LM] Story B immediate; and WMS-4 Visual Reproduction [VR] immediate)
  • Memory (RAVLT delayed recall, WMS-4 LM delayed recall, and WMS-4 VR delayed recall)
  • Language (Multilingual Naming Test, Auditory Naming Test, and animal fluency)
  • Executive function/processing speed (Letter fluency and Trail-Making Test Parts A and B)

The study found a correlation between higher ADI (most disadvantaged) and poorer performance on learning (Spearman rho: -0.433; 95% CI -0.664 to -0.126; P = .006), memory (r = -0.496; 95% CI -0.707 to -0.205; P = .001), and executive function/processes speed (r = -0.315; 95% CI -0.577 to 0.006; P = .048), but no significant association with language.

Looking at individual-level data, the study found memory and processing speed “were driving the relationship, and again, patients had worse performance when they were coming from the most disadvantaged neighborhoods,” said Dr. Reyes.

The investigators also examined mood, including depression and anxiety, and subjective complaints of cognitive problems. “We found those patients residing in the most disadvantaged neighborhoods complained more about memory problems,” she said.

The results underscore the need for community-level interventions “that could provide resources in support of these older adults and their families and connect them to services we know are good for brain health,” said Dr. Reyes.

Alzheimer’s disease experts “have done a really good job of this, but this is new for epilepsy,” she added. “This gives us a great opportunity to kind of bridge the worlds of dementia and epilepsy.”
 

 

 

Novel Research

Commenting on the research, Rani Sarkis, MD, assistant professor of neurology, Brigham and Women’s Hospital, Boston, said the study is “very useful” as it ties social determinants of health to cognition.

“We have not been doing that” in people with epilepsy, he said.

The study, one of the first to look at the link between disadvantaged neighborhoods and cognitive impairment, “has very important” public health implications, including the need to consider access to activities that promote cognitive resilience and other brain health initiatives, said Dr. Sarkis.

Another larger study that looked at neighborhood deprivation and cognition in epilepsy was also presented at the AES meeting and published earlier this year in the journal Neurology.

That study included 800 patients with pharmaco-resistant temporal lobe epilepsy being evaluated for surgery at the Cleveland Clinic, mean age about 38 years. It examined numerous cognitive domains as well as depression and anxiety in relation to ADI generated by patient addresses and split into quintiles from least to most disadvantaged.

After controlling for covariants, the study found scores for all cognitive domains were significantly worse in the most disadvantaged quintile except for executive function, which was close to reaching significance (P = .052), said lead author Robyn M. Busch, PhD, a clinical neuropsychologist in the Epilepsy Center, Department of Neurology, Cleveland Clinic.

The study also found people in the most disadvantaged areas had more symptoms of depression and anxiety compared with people in the least disadvantaged areas, said Busch.
 

A Complex Issue

Although the exact mechanism tying disadvantaged areas to cognition in epilepsy isn’t fully understood, having less access to health care and educational opportunities, poor nutrition, and being under chronic stress “are all things that affect the brain,” said Dr. Busch.

“This is super complex and it’s going to be really difficult to tease apart, but we’d like to look at imaging data to see if it’s something structural, if there are functional changes in the brain or something that might help us understand this better.”

But it’s also possible that having epilepsy “might be pushing people into environments” that offer fewer employment and educational opportunities and less access to resources, she said.

The study authors and Dr. Sarkis report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

ORLANDO — Older people with epilepsy who live in deprived neighborhoods with lower socioeconomic status, fewer educational opportunities, and less access to health care have poorer memory, executive function, and processing speed than those living in more affluent areas, early research suggests.

Seniors with epilepsy present with multiple comorbidities, including, for example, hypertension and diabetes, and they are at increased risk of developing dementia, said study investigator Anny Reyes, PhD, a postdoctoral scholar at the University of California at San Diego.

Past research has shown neighborhood disadvantage is associated with numerous adverse health outcomes, including an increased risk for developing Alzheimer’s disease and related dementias (ADRD).

“We already know epilepsy on its own increases risks for dementia, and when you add disadvantaged to that, it’s going to increase the risk even more,” said Dr. Reyes.

Neurologists should ask their older patients with epilepsy, many of whom live alone, about food insecurity and access to resources “not just within the hospital system but also within their community,” she said.

The findings were presented at the annual meeting of the American Epilepsy Society.
 

Proxy Measure of Disadvantage

The incidence and prevalence of epilepsy increases with age. Older adults represent the fastest growing segment of individuals with epilepsy, said Dr. Reyes.

The new study included 40 patients with focal epilepsy, average age 67 years, from three areas: San Diego, California; Madison, Wisconsin; and Cleveland, Ohio.

Researchers collected clinical and sociodemographic information as well as vascular biomarkers. They also gathered individual-level data, including income, parental education levels, details on childhood upbringing, etc.

Using residential addresses, investigators determined the area deprivation index (ADI) value for study participants. The ADI is a proxy measure for neighborhood-level socioeconomic disadvantage that captures factors such a poverty, employment, housing, and education opportunities.

ADI values range from 1 to 10, with a higher number indicating greater neighborhood disadvantage. About 30% of the cohort had an ADI decile greater than 6.

Researchers divided subjects into Most Disadvantaged (ADI greater than 7) and Least Disadvantaged (AD 7 or less). The two groups were similar with regard to age, education level, and race/ethnicity.

But those from the most disadvantaged areas were younger, taking more antiseizure medications, had fewer years of education, lower levels of father’s education, less personal and family income, and were less likely to be diagnosed with hypertension.

Study subjects completed neuropsychological testing, including:

  • Measures of learning (Rey Auditory Verbal Learning Test [RAVLT] Learning Over Trials; Wechsler Memory Scale 4th Edition [WMS-4] Logical Memory [LM] Story B immediate; and WMS-4 Visual Reproduction [VR] immediate)
  • Memory (RAVLT delayed recall, WMS-4 LM delayed recall, and WMS-4 VR delayed recall)
  • Language (Multilingual Naming Test, Auditory Naming Test, and animal fluency)
  • Executive function/processing speed (Letter fluency and Trail-Making Test Parts A and B)

The study found a correlation between higher ADI (most disadvantaged) and poorer performance on learning (Spearman rho: -0.433; 95% CI -0.664 to -0.126; P = .006), memory (r = -0.496; 95% CI -0.707 to -0.205; P = .001), and executive function/processes speed (r = -0.315; 95% CI -0.577 to 0.006; P = .048), but no significant association with language.

Looking at individual-level data, the study found memory and processing speed “were driving the relationship, and again, patients had worse performance when they were coming from the most disadvantaged neighborhoods,” said Dr. Reyes.

The investigators also examined mood, including depression and anxiety, and subjective complaints of cognitive problems. “We found those patients residing in the most disadvantaged neighborhoods complained more about memory problems,” she said.

The results underscore the need for community-level interventions “that could provide resources in support of these older adults and their families and connect them to services we know are good for brain health,” said Dr. Reyes.

Alzheimer’s disease experts “have done a really good job of this, but this is new for epilepsy,” she added. “This gives us a great opportunity to kind of bridge the worlds of dementia and epilepsy.”
 

 

 

Novel Research

Commenting on the research, Rani Sarkis, MD, assistant professor of neurology, Brigham and Women’s Hospital, Boston, said the study is “very useful” as it ties social determinants of health to cognition.

“We have not been doing that” in people with epilepsy, he said.

The study, one of the first to look at the link between disadvantaged neighborhoods and cognitive impairment, “has very important” public health implications, including the need to consider access to activities that promote cognitive resilience and other brain health initiatives, said Dr. Sarkis.

Another larger study that looked at neighborhood deprivation and cognition in epilepsy was also presented at the AES meeting and published earlier this year in the journal Neurology.

That study included 800 patients with pharmaco-resistant temporal lobe epilepsy being evaluated for surgery at the Cleveland Clinic, mean age about 38 years. It examined numerous cognitive domains as well as depression and anxiety in relation to ADI generated by patient addresses and split into quintiles from least to most disadvantaged.

After controlling for covariants, the study found scores for all cognitive domains were significantly worse in the most disadvantaged quintile except for executive function, which was close to reaching significance (P = .052), said lead author Robyn M. Busch, PhD, a clinical neuropsychologist in the Epilepsy Center, Department of Neurology, Cleveland Clinic.

The study also found people in the most disadvantaged areas had more symptoms of depression and anxiety compared with people in the least disadvantaged areas, said Busch.
 

A Complex Issue

Although the exact mechanism tying disadvantaged areas to cognition in epilepsy isn’t fully understood, having less access to health care and educational opportunities, poor nutrition, and being under chronic stress “are all things that affect the brain,” said Dr. Busch.

“This is super complex and it’s going to be really difficult to tease apart, but we’d like to look at imaging data to see if it’s something structural, if there are functional changes in the brain or something that might help us understand this better.”

But it’s also possible that having epilepsy “might be pushing people into environments” that offer fewer employment and educational opportunities and less access to resources, she said.

The study authors and Dr. Sarkis report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

ORLANDO — Older people with epilepsy who live in deprived neighborhoods with lower socioeconomic status, fewer educational opportunities, and less access to health care have poorer memory, executive function, and processing speed than those living in more affluent areas, early research suggests.

Seniors with epilepsy present with multiple comorbidities, including, for example, hypertension and diabetes, and they are at increased risk of developing dementia, said study investigator Anny Reyes, PhD, a postdoctoral scholar at the University of California at San Diego.

Past research has shown neighborhood disadvantage is associated with numerous adverse health outcomes, including an increased risk for developing Alzheimer’s disease and related dementias (ADRD).

“We already know epilepsy on its own increases risks for dementia, and when you add disadvantaged to that, it’s going to increase the risk even more,” said Dr. Reyes.

Neurologists should ask their older patients with epilepsy, many of whom live alone, about food insecurity and access to resources “not just within the hospital system but also within their community,” she said.

The findings were presented at the annual meeting of the American Epilepsy Society.
 

Proxy Measure of Disadvantage

The incidence and prevalence of epilepsy increases with age. Older adults represent the fastest growing segment of individuals with epilepsy, said Dr. Reyes.

The new study included 40 patients with focal epilepsy, average age 67 years, from three areas: San Diego, California; Madison, Wisconsin; and Cleveland, Ohio.

Researchers collected clinical and sociodemographic information as well as vascular biomarkers. They also gathered individual-level data, including income, parental education levels, details on childhood upbringing, etc.

Using residential addresses, investigators determined the area deprivation index (ADI) value for study participants. The ADI is a proxy measure for neighborhood-level socioeconomic disadvantage that captures factors such a poverty, employment, housing, and education opportunities.

ADI values range from 1 to 10, with a higher number indicating greater neighborhood disadvantage. About 30% of the cohort had an ADI decile greater than 6.

Researchers divided subjects into Most Disadvantaged (ADI greater than 7) and Least Disadvantaged (AD 7 or less). The two groups were similar with regard to age, education level, and race/ethnicity.

But those from the most disadvantaged areas were younger, taking more antiseizure medications, had fewer years of education, lower levels of father’s education, less personal and family income, and were less likely to be diagnosed with hypertension.

Study subjects completed neuropsychological testing, including:

  • Measures of learning (Rey Auditory Verbal Learning Test [RAVLT] Learning Over Trials; Wechsler Memory Scale 4th Edition [WMS-4] Logical Memory [LM] Story B immediate; and WMS-4 Visual Reproduction [VR] immediate)
  • Memory (RAVLT delayed recall, WMS-4 LM delayed recall, and WMS-4 VR delayed recall)
  • Language (Multilingual Naming Test, Auditory Naming Test, and animal fluency)
  • Executive function/processing speed (Letter fluency and Trail-Making Test Parts A and B)

The study found a correlation between higher ADI (most disadvantaged) and poorer performance on learning (Spearman rho: -0.433; 95% CI -0.664 to -0.126; P = .006), memory (r = -0.496; 95% CI -0.707 to -0.205; P = .001), and executive function/processes speed (r = -0.315; 95% CI -0.577 to 0.006; P = .048), but no significant association with language.

Looking at individual-level data, the study found memory and processing speed “were driving the relationship, and again, patients had worse performance when they were coming from the most disadvantaged neighborhoods,” said Dr. Reyes.

The investigators also examined mood, including depression and anxiety, and subjective complaints of cognitive problems. “We found those patients residing in the most disadvantaged neighborhoods complained more about memory problems,” she said.

The results underscore the need for community-level interventions “that could provide resources in support of these older adults and their families and connect them to services we know are good for brain health,” said Dr. Reyes.

Alzheimer’s disease experts “have done a really good job of this, but this is new for epilepsy,” she added. “This gives us a great opportunity to kind of bridge the worlds of dementia and epilepsy.”
 

 

 

Novel Research

Commenting on the research, Rani Sarkis, MD, assistant professor of neurology, Brigham and Women’s Hospital, Boston, said the study is “very useful” as it ties social determinants of health to cognition.

“We have not been doing that” in people with epilepsy, he said.

The study, one of the first to look at the link between disadvantaged neighborhoods and cognitive impairment, “has very important” public health implications, including the need to consider access to activities that promote cognitive resilience and other brain health initiatives, said Dr. Sarkis.

Another larger study that looked at neighborhood deprivation and cognition in epilepsy was also presented at the AES meeting and published earlier this year in the journal Neurology.

That study included 800 patients with pharmaco-resistant temporal lobe epilepsy being evaluated for surgery at the Cleveland Clinic, mean age about 38 years. It examined numerous cognitive domains as well as depression and anxiety in relation to ADI generated by patient addresses and split into quintiles from least to most disadvantaged.

After controlling for covariants, the study found scores for all cognitive domains were significantly worse in the most disadvantaged quintile except for executive function, which was close to reaching significance (P = .052), said lead author Robyn M. Busch, PhD, a clinical neuropsychologist in the Epilepsy Center, Department of Neurology, Cleveland Clinic.

The study also found people in the most disadvantaged areas had more symptoms of depression and anxiety compared with people in the least disadvantaged areas, said Busch.
 

A Complex Issue

Although the exact mechanism tying disadvantaged areas to cognition in epilepsy isn’t fully understood, having less access to health care and educational opportunities, poor nutrition, and being under chronic stress “are all things that affect the brain,” said Dr. Busch.

“This is super complex and it’s going to be really difficult to tease apart, but we’d like to look at imaging data to see if it’s something structural, if there are functional changes in the brain or something that might help us understand this better.”

But it’s also possible that having epilepsy “might be pushing people into environments” that offer fewer employment and educational opportunities and less access to resources, she said.

The study authors and Dr. Sarkis report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AES 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Genetic testing warranted in epilepsy of unknown origin

Article Type
Changed
Thu, 12/07/2023 - 13:34

ORLANDO — Genetic testing is warranted in patients with epilepsy of unknown origin, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.

Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California. 

But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.

Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies. 

Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.

The findings were presented at the annual meeting of the American Epilepsy Society.
 

Major Delays

About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.

Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.

The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.

Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.

The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.

Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.

And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.

Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).

These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.

In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.

Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.

Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.

Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.

Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.

Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
 

 

 

Valuable Evidence

Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”

“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”

The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.

She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.

Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”

Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.

Drs. Li and Poduri report no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

ORLANDO — Genetic testing is warranted in patients with epilepsy of unknown origin, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.

Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California. 

But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.

Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies. 

Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.

The findings were presented at the annual meeting of the American Epilepsy Society.
 

Major Delays

About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.

Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.

The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.

Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.

The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.

Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.

And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.

Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).

These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.

In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.

Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.

Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.

Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.

Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.

Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
 

 

 

Valuable Evidence

Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”

“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”

The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.

She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.

Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”

Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.

Drs. Li and Poduri report no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

ORLANDO — Genetic testing is warranted in patients with epilepsy of unknown origin, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.

Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California. 

But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.

Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies. 

Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.

The findings were presented at the annual meeting of the American Epilepsy Society.
 

Major Delays

About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.

Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.

The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.

Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.

The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.

Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.

And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.

Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).

These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.

In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.

Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.

Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.

Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.

Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.

Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
 

 

 

Valuable Evidence

Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”

“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”

The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.

She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.

Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”

Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.

Drs. Li and Poduri report no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AES 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Dual treatment may boost efficacy in chronic migraine

Article Type
Changed
Thu, 12/07/2023 - 13:09

For patients with chronic migraine, combination therapy with anti-CGRP monoclonal antibodies and onabotulinumtoxinA may be more effective than monotherapy, possibly owing to the synergistic mechanism of action of the two agents, a new study suggests.

“People with chronic migraine may be the toughest to treat. They have the greatest disability, and often insurance companies would prefer monotherapy, but in these patients, sometimes using a multifaceted approach and using different drugs that target different pathophysiologies of migraine can probably provide greater benefit in terms of reducing the frequency and severity of the headaches,” study investigator MaryAnn Mays, MD, staff neurologist at the Headache and Facial Pain Clinic in the Neurologic Institute, Cleveland Clinic, Cleveland, Ohio, said in an interview.

The findings were presented at the annual meeting of the American Headache Society.
 

Fewer Migraine Days

OnabotulinumtodxinA (onabot) has been shown to selectively inhibit unmyelinated C-fibers but not A-delta-meningeal nociceptors. Anti-CGRP mAb therapies have been shown to prevent the activation of A-delta-fibers but not C-fibers, said Dr. Mays.

For the study, the investigators reviewed the electronic medical records of 194 patients who had been concurrently treated with anti-CGRP mAbs and onabot. Most (86.6%) were women; ages ranged from 36 to 65 years, and at baseline, they had been having an average of 28 (+4.6) monthly migraine days (MMDs).

The number of MMDs were assessed at two periods: 3 months after monotherapy with an anti-CGRP mAb or onabot injections, and 3 months after combined therapy.

Monotherapy reduced the average number of MMDs from 28 to 18.6, for a reduction of 9.4 days (P > .0001).

After initiation of combined therapy, the average number of MMDs decreased further, from 18.6 MMDs to 12.1 MMDs (P > .0001).

In all, the combination of onabot and anti-CGRP mAbs resulted in a total MMD reduction of 15.8 (P > .0001).

In addition, most patients (68%) reported a 50% or greater reduction in MMDs, and 46.4% reported a 75% or greater reduction.
 

Great News for Patients

Commenting for this article, Rashmi B. Halker Singh, MD, associate professor of neurology at Mayo Clinic, Scottsdale, Arizona, said the study findings “support what we see in clinical practice and what we suspected from preclinical data.”

Single-agent treatment is not sufficient for many patients. Data confirming the benefit of dual therapy will provide more evidence to insurance companies of the need for coverage.

“We have lots of individuals for whom single treatment is not sufficient and who need this combination of treatment, and it is often denied by insurance. There are preclinical data suggesting synergy, but insurance says it is experimental, so the claims get denied. This leaves patients having to choose which drug they want to continue with, and that’s really heartbreaking,” Dr. Halker Singh said.

The importance of this study is that it adds more data to support evidence-based therapies for migraine and to help patients get the treatment they need, she added.

Dr. Mays reports financial relationships with AbbVie, Amgen, and Teva. Dr. Halker Singh reports no relevant financial relationships.
 

A version of this article appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

For patients with chronic migraine, combination therapy with anti-CGRP monoclonal antibodies and onabotulinumtoxinA may be more effective than monotherapy, possibly owing to the synergistic mechanism of action of the two agents, a new study suggests.

“People with chronic migraine may be the toughest to treat. They have the greatest disability, and often insurance companies would prefer monotherapy, but in these patients, sometimes using a multifaceted approach and using different drugs that target different pathophysiologies of migraine can probably provide greater benefit in terms of reducing the frequency and severity of the headaches,” study investigator MaryAnn Mays, MD, staff neurologist at the Headache and Facial Pain Clinic in the Neurologic Institute, Cleveland Clinic, Cleveland, Ohio, said in an interview.

The findings were presented at the annual meeting of the American Headache Society.
 

Fewer Migraine Days

OnabotulinumtodxinA (onabot) has been shown to selectively inhibit unmyelinated C-fibers but not A-delta-meningeal nociceptors. Anti-CGRP mAb therapies have been shown to prevent the activation of A-delta-fibers but not C-fibers, said Dr. Mays.

For the study, the investigators reviewed the electronic medical records of 194 patients who had been concurrently treated with anti-CGRP mAbs and onabot. Most (86.6%) were women; ages ranged from 36 to 65 years, and at baseline, they had been having an average of 28 (+4.6) monthly migraine days (MMDs).

The number of MMDs were assessed at two periods: 3 months after monotherapy with an anti-CGRP mAb or onabot injections, and 3 months after combined therapy.

Monotherapy reduced the average number of MMDs from 28 to 18.6, for a reduction of 9.4 days (P > .0001).

After initiation of combined therapy, the average number of MMDs decreased further, from 18.6 MMDs to 12.1 MMDs (P > .0001).

In all, the combination of onabot and anti-CGRP mAbs resulted in a total MMD reduction of 15.8 (P > .0001).

In addition, most patients (68%) reported a 50% or greater reduction in MMDs, and 46.4% reported a 75% or greater reduction.
 

Great News for Patients

Commenting for this article, Rashmi B. Halker Singh, MD, associate professor of neurology at Mayo Clinic, Scottsdale, Arizona, said the study findings “support what we see in clinical practice and what we suspected from preclinical data.”

Single-agent treatment is not sufficient for many patients. Data confirming the benefit of dual therapy will provide more evidence to insurance companies of the need for coverage.

“We have lots of individuals for whom single treatment is not sufficient and who need this combination of treatment, and it is often denied by insurance. There are preclinical data suggesting synergy, but insurance says it is experimental, so the claims get denied. This leaves patients having to choose which drug they want to continue with, and that’s really heartbreaking,” Dr. Halker Singh said.

The importance of this study is that it adds more data to support evidence-based therapies for migraine and to help patients get the treatment they need, she added.

Dr. Mays reports financial relationships with AbbVie, Amgen, and Teva. Dr. Halker Singh reports no relevant financial relationships.
 

A version of this article appeared on Medscape.com.

For patients with chronic migraine, combination therapy with anti-CGRP monoclonal antibodies and onabotulinumtoxinA may be more effective than monotherapy, possibly owing to the synergistic mechanism of action of the two agents, a new study suggests.

“People with chronic migraine may be the toughest to treat. They have the greatest disability, and often insurance companies would prefer monotherapy, but in these patients, sometimes using a multifaceted approach and using different drugs that target different pathophysiologies of migraine can probably provide greater benefit in terms of reducing the frequency and severity of the headaches,” study investigator MaryAnn Mays, MD, staff neurologist at the Headache and Facial Pain Clinic in the Neurologic Institute, Cleveland Clinic, Cleveland, Ohio, said in an interview.

The findings were presented at the annual meeting of the American Headache Society.
 

Fewer Migraine Days

OnabotulinumtodxinA (onabot) has been shown to selectively inhibit unmyelinated C-fibers but not A-delta-meningeal nociceptors. Anti-CGRP mAb therapies have been shown to prevent the activation of A-delta-fibers but not C-fibers, said Dr. Mays.

For the study, the investigators reviewed the electronic medical records of 194 patients who had been concurrently treated with anti-CGRP mAbs and onabot. Most (86.6%) were women; ages ranged from 36 to 65 years, and at baseline, they had been having an average of 28 (+4.6) monthly migraine days (MMDs).

The number of MMDs were assessed at two periods: 3 months after monotherapy with an anti-CGRP mAb or onabot injections, and 3 months after combined therapy.

Monotherapy reduced the average number of MMDs from 28 to 18.6, for a reduction of 9.4 days (P > .0001).

After initiation of combined therapy, the average number of MMDs decreased further, from 18.6 MMDs to 12.1 MMDs (P > .0001).

In all, the combination of onabot and anti-CGRP mAbs resulted in a total MMD reduction of 15.8 (P > .0001).

In addition, most patients (68%) reported a 50% or greater reduction in MMDs, and 46.4% reported a 75% or greater reduction.
 

Great News for Patients

Commenting for this article, Rashmi B. Halker Singh, MD, associate professor of neurology at Mayo Clinic, Scottsdale, Arizona, said the study findings “support what we see in clinical practice and what we suspected from preclinical data.”

Single-agent treatment is not sufficient for many patients. Data confirming the benefit of dual therapy will provide more evidence to insurance companies of the need for coverage.

“We have lots of individuals for whom single treatment is not sufficient and who need this combination of treatment, and it is often denied by insurance. There are preclinical data suggesting synergy, but insurance says it is experimental, so the claims get denied. This leaves patients having to choose which drug they want to continue with, and that’s really heartbreaking,” Dr. Halker Singh said.

The importance of this study is that it adds more data to support evidence-based therapies for migraine and to help patients get the treatment they need, she added.

Dr. Mays reports financial relationships with AbbVie, Amgen, and Teva. Dr. Halker Singh reports no relevant financial relationships.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AHS 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Excessive TV-watching tied to elevated risk for dementia, Parkinson’s disease, and depression

Article Type
Changed
Thu, 12/07/2023 - 13:05

 

TOPLINE

Excessive television-watching is tied to an increased risk for dementia, Parkinson’s disease (PD), and depression, whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.

METHODOLOGY:

  • Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
  • Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
  • MRI was conducted to determine participants’ brain volume.

TAKEAWAY: 

  • During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
  • Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
  • However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
  • Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).

IN PRACTICE:

The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”

SOURCE:

Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.

LIMITATIONS: 

Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account. 

DISCLOSURES:

The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.

Eve Bender has no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE

Excessive television-watching is tied to an increased risk for dementia, Parkinson’s disease (PD), and depression, whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.

METHODOLOGY:

  • Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
  • Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
  • MRI was conducted to determine participants’ brain volume.

TAKEAWAY: 

  • During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
  • Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
  • However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
  • Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).

IN PRACTICE:

The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”

SOURCE:

Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.

LIMITATIONS: 

Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account. 

DISCLOSURES:

The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.

Eve Bender has no relevant financial relationships.

A version of this article appeared on Medscape.com.

 

TOPLINE

Excessive television-watching is tied to an increased risk for dementia, Parkinson’s disease (PD), and depression, whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.

METHODOLOGY:

  • Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
  • Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
  • MRI was conducted to determine participants’ brain volume.

TAKEAWAY: 

  • During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
  • Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
  • However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
  • Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).

IN PRACTICE:

The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”

SOURCE:

Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.

LIMITATIONS: 

Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account. 

DISCLOSURES:

The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.

Eve Bender has no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article