A New and Early Predictor of Dementia?

Article Type
Changed
Wed, 11/27/2024 - 03:10

Signs of frailty may signal future dementia more than a decade before cognitive symptoms occur, in new findings that may provide a potential opportunity to identify high-risk populations for targeted enrollment in clinical trials of dementia prevention and treatment.

Results of an international study assessing frailty trajectories showed frailty levels notably increased in the 4-9 years before dementia diagnosis. Even among study participants whose baseline frailty measurement was taken prior to that acceleration period, frailty was still positively associated with dementia risk, the investigators noted.

“We found that with every four to five additional health problems, there is on average a 40% higher risk of developing dementia, while the risk is lower for people who are more physically fit,” said study investigator David Ward, PhD, of the Centre for Health Services Research, The University of Queensland, Brisbane, Australia.

The findings were published online in JAMA Neurology.

 

A Promising Biomarker

An accessible biomarker for both biologic age and dementia risk is essential for advancing dementia prevention and treatment strategies, the investigators noted, adding that growing evidence suggests frailty may be a promising candidate for this role.

To learn more about the association between frailty and dementia, Ward and his team analyzed data on 29,849 participants aged 60 years or above (mean age, 71.6 years; 62% women) who participated in four cohort studies: the English Longitudinal Study of Ageing (ELSA; n = 6771), the Health and Retirement Study (HRS; n = 9045), the Rush Memory and Aging Project (MAP; n = 1451), and the National Alzheimer’s Coordinating Center (NACC; n = 12,582).

The primary outcome was all-cause dementia. Depending on the cohort, dementia diagnoses were determined through cognitive testing, self- or family report of physician diagnosis, or a diagnosis by the study physician. Participants were excluded if they had cognitive impairment at baseline.

Investigators retrospectively determined frailty index scores by gathering information on health and functional outcomes for participants from each cohort. Only participants with frailty data on at least 30 deficits were included.

Commonly included deficits included high blood pressure, cancer, and chronic pain, as well as functional problems such as hearing impairment, difficulty with mobility, and challenges managing finances.

Investigators conducted follow-up visits with participants until they developed dementia or until the study ended, with follow-up periods varying across cohorts.

After adjustment for potential confounders, frailty scores were modeled using backward time scales.

Among participants who developed incident dementia (n = 3154), covariate-adjusted expected frailty index scores were, on average, higher in women than in men by 18.5% in ELSA, 20.9% in HRS, and 16.2% in MAP. There were no differences in frailty scores between sexes in the NACC cohort.

When measured on a timeline, as compared with those who didn’t develop dementia, frailty scores were significantly and consistently higher in the dementia groups 8-20 before dementia onset (20 years in HRS; 13 in MAP; 12 in ELSA; 8 in NACC).

Increases in the rates of frailty index scores began accelerating 4-9 years before dementia onset for the various cohorts, investigators noted.

In all four cohorts, each 0.1 increase in frailty scores was positively associated with increased dementia risk.

Adjusted hazard ratios [aHRs] ranged from 1.18 in the HRS cohort to 1.73 in the NACC cohort, which showed the strongest association.

In participants whose baseline frailty measurement was conducted before the predementia acceleration period began, the association of frailty scores and dementia risk was positive. These aHRs ranged from 1.18 in the HRS cohort to 1.43 in the NACC cohort.

 

The ‘Four Pillars’ of Prevention

The good news, investigators said, is that the long trajectory of frailty symptoms preceding dementia onset provides plenty of opportunity for intervention.

To slow the development of frailty, Ward suggested adhering to the “four pillars of frailty prevention and management,” which include good nutrition with plenty of protein, exercise, optimizing medications for chronic conditions, and maintaining a strong social network.

Ward suggested neurologists track frailty in their patients and pointed to a recent article focused on helping neurologists use frailty measures to influence care planning.

Study limitations include the possibility of reverse causality and the fact that investigators could not adjust for genetic risk for dementia.

 

Unclear Pathway

Commenting on the findings, Lycia Neumann, PhD, senior director of Health Services Research at the Alzheimer’s Association, noted that many studies over the years have shown a link between frailty and dementia. However, she cautioned that a link does not imply causation.

The pathway from frailty to dementia is not 100% clear, and both are complex conditions, said Neumann, who was not part of the study.

“Adopting healthy lifestyle behaviors early and consistently can help decrease the risk of — or postpone the onset of — both frailty and cognitive decline,” she said. Neumann added that physical activity, a healthy diet, social engagement, and controlling diabetes and blood pressure can also reduce the risk for dementia as well as cardiovascular disease.

The study was funded in part by the Deep Dementia Phenotyping Network through the Frailty and Dementia Special Interest Group. Ward and Neumann reported no relevant financial relationships.

 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Signs of frailty may signal future dementia more than a decade before cognitive symptoms occur, in new findings that may provide a potential opportunity to identify high-risk populations for targeted enrollment in clinical trials of dementia prevention and treatment.

Results of an international study assessing frailty trajectories showed frailty levels notably increased in the 4-9 years before dementia diagnosis. Even among study participants whose baseline frailty measurement was taken prior to that acceleration period, frailty was still positively associated with dementia risk, the investigators noted.

“We found that with every four to five additional health problems, there is on average a 40% higher risk of developing dementia, while the risk is lower for people who are more physically fit,” said study investigator David Ward, PhD, of the Centre for Health Services Research, The University of Queensland, Brisbane, Australia.

The findings were published online in JAMA Neurology.

 

A Promising Biomarker

An accessible biomarker for both biologic age and dementia risk is essential for advancing dementia prevention and treatment strategies, the investigators noted, adding that growing evidence suggests frailty may be a promising candidate for this role.

To learn more about the association between frailty and dementia, Ward and his team analyzed data on 29,849 participants aged 60 years or above (mean age, 71.6 years; 62% women) who participated in four cohort studies: the English Longitudinal Study of Ageing (ELSA; n = 6771), the Health and Retirement Study (HRS; n = 9045), the Rush Memory and Aging Project (MAP; n = 1451), and the National Alzheimer’s Coordinating Center (NACC; n = 12,582).

The primary outcome was all-cause dementia. Depending on the cohort, dementia diagnoses were determined through cognitive testing, self- or family report of physician diagnosis, or a diagnosis by the study physician. Participants were excluded if they had cognitive impairment at baseline.

Investigators retrospectively determined frailty index scores by gathering information on health and functional outcomes for participants from each cohort. Only participants with frailty data on at least 30 deficits were included.

Commonly included deficits included high blood pressure, cancer, and chronic pain, as well as functional problems such as hearing impairment, difficulty with mobility, and challenges managing finances.

Investigators conducted follow-up visits with participants until they developed dementia or until the study ended, with follow-up periods varying across cohorts.

After adjustment for potential confounders, frailty scores were modeled using backward time scales.

Among participants who developed incident dementia (n = 3154), covariate-adjusted expected frailty index scores were, on average, higher in women than in men by 18.5% in ELSA, 20.9% in HRS, and 16.2% in MAP. There were no differences in frailty scores between sexes in the NACC cohort.

When measured on a timeline, as compared with those who didn’t develop dementia, frailty scores were significantly and consistently higher in the dementia groups 8-20 before dementia onset (20 years in HRS; 13 in MAP; 12 in ELSA; 8 in NACC).

Increases in the rates of frailty index scores began accelerating 4-9 years before dementia onset for the various cohorts, investigators noted.

In all four cohorts, each 0.1 increase in frailty scores was positively associated with increased dementia risk.

Adjusted hazard ratios [aHRs] ranged from 1.18 in the HRS cohort to 1.73 in the NACC cohort, which showed the strongest association.

In participants whose baseline frailty measurement was conducted before the predementia acceleration period began, the association of frailty scores and dementia risk was positive. These aHRs ranged from 1.18 in the HRS cohort to 1.43 in the NACC cohort.

 

The ‘Four Pillars’ of Prevention

The good news, investigators said, is that the long trajectory of frailty symptoms preceding dementia onset provides plenty of opportunity for intervention.

To slow the development of frailty, Ward suggested adhering to the “four pillars of frailty prevention and management,” which include good nutrition with plenty of protein, exercise, optimizing medications for chronic conditions, and maintaining a strong social network.

Ward suggested neurologists track frailty in their patients and pointed to a recent article focused on helping neurologists use frailty measures to influence care planning.

Study limitations include the possibility of reverse causality and the fact that investigators could not adjust for genetic risk for dementia.

 

Unclear Pathway

Commenting on the findings, Lycia Neumann, PhD, senior director of Health Services Research at the Alzheimer’s Association, noted that many studies over the years have shown a link between frailty and dementia. However, she cautioned that a link does not imply causation.

The pathway from frailty to dementia is not 100% clear, and both are complex conditions, said Neumann, who was not part of the study.

“Adopting healthy lifestyle behaviors early and consistently can help decrease the risk of — or postpone the onset of — both frailty and cognitive decline,” she said. Neumann added that physical activity, a healthy diet, social engagement, and controlling diabetes and blood pressure can also reduce the risk for dementia as well as cardiovascular disease.

The study was funded in part by the Deep Dementia Phenotyping Network through the Frailty and Dementia Special Interest Group. Ward and Neumann reported no relevant financial relationships.

 

A version of this article appeared on Medscape.com.

Signs of frailty may signal future dementia more than a decade before cognitive symptoms occur, in new findings that may provide a potential opportunity to identify high-risk populations for targeted enrollment in clinical trials of dementia prevention and treatment.

Results of an international study assessing frailty trajectories showed frailty levels notably increased in the 4-9 years before dementia diagnosis. Even among study participants whose baseline frailty measurement was taken prior to that acceleration period, frailty was still positively associated with dementia risk, the investigators noted.

“We found that with every four to five additional health problems, there is on average a 40% higher risk of developing dementia, while the risk is lower for people who are more physically fit,” said study investigator David Ward, PhD, of the Centre for Health Services Research, The University of Queensland, Brisbane, Australia.

The findings were published online in JAMA Neurology.

 

A Promising Biomarker

An accessible biomarker for both biologic age and dementia risk is essential for advancing dementia prevention and treatment strategies, the investigators noted, adding that growing evidence suggests frailty may be a promising candidate for this role.

To learn more about the association between frailty and dementia, Ward and his team analyzed data on 29,849 participants aged 60 years or above (mean age, 71.6 years; 62% women) who participated in four cohort studies: the English Longitudinal Study of Ageing (ELSA; n = 6771), the Health and Retirement Study (HRS; n = 9045), the Rush Memory and Aging Project (MAP; n = 1451), and the National Alzheimer’s Coordinating Center (NACC; n = 12,582).

The primary outcome was all-cause dementia. Depending on the cohort, dementia diagnoses were determined through cognitive testing, self- or family report of physician diagnosis, or a diagnosis by the study physician. Participants were excluded if they had cognitive impairment at baseline.

Investigators retrospectively determined frailty index scores by gathering information on health and functional outcomes for participants from each cohort. Only participants with frailty data on at least 30 deficits were included.

Commonly included deficits included high blood pressure, cancer, and chronic pain, as well as functional problems such as hearing impairment, difficulty with mobility, and challenges managing finances.

Investigators conducted follow-up visits with participants until they developed dementia or until the study ended, with follow-up periods varying across cohorts.

After adjustment for potential confounders, frailty scores were modeled using backward time scales.

Among participants who developed incident dementia (n = 3154), covariate-adjusted expected frailty index scores were, on average, higher in women than in men by 18.5% in ELSA, 20.9% in HRS, and 16.2% in MAP. There were no differences in frailty scores between sexes in the NACC cohort.

When measured on a timeline, as compared with those who didn’t develop dementia, frailty scores were significantly and consistently higher in the dementia groups 8-20 before dementia onset (20 years in HRS; 13 in MAP; 12 in ELSA; 8 in NACC).

Increases in the rates of frailty index scores began accelerating 4-9 years before dementia onset for the various cohorts, investigators noted.

In all four cohorts, each 0.1 increase in frailty scores was positively associated with increased dementia risk.

Adjusted hazard ratios [aHRs] ranged from 1.18 in the HRS cohort to 1.73 in the NACC cohort, which showed the strongest association.

In participants whose baseline frailty measurement was conducted before the predementia acceleration period began, the association of frailty scores and dementia risk was positive. These aHRs ranged from 1.18 in the HRS cohort to 1.43 in the NACC cohort.

 

The ‘Four Pillars’ of Prevention

The good news, investigators said, is that the long trajectory of frailty symptoms preceding dementia onset provides plenty of opportunity for intervention.

To slow the development of frailty, Ward suggested adhering to the “four pillars of frailty prevention and management,” which include good nutrition with plenty of protein, exercise, optimizing medications for chronic conditions, and maintaining a strong social network.

Ward suggested neurologists track frailty in their patients and pointed to a recent article focused on helping neurologists use frailty measures to influence care planning.

Study limitations include the possibility of reverse causality and the fact that investigators could not adjust for genetic risk for dementia.

 

Unclear Pathway

Commenting on the findings, Lycia Neumann, PhD, senior director of Health Services Research at the Alzheimer’s Association, noted that many studies over the years have shown a link between frailty and dementia. However, she cautioned that a link does not imply causation.

The pathway from frailty to dementia is not 100% clear, and both are complex conditions, said Neumann, who was not part of the study.

“Adopting healthy lifestyle behaviors early and consistently can help decrease the risk of — or postpone the onset of — both frailty and cognitive decline,” she said. Neumann added that physical activity, a healthy diet, social engagement, and controlling diabetes and blood pressure can also reduce the risk for dementia as well as cardiovascular disease.

The study was funded in part by the Deep Dementia Phenotyping Network through the Frailty and Dementia Special Interest Group. Ward and Neumann reported no relevant financial relationships.

 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 11/15/2024 - 12:12
Un-Gate On Date
Fri, 11/15/2024 - 12:12
Use ProPublica
CFC Schedule Remove Status
Fri, 11/15/2024 - 12:12
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 11/15/2024 - 12:12

Vitamin K Supplementation Reduces Nocturnal Leg Cramps in Older Adults

Article Type
Changed
Wed, 11/27/2024 - 02:26

 

TOPLINE:

Vitamin K supplementation significantly reduced the frequency, intensity, and duration of nocturnal leg cramps in older adults. No adverse events related to vitamin K were identified.

METHODOLOGY:

  • Researchers conducted a multicenter, double-blind, placebo-controlled randomized clinical trial in China from September 2022 to December 2023.
  • A total of 199 participants aged ≥ 65 years with at least two documented episodes of nocturnal leg cramps during a 2-week screening period were included.
  • Participants were randomized in a 1:1 ratio to receive either 180 μg of vitamin K (menaquinone 7) or a placebo daily for 8 weeks.
  • The primary outcome was the mean number of nocturnal leg cramps per week, while secondary outcomes were the duration and severity of muscle cramps.
  • The ethics committees of Third People’s Hospital of Chengdu and Affiliated Hospital of North Sichuan Medical College approved the study, and all participants provided written informed consent.

TAKEAWAY:

  • Vitamin K group experienced a significant reduction in the mean weekly frequency of cramps (mean difference, 2.60 [SD, 0.81] to 0.96 [SD, 1.41]) compared with the placebo group, which maintained a mean weekly frequency of 3.63 (SD, 2.20) (P < .001).
  • The severity of nocturnal leg cramps decreased more in the vitamin K group (mean difference, −2.55 [SD, 2.12] points) than in the placebo group (mean difference, −1.24 [SD, 1.16] points).
  • The duration of nocturnal leg cramps also decreased more in the vitamin K group (mean difference, −0.90 [SD, 0.88] minutes) than in the placebo group (mean difference, −0.32 [SD, 0.78] minutes).
  • No adverse events related to vitamin K use were identified, indicating a good safety profile for the supplementation.

IN PRACTICE:

“Given the generally benign characteristics of NLCs, treatment modality must be both effective and safe, thus minimizing the risk of iatrogenic harm,” the study authors wrote.

SOURCE:

This study was led by Jing Tan, MD, the Third People’s Hospital of Chengdu in Chengdu, China. It was published online on October 28 in JAMA Internal Medicine.

LIMITATIONS: 

This study did not investigate the quality of life or sleep, which could have provided additional insights into the impact of vitamin K on nocturnal leg cramps. The relatively mild nature of nocturnal leg cramps experienced by the participants may limit the generalizability of the findings to populations with more severe symptoms.

DISCLOSURES:

This study was supported by grants from China Health Promotion Foundation and the Third People’s Hospital of Chengdu Scientific Research Project. Tan disclosed receiving personal fees from BeiGene, AbbVie, Pfizer, Xian Janssen Pharmaceutical, and Takeda Pharmaceutical outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Vitamin K supplementation significantly reduced the frequency, intensity, and duration of nocturnal leg cramps in older adults. No adverse events related to vitamin K were identified.

METHODOLOGY:

  • Researchers conducted a multicenter, double-blind, placebo-controlled randomized clinical trial in China from September 2022 to December 2023.
  • A total of 199 participants aged ≥ 65 years with at least two documented episodes of nocturnal leg cramps during a 2-week screening period were included.
  • Participants were randomized in a 1:1 ratio to receive either 180 μg of vitamin K (menaquinone 7) or a placebo daily for 8 weeks.
  • The primary outcome was the mean number of nocturnal leg cramps per week, while secondary outcomes were the duration and severity of muscle cramps.
  • The ethics committees of Third People’s Hospital of Chengdu and Affiliated Hospital of North Sichuan Medical College approved the study, and all participants provided written informed consent.

TAKEAWAY:

  • Vitamin K group experienced a significant reduction in the mean weekly frequency of cramps (mean difference, 2.60 [SD, 0.81] to 0.96 [SD, 1.41]) compared with the placebo group, which maintained a mean weekly frequency of 3.63 (SD, 2.20) (P < .001).
  • The severity of nocturnal leg cramps decreased more in the vitamin K group (mean difference, −2.55 [SD, 2.12] points) than in the placebo group (mean difference, −1.24 [SD, 1.16] points).
  • The duration of nocturnal leg cramps also decreased more in the vitamin K group (mean difference, −0.90 [SD, 0.88] minutes) than in the placebo group (mean difference, −0.32 [SD, 0.78] minutes).
  • No adverse events related to vitamin K use were identified, indicating a good safety profile for the supplementation.

IN PRACTICE:

“Given the generally benign characteristics of NLCs, treatment modality must be both effective and safe, thus minimizing the risk of iatrogenic harm,” the study authors wrote.

SOURCE:

This study was led by Jing Tan, MD, the Third People’s Hospital of Chengdu in Chengdu, China. It was published online on October 28 in JAMA Internal Medicine.

LIMITATIONS: 

This study did not investigate the quality of life or sleep, which could have provided additional insights into the impact of vitamin K on nocturnal leg cramps. The relatively mild nature of nocturnal leg cramps experienced by the participants may limit the generalizability of the findings to populations with more severe symptoms.

DISCLOSURES:

This study was supported by grants from China Health Promotion Foundation and the Third People’s Hospital of Chengdu Scientific Research Project. Tan disclosed receiving personal fees from BeiGene, AbbVie, Pfizer, Xian Janssen Pharmaceutical, and Takeda Pharmaceutical outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Vitamin K supplementation significantly reduced the frequency, intensity, and duration of nocturnal leg cramps in older adults. No adverse events related to vitamin K were identified.

METHODOLOGY:

  • Researchers conducted a multicenter, double-blind, placebo-controlled randomized clinical trial in China from September 2022 to December 2023.
  • A total of 199 participants aged ≥ 65 years with at least two documented episodes of nocturnal leg cramps during a 2-week screening period were included.
  • Participants were randomized in a 1:1 ratio to receive either 180 μg of vitamin K (menaquinone 7) or a placebo daily for 8 weeks.
  • The primary outcome was the mean number of nocturnal leg cramps per week, while secondary outcomes were the duration and severity of muscle cramps.
  • The ethics committees of Third People’s Hospital of Chengdu and Affiliated Hospital of North Sichuan Medical College approved the study, and all participants provided written informed consent.

TAKEAWAY:

  • Vitamin K group experienced a significant reduction in the mean weekly frequency of cramps (mean difference, 2.60 [SD, 0.81] to 0.96 [SD, 1.41]) compared with the placebo group, which maintained a mean weekly frequency of 3.63 (SD, 2.20) (P < .001).
  • The severity of nocturnal leg cramps decreased more in the vitamin K group (mean difference, −2.55 [SD, 2.12] points) than in the placebo group (mean difference, −1.24 [SD, 1.16] points).
  • The duration of nocturnal leg cramps also decreased more in the vitamin K group (mean difference, −0.90 [SD, 0.88] minutes) than in the placebo group (mean difference, −0.32 [SD, 0.78] minutes).
  • No adverse events related to vitamin K use were identified, indicating a good safety profile for the supplementation.

IN PRACTICE:

“Given the generally benign characteristics of NLCs, treatment modality must be both effective and safe, thus minimizing the risk of iatrogenic harm,” the study authors wrote.

SOURCE:

This study was led by Jing Tan, MD, the Third People’s Hospital of Chengdu in Chengdu, China. It was published online on October 28 in JAMA Internal Medicine.

LIMITATIONS: 

This study did not investigate the quality of life or sleep, which could have provided additional insights into the impact of vitamin K on nocturnal leg cramps. The relatively mild nature of nocturnal leg cramps experienced by the participants may limit the generalizability of the findings to populations with more severe symptoms.

DISCLOSURES:

This study was supported by grants from China Health Promotion Foundation and the Third People’s Hospital of Chengdu Scientific Research Project. Tan disclosed receiving personal fees from BeiGene, AbbVie, Pfizer, Xian Janssen Pharmaceutical, and Takeda Pharmaceutical outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Tue, 11/19/2024 - 10:29
Un-Gate On Date
Tue, 11/19/2024 - 10:29
Use ProPublica
CFC Schedule Remove Status
Tue, 11/19/2024 - 10:29
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Tue, 11/19/2024 - 10:29

Scurvy: A Diagnosis Still Relevant Today

Article Type
Changed
Wed, 11/27/2024 - 04:38

“Petechial rash often prompts further investigation into hematological, dermatological, or vasculitis causes. However, if the above investigations are negative and skin biopsy has not revealed a cause, there is a Renaissance-era diagnosis that is often overlooked but is easily investigated and treated,” wrote Andrew Dermawan, MD, and colleagues from Sir Charles Gairdner Hospital in Nedlands, Australia, in BMJ Case Reports. The diagnosis they highlight is scurvy, a disease that has faded from common medical concern but is reemerging, partly because of the rise in bariatric surgery.

Diagnosing Scurvy in the 2020s

In their article, Dermawan and colleagues present the case of a 50-year-old man with a bilateral petechial rash on his lower limbs, without any history of trauma. The patient, who exhibited no infectious symptoms, also had gross hematuria, microcytic anemia, mild neutropenia, and lymphopenia. Tests for autoimmune and hematological diseases were negative, as were abdominal and leg CT scans, ruling out abdominal hemorrhage and vasculitis. Additionally, a skin biopsy showed no causative findings.

The doctors noted that the patient had undergone sleeve gastrectomy, prompting them to inquire about his diet. They discovered that, because of financial difficulties, his diet primarily consisted of processed foods with little to no fruits or vegetables, and he had stopped taking supplements recommended by his gastroenterologist. Further tests revealed a vitamin D deficiency and a severe deficiency in vitamin C. With the diagnosis of scurvy confirmed, the doctors treated the patient with 1000 mg of ascorbic acid daily, along with cholecalciferol, folic acid, and a multivitamin complex, leading to a complete resolution of his symptoms.
 

Risk Factors Then and Now

Scurvy can present with a range of symptoms, including petechiae, perifollicular hemorrhage, ecchymosis, gingivitis, edema, anemia, delayed wound healing, malaise, weakness, joint swelling, arthralgia, anorexia, neuropathy, and vasomotor instability. It can cause mucosal and gastric hemorrhages, and if left untreated, it can lead to fatal bleeding.

Historically known as “sailors’ disease,” scurvy plagued men on long voyages who lacked access to fresh fruits or vegetables and thus did not get enough vitamin C. In 1747, James Lind, a British physician in the Royal Navy, demonstrated that the consumption of oranges and lemons could combat scurvy.

Today’s risk factors for scurvy include malnutrition, gastrointestinal disorders (eg, chronic inflammatory bowel diseases), alcohol and tobacco use, eating disorders, psychiatric illnesses, dialysis, and the use of medications that reduce the absorption of ascorbic acid (such as corticosteroids and proton pump inhibitors).

Scurvy remains more common among individuals with unfavorable socioeconomic conditions. The authors of the study emphasize how the rising cost of living — specifically in Australia but applicable elsewhere — is changing eating habits, leading to a high consumption of low-cost, nutritionally poor foods.

Poverty has always been a risk factor for scurvy, but today there may be an additional cause: bariatric surgery. Patients undergoing these procedures are at a risk for deficiencies in fat-soluble vitamins A, D, E, and K, and if their diet is inadequate, they may also experience a vitamin C deficiency. Awareness of this can facilitate the timely diagnosis of scurvy in these patients.

This story was translated from Univadis Italy using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

“Petechial rash often prompts further investigation into hematological, dermatological, or vasculitis causes. However, if the above investigations are negative and skin biopsy has not revealed a cause, there is a Renaissance-era diagnosis that is often overlooked but is easily investigated and treated,” wrote Andrew Dermawan, MD, and colleagues from Sir Charles Gairdner Hospital in Nedlands, Australia, in BMJ Case Reports. The diagnosis they highlight is scurvy, a disease that has faded from common medical concern but is reemerging, partly because of the rise in bariatric surgery.

Diagnosing Scurvy in the 2020s

In their article, Dermawan and colleagues present the case of a 50-year-old man with a bilateral petechial rash on his lower limbs, without any history of trauma. The patient, who exhibited no infectious symptoms, also had gross hematuria, microcytic anemia, mild neutropenia, and lymphopenia. Tests for autoimmune and hematological diseases were negative, as were abdominal and leg CT scans, ruling out abdominal hemorrhage and vasculitis. Additionally, a skin biopsy showed no causative findings.

The doctors noted that the patient had undergone sleeve gastrectomy, prompting them to inquire about his diet. They discovered that, because of financial difficulties, his diet primarily consisted of processed foods with little to no fruits or vegetables, and he had stopped taking supplements recommended by his gastroenterologist. Further tests revealed a vitamin D deficiency and a severe deficiency in vitamin C. With the diagnosis of scurvy confirmed, the doctors treated the patient with 1000 mg of ascorbic acid daily, along with cholecalciferol, folic acid, and a multivitamin complex, leading to a complete resolution of his symptoms.
 

Risk Factors Then and Now

Scurvy can present with a range of symptoms, including petechiae, perifollicular hemorrhage, ecchymosis, gingivitis, edema, anemia, delayed wound healing, malaise, weakness, joint swelling, arthralgia, anorexia, neuropathy, and vasomotor instability. It can cause mucosal and gastric hemorrhages, and if left untreated, it can lead to fatal bleeding.

Historically known as “sailors’ disease,” scurvy plagued men on long voyages who lacked access to fresh fruits or vegetables and thus did not get enough vitamin C. In 1747, James Lind, a British physician in the Royal Navy, demonstrated that the consumption of oranges and lemons could combat scurvy.

Today’s risk factors for scurvy include malnutrition, gastrointestinal disorders (eg, chronic inflammatory bowel diseases), alcohol and tobacco use, eating disorders, psychiatric illnesses, dialysis, and the use of medications that reduce the absorption of ascorbic acid (such as corticosteroids and proton pump inhibitors).

Scurvy remains more common among individuals with unfavorable socioeconomic conditions. The authors of the study emphasize how the rising cost of living — specifically in Australia but applicable elsewhere — is changing eating habits, leading to a high consumption of low-cost, nutritionally poor foods.

Poverty has always been a risk factor for scurvy, but today there may be an additional cause: bariatric surgery. Patients undergoing these procedures are at a risk for deficiencies in fat-soluble vitamins A, D, E, and K, and if their diet is inadequate, they may also experience a vitamin C deficiency. Awareness of this can facilitate the timely diagnosis of scurvy in these patients.

This story was translated from Univadis Italy using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

“Petechial rash often prompts further investigation into hematological, dermatological, or vasculitis causes. However, if the above investigations are negative and skin biopsy has not revealed a cause, there is a Renaissance-era diagnosis that is often overlooked but is easily investigated and treated,” wrote Andrew Dermawan, MD, and colleagues from Sir Charles Gairdner Hospital in Nedlands, Australia, in BMJ Case Reports. The diagnosis they highlight is scurvy, a disease that has faded from common medical concern but is reemerging, partly because of the rise in bariatric surgery.

Diagnosing Scurvy in the 2020s

In their article, Dermawan and colleagues present the case of a 50-year-old man with a bilateral petechial rash on his lower limbs, without any history of trauma. The patient, who exhibited no infectious symptoms, also had gross hematuria, microcytic anemia, mild neutropenia, and lymphopenia. Tests for autoimmune and hematological diseases were negative, as were abdominal and leg CT scans, ruling out abdominal hemorrhage and vasculitis. Additionally, a skin biopsy showed no causative findings.

The doctors noted that the patient had undergone sleeve gastrectomy, prompting them to inquire about his diet. They discovered that, because of financial difficulties, his diet primarily consisted of processed foods with little to no fruits or vegetables, and he had stopped taking supplements recommended by his gastroenterologist. Further tests revealed a vitamin D deficiency and a severe deficiency in vitamin C. With the diagnosis of scurvy confirmed, the doctors treated the patient with 1000 mg of ascorbic acid daily, along with cholecalciferol, folic acid, and a multivitamin complex, leading to a complete resolution of his symptoms.
 

Risk Factors Then and Now

Scurvy can present with a range of symptoms, including petechiae, perifollicular hemorrhage, ecchymosis, gingivitis, edema, anemia, delayed wound healing, malaise, weakness, joint swelling, arthralgia, anorexia, neuropathy, and vasomotor instability. It can cause mucosal and gastric hemorrhages, and if left untreated, it can lead to fatal bleeding.

Historically known as “sailors’ disease,” scurvy plagued men on long voyages who lacked access to fresh fruits or vegetables and thus did not get enough vitamin C. In 1747, James Lind, a British physician in the Royal Navy, demonstrated that the consumption of oranges and lemons could combat scurvy.

Today’s risk factors for scurvy include malnutrition, gastrointestinal disorders (eg, chronic inflammatory bowel diseases), alcohol and tobacco use, eating disorders, psychiatric illnesses, dialysis, and the use of medications that reduce the absorption of ascorbic acid (such as corticosteroids and proton pump inhibitors).

Scurvy remains more common among individuals with unfavorable socioeconomic conditions. The authors of the study emphasize how the rising cost of living — specifically in Australia but applicable elsewhere — is changing eating habits, leading to a high consumption of low-cost, nutritionally poor foods.

Poverty has always been a risk factor for scurvy, but today there may be an additional cause: bariatric surgery. Patients undergoing these procedures are at a risk for deficiencies in fat-soluble vitamins A, D, E, and K, and if their diet is inadequate, they may also experience a vitamin C deficiency. Awareness of this can facilitate the timely diagnosis of scurvy in these patients.

This story was translated from Univadis Italy using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 11/27/2024 - 04:38
Un-Gate On Date
Wed, 11/27/2024 - 04:38
Use ProPublica
CFC Schedule Remove Status
Wed, 11/27/2024 - 04:38
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 11/27/2024 - 04:38

Experts Challenge New Diagnostic Criteria for Alzheimer’s disease

Article Type
Changed
Wed, 11/06/2024 - 15:03

A group of international experts is challenging revised diagnostic criteria for Alzheimer’s disease as laid out by the Alzheimer’s Association earlier in 2024.

In a paper published online in JAMA Neurology, the International Working Group (IWG), which includes 46 experts from 17 countries, is recommending that the diagnosis of Alzheimer’s disease be limited to individuals with mild cognitive impairment or dementia and not be applied to cognitively normal individuals with Alzheimer’s disease biomarkers such as amyloid-beta 42/40 or p-tau.

Clinicians should be “very careful” about using the “A” word (Alzheimer’s) for cognitively unimpaired people with Alzheimer’s disease biomarkers, said the paper’s first author Bruno Dubois, MD, professor of neurology, Sorbonne University and Department of Neurology, Pitié-Salpêtrière Hospital, Paris, France.

Providing an Alzheimer’s disease diagnosis to those who have a high chance of never developing cognitive impairment can be psychologically harmful, said Dubois.

“It’s not something small like telling someone they have a fever. Just imagine you’re 65 years old and are amyloid positive, and you’re told you have Alzheimer’s disease. It affects the decisions you make for the rest of your life and changes your vision of your future, even though you may never develop the disease,” he added.
 

Divergent View

The IWG’s perspective on Alzheimer’s disease contrasts with a recent proposal from the Alzheimer’s Association. The Alzheimer’s Association criteria suggest that Alzheimer’s disease should be regarded solely as a biological entity, which could include cognitively normal individuals with one core Alzheimer’s disease biomarker.

The IWG noted that its concerns regarding the application of a purely biological definition of Alzheimer’s disease in clinical practice prompted the group to consider updating its guidelines, potentially offering “an alternative definitional view of Alzheimer’s disease as a clinical-biological construct for clinical use.”

The group conducted a PubMed search for relevant Alzheimer’s disease articles, and included references, published between July 2020 and March 2024. The research showed the majority of biomarker-positive, cognitively normal individuals will not become symptomatic during their lifetime.

The risk of a 55-year-old who is amyloid positive developing Alzheimer’s disease is not that much higher than that for an individual of a similar age who is amyloid negative, Dubois noted. “There’s an 83% chance that person will never develop Alzheimer’s disease.”

Disclosing a diagnosis of Alzheimer’s disease to cognitively normal people with only one core Alzheimer’s disease biomarker represents “the most problematic implication of a purely biological definition of the disease,” the authors noted.

“A biomarker is a marker of pathology, not a biomarker of disease,” said Dubois, adding that a person may have markers for several different brain diseases.

The IWG recommends the following nomenclature: At risk for Alzheimer’s disease for those with Alzheimer’s disease biomarkers but low lifetime risk and presymptomatic Alzheimer’s disease for those with Alzheimer’s disease biomarkers with a very high lifetime risk for progression such as individuals with autosomal dominant genetic mutations and other distinct biomarker profiles that put them at extremely high lifetime risk of developing the disease.

Dubois emphasized the difference between those showing typical Alzheimer’s disease symptoms with positive biomarkers who should be considered to have the disease and those with positive biomarkers but no typical Alzheimer’s disease symptoms who should be considered at risk.

This is an important distinction as it affects research approaches and assessment of risks, he said.

For low-risk asymptomatic individuals, the IWG does not recommend routine diagnostic testing outside of the research setting. “There’s no reason to send a 65-year-old cognitively normal subject off to collect biomarker information,” said Dubois.

He reiterated the importance of clinicians using appropriate and sensitive language surrounding Alzheimer’s disease when face to face with patients. This issue “is not purely semantic; this is real life.”

For these patients in the clinical setting, “we have to be very careful about proposing treatments that may have side effects,” he said.

However, this does not mean asymptomatic at-risk people should not be studied to determine what pharmacological interventions might prevent or delay the onset of clinical disease, he noted.

Presymptomatic individuals who are at a high risk of developing Alzheimer’s disease “should be the target for clinical trials in the future” to determine best ways to delay the conversion to Alzheimer’s disease, he said.

The main focus of such research should be to better understand the “biomarker pattern profile” that is associated with a high risk of developing Alzheimer’s disease, said Dubois.
 

 

 

Plea for Unity

In an accompanying editorial, Ronald C. Petersen, PhD, MD, director, Mayo Clinic Alzheimer’s Disease Research Center and Mayo Clinic Study of Aging, Rochester, Minnesota, and colleagues outline the difference between the IWG and Alzheimer’s Association positions.

As the IWG uses Alzheimer’s disease to define those with cognitive impairment and the Alzheimer’s Association group uses Alzheimer’s disease to define those with the pathology of the disease, the field is now at a crossroads. “Do we name the disease before clinical symptoms?” they asked.

They note that Alzheimer’s Association criteria distinguish between a disease and an illness, whereas the IWG does not. “As such, although the primary disagreement between the groups is semantic, the ramifications of the labeling can be significant.”

It is “incumbent” that the field “come together” on an Alzheimer’s disease definition, the editorial concluded. “Neither the Alzheimer’s Association or IWG documents are appropriate to serve as a guide for how to apply biomarkers in a clinical setting. Appropriate-use criteria are needed to form a bridge between biological frameworks and real-world clinical practice so we can all maximally help all of our patients with this disorder.”

In a comment, Reisa Sperling, MD, professor of neurology, Harvard Medical School, and director, Center for Alzheimer Research and Treatment, Brigham and Women’s Hospital and Massachusetts General Hospital, all in Boston, who is part of the Alzheimer’s Association work group that published the revised criteria for diagnosis and staging of Alzheimer’s disease, likened Alzheimer’s disease, which begins in the brain many years before dementia onset, to cardiovascular disease in that it involves multiple processes. She noted the World Health Organization classifies cardiovascular disease as a “disease” prior to clinical manifestations such as stroke and myocardial infarction.

“If someone has Alzheimer’s disease pathology in their brain, they are at risk for dementia or clinical manifestations of the disease — just like vascular disease quantifies the risk of stroke or heart attack, not risk of developing ‘vascular disease’ if the underlying vascular disease is already present,” said Sperling.

A large part of the controversy is related to terminology and the “stigma” of the “A” word in the same way there used to be fear around using the “C” word — cancer, said Sperling.

“Once people began talking about cancer publicly as a potentially treatable disease and began getting screened and diagnosed before symptoms of cancer were manifest, this has had a tremendous impact on public health.”

She clarified that her work group does not recommend screening asymptomatic people with Alzheimer’s disease biomarkers. “We actually need to prove that treating at the preclinical stage of the disease is able to prevent clinical impairment and dementia,” she said, adding “hopefully, we are getting closer to this.”

Dubois reported no relevant disclosures. Petersen reported receiving personal fees from Roche, Genentech, Eli Lilly and Company, Eisai, and Novo Nordisk outside the submitted work and royalties from Oxford University Press, UpToDate, and Medscape educational activities.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A group of international experts is challenging revised diagnostic criteria for Alzheimer’s disease as laid out by the Alzheimer’s Association earlier in 2024.

In a paper published online in JAMA Neurology, the International Working Group (IWG), which includes 46 experts from 17 countries, is recommending that the diagnosis of Alzheimer’s disease be limited to individuals with mild cognitive impairment or dementia and not be applied to cognitively normal individuals with Alzheimer’s disease biomarkers such as amyloid-beta 42/40 or p-tau.

Clinicians should be “very careful” about using the “A” word (Alzheimer’s) for cognitively unimpaired people with Alzheimer’s disease biomarkers, said the paper’s first author Bruno Dubois, MD, professor of neurology, Sorbonne University and Department of Neurology, Pitié-Salpêtrière Hospital, Paris, France.

Providing an Alzheimer’s disease diagnosis to those who have a high chance of never developing cognitive impairment can be psychologically harmful, said Dubois.

“It’s not something small like telling someone they have a fever. Just imagine you’re 65 years old and are amyloid positive, and you’re told you have Alzheimer’s disease. It affects the decisions you make for the rest of your life and changes your vision of your future, even though you may never develop the disease,” he added.
 

Divergent View

The IWG’s perspective on Alzheimer’s disease contrasts with a recent proposal from the Alzheimer’s Association. The Alzheimer’s Association criteria suggest that Alzheimer’s disease should be regarded solely as a biological entity, which could include cognitively normal individuals with one core Alzheimer’s disease biomarker.

The IWG noted that its concerns regarding the application of a purely biological definition of Alzheimer’s disease in clinical practice prompted the group to consider updating its guidelines, potentially offering “an alternative definitional view of Alzheimer’s disease as a clinical-biological construct for clinical use.”

The group conducted a PubMed search for relevant Alzheimer’s disease articles, and included references, published between July 2020 and March 2024. The research showed the majority of biomarker-positive, cognitively normal individuals will not become symptomatic during their lifetime.

The risk of a 55-year-old who is amyloid positive developing Alzheimer’s disease is not that much higher than that for an individual of a similar age who is amyloid negative, Dubois noted. “There’s an 83% chance that person will never develop Alzheimer’s disease.”

Disclosing a diagnosis of Alzheimer’s disease to cognitively normal people with only one core Alzheimer’s disease biomarker represents “the most problematic implication of a purely biological definition of the disease,” the authors noted.

“A biomarker is a marker of pathology, not a biomarker of disease,” said Dubois, adding that a person may have markers for several different brain diseases.

The IWG recommends the following nomenclature: At risk for Alzheimer’s disease for those with Alzheimer’s disease biomarkers but low lifetime risk and presymptomatic Alzheimer’s disease for those with Alzheimer’s disease biomarkers with a very high lifetime risk for progression such as individuals with autosomal dominant genetic mutations and other distinct biomarker profiles that put them at extremely high lifetime risk of developing the disease.

Dubois emphasized the difference between those showing typical Alzheimer’s disease symptoms with positive biomarkers who should be considered to have the disease and those with positive biomarkers but no typical Alzheimer’s disease symptoms who should be considered at risk.

This is an important distinction as it affects research approaches and assessment of risks, he said.

For low-risk asymptomatic individuals, the IWG does not recommend routine diagnostic testing outside of the research setting. “There’s no reason to send a 65-year-old cognitively normal subject off to collect biomarker information,” said Dubois.

He reiterated the importance of clinicians using appropriate and sensitive language surrounding Alzheimer’s disease when face to face with patients. This issue “is not purely semantic; this is real life.”

For these patients in the clinical setting, “we have to be very careful about proposing treatments that may have side effects,” he said.

However, this does not mean asymptomatic at-risk people should not be studied to determine what pharmacological interventions might prevent or delay the onset of clinical disease, he noted.

Presymptomatic individuals who are at a high risk of developing Alzheimer’s disease “should be the target for clinical trials in the future” to determine best ways to delay the conversion to Alzheimer’s disease, he said.

The main focus of such research should be to better understand the “biomarker pattern profile” that is associated with a high risk of developing Alzheimer’s disease, said Dubois.
 

 

 

Plea for Unity

In an accompanying editorial, Ronald C. Petersen, PhD, MD, director, Mayo Clinic Alzheimer’s Disease Research Center and Mayo Clinic Study of Aging, Rochester, Minnesota, and colleagues outline the difference between the IWG and Alzheimer’s Association positions.

As the IWG uses Alzheimer’s disease to define those with cognitive impairment and the Alzheimer’s Association group uses Alzheimer’s disease to define those with the pathology of the disease, the field is now at a crossroads. “Do we name the disease before clinical symptoms?” they asked.

They note that Alzheimer’s Association criteria distinguish between a disease and an illness, whereas the IWG does not. “As such, although the primary disagreement between the groups is semantic, the ramifications of the labeling can be significant.”

It is “incumbent” that the field “come together” on an Alzheimer’s disease definition, the editorial concluded. “Neither the Alzheimer’s Association or IWG documents are appropriate to serve as a guide for how to apply biomarkers in a clinical setting. Appropriate-use criteria are needed to form a bridge between biological frameworks and real-world clinical practice so we can all maximally help all of our patients with this disorder.”

In a comment, Reisa Sperling, MD, professor of neurology, Harvard Medical School, and director, Center for Alzheimer Research and Treatment, Brigham and Women’s Hospital and Massachusetts General Hospital, all in Boston, who is part of the Alzheimer’s Association work group that published the revised criteria for diagnosis and staging of Alzheimer’s disease, likened Alzheimer’s disease, which begins in the brain many years before dementia onset, to cardiovascular disease in that it involves multiple processes. She noted the World Health Organization classifies cardiovascular disease as a “disease” prior to clinical manifestations such as stroke and myocardial infarction.

“If someone has Alzheimer’s disease pathology in their brain, they are at risk for dementia or clinical manifestations of the disease — just like vascular disease quantifies the risk of stroke or heart attack, not risk of developing ‘vascular disease’ if the underlying vascular disease is already present,” said Sperling.

A large part of the controversy is related to terminology and the “stigma” of the “A” word in the same way there used to be fear around using the “C” word — cancer, said Sperling.

“Once people began talking about cancer publicly as a potentially treatable disease and began getting screened and diagnosed before symptoms of cancer were manifest, this has had a tremendous impact on public health.”

She clarified that her work group does not recommend screening asymptomatic people with Alzheimer’s disease biomarkers. “We actually need to prove that treating at the preclinical stage of the disease is able to prevent clinical impairment and dementia,” she said, adding “hopefully, we are getting closer to this.”

Dubois reported no relevant disclosures. Petersen reported receiving personal fees from Roche, Genentech, Eli Lilly and Company, Eisai, and Novo Nordisk outside the submitted work and royalties from Oxford University Press, UpToDate, and Medscape educational activities.

A version of this article appeared on Medscape.com.

A group of international experts is challenging revised diagnostic criteria for Alzheimer’s disease as laid out by the Alzheimer’s Association earlier in 2024.

In a paper published online in JAMA Neurology, the International Working Group (IWG), which includes 46 experts from 17 countries, is recommending that the diagnosis of Alzheimer’s disease be limited to individuals with mild cognitive impairment or dementia and not be applied to cognitively normal individuals with Alzheimer’s disease biomarkers such as amyloid-beta 42/40 or p-tau.

Clinicians should be “very careful” about using the “A” word (Alzheimer’s) for cognitively unimpaired people with Alzheimer’s disease biomarkers, said the paper’s first author Bruno Dubois, MD, professor of neurology, Sorbonne University and Department of Neurology, Pitié-Salpêtrière Hospital, Paris, France.

Providing an Alzheimer’s disease diagnosis to those who have a high chance of never developing cognitive impairment can be psychologically harmful, said Dubois.

“It’s not something small like telling someone they have a fever. Just imagine you’re 65 years old and are amyloid positive, and you’re told you have Alzheimer’s disease. It affects the decisions you make for the rest of your life and changes your vision of your future, even though you may never develop the disease,” he added.
 

Divergent View

The IWG’s perspective on Alzheimer’s disease contrasts with a recent proposal from the Alzheimer’s Association. The Alzheimer’s Association criteria suggest that Alzheimer’s disease should be regarded solely as a biological entity, which could include cognitively normal individuals with one core Alzheimer’s disease biomarker.

The IWG noted that its concerns regarding the application of a purely biological definition of Alzheimer’s disease in clinical practice prompted the group to consider updating its guidelines, potentially offering “an alternative definitional view of Alzheimer’s disease as a clinical-biological construct for clinical use.”

The group conducted a PubMed search for relevant Alzheimer’s disease articles, and included references, published between July 2020 and March 2024. The research showed the majority of biomarker-positive, cognitively normal individuals will not become symptomatic during their lifetime.

The risk of a 55-year-old who is amyloid positive developing Alzheimer’s disease is not that much higher than that for an individual of a similar age who is amyloid negative, Dubois noted. “There’s an 83% chance that person will never develop Alzheimer’s disease.”

Disclosing a diagnosis of Alzheimer’s disease to cognitively normal people with only one core Alzheimer’s disease biomarker represents “the most problematic implication of a purely biological definition of the disease,” the authors noted.

“A biomarker is a marker of pathology, not a biomarker of disease,” said Dubois, adding that a person may have markers for several different brain diseases.

The IWG recommends the following nomenclature: At risk for Alzheimer’s disease for those with Alzheimer’s disease biomarkers but low lifetime risk and presymptomatic Alzheimer’s disease for those with Alzheimer’s disease biomarkers with a very high lifetime risk for progression such as individuals with autosomal dominant genetic mutations and other distinct biomarker profiles that put them at extremely high lifetime risk of developing the disease.

Dubois emphasized the difference between those showing typical Alzheimer’s disease symptoms with positive biomarkers who should be considered to have the disease and those with positive biomarkers but no typical Alzheimer’s disease symptoms who should be considered at risk.

This is an important distinction as it affects research approaches and assessment of risks, he said.

For low-risk asymptomatic individuals, the IWG does not recommend routine diagnostic testing outside of the research setting. “There’s no reason to send a 65-year-old cognitively normal subject off to collect biomarker information,” said Dubois.

He reiterated the importance of clinicians using appropriate and sensitive language surrounding Alzheimer’s disease when face to face with patients. This issue “is not purely semantic; this is real life.”

For these patients in the clinical setting, “we have to be very careful about proposing treatments that may have side effects,” he said.

However, this does not mean asymptomatic at-risk people should not be studied to determine what pharmacological interventions might prevent or delay the onset of clinical disease, he noted.

Presymptomatic individuals who are at a high risk of developing Alzheimer’s disease “should be the target for clinical trials in the future” to determine best ways to delay the conversion to Alzheimer’s disease, he said.

The main focus of such research should be to better understand the “biomarker pattern profile” that is associated with a high risk of developing Alzheimer’s disease, said Dubois.
 

 

 

Plea for Unity

In an accompanying editorial, Ronald C. Petersen, PhD, MD, director, Mayo Clinic Alzheimer’s Disease Research Center and Mayo Clinic Study of Aging, Rochester, Minnesota, and colleagues outline the difference between the IWG and Alzheimer’s Association positions.

As the IWG uses Alzheimer’s disease to define those with cognitive impairment and the Alzheimer’s Association group uses Alzheimer’s disease to define those with the pathology of the disease, the field is now at a crossroads. “Do we name the disease before clinical symptoms?” they asked.

They note that Alzheimer’s Association criteria distinguish between a disease and an illness, whereas the IWG does not. “As such, although the primary disagreement between the groups is semantic, the ramifications of the labeling can be significant.”

It is “incumbent” that the field “come together” on an Alzheimer’s disease definition, the editorial concluded. “Neither the Alzheimer’s Association or IWG documents are appropriate to serve as a guide for how to apply biomarkers in a clinical setting. Appropriate-use criteria are needed to form a bridge between biological frameworks and real-world clinical practice so we can all maximally help all of our patients with this disorder.”

In a comment, Reisa Sperling, MD, professor of neurology, Harvard Medical School, and director, Center for Alzheimer Research and Treatment, Brigham and Women’s Hospital and Massachusetts General Hospital, all in Boston, who is part of the Alzheimer’s Association work group that published the revised criteria for diagnosis and staging of Alzheimer’s disease, likened Alzheimer’s disease, which begins in the brain many years before dementia onset, to cardiovascular disease in that it involves multiple processes. She noted the World Health Organization classifies cardiovascular disease as a “disease” prior to clinical manifestations such as stroke and myocardial infarction.

“If someone has Alzheimer’s disease pathology in their brain, they are at risk for dementia or clinical manifestations of the disease — just like vascular disease quantifies the risk of stroke or heart attack, not risk of developing ‘vascular disease’ if the underlying vascular disease is already present,” said Sperling.

A large part of the controversy is related to terminology and the “stigma” of the “A” word in the same way there used to be fear around using the “C” word — cancer, said Sperling.

“Once people began talking about cancer publicly as a potentially treatable disease and began getting screened and diagnosed before symptoms of cancer were manifest, this has had a tremendous impact on public health.”

She clarified that her work group does not recommend screening asymptomatic people with Alzheimer’s disease biomarkers. “We actually need to prove that treating at the preclinical stage of the disease is able to prevent clinical impairment and dementia,” she said, adding “hopefully, we are getting closer to this.”

Dubois reported no relevant disclosures. Petersen reported receiving personal fees from Roche, Genentech, Eli Lilly and Company, Eisai, and Novo Nordisk outside the submitted work and royalties from Oxford University Press, UpToDate, and Medscape educational activities.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

From JAMA Neurology

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Drug Options Abound for Duchenne Muscular Dystrophy

Article Type
Changed
Wed, 11/06/2024 - 11:15

— When Ann & Robert H. Lurie Children’s Hospital of Chicago pediatric neurologist Nancy L. Kuntz, MD, was a fellow about 45 years ago, there were few more devastating diagnoses than Duchenne muscular dystrophy (DMD).

“The rule of thumb was that they would stop walking by age 10 and probably die around age 20, and there was not much we could do,” Kuntz told colleagues at the American Association of Neuromuscular & Electrodiagnostic Medicine (AANEM) 2024.

Now, the landscape of DMD therapy is transforming at a rapid pace. “In the last 8 years, we’ve seen eight different therapies that are FDA-approved specifically for Duchenne, and many more are in the pipeline,” said session moderator Kathryn Mosher, MD, a pediatric physical medicine and rehabilitation physician at Akron Children’s Hospital, Akron, Ohio.

This is both good news and a new challenge for clinicians: Which of these treatments are best for which patients? Kuntz said the traditional therapy of corticosteroids is still crucial. However, “there are still families begging to not use steroids, or refusing to use steroids, just not filling the prescriptions,” she said.
 

Beware of Parents Who Reject Steroids

The failure to use steroids “breaks your heart” because data show their impact on “really important functions like walking and being able to get up from the ground,” she said. “You can add months and years to life with this treatment.”

However, “while we have shown that using corticosteroids makes a difference, I don’t think that we’ve really worked out the best age at which to start the steroids, or the dosing schedule, or even the type of steroids,” she cautioned.

In an accompanying presentation about therapy for DMD, pediatric neurologist Craig M. Zaidman, MD, of Washington University in St. Louis, Missouri, cautioned that “daily steroids make a big impact on your growth and particularly on your height.”

In particular, the corticosteroid deflazacort has been linked to more cataracts than prednisone and less weight gain and height growth. “They really don’t grow, they don’t get taller, and they also don’t gain weight. They look like little boys when they’re 13 years old.”
 

Deflazacort or Vamorolone?

Vamorolone (Agamree) is a cheaper corticosteroid alternative to deflazacort (Emflaza), and a 2024 study showed no difference in functional outcomes over 48 weeks, he said. Also, daily vamorolone does a better job of preserving height growth than daily prednisone, he said, and he’s seen less risk for vertebral fractures.

Where do newer drugs fit in? One crucial thing to know about the new generation of targeted therapies is that they’re often mutation-dependent, Kuntz said. They may only work in patients with certain mutations, or mutations may lead to more side effects.

“You should have the exact mutation of your patient, and then you can look and see what they’re eligible for,” she said.

$700,000 a Year for Givinostat

Zaidman highlighted the newly approved givinostat (Duvyzat), a histone deacetylase inhibitor approved for boys 6 years or older. The cost is $700,000 a year, he said, and it’s been linked to less decline in four-stair climb per a double-blind, placebo-controlled, phase 3 trial.

The drug can cause side effects such as reducing platelets, boosting triglycerides, and inducing gastrointestinal problems. “When you drop the dose, these problems go away,” he said.

Does givinostat work? While trial data are challenging to interpret, they do suggest that patients “will lose skill, but they might not lose two or three skills they otherwise would have,” Zaidman said. “To me, that’s quite compelling.”

As for exon-skipping therapies, another new-generation option for DMD, he noted that “these drugs are on the market based on their accelerated approval. We will never have the perfect phase 3, randomized, controlled, long-term trial for these. It’s just not going to come. This is what we get.”

Mosher disclosed the advisory board (Sarepta Therapeutics, Pfizer, Reata Pharmaceuticals, and PTC). Kuntz disclosed advisory board (Astellas Pharma, Inc., argenx, Catalyst, Entrada Therapeutics, Genentech, and Novartis), exchange expert on-demand program (Sarepta Therapeutics), speaker (Genentech, Sarepta Therapeutics, and Solid), and research funding (Astellas Pharma, Inc., argenx, Biogen, Catalyst, Genentech, Novartis, and Sarepta Therapeutics). Zaidman disclosed speaking/advisor/consulting (Sarepta Therapeutics and Optum) and research funding (Novartis and Biogen).
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

— When Ann & Robert H. Lurie Children’s Hospital of Chicago pediatric neurologist Nancy L. Kuntz, MD, was a fellow about 45 years ago, there were few more devastating diagnoses than Duchenne muscular dystrophy (DMD).

“The rule of thumb was that they would stop walking by age 10 and probably die around age 20, and there was not much we could do,” Kuntz told colleagues at the American Association of Neuromuscular & Electrodiagnostic Medicine (AANEM) 2024.

Now, the landscape of DMD therapy is transforming at a rapid pace. “In the last 8 years, we’ve seen eight different therapies that are FDA-approved specifically for Duchenne, and many more are in the pipeline,” said session moderator Kathryn Mosher, MD, a pediatric physical medicine and rehabilitation physician at Akron Children’s Hospital, Akron, Ohio.

This is both good news and a new challenge for clinicians: Which of these treatments are best for which patients? Kuntz said the traditional therapy of corticosteroids is still crucial. However, “there are still families begging to not use steroids, or refusing to use steroids, just not filling the prescriptions,” she said.
 

Beware of Parents Who Reject Steroids

The failure to use steroids “breaks your heart” because data show their impact on “really important functions like walking and being able to get up from the ground,” she said. “You can add months and years to life with this treatment.”

However, “while we have shown that using corticosteroids makes a difference, I don’t think that we’ve really worked out the best age at which to start the steroids, or the dosing schedule, or even the type of steroids,” she cautioned.

In an accompanying presentation about therapy for DMD, pediatric neurologist Craig M. Zaidman, MD, of Washington University in St. Louis, Missouri, cautioned that “daily steroids make a big impact on your growth and particularly on your height.”

In particular, the corticosteroid deflazacort has been linked to more cataracts than prednisone and less weight gain and height growth. “They really don’t grow, they don’t get taller, and they also don’t gain weight. They look like little boys when they’re 13 years old.”
 

Deflazacort or Vamorolone?

Vamorolone (Agamree) is a cheaper corticosteroid alternative to deflazacort (Emflaza), and a 2024 study showed no difference in functional outcomes over 48 weeks, he said. Also, daily vamorolone does a better job of preserving height growth than daily prednisone, he said, and he’s seen less risk for vertebral fractures.

Where do newer drugs fit in? One crucial thing to know about the new generation of targeted therapies is that they’re often mutation-dependent, Kuntz said. They may only work in patients with certain mutations, or mutations may lead to more side effects.

“You should have the exact mutation of your patient, and then you can look and see what they’re eligible for,” she said.

$700,000 a Year for Givinostat

Zaidman highlighted the newly approved givinostat (Duvyzat), a histone deacetylase inhibitor approved for boys 6 years or older. The cost is $700,000 a year, he said, and it’s been linked to less decline in four-stair climb per a double-blind, placebo-controlled, phase 3 trial.

The drug can cause side effects such as reducing platelets, boosting triglycerides, and inducing gastrointestinal problems. “When you drop the dose, these problems go away,” he said.

Does givinostat work? While trial data are challenging to interpret, they do suggest that patients “will lose skill, but they might not lose two or three skills they otherwise would have,” Zaidman said. “To me, that’s quite compelling.”

As for exon-skipping therapies, another new-generation option for DMD, he noted that “these drugs are on the market based on their accelerated approval. We will never have the perfect phase 3, randomized, controlled, long-term trial for these. It’s just not going to come. This is what we get.”

Mosher disclosed the advisory board (Sarepta Therapeutics, Pfizer, Reata Pharmaceuticals, and PTC). Kuntz disclosed advisory board (Astellas Pharma, Inc., argenx, Catalyst, Entrada Therapeutics, Genentech, and Novartis), exchange expert on-demand program (Sarepta Therapeutics), speaker (Genentech, Sarepta Therapeutics, and Solid), and research funding (Astellas Pharma, Inc., argenx, Biogen, Catalyst, Genentech, Novartis, and Sarepta Therapeutics). Zaidman disclosed speaking/advisor/consulting (Sarepta Therapeutics and Optum) and research funding (Novartis and Biogen).
 

A version of this article appeared on Medscape.com.

— When Ann & Robert H. Lurie Children’s Hospital of Chicago pediatric neurologist Nancy L. Kuntz, MD, was a fellow about 45 years ago, there were few more devastating diagnoses than Duchenne muscular dystrophy (DMD).

“The rule of thumb was that they would stop walking by age 10 and probably die around age 20, and there was not much we could do,” Kuntz told colleagues at the American Association of Neuromuscular & Electrodiagnostic Medicine (AANEM) 2024.

Now, the landscape of DMD therapy is transforming at a rapid pace. “In the last 8 years, we’ve seen eight different therapies that are FDA-approved specifically for Duchenne, and many more are in the pipeline,” said session moderator Kathryn Mosher, MD, a pediatric physical medicine and rehabilitation physician at Akron Children’s Hospital, Akron, Ohio.

This is both good news and a new challenge for clinicians: Which of these treatments are best for which patients? Kuntz said the traditional therapy of corticosteroids is still crucial. However, “there are still families begging to not use steroids, or refusing to use steroids, just not filling the prescriptions,” she said.
 

Beware of Parents Who Reject Steroids

The failure to use steroids “breaks your heart” because data show their impact on “really important functions like walking and being able to get up from the ground,” she said. “You can add months and years to life with this treatment.”

However, “while we have shown that using corticosteroids makes a difference, I don’t think that we’ve really worked out the best age at which to start the steroids, or the dosing schedule, or even the type of steroids,” she cautioned.

In an accompanying presentation about therapy for DMD, pediatric neurologist Craig M. Zaidman, MD, of Washington University in St. Louis, Missouri, cautioned that “daily steroids make a big impact on your growth and particularly on your height.”

In particular, the corticosteroid deflazacort has been linked to more cataracts than prednisone and less weight gain and height growth. “They really don’t grow, they don’t get taller, and they also don’t gain weight. They look like little boys when they’re 13 years old.”
 

Deflazacort or Vamorolone?

Vamorolone (Agamree) is a cheaper corticosteroid alternative to deflazacort (Emflaza), and a 2024 study showed no difference in functional outcomes over 48 weeks, he said. Also, daily vamorolone does a better job of preserving height growth than daily prednisone, he said, and he’s seen less risk for vertebral fractures.

Where do newer drugs fit in? One crucial thing to know about the new generation of targeted therapies is that they’re often mutation-dependent, Kuntz said. They may only work in patients with certain mutations, or mutations may lead to more side effects.

“You should have the exact mutation of your patient, and then you can look and see what they’re eligible for,” she said.

$700,000 a Year for Givinostat

Zaidman highlighted the newly approved givinostat (Duvyzat), a histone deacetylase inhibitor approved for boys 6 years or older. The cost is $700,000 a year, he said, and it’s been linked to less decline in four-stair climb per a double-blind, placebo-controlled, phase 3 trial.

The drug can cause side effects such as reducing platelets, boosting triglycerides, and inducing gastrointestinal problems. “When you drop the dose, these problems go away,” he said.

Does givinostat work? While trial data are challenging to interpret, they do suggest that patients “will lose skill, but they might not lose two or three skills they otherwise would have,” Zaidman said. “To me, that’s quite compelling.”

As for exon-skipping therapies, another new-generation option for DMD, he noted that “these drugs are on the market based on their accelerated approval. We will never have the perfect phase 3, randomized, controlled, long-term trial for these. It’s just not going to come. This is what we get.”

Mosher disclosed the advisory board (Sarepta Therapeutics, Pfizer, Reata Pharmaceuticals, and PTC). Kuntz disclosed advisory board (Astellas Pharma, Inc., argenx, Catalyst, Entrada Therapeutics, Genentech, and Novartis), exchange expert on-demand program (Sarepta Therapeutics), speaker (Genentech, Sarepta Therapeutics, and Solid), and research funding (Astellas Pharma, Inc., argenx, Biogen, Catalyst, Genentech, Novartis, and Sarepta Therapeutics). Zaidman disclosed speaking/advisor/consulting (Sarepta Therapeutics and Optum) and research funding (Novartis and Biogen).
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AANEM 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Brews, Bubbles, & Booze: Stroke Risk and Patients’ Favorite Drinks

Article Type
Changed
Tue, 11/05/2024 - 13:25

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Silent Epidemic: Loneliness a Serious Threat to Both Brain and Body

Article Type
Changed
Fri, 11/08/2024 - 02:18

In a world that is more connected than ever, a silent epidemic is taking its toll. Overall, one in three US adults report chronic loneliness — a condition so detrimental that it rivals smoking and obesity with respect to its negative effect on health and well-being. From anxiety and depression to life-threatening conditions like cardiovascular disease, stroke, and Alzheimer’s and Parkinson’s diseases, loneliness is more than an emotion — it’s a serious threat to both the brain and body.

In 2023, a US Surgeon General advisory raised the alarm about the national problem of loneliness and isolation, describing it as an epidemic.

“Given the significant health consequences of loneliness and isolation, we must prioritize building social connection in the same way we have prioritized other critical public health issues such as tobacco, obesity, and substance use disorders. Together, we can build a country that’s healthier, more resilient, less lonely, and more connected,” the report concluded.

But how, exactly, does chronic loneliness affect the physiology and function of the brain? What does the latest research reveal about the link between loneliness and neurologic and psychiatric illness, and what can clinicians do to address the issue?

This news organization spoke to multiple experts in the field to explore these issues.
 

A Major Risk Factor

Anna Finley, PhD, assistant professor of psychology at North Dakota State University, Fargo, explained that loneliness and social isolation are different entities. Social isolation is an objective measure of the number of people someone interacts with on a regular basis, whereas loneliness is a subjective feeling that occurs when close connections are lacking.

“These two things are not actually as related as you think they would be. People can feel lonely in a crowd or feel well connected with only a few friendships. It’s more about the quality of the connection and the quality of your perception of it. So someone could be in some very supportive relationships but still feel that there’s something missing,” she said in an interview.

So what do we know about how loneliness affects health? Evidence supporting the hypothesis that loneliness is an emerging risk factor for many diseases is steadily building.

Recently, the American Heart Association published a statement summarizing the evidence for a direct association between social isolation and loneliness and coronary heart disease and stroke mortality.

In addition, many studies have shown that individuals experiencing social isolation or loneliness have an increased risk for anxiety and depression, dementia, infectious disease, hospitalization, and all-cause death, even after adjusting for age and many other traditional risk factors.

One study revealed that eliminating loneliness has the potential to prevent nearly 20% of cases of depression in adults aged 50 years or older.

Indu Subramanian, MD, professor of neurology at the University of California, Los Angeles, and colleagues conducted a study involving patients with Parkinson’s disease, which showed that the negative impact of loneliness on disease severity was as significant as the positive effects of 30 minutes of daily exercise.

“The importance of loneliness is under-recognized and undervalued, and it poses a major risk for health outcomes and quality of life,” said Subramanian.

Subramanian noted that loneliness is stigmatizing, causing people to feel unlikable and blame themselves, which prevents them from opening up to doctors or loved ones about their struggle. At the same time, healthcare providers may not think to ask about loneliness or know about potential interventions. She emphasized that much more work is needed to address this issue.
 

 

 

Early Mortality Risk

Julianne Holt-Lunstad, PhD, professor of psychology and neuroscience at Brigham Young University in Provo, Utah, is the author of two large meta-analyses that suggest loneliness, social isolation, or living alone are independent risk factors for early mortality, increasing this risk by about a third — the equivalent to the risk of smoking 15 cigarettes per day.

“We have quite robust evidence across a number of health outcomes implicating the harmful effects of loneliness and social isolation. While these are observational studies and show mainly associations, we do have evidence from longitudinal studies that show lacking social connection, whether that be loneliness or social isolation, predicts subsequent worse outcomes, and most of these studies have adjusted for alternative kinds of explanations, like age, initial health status, lifestyle factors,” Holt-Lunstad said.

There is some evidence to suggest that isolation is more predictive of physical health outcomes, whereas loneliness is more predictive of mental health outcomes. That said, both isolation and loneliness have significant effects on mental and physical health outcomes, she noted.

There is also the question of whether loneliness is causing poor health or whether people who are in poor health feel lonely because poor health can lead to social isolation.

Finley said there’s probably a bit of both going on, but longitudinal studies, where loneliness is measured at a fixed timepoint then health outcomes are reported a few years later, suggest that loneliness is contributing to these adverse outcomes.

She added that there is also some evidence in animal models to suggest that loneliness is a causal risk factor for adverse health outcomes. “But you can’t ask a mouse or rat how lonely they’re feeling. All you can do is house them individually — removing them from social connection. This isn’t necessarily the same thing as loneliness in humans.”

Finley is studying mechanisms in the brain that may be involved in mediating the adverse health consequences of loneliness.

“What I’ve been seeing in the data so far is that it tends to be the self-report of how lonely folks are feeling that has the associations with differences in the brain, as opposed to the number of social connections people have. It does seem to be the more subjective, emotional perception of loneliness that is important.”

In a review of potential mechanisms involved, she concluded that it is dysregulated emotions and altered perceptions of social interactions that has profound impacts on the brain, suggesting that people who are lonely may have a tendency to interpret social cues in a negative way, preventing them from forming productive positive relationships.
 

Lack of Trust

One researcher who has studied this phenomenon is Dirk Scheele, PhD, professor of social neuroscience at Ruhr University Bochum in Germany.

“We were interested to find out why people remained lonely,” he said in an interview. “Loneliness is an unpleasant experience, and there are so many opportunities for social contacts nowadays, it’s not really clear at first sight why people are chronically lonely.”

To examine this question, Scheele and his team conducted a study in which functional MRI was used to examine the brain in otherwise healthy individuals with high or low loneliness scores while they played a trust game.

They also simulated a positive social interaction between participants and researchers, in which they talked about plans for a fictitious lottery win, and about their hobbies and interests, during which mood was measured with questionnaires, and saliva samples were collected to measure hormone levels.

Results showed that the high-lonely individuals had reduced activation in the insula cortex during the trust decisions. “This area of the brain is involved in the processing of bodily signals, such as ‘gut feelings.’ So reduced activity here could be interpreted as fewer gut feelings on who can be trusted,” Scheele explained.

The high-lonely individuals also had reduced responsiveness to the positive social interaction with a lower release of oxytocin and a smaller elevation in mood compared with the control individuals.

Scheele pointed out that there is some evidence that oxytocin might increase trust, and there is reduced release of endogenous oxytocin in high loneliness.

“Our results are consistent with the idea that loneliness is associated with negative biases about other people. So if we expect negative things from other people — for instance, that they cannot be trusted — then that would hamper further social interactions and could lead to loneliness,” he added.
 

 

 

A Role for Oxytocin?

In another study, the same researchers tested short-term (five weekly sessions) group psychotherapy to reduce loneliness using established techniques to target these negative biases. They also investigated whether the effects of this group psychotherapy could be augmented by administering intranasal oxytocin (vs placebo) before the group psychotherapy sessions.

Results showed that the group psychotherapy intervention reduced trait loneliness (loneliness experienced over a prolonged period). The oxytocin did not show a significant effect on trait loneliness, but there was a suggestion that it may enhance the reduction in state loneliness (how someone is feeling at a specific time) brought about by the psychotherapy sessions.

“We found that bonding within the groups was experienced as more positive in the oxytocin treated groups. It is possible that a longer intervention would be helpful for longer-term results,” Scheele concluded. “It’s not going to be a quick fix for loneliness, but there may be a role for oxytocin as an adjunct to psychotherapy.”
 

A Basic Human Need

Another loneliness researcher, Livia Tomova, PhD, assistant professor of psychology at Cardiff University in Wales, has used social isolation to induce loneliness in young people and found that this intervention was linked to brain patterns similar to those associated with hunger.

“We know that the drive to eat food is a very basic human need. We know quite well how it is represented in the brain,” she explained.

The researchers tested how the brains of the participants responded to seeing pictures of social interactions after they underwent a prolonged period of social isolation. In a subsequent session, the same people were asked to undergo food fasting and then underwent brain scans when looking at pictures of food. Results showed that the neural patterns were similar in the two situations with increased activity in the substantia nigra area within the midbrain.

“This area of the brain processes rewards and motivation. It consists primarily of dopamine neurons and increased activity corresponds to a feeling of craving something. So this area of the brain that controls essential homeostatic needs is activated when people feel lonely, suggesting that our need for social contact with others is potentially a very basic need similar to eating,” Tomova said.
 

Lower Gray Matter Volumes in Key Brain Areas

And another group from Germany has found that higher loneliness scores are negatively associated with specific brain regions responsible for memory, emotion regulation, and social processing.

Sandra Düzel, PhD, and colleagues from the Max Planck Institute for Human Development and the Charité – Universitätsmedizin Berlin, both in Berlin, Germany, reported a study in which individuals who reported higher loneliness had smaller gray matter volumes in brain regions such as the left amygdala, anterior hippocampus, and cerebellum, regions which are crucial for both emotional regulation and higher-order cognitive processes, such as self-reflection and executive function.

Düzel believes that possible mechanisms behind the link between loneliness and brain volume differences could include stress-related damage, with prolonged loneliness associated with elevated levels of stress hormones, which can damage the hippocampus over time, and reduced cognitive and social stimulation, which may contribute to brain volume reductions in regions critical for memory and emotional processing.

“Loneliness is often characterized by reduced social and environmental diversity, leading to less engagement with novel experiences and potentially lower hippocampal-striatal connectivity.

Since novelty-seeking and environmental diversity are associated with positive emotional states, individuals experiencing loneliness might benefit from increased exposure to new environments which could stimulate the brain’s reward circuits, fostering positive affect and potentially mitigating the emotional burden of loneliness,” she said.
 

 

 

Is Social Prescribing the Answer?

So are there enough data now to act and attempt to develop interventions to reduce loneliness? Most of these researchers believe so.

“I think we have enough information to act on this now. There are a number of national academies consensus reports, which suggest that, while certainly there are still gaps in our evidence and more to be learned, there is sufficient evidence that a concerning portion of the population seems to lack connection, and that the consequences are serious enough that we need to do something about it,” said Holt-Lunstad.

Some countries have introduced social prescribing where doctors can prescribe a group activity or a regular visit or telephone conversation with a supportive person.

Subramanian pointed out that it’s easier to implement in countries with national health services and may be more difficult to embrace in the US healthcare system.

“We are not so encouraged from a financial perspective to think about preventive care in the US. We don’t have an easy way to recognize in any tangible way the downstream of such activities in terms of preventing future problems. That is something we need to work on,” she said.

Finley cautioned that to work well, social prescribing will require an understanding of each person’s individual situation.

“Some people may only receive benefit of interacting with others if they are also getting some sort of support to address the social and emotional concerns that are tagging along with loneliness. I’m not sure that just telling people to go join their local gardening club or whatever will be the correct answer for everyone.”

She pointed out that many people will have issues in their life that are making it hard for them to be social. These could be mobility or financial challenges, care responsibilities, or concerns about illnesses or life events. “We need to figure out what would have the most bang for the person’s buck, so to speak, as an intervention. That could mean connecting them to a group relevant to their individual situation.”
 

Opportunity to Connect Not Enough?

Tomova believes that training people in social skills may be a better option. “It appears that some people who are chronically lonely seem to struggle to make relationships with others. So just encouraging them to interact with others more will not necessarily help. We need to better understand the pathways involved and who are the people who become ill. We can then develop and target better interventions and teach people coping strategies for that situation.”

Scheele agreed. “While just giving people the opportunity to connect may work for some, others who are experiencing really chronic loneliness may not benefit very much from this unless their negative belief systems are addressed.” He suggested some sort of psychotherapy may be helpful in this situation.

But at least all seem to agree that healthcare providers need to be more aware of loneliness as a health risk factor, try to identify people at risk, and to think about how best to support them.

Holt-Lunstad noted that one of the recommendations in the US Surgeon General’s advisory was to increase the education, training, and resources on loneliness for healthcare providers.

“If we want this to be addressed, we need to give healthcare providers the time, resources, and training in order to do that, otherwise, we are adding one more thing to an already overburdened system. They need to understand how important it is, and how it might help them take care of the patient.”

“Our hope is that we can start to reverse some of the trends that we are seeing, both in terms of the prevalence rates of loneliness, but also that we could start seeing improvements in health and other kinds of outcomes,” she concluded.

Progress is being made in increasing awareness about the dangers of chronic loneliness. It’s now recognized as a serious health risk, but there are actionable steps that can help. Loneliness doesn’t have to be a permanent condition for anyone, said Scheele.

Holt-Lunstad served as an adviser for Foundation for Social Connection, Global Initiative on Loneliness and Connection, and Nextdoor Neighborhood Vitality Board and received research grants/income from Templeton Foundation, Eventbrite, Foundation for Social Connection, and Triple-S Foundation. Subramanian served as a speaker bureau for Acorda Pharma. The other researchers reported no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

In a world that is more connected than ever, a silent epidemic is taking its toll. Overall, one in three US adults report chronic loneliness — a condition so detrimental that it rivals smoking and obesity with respect to its negative effect on health and well-being. From anxiety and depression to life-threatening conditions like cardiovascular disease, stroke, and Alzheimer’s and Parkinson’s diseases, loneliness is more than an emotion — it’s a serious threat to both the brain and body.

In 2023, a US Surgeon General advisory raised the alarm about the national problem of loneliness and isolation, describing it as an epidemic.

“Given the significant health consequences of loneliness and isolation, we must prioritize building social connection in the same way we have prioritized other critical public health issues such as tobacco, obesity, and substance use disorders. Together, we can build a country that’s healthier, more resilient, less lonely, and more connected,” the report concluded.

But how, exactly, does chronic loneliness affect the physiology and function of the brain? What does the latest research reveal about the link between loneliness and neurologic and psychiatric illness, and what can clinicians do to address the issue?

This news organization spoke to multiple experts in the field to explore these issues.
 

A Major Risk Factor

Anna Finley, PhD, assistant professor of psychology at North Dakota State University, Fargo, explained that loneliness and social isolation are different entities. Social isolation is an objective measure of the number of people someone interacts with on a regular basis, whereas loneliness is a subjective feeling that occurs when close connections are lacking.

“These two things are not actually as related as you think they would be. People can feel lonely in a crowd or feel well connected with only a few friendships. It’s more about the quality of the connection and the quality of your perception of it. So someone could be in some very supportive relationships but still feel that there’s something missing,” she said in an interview.

So what do we know about how loneliness affects health? Evidence supporting the hypothesis that loneliness is an emerging risk factor for many diseases is steadily building.

Recently, the American Heart Association published a statement summarizing the evidence for a direct association between social isolation and loneliness and coronary heart disease and stroke mortality.

In addition, many studies have shown that individuals experiencing social isolation or loneliness have an increased risk for anxiety and depression, dementia, infectious disease, hospitalization, and all-cause death, even after adjusting for age and many other traditional risk factors.

One study revealed that eliminating loneliness has the potential to prevent nearly 20% of cases of depression in adults aged 50 years or older.

Indu Subramanian, MD, professor of neurology at the University of California, Los Angeles, and colleagues conducted a study involving patients with Parkinson’s disease, which showed that the negative impact of loneliness on disease severity was as significant as the positive effects of 30 minutes of daily exercise.

“The importance of loneliness is under-recognized and undervalued, and it poses a major risk for health outcomes and quality of life,” said Subramanian.

Subramanian noted that loneliness is stigmatizing, causing people to feel unlikable and blame themselves, which prevents them from opening up to doctors or loved ones about their struggle. At the same time, healthcare providers may not think to ask about loneliness or know about potential interventions. She emphasized that much more work is needed to address this issue.
 

 

 

Early Mortality Risk

Julianne Holt-Lunstad, PhD, professor of psychology and neuroscience at Brigham Young University in Provo, Utah, is the author of two large meta-analyses that suggest loneliness, social isolation, or living alone are independent risk factors for early mortality, increasing this risk by about a third — the equivalent to the risk of smoking 15 cigarettes per day.

“We have quite robust evidence across a number of health outcomes implicating the harmful effects of loneliness and social isolation. While these are observational studies and show mainly associations, we do have evidence from longitudinal studies that show lacking social connection, whether that be loneliness or social isolation, predicts subsequent worse outcomes, and most of these studies have adjusted for alternative kinds of explanations, like age, initial health status, lifestyle factors,” Holt-Lunstad said.

There is some evidence to suggest that isolation is more predictive of physical health outcomes, whereas loneliness is more predictive of mental health outcomes. That said, both isolation and loneliness have significant effects on mental and physical health outcomes, she noted.

There is also the question of whether loneliness is causing poor health or whether people who are in poor health feel lonely because poor health can lead to social isolation.

Finley said there’s probably a bit of both going on, but longitudinal studies, where loneliness is measured at a fixed timepoint then health outcomes are reported a few years later, suggest that loneliness is contributing to these adverse outcomes.

She added that there is also some evidence in animal models to suggest that loneliness is a causal risk factor for adverse health outcomes. “But you can’t ask a mouse or rat how lonely they’re feeling. All you can do is house them individually — removing them from social connection. This isn’t necessarily the same thing as loneliness in humans.”

Finley is studying mechanisms in the brain that may be involved in mediating the adverse health consequences of loneliness.

“What I’ve been seeing in the data so far is that it tends to be the self-report of how lonely folks are feeling that has the associations with differences in the brain, as opposed to the number of social connections people have. It does seem to be the more subjective, emotional perception of loneliness that is important.”

In a review of potential mechanisms involved, she concluded that it is dysregulated emotions and altered perceptions of social interactions that has profound impacts on the brain, suggesting that people who are lonely may have a tendency to interpret social cues in a negative way, preventing them from forming productive positive relationships.
 

Lack of Trust

One researcher who has studied this phenomenon is Dirk Scheele, PhD, professor of social neuroscience at Ruhr University Bochum in Germany.

“We were interested to find out why people remained lonely,” he said in an interview. “Loneliness is an unpleasant experience, and there are so many opportunities for social contacts nowadays, it’s not really clear at first sight why people are chronically lonely.”

To examine this question, Scheele and his team conducted a study in which functional MRI was used to examine the brain in otherwise healthy individuals with high or low loneliness scores while they played a trust game.

They also simulated a positive social interaction between participants and researchers, in which they talked about plans for a fictitious lottery win, and about their hobbies and interests, during which mood was measured with questionnaires, and saliva samples were collected to measure hormone levels.

Results showed that the high-lonely individuals had reduced activation in the insula cortex during the trust decisions. “This area of the brain is involved in the processing of bodily signals, such as ‘gut feelings.’ So reduced activity here could be interpreted as fewer gut feelings on who can be trusted,” Scheele explained.

The high-lonely individuals also had reduced responsiveness to the positive social interaction with a lower release of oxytocin and a smaller elevation in mood compared with the control individuals.

Scheele pointed out that there is some evidence that oxytocin might increase trust, and there is reduced release of endogenous oxytocin in high loneliness.

“Our results are consistent with the idea that loneliness is associated with negative biases about other people. So if we expect negative things from other people — for instance, that they cannot be trusted — then that would hamper further social interactions and could lead to loneliness,” he added.
 

 

 

A Role for Oxytocin?

In another study, the same researchers tested short-term (five weekly sessions) group psychotherapy to reduce loneliness using established techniques to target these negative biases. They also investigated whether the effects of this group psychotherapy could be augmented by administering intranasal oxytocin (vs placebo) before the group psychotherapy sessions.

Results showed that the group psychotherapy intervention reduced trait loneliness (loneliness experienced over a prolonged period). The oxytocin did not show a significant effect on trait loneliness, but there was a suggestion that it may enhance the reduction in state loneliness (how someone is feeling at a specific time) brought about by the psychotherapy sessions.

“We found that bonding within the groups was experienced as more positive in the oxytocin treated groups. It is possible that a longer intervention would be helpful for longer-term results,” Scheele concluded. “It’s not going to be a quick fix for loneliness, but there may be a role for oxytocin as an adjunct to psychotherapy.”
 

A Basic Human Need

Another loneliness researcher, Livia Tomova, PhD, assistant professor of psychology at Cardiff University in Wales, has used social isolation to induce loneliness in young people and found that this intervention was linked to brain patterns similar to those associated with hunger.

“We know that the drive to eat food is a very basic human need. We know quite well how it is represented in the brain,” she explained.

The researchers tested how the brains of the participants responded to seeing pictures of social interactions after they underwent a prolonged period of social isolation. In a subsequent session, the same people were asked to undergo food fasting and then underwent brain scans when looking at pictures of food. Results showed that the neural patterns were similar in the two situations with increased activity in the substantia nigra area within the midbrain.

“This area of the brain processes rewards and motivation. It consists primarily of dopamine neurons and increased activity corresponds to a feeling of craving something. So this area of the brain that controls essential homeostatic needs is activated when people feel lonely, suggesting that our need for social contact with others is potentially a very basic need similar to eating,” Tomova said.
 

Lower Gray Matter Volumes in Key Brain Areas

And another group from Germany has found that higher loneliness scores are negatively associated with specific brain regions responsible for memory, emotion regulation, and social processing.

Sandra Düzel, PhD, and colleagues from the Max Planck Institute for Human Development and the Charité – Universitätsmedizin Berlin, both in Berlin, Germany, reported a study in which individuals who reported higher loneliness had smaller gray matter volumes in brain regions such as the left amygdala, anterior hippocampus, and cerebellum, regions which are crucial for both emotional regulation and higher-order cognitive processes, such as self-reflection and executive function.

Düzel believes that possible mechanisms behind the link between loneliness and brain volume differences could include stress-related damage, with prolonged loneliness associated with elevated levels of stress hormones, which can damage the hippocampus over time, and reduced cognitive and social stimulation, which may contribute to brain volume reductions in regions critical for memory and emotional processing.

“Loneliness is often characterized by reduced social and environmental diversity, leading to less engagement with novel experiences and potentially lower hippocampal-striatal connectivity.

Since novelty-seeking and environmental diversity are associated with positive emotional states, individuals experiencing loneliness might benefit from increased exposure to new environments which could stimulate the brain’s reward circuits, fostering positive affect and potentially mitigating the emotional burden of loneliness,” she said.
 

 

 

Is Social Prescribing the Answer?

So are there enough data now to act and attempt to develop interventions to reduce loneliness? Most of these researchers believe so.

“I think we have enough information to act on this now. There are a number of national academies consensus reports, which suggest that, while certainly there are still gaps in our evidence and more to be learned, there is sufficient evidence that a concerning portion of the population seems to lack connection, and that the consequences are serious enough that we need to do something about it,” said Holt-Lunstad.

Some countries have introduced social prescribing where doctors can prescribe a group activity or a regular visit or telephone conversation with a supportive person.

Subramanian pointed out that it’s easier to implement in countries with national health services and may be more difficult to embrace in the US healthcare system.

“We are not so encouraged from a financial perspective to think about preventive care in the US. We don’t have an easy way to recognize in any tangible way the downstream of such activities in terms of preventing future problems. That is something we need to work on,” she said.

Finley cautioned that to work well, social prescribing will require an understanding of each person’s individual situation.

“Some people may only receive benefit of interacting with others if they are also getting some sort of support to address the social and emotional concerns that are tagging along with loneliness. I’m not sure that just telling people to go join their local gardening club or whatever will be the correct answer for everyone.”

She pointed out that many people will have issues in their life that are making it hard for them to be social. These could be mobility or financial challenges, care responsibilities, or concerns about illnesses or life events. “We need to figure out what would have the most bang for the person’s buck, so to speak, as an intervention. That could mean connecting them to a group relevant to their individual situation.”
 

Opportunity to Connect Not Enough?

Tomova believes that training people in social skills may be a better option. “It appears that some people who are chronically lonely seem to struggle to make relationships with others. So just encouraging them to interact with others more will not necessarily help. We need to better understand the pathways involved and who are the people who become ill. We can then develop and target better interventions and teach people coping strategies for that situation.”

Scheele agreed. “While just giving people the opportunity to connect may work for some, others who are experiencing really chronic loneliness may not benefit very much from this unless their negative belief systems are addressed.” He suggested some sort of psychotherapy may be helpful in this situation.

But at least all seem to agree that healthcare providers need to be more aware of loneliness as a health risk factor, try to identify people at risk, and to think about how best to support them.

Holt-Lunstad noted that one of the recommendations in the US Surgeon General’s advisory was to increase the education, training, and resources on loneliness for healthcare providers.

“If we want this to be addressed, we need to give healthcare providers the time, resources, and training in order to do that, otherwise, we are adding one more thing to an already overburdened system. They need to understand how important it is, and how it might help them take care of the patient.”

“Our hope is that we can start to reverse some of the trends that we are seeing, both in terms of the prevalence rates of loneliness, but also that we could start seeing improvements in health and other kinds of outcomes,” she concluded.

Progress is being made in increasing awareness about the dangers of chronic loneliness. It’s now recognized as a serious health risk, but there are actionable steps that can help. Loneliness doesn’t have to be a permanent condition for anyone, said Scheele.

Holt-Lunstad served as an adviser for Foundation for Social Connection, Global Initiative on Loneliness and Connection, and Nextdoor Neighborhood Vitality Board and received research grants/income from Templeton Foundation, Eventbrite, Foundation for Social Connection, and Triple-S Foundation. Subramanian served as a speaker bureau for Acorda Pharma. The other researchers reported no disclosures.

A version of this article first appeared on Medscape.com.

In a world that is more connected than ever, a silent epidemic is taking its toll. Overall, one in three US adults report chronic loneliness — a condition so detrimental that it rivals smoking and obesity with respect to its negative effect on health and well-being. From anxiety and depression to life-threatening conditions like cardiovascular disease, stroke, and Alzheimer’s and Parkinson’s diseases, loneliness is more than an emotion — it’s a serious threat to both the brain and body.

In 2023, a US Surgeon General advisory raised the alarm about the national problem of loneliness and isolation, describing it as an epidemic.

“Given the significant health consequences of loneliness and isolation, we must prioritize building social connection in the same way we have prioritized other critical public health issues such as tobacco, obesity, and substance use disorders. Together, we can build a country that’s healthier, more resilient, less lonely, and more connected,” the report concluded.

But how, exactly, does chronic loneliness affect the physiology and function of the brain? What does the latest research reveal about the link between loneliness and neurologic and psychiatric illness, and what can clinicians do to address the issue?

This news organization spoke to multiple experts in the field to explore these issues.
 

A Major Risk Factor

Anna Finley, PhD, assistant professor of psychology at North Dakota State University, Fargo, explained that loneliness and social isolation are different entities. Social isolation is an objective measure of the number of people someone interacts with on a regular basis, whereas loneliness is a subjective feeling that occurs when close connections are lacking.

“These two things are not actually as related as you think they would be. People can feel lonely in a crowd or feel well connected with only a few friendships. It’s more about the quality of the connection and the quality of your perception of it. So someone could be in some very supportive relationships but still feel that there’s something missing,” she said in an interview.

So what do we know about how loneliness affects health? Evidence supporting the hypothesis that loneliness is an emerging risk factor for many diseases is steadily building.

Recently, the American Heart Association published a statement summarizing the evidence for a direct association between social isolation and loneliness and coronary heart disease and stroke mortality.

In addition, many studies have shown that individuals experiencing social isolation or loneliness have an increased risk for anxiety and depression, dementia, infectious disease, hospitalization, and all-cause death, even after adjusting for age and many other traditional risk factors.

One study revealed that eliminating loneliness has the potential to prevent nearly 20% of cases of depression in adults aged 50 years or older.

Indu Subramanian, MD, professor of neurology at the University of California, Los Angeles, and colleagues conducted a study involving patients with Parkinson’s disease, which showed that the negative impact of loneliness on disease severity was as significant as the positive effects of 30 minutes of daily exercise.

“The importance of loneliness is under-recognized and undervalued, and it poses a major risk for health outcomes and quality of life,” said Subramanian.

Subramanian noted that loneliness is stigmatizing, causing people to feel unlikable and blame themselves, which prevents them from opening up to doctors or loved ones about their struggle. At the same time, healthcare providers may not think to ask about loneliness or know about potential interventions. She emphasized that much more work is needed to address this issue.
 

 

 

Early Mortality Risk

Julianne Holt-Lunstad, PhD, professor of psychology and neuroscience at Brigham Young University in Provo, Utah, is the author of two large meta-analyses that suggest loneliness, social isolation, or living alone are independent risk factors for early mortality, increasing this risk by about a third — the equivalent to the risk of smoking 15 cigarettes per day.

“We have quite robust evidence across a number of health outcomes implicating the harmful effects of loneliness and social isolation. While these are observational studies and show mainly associations, we do have evidence from longitudinal studies that show lacking social connection, whether that be loneliness or social isolation, predicts subsequent worse outcomes, and most of these studies have adjusted for alternative kinds of explanations, like age, initial health status, lifestyle factors,” Holt-Lunstad said.

There is some evidence to suggest that isolation is more predictive of physical health outcomes, whereas loneliness is more predictive of mental health outcomes. That said, both isolation and loneliness have significant effects on mental and physical health outcomes, she noted.

There is also the question of whether loneliness is causing poor health or whether people who are in poor health feel lonely because poor health can lead to social isolation.

Finley said there’s probably a bit of both going on, but longitudinal studies, where loneliness is measured at a fixed timepoint then health outcomes are reported a few years later, suggest that loneliness is contributing to these adverse outcomes.

She added that there is also some evidence in animal models to suggest that loneliness is a causal risk factor for adverse health outcomes. “But you can’t ask a mouse or rat how lonely they’re feeling. All you can do is house them individually — removing them from social connection. This isn’t necessarily the same thing as loneliness in humans.”

Finley is studying mechanisms in the brain that may be involved in mediating the adverse health consequences of loneliness.

“What I’ve been seeing in the data so far is that it tends to be the self-report of how lonely folks are feeling that has the associations with differences in the brain, as opposed to the number of social connections people have. It does seem to be the more subjective, emotional perception of loneliness that is important.”

In a review of potential mechanisms involved, she concluded that it is dysregulated emotions and altered perceptions of social interactions that has profound impacts on the brain, suggesting that people who are lonely may have a tendency to interpret social cues in a negative way, preventing them from forming productive positive relationships.
 

Lack of Trust

One researcher who has studied this phenomenon is Dirk Scheele, PhD, professor of social neuroscience at Ruhr University Bochum in Germany.

“We were interested to find out why people remained lonely,” he said in an interview. “Loneliness is an unpleasant experience, and there are so many opportunities for social contacts nowadays, it’s not really clear at first sight why people are chronically lonely.”

To examine this question, Scheele and his team conducted a study in which functional MRI was used to examine the brain in otherwise healthy individuals with high or low loneliness scores while they played a trust game.

They also simulated a positive social interaction between participants and researchers, in which they talked about plans for a fictitious lottery win, and about their hobbies and interests, during which mood was measured with questionnaires, and saliva samples were collected to measure hormone levels.

Results showed that the high-lonely individuals had reduced activation in the insula cortex during the trust decisions. “This area of the brain is involved in the processing of bodily signals, such as ‘gut feelings.’ So reduced activity here could be interpreted as fewer gut feelings on who can be trusted,” Scheele explained.

The high-lonely individuals also had reduced responsiveness to the positive social interaction with a lower release of oxytocin and a smaller elevation in mood compared with the control individuals.

Scheele pointed out that there is some evidence that oxytocin might increase trust, and there is reduced release of endogenous oxytocin in high loneliness.

“Our results are consistent with the idea that loneliness is associated with negative biases about other people. So if we expect negative things from other people — for instance, that they cannot be trusted — then that would hamper further social interactions and could lead to loneliness,” he added.
 

 

 

A Role for Oxytocin?

In another study, the same researchers tested short-term (five weekly sessions) group psychotherapy to reduce loneliness using established techniques to target these negative biases. They also investigated whether the effects of this group psychotherapy could be augmented by administering intranasal oxytocin (vs placebo) before the group psychotherapy sessions.

Results showed that the group psychotherapy intervention reduced trait loneliness (loneliness experienced over a prolonged period). The oxytocin did not show a significant effect on trait loneliness, but there was a suggestion that it may enhance the reduction in state loneliness (how someone is feeling at a specific time) brought about by the psychotherapy sessions.

“We found that bonding within the groups was experienced as more positive in the oxytocin treated groups. It is possible that a longer intervention would be helpful for longer-term results,” Scheele concluded. “It’s not going to be a quick fix for loneliness, but there may be a role for oxytocin as an adjunct to psychotherapy.”
 

A Basic Human Need

Another loneliness researcher, Livia Tomova, PhD, assistant professor of psychology at Cardiff University in Wales, has used social isolation to induce loneliness in young people and found that this intervention was linked to brain patterns similar to those associated with hunger.

“We know that the drive to eat food is a very basic human need. We know quite well how it is represented in the brain,” she explained.

The researchers tested how the brains of the participants responded to seeing pictures of social interactions after they underwent a prolonged period of social isolation. In a subsequent session, the same people were asked to undergo food fasting and then underwent brain scans when looking at pictures of food. Results showed that the neural patterns were similar in the two situations with increased activity in the substantia nigra area within the midbrain.

“This area of the brain processes rewards and motivation. It consists primarily of dopamine neurons and increased activity corresponds to a feeling of craving something. So this area of the brain that controls essential homeostatic needs is activated when people feel lonely, suggesting that our need for social contact with others is potentially a very basic need similar to eating,” Tomova said.
 

Lower Gray Matter Volumes in Key Brain Areas

And another group from Germany has found that higher loneliness scores are negatively associated with specific brain regions responsible for memory, emotion regulation, and social processing.

Sandra Düzel, PhD, and colleagues from the Max Planck Institute for Human Development and the Charité – Universitätsmedizin Berlin, both in Berlin, Germany, reported a study in which individuals who reported higher loneliness had smaller gray matter volumes in brain regions such as the left amygdala, anterior hippocampus, and cerebellum, regions which are crucial for both emotional regulation and higher-order cognitive processes, such as self-reflection and executive function.

Düzel believes that possible mechanisms behind the link between loneliness and brain volume differences could include stress-related damage, with prolonged loneliness associated with elevated levels of stress hormones, which can damage the hippocampus over time, and reduced cognitive and social stimulation, which may contribute to brain volume reductions in regions critical for memory and emotional processing.

“Loneliness is often characterized by reduced social and environmental diversity, leading to less engagement with novel experiences and potentially lower hippocampal-striatal connectivity.

Since novelty-seeking and environmental diversity are associated with positive emotional states, individuals experiencing loneliness might benefit from increased exposure to new environments which could stimulate the brain’s reward circuits, fostering positive affect and potentially mitigating the emotional burden of loneliness,” she said.
 

 

 

Is Social Prescribing the Answer?

So are there enough data now to act and attempt to develop interventions to reduce loneliness? Most of these researchers believe so.

“I think we have enough information to act on this now. There are a number of national academies consensus reports, which suggest that, while certainly there are still gaps in our evidence and more to be learned, there is sufficient evidence that a concerning portion of the population seems to lack connection, and that the consequences are serious enough that we need to do something about it,” said Holt-Lunstad.

Some countries have introduced social prescribing where doctors can prescribe a group activity or a regular visit or telephone conversation with a supportive person.

Subramanian pointed out that it’s easier to implement in countries with national health services and may be more difficult to embrace in the US healthcare system.

“We are not so encouraged from a financial perspective to think about preventive care in the US. We don’t have an easy way to recognize in any tangible way the downstream of such activities in terms of preventing future problems. That is something we need to work on,” she said.

Finley cautioned that to work well, social prescribing will require an understanding of each person’s individual situation.

“Some people may only receive benefit of interacting with others if they are also getting some sort of support to address the social and emotional concerns that are tagging along with loneliness. I’m not sure that just telling people to go join their local gardening club or whatever will be the correct answer for everyone.”

She pointed out that many people will have issues in their life that are making it hard for them to be social. These could be mobility or financial challenges, care responsibilities, or concerns about illnesses or life events. “We need to figure out what would have the most bang for the person’s buck, so to speak, as an intervention. That could mean connecting them to a group relevant to their individual situation.”
 

Opportunity to Connect Not Enough?

Tomova believes that training people in social skills may be a better option. “It appears that some people who are chronically lonely seem to struggle to make relationships with others. So just encouraging them to interact with others more will not necessarily help. We need to better understand the pathways involved and who are the people who become ill. We can then develop and target better interventions and teach people coping strategies for that situation.”

Scheele agreed. “While just giving people the opportunity to connect may work for some, others who are experiencing really chronic loneliness may not benefit very much from this unless their negative belief systems are addressed.” He suggested some sort of psychotherapy may be helpful in this situation.

But at least all seem to agree that healthcare providers need to be more aware of loneliness as a health risk factor, try to identify people at risk, and to think about how best to support them.

Holt-Lunstad noted that one of the recommendations in the US Surgeon General’s advisory was to increase the education, training, and resources on loneliness for healthcare providers.

“If we want this to be addressed, we need to give healthcare providers the time, resources, and training in order to do that, otherwise, we are adding one more thing to an already overburdened system. They need to understand how important it is, and how it might help them take care of the patient.”

“Our hope is that we can start to reverse some of the trends that we are seeing, both in terms of the prevalence rates of loneliness, but also that we could start seeing improvements in health and other kinds of outcomes,” she concluded.

Progress is being made in increasing awareness about the dangers of chronic loneliness. It’s now recognized as a serious health risk, but there are actionable steps that can help. Loneliness doesn’t have to be a permanent condition for anyone, said Scheele.

Holt-Lunstad served as an adviser for Foundation for Social Connection, Global Initiative on Loneliness and Connection, and Nextdoor Neighborhood Vitality Board and received research grants/income from Templeton Foundation, Eventbrite, Foundation for Social Connection, and Triple-S Foundation. Subramanian served as a speaker bureau for Acorda Pharma. The other researchers reported no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Being a Weekend Warrior Linked to Lower Dementia Risk

Article Type
Changed
Tue, 11/05/2024 - 10:09

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cannabis Use Linked to Brain Thinning in Adolescents

Article Type
Changed
Mon, 11/04/2024 - 16:08

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROSCIENCE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Novel Intervention Slows Cognitive Decline in At-Risk Adults

Article Type
Changed
Wed, 11/27/2024 - 02:29

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 11/25/2024 - 05:59
Un-Gate On Date
Mon, 11/25/2024 - 05:59
Use ProPublica
CFC Schedule Remove Status
Mon, 11/25/2024 - 05:59
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 11/25/2024 - 05:59