Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Blood biomarker may predict Alzheimer’s disease progression

Article Type
Changed
Thu, 12/15/2022 - 15:42

Plasma levels of phosphorylated tau at threonine 181 (p-tau181) may provide a means of monitoring disease progression for patients with Alzheimer’s disease, new research suggests.

In a study of more than 1,000 participants, changes over time in levels of p-tau181 were associated with prospective neurodegeneration and cognitive decline characteristic of Alzheimer’s disease. These results have implications for investigative trials as well as clinical practice, the investigators noted.

Like p-tau181, neurofilament light chain (NfL) is associated with imaging markers of degeneration and cognitive decline; in contrast to the findings related to p-tau181, however, the associations between NfL and these outcomes are not specific to Alzheimer’s disease. Using both biomarkers could improve prediction of outcomes and patient monitoring, according to the researchers.

“These findings demonstrate that p-tau181 and NfL in blood have individual and complementary potential roles in the diagnosis and the monitoring of neurodegenerative disease,” said coinvestigator Michael Schöll, PhD, senior lecturer in psychiatry and neurochemistry at the University of Gothenburg (Sweden).

“With the reservation that we did not assess domain-specific cognitive impairment, p-tau181 was also more strongly associated with cognitive decline than was NfL,” Dr. Schöll added.

The findings were published online Jan. 11 in JAMA Neurology.
 

Biomarker-tracked neurodegeneration

Monitoring a patient’s neurodegenerative changes is important for tracking Alzheimer’s disease progression. Although clinicians can detect amyloid-beta and tau pathology using PET and cerebrospinal fluid (CSF) biomarkers, the widespread use of the latter has been hampered by cost and limited availability of necessary equipment. The use of blood-based biomarkers is not limited in these ways, and so they could aid in diagnosis and patient monitoring.

Previous studies have suggested that p-tau181 is a marker of Alzheimer’s disease status.

In the current study, investigators examined whether baseline and longitudinal levels of p-tau181 in plasma were associated with progressive neurodegeneration related to the disease. They analyzed data from the Alzheimer’s Disease Neuroimaging Initiative, a multicenter study designed to identify biomarkers for the detection and tracking of Alzheimer’s disease.

The researchers selected data for cognitively unimpaired and cognitively impaired participants who participated in the initiative between Feb. 1, 2007, and June 6, 2016. Participants were eligible for inclusion if plasma p-tau181 and NfL data were available for them and if they had undergone at least one 18fluorodeoxyglucose (FDG)–PET scan or structural T1 MRI at the same study visit. Most had also undergone imaging with 18florbetapir, which detects amyloid-beta.

A single-molecule array was used to analyze concentrations of p-tau181 and NfL in participants’ blood samples. Outliers for p-tau181 and NfL concentrations were excluded from further analysis. Using participants’ FDG-PET scans, the investigators measured glucose hypometabolism characteristic of Alzheimer’s disease. They used T1-weighted MRI scans to measure gray-matter volume.

Cognitively unimpaired participants responded to the Preclinical Alzheimer Cognitive Composite, a measure designed to detect early cognitive changes in cognitively normal patients with Alzheimer’s disease pathology. Cognitively impaired participants underwent the Alzheimer Disease Assessment Scale–Cognitive Subscale with 13 tasks to assess the severity of cognitive impairment.

The researchers included 1,113 participants (54% men; 89% non-Hispanic Whites; mean age, 74 years) in their analysis. In all, 378 participants were cognitively unimpaired, and 735 were cognitively impaired. Of the latter group, 73% had mild cognitive impairment, and 27% had Alzheimer’s disease dementia.
 

 

 

Atrophy predictor

Results showed that higher plasma p-tau181 levels at baseline were associated with more rapid progression of hypometabolism and atrophy in areas vulnerable to Alzheimer’s disease among cognitively impaired participants (FDG-PET standardized uptake value ratio change, r = –0.28; P < .001; gray-matter volume change, r = –0.28; P < .001).

The association with atrophy progression in cognitively impaired participants was stronger for p-tau181 than for NfL.

Plasma p-tau181 levels at baseline also predicted atrophy in temporoparietal regions vulnerable to Alzheimer’s disease among cognitively unimpaired participants (r = –0.11; P = .03). NfL, however, was associated with progressive atrophy in frontal regions among cognitively unimpaired participants.

At baseline, plasma p-tau181 levels were associated with prospective cognitive decline in both the cognitively unimpaired group (r = −0.12; P = .04) and the cognitively impaired group (r = 0.35; P < .001). However, plasma NfL was linked to cognitive decline only among those who were cognitively impaired (r = 0.26; P < .001).

Additional analyses showed that p-tau181, unlike NfL, was associated with hypometabolism and atrophy only in participants with amyloid-beta, regardless of cognitive status.

Between 25% and 45% of the association between baseline p-tau181 level and cognitive decline was mediated by baseline imaging markers of neurodegeneration. This finding suggests that another factor, such as regional tau pathology, might have an independent and direct effect on cognition, Dr. Schöll noted.

Furthermore, changes over time in p-tau181 levels were associated with cognitive decline in the cognitively unimpaired (r = –0.24; P < .001) and cognitively impaired (r = 0.34; P < .001) participants. Longitudinal changes in this biomarker also were associated with a prospective decrease in glucose metabolism in cognitively unimpaired (r = –0.05; P = .48) and cognitively impaired (r = –0.27; P < .001) participants, but the association was only significant in the latter group.

Changes over time in p-tau181 levels were linked to prospective decreases in gray-matter volume in brain regions highly characteristic of Alzheimer’s disease in those who were cognitively unimpaired (r = –0.19; P < .001) and those who were cognitively impaired (r = –0.31, P < .001). However, these associations were obtained only in patients with amyloid-beta.

Dr. Schöll noted that blood-based biomarkers that are sensitive to Alzheimer’s disease could greatly expand patients’ access to a diagnostic workup and could improve screening for clinical trials.

“While the final validation of the existence and the monitoring of potential changes of neuropathology in vivo is likely to be conducted using neuroimaging modalities such as PET, our results suggest that at least a part of these examinations could be replaced by regular blood tests,” Dr. Schöll said.

Lead author Alexis Moscoso, PhD, a postdoctoral researcher in psychiatry and neurochemistry at the University of Gothenburg, reported that the researchers will continue validating blood-based biomarkers, especially against established and well-validated neuroimaging methods. “We are also hoping to be able to compare existing and novel blood-based Alzheimer’s disease biomarkers head to head to establish the individual roles each of these play in the research and diagnosis of Alzheimer’s disease,” Dr. Moscoso said.
 

‘Outstanding study’

Commenting on the findings, David S. Knopman, MD, professor of neurology at Mayo Clinic, Rochester, Minn., said that this is “an outstanding study” because of its large number of participants and because the investigators are “world leaders in the technology of measuring plasma p-tau and NfL.”

Dr. Knopman, who was not involved with the research, noted that the study had no substantive weaknesses.

“The biggest advantages of a blood-based biomarker over CSF- and PET-based biomarkers of Alzheimer disease are the obvious ones of accessibility, cost, portability, and ease of repeatability,” he said.

“As CSF and PET exams are largely limited to major medical centers, valid blood-based biomarkers of Alzheimer disease that are reasonably specific make large-scale epidemiological studies that investigate dementia etiologies in rural or urban and diverse communities feasible,” he added.

Whereas p-tau181 appears to be specific for plaque and tangle disease, NfL is a nonspecific marker of neurodegeneration.

“Each has a role that could be valuable, depending on the circumstance,” said Dr. Knopman. “Plasma NfL has already proved itself useful in frontotemporal degeneration and chronic traumatic encephalopathy, for example.”

He noted that future studies should examine how closely p-tau181 and NfL align with more granular and direct measures of Alzheimer’s disease–related brain pathologies.

“There has got to be some loss of fidelity in detecting abnormality in going from brain tissue to blood, which might siphon off some time-related and severity-related information,” said Dr. Knopman.

“The exact role that plasma p-tau and NfL will play remains to be seen, because the diagnostic information that these biomarkers provide is contingent on the existence of interventions that require specific or nonspecific information about progressive neurodegeneration due to Alzheimer disease,” he added.

The study was funded by grants from the Spanish Instituto de Salud Carlos III, the Brightfocus Foundation, the Swedish Alzheimer Foundation, and the Swedish Brain Foundation. Dr. Schöll reported serving on a scientific advisory board for Servier on matters unrelated to this study. Dr. Moscoso and Dr. Knopman have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(3)
Publications
Topics
Sections

Plasma levels of phosphorylated tau at threonine 181 (p-tau181) may provide a means of monitoring disease progression for patients with Alzheimer’s disease, new research suggests.

In a study of more than 1,000 participants, changes over time in levels of p-tau181 were associated with prospective neurodegeneration and cognitive decline characteristic of Alzheimer’s disease. These results have implications for investigative trials as well as clinical practice, the investigators noted.

Like p-tau181, neurofilament light chain (NfL) is associated with imaging markers of degeneration and cognitive decline; in contrast to the findings related to p-tau181, however, the associations between NfL and these outcomes are not specific to Alzheimer’s disease. Using both biomarkers could improve prediction of outcomes and patient monitoring, according to the researchers.

“These findings demonstrate that p-tau181 and NfL in blood have individual and complementary potential roles in the diagnosis and the monitoring of neurodegenerative disease,” said coinvestigator Michael Schöll, PhD, senior lecturer in psychiatry and neurochemistry at the University of Gothenburg (Sweden).

“With the reservation that we did not assess domain-specific cognitive impairment, p-tau181 was also more strongly associated with cognitive decline than was NfL,” Dr. Schöll added.

The findings were published online Jan. 11 in JAMA Neurology.
 

Biomarker-tracked neurodegeneration

Monitoring a patient’s neurodegenerative changes is important for tracking Alzheimer’s disease progression. Although clinicians can detect amyloid-beta and tau pathology using PET and cerebrospinal fluid (CSF) biomarkers, the widespread use of the latter has been hampered by cost and limited availability of necessary equipment. The use of blood-based biomarkers is not limited in these ways, and so they could aid in diagnosis and patient monitoring.

Previous studies have suggested that p-tau181 is a marker of Alzheimer’s disease status.

In the current study, investigators examined whether baseline and longitudinal levels of p-tau181 in plasma were associated with progressive neurodegeneration related to the disease. They analyzed data from the Alzheimer’s Disease Neuroimaging Initiative, a multicenter study designed to identify biomarkers for the detection and tracking of Alzheimer’s disease.

The researchers selected data for cognitively unimpaired and cognitively impaired participants who participated in the initiative between Feb. 1, 2007, and June 6, 2016. Participants were eligible for inclusion if plasma p-tau181 and NfL data were available for them and if they had undergone at least one 18fluorodeoxyglucose (FDG)–PET scan or structural T1 MRI at the same study visit. Most had also undergone imaging with 18florbetapir, which detects amyloid-beta.

A single-molecule array was used to analyze concentrations of p-tau181 and NfL in participants’ blood samples. Outliers for p-tau181 and NfL concentrations were excluded from further analysis. Using participants’ FDG-PET scans, the investigators measured glucose hypometabolism characteristic of Alzheimer’s disease. They used T1-weighted MRI scans to measure gray-matter volume.

Cognitively unimpaired participants responded to the Preclinical Alzheimer Cognitive Composite, a measure designed to detect early cognitive changes in cognitively normal patients with Alzheimer’s disease pathology. Cognitively impaired participants underwent the Alzheimer Disease Assessment Scale–Cognitive Subscale with 13 tasks to assess the severity of cognitive impairment.

The researchers included 1,113 participants (54% men; 89% non-Hispanic Whites; mean age, 74 years) in their analysis. In all, 378 participants were cognitively unimpaired, and 735 were cognitively impaired. Of the latter group, 73% had mild cognitive impairment, and 27% had Alzheimer’s disease dementia.
 

 

 

Atrophy predictor

Results showed that higher plasma p-tau181 levels at baseline were associated with more rapid progression of hypometabolism and atrophy in areas vulnerable to Alzheimer’s disease among cognitively impaired participants (FDG-PET standardized uptake value ratio change, r = –0.28; P < .001; gray-matter volume change, r = –0.28; P < .001).

The association with atrophy progression in cognitively impaired participants was stronger for p-tau181 than for NfL.

Plasma p-tau181 levels at baseline also predicted atrophy in temporoparietal regions vulnerable to Alzheimer’s disease among cognitively unimpaired participants (r = –0.11; P = .03). NfL, however, was associated with progressive atrophy in frontal regions among cognitively unimpaired participants.

At baseline, plasma p-tau181 levels were associated with prospective cognitive decline in both the cognitively unimpaired group (r = −0.12; P = .04) and the cognitively impaired group (r = 0.35; P < .001). However, plasma NfL was linked to cognitive decline only among those who were cognitively impaired (r = 0.26; P < .001).

Additional analyses showed that p-tau181, unlike NfL, was associated with hypometabolism and atrophy only in participants with amyloid-beta, regardless of cognitive status.

Between 25% and 45% of the association between baseline p-tau181 level and cognitive decline was mediated by baseline imaging markers of neurodegeneration. This finding suggests that another factor, such as regional tau pathology, might have an independent and direct effect on cognition, Dr. Schöll noted.

Furthermore, changes over time in p-tau181 levels were associated with cognitive decline in the cognitively unimpaired (r = –0.24; P < .001) and cognitively impaired (r = 0.34; P < .001) participants. Longitudinal changes in this biomarker also were associated with a prospective decrease in glucose metabolism in cognitively unimpaired (r = –0.05; P = .48) and cognitively impaired (r = –0.27; P < .001) participants, but the association was only significant in the latter group.

Changes over time in p-tau181 levels were linked to prospective decreases in gray-matter volume in brain regions highly characteristic of Alzheimer’s disease in those who were cognitively unimpaired (r = –0.19; P < .001) and those who were cognitively impaired (r = –0.31, P < .001). However, these associations were obtained only in patients with amyloid-beta.

Dr. Schöll noted that blood-based biomarkers that are sensitive to Alzheimer’s disease could greatly expand patients’ access to a diagnostic workup and could improve screening for clinical trials.

“While the final validation of the existence and the monitoring of potential changes of neuropathology in vivo is likely to be conducted using neuroimaging modalities such as PET, our results suggest that at least a part of these examinations could be replaced by regular blood tests,” Dr. Schöll said.

Lead author Alexis Moscoso, PhD, a postdoctoral researcher in psychiatry and neurochemistry at the University of Gothenburg, reported that the researchers will continue validating blood-based biomarkers, especially against established and well-validated neuroimaging methods. “We are also hoping to be able to compare existing and novel blood-based Alzheimer’s disease biomarkers head to head to establish the individual roles each of these play in the research and diagnosis of Alzheimer’s disease,” Dr. Moscoso said.
 

‘Outstanding study’

Commenting on the findings, David S. Knopman, MD, professor of neurology at Mayo Clinic, Rochester, Minn., said that this is “an outstanding study” because of its large number of participants and because the investigators are “world leaders in the technology of measuring plasma p-tau and NfL.”

Dr. Knopman, who was not involved with the research, noted that the study had no substantive weaknesses.

“The biggest advantages of a blood-based biomarker over CSF- and PET-based biomarkers of Alzheimer disease are the obvious ones of accessibility, cost, portability, and ease of repeatability,” he said.

“As CSF and PET exams are largely limited to major medical centers, valid blood-based biomarkers of Alzheimer disease that are reasonably specific make large-scale epidemiological studies that investigate dementia etiologies in rural or urban and diverse communities feasible,” he added.

Whereas p-tau181 appears to be specific for plaque and tangle disease, NfL is a nonspecific marker of neurodegeneration.

“Each has a role that could be valuable, depending on the circumstance,” said Dr. Knopman. “Plasma NfL has already proved itself useful in frontotemporal degeneration and chronic traumatic encephalopathy, for example.”

He noted that future studies should examine how closely p-tau181 and NfL align with more granular and direct measures of Alzheimer’s disease–related brain pathologies.

“There has got to be some loss of fidelity in detecting abnormality in going from brain tissue to blood, which might siphon off some time-related and severity-related information,” said Dr. Knopman.

“The exact role that plasma p-tau and NfL will play remains to be seen, because the diagnostic information that these biomarkers provide is contingent on the existence of interventions that require specific or nonspecific information about progressive neurodegeneration due to Alzheimer disease,” he added.

The study was funded by grants from the Spanish Instituto de Salud Carlos III, the Brightfocus Foundation, the Swedish Alzheimer Foundation, and the Swedish Brain Foundation. Dr. Schöll reported serving on a scientific advisory board for Servier on matters unrelated to this study. Dr. Moscoso and Dr. Knopman have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Plasma levels of phosphorylated tau at threonine 181 (p-tau181) may provide a means of monitoring disease progression for patients with Alzheimer’s disease, new research suggests.

In a study of more than 1,000 participants, changes over time in levels of p-tau181 were associated with prospective neurodegeneration and cognitive decline characteristic of Alzheimer’s disease. These results have implications for investigative trials as well as clinical practice, the investigators noted.

Like p-tau181, neurofilament light chain (NfL) is associated with imaging markers of degeneration and cognitive decline; in contrast to the findings related to p-tau181, however, the associations between NfL and these outcomes are not specific to Alzheimer’s disease. Using both biomarkers could improve prediction of outcomes and patient monitoring, according to the researchers.

“These findings demonstrate that p-tau181 and NfL in blood have individual and complementary potential roles in the diagnosis and the monitoring of neurodegenerative disease,” said coinvestigator Michael Schöll, PhD, senior lecturer in psychiatry and neurochemistry at the University of Gothenburg (Sweden).

“With the reservation that we did not assess domain-specific cognitive impairment, p-tau181 was also more strongly associated with cognitive decline than was NfL,” Dr. Schöll added.

The findings were published online Jan. 11 in JAMA Neurology.
 

Biomarker-tracked neurodegeneration

Monitoring a patient’s neurodegenerative changes is important for tracking Alzheimer’s disease progression. Although clinicians can detect amyloid-beta and tau pathology using PET and cerebrospinal fluid (CSF) biomarkers, the widespread use of the latter has been hampered by cost and limited availability of necessary equipment. The use of blood-based biomarkers is not limited in these ways, and so they could aid in diagnosis and patient monitoring.

Previous studies have suggested that p-tau181 is a marker of Alzheimer’s disease status.

In the current study, investigators examined whether baseline and longitudinal levels of p-tau181 in plasma were associated with progressive neurodegeneration related to the disease. They analyzed data from the Alzheimer’s Disease Neuroimaging Initiative, a multicenter study designed to identify biomarkers for the detection and tracking of Alzheimer’s disease.

The researchers selected data for cognitively unimpaired and cognitively impaired participants who participated in the initiative between Feb. 1, 2007, and June 6, 2016. Participants were eligible for inclusion if plasma p-tau181 and NfL data were available for them and if they had undergone at least one 18fluorodeoxyglucose (FDG)–PET scan or structural T1 MRI at the same study visit. Most had also undergone imaging with 18florbetapir, which detects amyloid-beta.

A single-molecule array was used to analyze concentrations of p-tau181 and NfL in participants’ blood samples. Outliers for p-tau181 and NfL concentrations were excluded from further analysis. Using participants’ FDG-PET scans, the investigators measured glucose hypometabolism characteristic of Alzheimer’s disease. They used T1-weighted MRI scans to measure gray-matter volume.

Cognitively unimpaired participants responded to the Preclinical Alzheimer Cognitive Composite, a measure designed to detect early cognitive changes in cognitively normal patients with Alzheimer’s disease pathology. Cognitively impaired participants underwent the Alzheimer Disease Assessment Scale–Cognitive Subscale with 13 tasks to assess the severity of cognitive impairment.

The researchers included 1,113 participants (54% men; 89% non-Hispanic Whites; mean age, 74 years) in their analysis. In all, 378 participants were cognitively unimpaired, and 735 were cognitively impaired. Of the latter group, 73% had mild cognitive impairment, and 27% had Alzheimer’s disease dementia.
 

 

 

Atrophy predictor

Results showed that higher plasma p-tau181 levels at baseline were associated with more rapid progression of hypometabolism and atrophy in areas vulnerable to Alzheimer’s disease among cognitively impaired participants (FDG-PET standardized uptake value ratio change, r = –0.28; P < .001; gray-matter volume change, r = –0.28; P < .001).

The association with atrophy progression in cognitively impaired participants was stronger for p-tau181 than for NfL.

Plasma p-tau181 levels at baseline also predicted atrophy in temporoparietal regions vulnerable to Alzheimer’s disease among cognitively unimpaired participants (r = –0.11; P = .03). NfL, however, was associated with progressive atrophy in frontal regions among cognitively unimpaired participants.

At baseline, plasma p-tau181 levels were associated with prospective cognitive decline in both the cognitively unimpaired group (r = −0.12; P = .04) and the cognitively impaired group (r = 0.35; P < .001). However, plasma NfL was linked to cognitive decline only among those who were cognitively impaired (r = 0.26; P < .001).

Additional analyses showed that p-tau181, unlike NfL, was associated with hypometabolism and atrophy only in participants with amyloid-beta, regardless of cognitive status.

Between 25% and 45% of the association between baseline p-tau181 level and cognitive decline was mediated by baseline imaging markers of neurodegeneration. This finding suggests that another factor, such as regional tau pathology, might have an independent and direct effect on cognition, Dr. Schöll noted.

Furthermore, changes over time in p-tau181 levels were associated with cognitive decline in the cognitively unimpaired (r = –0.24; P < .001) and cognitively impaired (r = 0.34; P < .001) participants. Longitudinal changes in this biomarker also were associated with a prospective decrease in glucose metabolism in cognitively unimpaired (r = –0.05; P = .48) and cognitively impaired (r = –0.27; P < .001) participants, but the association was only significant in the latter group.

Changes over time in p-tau181 levels were linked to prospective decreases in gray-matter volume in brain regions highly characteristic of Alzheimer’s disease in those who were cognitively unimpaired (r = –0.19; P < .001) and those who were cognitively impaired (r = –0.31, P < .001). However, these associations were obtained only in patients with amyloid-beta.

Dr. Schöll noted that blood-based biomarkers that are sensitive to Alzheimer’s disease could greatly expand patients’ access to a diagnostic workup and could improve screening for clinical trials.

“While the final validation of the existence and the monitoring of potential changes of neuropathology in vivo is likely to be conducted using neuroimaging modalities such as PET, our results suggest that at least a part of these examinations could be replaced by regular blood tests,” Dr. Schöll said.

Lead author Alexis Moscoso, PhD, a postdoctoral researcher in psychiatry and neurochemistry at the University of Gothenburg, reported that the researchers will continue validating blood-based biomarkers, especially against established and well-validated neuroimaging methods. “We are also hoping to be able to compare existing and novel blood-based Alzheimer’s disease biomarkers head to head to establish the individual roles each of these play in the research and diagnosis of Alzheimer’s disease,” Dr. Moscoso said.
 

‘Outstanding study’

Commenting on the findings, David S. Knopman, MD, professor of neurology at Mayo Clinic, Rochester, Minn., said that this is “an outstanding study” because of its large number of participants and because the investigators are “world leaders in the technology of measuring plasma p-tau and NfL.”

Dr. Knopman, who was not involved with the research, noted that the study had no substantive weaknesses.

“The biggest advantages of a blood-based biomarker over CSF- and PET-based biomarkers of Alzheimer disease are the obvious ones of accessibility, cost, portability, and ease of repeatability,” he said.

“As CSF and PET exams are largely limited to major medical centers, valid blood-based biomarkers of Alzheimer disease that are reasonably specific make large-scale epidemiological studies that investigate dementia etiologies in rural or urban and diverse communities feasible,” he added.

Whereas p-tau181 appears to be specific for plaque and tangle disease, NfL is a nonspecific marker of neurodegeneration.

“Each has a role that could be valuable, depending on the circumstance,” said Dr. Knopman. “Plasma NfL has already proved itself useful in frontotemporal degeneration and chronic traumatic encephalopathy, for example.”

He noted that future studies should examine how closely p-tau181 and NfL align with more granular and direct measures of Alzheimer’s disease–related brain pathologies.

“There has got to be some loss of fidelity in detecting abnormality in going from brain tissue to blood, which might siphon off some time-related and severity-related information,” said Dr. Knopman.

“The exact role that plasma p-tau and NfL will play remains to be seen, because the diagnostic information that these biomarkers provide is contingent on the existence of interventions that require specific or nonspecific information about progressive neurodegeneration due to Alzheimer disease,” he added.

The study was funded by grants from the Spanish Instituto de Salud Carlos III, the Brightfocus Foundation, the Swedish Alzheimer Foundation, and the Swedish Brain Foundation. Dr. Schöll reported serving on a scientific advisory board for Servier on matters unrelated to this study. Dr. Moscoso and Dr. Knopman have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(3)
Issue
Neurology Reviews- 29(3)
Publications
Publications
Topics
Article Type
Sections
Citation Override
Publish date: January 22, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer

Type of Alzheimer’s disease with intact memory offers new research paths

Article Type
Changed
Thu, 12/15/2022 - 15:42

Patients with a rare type of Alzheimer’s disease do not show the memory loss characteristic of the condition even over the long term, new research suggests. They also show some differences in neuropathology to typical patients with Alzheimer’s disease, raising hopes of discovering novel mechanisms that might protect against memory loss in typical forms of the disease.

“We are discovering that Alzheimer’s disease has more than one form. While the typical patient with Alzheimer’s disease will have impaired memory, patients with primary progressive aphasia linked to Alzheimer’s disease are quite different. They have problems with language – they know what they want to say but can’t find the words – but their memory is intact,” said lead author Marsel Mesulam, MD.

“We have found that these patients still show the same levels of neurofibrillary tangles which destroy neurons in the memory part of the brain as typical patients with Alzheimer’s disease, but in patients with primary progressive aphasia Alzheimer’s the nondominant side of this part of the brain showed less atrophy,” added Dr. Mesulam, who is director of the Mesulam Center for Cognitive Neurology and Alzheimer’s Disease at Northwestern University, Chicago. “It appears that these patients are more resilient to the effects of the neurofibrillary tangles.”

The researchers also found that two biomarkers that are established risk factors in typical Alzheimer’s disease do not appear to be risk factors for the primary progressive aphasia (PPA) form of the condition.

“These observations suggest that there are mechanisms that may protect the brain from Alzheimer’s-type damage. Studying these patients with this primary progressive aphasia form of Alzheimer’s disease may give us clues as to where to look for these mechanisms that may lead to new treatments for the memory loss associated with typical Alzheimer’s disease,” Dr. Mesulam commented.

The study was published online in the Jan. 13 issue of Neurology.

PPA is diagnosed when language impairment emerges on a background of preserved memory and behavior, with about 40% of cases representing atypical manifestations of Alzheimer’s disease, the researchers explained.

“While we knew that the memories of people with primary progressive aphasia were not affected at first, we did not know if they maintained their memory functioning over years,” Dr. Mesulam noted.

The current study aimed to investigate whether the memory preservation in PPA linked to Alzheimer’s disease is a consistent core feature or a transient finding confined to initial presentation, and to explore the underlying pathology of the condition.

The researchers searched their database to identify patients with PPA with autopsy or biomarker evidence of Alzheimer’s disease, who also had at least two consecutive visits during which language and memory assessment had been obtained with the same tests. The study included 17 patients with the PPA-type Alzheimer’s disease who compared with 14 patients who had typical Alzheimer’s disease with memory loss.

The authors pointed out that characterization of memory in patients with PPA is challenging because most tests use word lists, and thus patients may fail the test because of their language impairments. To address this issue, they included patients with PPA who had undergone memory tests involving recalling pictures of common objects.

Patients with typical Alzheimer’s disease underwent similar tests but used a list of common words.

A second round of tests was conducted in the primary progressive aphasia group an average of 2.4 years later and in the typical Alzheimer’s disease group an average of 1.7 years later.

Brain scans were also available for the patients with PPA, as well as postmortem evaluations for eight of the PPA cases and all the typical Alzheimer’s disease cases.

Results showed that patients with PPA had no decline in their memory skills when they took the tests a second time. At that point, they had been showing symptoms of the disorder for an average of 6 years. In contrast, their language skills declined significantly during the same period. For typical patients with Alzheimer’s disease, verbal memory and language skills declined with equal severity during the study.

Postmortem results showed that the two groups had comparable degrees of Alzheimer’s disease pathology in the medial temporal lobe – the main area of the brain affected in dementia.

However, MRI scans showed that patients with PPA had an asymmetrical atrophy of the dominant (left) hemisphere with sparing of the right sided medial temporal lobe, indicating a lack of neurodegeneration in the nondominant hemisphere, despite the presence of Alzheimer’s disease pathology.

It was also found that the patients with PPA had significantly lower prevalence of two factors strongly linked to Alzheimer’s disease – TDP-43 pathology and APOE ε4 positivity – than the typical patients with Alzheimer’s disease.

The authors concluded that “primary progressive aphasia Alzheimer’s syndrome offers unique opportunities for exploring the biological foundations of these phenomena that interactively modulate the impact of Alzheimer’s disease neuropathology on cognitive function.”
 

 

 

‘Preservation of cognition is the holy grail’

In an accompanying editorial, Seyed Ahmad Sajjadi, MD, University of California, Irvine; Sharon Ash, PhD, University of Pennsylvania, Philadelphia; and Stefano Cappa, MD, University School for Advanced Studies, Pavia, Italy, said these findings have important implications, “as ultimately, preservation of cognition is the holy grail of research in this area.”

They pointed out that the current observations imply “an uncoupling of neurodegeneration and pathology” in patients with PPA-type Alzheimer’s disease, adding that “it seems reasonable to conclude that neurodegeneration, and not mere presence of pathology, is what correlates with clinical presentation in these patients.”

The editorialists noted that the study has some limitations: the sample size is relatively small, not all patients with PPA-type Alzheimer’s disease underwent autopsy, MRI was only available for the aphasia group, and the two groups had different memory tests for comparison of their recognition memory.

But they concluded that this study “provides important insights about the potential reasons for differential vulnerability of the neural substrate of memory in those with different clinical presentations of Alzheimer’s disease pathology.”

The study was supported by the National Institute on Deafness and Communication Disorders, the National Institute of Neurological Disorders and Stroke, the National Institute on Aging, the Davee Foundation, and the Jeanine Jones Fund.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(3)
Publications
Topics
Sections

Patients with a rare type of Alzheimer’s disease do not show the memory loss characteristic of the condition even over the long term, new research suggests. They also show some differences in neuropathology to typical patients with Alzheimer’s disease, raising hopes of discovering novel mechanisms that might protect against memory loss in typical forms of the disease.

“We are discovering that Alzheimer’s disease has more than one form. While the typical patient with Alzheimer’s disease will have impaired memory, patients with primary progressive aphasia linked to Alzheimer’s disease are quite different. They have problems with language – they know what they want to say but can’t find the words – but their memory is intact,” said lead author Marsel Mesulam, MD.

“We have found that these patients still show the same levels of neurofibrillary tangles which destroy neurons in the memory part of the brain as typical patients with Alzheimer’s disease, but in patients with primary progressive aphasia Alzheimer’s the nondominant side of this part of the brain showed less atrophy,” added Dr. Mesulam, who is director of the Mesulam Center for Cognitive Neurology and Alzheimer’s Disease at Northwestern University, Chicago. “It appears that these patients are more resilient to the effects of the neurofibrillary tangles.”

The researchers also found that two biomarkers that are established risk factors in typical Alzheimer’s disease do not appear to be risk factors for the primary progressive aphasia (PPA) form of the condition.

“These observations suggest that there are mechanisms that may protect the brain from Alzheimer’s-type damage. Studying these patients with this primary progressive aphasia form of Alzheimer’s disease may give us clues as to where to look for these mechanisms that may lead to new treatments for the memory loss associated with typical Alzheimer’s disease,” Dr. Mesulam commented.

The study was published online in the Jan. 13 issue of Neurology.

PPA is diagnosed when language impairment emerges on a background of preserved memory and behavior, with about 40% of cases representing atypical manifestations of Alzheimer’s disease, the researchers explained.

“While we knew that the memories of people with primary progressive aphasia were not affected at first, we did not know if they maintained their memory functioning over years,” Dr. Mesulam noted.

The current study aimed to investigate whether the memory preservation in PPA linked to Alzheimer’s disease is a consistent core feature or a transient finding confined to initial presentation, and to explore the underlying pathology of the condition.

The researchers searched their database to identify patients with PPA with autopsy or biomarker evidence of Alzheimer’s disease, who also had at least two consecutive visits during which language and memory assessment had been obtained with the same tests. The study included 17 patients with the PPA-type Alzheimer’s disease who compared with 14 patients who had typical Alzheimer’s disease with memory loss.

The authors pointed out that characterization of memory in patients with PPA is challenging because most tests use word lists, and thus patients may fail the test because of their language impairments. To address this issue, they included patients with PPA who had undergone memory tests involving recalling pictures of common objects.

Patients with typical Alzheimer’s disease underwent similar tests but used a list of common words.

A second round of tests was conducted in the primary progressive aphasia group an average of 2.4 years later and in the typical Alzheimer’s disease group an average of 1.7 years later.

Brain scans were also available for the patients with PPA, as well as postmortem evaluations for eight of the PPA cases and all the typical Alzheimer’s disease cases.

Results showed that patients with PPA had no decline in their memory skills when they took the tests a second time. At that point, they had been showing symptoms of the disorder for an average of 6 years. In contrast, their language skills declined significantly during the same period. For typical patients with Alzheimer’s disease, verbal memory and language skills declined with equal severity during the study.

Postmortem results showed that the two groups had comparable degrees of Alzheimer’s disease pathology in the medial temporal lobe – the main area of the brain affected in dementia.

However, MRI scans showed that patients with PPA had an asymmetrical atrophy of the dominant (left) hemisphere with sparing of the right sided medial temporal lobe, indicating a lack of neurodegeneration in the nondominant hemisphere, despite the presence of Alzheimer’s disease pathology.

It was also found that the patients with PPA had significantly lower prevalence of two factors strongly linked to Alzheimer’s disease – TDP-43 pathology and APOE ε4 positivity – than the typical patients with Alzheimer’s disease.

The authors concluded that “primary progressive aphasia Alzheimer’s syndrome offers unique opportunities for exploring the biological foundations of these phenomena that interactively modulate the impact of Alzheimer’s disease neuropathology on cognitive function.”
 

 

 

‘Preservation of cognition is the holy grail’

In an accompanying editorial, Seyed Ahmad Sajjadi, MD, University of California, Irvine; Sharon Ash, PhD, University of Pennsylvania, Philadelphia; and Stefano Cappa, MD, University School for Advanced Studies, Pavia, Italy, said these findings have important implications, “as ultimately, preservation of cognition is the holy grail of research in this area.”

They pointed out that the current observations imply “an uncoupling of neurodegeneration and pathology” in patients with PPA-type Alzheimer’s disease, adding that “it seems reasonable to conclude that neurodegeneration, and not mere presence of pathology, is what correlates with clinical presentation in these patients.”

The editorialists noted that the study has some limitations: the sample size is relatively small, not all patients with PPA-type Alzheimer’s disease underwent autopsy, MRI was only available for the aphasia group, and the two groups had different memory tests for comparison of their recognition memory.

But they concluded that this study “provides important insights about the potential reasons for differential vulnerability of the neural substrate of memory in those with different clinical presentations of Alzheimer’s disease pathology.”

The study was supported by the National Institute on Deafness and Communication Disorders, the National Institute of Neurological Disorders and Stroke, the National Institute on Aging, the Davee Foundation, and the Jeanine Jones Fund.

A version of this article first appeared on Medscape.com.

Patients with a rare type of Alzheimer’s disease do not show the memory loss characteristic of the condition even over the long term, new research suggests. They also show some differences in neuropathology to typical patients with Alzheimer’s disease, raising hopes of discovering novel mechanisms that might protect against memory loss in typical forms of the disease.

“We are discovering that Alzheimer’s disease has more than one form. While the typical patient with Alzheimer’s disease will have impaired memory, patients with primary progressive aphasia linked to Alzheimer’s disease are quite different. They have problems with language – they know what they want to say but can’t find the words – but their memory is intact,” said lead author Marsel Mesulam, MD.

“We have found that these patients still show the same levels of neurofibrillary tangles which destroy neurons in the memory part of the brain as typical patients with Alzheimer’s disease, but in patients with primary progressive aphasia Alzheimer’s the nondominant side of this part of the brain showed less atrophy,” added Dr. Mesulam, who is director of the Mesulam Center for Cognitive Neurology and Alzheimer’s Disease at Northwestern University, Chicago. “It appears that these patients are more resilient to the effects of the neurofibrillary tangles.”

The researchers also found that two biomarkers that are established risk factors in typical Alzheimer’s disease do not appear to be risk factors for the primary progressive aphasia (PPA) form of the condition.

“These observations suggest that there are mechanisms that may protect the brain from Alzheimer’s-type damage. Studying these patients with this primary progressive aphasia form of Alzheimer’s disease may give us clues as to where to look for these mechanisms that may lead to new treatments for the memory loss associated with typical Alzheimer’s disease,” Dr. Mesulam commented.

The study was published online in the Jan. 13 issue of Neurology.

PPA is diagnosed when language impairment emerges on a background of preserved memory and behavior, with about 40% of cases representing atypical manifestations of Alzheimer’s disease, the researchers explained.

“While we knew that the memories of people with primary progressive aphasia were not affected at first, we did not know if they maintained their memory functioning over years,” Dr. Mesulam noted.

The current study aimed to investigate whether the memory preservation in PPA linked to Alzheimer’s disease is a consistent core feature or a transient finding confined to initial presentation, and to explore the underlying pathology of the condition.

The researchers searched their database to identify patients with PPA with autopsy or biomarker evidence of Alzheimer’s disease, who also had at least two consecutive visits during which language and memory assessment had been obtained with the same tests. The study included 17 patients with the PPA-type Alzheimer’s disease who compared with 14 patients who had typical Alzheimer’s disease with memory loss.

The authors pointed out that characterization of memory in patients with PPA is challenging because most tests use word lists, and thus patients may fail the test because of their language impairments. To address this issue, they included patients with PPA who had undergone memory tests involving recalling pictures of common objects.

Patients with typical Alzheimer’s disease underwent similar tests but used a list of common words.

A second round of tests was conducted in the primary progressive aphasia group an average of 2.4 years later and in the typical Alzheimer’s disease group an average of 1.7 years later.

Brain scans were also available for the patients with PPA, as well as postmortem evaluations for eight of the PPA cases and all the typical Alzheimer’s disease cases.

Results showed that patients with PPA had no decline in their memory skills when they took the tests a second time. At that point, they had been showing symptoms of the disorder for an average of 6 years. In contrast, their language skills declined significantly during the same period. For typical patients with Alzheimer’s disease, verbal memory and language skills declined with equal severity during the study.

Postmortem results showed that the two groups had comparable degrees of Alzheimer’s disease pathology in the medial temporal lobe – the main area of the brain affected in dementia.

However, MRI scans showed that patients with PPA had an asymmetrical atrophy of the dominant (left) hemisphere with sparing of the right sided medial temporal lobe, indicating a lack of neurodegeneration in the nondominant hemisphere, despite the presence of Alzheimer’s disease pathology.

It was also found that the patients with PPA had significantly lower prevalence of two factors strongly linked to Alzheimer’s disease – TDP-43 pathology and APOE ε4 positivity – than the typical patients with Alzheimer’s disease.

The authors concluded that “primary progressive aphasia Alzheimer’s syndrome offers unique opportunities for exploring the biological foundations of these phenomena that interactively modulate the impact of Alzheimer’s disease neuropathology on cognitive function.”
 

 

 

‘Preservation of cognition is the holy grail’

In an accompanying editorial, Seyed Ahmad Sajjadi, MD, University of California, Irvine; Sharon Ash, PhD, University of Pennsylvania, Philadelphia; and Stefano Cappa, MD, University School for Advanced Studies, Pavia, Italy, said these findings have important implications, “as ultimately, preservation of cognition is the holy grail of research in this area.”

They pointed out that the current observations imply “an uncoupling of neurodegeneration and pathology” in patients with PPA-type Alzheimer’s disease, adding that “it seems reasonable to conclude that neurodegeneration, and not mere presence of pathology, is what correlates with clinical presentation in these patients.”

The editorialists noted that the study has some limitations: the sample size is relatively small, not all patients with PPA-type Alzheimer’s disease underwent autopsy, MRI was only available for the aphasia group, and the two groups had different memory tests for comparison of their recognition memory.

But they concluded that this study “provides important insights about the potential reasons for differential vulnerability of the neural substrate of memory in those with different clinical presentations of Alzheimer’s disease pathology.”

The study was supported by the National Institute on Deafness and Communication Disorders, the National Institute of Neurological Disorders and Stroke, the National Institute on Aging, the Davee Foundation, and the Jeanine Jones Fund.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(3)
Issue
Neurology Reviews- 29(3)
Publications
Publications
Topics
Article Type
Sections
Citation Override
Publish date: January 22, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer

Bedside EEG test aids prognosis in patients with brain injury

Article Type
Changed
Thu, 12/15/2022 - 15:42

A simple, noninvasive EEG may help detect residual cognition in unresponsive patients who have experienced a traumatic brain injury (TBI), results of a new study suggest. The study showed that the use of a paradigm that measures the strength of responses to speech improved the accuracy of prognosis for these patients, compared with prognoses made solely on the basis of standard clinical characteristics.

“What we found is really compelling evidence” of the usefulness of the test, lead study author Rodika Sokoliuk, PhD, a postdoctoral researcher at the Center for Human Brain Health, University of Birmingham (England), said in an interview.

The passive measure of comprehension, which doesn’t require any other response from the patient, can reduce uncertainty at a critical phase of decision-making in the ICU, said Dr. Sokoliuk.

The study was published online Dec. 23, 2020, in Annals of Neurology.
 

Useful information at a time of ‘considerable prognostic uncertainty’

Accurate, early prognostication is vital for efficient stratification of patients after a TBI, the authors wrote. This can often be achieved from patient behavior and CT at admission, but some patients continue to fail to obey commands after washout of sedation.

These patients pose a significant challenge for neurologic prognostication, they noted. In these cases, clinicians and families must decide whether to “wait and see” or consider treatment withdrawal.

The authors noted that a lack of command following early in the postsedation period is associated with poor outcome, including vegetative state/unresponsive wakefulness syndrome (VS/UWS). This, they said, represents a “window of opportunity” for cessation of life-sustaining therapy at a time of considerable prognostic uncertainty.

Recent research shows that a significant proportion of unresponsive patients retain a level of cognition, and even consciousness, that isn’t evident from their external behavior – the so-called cognitive-motor dissociation.

The new study included 28 adult patients who had experienced a TBI and were admitted to the ICU of the Queen Elizabeth Hospital in Birmingham, England. The patients had a Glasgow Coma Scale motor score less than 6 (i.e., they were incapable of obeying commands). They had been sedation free for 2-7 days.

For the paradigm, researchers constructed 288 English words using the male voice of the Apple synthesizer. The words required the same amount of time to be generated (320 ms) and were monosyllabic, so the rhythms of the sounds were the same.

The words were presented in a specific order: an adjective, then a noun, then a verb, then a noun. Two words – for example, an adjective and noun – “would build a meaningful phrase,” and four words would build a sentence, said Dr. Sokoliuk.

The researchers built 72 of these four-word sentences. A trial comprised 12 of these sentences, resulting in a total of 864 four-word sentences.

Dr. Sokoliuk likened the paradigm to a rap song with a specific beat that is continually repeated. “Basically, we play 12 of these four-word sentences in a row, without any gaps,” she said.

Each sentence was played to patients, in random order, a minimum of eight and a maximum of nine times per patient throughout the experiment. The patients’ brain activity was recorded on EEG.

Dr. Sokoliuk noted that brain activity in healthy people synchronizes only with the rhythm of phrases and sentences when listeners consciously comprehend the speech. The researchers assessed the level of comprehension in the unresponsive patients by measuring the strength of this synchronicity or brain pattern.

After exclusions, 17 patients were available for outcome assessment 3 months post EEG, and 16 patients were available 6 months post EEG.

The analysis showed that outcome significantly correlated with the strength of patients’ acute cortical tracking of phrases and sentences (r > 0.6; P < .007), quantified by intertrial phase coherence.

Linear regressions revealed that the strength of this comprehension response (beta, 0.603; P = .006) significantly improved the accuracy of prognoses relative to clinical characteristics alone, such as the Glasgow Coma Scale or CT grade.

Previous studies showed that, if there is no understanding of the language used or if the subject is asleep, the brain doesn’t have the “signature” of tracking phrases and sentences, so it doesn’t have the synchronicity or the pattern of individuals with normal cognition, said Dr. Sokoliuk.

“You need a certain level of consciousness, and you need to understand the language, so your brain can actually track sentences or phrases,” she said.

Dr. Sokoliuk explained that the paradigm shows that patients are understanding the sentences and are not just hearing them.

“It’s not showing us that they only hear it, because there are no obvious gaps between the sentences; if there were gaps between sentences, it would probably only show that they hear it. It could be both, that they hear and understand it, but we wouldn’t know.”

A receiver operating characteristics analysis indicated 100% sensitivity and 80% specificity for a distinction between bad outcome (death, VS/UWS) and good outcome at 6 months.

“We could actually define a threshold of the tracking,” said Dr. Sokoliuk. “Patients who had phrases and sentences tracking below this threshold had worse outcome than those whose tracking value was above this threshold.”

The study illustrates that some posttraumatic patients who remain in an unresponsive state despite being sedation free may nevertheless comprehend speech.

The EEG paradigm approach, the authors said, may significantly reduce prognostic uncertainty in a critical phase of medical decision-making. It could also help clinicians make more appropriate decisions about whether or not to continue life-sustaining therapy and ensure more appropriate distribution of limited rehabilitation resources to patients most likely to benefit.

Dr. Sokoliuk stressed that the paradigm could be used at the bedside soon after a brain injury. “The critical thing is, we can actually use it during the acute phase, which is very important for clinical decisions about life-sustaining methods, therapy, and long-term care.”
 

 

 

A prognostic tool

The simple approach promises to be more accessible than fMRI, said Dr. Sokoliuk. “Putting an unresponsive coma patient in a scanner is very difficult and also much more expensive,” she said.

The next step, said Dr. Sokoliuk, is to repeat the study with a larger sample. “The number in the current study was quite small, and we can’t say if the sensitivity of the paradigm is strong enough to use it as a standard prognostic tool.”

To use it in clinical setting, “we really have to have robust measures,” she added.

She aims to conduct a collaborative study involving several institutions and more patients.

The research team plans to eventually build “an open-access toolbox” that would include the auditory streams to be played during EEG recordings and a program to analyze the data, said Dr. Sokoliuk. “Then, in the end, you would get a threshold or a value of tracking for phrases and sentences, and this could then classify a patient to be in a good-outcome or in bad-outcome group.”

She stressed this is a prognostic tool, not a diagnostic tool, and it should not be used in isolation. “It’s important to know that no clinician should only use this paradigm to prognosticate a patient; our paradigm should be part of a bigger battery of tests.”

But it could go a long way toward helping families as well as physicians. “If they know that the patient would be better in 3 months’ time, it’s easier for them to decide what should come next,” she said.

And it’s heartening to know that when families talk to their unresponsive loved one, the patient understands them, she added.
 

Promising basic research

Commenting on the study in an interview, Christine Blume, PhD, of the Center for Chronobiology, University of Basel (Switzerland), whose research interests include cognitive processing of patients with disorders of consciousness, described it as “very elegant and appealing” and the paradigm it used as “really promising.”

“However, we do, of course, not yet know about the prognostic value on a single-subject level, as the authors performed only group analyses,” said Dr. Blume. “This will require more extensive and perhaps even multicenter studies.”

It would also require developing a “solution” that “allows clinicians with limited time resources and perhaps lacking expert knowledge on the paradigm and the necessary analyses to apply the paradigm at bedside,” said Dr. Blume.

She agreed that a passive paradigm that helps determine whether a patient consciously understands speech, without the need for further processing, “has the potential to really improve the diagnostic process and uncover covert consciousness.”

One should bear in mind, though, that the paradigm “makes one essential assumption: that patients can understand speech,” said Dr. Blume. “For example, an aphasic patient might not understand but still be conscious.”

In this context, she added, “it’s essential to note that while the presence of a response suggests consciousness, the absence of a response does not suggest the absence of consciousness.”

Dr. Blume cautioned that the approach used in the study “is still at the stage of basic research.” Although the paradigm is promising, “I do not think it is ‘around the corner,’ ” she said.

The study was funded by the Medical Research Council. It was further supported by the National Institute for Health Research Surgical Reconstruction and Microbiology Research Center. Dr. Sokoliuk and Dr. Blume have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

A simple, noninvasive EEG may help detect residual cognition in unresponsive patients who have experienced a traumatic brain injury (TBI), results of a new study suggest. The study showed that the use of a paradigm that measures the strength of responses to speech improved the accuracy of prognosis for these patients, compared with prognoses made solely on the basis of standard clinical characteristics.

“What we found is really compelling evidence” of the usefulness of the test, lead study author Rodika Sokoliuk, PhD, a postdoctoral researcher at the Center for Human Brain Health, University of Birmingham (England), said in an interview.

The passive measure of comprehension, which doesn’t require any other response from the patient, can reduce uncertainty at a critical phase of decision-making in the ICU, said Dr. Sokoliuk.

The study was published online Dec. 23, 2020, in Annals of Neurology.
 

Useful information at a time of ‘considerable prognostic uncertainty’

Accurate, early prognostication is vital for efficient stratification of patients after a TBI, the authors wrote. This can often be achieved from patient behavior and CT at admission, but some patients continue to fail to obey commands after washout of sedation.

These patients pose a significant challenge for neurologic prognostication, they noted. In these cases, clinicians and families must decide whether to “wait and see” or consider treatment withdrawal.

The authors noted that a lack of command following early in the postsedation period is associated with poor outcome, including vegetative state/unresponsive wakefulness syndrome (VS/UWS). This, they said, represents a “window of opportunity” for cessation of life-sustaining therapy at a time of considerable prognostic uncertainty.

Recent research shows that a significant proportion of unresponsive patients retain a level of cognition, and even consciousness, that isn’t evident from their external behavior – the so-called cognitive-motor dissociation.

The new study included 28 adult patients who had experienced a TBI and were admitted to the ICU of the Queen Elizabeth Hospital in Birmingham, England. The patients had a Glasgow Coma Scale motor score less than 6 (i.e., they were incapable of obeying commands). They had been sedation free for 2-7 days.

For the paradigm, researchers constructed 288 English words using the male voice of the Apple synthesizer. The words required the same amount of time to be generated (320 ms) and were monosyllabic, so the rhythms of the sounds were the same.

The words were presented in a specific order: an adjective, then a noun, then a verb, then a noun. Two words – for example, an adjective and noun – “would build a meaningful phrase,” and four words would build a sentence, said Dr. Sokoliuk.

The researchers built 72 of these four-word sentences. A trial comprised 12 of these sentences, resulting in a total of 864 four-word sentences.

Dr. Sokoliuk likened the paradigm to a rap song with a specific beat that is continually repeated. “Basically, we play 12 of these four-word sentences in a row, without any gaps,” she said.

Each sentence was played to patients, in random order, a minimum of eight and a maximum of nine times per patient throughout the experiment. The patients’ brain activity was recorded on EEG.

Dr. Sokoliuk noted that brain activity in healthy people synchronizes only with the rhythm of phrases and sentences when listeners consciously comprehend the speech. The researchers assessed the level of comprehension in the unresponsive patients by measuring the strength of this synchronicity or brain pattern.

After exclusions, 17 patients were available for outcome assessment 3 months post EEG, and 16 patients were available 6 months post EEG.

The analysis showed that outcome significantly correlated with the strength of patients’ acute cortical tracking of phrases and sentences (r > 0.6; P < .007), quantified by intertrial phase coherence.

Linear regressions revealed that the strength of this comprehension response (beta, 0.603; P = .006) significantly improved the accuracy of prognoses relative to clinical characteristics alone, such as the Glasgow Coma Scale or CT grade.

Previous studies showed that, if there is no understanding of the language used or if the subject is asleep, the brain doesn’t have the “signature” of tracking phrases and sentences, so it doesn’t have the synchronicity or the pattern of individuals with normal cognition, said Dr. Sokoliuk.

“You need a certain level of consciousness, and you need to understand the language, so your brain can actually track sentences or phrases,” she said.

Dr. Sokoliuk explained that the paradigm shows that patients are understanding the sentences and are not just hearing them.

“It’s not showing us that they only hear it, because there are no obvious gaps between the sentences; if there were gaps between sentences, it would probably only show that they hear it. It could be both, that they hear and understand it, but we wouldn’t know.”

A receiver operating characteristics analysis indicated 100% sensitivity and 80% specificity for a distinction between bad outcome (death, VS/UWS) and good outcome at 6 months.

“We could actually define a threshold of the tracking,” said Dr. Sokoliuk. “Patients who had phrases and sentences tracking below this threshold had worse outcome than those whose tracking value was above this threshold.”

The study illustrates that some posttraumatic patients who remain in an unresponsive state despite being sedation free may nevertheless comprehend speech.

The EEG paradigm approach, the authors said, may significantly reduce prognostic uncertainty in a critical phase of medical decision-making. It could also help clinicians make more appropriate decisions about whether or not to continue life-sustaining therapy and ensure more appropriate distribution of limited rehabilitation resources to patients most likely to benefit.

Dr. Sokoliuk stressed that the paradigm could be used at the bedside soon after a brain injury. “The critical thing is, we can actually use it during the acute phase, which is very important for clinical decisions about life-sustaining methods, therapy, and long-term care.”
 

 

 

A prognostic tool

The simple approach promises to be more accessible than fMRI, said Dr. Sokoliuk. “Putting an unresponsive coma patient in a scanner is very difficult and also much more expensive,” she said.

The next step, said Dr. Sokoliuk, is to repeat the study with a larger sample. “The number in the current study was quite small, and we can’t say if the sensitivity of the paradigm is strong enough to use it as a standard prognostic tool.”

To use it in clinical setting, “we really have to have robust measures,” she added.

She aims to conduct a collaborative study involving several institutions and more patients.

The research team plans to eventually build “an open-access toolbox” that would include the auditory streams to be played during EEG recordings and a program to analyze the data, said Dr. Sokoliuk. “Then, in the end, you would get a threshold or a value of tracking for phrases and sentences, and this could then classify a patient to be in a good-outcome or in bad-outcome group.”

She stressed this is a prognostic tool, not a diagnostic tool, and it should not be used in isolation. “It’s important to know that no clinician should only use this paradigm to prognosticate a patient; our paradigm should be part of a bigger battery of tests.”

But it could go a long way toward helping families as well as physicians. “If they know that the patient would be better in 3 months’ time, it’s easier for them to decide what should come next,” she said.

And it’s heartening to know that when families talk to their unresponsive loved one, the patient understands them, she added.
 

Promising basic research

Commenting on the study in an interview, Christine Blume, PhD, of the Center for Chronobiology, University of Basel (Switzerland), whose research interests include cognitive processing of patients with disorders of consciousness, described it as “very elegant and appealing” and the paradigm it used as “really promising.”

“However, we do, of course, not yet know about the prognostic value on a single-subject level, as the authors performed only group analyses,” said Dr. Blume. “This will require more extensive and perhaps even multicenter studies.”

It would also require developing a “solution” that “allows clinicians with limited time resources and perhaps lacking expert knowledge on the paradigm and the necessary analyses to apply the paradigm at bedside,” said Dr. Blume.

She agreed that a passive paradigm that helps determine whether a patient consciously understands speech, without the need for further processing, “has the potential to really improve the diagnostic process and uncover covert consciousness.”

One should bear in mind, though, that the paradigm “makes one essential assumption: that patients can understand speech,” said Dr. Blume. “For example, an aphasic patient might not understand but still be conscious.”

In this context, she added, “it’s essential to note that while the presence of a response suggests consciousness, the absence of a response does not suggest the absence of consciousness.”

Dr. Blume cautioned that the approach used in the study “is still at the stage of basic research.” Although the paradigm is promising, “I do not think it is ‘around the corner,’ ” she said.

The study was funded by the Medical Research Council. It was further supported by the National Institute for Health Research Surgical Reconstruction and Microbiology Research Center. Dr. Sokoliuk and Dr. Blume have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

A simple, noninvasive EEG may help detect residual cognition in unresponsive patients who have experienced a traumatic brain injury (TBI), results of a new study suggest. The study showed that the use of a paradigm that measures the strength of responses to speech improved the accuracy of prognosis for these patients, compared with prognoses made solely on the basis of standard clinical characteristics.

“What we found is really compelling evidence” of the usefulness of the test, lead study author Rodika Sokoliuk, PhD, a postdoctoral researcher at the Center for Human Brain Health, University of Birmingham (England), said in an interview.

The passive measure of comprehension, which doesn’t require any other response from the patient, can reduce uncertainty at a critical phase of decision-making in the ICU, said Dr. Sokoliuk.

The study was published online Dec. 23, 2020, in Annals of Neurology.
 

Useful information at a time of ‘considerable prognostic uncertainty’

Accurate, early prognostication is vital for efficient stratification of patients after a TBI, the authors wrote. This can often be achieved from patient behavior and CT at admission, but some patients continue to fail to obey commands after washout of sedation.

These patients pose a significant challenge for neurologic prognostication, they noted. In these cases, clinicians and families must decide whether to “wait and see” or consider treatment withdrawal.

The authors noted that a lack of command following early in the postsedation period is associated with poor outcome, including vegetative state/unresponsive wakefulness syndrome (VS/UWS). This, they said, represents a “window of opportunity” for cessation of life-sustaining therapy at a time of considerable prognostic uncertainty.

Recent research shows that a significant proportion of unresponsive patients retain a level of cognition, and even consciousness, that isn’t evident from their external behavior – the so-called cognitive-motor dissociation.

The new study included 28 adult patients who had experienced a TBI and were admitted to the ICU of the Queen Elizabeth Hospital in Birmingham, England. The patients had a Glasgow Coma Scale motor score less than 6 (i.e., they were incapable of obeying commands). They had been sedation free for 2-7 days.

For the paradigm, researchers constructed 288 English words using the male voice of the Apple synthesizer. The words required the same amount of time to be generated (320 ms) and were monosyllabic, so the rhythms of the sounds were the same.

The words were presented in a specific order: an adjective, then a noun, then a verb, then a noun. Two words – for example, an adjective and noun – “would build a meaningful phrase,” and four words would build a sentence, said Dr. Sokoliuk.

The researchers built 72 of these four-word sentences. A trial comprised 12 of these sentences, resulting in a total of 864 four-word sentences.

Dr. Sokoliuk likened the paradigm to a rap song with a specific beat that is continually repeated. “Basically, we play 12 of these four-word sentences in a row, without any gaps,” she said.

Each sentence was played to patients, in random order, a minimum of eight and a maximum of nine times per patient throughout the experiment. The patients’ brain activity was recorded on EEG.

Dr. Sokoliuk noted that brain activity in healthy people synchronizes only with the rhythm of phrases and sentences when listeners consciously comprehend the speech. The researchers assessed the level of comprehension in the unresponsive patients by measuring the strength of this synchronicity or brain pattern.

After exclusions, 17 patients were available for outcome assessment 3 months post EEG, and 16 patients were available 6 months post EEG.

The analysis showed that outcome significantly correlated with the strength of patients’ acute cortical tracking of phrases and sentences (r > 0.6; P < .007), quantified by intertrial phase coherence.

Linear regressions revealed that the strength of this comprehension response (beta, 0.603; P = .006) significantly improved the accuracy of prognoses relative to clinical characteristics alone, such as the Glasgow Coma Scale or CT grade.

Previous studies showed that, if there is no understanding of the language used or if the subject is asleep, the brain doesn’t have the “signature” of tracking phrases and sentences, so it doesn’t have the synchronicity or the pattern of individuals with normal cognition, said Dr. Sokoliuk.

“You need a certain level of consciousness, and you need to understand the language, so your brain can actually track sentences or phrases,” she said.

Dr. Sokoliuk explained that the paradigm shows that patients are understanding the sentences and are not just hearing them.

“It’s not showing us that they only hear it, because there are no obvious gaps between the sentences; if there were gaps between sentences, it would probably only show that they hear it. It could be both, that they hear and understand it, but we wouldn’t know.”

A receiver operating characteristics analysis indicated 100% sensitivity and 80% specificity for a distinction between bad outcome (death, VS/UWS) and good outcome at 6 months.

“We could actually define a threshold of the tracking,” said Dr. Sokoliuk. “Patients who had phrases and sentences tracking below this threshold had worse outcome than those whose tracking value was above this threshold.”

The study illustrates that some posttraumatic patients who remain in an unresponsive state despite being sedation free may nevertheless comprehend speech.

The EEG paradigm approach, the authors said, may significantly reduce prognostic uncertainty in a critical phase of medical decision-making. It could also help clinicians make more appropriate decisions about whether or not to continue life-sustaining therapy and ensure more appropriate distribution of limited rehabilitation resources to patients most likely to benefit.

Dr. Sokoliuk stressed that the paradigm could be used at the bedside soon after a brain injury. “The critical thing is, we can actually use it during the acute phase, which is very important for clinical decisions about life-sustaining methods, therapy, and long-term care.”
 

 

 

A prognostic tool

The simple approach promises to be more accessible than fMRI, said Dr. Sokoliuk. “Putting an unresponsive coma patient in a scanner is very difficult and also much more expensive,” she said.

The next step, said Dr. Sokoliuk, is to repeat the study with a larger sample. “The number in the current study was quite small, and we can’t say if the sensitivity of the paradigm is strong enough to use it as a standard prognostic tool.”

To use it in clinical setting, “we really have to have robust measures,” she added.

She aims to conduct a collaborative study involving several institutions and more patients.

The research team plans to eventually build “an open-access toolbox” that would include the auditory streams to be played during EEG recordings and a program to analyze the data, said Dr. Sokoliuk. “Then, in the end, you would get a threshold or a value of tracking for phrases and sentences, and this could then classify a patient to be in a good-outcome or in bad-outcome group.”

She stressed this is a prognostic tool, not a diagnostic tool, and it should not be used in isolation. “It’s important to know that no clinician should only use this paradigm to prognosticate a patient; our paradigm should be part of a bigger battery of tests.”

But it could go a long way toward helping families as well as physicians. “If they know that the patient would be better in 3 months’ time, it’s easier for them to decide what should come next,” she said.

And it’s heartening to know that when families talk to their unresponsive loved one, the patient understands them, she added.
 

Promising basic research

Commenting on the study in an interview, Christine Blume, PhD, of the Center for Chronobiology, University of Basel (Switzerland), whose research interests include cognitive processing of patients with disorders of consciousness, described it as “very elegant and appealing” and the paradigm it used as “really promising.”

“However, we do, of course, not yet know about the prognostic value on a single-subject level, as the authors performed only group analyses,” said Dr. Blume. “This will require more extensive and perhaps even multicenter studies.”

It would also require developing a “solution” that “allows clinicians with limited time resources and perhaps lacking expert knowledge on the paradigm and the necessary analyses to apply the paradigm at bedside,” said Dr. Blume.

She agreed that a passive paradigm that helps determine whether a patient consciously understands speech, without the need for further processing, “has the potential to really improve the diagnostic process and uncover covert consciousness.”

One should bear in mind, though, that the paradigm “makes one essential assumption: that patients can understand speech,” said Dr. Blume. “For example, an aphasic patient might not understand but still be conscious.”

In this context, she added, “it’s essential to note that while the presence of a response suggests consciousness, the absence of a response does not suggest the absence of consciousness.”

Dr. Blume cautioned that the approach used in the study “is still at the stage of basic research.” Although the paradigm is promising, “I do not think it is ‘around the corner,’ ” she said.

The study was funded by the Medical Research Council. It was further supported by the National Institute for Health Research Surgical Reconstruction and Microbiology Research Center. Dr. Sokoliuk and Dr. Blume have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF NEUROLOGY

Citation Override
Publish date: January 12, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Ultrasound ablation for Parkinson’s disease: Benefit limited by adverse effects

Article Type
Changed
Thu, 12/15/2022 - 15:42

Focused ultrasound for ablation of the subthalamic nucleus in one hemisphere improved motor features in a selected group of patients with markedly asymmetric Parkinson’s disease, but was associated with a high rate of adverse events, including dyskinesias and other neurologic complications, in a new randomized, sham-controlled trial.

“Longer-term and larger trials are needed to determine the role of focused ultrasound subthalamotomy in the management of Parkinson’s disease and its effect as compared with other available treatments, including deep-brain stimulation,” the authors concluded.

The trial was published online Dec.24, 2020, in the New England Journal of Medicine.

An accompanying editorial concluded that the high rate of adverse events and the lack of ability to modulate treatment over time to treat prominent tremor “raise questions about the appropriate implementation of focused ultrasound–produced lesions for the treatment of Parkinson’s disease.”
 

A scalpel-free alternative to brain surgery

The study authors, led by Raul Martinez-Fernandez, MD, PhD, University Hospital HM Puerta del Sur, Mostoles, Spain, explained that, in severe cases of refractory motor manifestations such as tremor and motor complications, a neurosurgical approach using deep-brain stimulation of the subthalamic nucleus can be used. But to avoid craniotomy and electrode penetration, MRI-guided focused ultrasound for the ablation of deep-brain structures, including the subthalamic nucleus, is being investigated as a treatment for Parkinson’s disease.

Patients are potential candidates for ultrasound ablation if they have prominently asymmetric parkinsonism, if they are not considered to be clinically suitable candidates for surgery because of contraindications, or if they are reluctant to undergo a brain operation or to have an implanted device.

The current trial involved 40 patients with markedly asymmetric Parkinson’s disease who had motor signs not fully controlled by medication or who were ineligible for deep-brain stimulation surgery. They were randomly assigned in a 2:1 ratio to undergo focused ultrasound subthalamotomy on the side opposite their main motor signs or a sham procedure.

Results showed that the mean Movement Disorder Society–Unified Parkinson’s Disease Rating Scale part III (MDS-UPDRS III) motor score for the more affected side – which was the primary endpoint – decreased from 19.9 at baseline to 9.9 at 4 months in the active-treatment group (least-squares mean difference, 9.8 points); and from 18.7 to 17.1 in the control group (least-squares mean difference, 1.7 points). The between-group difference was 8.1 points (P < .001).

The change from baseline in the MDS-UPDRS III score for the more affected side in patients who underwent active treatment varied, ranging from 5% to 95%; the changes were qualitatively more evident for reduction of tremor and rigidity than for bradykinesia.

Adverse events in the active-treatment group were the following:

  • Dyskinesia in the off-medication state in six patients and in the on-medication state in six, which persisted in three and one, respectively, at 4 months.
  • Weakness on the treated side in five patients, which persisted in two at 4 months.
  • Speech disturbance in 15 patients, which persisted in 3 at 4 months.
  • Facial weakness in three patients, which persisted in one at 4 months.
  • in 13 patients, which persisted in two at 4 months.
 

 

In six patients in the active-treatment group, some of these deficits were present at 12 months.

The researchers noted that an approach that has been suggested to reduce the risk of dyskinesias has been to extend ablations dorsal to the subthalamic nucleus in order to interrupt the pallidothalamic-projecting neurons.

The study also showed a greater reduction in the use of dopaminergic medication in the active-treatment group versus the control group, but the researchers noted that the 95% confidence intervals for this and other secondary outcomes were not adjusted for multiple comparisons, so no definite conclusions can be drawn from these data.

They also pointed out that subthalamotomy was performed in one hemisphere, and the natural evolution of Parkinson’s disease eventually leads to motor impairment on both sides of the body in most patients.

“The likely need for an increase in the daily dose of levodopa equivalent to maintain function on the untreated side of the body could lead to the development of dyskinesias on the treated side. However, the few open-label studies of long-term (≥36 months) follow-up of radiofrequency subthalamotomy performed in one hemisphere do not provide support for this concern,” they said.
 

An important step, but improvements are needed

In an accompanying editorial, Joel S. Perlmutter, MD, and Mwiza Ushe, MD, Washington University, St. Louis, noted that surgical deep brain stimulation of the left and right subthalamic nuclei has shown a reduction in the severity of motor signs of 40%-60% and a reduction in medication use of up to 50%. But this technique involves a small craniotomy with implantation of stimulating electrodes, which has a 1%-5% risk of major adverse events such as hemorrhage, stroke, or infection.

Less severe complications include dystonia, dysarthria, gait impairment, dyskinesia, swallowing dysfunction, or change in verbal fluency; however, modification of the device programming may alleviate these effects. Nevertheless, some patients are wary of the implantation surgery and hardware and therefore decline to undergo deep-brain stimulation, the editorialists explained.

“The development of alternative procedures to deep-brain stimulation is important to the field of Parkinson’s disease treatment. The current trial begins the path to that goal, and improvements in targeting may improve the risk-benefit ratio and permit the use of lesions in both hemispheres, which would widen the population of eligible patients,” Dr. Perlmutter and Dr. Ushe wrote.

They pointed out that limiting the treatment to one side of the brain by ultrasound-produced lesioning constrains the application, since most patients with Parkinson’s disease have progression of symptoms on both sides of the body.

“The potential advantages and limitations of focused ultrasound–produced lesioning should be discussed with patients. We hope that improved technique will reduce the associated risks and increase the applicability of this provocative procedure,” the editorialists concluded.

This study was supported by Insightec, the Focused Ultrasound Foundation, Fundacion MAPFRE, Fundacion Hospitales de Madrid, and the University of Virginia Center of Excellence. Dr. Martinez-Fernandez reported receiving for consultancy fees for Insightec. Dr. Ushe reported non-financial support for Abbott outside the submitted work. Dr. Perlmutter disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

Focused ultrasound for ablation of the subthalamic nucleus in one hemisphere improved motor features in a selected group of patients with markedly asymmetric Parkinson’s disease, but was associated with a high rate of adverse events, including dyskinesias and other neurologic complications, in a new randomized, sham-controlled trial.

“Longer-term and larger trials are needed to determine the role of focused ultrasound subthalamotomy in the management of Parkinson’s disease and its effect as compared with other available treatments, including deep-brain stimulation,” the authors concluded.

The trial was published online Dec.24, 2020, in the New England Journal of Medicine.

An accompanying editorial concluded that the high rate of adverse events and the lack of ability to modulate treatment over time to treat prominent tremor “raise questions about the appropriate implementation of focused ultrasound–produced lesions for the treatment of Parkinson’s disease.”
 

A scalpel-free alternative to brain surgery

The study authors, led by Raul Martinez-Fernandez, MD, PhD, University Hospital HM Puerta del Sur, Mostoles, Spain, explained that, in severe cases of refractory motor manifestations such as tremor and motor complications, a neurosurgical approach using deep-brain stimulation of the subthalamic nucleus can be used. But to avoid craniotomy and electrode penetration, MRI-guided focused ultrasound for the ablation of deep-brain structures, including the subthalamic nucleus, is being investigated as a treatment for Parkinson’s disease.

Patients are potential candidates for ultrasound ablation if they have prominently asymmetric parkinsonism, if they are not considered to be clinically suitable candidates for surgery because of contraindications, or if they are reluctant to undergo a brain operation or to have an implanted device.

The current trial involved 40 patients with markedly asymmetric Parkinson’s disease who had motor signs not fully controlled by medication or who were ineligible for deep-brain stimulation surgery. They were randomly assigned in a 2:1 ratio to undergo focused ultrasound subthalamotomy on the side opposite their main motor signs or a sham procedure.

Results showed that the mean Movement Disorder Society–Unified Parkinson’s Disease Rating Scale part III (MDS-UPDRS III) motor score for the more affected side – which was the primary endpoint – decreased from 19.9 at baseline to 9.9 at 4 months in the active-treatment group (least-squares mean difference, 9.8 points); and from 18.7 to 17.1 in the control group (least-squares mean difference, 1.7 points). The between-group difference was 8.1 points (P < .001).

The change from baseline in the MDS-UPDRS III score for the more affected side in patients who underwent active treatment varied, ranging from 5% to 95%; the changes were qualitatively more evident for reduction of tremor and rigidity than for bradykinesia.

Adverse events in the active-treatment group were the following:

  • Dyskinesia in the off-medication state in six patients and in the on-medication state in six, which persisted in three and one, respectively, at 4 months.
  • Weakness on the treated side in five patients, which persisted in two at 4 months.
  • Speech disturbance in 15 patients, which persisted in 3 at 4 months.
  • Facial weakness in three patients, which persisted in one at 4 months.
  • in 13 patients, which persisted in two at 4 months.
 

 

In six patients in the active-treatment group, some of these deficits were present at 12 months.

The researchers noted that an approach that has been suggested to reduce the risk of dyskinesias has been to extend ablations dorsal to the subthalamic nucleus in order to interrupt the pallidothalamic-projecting neurons.

The study also showed a greater reduction in the use of dopaminergic medication in the active-treatment group versus the control group, but the researchers noted that the 95% confidence intervals for this and other secondary outcomes were not adjusted for multiple comparisons, so no definite conclusions can be drawn from these data.

They also pointed out that subthalamotomy was performed in one hemisphere, and the natural evolution of Parkinson’s disease eventually leads to motor impairment on both sides of the body in most patients.

“The likely need for an increase in the daily dose of levodopa equivalent to maintain function on the untreated side of the body could lead to the development of dyskinesias on the treated side. However, the few open-label studies of long-term (≥36 months) follow-up of radiofrequency subthalamotomy performed in one hemisphere do not provide support for this concern,” they said.
 

An important step, but improvements are needed

In an accompanying editorial, Joel S. Perlmutter, MD, and Mwiza Ushe, MD, Washington University, St. Louis, noted that surgical deep brain stimulation of the left and right subthalamic nuclei has shown a reduction in the severity of motor signs of 40%-60% and a reduction in medication use of up to 50%. But this technique involves a small craniotomy with implantation of stimulating electrodes, which has a 1%-5% risk of major adverse events such as hemorrhage, stroke, or infection.

Less severe complications include dystonia, dysarthria, gait impairment, dyskinesia, swallowing dysfunction, or change in verbal fluency; however, modification of the device programming may alleviate these effects. Nevertheless, some patients are wary of the implantation surgery and hardware and therefore decline to undergo deep-brain stimulation, the editorialists explained.

“The development of alternative procedures to deep-brain stimulation is important to the field of Parkinson’s disease treatment. The current trial begins the path to that goal, and improvements in targeting may improve the risk-benefit ratio and permit the use of lesions in both hemispheres, which would widen the population of eligible patients,” Dr. Perlmutter and Dr. Ushe wrote.

They pointed out that limiting the treatment to one side of the brain by ultrasound-produced lesioning constrains the application, since most patients with Parkinson’s disease have progression of symptoms on both sides of the body.

“The potential advantages and limitations of focused ultrasound–produced lesioning should be discussed with patients. We hope that improved technique will reduce the associated risks and increase the applicability of this provocative procedure,” the editorialists concluded.

This study was supported by Insightec, the Focused Ultrasound Foundation, Fundacion MAPFRE, Fundacion Hospitales de Madrid, and the University of Virginia Center of Excellence. Dr. Martinez-Fernandez reported receiving for consultancy fees for Insightec. Dr. Ushe reported non-financial support for Abbott outside the submitted work. Dr. Perlmutter disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Focused ultrasound for ablation of the subthalamic nucleus in one hemisphere improved motor features in a selected group of patients with markedly asymmetric Parkinson’s disease, but was associated with a high rate of adverse events, including dyskinesias and other neurologic complications, in a new randomized, sham-controlled trial.

“Longer-term and larger trials are needed to determine the role of focused ultrasound subthalamotomy in the management of Parkinson’s disease and its effect as compared with other available treatments, including deep-brain stimulation,” the authors concluded.

The trial was published online Dec.24, 2020, in the New England Journal of Medicine.

An accompanying editorial concluded that the high rate of adverse events and the lack of ability to modulate treatment over time to treat prominent tremor “raise questions about the appropriate implementation of focused ultrasound–produced lesions for the treatment of Parkinson’s disease.”
 

A scalpel-free alternative to brain surgery

The study authors, led by Raul Martinez-Fernandez, MD, PhD, University Hospital HM Puerta del Sur, Mostoles, Spain, explained that, in severe cases of refractory motor manifestations such as tremor and motor complications, a neurosurgical approach using deep-brain stimulation of the subthalamic nucleus can be used. But to avoid craniotomy and electrode penetration, MRI-guided focused ultrasound for the ablation of deep-brain structures, including the subthalamic nucleus, is being investigated as a treatment for Parkinson’s disease.

Patients are potential candidates for ultrasound ablation if they have prominently asymmetric parkinsonism, if they are not considered to be clinically suitable candidates for surgery because of contraindications, or if they are reluctant to undergo a brain operation or to have an implanted device.

The current trial involved 40 patients with markedly asymmetric Parkinson’s disease who had motor signs not fully controlled by medication or who were ineligible for deep-brain stimulation surgery. They were randomly assigned in a 2:1 ratio to undergo focused ultrasound subthalamotomy on the side opposite their main motor signs or a sham procedure.

Results showed that the mean Movement Disorder Society–Unified Parkinson’s Disease Rating Scale part III (MDS-UPDRS III) motor score for the more affected side – which was the primary endpoint – decreased from 19.9 at baseline to 9.9 at 4 months in the active-treatment group (least-squares mean difference, 9.8 points); and from 18.7 to 17.1 in the control group (least-squares mean difference, 1.7 points). The between-group difference was 8.1 points (P < .001).

The change from baseline in the MDS-UPDRS III score for the more affected side in patients who underwent active treatment varied, ranging from 5% to 95%; the changes were qualitatively more evident for reduction of tremor and rigidity than for bradykinesia.

Adverse events in the active-treatment group were the following:

  • Dyskinesia in the off-medication state in six patients and in the on-medication state in six, which persisted in three and one, respectively, at 4 months.
  • Weakness on the treated side in five patients, which persisted in two at 4 months.
  • Speech disturbance in 15 patients, which persisted in 3 at 4 months.
  • Facial weakness in three patients, which persisted in one at 4 months.
  • in 13 patients, which persisted in two at 4 months.
 

 

In six patients in the active-treatment group, some of these deficits were present at 12 months.

The researchers noted that an approach that has been suggested to reduce the risk of dyskinesias has been to extend ablations dorsal to the subthalamic nucleus in order to interrupt the pallidothalamic-projecting neurons.

The study also showed a greater reduction in the use of dopaminergic medication in the active-treatment group versus the control group, but the researchers noted that the 95% confidence intervals for this and other secondary outcomes were not adjusted for multiple comparisons, so no definite conclusions can be drawn from these data.

They also pointed out that subthalamotomy was performed in one hemisphere, and the natural evolution of Parkinson’s disease eventually leads to motor impairment on both sides of the body in most patients.

“The likely need for an increase in the daily dose of levodopa equivalent to maintain function on the untreated side of the body could lead to the development of dyskinesias on the treated side. However, the few open-label studies of long-term (≥36 months) follow-up of radiofrequency subthalamotomy performed in one hemisphere do not provide support for this concern,” they said.
 

An important step, but improvements are needed

In an accompanying editorial, Joel S. Perlmutter, MD, and Mwiza Ushe, MD, Washington University, St. Louis, noted that surgical deep brain stimulation of the left and right subthalamic nuclei has shown a reduction in the severity of motor signs of 40%-60% and a reduction in medication use of up to 50%. But this technique involves a small craniotomy with implantation of stimulating electrodes, which has a 1%-5% risk of major adverse events such as hemorrhage, stroke, or infection.

Less severe complications include dystonia, dysarthria, gait impairment, dyskinesia, swallowing dysfunction, or change in verbal fluency; however, modification of the device programming may alleviate these effects. Nevertheless, some patients are wary of the implantation surgery and hardware and therefore decline to undergo deep-brain stimulation, the editorialists explained.

“The development of alternative procedures to deep-brain stimulation is important to the field of Parkinson’s disease treatment. The current trial begins the path to that goal, and improvements in targeting may improve the risk-benefit ratio and permit the use of lesions in both hemispheres, which would widen the population of eligible patients,” Dr. Perlmutter and Dr. Ushe wrote.

They pointed out that limiting the treatment to one side of the brain by ultrasound-produced lesioning constrains the application, since most patients with Parkinson’s disease have progression of symptoms on both sides of the body.

“The potential advantages and limitations of focused ultrasound–produced lesioning should be discussed with patients. We hope that improved technique will reduce the associated risks and increase the applicability of this provocative procedure,” the editorialists concluded.

This study was supported by Insightec, the Focused Ultrasound Foundation, Fundacion MAPFRE, Fundacion Hospitales de Madrid, and the University of Virginia Center of Excellence. Dr. Martinez-Fernandez reported receiving for consultancy fees for Insightec. Dr. Ushe reported non-financial support for Abbott outside the submitted work. Dr. Perlmutter disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Citation Override
Publish date: January 7, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Which imaging criteria identify progressive forms of MS?

Article Type
Changed
Thu, 12/15/2022 - 15:42

The role of imaging in diagnosing progressive multiple sclerosis (MS) and in assessing prognosis is the subject of a new review.

MRI is central in the diagnostic work-up of patients suspected of having MS, given its high sensitivity in detecting disease dissemination in space and over time and its notable ability to exclude mimics of MS, the authors noted. However, diagnosis of primary progressive MS remains challenging and is only possible retrospectively on the basis of clinical assessment.

Identification of imaging features associated with primary progressive MS and features that predict evolution from relapsing remitting MS to secondary progressive MS is an important, unmet need, they wrote.

Diagnosis of progressive MS is limited by difficulties in distinguishing accumulating disability caused by inflammatory disease activity from that attributable to degenerative processes associated with secondary progressive MS. Moreover, there are no accepted clinical criteria for diagnosing secondary progressive MS, the authors explained.

This need has promoted extensive research in the field of imaging, facilitated by definition of novel MRI sequences, to identify imaging features reflecting pathophysiological mechanisms relevant to the pathobiology of progressive MS, the authors said.

The current review reports the conclusions of a workshop held in Milan in November 2019, at which an expert panel of neurologists and neuroradiologists addressed the role of MRI in progressive MS.

Massimo Filippi, MD, IRCCS San Raffaele Scientific Institute, Milan, was the lead author of the review, which was published online Dec. 14, 2020, in JAMA Neurology.

The authors concluded that no definitive, qualitative clinical, immunologic, histopathologic, or neuroimaging features differentiate primary progressive and secondary progressive forms of MS; both are characterized by neurodegenerative phenomena and a gradual and irreversible accumulation of clinical disability, which is also affected by aging and comorbidities.

A definitive diagnosis of primary progressive MS is more difficult than a diagnosis of relapsing remitting MS; in part, primary progressive MS is a diagnosis of exclusion because it can be mimicked by other conditions clinically and radiologically, the authors noted.

The writers did report that, although nonspecific, some spinal cord imaging features are typical of primary progressive MS. These include diffuse abnormalities and lesions involving gray matter and two or more white-matter columns, but confirmation of this is required.

In patients with primary progressive MS and those with relapse-onset MS, MRI features at disease onset predict long-term disability and a progressive disease course. These features include lesions in critical central nervous system regions (i.e., spinal cord, infratentorial regions, and gray matter) and high inflammatory activity in the first years after disease onset. These measures are evaluable in clinical practice, the authors said.

In patients with established MS, gray-matter involvement and neurodegeneration are associated with accelerated clinical worsening; however, detection validation and standardization need to be implemented at the individual patient level, they commented.

Novel candidate imaging biomarkers, such as subpial demyelination, and the presence of slowly expanding lesions or paramagnetic rim lesions may identify progressive MS but should be further investigated, they added.

Discovery of MRI markers capable of detecting evolution from relapsing-remitting to secondary progressive MS remains an unmet need that will probably require multiparametric MRI studies, because it is unlikely that a single MRI method will be able to allow clinicians to optimally distinguish among these stages, the authors said.

The contribution of these promising MRI measures combined with other biomarkers, such as quantification of serum neurofilament light chain levels or optical coherence tomography assessment, should be explored to improve the identification of patients with progressive MS, they concluded.
 

 

 

‘A comprehensive review’

In a comment, Jeffrey A. Cohen, MD, director of the Cleveland Clinic’s Mellen Center for MS Treatment and Research, said the article is a comprehensive review of the pathologic mechanisms that underlie progression in MS and the proxy measures of those processes (brain and spinal cord MRI, PET, optical coherence tomography, and biomarkers).

“The paper reports there is no qualitative difference between relapsing remitting and progressive MS; rather, the difference is quantitative,” Dr. Cohen noted. “In other words, the processes that underlie progression are present from the earliest stages of MS, becoming more prominent over time.”

The apparent transition to progressive MS, he added, “rather than representing a ‘transition,’ instead results from the accumulation of pathology over time, a shift from focal lesions to diffuse inflammation and damage, and unmasking of the damage due to decreased resiliency due to aging and failure of compensatory mechanisms (neuroplasticity and remyelination).”

Also commenting, Edward Fox, MD, director, MS Clinic of Central Texas and clinical associate professor, University of Texas, Austin, explained that loss of tissue is the main driver of progressive MS.

“We all look at imaging to confirm that the progressive symptoms expressed by the patient are related to demyelinating disease,” he said. “When I see MRI of the spinal cord showing multifocal lesions, especially if localized atrophy is seen in a region of the cord, I expect to hear a history of progressive deficits in gait and other signs of disability.”

Dr. Fox noted that, on MRI of the brain, gray matter atrophy both cortically and in the deep gray structures usually manifests as cognitive slowing and poorer performance in work and social situations.

“We hope that other biomarkers, such as neurofilament light chain, will add to this body of knowledge and give us a better grasp of the definition of neurodegeneration to confirm the clinical and radiographic findings,” he added.

Dr. Filippi has received compensation for consulting services and/or speaking activities from Bayer, Biogen Idec, Merck Serono, Novartis, Roche, Sanofi, Genzyme, Takeda, and Teva Pharmaceutical Industries; and research support from ARiSLA, Biogen Idec, Fondazione Italiana Sclerosi Multipla, Italian Ministry of Health, Merck Serono, Novartis, Roche, and Teva.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

The role of imaging in diagnosing progressive multiple sclerosis (MS) and in assessing prognosis is the subject of a new review.

MRI is central in the diagnostic work-up of patients suspected of having MS, given its high sensitivity in detecting disease dissemination in space and over time and its notable ability to exclude mimics of MS, the authors noted. However, diagnosis of primary progressive MS remains challenging and is only possible retrospectively on the basis of clinical assessment.

Identification of imaging features associated with primary progressive MS and features that predict evolution from relapsing remitting MS to secondary progressive MS is an important, unmet need, they wrote.

Diagnosis of progressive MS is limited by difficulties in distinguishing accumulating disability caused by inflammatory disease activity from that attributable to degenerative processes associated with secondary progressive MS. Moreover, there are no accepted clinical criteria for diagnosing secondary progressive MS, the authors explained.

This need has promoted extensive research in the field of imaging, facilitated by definition of novel MRI sequences, to identify imaging features reflecting pathophysiological mechanisms relevant to the pathobiology of progressive MS, the authors said.

The current review reports the conclusions of a workshop held in Milan in November 2019, at which an expert panel of neurologists and neuroradiologists addressed the role of MRI in progressive MS.

Massimo Filippi, MD, IRCCS San Raffaele Scientific Institute, Milan, was the lead author of the review, which was published online Dec. 14, 2020, in JAMA Neurology.

The authors concluded that no definitive, qualitative clinical, immunologic, histopathologic, or neuroimaging features differentiate primary progressive and secondary progressive forms of MS; both are characterized by neurodegenerative phenomena and a gradual and irreversible accumulation of clinical disability, which is also affected by aging and comorbidities.

A definitive diagnosis of primary progressive MS is more difficult than a diagnosis of relapsing remitting MS; in part, primary progressive MS is a diagnosis of exclusion because it can be mimicked by other conditions clinically and radiologically, the authors noted.

The writers did report that, although nonspecific, some spinal cord imaging features are typical of primary progressive MS. These include diffuse abnormalities and lesions involving gray matter and two or more white-matter columns, but confirmation of this is required.

In patients with primary progressive MS and those with relapse-onset MS, MRI features at disease onset predict long-term disability and a progressive disease course. These features include lesions in critical central nervous system regions (i.e., spinal cord, infratentorial regions, and gray matter) and high inflammatory activity in the first years after disease onset. These measures are evaluable in clinical practice, the authors said.

In patients with established MS, gray-matter involvement and neurodegeneration are associated with accelerated clinical worsening; however, detection validation and standardization need to be implemented at the individual patient level, they commented.

Novel candidate imaging biomarkers, such as subpial demyelination, and the presence of slowly expanding lesions or paramagnetic rim lesions may identify progressive MS but should be further investigated, they added.

Discovery of MRI markers capable of detecting evolution from relapsing-remitting to secondary progressive MS remains an unmet need that will probably require multiparametric MRI studies, because it is unlikely that a single MRI method will be able to allow clinicians to optimally distinguish among these stages, the authors said.

The contribution of these promising MRI measures combined with other biomarkers, such as quantification of serum neurofilament light chain levels or optical coherence tomography assessment, should be explored to improve the identification of patients with progressive MS, they concluded.
 

 

 

‘A comprehensive review’

In a comment, Jeffrey A. Cohen, MD, director of the Cleveland Clinic’s Mellen Center for MS Treatment and Research, said the article is a comprehensive review of the pathologic mechanisms that underlie progression in MS and the proxy measures of those processes (brain and spinal cord MRI, PET, optical coherence tomography, and biomarkers).

“The paper reports there is no qualitative difference between relapsing remitting and progressive MS; rather, the difference is quantitative,” Dr. Cohen noted. “In other words, the processes that underlie progression are present from the earliest stages of MS, becoming more prominent over time.”

The apparent transition to progressive MS, he added, “rather than representing a ‘transition,’ instead results from the accumulation of pathology over time, a shift from focal lesions to diffuse inflammation and damage, and unmasking of the damage due to decreased resiliency due to aging and failure of compensatory mechanisms (neuroplasticity and remyelination).”

Also commenting, Edward Fox, MD, director, MS Clinic of Central Texas and clinical associate professor, University of Texas, Austin, explained that loss of tissue is the main driver of progressive MS.

“We all look at imaging to confirm that the progressive symptoms expressed by the patient are related to demyelinating disease,” he said. “When I see MRI of the spinal cord showing multifocal lesions, especially if localized atrophy is seen in a region of the cord, I expect to hear a history of progressive deficits in gait and other signs of disability.”

Dr. Fox noted that, on MRI of the brain, gray matter atrophy both cortically and in the deep gray structures usually manifests as cognitive slowing and poorer performance in work and social situations.

“We hope that other biomarkers, such as neurofilament light chain, will add to this body of knowledge and give us a better grasp of the definition of neurodegeneration to confirm the clinical and radiographic findings,” he added.

Dr. Filippi has received compensation for consulting services and/or speaking activities from Bayer, Biogen Idec, Merck Serono, Novartis, Roche, Sanofi, Genzyme, Takeda, and Teva Pharmaceutical Industries; and research support from ARiSLA, Biogen Idec, Fondazione Italiana Sclerosi Multipla, Italian Ministry of Health, Merck Serono, Novartis, Roche, and Teva.

A version of this article first appeared on Medscape.com.

The role of imaging in diagnosing progressive multiple sclerosis (MS) and in assessing prognosis is the subject of a new review.

MRI is central in the diagnostic work-up of patients suspected of having MS, given its high sensitivity in detecting disease dissemination in space and over time and its notable ability to exclude mimics of MS, the authors noted. However, diagnosis of primary progressive MS remains challenging and is only possible retrospectively on the basis of clinical assessment.

Identification of imaging features associated with primary progressive MS and features that predict evolution from relapsing remitting MS to secondary progressive MS is an important, unmet need, they wrote.

Diagnosis of progressive MS is limited by difficulties in distinguishing accumulating disability caused by inflammatory disease activity from that attributable to degenerative processes associated with secondary progressive MS. Moreover, there are no accepted clinical criteria for diagnosing secondary progressive MS, the authors explained.

This need has promoted extensive research in the field of imaging, facilitated by definition of novel MRI sequences, to identify imaging features reflecting pathophysiological mechanisms relevant to the pathobiology of progressive MS, the authors said.

The current review reports the conclusions of a workshop held in Milan in November 2019, at which an expert panel of neurologists and neuroradiologists addressed the role of MRI in progressive MS.

Massimo Filippi, MD, IRCCS San Raffaele Scientific Institute, Milan, was the lead author of the review, which was published online Dec. 14, 2020, in JAMA Neurology.

The authors concluded that no definitive, qualitative clinical, immunologic, histopathologic, or neuroimaging features differentiate primary progressive and secondary progressive forms of MS; both are characterized by neurodegenerative phenomena and a gradual and irreversible accumulation of clinical disability, which is also affected by aging and comorbidities.

A definitive diagnosis of primary progressive MS is more difficult than a diagnosis of relapsing remitting MS; in part, primary progressive MS is a diagnosis of exclusion because it can be mimicked by other conditions clinically and radiologically, the authors noted.

The writers did report that, although nonspecific, some spinal cord imaging features are typical of primary progressive MS. These include diffuse abnormalities and lesions involving gray matter and two or more white-matter columns, but confirmation of this is required.

In patients with primary progressive MS and those with relapse-onset MS, MRI features at disease onset predict long-term disability and a progressive disease course. These features include lesions in critical central nervous system regions (i.e., spinal cord, infratentorial regions, and gray matter) and high inflammatory activity in the first years after disease onset. These measures are evaluable in clinical practice, the authors said.

In patients with established MS, gray-matter involvement and neurodegeneration are associated with accelerated clinical worsening; however, detection validation and standardization need to be implemented at the individual patient level, they commented.

Novel candidate imaging biomarkers, such as subpial demyelination, and the presence of slowly expanding lesions or paramagnetic rim lesions may identify progressive MS but should be further investigated, they added.

Discovery of MRI markers capable of detecting evolution from relapsing-remitting to secondary progressive MS remains an unmet need that will probably require multiparametric MRI studies, because it is unlikely that a single MRI method will be able to allow clinicians to optimally distinguish among these stages, the authors said.

The contribution of these promising MRI measures combined with other biomarkers, such as quantification of serum neurofilament light chain levels or optical coherence tomography assessment, should be explored to improve the identification of patients with progressive MS, they concluded.
 

 

 

‘A comprehensive review’

In a comment, Jeffrey A. Cohen, MD, director of the Cleveland Clinic’s Mellen Center for MS Treatment and Research, said the article is a comprehensive review of the pathologic mechanisms that underlie progression in MS and the proxy measures of those processes (brain and spinal cord MRI, PET, optical coherence tomography, and biomarkers).

“The paper reports there is no qualitative difference between relapsing remitting and progressive MS; rather, the difference is quantitative,” Dr. Cohen noted. “In other words, the processes that underlie progression are present from the earliest stages of MS, becoming more prominent over time.”

The apparent transition to progressive MS, he added, “rather than representing a ‘transition,’ instead results from the accumulation of pathology over time, a shift from focal lesions to diffuse inflammation and damage, and unmasking of the damage due to decreased resiliency due to aging and failure of compensatory mechanisms (neuroplasticity and remyelination).”

Also commenting, Edward Fox, MD, director, MS Clinic of Central Texas and clinical associate professor, University of Texas, Austin, explained that loss of tissue is the main driver of progressive MS.

“We all look at imaging to confirm that the progressive symptoms expressed by the patient are related to demyelinating disease,” he said. “When I see MRI of the spinal cord showing multifocal lesions, especially if localized atrophy is seen in a region of the cord, I expect to hear a history of progressive deficits in gait and other signs of disability.”

Dr. Fox noted that, on MRI of the brain, gray matter atrophy both cortically and in the deep gray structures usually manifests as cognitive slowing and poorer performance in work and social situations.

“We hope that other biomarkers, such as neurofilament light chain, will add to this body of knowledge and give us a better grasp of the definition of neurodegeneration to confirm the clinical and radiographic findings,” he added.

Dr. Filippi has received compensation for consulting services and/or speaking activities from Bayer, Biogen Idec, Merck Serono, Novartis, Roche, Sanofi, Genzyme, Takeda, and Teva Pharmaceutical Industries; and research support from ARiSLA, Biogen Idec, Fondazione Italiana Sclerosi Multipla, Italian Ministry of Health, Merck Serono, Novartis, Roche, and Teva.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Sections
Citation Override
Publish date: January 5, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

New evidence shows that COVID-19 invades the brain

Article Type
Changed
Thu, 12/15/2022 - 15:42

SARS-CoV-2 can invade the brain and directly act on brain cells, causing neuroinflammation, new animal research suggests. Investigators injected spike 1 (S1), which is found on the tufts of the “red spikes” of the virus, into mice and found that it crossed the blood-brain barrier (BBB) and was taken up not only by brain regions and the brain space but also by other organs – specifically, the lungs, spleen, liver, and kidneys.

“We found that the S1 protein, which is the protein COVID-19 uses to ‘grab onto’ cells, crosses the BBB and is a good model of what the virus does when it enters the brain,” lead author William A. Banks, MD, professor of medicine, University of Washington, Seattle, said in an interview.

“When proteins such as the S1 protein become detached from the virus, they can enter the brain and cause mayhem, causing the brain to release cytokines, which, in turn, cause inflammation and subsequent neurotoxicity,” said Dr. Banks, associate chief of staff and a researcher at the Puget Sound Veterans Affairs Healthcare System.

The study was published online in Nature Neuroscience.
 

Neurologic symptoms

COVID-19 is associated with a variety of central nervous system symptoms, including the loss of taste and smell, headaches, confusion, stroke, and cerebral hemorrhage, the investigators noted.

Dr. Banks explained that SARS-CoV-2 may enter the brain by crossing the BBB, acting directly on the brain centers responsible for other body functions. The respiratory symptoms of COVID-19 may therefore result partly from the invasion of the areas of the brain responsible for respiratory functions, not only from the virus’ action at the site of the lungs.

The researchers set out to assess whether a particular viral protein – S1, which is a subunit of the viral spike protein – could cross the BBB or enter other organs when injected into mice. They found that, when intravenously injected S1 (I-S1) was cleared from the blood, tissues in multiple organs, including the lung, spleen, kidney, and liver, took it up.

Notably, uptake of I-S1 was higher in the liver, “suggesting that this protein is cleared from the blood predominantly by the liver,” Dr. Banks said. In addition, uptake by the lungs is “important, because that’s where many of the effects of the virus are,” he added.

The researchers found that I-S1 in the brains of the mice was “mostly degraded” 30 minutes following injection. “This indicates that I-S1 enters the BBB intact but is eventually degraded in the brain,” they wrote.

Moreover, by 30 minutes, more than half of the I-S1 proteins had crossed the capillary wall and had fully entered into the brain parenchymal and interstitial fluid spaces, as well as other regions.
 

More severe outcomes in men

The researchers then induced an inflammatory state in the mice through injection of lipopolysaccharide (LPS) and found that inflammation increased I-S1 uptake in both the brain and the lung (where uptake was increased by 101%). “These results show that inflammation could increase S1 toxicity for lung tissue by increasing its uptake,” the authors suggested. Moreover, inflammation also increased the entry of I-S1 into the brain, “likely due to BBB disruption.”

In human beings, male sex and APOE4 genotype are risk factors for both contracting COVID-19 and having a poor outcome, the authors noted. As a result, they examined I-S1 uptake in male and female mice that expressed human APOE3 or APOE4 (induced by a mouse ApoE promoter).

Multiple-comparison tests showed that among male mice that expressed human APOE3, the “fastest I-S1 uptake” was in the olfactory bulb, liver, and kidney. Female mice displayed increased APOE3 uptake in the spleen.

“This observation might relate to the increased susceptibility of men to more severe COVID-19 outcomes,” coauthor Jacob Raber, PhD, professor, departments of behavioral neuroscience, neurology, and radiation medicine, Oregon Health & Science University, Portland, said in a press release.

In addition to intravenous I-S1 injection, the researchers also investigated the effects of intranasal administration. They found that, although it also entered the brain, it did so at levels roughly 10 times lower than those induced by intravenous administration.
 

“Frightening tricks”

Dr. Banks said his laboratory has studied the BBB in conditions such as Alzheimer’s diseaseobesity, diabetes, and HIV. “Our experience with viruses is that they do an incredible number of things and have a frightening number of tricks,” he said. In this case, “the virus is probably causing inflammation by releasing cytokines elsewhere in the body that get into the brain through the BBB.” Conversely, “the virus itself may enter the brain by crossing the BBB and directly cause brain cells to release their own cytokines,” he added.

An additional finding of the study is that, whatever the S1 protein does in the brain is a model for what the entire virus itself does, because these proteins often bring the viruses along with them, he added.

Dr. Banks said the clinical implications of the findings are that antibodies from those who have already had COVID-19 could potentially be directed against S1. Similarly, he added, so can COVID-19 vaccines, which induce production of S1.

“When an antibody locks onto something, it prevents it from crossing the BBB,” Dr. Banks noted.
 

Confirmatory findings

Commenting on the study, Howard E. Gendelman, MD, Margaret R. Larson Professor of Internal Medicine and Infectious Diseases and professor and chair of the department of pharmacology and experimental neuroscience, University of Nebraska, Omaha, said the study is confirmatory.

“What this paper highlights, and we have known for a long time, is that COVID-19 is a systemic, not only a respiratory, disease involving many organs and tissues and can yield not only pulmonary problems but also a whole host of cardiac, brain, and kidney problems,” he said.

“So the fact that these proteins are getting in [the brain] and are able to induce a reaction in the brain itself, and this is part of the complex progressive nature of COVID-19, is an important finding,” added Dr. Gendelman, director of the center for neurodegenerative disorders at the university. He was not involved with the study.

The study was supported by the Veterans Affairs Puget Sound Healthcare System and by grants from the National Institutes of Health. The authors and Dr. Gendelman have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

SARS-CoV-2 can invade the brain and directly act on brain cells, causing neuroinflammation, new animal research suggests. Investigators injected spike 1 (S1), which is found on the tufts of the “red spikes” of the virus, into mice and found that it crossed the blood-brain barrier (BBB) and was taken up not only by brain regions and the brain space but also by other organs – specifically, the lungs, spleen, liver, and kidneys.

“We found that the S1 protein, which is the protein COVID-19 uses to ‘grab onto’ cells, crosses the BBB and is a good model of what the virus does when it enters the brain,” lead author William A. Banks, MD, professor of medicine, University of Washington, Seattle, said in an interview.

“When proteins such as the S1 protein become detached from the virus, they can enter the brain and cause mayhem, causing the brain to release cytokines, which, in turn, cause inflammation and subsequent neurotoxicity,” said Dr. Banks, associate chief of staff and a researcher at the Puget Sound Veterans Affairs Healthcare System.

The study was published online in Nature Neuroscience.
 

Neurologic symptoms

COVID-19 is associated with a variety of central nervous system symptoms, including the loss of taste and smell, headaches, confusion, stroke, and cerebral hemorrhage, the investigators noted.

Dr. Banks explained that SARS-CoV-2 may enter the brain by crossing the BBB, acting directly on the brain centers responsible for other body functions. The respiratory symptoms of COVID-19 may therefore result partly from the invasion of the areas of the brain responsible for respiratory functions, not only from the virus’ action at the site of the lungs.

The researchers set out to assess whether a particular viral protein – S1, which is a subunit of the viral spike protein – could cross the BBB or enter other organs when injected into mice. They found that, when intravenously injected S1 (I-S1) was cleared from the blood, tissues in multiple organs, including the lung, spleen, kidney, and liver, took it up.

Notably, uptake of I-S1 was higher in the liver, “suggesting that this protein is cleared from the blood predominantly by the liver,” Dr. Banks said. In addition, uptake by the lungs is “important, because that’s where many of the effects of the virus are,” he added.

The researchers found that I-S1 in the brains of the mice was “mostly degraded” 30 minutes following injection. “This indicates that I-S1 enters the BBB intact but is eventually degraded in the brain,” they wrote.

Moreover, by 30 minutes, more than half of the I-S1 proteins had crossed the capillary wall and had fully entered into the brain parenchymal and interstitial fluid spaces, as well as other regions.
 

More severe outcomes in men

The researchers then induced an inflammatory state in the mice through injection of lipopolysaccharide (LPS) and found that inflammation increased I-S1 uptake in both the brain and the lung (where uptake was increased by 101%). “These results show that inflammation could increase S1 toxicity for lung tissue by increasing its uptake,” the authors suggested. Moreover, inflammation also increased the entry of I-S1 into the brain, “likely due to BBB disruption.”

In human beings, male sex and APOE4 genotype are risk factors for both contracting COVID-19 and having a poor outcome, the authors noted. As a result, they examined I-S1 uptake in male and female mice that expressed human APOE3 or APOE4 (induced by a mouse ApoE promoter).

Multiple-comparison tests showed that among male mice that expressed human APOE3, the “fastest I-S1 uptake” was in the olfactory bulb, liver, and kidney. Female mice displayed increased APOE3 uptake in the spleen.

“This observation might relate to the increased susceptibility of men to more severe COVID-19 outcomes,” coauthor Jacob Raber, PhD, professor, departments of behavioral neuroscience, neurology, and radiation medicine, Oregon Health & Science University, Portland, said in a press release.

In addition to intravenous I-S1 injection, the researchers also investigated the effects of intranasal administration. They found that, although it also entered the brain, it did so at levels roughly 10 times lower than those induced by intravenous administration.
 

“Frightening tricks”

Dr. Banks said his laboratory has studied the BBB in conditions such as Alzheimer’s diseaseobesity, diabetes, and HIV. “Our experience with viruses is that they do an incredible number of things and have a frightening number of tricks,” he said. In this case, “the virus is probably causing inflammation by releasing cytokines elsewhere in the body that get into the brain through the BBB.” Conversely, “the virus itself may enter the brain by crossing the BBB and directly cause brain cells to release their own cytokines,” he added.

An additional finding of the study is that, whatever the S1 protein does in the brain is a model for what the entire virus itself does, because these proteins often bring the viruses along with them, he added.

Dr. Banks said the clinical implications of the findings are that antibodies from those who have already had COVID-19 could potentially be directed against S1. Similarly, he added, so can COVID-19 vaccines, which induce production of S1.

“When an antibody locks onto something, it prevents it from crossing the BBB,” Dr. Banks noted.
 

Confirmatory findings

Commenting on the study, Howard E. Gendelman, MD, Margaret R. Larson Professor of Internal Medicine and Infectious Diseases and professor and chair of the department of pharmacology and experimental neuroscience, University of Nebraska, Omaha, said the study is confirmatory.

“What this paper highlights, and we have known for a long time, is that COVID-19 is a systemic, not only a respiratory, disease involving many organs and tissues and can yield not only pulmonary problems but also a whole host of cardiac, brain, and kidney problems,” he said.

“So the fact that these proteins are getting in [the brain] and are able to induce a reaction in the brain itself, and this is part of the complex progressive nature of COVID-19, is an important finding,” added Dr. Gendelman, director of the center for neurodegenerative disorders at the university. He was not involved with the study.

The study was supported by the Veterans Affairs Puget Sound Healthcare System and by grants from the National Institutes of Health. The authors and Dr. Gendelman have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

SARS-CoV-2 can invade the brain and directly act on brain cells, causing neuroinflammation, new animal research suggests. Investigators injected spike 1 (S1), which is found on the tufts of the “red spikes” of the virus, into mice and found that it crossed the blood-brain barrier (BBB) and was taken up not only by brain regions and the brain space but also by other organs – specifically, the lungs, spleen, liver, and kidneys.

“We found that the S1 protein, which is the protein COVID-19 uses to ‘grab onto’ cells, crosses the BBB and is a good model of what the virus does when it enters the brain,” lead author William A. Banks, MD, professor of medicine, University of Washington, Seattle, said in an interview.

“When proteins such as the S1 protein become detached from the virus, they can enter the brain and cause mayhem, causing the brain to release cytokines, which, in turn, cause inflammation and subsequent neurotoxicity,” said Dr. Banks, associate chief of staff and a researcher at the Puget Sound Veterans Affairs Healthcare System.

The study was published online in Nature Neuroscience.
 

Neurologic symptoms

COVID-19 is associated with a variety of central nervous system symptoms, including the loss of taste and smell, headaches, confusion, stroke, and cerebral hemorrhage, the investigators noted.

Dr. Banks explained that SARS-CoV-2 may enter the brain by crossing the BBB, acting directly on the brain centers responsible for other body functions. The respiratory symptoms of COVID-19 may therefore result partly from the invasion of the areas of the brain responsible for respiratory functions, not only from the virus’ action at the site of the lungs.

The researchers set out to assess whether a particular viral protein – S1, which is a subunit of the viral spike protein – could cross the BBB or enter other organs when injected into mice. They found that, when intravenously injected S1 (I-S1) was cleared from the blood, tissues in multiple organs, including the lung, spleen, kidney, and liver, took it up.

Notably, uptake of I-S1 was higher in the liver, “suggesting that this protein is cleared from the blood predominantly by the liver,” Dr. Banks said. In addition, uptake by the lungs is “important, because that’s where many of the effects of the virus are,” he added.

The researchers found that I-S1 in the brains of the mice was “mostly degraded” 30 minutes following injection. “This indicates that I-S1 enters the BBB intact but is eventually degraded in the brain,” they wrote.

Moreover, by 30 minutes, more than half of the I-S1 proteins had crossed the capillary wall and had fully entered into the brain parenchymal and interstitial fluid spaces, as well as other regions.
 

More severe outcomes in men

The researchers then induced an inflammatory state in the mice through injection of lipopolysaccharide (LPS) and found that inflammation increased I-S1 uptake in both the brain and the lung (where uptake was increased by 101%). “These results show that inflammation could increase S1 toxicity for lung tissue by increasing its uptake,” the authors suggested. Moreover, inflammation also increased the entry of I-S1 into the brain, “likely due to BBB disruption.”

In human beings, male sex and APOE4 genotype are risk factors for both contracting COVID-19 and having a poor outcome, the authors noted. As a result, they examined I-S1 uptake in male and female mice that expressed human APOE3 or APOE4 (induced by a mouse ApoE promoter).

Multiple-comparison tests showed that among male mice that expressed human APOE3, the “fastest I-S1 uptake” was in the olfactory bulb, liver, and kidney. Female mice displayed increased APOE3 uptake in the spleen.

“This observation might relate to the increased susceptibility of men to more severe COVID-19 outcomes,” coauthor Jacob Raber, PhD, professor, departments of behavioral neuroscience, neurology, and radiation medicine, Oregon Health & Science University, Portland, said in a press release.

In addition to intravenous I-S1 injection, the researchers also investigated the effects of intranasal administration. They found that, although it also entered the brain, it did so at levels roughly 10 times lower than those induced by intravenous administration.
 

“Frightening tricks”

Dr. Banks said his laboratory has studied the BBB in conditions such as Alzheimer’s diseaseobesity, diabetes, and HIV. “Our experience with viruses is that they do an incredible number of things and have a frightening number of tricks,” he said. In this case, “the virus is probably causing inflammation by releasing cytokines elsewhere in the body that get into the brain through the BBB.” Conversely, “the virus itself may enter the brain by crossing the BBB and directly cause brain cells to release their own cytokines,” he added.

An additional finding of the study is that, whatever the S1 protein does in the brain is a model for what the entire virus itself does, because these proteins often bring the viruses along with them, he added.

Dr. Banks said the clinical implications of the findings are that antibodies from those who have already had COVID-19 could potentially be directed against S1. Similarly, he added, so can COVID-19 vaccines, which induce production of S1.

“When an antibody locks onto something, it prevents it from crossing the BBB,” Dr. Banks noted.
 

Confirmatory findings

Commenting on the study, Howard E. Gendelman, MD, Margaret R. Larson Professor of Internal Medicine and Infectious Diseases and professor and chair of the department of pharmacology and experimental neuroscience, University of Nebraska, Omaha, said the study is confirmatory.

“What this paper highlights, and we have known for a long time, is that COVID-19 is a systemic, not only a respiratory, disease involving many organs and tissues and can yield not only pulmonary problems but also a whole host of cardiac, brain, and kidney problems,” he said.

“So the fact that these proteins are getting in [the brain] and are able to induce a reaction in the brain itself, and this is part of the complex progressive nature of COVID-19, is an important finding,” added Dr. Gendelman, director of the center for neurodegenerative disorders at the university. He was not involved with the study.

The study was supported by the Veterans Affairs Puget Sound Healthcare System and by grants from the National Institutes of Health. The authors and Dr. Gendelman have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE NEUROSCIENCE

Citation Override
Publish date: January 4, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

High blood pressure at any age speeds cognitive decline

Article Type
Changed
Thu, 12/15/2022 - 15:42

 

Individuals who have hypertension at any age are more likely to experience more rapid cognitive decline compared with their counterparts with normal blood pressure, new research shows. In a retrospective study of more than 15,000 participants, hypertension during middle age was associated with memory decline, and onset at later ages was linked to worsening memory and global cognition.

The investigators found that prehypertension, defined as systolic pressure of 120-139 mm Hg or diastolic pressure of 80-89 mm Hg, was also linked to accelerated cognitive decline.

Although duration of hypertension was not associated with any marker of cognitive decline, blood pressure control “can substantially reduce hypertension’s deleterious effect on the pace of cognitive decline,” said study investigator Sandhi M. Barreto, MD, PhD, professor of medicine at Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.

The findings were published online Dec. 14 in Hypertension.
 

Unanswered questions

Hypertension is an established and highly prevalent risk factor for cognitive decline, but the age at which it begins to affect cognition is unclear. Previous research suggests that onset during middle age is associated with more harmful cognitive effects than onset in later life. One reason for this apparent difference may be that the duration of hypertension influences the magnitude of cognitive decline, the researchers noted.

Other studies have shown that prehypertension is associated with damage to certain organs, but its effects on cognition are uncertain. In addition, the effect of good blood pressure control with antihypertensive medications and the impact on cognition are also unclear.

To investigate, the researchers examined data from the ongoing, multicenter ELSA-Brasil study. ELSA-Brasil follows 15,105 civil servants between the ages of 35 and 74 years. Dr. Barreto and team assessed data from visit 1, which was conducted between 2008 and 2010, and visit 2, which was conducted between 2012 and 2014.

At each visit, participants underwent a memory test, a verbal fluency test, and the Trail Making Test Part B. The investigators calculated Z scores for these tests to derive a global cognitive score.

Blood pressure was measured on the right arm, and hypertension status, age at the time of hypertension diagnosis, duration of hypertension diagnosis, hypertension treatment, and control status were recorded. Other covariables included sex, education, race, smoking status, physical activity, body mass index, and total cholesterol level.

The researchers excluded patients who did not undergo cognitive testing at visit 2, those who had a history of stroke at baseline, and those who initiated antihypertensive medications despite having normotension. After exclusions, the analysis included 7,063 participants (approximately 55% were women, 15% were Black).

At visit 1, the mean age of the group was 58.9 years, and 53.4% of participants had 14 or more years of education. In addition, 22% had prehypertension, and 46.8% had hypertension. The median duration of hypertension was 7 years; 29.8% of participants with hypertension were diagnosed with the condition during middle age.

Of those who reported having hypertension at visit 1, 7.3% were not taking any antihypertensive medication. Among participants with hypertension who were taking antihypertensives, 31.2% had uncontrolled blood pressure.
 

Independent predictor

Results showed that prehypertension independently predicted a significantly greater decline in verbal fluency (Z score, –0.0095; P < .01) and global cognitive score (Z score, –0.0049; P < .05) compared with normal blood pressure.

At middle age, hypertension was associated with a steeper decline in memory (Z score, –0.0072; P < .05) compared with normal blood pressure. At older ages, hypertension was linked to a steeper decline in both memory (Z score, –0.0151; P < .001) and global cognitive score (Z score, –0.0080; P < .01). Duration of hypertension, however, did not significantly predict changes in cognition (P < .109).

Among those with hypertension who were taking antihypertensive medications, those with uncontrolled blood pressure experienced greater declines in rapid memory (Z score, –0.0126; P < .01) and global cognitive score (Z score, –0.0074; P < .01) than did those with controlled blood pressure.

The investigators noted that the study participants had a comparatively high level of education, which has been shown to “boost cognitive reserve and lessen the speed of age-related cognitive decline,” Dr. Barreto said. However, “our results indicate that the effect of hypertension on cognitive decline affects individuals of all educational levels similarly,” she said.

Dr. Barreto noted that the findings have two major clinical implications. First, “maintaining blood pressure below prehypertension levels is important to preserve cognitive function or delay cognitive decline,” she said. Secondly, “in hypertensive individuals, keeping blood pressure under control is essential to reduce the speed of cognitive decline.”

The researchers plan to conduct further analyses of the data to clarify the observed relationship between memory and verbal fluency. They also plan to examine how hypertension affects long-term executive function.
 

‘Continuum of risk’

Commenting on the study, Philip B. Gorelick, MD, MPH, adjunct professor of neurology (stroke and neurocritical care) at Northwestern University, Chicago, noted that, so far, research suggests that the risk for stroke associated with blood pressure levels should be understood as representing a continuum rather than as being associated with several discrete points.

“The same may hold true for cognitive decline and dementia. There may be a continuum of risk whereby persons even at so-called elevated but relatively lower levels of blood pressure based on a continuous scale are at risk,” said Dr. Gorelick, who was not involved with the current study.

The investigators relied on a large and well-studied population of civil servants. However, the population’s relative youth and high level of education may limit the generalizability of the findings, he noted. In addition, the follow-up time was relatively short.

“The hard endpoint of dementia was not studied but would be of interest to enhance our understanding of the influence of blood pressure elevation on cognitive decline or dementia during a longer follow-up of the cohort,” Dr. Gorelick said.

The findings also suggest the need to better understand mechanisms that link blood pressure elevation with cognitive decline, he added.

They indicate “the need for additional clinical trials to better elucidate blood pressure lowering targets for cognitive preservation in different groups of persons at risk,” such as those with normal cognition, those with mild cognitive impairment, and those with dementia, said Dr. Gorelick. “For example, is it safe and efficacious to lower blood pressure in persons with more advanced cognitive impairment or dementia?” he asked.

The study was funded by the Brazilian Coordination for the Improvement of Higher Education Personnel. Dr. Barreto has received support from the Research Agency of the State of Minas Gerais. Although Dr. Gorelick was not involved in the ELSA-Brasil cohort study, he serves on a data monitoring committee for a trial of a blood pressure–lowering agent in the preservation of cognition.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

 

Individuals who have hypertension at any age are more likely to experience more rapid cognitive decline compared with their counterparts with normal blood pressure, new research shows. In a retrospective study of more than 15,000 participants, hypertension during middle age was associated with memory decline, and onset at later ages was linked to worsening memory and global cognition.

The investigators found that prehypertension, defined as systolic pressure of 120-139 mm Hg or diastolic pressure of 80-89 mm Hg, was also linked to accelerated cognitive decline.

Although duration of hypertension was not associated with any marker of cognitive decline, blood pressure control “can substantially reduce hypertension’s deleterious effect on the pace of cognitive decline,” said study investigator Sandhi M. Barreto, MD, PhD, professor of medicine at Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.

The findings were published online Dec. 14 in Hypertension.
 

Unanswered questions

Hypertension is an established and highly prevalent risk factor for cognitive decline, but the age at which it begins to affect cognition is unclear. Previous research suggests that onset during middle age is associated with more harmful cognitive effects than onset in later life. One reason for this apparent difference may be that the duration of hypertension influences the magnitude of cognitive decline, the researchers noted.

Other studies have shown that prehypertension is associated with damage to certain organs, but its effects on cognition are uncertain. In addition, the effect of good blood pressure control with antihypertensive medications and the impact on cognition are also unclear.

To investigate, the researchers examined data from the ongoing, multicenter ELSA-Brasil study. ELSA-Brasil follows 15,105 civil servants between the ages of 35 and 74 years. Dr. Barreto and team assessed data from visit 1, which was conducted between 2008 and 2010, and visit 2, which was conducted between 2012 and 2014.

At each visit, participants underwent a memory test, a verbal fluency test, and the Trail Making Test Part B. The investigators calculated Z scores for these tests to derive a global cognitive score.

Blood pressure was measured on the right arm, and hypertension status, age at the time of hypertension diagnosis, duration of hypertension diagnosis, hypertension treatment, and control status were recorded. Other covariables included sex, education, race, smoking status, physical activity, body mass index, and total cholesterol level.

The researchers excluded patients who did not undergo cognitive testing at visit 2, those who had a history of stroke at baseline, and those who initiated antihypertensive medications despite having normotension. After exclusions, the analysis included 7,063 participants (approximately 55% were women, 15% were Black).

At visit 1, the mean age of the group was 58.9 years, and 53.4% of participants had 14 or more years of education. In addition, 22% had prehypertension, and 46.8% had hypertension. The median duration of hypertension was 7 years; 29.8% of participants with hypertension were diagnosed with the condition during middle age.

Of those who reported having hypertension at visit 1, 7.3% were not taking any antihypertensive medication. Among participants with hypertension who were taking antihypertensives, 31.2% had uncontrolled blood pressure.
 

Independent predictor

Results showed that prehypertension independently predicted a significantly greater decline in verbal fluency (Z score, –0.0095; P < .01) and global cognitive score (Z score, –0.0049; P < .05) compared with normal blood pressure.

At middle age, hypertension was associated with a steeper decline in memory (Z score, –0.0072; P < .05) compared with normal blood pressure. At older ages, hypertension was linked to a steeper decline in both memory (Z score, –0.0151; P < .001) and global cognitive score (Z score, –0.0080; P < .01). Duration of hypertension, however, did not significantly predict changes in cognition (P < .109).

Among those with hypertension who were taking antihypertensive medications, those with uncontrolled blood pressure experienced greater declines in rapid memory (Z score, –0.0126; P < .01) and global cognitive score (Z score, –0.0074; P < .01) than did those with controlled blood pressure.

The investigators noted that the study participants had a comparatively high level of education, which has been shown to “boost cognitive reserve and lessen the speed of age-related cognitive decline,” Dr. Barreto said. However, “our results indicate that the effect of hypertension on cognitive decline affects individuals of all educational levels similarly,” she said.

Dr. Barreto noted that the findings have two major clinical implications. First, “maintaining blood pressure below prehypertension levels is important to preserve cognitive function or delay cognitive decline,” she said. Secondly, “in hypertensive individuals, keeping blood pressure under control is essential to reduce the speed of cognitive decline.”

The researchers plan to conduct further analyses of the data to clarify the observed relationship between memory and verbal fluency. They also plan to examine how hypertension affects long-term executive function.
 

‘Continuum of risk’

Commenting on the study, Philip B. Gorelick, MD, MPH, adjunct professor of neurology (stroke and neurocritical care) at Northwestern University, Chicago, noted that, so far, research suggests that the risk for stroke associated with blood pressure levels should be understood as representing a continuum rather than as being associated with several discrete points.

“The same may hold true for cognitive decline and dementia. There may be a continuum of risk whereby persons even at so-called elevated but relatively lower levels of blood pressure based on a continuous scale are at risk,” said Dr. Gorelick, who was not involved with the current study.

The investigators relied on a large and well-studied population of civil servants. However, the population’s relative youth and high level of education may limit the generalizability of the findings, he noted. In addition, the follow-up time was relatively short.

“The hard endpoint of dementia was not studied but would be of interest to enhance our understanding of the influence of blood pressure elevation on cognitive decline or dementia during a longer follow-up of the cohort,” Dr. Gorelick said.

The findings also suggest the need to better understand mechanisms that link blood pressure elevation with cognitive decline, he added.

They indicate “the need for additional clinical trials to better elucidate blood pressure lowering targets for cognitive preservation in different groups of persons at risk,” such as those with normal cognition, those with mild cognitive impairment, and those with dementia, said Dr. Gorelick. “For example, is it safe and efficacious to lower blood pressure in persons with more advanced cognitive impairment or dementia?” he asked.

The study was funded by the Brazilian Coordination for the Improvement of Higher Education Personnel. Dr. Barreto has received support from the Research Agency of the State of Minas Gerais. Although Dr. Gorelick was not involved in the ELSA-Brasil cohort study, he serves on a data monitoring committee for a trial of a blood pressure–lowering agent in the preservation of cognition.

A version of this article first appeared on Medscape.com.

 

Individuals who have hypertension at any age are more likely to experience more rapid cognitive decline compared with their counterparts with normal blood pressure, new research shows. In a retrospective study of more than 15,000 participants, hypertension during middle age was associated with memory decline, and onset at later ages was linked to worsening memory and global cognition.

The investigators found that prehypertension, defined as systolic pressure of 120-139 mm Hg or diastolic pressure of 80-89 mm Hg, was also linked to accelerated cognitive decline.

Although duration of hypertension was not associated with any marker of cognitive decline, blood pressure control “can substantially reduce hypertension’s deleterious effect on the pace of cognitive decline,” said study investigator Sandhi M. Barreto, MD, PhD, professor of medicine at Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.

The findings were published online Dec. 14 in Hypertension.
 

Unanswered questions

Hypertension is an established and highly prevalent risk factor for cognitive decline, but the age at which it begins to affect cognition is unclear. Previous research suggests that onset during middle age is associated with more harmful cognitive effects than onset in later life. One reason for this apparent difference may be that the duration of hypertension influences the magnitude of cognitive decline, the researchers noted.

Other studies have shown that prehypertension is associated with damage to certain organs, but its effects on cognition are uncertain. In addition, the effect of good blood pressure control with antihypertensive medications and the impact on cognition are also unclear.

To investigate, the researchers examined data from the ongoing, multicenter ELSA-Brasil study. ELSA-Brasil follows 15,105 civil servants between the ages of 35 and 74 years. Dr. Barreto and team assessed data from visit 1, which was conducted between 2008 and 2010, and visit 2, which was conducted between 2012 and 2014.

At each visit, participants underwent a memory test, a verbal fluency test, and the Trail Making Test Part B. The investigators calculated Z scores for these tests to derive a global cognitive score.

Blood pressure was measured on the right arm, and hypertension status, age at the time of hypertension diagnosis, duration of hypertension diagnosis, hypertension treatment, and control status were recorded. Other covariables included sex, education, race, smoking status, physical activity, body mass index, and total cholesterol level.

The researchers excluded patients who did not undergo cognitive testing at visit 2, those who had a history of stroke at baseline, and those who initiated antihypertensive medications despite having normotension. After exclusions, the analysis included 7,063 participants (approximately 55% were women, 15% were Black).

At visit 1, the mean age of the group was 58.9 years, and 53.4% of participants had 14 or more years of education. In addition, 22% had prehypertension, and 46.8% had hypertension. The median duration of hypertension was 7 years; 29.8% of participants with hypertension were diagnosed with the condition during middle age.

Of those who reported having hypertension at visit 1, 7.3% were not taking any antihypertensive medication. Among participants with hypertension who were taking antihypertensives, 31.2% had uncontrolled blood pressure.
 

Independent predictor

Results showed that prehypertension independently predicted a significantly greater decline in verbal fluency (Z score, –0.0095; P < .01) and global cognitive score (Z score, –0.0049; P < .05) compared with normal blood pressure.

At middle age, hypertension was associated with a steeper decline in memory (Z score, –0.0072; P < .05) compared with normal blood pressure. At older ages, hypertension was linked to a steeper decline in both memory (Z score, –0.0151; P < .001) and global cognitive score (Z score, –0.0080; P < .01). Duration of hypertension, however, did not significantly predict changes in cognition (P < .109).

Among those with hypertension who were taking antihypertensive medications, those with uncontrolled blood pressure experienced greater declines in rapid memory (Z score, –0.0126; P < .01) and global cognitive score (Z score, –0.0074; P < .01) than did those with controlled blood pressure.

The investigators noted that the study participants had a comparatively high level of education, which has been shown to “boost cognitive reserve and lessen the speed of age-related cognitive decline,” Dr. Barreto said. However, “our results indicate that the effect of hypertension on cognitive decline affects individuals of all educational levels similarly,” she said.

Dr. Barreto noted that the findings have two major clinical implications. First, “maintaining blood pressure below prehypertension levels is important to preserve cognitive function or delay cognitive decline,” she said. Secondly, “in hypertensive individuals, keeping blood pressure under control is essential to reduce the speed of cognitive decline.”

The researchers plan to conduct further analyses of the data to clarify the observed relationship between memory and verbal fluency. They also plan to examine how hypertension affects long-term executive function.
 

‘Continuum of risk’

Commenting on the study, Philip B. Gorelick, MD, MPH, adjunct professor of neurology (stroke and neurocritical care) at Northwestern University, Chicago, noted that, so far, research suggests that the risk for stroke associated with blood pressure levels should be understood as representing a continuum rather than as being associated with several discrete points.

“The same may hold true for cognitive decline and dementia. There may be a continuum of risk whereby persons even at so-called elevated but relatively lower levels of blood pressure based on a continuous scale are at risk,” said Dr. Gorelick, who was not involved with the current study.

The investigators relied on a large and well-studied population of civil servants. However, the population’s relative youth and high level of education may limit the generalizability of the findings, he noted. In addition, the follow-up time was relatively short.

“The hard endpoint of dementia was not studied but would be of interest to enhance our understanding of the influence of blood pressure elevation on cognitive decline or dementia during a longer follow-up of the cohort,” Dr. Gorelick said.

The findings also suggest the need to better understand mechanisms that link blood pressure elevation with cognitive decline, he added.

They indicate “the need for additional clinical trials to better elucidate blood pressure lowering targets for cognitive preservation in different groups of persons at risk,” such as those with normal cognition, those with mild cognitive impairment, and those with dementia, said Dr. Gorelick. “For example, is it safe and efficacious to lower blood pressure in persons with more advanced cognitive impairment or dementia?” he asked.

The study was funded by the Brazilian Coordination for the Improvement of Higher Education Personnel. Dr. Barreto has received support from the Research Agency of the State of Minas Gerais. Although Dr. Gorelick was not involved in the ELSA-Brasil cohort study, he serves on a data monitoring committee for a trial of a blood pressure–lowering agent in the preservation of cognition.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM HYPERTENSION

Citation Override
Publish date: December 17, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Air pollution linked to brain amyloid pathology

Article Type
Changed
Thu, 12/15/2022 - 15:42

Higher levels of air pollution were associated with an increased risk for amyloid-beta pathology in a new study of older adults with cognitive impairment. “Many studies have now found a link between air pollution and clinical outcomes of dementia or cognitive decline,” said lead author Leonardo Iaccarino, PhD, Weill Institute for Neurosciences, University of California, San Francisco. “But this study is now showing a clear link between air pollution and a biomarker of Alzheimer’s disease: It shows a relationship between bad air quality and pathology in the brain.

“We believe that exposure to air pollution should be considered as one factor in the lifetime risk of developing Alzheimer’s disease,” he added. “We believe it is a significant determinant. Our results suggest that, if we can reduce occupational and residential exposure to air pollution, then this could help reduce the risk of Alzheimer’s disease.”

The study was published online Nov. 30 in JAMA Neurology.
 

A modifiable risk factor

Dr. Iaccarino explained that it is well known that air pollution is linked to poor health outcomes. “As well as cardiovascular and respiratory disease, there is also growing interest in the relationship between air pollution and brain health,” he said. “The link is becoming more and more convincing, with evidence from laboratory, animal, and human studies suggesting that individuals exposed to poor air quality have an increased risk of cognitive decline and dementia.”

In addition, this year, the Lancet Commission included air pollution in its updated list of modifiable risk factors for dementia.

For the current study, the researchers analyzed data from the Imaging Dementia–Evidence for Amyloid Scanning (IDEAS) Study, which included more than 18,000 U.S. participants with cognitive impairment who received an amyloid positron-emission tomography scan between 2016 and 2018.

The investigators used data from the IDEAS study to assess the relationship between the air quality at the place of residence of each patient and the likelihood of a positive amyloid PET result. Public records from the U.S. Environmental Protection Agency were used to estimate air quality in individual ZIP-code areas during two periods – 2002-2003 (approximately 14 years before the amyloid PET scan) and 2015-2016 (approximately 1 year before the amyloid PET scan).

Results showed that those living in an area with increased air pollution, as determined using concentrations of predicted fine particulate matter (PM2.5), had a higher probability of a positive amyloid PET scan. This association was dose dependent and statistically significant after adjusting for demographic, lifestyle, and socioeconomic factors as well as medical comorbidities. The association was seen in both periods; the adjusted odds ratio was 1.10 in 2002-2003 and 1.15 in 2015-2016.

“This shows about a 10% increased probability of a positive amyloid test for individuals living in the worst polluted areas, compared with those in the least polluted areas,” Dr. Iaccarino explained.

Every unit increase in PM2.5 in 2002-2003 was associated with an increased probability of positive amyloid findings on PET of 0.5%. Every unit increase in PM2.5 in for the 2015-2016 period was associated with an increased probability of positive amyloid findings on PET of 0.8%.

“This was a very large cohort study, and we adjusted for multiple other factors, so these are pretty robust findings,” Dr. Iaccarino said.

Exposure to higher ozone concentrations was not associated with amyloid positivity on PET scans in either time window.

“These findings suggest that brain amyloid-beta accumulation could be one of the biological pathways in the increased incidence of dementia and cognitive decline associated with exposure to air pollution,” the researchers stated.
 

 

 

A public health concern

“Adverse effects of airborne toxic pollutants associated with amyloid-beta pathology should be considered in public health policy decisions and should inform individual lifetime risk of developing Alzheimer’s disease and dementia,” they concluded.

Dr. Iaccarino noted that, although governments need to take primary action in reducing air pollution, individuals can make some changes to reduce their exposure to poor-quality air.

“Such changes could include not going out or using masks when pollution levels are very high (as happened recently in California with the wildfires) and avoiding areas where the air quality is known to be bad. In addition, there are activities which increase indoor air pollution which can be changed, such as certain types of cooking, cigarette smoking, use of coal fires,” he commented.

“Based on our findings, it would be reasonable to take action on these things, especially for individuals at higher risk of cardiovascular and respiratory disease or Alzheimer’s,” he added.

On a more optimistic note, Dr. Iaccarino pointed out that air quality in the United States has improved significantly in recent years. Meaningful improvements were found between the two periods in this analysis study (2002-2016), “so we are going in the right direction.”

The IDEAS Study was funded by the Alzheimer’s Association, the American College of Radiology, Avid Radiopharmaceuticals, GE Healthcare, and Life Molecular Imaging. Dr. Iaccarino has disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Publications
Topics
Sections

Higher levels of air pollution were associated with an increased risk for amyloid-beta pathology in a new study of older adults with cognitive impairment. “Many studies have now found a link between air pollution and clinical outcomes of dementia or cognitive decline,” said lead author Leonardo Iaccarino, PhD, Weill Institute for Neurosciences, University of California, San Francisco. “But this study is now showing a clear link between air pollution and a biomarker of Alzheimer’s disease: It shows a relationship between bad air quality and pathology in the brain.

“We believe that exposure to air pollution should be considered as one factor in the lifetime risk of developing Alzheimer’s disease,” he added. “We believe it is a significant determinant. Our results suggest that, if we can reduce occupational and residential exposure to air pollution, then this could help reduce the risk of Alzheimer’s disease.”

The study was published online Nov. 30 in JAMA Neurology.
 

A modifiable risk factor

Dr. Iaccarino explained that it is well known that air pollution is linked to poor health outcomes. “As well as cardiovascular and respiratory disease, there is also growing interest in the relationship between air pollution and brain health,” he said. “The link is becoming more and more convincing, with evidence from laboratory, animal, and human studies suggesting that individuals exposed to poor air quality have an increased risk of cognitive decline and dementia.”

In addition, this year, the Lancet Commission included air pollution in its updated list of modifiable risk factors for dementia.

For the current study, the researchers analyzed data from the Imaging Dementia–Evidence for Amyloid Scanning (IDEAS) Study, which included more than 18,000 U.S. participants with cognitive impairment who received an amyloid positron-emission tomography scan between 2016 and 2018.

The investigators used data from the IDEAS study to assess the relationship between the air quality at the place of residence of each patient and the likelihood of a positive amyloid PET result. Public records from the U.S. Environmental Protection Agency were used to estimate air quality in individual ZIP-code areas during two periods – 2002-2003 (approximately 14 years before the amyloid PET scan) and 2015-2016 (approximately 1 year before the amyloid PET scan).

Results showed that those living in an area with increased air pollution, as determined using concentrations of predicted fine particulate matter (PM2.5), had a higher probability of a positive amyloid PET scan. This association was dose dependent and statistically significant after adjusting for demographic, lifestyle, and socioeconomic factors as well as medical comorbidities. The association was seen in both periods; the adjusted odds ratio was 1.10 in 2002-2003 and 1.15 in 2015-2016.

“This shows about a 10% increased probability of a positive amyloid test for individuals living in the worst polluted areas, compared with those in the least polluted areas,” Dr. Iaccarino explained.

Every unit increase in PM2.5 in 2002-2003 was associated with an increased probability of positive amyloid findings on PET of 0.5%. Every unit increase in PM2.5 in for the 2015-2016 period was associated with an increased probability of positive amyloid findings on PET of 0.8%.

“This was a very large cohort study, and we adjusted for multiple other factors, so these are pretty robust findings,” Dr. Iaccarino said.

Exposure to higher ozone concentrations was not associated with amyloid positivity on PET scans in either time window.

“These findings suggest that brain amyloid-beta accumulation could be one of the biological pathways in the increased incidence of dementia and cognitive decline associated with exposure to air pollution,” the researchers stated.
 

 

 

A public health concern

“Adverse effects of airborne toxic pollutants associated with amyloid-beta pathology should be considered in public health policy decisions and should inform individual lifetime risk of developing Alzheimer’s disease and dementia,” they concluded.

Dr. Iaccarino noted that, although governments need to take primary action in reducing air pollution, individuals can make some changes to reduce their exposure to poor-quality air.

“Such changes could include not going out or using masks when pollution levels are very high (as happened recently in California with the wildfires) and avoiding areas where the air quality is known to be bad. In addition, there are activities which increase indoor air pollution which can be changed, such as certain types of cooking, cigarette smoking, use of coal fires,” he commented.

“Based on our findings, it would be reasonable to take action on these things, especially for individuals at higher risk of cardiovascular and respiratory disease or Alzheimer’s,” he added.

On a more optimistic note, Dr. Iaccarino pointed out that air quality in the United States has improved significantly in recent years. Meaningful improvements were found between the two periods in this analysis study (2002-2016), “so we are going in the right direction.”

The IDEAS Study was funded by the Alzheimer’s Association, the American College of Radiology, Avid Radiopharmaceuticals, GE Healthcare, and Life Molecular Imaging. Dr. Iaccarino has disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Higher levels of air pollution were associated with an increased risk for amyloid-beta pathology in a new study of older adults with cognitive impairment. “Many studies have now found a link between air pollution and clinical outcomes of dementia or cognitive decline,” said lead author Leonardo Iaccarino, PhD, Weill Institute for Neurosciences, University of California, San Francisco. “But this study is now showing a clear link between air pollution and a biomarker of Alzheimer’s disease: It shows a relationship between bad air quality and pathology in the brain.

“We believe that exposure to air pollution should be considered as one factor in the lifetime risk of developing Alzheimer’s disease,” he added. “We believe it is a significant determinant. Our results suggest that, if we can reduce occupational and residential exposure to air pollution, then this could help reduce the risk of Alzheimer’s disease.”

The study was published online Nov. 30 in JAMA Neurology.
 

A modifiable risk factor

Dr. Iaccarino explained that it is well known that air pollution is linked to poor health outcomes. “As well as cardiovascular and respiratory disease, there is also growing interest in the relationship between air pollution and brain health,” he said. “The link is becoming more and more convincing, with evidence from laboratory, animal, and human studies suggesting that individuals exposed to poor air quality have an increased risk of cognitive decline and dementia.”

In addition, this year, the Lancet Commission included air pollution in its updated list of modifiable risk factors for dementia.

For the current study, the researchers analyzed data from the Imaging Dementia–Evidence for Amyloid Scanning (IDEAS) Study, which included more than 18,000 U.S. participants with cognitive impairment who received an amyloid positron-emission tomography scan between 2016 and 2018.

The investigators used data from the IDEAS study to assess the relationship between the air quality at the place of residence of each patient and the likelihood of a positive amyloid PET result. Public records from the U.S. Environmental Protection Agency were used to estimate air quality in individual ZIP-code areas during two periods – 2002-2003 (approximately 14 years before the amyloid PET scan) and 2015-2016 (approximately 1 year before the amyloid PET scan).

Results showed that those living in an area with increased air pollution, as determined using concentrations of predicted fine particulate matter (PM2.5), had a higher probability of a positive amyloid PET scan. This association was dose dependent and statistically significant after adjusting for demographic, lifestyle, and socioeconomic factors as well as medical comorbidities. The association was seen in both periods; the adjusted odds ratio was 1.10 in 2002-2003 and 1.15 in 2015-2016.

“This shows about a 10% increased probability of a positive amyloid test for individuals living in the worst polluted areas, compared with those in the least polluted areas,” Dr. Iaccarino explained.

Every unit increase in PM2.5 in 2002-2003 was associated with an increased probability of positive amyloid findings on PET of 0.5%. Every unit increase in PM2.5 in for the 2015-2016 period was associated with an increased probability of positive amyloid findings on PET of 0.8%.

“This was a very large cohort study, and we adjusted for multiple other factors, so these are pretty robust findings,” Dr. Iaccarino said.

Exposure to higher ozone concentrations was not associated with amyloid positivity on PET scans in either time window.

“These findings suggest that brain amyloid-beta accumulation could be one of the biological pathways in the increased incidence of dementia and cognitive decline associated with exposure to air pollution,” the researchers stated.
 

 

 

A public health concern

“Adverse effects of airborne toxic pollutants associated with amyloid-beta pathology should be considered in public health policy decisions and should inform individual lifetime risk of developing Alzheimer’s disease and dementia,” they concluded.

Dr. Iaccarino noted that, although governments need to take primary action in reducing air pollution, individuals can make some changes to reduce their exposure to poor-quality air.

“Such changes could include not going out or using masks when pollution levels are very high (as happened recently in California with the wildfires) and avoiding areas where the air quality is known to be bad. In addition, there are activities which increase indoor air pollution which can be changed, such as certain types of cooking, cigarette smoking, use of coal fires,” he commented.

“Based on our findings, it would be reasonable to take action on these things, especially for individuals at higher risk of cardiovascular and respiratory disease or Alzheimer’s,” he added.

On a more optimistic note, Dr. Iaccarino pointed out that air quality in the United States has improved significantly in recent years. Meaningful improvements were found between the two periods in this analysis study (2002-2016), “so we are going in the right direction.”

The IDEAS Study was funded by the Alzheimer’s Association, the American College of Radiology, Avid Radiopharmaceuticals, GE Healthcare, and Life Molecular Imaging. Dr. Iaccarino has disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Issue
Neurology Reviews- 29(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA NEUROLOGY

Citation Override
Publish date: December 8, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Oral steroids benefit patients with cluster headache

Article Type
Changed
Thu, 12/15/2022 - 15:43

Adjunctive oral prednisone appears to significantly reduce cluster headache attacks, new research shows. Results of the multicenter, randomized, double-blind trial show that patients who received the steroid had 25% fewer attacks in the first week of therapy, compared with their counterparts who received placebo.

In addition, more than a third of patients in the prednisone group were pain free, and for almost half, headache frequency was reduced by at least 50% at day 7 of treatment.

These findings provide clear evidence that prednisone, in conjunction with the use of verapamil, is effective in cluster headache, said lead author Mark Obermann, MD, director, Center for Neurology, Asklepios Hospitals Seesen (Germany), and associate professor, University of Duisburg-Essen (Germany).

The key message, he added, is that all patients with cluster headache should receive prednisone at the start of an episode.

The study was published online Nov. 24 in the Lancet Neurology.
 

‘Suicide headaches’

Cluster headaches are intense unilateral attacks of facial and head pain. They last 15-180 minutes and predominantly affect men. They are accompanied by trigeminal autonomic symptoms and are extremely painful. “They’re referred to as ‘suicide headaches’ because the pain is so severe that patients often report they think about killing themselves to get rid of the pain,” said Dr. Obermann.

The cause is unclear, although there is some evidence that the hypothalamus is involved. The headaches sometimes follow a “strict circadian pattern,” said Dr. Obermann. He noted that the attacks might occur over a few weeks or months and then not return for months or even years.

An estimated 1 in 1,000 people experience cluster headache, but the condition is underrecognized, and research is scarce and poorly funded. Previous research does show that the calcium channel blocker verapamil, which is used to treat high blood pressure, is effective in cluster headache. However, it takes about 14 days to work and has to be slowly titrated because of cardiac side effects, said Dr. Obermann. For these reasons, international guidelines recommend initiating short-term preventive treatment with corticosteroids to suppress, or at least lessen, cluster headache attacks until long-term prevention is effective.

Although some clinicians treat cluster headaches with corticosteroids, others don’t because of a lack of evidence that shows they are effective. “There’s no evidence whatsoever on what the correct dose is or whether it helps at all. This is the gap we wanted to close,” said Dr. Obermann.

The study included 116 adult patients with cluster headache from 10 centers who were experiencing a cluster headache episode and were not taking prophylactic medication.

The trial only included patients who had an attack within 30 days of their current episode. The investigators included this restriction to reduce the possibility of spontaneous remission, which is “a big problem” in cluster headache trials, he said. To confirm that episodes were cluster headache attacks, patients were also required to have moderate to severe pain, indicated by a score of at least 5 on a numerical rating scale in which 0 indicates no pain and 10 indicates the worse imaginable pain.

Participants were allowed to use treatments for acute attack, but these therapies were limited to triptans, high-flow oxygen, intranasal lidocaineergotamine, and oral analgesics.
 

 

 

Debilitating pain

Patients were randomly assigned to receive oral prednisone (n = 53) or placebo (n = 56). The study groups were matched with respect to demographic and clinical characteristics. Prednisone was initiated at 100 mg/d for 5 days and was then tapered by 20 mg every 3 days in the active-treatment group. All patients also received oral verapamil at a starting dose of 40 mg three times per day. The dose was increased every 3 days by 40 mg to a maximum of 360 mg/d.

All participants received pantoprazole 20 mg to prevent the gastric side effects of prednisone. An attack was defined as a unilateral headache of moderate to severe intensity. The study lasted 28 days.

The study’s primary outcome was the mean number of cluster headache attacks during the first week of treatment with prednisone versus placebo.

The mean number of attacks during the first week of treatment was 7.1 in the prednisone group and 9.5 in the placebo group, for a difference of –2.4 attacks (95% confidence interval, –4.8 to –0.03; P = .002). “This might not sound like much,” but reducing the number of daily attacks from, say, eight to six “really makes a difference because the attacks are so painful,” said Dr. Obermann.

The prednisone group also came out on top for a number secondary outcomes. After the first 7 days, attacks ceased in 35% of the prednisone group versus 7% in the placebo group.
 

‘Clear evidence’ of efficacy

About 49% of patients who took prednisone reported a reduction of at least 50% in attack frequency at day 7. By comparison, 15% of patients who received placebo reported such a reduction. The number of cluster attacks at day 28 was less in the prednisone group than in the patients who received placebo.

With respect to treatment effect, the difference between prednisone and placebo gradually lessened over time “in parallel to the verapamil dose reaching its therapeutic effect,” the investigators noted. “Therefore, attack frequency reduction slowly converged between groups,” they added.

The study results provide “clear evidence” and should reassure clinicians that short-term prednisone early in a cluster headache attack is effective, said Dr. Obermann.

Adverse events, which included headache, palpitations, dizziness, and nausea, were as expected and were similar in the two groups. There were only two severe adverse events, both of which occurred in participants in the placebo group.

Dr. Obermann said the investigators were surprised that so many patients in the study were taking analgesics. “Analgesics don’t work in cluster headache; they just don’t work in this kind of pain.”

He noted that prednisone exposure of study patients spanned only 19 days and amounted to only 1,100 mg, which he believes is safe.

The prednisone dose used in the study is “what most clinicians use in clinical practice,” although there have been reports of success using 500 mg of IV prednisone over 5 days, said Dr. Obermann. He added that it would be “interesting to see if 50 mg would be just as good” as a starting dose.

Potential limitations of the study include the fact that the majority of participants were White, so the findings may not be generalizable to other populations.
 

 

 

Long-awaited results

In an accompanying editorial, Anne Ducros, MD, PhD, professor of neurology and director of the Headache Center, Montpellier (France) University Hospital, said the study provides “strong and long-awaited evidence supporting the use of oral steroids as a transitional treatment option.”

The trial “raises many topics for future research,” one of which is the long-term safety of prednisone for patients with cluster headache, said Dr. Ducros. She noted that use of high-dose steroids once or twice a year for 15 years or more “has the potential for severe systemic toxic effects,” such as corticosteroid-induced osteonecrosis of the femoral head.

Other questions about corticosteroid use for patients with cluster headache remain. These include understanding whether these agents provide better efficacy than occipital nerve injections and determining the optimal verapamil regimen, she noted.

In addition, the risk for oral steroid misuse needs to be studied, she said. She noted that drug misuse is common among patients with cluster headache.

Despite these questions, the results of this new study “provide an important step forward for patients with cluster headache, for whom safe and effective transitional therapies are much needed,” Dr. Ducros wrote.


Dr. Obermann has received fees from Sanofi, Biogen, Novartis, Teva Pharmaceuticals, and Eli Lilly and grants from Allergan and Heel Pharmaceuticals outside of this work. Dr. Ducros has received fees from Amgen, Novartis, Teva, and Eli Lilly; grants from the Programme Hospitalier de Recherche Clinique and from the Appel d’Offre Interne of Montpellier University Hospital; and nonfinancial support from SOS Oxygene.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Publications
Topics
Sections

Adjunctive oral prednisone appears to significantly reduce cluster headache attacks, new research shows. Results of the multicenter, randomized, double-blind trial show that patients who received the steroid had 25% fewer attacks in the first week of therapy, compared with their counterparts who received placebo.

In addition, more than a third of patients in the prednisone group were pain free, and for almost half, headache frequency was reduced by at least 50% at day 7 of treatment.

These findings provide clear evidence that prednisone, in conjunction with the use of verapamil, is effective in cluster headache, said lead author Mark Obermann, MD, director, Center for Neurology, Asklepios Hospitals Seesen (Germany), and associate professor, University of Duisburg-Essen (Germany).

The key message, he added, is that all patients with cluster headache should receive prednisone at the start of an episode.

The study was published online Nov. 24 in the Lancet Neurology.
 

‘Suicide headaches’

Cluster headaches are intense unilateral attacks of facial and head pain. They last 15-180 minutes and predominantly affect men. They are accompanied by trigeminal autonomic symptoms and are extremely painful. “They’re referred to as ‘suicide headaches’ because the pain is so severe that patients often report they think about killing themselves to get rid of the pain,” said Dr. Obermann.

The cause is unclear, although there is some evidence that the hypothalamus is involved. The headaches sometimes follow a “strict circadian pattern,” said Dr. Obermann. He noted that the attacks might occur over a few weeks or months and then not return for months or even years.

An estimated 1 in 1,000 people experience cluster headache, but the condition is underrecognized, and research is scarce and poorly funded. Previous research does show that the calcium channel blocker verapamil, which is used to treat high blood pressure, is effective in cluster headache. However, it takes about 14 days to work and has to be slowly titrated because of cardiac side effects, said Dr. Obermann. For these reasons, international guidelines recommend initiating short-term preventive treatment with corticosteroids to suppress, or at least lessen, cluster headache attacks until long-term prevention is effective.

Although some clinicians treat cluster headaches with corticosteroids, others don’t because of a lack of evidence that shows they are effective. “There’s no evidence whatsoever on what the correct dose is or whether it helps at all. This is the gap we wanted to close,” said Dr. Obermann.

The study included 116 adult patients with cluster headache from 10 centers who were experiencing a cluster headache episode and were not taking prophylactic medication.

The trial only included patients who had an attack within 30 days of their current episode. The investigators included this restriction to reduce the possibility of spontaneous remission, which is “a big problem” in cluster headache trials, he said. To confirm that episodes were cluster headache attacks, patients were also required to have moderate to severe pain, indicated by a score of at least 5 on a numerical rating scale in which 0 indicates no pain and 10 indicates the worse imaginable pain.

Participants were allowed to use treatments for acute attack, but these therapies were limited to triptans, high-flow oxygen, intranasal lidocaineergotamine, and oral analgesics.
 

 

 

Debilitating pain

Patients were randomly assigned to receive oral prednisone (n = 53) or placebo (n = 56). The study groups were matched with respect to demographic and clinical characteristics. Prednisone was initiated at 100 mg/d for 5 days and was then tapered by 20 mg every 3 days in the active-treatment group. All patients also received oral verapamil at a starting dose of 40 mg three times per day. The dose was increased every 3 days by 40 mg to a maximum of 360 mg/d.

All participants received pantoprazole 20 mg to prevent the gastric side effects of prednisone. An attack was defined as a unilateral headache of moderate to severe intensity. The study lasted 28 days.

The study’s primary outcome was the mean number of cluster headache attacks during the first week of treatment with prednisone versus placebo.

The mean number of attacks during the first week of treatment was 7.1 in the prednisone group and 9.5 in the placebo group, for a difference of –2.4 attacks (95% confidence interval, –4.8 to –0.03; P = .002). “This might not sound like much,” but reducing the number of daily attacks from, say, eight to six “really makes a difference because the attacks are so painful,” said Dr. Obermann.

The prednisone group also came out on top for a number secondary outcomes. After the first 7 days, attacks ceased in 35% of the prednisone group versus 7% in the placebo group.
 

‘Clear evidence’ of efficacy

About 49% of patients who took prednisone reported a reduction of at least 50% in attack frequency at day 7. By comparison, 15% of patients who received placebo reported such a reduction. The number of cluster attacks at day 28 was less in the prednisone group than in the patients who received placebo.

With respect to treatment effect, the difference between prednisone and placebo gradually lessened over time “in parallel to the verapamil dose reaching its therapeutic effect,” the investigators noted. “Therefore, attack frequency reduction slowly converged between groups,” they added.

The study results provide “clear evidence” and should reassure clinicians that short-term prednisone early in a cluster headache attack is effective, said Dr. Obermann.

Adverse events, which included headache, palpitations, dizziness, and nausea, were as expected and were similar in the two groups. There were only two severe adverse events, both of which occurred in participants in the placebo group.

Dr. Obermann said the investigators were surprised that so many patients in the study were taking analgesics. “Analgesics don’t work in cluster headache; they just don’t work in this kind of pain.”

He noted that prednisone exposure of study patients spanned only 19 days and amounted to only 1,100 mg, which he believes is safe.

The prednisone dose used in the study is “what most clinicians use in clinical practice,” although there have been reports of success using 500 mg of IV prednisone over 5 days, said Dr. Obermann. He added that it would be “interesting to see if 50 mg would be just as good” as a starting dose.

Potential limitations of the study include the fact that the majority of participants were White, so the findings may not be generalizable to other populations.
 

 

 

Long-awaited results

In an accompanying editorial, Anne Ducros, MD, PhD, professor of neurology and director of the Headache Center, Montpellier (France) University Hospital, said the study provides “strong and long-awaited evidence supporting the use of oral steroids as a transitional treatment option.”

The trial “raises many topics for future research,” one of which is the long-term safety of prednisone for patients with cluster headache, said Dr. Ducros. She noted that use of high-dose steroids once or twice a year for 15 years or more “has the potential for severe systemic toxic effects,” such as corticosteroid-induced osteonecrosis of the femoral head.

Other questions about corticosteroid use for patients with cluster headache remain. These include understanding whether these agents provide better efficacy than occipital nerve injections and determining the optimal verapamil regimen, she noted.

In addition, the risk for oral steroid misuse needs to be studied, she said. She noted that drug misuse is common among patients with cluster headache.

Despite these questions, the results of this new study “provide an important step forward for patients with cluster headache, for whom safe and effective transitional therapies are much needed,” Dr. Ducros wrote.


Dr. Obermann has received fees from Sanofi, Biogen, Novartis, Teva Pharmaceuticals, and Eli Lilly and grants from Allergan and Heel Pharmaceuticals outside of this work. Dr. Ducros has received fees from Amgen, Novartis, Teva, and Eli Lilly; grants from the Programme Hospitalier de Recherche Clinique and from the Appel d’Offre Interne of Montpellier University Hospital; and nonfinancial support from SOS Oxygene.

A version of this article originally appeared on Medscape.com.

Adjunctive oral prednisone appears to significantly reduce cluster headache attacks, new research shows. Results of the multicenter, randomized, double-blind trial show that patients who received the steroid had 25% fewer attacks in the first week of therapy, compared with their counterparts who received placebo.

In addition, more than a third of patients in the prednisone group were pain free, and for almost half, headache frequency was reduced by at least 50% at day 7 of treatment.

These findings provide clear evidence that prednisone, in conjunction with the use of verapamil, is effective in cluster headache, said lead author Mark Obermann, MD, director, Center for Neurology, Asklepios Hospitals Seesen (Germany), and associate professor, University of Duisburg-Essen (Germany).

The key message, he added, is that all patients with cluster headache should receive prednisone at the start of an episode.

The study was published online Nov. 24 in the Lancet Neurology.
 

‘Suicide headaches’

Cluster headaches are intense unilateral attacks of facial and head pain. They last 15-180 minutes and predominantly affect men. They are accompanied by trigeminal autonomic symptoms and are extremely painful. “They’re referred to as ‘suicide headaches’ because the pain is so severe that patients often report they think about killing themselves to get rid of the pain,” said Dr. Obermann.

The cause is unclear, although there is some evidence that the hypothalamus is involved. The headaches sometimes follow a “strict circadian pattern,” said Dr. Obermann. He noted that the attacks might occur over a few weeks or months and then not return for months or even years.

An estimated 1 in 1,000 people experience cluster headache, but the condition is underrecognized, and research is scarce and poorly funded. Previous research does show that the calcium channel blocker verapamil, which is used to treat high blood pressure, is effective in cluster headache. However, it takes about 14 days to work and has to be slowly titrated because of cardiac side effects, said Dr. Obermann. For these reasons, international guidelines recommend initiating short-term preventive treatment with corticosteroids to suppress, or at least lessen, cluster headache attacks until long-term prevention is effective.

Although some clinicians treat cluster headaches with corticosteroids, others don’t because of a lack of evidence that shows they are effective. “There’s no evidence whatsoever on what the correct dose is or whether it helps at all. This is the gap we wanted to close,” said Dr. Obermann.

The study included 116 adult patients with cluster headache from 10 centers who were experiencing a cluster headache episode and were not taking prophylactic medication.

The trial only included patients who had an attack within 30 days of their current episode. The investigators included this restriction to reduce the possibility of spontaneous remission, which is “a big problem” in cluster headache trials, he said. To confirm that episodes were cluster headache attacks, patients were also required to have moderate to severe pain, indicated by a score of at least 5 on a numerical rating scale in which 0 indicates no pain and 10 indicates the worse imaginable pain.

Participants were allowed to use treatments for acute attack, but these therapies were limited to triptans, high-flow oxygen, intranasal lidocaineergotamine, and oral analgesics.
 

 

 

Debilitating pain

Patients were randomly assigned to receive oral prednisone (n = 53) or placebo (n = 56). The study groups were matched with respect to demographic and clinical characteristics. Prednisone was initiated at 100 mg/d for 5 days and was then tapered by 20 mg every 3 days in the active-treatment group. All patients also received oral verapamil at a starting dose of 40 mg three times per day. The dose was increased every 3 days by 40 mg to a maximum of 360 mg/d.

All participants received pantoprazole 20 mg to prevent the gastric side effects of prednisone. An attack was defined as a unilateral headache of moderate to severe intensity. The study lasted 28 days.

The study’s primary outcome was the mean number of cluster headache attacks during the first week of treatment with prednisone versus placebo.

The mean number of attacks during the first week of treatment was 7.1 in the prednisone group and 9.5 in the placebo group, for a difference of –2.4 attacks (95% confidence interval, –4.8 to –0.03; P = .002). “This might not sound like much,” but reducing the number of daily attacks from, say, eight to six “really makes a difference because the attacks are so painful,” said Dr. Obermann.

The prednisone group also came out on top for a number secondary outcomes. After the first 7 days, attacks ceased in 35% of the prednisone group versus 7% in the placebo group.
 

‘Clear evidence’ of efficacy

About 49% of patients who took prednisone reported a reduction of at least 50% in attack frequency at day 7. By comparison, 15% of patients who received placebo reported such a reduction. The number of cluster attacks at day 28 was less in the prednisone group than in the patients who received placebo.

With respect to treatment effect, the difference between prednisone and placebo gradually lessened over time “in parallel to the verapamil dose reaching its therapeutic effect,” the investigators noted. “Therefore, attack frequency reduction slowly converged between groups,” they added.

The study results provide “clear evidence” and should reassure clinicians that short-term prednisone early in a cluster headache attack is effective, said Dr. Obermann.

Adverse events, which included headache, palpitations, dizziness, and nausea, were as expected and were similar in the two groups. There were only two severe adverse events, both of which occurred in participants in the placebo group.

Dr. Obermann said the investigators were surprised that so many patients in the study were taking analgesics. “Analgesics don’t work in cluster headache; they just don’t work in this kind of pain.”

He noted that prednisone exposure of study patients spanned only 19 days and amounted to only 1,100 mg, which he believes is safe.

The prednisone dose used in the study is “what most clinicians use in clinical practice,” although there have been reports of success using 500 mg of IV prednisone over 5 days, said Dr. Obermann. He added that it would be “interesting to see if 50 mg would be just as good” as a starting dose.

Potential limitations of the study include the fact that the majority of participants were White, so the findings may not be generalizable to other populations.
 

 

 

Long-awaited results

In an accompanying editorial, Anne Ducros, MD, PhD, professor of neurology and director of the Headache Center, Montpellier (France) University Hospital, said the study provides “strong and long-awaited evidence supporting the use of oral steroids as a transitional treatment option.”

The trial “raises many topics for future research,” one of which is the long-term safety of prednisone for patients with cluster headache, said Dr. Ducros. She noted that use of high-dose steroids once or twice a year for 15 years or more “has the potential for severe systemic toxic effects,” such as corticosteroid-induced osteonecrosis of the femoral head.

Other questions about corticosteroid use for patients with cluster headache remain. These include understanding whether these agents provide better efficacy than occipital nerve injections and determining the optimal verapamil regimen, she noted.

In addition, the risk for oral steroid misuse needs to be studied, she said. She noted that drug misuse is common among patients with cluster headache.

Despite these questions, the results of this new study “provide an important step forward for patients with cluster headache, for whom safe and effective transitional therapies are much needed,” Dr. Ducros wrote.


Dr. Obermann has received fees from Sanofi, Biogen, Novartis, Teva Pharmaceuticals, and Eli Lilly and grants from Allergan and Heel Pharmaceuticals outside of this work. Dr. Ducros has received fees from Amgen, Novartis, Teva, and Eli Lilly; grants from the Programme Hospitalier de Recherche Clinique and from the Appel d’Offre Interne of Montpellier University Hospital; and nonfinancial support from SOS Oxygene.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Issue
Neurology Reviews- 29(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Citation Override
Publish date: December 8, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

First guidelines for keto diets in adults with epilepsy released

Article Type
Changed
Thu, 12/15/2022 - 15:43

 

An international panel of experts has published the first set of recommendations based on current clinical practices and scientific evidence for using ketogenic diet therapies in adults with drug-resistant epilepsy.

Just as in children with epilepsy, ketogenic diet therapies can be safe and effective in adults with epilepsy but should only be undertaken with the support of medical professionals trained in their use, the group said.

Dr. Mackenzie Cervenka


“Motivation is the key to successful ketogenic diet therapy adherence,” first author Mackenzie Cervenka, MD, director of the Adult Epilepsy Diet Center and associate professor of neurology at Johns Hopkins University, Baltimore, said in an interview.

“Patients who are autonomous require self-motivation and having a strong support structure is important as well. For those patients who are dependents, their caregivers need to be motivated to manage their diet,” said Dr. Cervenka.

The guidelines were published online Oct. 30 in Neurology Clinical Practice.

Novel in adult neurology

Ketogenic diet therapies are high-fat, low-carbohydrate, and adequate-protein diets that induce fat metabolism and ketone production. Despite its use as an effective antiseizure therapy since the 1920s, ketogenic diet therapies remain novel in adult neurology.

Furthermore, while there are established guidelines for ketogenic diet therapies to reduce seizures in children, there were no formal recommendations for adults, until now.

Drawing on the experience of experts at 20 centers using ketogenic diet therapies in more than 2,100 adults with epilepsy in 10 countries, Dr. Cervenka and an international team developed recommendations on use of ketogenic diet therapies in adults.

The panel noted, “with a relatively mild side effect profile and the potential to reduce seizures in nearly 60% of adults with drug-resistant epilepsy, ketogenic diet therapies should be part of the repertoire of available options.”

Ketogenic diet therapies are appropriate to offer to adults with seizure types and epilepsy syndromes for which these treatments are known to be effective in children, they said. These include tuberous sclerosis complexRett syndromeLennox-Gastaut syndrome, glucose transporter type 1 deficiency syndrome, genetic generalized epilepsies, and focal epilepsies caused by underlying migrational disorders and resistant to antiseizure medication.

However, adults with drug-resistant focal epilepsy should be offered surgical evaluation first, given the higher anticipated rate of seizure freedom via this route, the panel said.
 

A focus on compliance

Experts at nearly all of the centers report using two or more ketogenic diet therapies. Ninety percent use the modified Atkins diet, 84% use the classic ketogenic diet, and 63% use the modified ketogenic diet and/or low glycemic index treatment. More than half of the centers (58%) use medium-chain triglyceride oil in combination with another ketogenic diet therapy to boost ketone body production.

The most important factors influencing the choice of ketogenic diet therapy are ease of diet application for the patient (100%) and patient and/or caregiver preference, home setting, and mode of feeding (90% each).

The panel recommended that ketogenic diet therapies be tailored to fit the needs of the individual, taking into account his or her physical and mental characteristics, underlying medical conditions, food preferences, type and amount of support from family and others, level of self-sufficiency, feeding habits, and ease of following the diet.

“Most of the differences between the child and adult recommendations have to do with compliance. Often, it’s more of a challenge for adults than for children,” said Dr. Cervenka.

The panel recommended providing adult patients with recipe ideas, individualized training on the ketogenic diet lifestyle from a dietitian or nutritionist, and guidance for meal planning and preparation before starting the diet. This will provide the greatest likelihood of success, as patients often report difficulties coping with carbohydrate restriction.

“In pediatric practice, positive responders typically remain on a ketogenic diet therapy for 2 years before considering weaning. Ketogenic diet therapy in adults is not time-limited. However, a minimum of 3 months of ketogenic diet therapy is recommended before any judgment of response is made,” the panel advised.

The panel pointed out the absolute metabolic contraindications and cautions related to feeding difficulties, gastrointestinal dysfunction, and digestion remain the same for both children and adults. However, they added that a range of common adult conditions such as hyperlipidemia, heart disease, diabetes, low bone density, and pregnancy “bring additional consideration, caution, and monitoring to ketogenic diet therapy use.”
 

 

 

Beyond epilepsy

The guidelines also call for pre–ketogenic diet therapy biochemical studies to screen adults for preexisting abnormalities and establish a reference for comparing follow-up results after 3, 6, and 12 months, and then annually or as needed.

They also noted that metabolic studies such as urine organic acid and serum amino acid levels are generally not needed in adults unless there is a strong clinical suspicion for an underlying metabolic disorder.

Updated genetic evaluation may also be considered in adults with intellectual disability and epilepsy of unknown etiology. Serial bone mineral density scans may be obtained every 5 years.

The guidelines also call for ketone monitoring (blood beta-hydroxybutyrate or urine amino acids) during the early months of ketogenic diet therapy as an objective indication of compliance and biochemical response.

Dietary adjustments should focus on optimizing the treatment response, minimizing side effects, and maximizing sustainability.

Adults on a ketogenic diet therapy should also be advised to take multivitamin and mineral supplements and drink plenty of fluids.

The panel said emerging evidence also supports the use of ketogenic diet therapies in other adult neurologic disorders such as migraineParkinson’s disease, dementia, and multiple sclerosis.

However, the panel said further evidence is needed to guide recommendations on use of ketogenic diet therapies in other neurologic conditions.

The research had no targeted funding. Dr. Cervenka has reported receiving grants from Nutricia, Vitaflo, BrightFocus Foundation, and Army Research Laboratory; honoraria from the American Epilepsy Society, the Neurology Center, Epigenix, LivaNova, and Nutricia; royalties from Demos; and consulting for Nutricia, Glut1 Deficiency Foundation, and Sage Therapeutics. Disclosures for the other authors are listed in the article.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Publications
Topics
Sections

 

An international panel of experts has published the first set of recommendations based on current clinical practices and scientific evidence for using ketogenic diet therapies in adults with drug-resistant epilepsy.

Just as in children with epilepsy, ketogenic diet therapies can be safe and effective in adults with epilepsy but should only be undertaken with the support of medical professionals trained in their use, the group said.

Dr. Mackenzie Cervenka


“Motivation is the key to successful ketogenic diet therapy adherence,” first author Mackenzie Cervenka, MD, director of the Adult Epilepsy Diet Center and associate professor of neurology at Johns Hopkins University, Baltimore, said in an interview.

“Patients who are autonomous require self-motivation and having a strong support structure is important as well. For those patients who are dependents, their caregivers need to be motivated to manage their diet,” said Dr. Cervenka.

The guidelines were published online Oct. 30 in Neurology Clinical Practice.

Novel in adult neurology

Ketogenic diet therapies are high-fat, low-carbohydrate, and adequate-protein diets that induce fat metabolism and ketone production. Despite its use as an effective antiseizure therapy since the 1920s, ketogenic diet therapies remain novel in adult neurology.

Furthermore, while there are established guidelines for ketogenic diet therapies to reduce seizures in children, there were no formal recommendations for adults, until now.

Drawing on the experience of experts at 20 centers using ketogenic diet therapies in more than 2,100 adults with epilepsy in 10 countries, Dr. Cervenka and an international team developed recommendations on use of ketogenic diet therapies in adults.

The panel noted, “with a relatively mild side effect profile and the potential to reduce seizures in nearly 60% of adults with drug-resistant epilepsy, ketogenic diet therapies should be part of the repertoire of available options.”

Ketogenic diet therapies are appropriate to offer to adults with seizure types and epilepsy syndromes for which these treatments are known to be effective in children, they said. These include tuberous sclerosis complexRett syndromeLennox-Gastaut syndrome, glucose transporter type 1 deficiency syndrome, genetic generalized epilepsies, and focal epilepsies caused by underlying migrational disorders and resistant to antiseizure medication.

However, adults with drug-resistant focal epilepsy should be offered surgical evaluation first, given the higher anticipated rate of seizure freedom via this route, the panel said.
 

A focus on compliance

Experts at nearly all of the centers report using two or more ketogenic diet therapies. Ninety percent use the modified Atkins diet, 84% use the classic ketogenic diet, and 63% use the modified ketogenic diet and/or low glycemic index treatment. More than half of the centers (58%) use medium-chain triglyceride oil in combination with another ketogenic diet therapy to boost ketone body production.

The most important factors influencing the choice of ketogenic diet therapy are ease of diet application for the patient (100%) and patient and/or caregiver preference, home setting, and mode of feeding (90% each).

The panel recommended that ketogenic diet therapies be tailored to fit the needs of the individual, taking into account his or her physical and mental characteristics, underlying medical conditions, food preferences, type and amount of support from family and others, level of self-sufficiency, feeding habits, and ease of following the diet.

“Most of the differences between the child and adult recommendations have to do with compliance. Often, it’s more of a challenge for adults than for children,” said Dr. Cervenka.

The panel recommended providing adult patients with recipe ideas, individualized training on the ketogenic diet lifestyle from a dietitian or nutritionist, and guidance for meal planning and preparation before starting the diet. This will provide the greatest likelihood of success, as patients often report difficulties coping with carbohydrate restriction.

“In pediatric practice, positive responders typically remain on a ketogenic diet therapy for 2 years before considering weaning. Ketogenic diet therapy in adults is not time-limited. However, a minimum of 3 months of ketogenic diet therapy is recommended before any judgment of response is made,” the panel advised.

The panel pointed out the absolute metabolic contraindications and cautions related to feeding difficulties, gastrointestinal dysfunction, and digestion remain the same for both children and adults. However, they added that a range of common adult conditions such as hyperlipidemia, heart disease, diabetes, low bone density, and pregnancy “bring additional consideration, caution, and monitoring to ketogenic diet therapy use.”
 

 

 

Beyond epilepsy

The guidelines also call for pre–ketogenic diet therapy biochemical studies to screen adults for preexisting abnormalities and establish a reference for comparing follow-up results after 3, 6, and 12 months, and then annually or as needed.

They also noted that metabolic studies such as urine organic acid and serum amino acid levels are generally not needed in adults unless there is a strong clinical suspicion for an underlying metabolic disorder.

Updated genetic evaluation may also be considered in adults with intellectual disability and epilepsy of unknown etiology. Serial bone mineral density scans may be obtained every 5 years.

The guidelines also call for ketone monitoring (blood beta-hydroxybutyrate or urine amino acids) during the early months of ketogenic diet therapy as an objective indication of compliance and biochemical response.

Dietary adjustments should focus on optimizing the treatment response, minimizing side effects, and maximizing sustainability.

Adults on a ketogenic diet therapy should also be advised to take multivitamin and mineral supplements and drink plenty of fluids.

The panel said emerging evidence also supports the use of ketogenic diet therapies in other adult neurologic disorders such as migraineParkinson’s disease, dementia, and multiple sclerosis.

However, the panel said further evidence is needed to guide recommendations on use of ketogenic diet therapies in other neurologic conditions.

The research had no targeted funding. Dr. Cervenka has reported receiving grants from Nutricia, Vitaflo, BrightFocus Foundation, and Army Research Laboratory; honoraria from the American Epilepsy Society, the Neurology Center, Epigenix, LivaNova, and Nutricia; royalties from Demos; and consulting for Nutricia, Glut1 Deficiency Foundation, and Sage Therapeutics. Disclosures for the other authors are listed in the article.

A version of this article originally appeared on Medscape.com.

 

An international panel of experts has published the first set of recommendations based on current clinical practices and scientific evidence for using ketogenic diet therapies in adults with drug-resistant epilepsy.

Just as in children with epilepsy, ketogenic diet therapies can be safe and effective in adults with epilepsy but should only be undertaken with the support of medical professionals trained in their use, the group said.

Dr. Mackenzie Cervenka


“Motivation is the key to successful ketogenic diet therapy adherence,” first author Mackenzie Cervenka, MD, director of the Adult Epilepsy Diet Center and associate professor of neurology at Johns Hopkins University, Baltimore, said in an interview.

“Patients who are autonomous require self-motivation and having a strong support structure is important as well. For those patients who are dependents, their caregivers need to be motivated to manage their diet,” said Dr. Cervenka.

The guidelines were published online Oct. 30 in Neurology Clinical Practice.

Novel in adult neurology

Ketogenic diet therapies are high-fat, low-carbohydrate, and adequate-protein diets that induce fat metabolism and ketone production. Despite its use as an effective antiseizure therapy since the 1920s, ketogenic diet therapies remain novel in adult neurology.

Furthermore, while there are established guidelines for ketogenic diet therapies to reduce seizures in children, there were no formal recommendations for adults, until now.

Drawing on the experience of experts at 20 centers using ketogenic diet therapies in more than 2,100 adults with epilepsy in 10 countries, Dr. Cervenka and an international team developed recommendations on use of ketogenic diet therapies in adults.

The panel noted, “with a relatively mild side effect profile and the potential to reduce seizures in nearly 60% of adults with drug-resistant epilepsy, ketogenic diet therapies should be part of the repertoire of available options.”

Ketogenic diet therapies are appropriate to offer to adults with seizure types and epilepsy syndromes for which these treatments are known to be effective in children, they said. These include tuberous sclerosis complexRett syndromeLennox-Gastaut syndrome, glucose transporter type 1 deficiency syndrome, genetic generalized epilepsies, and focal epilepsies caused by underlying migrational disorders and resistant to antiseizure medication.

However, adults with drug-resistant focal epilepsy should be offered surgical evaluation first, given the higher anticipated rate of seizure freedom via this route, the panel said.
 

A focus on compliance

Experts at nearly all of the centers report using two or more ketogenic diet therapies. Ninety percent use the modified Atkins diet, 84% use the classic ketogenic diet, and 63% use the modified ketogenic diet and/or low glycemic index treatment. More than half of the centers (58%) use medium-chain triglyceride oil in combination with another ketogenic diet therapy to boost ketone body production.

The most important factors influencing the choice of ketogenic diet therapy are ease of diet application for the patient (100%) and patient and/or caregiver preference, home setting, and mode of feeding (90% each).

The panel recommended that ketogenic diet therapies be tailored to fit the needs of the individual, taking into account his or her physical and mental characteristics, underlying medical conditions, food preferences, type and amount of support from family and others, level of self-sufficiency, feeding habits, and ease of following the diet.

“Most of the differences between the child and adult recommendations have to do with compliance. Often, it’s more of a challenge for adults than for children,” said Dr. Cervenka.

The panel recommended providing adult patients with recipe ideas, individualized training on the ketogenic diet lifestyle from a dietitian or nutritionist, and guidance for meal planning and preparation before starting the diet. This will provide the greatest likelihood of success, as patients often report difficulties coping with carbohydrate restriction.

“In pediatric practice, positive responders typically remain on a ketogenic diet therapy for 2 years before considering weaning. Ketogenic diet therapy in adults is not time-limited. However, a minimum of 3 months of ketogenic diet therapy is recommended before any judgment of response is made,” the panel advised.

The panel pointed out the absolute metabolic contraindications and cautions related to feeding difficulties, gastrointestinal dysfunction, and digestion remain the same for both children and adults. However, they added that a range of common adult conditions such as hyperlipidemia, heart disease, diabetes, low bone density, and pregnancy “bring additional consideration, caution, and monitoring to ketogenic diet therapy use.”
 

 

 

Beyond epilepsy

The guidelines also call for pre–ketogenic diet therapy biochemical studies to screen adults for preexisting abnormalities and establish a reference for comparing follow-up results after 3, 6, and 12 months, and then annually or as needed.

They also noted that metabolic studies such as urine organic acid and serum amino acid levels are generally not needed in adults unless there is a strong clinical suspicion for an underlying metabolic disorder.

Updated genetic evaluation may also be considered in adults with intellectual disability and epilepsy of unknown etiology. Serial bone mineral density scans may be obtained every 5 years.

The guidelines also call for ketone monitoring (blood beta-hydroxybutyrate or urine amino acids) during the early months of ketogenic diet therapy as an objective indication of compliance and biochemical response.

Dietary adjustments should focus on optimizing the treatment response, minimizing side effects, and maximizing sustainability.

Adults on a ketogenic diet therapy should also be advised to take multivitamin and mineral supplements and drink plenty of fluids.

The panel said emerging evidence also supports the use of ketogenic diet therapies in other adult neurologic disorders such as migraineParkinson’s disease, dementia, and multiple sclerosis.

However, the panel said further evidence is needed to guide recommendations on use of ketogenic diet therapies in other neurologic conditions.

The research had no targeted funding. Dr. Cervenka has reported receiving grants from Nutricia, Vitaflo, BrightFocus Foundation, and Army Research Laboratory; honoraria from the American Epilepsy Society, the Neurology Center, Epigenix, LivaNova, and Nutricia; royalties from Demos; and consulting for Nutricia, Glut1 Deficiency Foundation, and Sage Therapeutics. Disclosures for the other authors are listed in the article.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Issue
Neurology Reviews- 29(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Citation Override
Publish date: December 2, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article