Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Encephalopathy common, often lethal in hospitalized patients with COVID-19

Article Type
Changed
Thu, 12/15/2022 - 15:41

 

Toxic metabolic encephalopathy (TME) is common and often lethal in hospitalized patients with COVID-19, new research shows. Results of a retrospective study show that of almost 4,500 patients with COVID-19, 12% were diagnosed with TME. Of these, 78% developed encephalopathy immediately prior to hospital admission. Septic encephalopathy, hypoxic-ischemic encephalopathy (HIE), and uremia were the most common causes, although multiple causes were present in close to 80% of patients. TME was also associated with a 24% higher risk of in-hospital death.

“We found that close to one in eight patients who were hospitalized with COVID-19 had TME that was not attributed to the effects of sedatives, and that this is incredibly common among these patients who are critically ill” said lead author Jennifer A. Frontera, MD, New York University.

“The general principle of our findings is to be more aggressive in TME; and from a neurologist perspective, the way to do this is to eliminate the effects of sedation, which is a confounder,” she said.

The study was published online March 16 in Neurocritical Care.
 

Drilling down

“Many neurological complications of COVID-19 are sequelae of severe illness or secondary effects of multisystem organ failure, but our previous work identified TME as the most common neurological complication,” Dr. Frontera said.

Previous research investigating encephalopathy among patients with COVID-19 included patients who may have been sedated or have had a positive Confusion Assessment Method (CAM) result.

“A lot of the delirium literature is effectively heterogeneous because there are a number of patients who are on sedative medication that, if you could turn it off, these patients would return to normal. Some may have underlying neurological issues that can be addressed, but you can›t get to the bottom of this unless you turn off the sedation,” Dr. Frontera noted.

“We wanted to be specific and try to drill down to see what the underlying cause of the encephalopathy was,” she said.

The researchers retrospectively analyzed data on 4,491 patients (≥ 18 years old) with COVID-19 who were admitted to four New York City hospitals between March 1, 2020, and May 20, 2020. Of these, 559 (12%) with TME were compared with 3,932 patients without TME.

The researchers looked at index admissions and included patients who had:

  • New changes in mental status or significant worsening of mental status (in patients with baseline abnormal mental status).
  • Hyperglycemia or  with transient focal neurologic deficits that resolved with glucose correction.
  • An adequate washout of sedating medications (when relevant) prior to mental status assessment.

Potential etiologies included electrolyte abnormalities, organ failure, hypertensive encephalopathysepsis or active infection, fever, nutritional deficiency, and environmental injury.
 

Foreign environment

Most (78%) of the 559 patients diagnosed with TME had already developed encephalopathy immediately prior to hospital admission, the authors report. The most common etiologies of TME among hospitalized patients with COVID-19 are listed below.


 

Compared with patients without TME, those with TME – (all Ps < .001):

  • Were older (76 vs. 62 years).
  • Had higher rates of dementia (27% vs. 3%).
  • Had higher rates of psychiatric history (20% vs. 10%).
  • Were more often intubated (37% vs. 20%).
  • Had a longer length of hospital stay (7.9 vs. 6.0 days).
  • Were less often discharged home (25% vs. 66%).

“It’s no surprise that older patients and people with dementia or psychiatric illness are predisposed to becoming encephalopathic,” said Dr. Frontera. “Being in a foreign environment, such as a hospital, or being sleep-deprived in the ICU is likely to make them more confused during their hospital stay.”
 

Delirium as a symptom

In-hospital mortality or discharge to hospice was considerably higher in the TME versus non-TME patients (44% vs. 18%, respectively).

When the researchers adjusted for confounders (age, sex, race, worse Sequential Organ Failure Assessment score during hospitalization, ventilator status, study week, hospital location, and ICU care level) and excluded patients receiving only comfort care, they found that TME was associated with a 24% increased risk of in-hospital death (30% in patients with TME vs. 16% in those without TME).

The highest mortality risk was associated with hypoxemia, with 42% of patients with HIE dying during hospitalization, compared with 16% of patients without HIE (adjusted hazard ratio 1.56; 95% confidence interval, 1.21-2.00; P = .001).

“Not all patients who are intubated require sedation, but there’s generally a lot of hesitation in reducing or stopping sedation in some patients,” Dr. Frontera observed.

She acknowledged there are “many extremely sick patients whom you can’t ventilate without sedation.”

Nevertheless, “delirium in and of itself does not cause death. It’s a symptom, not a disease, and we have to figure out what causes it. Delirium might not need to be sedated, and it’s more important to see what the causal problem is.”
 

Independent predictor of death

Commenting on the study, Panayiotis N. Varelas, MD, PhD, vice president of the Neurocritical Care Society, said the study “approached the TME issue better than previously, namely allowing time for sedatives to wear off to have a better sample of patients with this syndrome.”

Dr. Varelas, who is chairman of the department of neurology and professor of neurology at Albany (N.Y.) Medical College, emphasized that TME “is not benign and, in patients with COVID-19, it is an independent predictor of in-hospital mortality.”

“One should take all possible measures … to avoid desaturation and hypotensive episodes and also aggressively treat SAE and uremic encephalopathy in hopes of improving the outcomes,” added Dr. Varelas, who was not involved with the study.

Also commenting on the study, Mitchell Elkind, MD, professor of neurology and epidemiology at Columbia University in New York, who was not associated with the research, said it “nicely distinguishes among the different causes of encephalopathy, including sepsis, hypoxia, and kidney failure … emphasizing just how sick these patients are.”

The study received no direct funding. Individual investigators were supported by grants from the National Institute on Aging and the National Institute of Neurological Disorders and Stroke. The investigators, Dr. Varelas, and Dr. Elkind have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(5)
Publications
Topics
Sections

 

Toxic metabolic encephalopathy (TME) is common and often lethal in hospitalized patients with COVID-19, new research shows. Results of a retrospective study show that of almost 4,500 patients with COVID-19, 12% were diagnosed with TME. Of these, 78% developed encephalopathy immediately prior to hospital admission. Septic encephalopathy, hypoxic-ischemic encephalopathy (HIE), and uremia were the most common causes, although multiple causes were present in close to 80% of patients. TME was also associated with a 24% higher risk of in-hospital death.

“We found that close to one in eight patients who were hospitalized with COVID-19 had TME that was not attributed to the effects of sedatives, and that this is incredibly common among these patients who are critically ill” said lead author Jennifer A. Frontera, MD, New York University.

“The general principle of our findings is to be more aggressive in TME; and from a neurologist perspective, the way to do this is to eliminate the effects of sedation, which is a confounder,” she said.

The study was published online March 16 in Neurocritical Care.
 

Drilling down

“Many neurological complications of COVID-19 are sequelae of severe illness or secondary effects of multisystem organ failure, but our previous work identified TME as the most common neurological complication,” Dr. Frontera said.

Previous research investigating encephalopathy among patients with COVID-19 included patients who may have been sedated or have had a positive Confusion Assessment Method (CAM) result.

“A lot of the delirium literature is effectively heterogeneous because there are a number of patients who are on sedative medication that, if you could turn it off, these patients would return to normal. Some may have underlying neurological issues that can be addressed, but you can›t get to the bottom of this unless you turn off the sedation,” Dr. Frontera noted.

“We wanted to be specific and try to drill down to see what the underlying cause of the encephalopathy was,” she said.

The researchers retrospectively analyzed data on 4,491 patients (≥ 18 years old) with COVID-19 who were admitted to four New York City hospitals between March 1, 2020, and May 20, 2020. Of these, 559 (12%) with TME were compared with 3,932 patients without TME.

The researchers looked at index admissions and included patients who had:

  • New changes in mental status or significant worsening of mental status (in patients with baseline abnormal mental status).
  • Hyperglycemia or  with transient focal neurologic deficits that resolved with glucose correction.
  • An adequate washout of sedating medications (when relevant) prior to mental status assessment.

Potential etiologies included electrolyte abnormalities, organ failure, hypertensive encephalopathysepsis or active infection, fever, nutritional deficiency, and environmental injury.
 

Foreign environment

Most (78%) of the 559 patients diagnosed with TME had already developed encephalopathy immediately prior to hospital admission, the authors report. The most common etiologies of TME among hospitalized patients with COVID-19 are listed below.


 

Compared with patients without TME, those with TME – (all Ps < .001):

  • Were older (76 vs. 62 years).
  • Had higher rates of dementia (27% vs. 3%).
  • Had higher rates of psychiatric history (20% vs. 10%).
  • Were more often intubated (37% vs. 20%).
  • Had a longer length of hospital stay (7.9 vs. 6.0 days).
  • Were less often discharged home (25% vs. 66%).

“It’s no surprise that older patients and people with dementia or psychiatric illness are predisposed to becoming encephalopathic,” said Dr. Frontera. “Being in a foreign environment, such as a hospital, or being sleep-deprived in the ICU is likely to make them more confused during their hospital stay.”
 

Delirium as a symptom

In-hospital mortality or discharge to hospice was considerably higher in the TME versus non-TME patients (44% vs. 18%, respectively).

When the researchers adjusted for confounders (age, sex, race, worse Sequential Organ Failure Assessment score during hospitalization, ventilator status, study week, hospital location, and ICU care level) and excluded patients receiving only comfort care, they found that TME was associated with a 24% increased risk of in-hospital death (30% in patients with TME vs. 16% in those without TME).

The highest mortality risk was associated with hypoxemia, with 42% of patients with HIE dying during hospitalization, compared with 16% of patients without HIE (adjusted hazard ratio 1.56; 95% confidence interval, 1.21-2.00; P = .001).

“Not all patients who are intubated require sedation, but there’s generally a lot of hesitation in reducing or stopping sedation in some patients,” Dr. Frontera observed.

She acknowledged there are “many extremely sick patients whom you can’t ventilate without sedation.”

Nevertheless, “delirium in and of itself does not cause death. It’s a symptom, not a disease, and we have to figure out what causes it. Delirium might not need to be sedated, and it’s more important to see what the causal problem is.”
 

Independent predictor of death

Commenting on the study, Panayiotis N. Varelas, MD, PhD, vice president of the Neurocritical Care Society, said the study “approached the TME issue better than previously, namely allowing time for sedatives to wear off to have a better sample of patients with this syndrome.”

Dr. Varelas, who is chairman of the department of neurology and professor of neurology at Albany (N.Y.) Medical College, emphasized that TME “is not benign and, in patients with COVID-19, it is an independent predictor of in-hospital mortality.”

“One should take all possible measures … to avoid desaturation and hypotensive episodes and also aggressively treat SAE and uremic encephalopathy in hopes of improving the outcomes,” added Dr. Varelas, who was not involved with the study.

Also commenting on the study, Mitchell Elkind, MD, professor of neurology and epidemiology at Columbia University in New York, who was not associated with the research, said it “nicely distinguishes among the different causes of encephalopathy, including sepsis, hypoxia, and kidney failure … emphasizing just how sick these patients are.”

The study received no direct funding. Individual investigators were supported by grants from the National Institute on Aging and the National Institute of Neurological Disorders and Stroke. The investigators, Dr. Varelas, and Dr. Elkind have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Toxic metabolic encephalopathy (TME) is common and often lethal in hospitalized patients with COVID-19, new research shows. Results of a retrospective study show that of almost 4,500 patients with COVID-19, 12% were diagnosed with TME. Of these, 78% developed encephalopathy immediately prior to hospital admission. Septic encephalopathy, hypoxic-ischemic encephalopathy (HIE), and uremia were the most common causes, although multiple causes were present in close to 80% of patients. TME was also associated with a 24% higher risk of in-hospital death.

“We found that close to one in eight patients who were hospitalized with COVID-19 had TME that was not attributed to the effects of sedatives, and that this is incredibly common among these patients who are critically ill” said lead author Jennifer A. Frontera, MD, New York University.

“The general principle of our findings is to be more aggressive in TME; and from a neurologist perspective, the way to do this is to eliminate the effects of sedation, which is a confounder,” she said.

The study was published online March 16 in Neurocritical Care.
 

Drilling down

“Many neurological complications of COVID-19 are sequelae of severe illness or secondary effects of multisystem organ failure, but our previous work identified TME as the most common neurological complication,” Dr. Frontera said.

Previous research investigating encephalopathy among patients with COVID-19 included patients who may have been sedated or have had a positive Confusion Assessment Method (CAM) result.

“A lot of the delirium literature is effectively heterogeneous because there are a number of patients who are on sedative medication that, if you could turn it off, these patients would return to normal. Some may have underlying neurological issues that can be addressed, but you can›t get to the bottom of this unless you turn off the sedation,” Dr. Frontera noted.

“We wanted to be specific and try to drill down to see what the underlying cause of the encephalopathy was,” she said.

The researchers retrospectively analyzed data on 4,491 patients (≥ 18 years old) with COVID-19 who were admitted to four New York City hospitals between March 1, 2020, and May 20, 2020. Of these, 559 (12%) with TME were compared with 3,932 patients without TME.

The researchers looked at index admissions and included patients who had:

  • New changes in mental status or significant worsening of mental status (in patients with baseline abnormal mental status).
  • Hyperglycemia or  with transient focal neurologic deficits that resolved with glucose correction.
  • An adequate washout of sedating medications (when relevant) prior to mental status assessment.

Potential etiologies included electrolyte abnormalities, organ failure, hypertensive encephalopathysepsis or active infection, fever, nutritional deficiency, and environmental injury.
 

Foreign environment

Most (78%) of the 559 patients diagnosed with TME had already developed encephalopathy immediately prior to hospital admission, the authors report. The most common etiologies of TME among hospitalized patients with COVID-19 are listed below.


 

Compared with patients without TME, those with TME – (all Ps < .001):

  • Were older (76 vs. 62 years).
  • Had higher rates of dementia (27% vs. 3%).
  • Had higher rates of psychiatric history (20% vs. 10%).
  • Were more often intubated (37% vs. 20%).
  • Had a longer length of hospital stay (7.9 vs. 6.0 days).
  • Were less often discharged home (25% vs. 66%).

“It’s no surprise that older patients and people with dementia or psychiatric illness are predisposed to becoming encephalopathic,” said Dr. Frontera. “Being in a foreign environment, such as a hospital, or being sleep-deprived in the ICU is likely to make them more confused during their hospital stay.”
 

Delirium as a symptom

In-hospital mortality or discharge to hospice was considerably higher in the TME versus non-TME patients (44% vs. 18%, respectively).

When the researchers adjusted for confounders (age, sex, race, worse Sequential Organ Failure Assessment score during hospitalization, ventilator status, study week, hospital location, and ICU care level) and excluded patients receiving only comfort care, they found that TME was associated with a 24% increased risk of in-hospital death (30% in patients with TME vs. 16% in those without TME).

The highest mortality risk was associated with hypoxemia, with 42% of patients with HIE dying during hospitalization, compared with 16% of patients without HIE (adjusted hazard ratio 1.56; 95% confidence interval, 1.21-2.00; P = .001).

“Not all patients who are intubated require sedation, but there’s generally a lot of hesitation in reducing or stopping sedation in some patients,” Dr. Frontera observed.

She acknowledged there are “many extremely sick patients whom you can’t ventilate without sedation.”

Nevertheless, “delirium in and of itself does not cause death. It’s a symptom, not a disease, and we have to figure out what causes it. Delirium might not need to be sedated, and it’s more important to see what the causal problem is.”
 

Independent predictor of death

Commenting on the study, Panayiotis N. Varelas, MD, PhD, vice president of the Neurocritical Care Society, said the study “approached the TME issue better than previously, namely allowing time for sedatives to wear off to have a better sample of patients with this syndrome.”

Dr. Varelas, who is chairman of the department of neurology and professor of neurology at Albany (N.Y.) Medical College, emphasized that TME “is not benign and, in patients with COVID-19, it is an independent predictor of in-hospital mortality.”

“One should take all possible measures … to avoid desaturation and hypotensive episodes and also aggressively treat SAE and uremic encephalopathy in hopes of improving the outcomes,” added Dr. Varelas, who was not involved with the study.

Also commenting on the study, Mitchell Elkind, MD, professor of neurology and epidemiology at Columbia University in New York, who was not associated with the research, said it “nicely distinguishes among the different causes of encephalopathy, including sepsis, hypoxia, and kidney failure … emphasizing just how sick these patients are.”

The study received no direct funding. Individual investigators were supported by grants from the National Institute on Aging and the National Institute of Neurological Disorders and Stroke. The investigators, Dr. Varelas, and Dr. Elkind have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(5)
Issue
Neurology Reviews- 29(5)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROCRITICAL CARE

Citation Override
Publish date: March 29, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Neurologic drug prices jump 50% in five years

Article Type
Changed
Thu, 12/15/2022 - 15:41

Medicare payments for branded neurologic drugs jumped 50% over a 5-year period, while claims for these medications increased by just 8%, new research shows. Results of the retrospective study also showed that most of the increased costs for these agents were due to rising costs for neuroimmunology drugs, mainly for those used to treat multiple sclerosis (MS).

Dr. Adam de Havenon

“The same brand name medication in 2017 cost approximately 50% more than in 2013,” said Adam de Havenon, MD, assistant professor of neurology, University of Utah, Salt Lake City.

“An analogy would be if you bought an iPhone 5 in 2013 for $500, and then in 2017, you were asked to pay $750 for the exact same iPhone 5,” Dr. de Havenon added.

The study findings were published online March 10 in the journal Neurology.
 

$26 billion in payments

Both neurologists and patients are concerned about the high cost of prescription drugs for neurologic diseases, and Medicare Part D data indicate that these drugs are the most expensive component of neurologic care, the researchers noted. In addition, out-of-pocket costs have increased significantly for patients with neurologic disease such as Parkinson’s disease, epilepsy, and MS.

To understand trends in payments for neurologic drugs, Dr. de Havenon and colleagues analyzed Medicare Part D claims filed from 2013 to 2017. The payments include costs paid by Medicare, the patient, government subsidies, and other third-party payers.

In addition to examining more current Medicare Part D data than previous studies, the current analysis examined all medications prescribed by neurologists that consistently remained branded or generic during the 5-year study period, said Dr. de Havenon. This approach resulted in a large number of claims and a large total cost.

To calculate the percentage change in annual payment claims, the researchers used 2013 prices as a reference point. They identified drugs named in 2013 claims and classified them as generic, brand-name only, or brand-name with generic equivalent. Researchers also divided the drugs by neurologic subspecialty.

The analysis included 520 drugs, all of which were available in each year of the study period. Of these drugs, 322 were generic, 61 were brand-name only, and 137 were brand-name with a generic equivalent. There were 90.7 million total claims.

Results showed total payments amounted to $26.65 billion. Yearly total payments increased from $4.05 billion in 2013 to $6.09 billion in 2017, representing a 50.4% increase, even after adjusting for inflation. Total claims increased by 7.6% – from 17.1 million in 2013 to 18.4 million in 2017.

From 2013 to 2017, claim payments increased by 0.6% for generic drugs, 42.4% for brand-name only drugs, and 45% for brand-name drugs with generic equivalents. The proportion of claims increased from 81.9% to 88% for generic drugs and from 4.9% to 6.2% for brand-name only drugs.

However, the proportion of claims for brand-name drugs with generic equivalents decreased from 13.3% to 5.8%.
 

Treatment barrier

Neuroimmunologic drugs, most of which were prescribed for MS, had exceptional cost, the researchers noted. These drugs accounted for more than 50% of payments but only 4.3% of claims. Claim payment for these drugs increased by 46.9% during the study period, from $3,337 to $4,902.

When neuroimmunologic drugs were removed from the analysis there was still significant increase in claim payments for brand-name only drugs (50.4%) and brand-name drugs with generic equivalents (45.6%).

Although neuroimmunologic medicines, including monoclonal antibodies, are more expensive to produce, this factor alone does not explain their exceptional cost, said Dr. de Havenon. “The high cost of brand-name drugs in this speciality is likely because the market bears it,” he added. “In other words, MS is a disabling disease and the medications work, so historically the Centers for Medicare & Medicaid Services have been willing to tolerate the high cost of these primarily brand-name medications.”

Several countries have controlled drug costs by negotiating with pharmaceutical companies and through legislation, Dr. de Havenon noted.

“My intent with this article was to raise awareness on the topic, which I struggle with frequently as a clinician. I know I want my patients to have a medication, but the cost prevents it,” he said.
 

‘Unfettered’ price-setting

Commenting on the findings, Robert J. Fox, MD, vice chair for research at the Neurological Institute of the Cleveland Clinic, said the study “brings into clear light” what neurologists, particularly those who treat MS, have long suspected but did not really know. These neurologists “are typically distanced from the payment aspects of the medications they prescribe,” said Dr. Fox, who was not involved with the research.

Although a particular strength of the study was its comprehensiveness, the researchers excluded infusion claims – which account for a large portion of total patient care costs for many disorders, he noted.

Drugs for MS historically have been expensive, ostensibly because of their high cost of development. In addition, the large and continued price increase that occurs long after these drugs have been approved remains unexplained, said Dr. Fox.

He noted that the study findings might not directly affect clinical practice because neurologists will continue prescribing medications they think are best for their patients. “Instead, I think this is a lesson to lawmakers about the massive error in the Medicare Modernization Act of 2003, where the federal government was prohibited from negotiating drug prices. If the seller is unfettered in setting a price, then no one should be surprised when the price rises,” Dr. Fox said.

Because many new drugs and new generic formulations for treating MS have become available during the past year, “repeating these types of economic studies for the period 2020-2025 will help us understand if generic competition – as well as new laws if they are passed – alter price,” he concluded.

The study was funded by the American Academy of Neurology, which publishes Neurology. Dr. de Havenon has received clinical research funding from AMAG Pharmaceuticals and Regeneron Pharmaceuticals. Dr. Fox receives consulting fees from many pharmaceutical companies involved in the development of therapies for MS.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

Medicare payments for branded neurologic drugs jumped 50% over a 5-year period, while claims for these medications increased by just 8%, new research shows. Results of the retrospective study also showed that most of the increased costs for these agents were due to rising costs for neuroimmunology drugs, mainly for those used to treat multiple sclerosis (MS).

Dr. Adam de Havenon

“The same brand name medication in 2017 cost approximately 50% more than in 2013,” said Adam de Havenon, MD, assistant professor of neurology, University of Utah, Salt Lake City.

“An analogy would be if you bought an iPhone 5 in 2013 for $500, and then in 2017, you were asked to pay $750 for the exact same iPhone 5,” Dr. de Havenon added.

The study findings were published online March 10 in the journal Neurology.
 

$26 billion in payments

Both neurologists and patients are concerned about the high cost of prescription drugs for neurologic diseases, and Medicare Part D data indicate that these drugs are the most expensive component of neurologic care, the researchers noted. In addition, out-of-pocket costs have increased significantly for patients with neurologic disease such as Parkinson’s disease, epilepsy, and MS.

To understand trends in payments for neurologic drugs, Dr. de Havenon and colleagues analyzed Medicare Part D claims filed from 2013 to 2017. The payments include costs paid by Medicare, the patient, government subsidies, and other third-party payers.

In addition to examining more current Medicare Part D data than previous studies, the current analysis examined all medications prescribed by neurologists that consistently remained branded or generic during the 5-year study period, said Dr. de Havenon. This approach resulted in a large number of claims and a large total cost.

To calculate the percentage change in annual payment claims, the researchers used 2013 prices as a reference point. They identified drugs named in 2013 claims and classified them as generic, brand-name only, or brand-name with generic equivalent. Researchers also divided the drugs by neurologic subspecialty.

The analysis included 520 drugs, all of which were available in each year of the study period. Of these drugs, 322 were generic, 61 were brand-name only, and 137 were brand-name with a generic equivalent. There were 90.7 million total claims.

Results showed total payments amounted to $26.65 billion. Yearly total payments increased from $4.05 billion in 2013 to $6.09 billion in 2017, representing a 50.4% increase, even after adjusting for inflation. Total claims increased by 7.6% – from 17.1 million in 2013 to 18.4 million in 2017.

From 2013 to 2017, claim payments increased by 0.6% for generic drugs, 42.4% for brand-name only drugs, and 45% for brand-name drugs with generic equivalents. The proportion of claims increased from 81.9% to 88% for generic drugs and from 4.9% to 6.2% for brand-name only drugs.

However, the proportion of claims for brand-name drugs with generic equivalents decreased from 13.3% to 5.8%.
 

Treatment barrier

Neuroimmunologic drugs, most of which were prescribed for MS, had exceptional cost, the researchers noted. These drugs accounted for more than 50% of payments but only 4.3% of claims. Claim payment for these drugs increased by 46.9% during the study period, from $3,337 to $4,902.

When neuroimmunologic drugs were removed from the analysis there was still significant increase in claim payments for brand-name only drugs (50.4%) and brand-name drugs with generic equivalents (45.6%).

Although neuroimmunologic medicines, including monoclonal antibodies, are more expensive to produce, this factor alone does not explain their exceptional cost, said Dr. de Havenon. “The high cost of brand-name drugs in this speciality is likely because the market bears it,” he added. “In other words, MS is a disabling disease and the medications work, so historically the Centers for Medicare & Medicaid Services have been willing to tolerate the high cost of these primarily brand-name medications.”

Several countries have controlled drug costs by negotiating with pharmaceutical companies and through legislation, Dr. de Havenon noted.

“My intent with this article was to raise awareness on the topic, which I struggle with frequently as a clinician. I know I want my patients to have a medication, but the cost prevents it,” he said.
 

‘Unfettered’ price-setting

Commenting on the findings, Robert J. Fox, MD, vice chair for research at the Neurological Institute of the Cleveland Clinic, said the study “brings into clear light” what neurologists, particularly those who treat MS, have long suspected but did not really know. These neurologists “are typically distanced from the payment aspects of the medications they prescribe,” said Dr. Fox, who was not involved with the research.

Although a particular strength of the study was its comprehensiveness, the researchers excluded infusion claims – which account for a large portion of total patient care costs for many disorders, he noted.

Drugs for MS historically have been expensive, ostensibly because of their high cost of development. In addition, the large and continued price increase that occurs long after these drugs have been approved remains unexplained, said Dr. Fox.

He noted that the study findings might not directly affect clinical practice because neurologists will continue prescribing medications they think are best for their patients. “Instead, I think this is a lesson to lawmakers about the massive error in the Medicare Modernization Act of 2003, where the federal government was prohibited from negotiating drug prices. If the seller is unfettered in setting a price, then no one should be surprised when the price rises,” Dr. Fox said.

Because many new drugs and new generic formulations for treating MS have become available during the past year, “repeating these types of economic studies for the period 2020-2025 will help us understand if generic competition – as well as new laws if they are passed – alter price,” he concluded.

The study was funded by the American Academy of Neurology, which publishes Neurology. Dr. de Havenon has received clinical research funding from AMAG Pharmaceuticals and Regeneron Pharmaceuticals. Dr. Fox receives consulting fees from many pharmaceutical companies involved in the development of therapies for MS.

A version of this article first appeared on Medscape.com.

Medicare payments for branded neurologic drugs jumped 50% over a 5-year period, while claims for these medications increased by just 8%, new research shows. Results of the retrospective study also showed that most of the increased costs for these agents were due to rising costs for neuroimmunology drugs, mainly for those used to treat multiple sclerosis (MS).

Dr. Adam de Havenon

“The same brand name medication in 2017 cost approximately 50% more than in 2013,” said Adam de Havenon, MD, assistant professor of neurology, University of Utah, Salt Lake City.

“An analogy would be if you bought an iPhone 5 in 2013 for $500, and then in 2017, you were asked to pay $750 for the exact same iPhone 5,” Dr. de Havenon added.

The study findings were published online March 10 in the journal Neurology.
 

$26 billion in payments

Both neurologists and patients are concerned about the high cost of prescription drugs for neurologic diseases, and Medicare Part D data indicate that these drugs are the most expensive component of neurologic care, the researchers noted. In addition, out-of-pocket costs have increased significantly for patients with neurologic disease such as Parkinson’s disease, epilepsy, and MS.

To understand trends in payments for neurologic drugs, Dr. de Havenon and colleagues analyzed Medicare Part D claims filed from 2013 to 2017. The payments include costs paid by Medicare, the patient, government subsidies, and other third-party payers.

In addition to examining more current Medicare Part D data than previous studies, the current analysis examined all medications prescribed by neurologists that consistently remained branded or generic during the 5-year study period, said Dr. de Havenon. This approach resulted in a large number of claims and a large total cost.

To calculate the percentage change in annual payment claims, the researchers used 2013 prices as a reference point. They identified drugs named in 2013 claims and classified them as generic, brand-name only, or brand-name with generic equivalent. Researchers also divided the drugs by neurologic subspecialty.

The analysis included 520 drugs, all of which were available in each year of the study period. Of these drugs, 322 were generic, 61 were brand-name only, and 137 were brand-name with a generic equivalent. There were 90.7 million total claims.

Results showed total payments amounted to $26.65 billion. Yearly total payments increased from $4.05 billion in 2013 to $6.09 billion in 2017, representing a 50.4% increase, even after adjusting for inflation. Total claims increased by 7.6% – from 17.1 million in 2013 to 18.4 million in 2017.

From 2013 to 2017, claim payments increased by 0.6% for generic drugs, 42.4% for brand-name only drugs, and 45% for brand-name drugs with generic equivalents. The proportion of claims increased from 81.9% to 88% for generic drugs and from 4.9% to 6.2% for brand-name only drugs.

However, the proportion of claims for brand-name drugs with generic equivalents decreased from 13.3% to 5.8%.
 

Treatment barrier

Neuroimmunologic drugs, most of which were prescribed for MS, had exceptional cost, the researchers noted. These drugs accounted for more than 50% of payments but only 4.3% of claims. Claim payment for these drugs increased by 46.9% during the study period, from $3,337 to $4,902.

When neuroimmunologic drugs were removed from the analysis there was still significant increase in claim payments for brand-name only drugs (50.4%) and brand-name drugs with generic equivalents (45.6%).

Although neuroimmunologic medicines, including monoclonal antibodies, are more expensive to produce, this factor alone does not explain their exceptional cost, said Dr. de Havenon. “The high cost of brand-name drugs in this speciality is likely because the market bears it,” he added. “In other words, MS is a disabling disease and the medications work, so historically the Centers for Medicare & Medicaid Services have been willing to tolerate the high cost of these primarily brand-name medications.”

Several countries have controlled drug costs by negotiating with pharmaceutical companies and through legislation, Dr. de Havenon noted.

“My intent with this article was to raise awareness on the topic, which I struggle with frequently as a clinician. I know I want my patients to have a medication, but the cost prevents it,” he said.
 

‘Unfettered’ price-setting

Commenting on the findings, Robert J. Fox, MD, vice chair for research at the Neurological Institute of the Cleveland Clinic, said the study “brings into clear light” what neurologists, particularly those who treat MS, have long suspected but did not really know. These neurologists “are typically distanced from the payment aspects of the medications they prescribe,” said Dr. Fox, who was not involved with the research.

Although a particular strength of the study was its comprehensiveness, the researchers excluded infusion claims – which account for a large portion of total patient care costs for many disorders, he noted.

Drugs for MS historically have been expensive, ostensibly because of their high cost of development. In addition, the large and continued price increase that occurs long after these drugs have been approved remains unexplained, said Dr. Fox.

He noted that the study findings might not directly affect clinical practice because neurologists will continue prescribing medications they think are best for their patients. “Instead, I think this is a lesson to lawmakers about the massive error in the Medicare Modernization Act of 2003, where the federal government was prohibited from negotiating drug prices. If the seller is unfettered in setting a price, then no one should be surprised when the price rises,” Dr. Fox said.

Because many new drugs and new generic formulations for treating MS have become available during the past year, “repeating these types of economic studies for the period 2020-2025 will help us understand if generic competition – as well as new laws if they are passed – alter price,” he concluded.

The study was funded by the American Academy of Neurology, which publishes Neurology. Dr. de Havenon has received clinical research funding from AMAG Pharmaceuticals and Regeneron Pharmaceuticals. Dr. Fox receives consulting fees from many pharmaceutical companies involved in the development of therapies for MS.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Citation Override
Publish date: March 16, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

Despite risks and warnings, CNS polypharmacy is prevalent among patients with dementia

Article Type
Changed
Thu, 12/15/2022 - 15:41

 

A significant proportion of community-dwelling older adults with dementia take three or more central nervous system medications despite guidelines that say to avoid this dangerous practice, new research suggests.

Investigators found that 14% of these individuals were receiving CNS-active polypharmacy, defined as combinations of multiple psychotropic and opioid medications taken for more than 30 days.

“For most patients, the risks of these medications, particularly in combination, are almost certainly greater than the potential benefits,” said Donovan Maust, MD, associate director of the geriatric psychiatry program, University of Michigan, Ann Arbor.

The study was published online March 9 in JAMA.
 

Serious risks

Memory impairment is the cardinal feature of dementia, but behavioral and psychological symptoms, which can include apathy, delusions, and agitation, are common during all stages of illness and cause significant caregiver distress, the researchers noted.

They noted that there is a dearth of high-quality evidence to support prescribing these medications in this patient population, yet “clinicians regularly prescribe psychotropic medications to community-dwelling persons with dementia in rates that far exceed use in the general older adult population.”

The Beers Criteria, from the American Geriatrics Society, advise against the practice of CNS polypharmacy because of the significant increase in risk for falls as well as impaired cognition, cardiac conduction abnormalities, respiratory suppression, and death when polypharmacy involves opioids.

They note that previous studies from Europe of polypharmacy for patients with dementia have not included antiepileptic medications or opioids, so the true extent of CNS-active polypharmacy may be “significantly” underestimated.

To determine the prevalence of polypharmacy with CNS-active medications among community-dwelling older adults with dementia, the researchers analyzed data on prescription fills for nearly 1.2 million community-dwelling Medicare patients with dementia.

The primary outcome was the prevalence of CNS-active polypharmacy in 2018. They defined CNS-active polypharmacy as exposure to three or more medications for more than 30 consecutive days from the following drug classes: antidepressants, antipsychotics, antiepileptics, benzodiazepines, nonbenzodiazepines, benzodiazepine receptor agonist hypnotics, and opioids.

They found that roughly one in seven (13.9%) patients met criteria for CNS-active polypharmacy. Of those receiving a CNS-active polypharmacy regimen, 57.8% had been doing so for longer than 180 days, and 6.8% had been doing so for a year. Nearly 30% of patients were exposed to five or more medications, and 5.2% were exposed to five or more medication classes.
 

Conservative approach warranted

Nearly all (92%) patients taking three or more CNS-active medications were taking an antidepressant, “consistent with their place as the psychotropic class most commonly prescribed both to older adults overall and those with dementia,” the investigators noted.

There is minimal high-quality evidence to support the efficacy of antidepressants for the treatment of depression for patients with dementia, they pointed out.

Nearly half (47%) of patients who were taking three or more CNS-active medications received at least one antipsychotic, most often quetiapine. Antipsychotics are not approved for people with dementia but are often prescribed off label for agitation, anxiety, and sleep problems, the researchers noted.

Nearly two thirds (62%) of patients with dementia who were taking three or more CNS drugs were taking an antiepileptic (most commonly, gabapentin); 41%, benzodiazepines; 32%, opioids; and 6%, Z-drugs.

The most common polypharmacy class combination included at least one antidepressant, one antiepileptic, and one antipsychotic. These accounted for 12.9% of polypharmacy days.

Despite limited high-quality evidence of efficacy, the prescribing of psychotropic medications and opioids is “pervasive” for adults with dementia in the United States, the investigators noted.

“Especially given that older adults with dementia might not be able to convey side effects they are experiencing, I think clinicians should be more conservative in how they are prescribing these medications and skeptical about the potential for benefit,” said Dr. Maust.

Regarding study limitations, the researchers noted that prescription medication claims may have led to an overestimation of the exposure to polypharmacy, insofar as the prescriptions may have been filled but not taken or were taken only on an as-needed basis.

In addition, the investigators were unable to determine the appropriateness of the particular combinations used or to examine the specific harms associated with CNS-active polypharmacy.
 

 

 

A major clinical challenge

Weighing in on the results, Howard Fillit, MD, founding executive director and chief science officer of the Alzheimer’s Drug Discovery Foundation, said the study is important because polypharmacy is one of the “geriatric giants, and the question is, what do you do about it?”

Dr. Fillit said it is important to conduct a careful medication review for all older patients, “making sure that the use of each drug is appropriate. The most important thing is to define what is the appropriate utilization of these kinds of drugs. That goes for both overutilization or misuse of these drugs and underutilization, where people are undertreated for symptoms that can’t be managed by behavioral management, for example,” Dr. Fillit said.

Dr. Fillit also said the finding that about 14% of dementia patients were receiving three or more of these drugs “may not be an outrageous number, because these patients, especially as they get into moderate and severe stages of disease, can be incredibly difficult to manage.

“Very often, dementia patients have depression, and up to 90% will have agitation and even psychosis during the course of dementia. And many of these patients need these types of drugs,” said Dr. Fillit.

Echoing the authors, Dr. Fillit said a key limitation of the study is not knowing whether the prescribing was appropriate or not.

The study was supported by a grant from the National Institute on Aging. Dr. Maust and Dr. Fillit have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

 

A significant proportion of community-dwelling older adults with dementia take three or more central nervous system medications despite guidelines that say to avoid this dangerous practice, new research suggests.

Investigators found that 14% of these individuals were receiving CNS-active polypharmacy, defined as combinations of multiple psychotropic and opioid medications taken for more than 30 days.

“For most patients, the risks of these medications, particularly in combination, are almost certainly greater than the potential benefits,” said Donovan Maust, MD, associate director of the geriatric psychiatry program, University of Michigan, Ann Arbor.

The study was published online March 9 in JAMA.
 

Serious risks

Memory impairment is the cardinal feature of dementia, but behavioral and psychological symptoms, which can include apathy, delusions, and agitation, are common during all stages of illness and cause significant caregiver distress, the researchers noted.

They noted that there is a dearth of high-quality evidence to support prescribing these medications in this patient population, yet “clinicians regularly prescribe psychotropic medications to community-dwelling persons with dementia in rates that far exceed use in the general older adult population.”

The Beers Criteria, from the American Geriatrics Society, advise against the practice of CNS polypharmacy because of the significant increase in risk for falls as well as impaired cognition, cardiac conduction abnormalities, respiratory suppression, and death when polypharmacy involves opioids.

They note that previous studies from Europe of polypharmacy for patients with dementia have not included antiepileptic medications or opioids, so the true extent of CNS-active polypharmacy may be “significantly” underestimated.

To determine the prevalence of polypharmacy with CNS-active medications among community-dwelling older adults with dementia, the researchers analyzed data on prescription fills for nearly 1.2 million community-dwelling Medicare patients with dementia.

The primary outcome was the prevalence of CNS-active polypharmacy in 2018. They defined CNS-active polypharmacy as exposure to three or more medications for more than 30 consecutive days from the following drug classes: antidepressants, antipsychotics, antiepileptics, benzodiazepines, nonbenzodiazepines, benzodiazepine receptor agonist hypnotics, and opioids.

They found that roughly one in seven (13.9%) patients met criteria for CNS-active polypharmacy. Of those receiving a CNS-active polypharmacy regimen, 57.8% had been doing so for longer than 180 days, and 6.8% had been doing so for a year. Nearly 30% of patients were exposed to five or more medications, and 5.2% were exposed to five or more medication classes.
 

Conservative approach warranted

Nearly all (92%) patients taking three or more CNS-active medications were taking an antidepressant, “consistent with their place as the psychotropic class most commonly prescribed both to older adults overall and those with dementia,” the investigators noted.

There is minimal high-quality evidence to support the efficacy of antidepressants for the treatment of depression for patients with dementia, they pointed out.

Nearly half (47%) of patients who were taking three or more CNS-active medications received at least one antipsychotic, most often quetiapine. Antipsychotics are not approved for people with dementia but are often prescribed off label for agitation, anxiety, and sleep problems, the researchers noted.

Nearly two thirds (62%) of patients with dementia who were taking three or more CNS drugs were taking an antiepileptic (most commonly, gabapentin); 41%, benzodiazepines; 32%, opioids; and 6%, Z-drugs.

The most common polypharmacy class combination included at least one antidepressant, one antiepileptic, and one antipsychotic. These accounted for 12.9% of polypharmacy days.

Despite limited high-quality evidence of efficacy, the prescribing of psychotropic medications and opioids is “pervasive” for adults with dementia in the United States, the investigators noted.

“Especially given that older adults with dementia might not be able to convey side effects they are experiencing, I think clinicians should be more conservative in how they are prescribing these medications and skeptical about the potential for benefit,” said Dr. Maust.

Regarding study limitations, the researchers noted that prescription medication claims may have led to an overestimation of the exposure to polypharmacy, insofar as the prescriptions may have been filled but not taken or were taken only on an as-needed basis.

In addition, the investigators were unable to determine the appropriateness of the particular combinations used or to examine the specific harms associated with CNS-active polypharmacy.
 

 

 

A major clinical challenge

Weighing in on the results, Howard Fillit, MD, founding executive director and chief science officer of the Alzheimer’s Drug Discovery Foundation, said the study is important because polypharmacy is one of the “geriatric giants, and the question is, what do you do about it?”

Dr. Fillit said it is important to conduct a careful medication review for all older patients, “making sure that the use of each drug is appropriate. The most important thing is to define what is the appropriate utilization of these kinds of drugs. That goes for both overutilization or misuse of these drugs and underutilization, where people are undertreated for symptoms that can’t be managed by behavioral management, for example,” Dr. Fillit said.

Dr. Fillit also said the finding that about 14% of dementia patients were receiving three or more of these drugs “may not be an outrageous number, because these patients, especially as they get into moderate and severe stages of disease, can be incredibly difficult to manage.

“Very often, dementia patients have depression, and up to 90% will have agitation and even psychosis during the course of dementia. And many of these patients need these types of drugs,” said Dr. Fillit.

Echoing the authors, Dr. Fillit said a key limitation of the study is not knowing whether the prescribing was appropriate or not.

The study was supported by a grant from the National Institute on Aging. Dr. Maust and Dr. Fillit have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

A significant proportion of community-dwelling older adults with dementia take three or more central nervous system medications despite guidelines that say to avoid this dangerous practice, new research suggests.

Investigators found that 14% of these individuals were receiving CNS-active polypharmacy, defined as combinations of multiple psychotropic and opioid medications taken for more than 30 days.

“For most patients, the risks of these medications, particularly in combination, are almost certainly greater than the potential benefits,” said Donovan Maust, MD, associate director of the geriatric psychiatry program, University of Michigan, Ann Arbor.

The study was published online March 9 in JAMA.
 

Serious risks

Memory impairment is the cardinal feature of dementia, but behavioral and psychological symptoms, which can include apathy, delusions, and agitation, are common during all stages of illness and cause significant caregiver distress, the researchers noted.

They noted that there is a dearth of high-quality evidence to support prescribing these medications in this patient population, yet “clinicians regularly prescribe psychotropic medications to community-dwelling persons with dementia in rates that far exceed use in the general older adult population.”

The Beers Criteria, from the American Geriatrics Society, advise against the practice of CNS polypharmacy because of the significant increase in risk for falls as well as impaired cognition, cardiac conduction abnormalities, respiratory suppression, and death when polypharmacy involves opioids.

They note that previous studies from Europe of polypharmacy for patients with dementia have not included antiepileptic medications or opioids, so the true extent of CNS-active polypharmacy may be “significantly” underestimated.

To determine the prevalence of polypharmacy with CNS-active medications among community-dwelling older adults with dementia, the researchers analyzed data on prescription fills for nearly 1.2 million community-dwelling Medicare patients with dementia.

The primary outcome was the prevalence of CNS-active polypharmacy in 2018. They defined CNS-active polypharmacy as exposure to three or more medications for more than 30 consecutive days from the following drug classes: antidepressants, antipsychotics, antiepileptics, benzodiazepines, nonbenzodiazepines, benzodiazepine receptor agonist hypnotics, and opioids.

They found that roughly one in seven (13.9%) patients met criteria for CNS-active polypharmacy. Of those receiving a CNS-active polypharmacy regimen, 57.8% had been doing so for longer than 180 days, and 6.8% had been doing so for a year. Nearly 30% of patients were exposed to five or more medications, and 5.2% were exposed to five or more medication classes.
 

Conservative approach warranted

Nearly all (92%) patients taking three or more CNS-active medications were taking an antidepressant, “consistent with their place as the psychotropic class most commonly prescribed both to older adults overall and those with dementia,” the investigators noted.

There is minimal high-quality evidence to support the efficacy of antidepressants for the treatment of depression for patients with dementia, they pointed out.

Nearly half (47%) of patients who were taking three or more CNS-active medications received at least one antipsychotic, most often quetiapine. Antipsychotics are not approved for people with dementia but are often prescribed off label for agitation, anxiety, and sleep problems, the researchers noted.

Nearly two thirds (62%) of patients with dementia who were taking three or more CNS drugs were taking an antiepileptic (most commonly, gabapentin); 41%, benzodiazepines; 32%, opioids; and 6%, Z-drugs.

The most common polypharmacy class combination included at least one antidepressant, one antiepileptic, and one antipsychotic. These accounted for 12.9% of polypharmacy days.

Despite limited high-quality evidence of efficacy, the prescribing of psychotropic medications and opioids is “pervasive” for adults with dementia in the United States, the investigators noted.

“Especially given that older adults with dementia might not be able to convey side effects they are experiencing, I think clinicians should be more conservative in how they are prescribing these medications and skeptical about the potential for benefit,” said Dr. Maust.

Regarding study limitations, the researchers noted that prescription medication claims may have led to an overestimation of the exposure to polypharmacy, insofar as the prescriptions may have been filled but not taken or were taken only on an as-needed basis.

In addition, the investigators were unable to determine the appropriateness of the particular combinations used or to examine the specific harms associated with CNS-active polypharmacy.
 

 

 

A major clinical challenge

Weighing in on the results, Howard Fillit, MD, founding executive director and chief science officer of the Alzheimer’s Drug Discovery Foundation, said the study is important because polypharmacy is one of the “geriatric giants, and the question is, what do you do about it?”

Dr. Fillit said it is important to conduct a careful medication review for all older patients, “making sure that the use of each drug is appropriate. The most important thing is to define what is the appropriate utilization of these kinds of drugs. That goes for both overutilization or misuse of these drugs and underutilization, where people are undertreated for symptoms that can’t be managed by behavioral management, for example,” Dr. Fillit said.

Dr. Fillit also said the finding that about 14% of dementia patients were receiving three or more of these drugs “may not be an outrageous number, because these patients, especially as they get into moderate and severe stages of disease, can be incredibly difficult to manage.

“Very often, dementia patients have depression, and up to 90% will have agitation and even psychosis during the course of dementia. And many of these patients need these types of drugs,” said Dr. Fillit.

Echoing the authors, Dr. Fillit said a key limitation of the study is not knowing whether the prescribing was appropriate or not.

The study was supported by a grant from the National Institute on Aging. Dr. Maust and Dr. Fillit have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA

Citation Override
Publish date: March 16, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

Palliative care for patients with dementia: When to refer?

Article Type
Changed
Thu, 12/15/2022 - 15:41

Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer, relatively few patients with terminal dementia receive referrals to palliative care.

A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.

For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
 

Standardized criteria is lacking

The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.

A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.

Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.

Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.

The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.

Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.

Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
 

 

 

A starting point for further discussion

Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.

Dr. Elizabeth Sampson


“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”

Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.

One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”

Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer, relatively few patients with terminal dementia receive referrals to palliative care.

A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.

For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
 

Standardized criteria is lacking

The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.

A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.

Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.

Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.

The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.

Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.

Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
 

 

 

A starting point for further discussion

Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.

Dr. Elizabeth Sampson


“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”

Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.

One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”

Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.

Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer, relatively few patients with terminal dementia receive referrals to palliative care.

A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.

For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
 

Standardized criteria is lacking

The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.

A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.

Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.

Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.

The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.

Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.

Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
 

 

 

A starting point for further discussion

Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.

Dr. Elizabeth Sampson


“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”

Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.

One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”

Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY

Citation Override
Publish date: March 10, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

Risdiplam study shows promise for spinal muscular atrophy

Article Type
Changed
Thu, 12/15/2022 - 15:41

Infants with type 1 spinal muscular atrophy (SMA) showed promising signs, including an increased expression of functional survival motor neuron (SMN) protein in the blood, after 1 year of treatment with oral risdiplam (Evrysdi, Genentech), according to results of part 1 of the FIREFISH study.

A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.

“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.

However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.

The findings were published online Feb. 24 in the New England Journal of Medicine.
 

A phase 2-3 open-label study

The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.

The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.

In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.

Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.

Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
 

 

 

The first oral treatment option

Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”

While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.

Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”

Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”

Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”

The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.

The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

Infants with type 1 spinal muscular atrophy (SMA) showed promising signs, including an increased expression of functional survival motor neuron (SMN) protein in the blood, after 1 year of treatment with oral risdiplam (Evrysdi, Genentech), according to results of part 1 of the FIREFISH study.

A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.

“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.

However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.

The findings were published online Feb. 24 in the New England Journal of Medicine.
 

A phase 2-3 open-label study

The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.

The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.

In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.

Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.

Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
 

 

 

The first oral treatment option

Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”

While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.

Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”

Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”

Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”

The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.

The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.

Infants with type 1 spinal muscular atrophy (SMA) showed promising signs, including an increased expression of functional survival motor neuron (SMN) protein in the blood, after 1 year of treatment with oral risdiplam (Evrysdi, Genentech), according to results of part 1 of the FIREFISH study.

A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.

“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.

However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.

The findings were published online Feb. 24 in the New England Journal of Medicine.
 

A phase 2-3 open-label study

The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.

The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.

In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.

Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.

Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
 

 

 

The first oral treatment option

Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”

While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.

Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”

Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”

Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”

The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.

The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Citation Override
Publish date: March 9, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

Adherence and discontinuation limit triptan outcomes

Article Type
Changed
Thu, 12/15/2022 - 15:41

 

Poor adherence and high discontinuance rates frequently compromise achieving optimal triptan therapy in managing acute migraine headaches, a new Danish study shows.

“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”

Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.

Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
 

25 years of triptan use

Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.

Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).

Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.

Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.

The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.

Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.

One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.

“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.

“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.

Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

 

Poor adherence and high discontinuance rates frequently compromise achieving optimal triptan therapy in managing acute migraine headaches, a new Danish study shows.

“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”

Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.

Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
 

25 years of triptan use

Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.

Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).

Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.

Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.

The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.

Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.

One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.

“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.

“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.

Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.

 

Poor adherence and high discontinuance rates frequently compromise achieving optimal triptan therapy in managing acute migraine headaches, a new Danish study shows.

“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”

Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.

Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
 

25 years of triptan use

Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.

Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).

Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.

Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.

The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.

Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.

One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.

“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.

“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.

Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CEPHALALGIA

Citation Override
Publish date: March 8, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

Core feature of frontotemporal dementia may aid diagnosis

Article Type
Changed
Thu, 12/15/2022 - 15:41

Increased white matter hyperintensities (WMH) are strongly associated with Alzheimer’s disease, but new research reveals they are also a “core feature” of frontotemporal dementia (FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.

“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.

“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.

The study was published online Feb. 17 in Neurology.
 

Difficult diagnosis

“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.

“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.

Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.

FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.

WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
 

Higher disease severity

To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).

Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).

Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).

The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.

The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).

Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.

A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
 

 

 

Unexpected findings

Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.

After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.

Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.

“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.

The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.

“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
 

Major research contribution

Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.

“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.

The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

Increased white matter hyperintensities (WMH) are strongly associated with Alzheimer’s disease, but new research reveals they are also a “core feature” of frontotemporal dementia (FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.

“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.

“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.

The study was published online Feb. 17 in Neurology.
 

Difficult diagnosis

“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.

“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.

Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.

FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.

WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
 

Higher disease severity

To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).

Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).

Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).

The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.

The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).

Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.

A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
 

 

 

Unexpected findings

Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.

After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.

Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.

“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.

The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.

“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
 

Major research contribution

Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.

“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.

The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Increased white matter hyperintensities (WMH) are strongly associated with Alzheimer’s disease, but new research reveals they are also a “core feature” of frontotemporal dementia (FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.

“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.

“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.

The study was published online Feb. 17 in Neurology.
 

Difficult diagnosis

“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.

“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.

Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.

FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.

WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
 

Higher disease severity

To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).

Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).

Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).

The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.

The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).

Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.

A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
 

 

 

Unexpected findings

Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.

After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.

Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.

“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.

The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.

“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
 

Major research contribution

Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.

“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.

The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Citation Override
Publish date: February 25, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

EEG data may help aid diagnosis, treatment of focal epilepsy

Article Type
Changed
Thu, 12/15/2022 - 15:41

Continuous intracranial electroencephalography (cEEG) may help pinpoint optimal timing of diagnostic studies and treatment for patients with focal epilepsy, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.

“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.

The study was published online Feb. 8 in JAMA Neurology.
 

Distinct chronotypes

Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.

Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.

To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.

After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.

Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.

To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.

To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.

Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.

The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.

The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).

Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.

Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.

Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.

“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”

The results support patients’ impressions that their seizures occur in a cyclical pattern.

“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.

“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
 

 

 

Need for more research

Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”

The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.

The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.

However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.

“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.

“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”

The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

Continuous intracranial electroencephalography (cEEG) may help pinpoint optimal timing of diagnostic studies and treatment for patients with focal epilepsy, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.

“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.

The study was published online Feb. 8 in JAMA Neurology.
 

Distinct chronotypes

Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.

Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.

To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.

After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.

Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.

To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.

To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.

Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.

The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.

The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).

Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.

Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.

Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.

“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”

The results support patients’ impressions that their seizures occur in a cyclical pattern.

“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.

“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
 

 

 

Need for more research

Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”

The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.

The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.

However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.

“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.

“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”

The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Continuous intracranial electroencephalography (cEEG) may help pinpoint optimal timing of diagnostic studies and treatment for patients with focal epilepsy, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.

“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.

The study was published online Feb. 8 in JAMA Neurology.
 

Distinct chronotypes

Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.

Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.

To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.

After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.

Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.

To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.

To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.

Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.

The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.

The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).

Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.

Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.

Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.

“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”

The results support patients’ impressions that their seizures occur in a cyclical pattern.

“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.

“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
 

 

 

Need for more research

Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”

The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.

The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.

However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.

“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.

“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”

The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NEUROLOGY

Citation Override
Publish date: February 25, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads

New data may help intercept head injuries in college football

Article Type
Changed
Thu, 12/15/2022 - 15:42

Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.

Dr. Michael McCrea

The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows concussion incidence and head injury exposure are disproportionately higher in the preseason versus the regular season.

Dr. Brian Stemper

The research also reveals that such injuries occur more often during practices than games.

“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”

The study was published online Feb. 1 in JAMA Neurology.
 

More injuries in preseason

Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).

The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.

“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”

To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.

A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.

“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.

Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.

Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).

“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”

Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.

“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.

These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.

“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.

“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.  

Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.

 

 

‘Shocking’ findings

In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.

“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”

“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”

Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.

“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.  

“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.

Dr. Michael McCrea

The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows concussion incidence and head injury exposure are disproportionately higher in the preseason versus the regular season.

Dr. Brian Stemper

The research also reveals that such injuries occur more often during practices than games.

“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”

The study was published online Feb. 1 in JAMA Neurology.
 

More injuries in preseason

Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).

The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.

“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”

To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.

A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.

“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.

Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.

Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).

“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”

Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.

“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.

These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.

“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.

“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.  

Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.

 

 

‘Shocking’ findings

In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.

“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”

“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”

Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.

“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.  

“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”

A version of this article first appeared on Medscape.com.

Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.

Dr. Michael McCrea

The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows concussion incidence and head injury exposure are disproportionately higher in the preseason versus the regular season.

Dr. Brian Stemper

The research also reveals that such injuries occur more often during practices than games.

“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”

The study was published online Feb. 1 in JAMA Neurology.
 

More injuries in preseason

Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).

The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.

“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”

To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.

A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.

“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.

Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.

Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).

“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”

Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.

“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.

These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.

“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.

“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.  

Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.

 

 

‘Shocking’ findings

In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.

“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”

“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”

Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.

“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.  

“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer

New steroid dosing regimen for myasthenia gravis

Article Type
Changed
Thu, 12/15/2022 - 15:42

 

The findings of a new randomized trial support the rapid tapering of prednisone in patients with generalized myasthenia gravis requiring combined corticosteroid and azathioprine therapy. The trial showed that the conventional slow tapering regimen enabled discontinuation of prednisone earlier than previously reported but the new rapid-tapering regimen enabled an even faster discontinuation.

Noting that although both regimens led to a comparable myasthenia gravis status and prednisone dose at 15 months, the authors stated: “We think that the reduction of the cumulative dose over a year (equivalent to 5 mg/day) is a clinically relevant reduction, since the risk of complications is proportional to the daily or cumulative doses of prednisone.

“Our results warrant testing of a more rapid-tapering regimen in a future trial. In the meantime, our trial provides useful information on how prednisone tapering could be managed in patients with generalized myasthenia gravis treated with azathioprine,” they concluded.

The trial was published online Feb. 8 in JAMA Neurology.

Myasthenia gravis is a disorder of neuromuscular transmission, resulting from autoantibodies to components of the neuromuscular junction, most commonly the acetylcholine receptor. The incidence ranges from 0.3 to 2.8 per 100,000, and it is estimated to affect more than 700,000 people worldwide.

The authors of the current paper, led by Tarek Sharshar, MD, PhD, Groupe Hospitalier Universitaire, Paris, explained that many patients whose symptoms are not controlled by cholinesterase inhibitors are treated with corticosteroids and an immunosuppressant, usually azathioprine. No specific dosing protocol for prednisone has been validated, but it is commonly gradually increased to 0.75 mg/kg on alternate days and reduced progressively when minimal manifestation status (MMS; no symptoms or functional limitations) is reached.  

They noted that this regimen leads to high and prolonged corticosteroid treatment – often for several years – with the mean daily prednisone dose exceeding 30 mg/day at 15 months and 20 mg/day at 36 months. As long-term use of corticosteroids is often associated with significant complications, reducing or even discontinuing prednisone treatment without destabilizing myasthenia gravis is therefore a therapeutic goal.


 

Evaluating dosage regimens

To investigate whether different dosage regimens could help wean patients with generalized myasthenia gravis from corticosteroid therapy without compromising efficacy, the researchers conducted this study in which the current recommended regimen was compared with an approach using higher initial corticosteroid doses followed by rapid tapering.

In the conventional slow-tapering group (control group), prednisone was given on alternate days, starting at a dose of 10 mg then increased by increments of 10 mg every 2 days up to 1.5 mg/kg on alternate days without exceeding 100 mg. This dose was maintained until MMS was reached and then reduced by 10 mg every 2 weeks until a dosage of 40 mg was reached, with subsequent slowing of the taper to 5 mg monthly. If MMS was not maintained, the alternate-day prednisone dose was increased by 10 mg every 2 weeks until MMS was restored, and the tapering resumed 4 weeks later.

In the new rapid-tapering group, oral prednisone was immediately started at 0.75 mg/kg per day, and this was followed by an earlier and rapid decrease once improved myasthenia gravis status was attained. Three different tapering schedules were applied dependent on the improvement status of the patient.

First, If the patient reached MMS at 1 month, the dose of prednisone was reduced by 0.1 mg/kg every 10 days up to 0.45 mg/kg per day, then 0.05 mg/kg every 10 days up to 0.25 mg/kg per day, then in decrements of 1 mg by adjusting the duration of the decrements according to the participant’s weight with the aim of achieving complete cessation of corticosteroid therapy within 18-20 weeks for this third stage of tapering.

Second, if the state of MMS was not reached at 1 month but the participant had improved, a slower tapering was conducted, with the dosage reduced in a similar way to the first instance but with each reduction introduced every 20 days. If the participant reached MMS during this tapering process, the tapering of prednisone was similar to the sequence described in the first group.

Third, if MMS was not reached and the participant had not improved, the initial dose was maintained for the first 3 months; beyond that time, a decrease in the prednisone dose was undertaken as in the second group to a minimum dose of 0.25 mg/kg per day, after which the prednisone dose was not reduced further. If the patient improved, the tapering of prednisone followed the sequence described in the second category.

Reductions in prednisone dose could be accelerated in the case of severe prednisone adverse effects, according to the prescriber’s decision.

In the event of a myasthenia gravis exacerbation, the patient was hospitalized and the dose of prednisone was routinely doubled, or for a more moderate aggravation, the dose was increased to the previous dose recommended in the tapering regimen.

Azathioprine, up to a maximum dose of 3 mg/kg per day, was prescribed for all participants. In all, 117 patients were randomly assigned, and 113 completed the study.

The primary outcome was the proportion of participants having reached MMS without prednisone at 12 months and having not relapsed or taken prednisone between months 12 and 15. This was achieved by significantly more patients in the rapid-tapering group (39% vs. 9%; risk ratio, 3.61; P < .001).

Rapid tapering allowed sparing of a mean of 1,898 mg of prednisone over 1 year (5.3 mg/day) per patient.

The rate of myasthenia gravis exacerbation or worsening did not differ significantly between the two groups, nor did the use of plasmapheresis or IVIG or the doses of azathioprine.

The overall number of serious adverse events did not differ significantly between the two groups (slow tapering, 22% vs. rapid-tapering, 36%; P = .15).

The researchers said it is possible that prednisone tapering would differ with another immunosuppressive agent but as azathioprine is the first-line immunosuppressant usually recommended, these results are relevant for a large proportion of patients.

They said the better outcome of the intervention group could have been related to one or more of four differences in prednisone administration: An immediate high dose versus a slow increase of the prednisone dose; daily versus alternate-day dosing; earlier tapering initiation; and faster tapering. However, the structure of the study did not allow identification of which of these factors was responsible.

“Researching the best prednisone-tapering scheme is not only a major issue for patients with myasthenia gravis but also for other autoimmune or inflammatory diseases, because validated prednisone-tapering regimens are scarce,” the authors said.

The rapid tapering of prednisone therapy appears to be feasible, beneficial, and safe in patients with generalized myasthenia gravis and “warrants testing in other autoimmune diseases,” they added.
 

 

 

Particularly relevant to late-onset disease

Commenting on the study, Raffi Topakian, MD, Klinikum Wels-Grieskirchen, Wels, Austria, said the results showed that in patients with moderate to severe generalized myasthenia gravis requiring high-dose prednisone, azathioprine, a widely used immunosuppressant, may have a quicker steroid-sparing effect than previously thought, and that rapid steroid tapering can be achieved safely, resulting in a reduction of the cumulative steroid dose over a year despite higher initial doses.

Dr. Topakian, who was not involved with the research, pointed out that the median age was advanced (around 56 years), and the benefit of a regimen that leads to a reduction of the cumulative steroid dose over a year may be disproportionately larger for older, sicker patients with many comorbidities who are at considerably higher risk for a prednisone-induced increase in cardiovascular complications, osteoporotic fractures, and gastrointestinal bleeding. 

“The study findings are particularly relevant for the management of late-onset myasthenia gravis (when first symptoms start after age 45-50 years), which is being encountered more frequently over the past years,” he said.

“But the holy grail of myasthenia gravis treatment has not been found yet,” Dr. Topakian noted. “Disappointingly, rapid tapering of steroids (compared to slow tapering) resulted in a reduction of the cumulative steroid dose only, but was not associated with better myasthenia gravis functional status or lower doses of steroids at 15 months. To my view, this finding points to the limited immunosuppressive efficacy of azathioprine.”

He added that the study findings should not be extrapolated to patients with mild presentations or to those with muscle-specific kinase myasthenia gravis.

Dr. Sharshar disclosed no relevant financial relationships. Disclosures for the study coauthors appear in the original article.
 

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Publications
Topics
Sections

 

The findings of a new randomized trial support the rapid tapering of prednisone in patients with generalized myasthenia gravis requiring combined corticosteroid and azathioprine therapy. The trial showed that the conventional slow tapering regimen enabled discontinuation of prednisone earlier than previously reported but the new rapid-tapering regimen enabled an even faster discontinuation.

Noting that although both regimens led to a comparable myasthenia gravis status and prednisone dose at 15 months, the authors stated: “We think that the reduction of the cumulative dose over a year (equivalent to 5 mg/day) is a clinically relevant reduction, since the risk of complications is proportional to the daily or cumulative doses of prednisone.

“Our results warrant testing of a more rapid-tapering regimen in a future trial. In the meantime, our trial provides useful information on how prednisone tapering could be managed in patients with generalized myasthenia gravis treated with azathioprine,” they concluded.

The trial was published online Feb. 8 in JAMA Neurology.

Myasthenia gravis is a disorder of neuromuscular transmission, resulting from autoantibodies to components of the neuromuscular junction, most commonly the acetylcholine receptor. The incidence ranges from 0.3 to 2.8 per 100,000, and it is estimated to affect more than 700,000 people worldwide.

The authors of the current paper, led by Tarek Sharshar, MD, PhD, Groupe Hospitalier Universitaire, Paris, explained that many patients whose symptoms are not controlled by cholinesterase inhibitors are treated with corticosteroids and an immunosuppressant, usually azathioprine. No specific dosing protocol for prednisone has been validated, but it is commonly gradually increased to 0.75 mg/kg on alternate days and reduced progressively when minimal manifestation status (MMS; no symptoms or functional limitations) is reached.  

They noted that this regimen leads to high and prolonged corticosteroid treatment – often for several years – with the mean daily prednisone dose exceeding 30 mg/day at 15 months and 20 mg/day at 36 months. As long-term use of corticosteroids is often associated with significant complications, reducing or even discontinuing prednisone treatment without destabilizing myasthenia gravis is therefore a therapeutic goal.


 

Evaluating dosage regimens

To investigate whether different dosage regimens could help wean patients with generalized myasthenia gravis from corticosteroid therapy without compromising efficacy, the researchers conducted this study in which the current recommended regimen was compared with an approach using higher initial corticosteroid doses followed by rapid tapering.

In the conventional slow-tapering group (control group), prednisone was given on alternate days, starting at a dose of 10 mg then increased by increments of 10 mg every 2 days up to 1.5 mg/kg on alternate days without exceeding 100 mg. This dose was maintained until MMS was reached and then reduced by 10 mg every 2 weeks until a dosage of 40 mg was reached, with subsequent slowing of the taper to 5 mg monthly. If MMS was not maintained, the alternate-day prednisone dose was increased by 10 mg every 2 weeks until MMS was restored, and the tapering resumed 4 weeks later.

In the new rapid-tapering group, oral prednisone was immediately started at 0.75 mg/kg per day, and this was followed by an earlier and rapid decrease once improved myasthenia gravis status was attained. Three different tapering schedules were applied dependent on the improvement status of the patient.

First, If the patient reached MMS at 1 month, the dose of prednisone was reduced by 0.1 mg/kg every 10 days up to 0.45 mg/kg per day, then 0.05 mg/kg every 10 days up to 0.25 mg/kg per day, then in decrements of 1 mg by adjusting the duration of the decrements according to the participant’s weight with the aim of achieving complete cessation of corticosteroid therapy within 18-20 weeks for this third stage of tapering.

Second, if the state of MMS was not reached at 1 month but the participant had improved, a slower tapering was conducted, with the dosage reduced in a similar way to the first instance but with each reduction introduced every 20 days. If the participant reached MMS during this tapering process, the tapering of prednisone was similar to the sequence described in the first group.

Third, if MMS was not reached and the participant had not improved, the initial dose was maintained for the first 3 months; beyond that time, a decrease in the prednisone dose was undertaken as in the second group to a minimum dose of 0.25 mg/kg per day, after which the prednisone dose was not reduced further. If the patient improved, the tapering of prednisone followed the sequence described in the second category.

Reductions in prednisone dose could be accelerated in the case of severe prednisone adverse effects, according to the prescriber’s decision.

In the event of a myasthenia gravis exacerbation, the patient was hospitalized and the dose of prednisone was routinely doubled, or for a more moderate aggravation, the dose was increased to the previous dose recommended in the tapering regimen.

Azathioprine, up to a maximum dose of 3 mg/kg per day, was prescribed for all participants. In all, 117 patients were randomly assigned, and 113 completed the study.

The primary outcome was the proportion of participants having reached MMS without prednisone at 12 months and having not relapsed or taken prednisone between months 12 and 15. This was achieved by significantly more patients in the rapid-tapering group (39% vs. 9%; risk ratio, 3.61; P < .001).

Rapid tapering allowed sparing of a mean of 1,898 mg of prednisone over 1 year (5.3 mg/day) per patient.

The rate of myasthenia gravis exacerbation or worsening did not differ significantly between the two groups, nor did the use of plasmapheresis or IVIG or the doses of azathioprine.

The overall number of serious adverse events did not differ significantly between the two groups (slow tapering, 22% vs. rapid-tapering, 36%; P = .15).

The researchers said it is possible that prednisone tapering would differ with another immunosuppressive agent but as azathioprine is the first-line immunosuppressant usually recommended, these results are relevant for a large proportion of patients.

They said the better outcome of the intervention group could have been related to one or more of four differences in prednisone administration: An immediate high dose versus a slow increase of the prednisone dose; daily versus alternate-day dosing; earlier tapering initiation; and faster tapering. However, the structure of the study did not allow identification of which of these factors was responsible.

“Researching the best prednisone-tapering scheme is not only a major issue for patients with myasthenia gravis but also for other autoimmune or inflammatory diseases, because validated prednisone-tapering regimens are scarce,” the authors said.

The rapid tapering of prednisone therapy appears to be feasible, beneficial, and safe in patients with generalized myasthenia gravis and “warrants testing in other autoimmune diseases,” they added.
 

 

 

Particularly relevant to late-onset disease

Commenting on the study, Raffi Topakian, MD, Klinikum Wels-Grieskirchen, Wels, Austria, said the results showed that in patients with moderate to severe generalized myasthenia gravis requiring high-dose prednisone, azathioprine, a widely used immunosuppressant, may have a quicker steroid-sparing effect than previously thought, and that rapid steroid tapering can be achieved safely, resulting in a reduction of the cumulative steroid dose over a year despite higher initial doses.

Dr. Topakian, who was not involved with the research, pointed out that the median age was advanced (around 56 years), and the benefit of a regimen that leads to a reduction of the cumulative steroid dose over a year may be disproportionately larger for older, sicker patients with many comorbidities who are at considerably higher risk for a prednisone-induced increase in cardiovascular complications, osteoporotic fractures, and gastrointestinal bleeding. 

“The study findings are particularly relevant for the management of late-onset myasthenia gravis (when first symptoms start after age 45-50 years), which is being encountered more frequently over the past years,” he said.

“But the holy grail of myasthenia gravis treatment has not been found yet,” Dr. Topakian noted. “Disappointingly, rapid tapering of steroids (compared to slow tapering) resulted in a reduction of the cumulative steroid dose only, but was not associated with better myasthenia gravis functional status or lower doses of steroids at 15 months. To my view, this finding points to the limited immunosuppressive efficacy of azathioprine.”

He added that the study findings should not be extrapolated to patients with mild presentations or to those with muscle-specific kinase myasthenia gravis.

Dr. Sharshar disclosed no relevant financial relationships. Disclosures for the study coauthors appear in the original article.
 

A version of this article first appeared on Medscape.com.

 

The findings of a new randomized trial support the rapid tapering of prednisone in patients with generalized myasthenia gravis requiring combined corticosteroid and azathioprine therapy. The trial showed that the conventional slow tapering regimen enabled discontinuation of prednisone earlier than previously reported but the new rapid-tapering regimen enabled an even faster discontinuation.

Noting that although both regimens led to a comparable myasthenia gravis status and prednisone dose at 15 months, the authors stated: “We think that the reduction of the cumulative dose over a year (equivalent to 5 mg/day) is a clinically relevant reduction, since the risk of complications is proportional to the daily or cumulative doses of prednisone.

“Our results warrant testing of a more rapid-tapering regimen in a future trial. In the meantime, our trial provides useful information on how prednisone tapering could be managed in patients with generalized myasthenia gravis treated with azathioprine,” they concluded.

The trial was published online Feb. 8 in JAMA Neurology.

Myasthenia gravis is a disorder of neuromuscular transmission, resulting from autoantibodies to components of the neuromuscular junction, most commonly the acetylcholine receptor. The incidence ranges from 0.3 to 2.8 per 100,000, and it is estimated to affect more than 700,000 people worldwide.

The authors of the current paper, led by Tarek Sharshar, MD, PhD, Groupe Hospitalier Universitaire, Paris, explained that many patients whose symptoms are not controlled by cholinesterase inhibitors are treated with corticosteroids and an immunosuppressant, usually azathioprine. No specific dosing protocol for prednisone has been validated, but it is commonly gradually increased to 0.75 mg/kg on alternate days and reduced progressively when minimal manifestation status (MMS; no symptoms or functional limitations) is reached.  

They noted that this regimen leads to high and prolonged corticosteroid treatment – often for several years – with the mean daily prednisone dose exceeding 30 mg/day at 15 months and 20 mg/day at 36 months. As long-term use of corticosteroids is often associated with significant complications, reducing or even discontinuing prednisone treatment without destabilizing myasthenia gravis is therefore a therapeutic goal.


 

Evaluating dosage regimens

To investigate whether different dosage regimens could help wean patients with generalized myasthenia gravis from corticosteroid therapy without compromising efficacy, the researchers conducted this study in which the current recommended regimen was compared with an approach using higher initial corticosteroid doses followed by rapid tapering.

In the conventional slow-tapering group (control group), prednisone was given on alternate days, starting at a dose of 10 mg then increased by increments of 10 mg every 2 days up to 1.5 mg/kg on alternate days without exceeding 100 mg. This dose was maintained until MMS was reached and then reduced by 10 mg every 2 weeks until a dosage of 40 mg was reached, with subsequent slowing of the taper to 5 mg monthly. If MMS was not maintained, the alternate-day prednisone dose was increased by 10 mg every 2 weeks until MMS was restored, and the tapering resumed 4 weeks later.

In the new rapid-tapering group, oral prednisone was immediately started at 0.75 mg/kg per day, and this was followed by an earlier and rapid decrease once improved myasthenia gravis status was attained. Three different tapering schedules were applied dependent on the improvement status of the patient.

First, If the patient reached MMS at 1 month, the dose of prednisone was reduced by 0.1 mg/kg every 10 days up to 0.45 mg/kg per day, then 0.05 mg/kg every 10 days up to 0.25 mg/kg per day, then in decrements of 1 mg by adjusting the duration of the decrements according to the participant’s weight with the aim of achieving complete cessation of corticosteroid therapy within 18-20 weeks for this third stage of tapering.

Second, if the state of MMS was not reached at 1 month but the participant had improved, a slower tapering was conducted, with the dosage reduced in a similar way to the first instance but with each reduction introduced every 20 days. If the participant reached MMS during this tapering process, the tapering of prednisone was similar to the sequence described in the first group.

Third, if MMS was not reached and the participant had not improved, the initial dose was maintained for the first 3 months; beyond that time, a decrease in the prednisone dose was undertaken as in the second group to a minimum dose of 0.25 mg/kg per day, after which the prednisone dose was not reduced further. If the patient improved, the tapering of prednisone followed the sequence described in the second category.

Reductions in prednisone dose could be accelerated in the case of severe prednisone adverse effects, according to the prescriber’s decision.

In the event of a myasthenia gravis exacerbation, the patient was hospitalized and the dose of prednisone was routinely doubled, or for a more moderate aggravation, the dose was increased to the previous dose recommended in the tapering regimen.

Azathioprine, up to a maximum dose of 3 mg/kg per day, was prescribed for all participants. In all, 117 patients were randomly assigned, and 113 completed the study.

The primary outcome was the proportion of participants having reached MMS without prednisone at 12 months and having not relapsed or taken prednisone between months 12 and 15. This was achieved by significantly more patients in the rapid-tapering group (39% vs. 9%; risk ratio, 3.61; P < .001).

Rapid tapering allowed sparing of a mean of 1,898 mg of prednisone over 1 year (5.3 mg/day) per patient.

The rate of myasthenia gravis exacerbation or worsening did not differ significantly between the two groups, nor did the use of plasmapheresis or IVIG or the doses of azathioprine.

The overall number of serious adverse events did not differ significantly between the two groups (slow tapering, 22% vs. rapid-tapering, 36%; P = .15).

The researchers said it is possible that prednisone tapering would differ with another immunosuppressive agent but as azathioprine is the first-line immunosuppressant usually recommended, these results are relevant for a large proportion of patients.

They said the better outcome of the intervention group could have been related to one or more of four differences in prednisone administration: An immediate high dose versus a slow increase of the prednisone dose; daily versus alternate-day dosing; earlier tapering initiation; and faster tapering. However, the structure of the study did not allow identification of which of these factors was responsible.

“Researching the best prednisone-tapering scheme is not only a major issue for patients with myasthenia gravis but also for other autoimmune or inflammatory diseases, because validated prednisone-tapering regimens are scarce,” the authors said.

The rapid tapering of prednisone therapy appears to be feasible, beneficial, and safe in patients with generalized myasthenia gravis and “warrants testing in other autoimmune diseases,” they added.
 

 

 

Particularly relevant to late-onset disease

Commenting on the study, Raffi Topakian, MD, Klinikum Wels-Grieskirchen, Wels, Austria, said the results showed that in patients with moderate to severe generalized myasthenia gravis requiring high-dose prednisone, azathioprine, a widely used immunosuppressant, may have a quicker steroid-sparing effect than previously thought, and that rapid steroid tapering can be achieved safely, resulting in a reduction of the cumulative steroid dose over a year despite higher initial doses.

Dr. Topakian, who was not involved with the research, pointed out that the median age was advanced (around 56 years), and the benefit of a regimen that leads to a reduction of the cumulative steroid dose over a year may be disproportionately larger for older, sicker patients with many comorbidities who are at considerably higher risk for a prednisone-induced increase in cardiovascular complications, osteoporotic fractures, and gastrointestinal bleeding. 

“The study findings are particularly relevant for the management of late-onset myasthenia gravis (when first symptoms start after age 45-50 years), which is being encountered more frequently over the past years,” he said.

“But the holy grail of myasthenia gravis treatment has not been found yet,” Dr. Topakian noted. “Disappointingly, rapid tapering of steroids (compared to slow tapering) resulted in a reduction of the cumulative steroid dose only, but was not associated with better myasthenia gravis functional status or lower doses of steroids at 15 months. To my view, this finding points to the limited immunosuppressive efficacy of azathioprine.”

He added that the study findings should not be extrapolated to patients with mild presentations or to those with muscle-specific kinase myasthenia gravis.

Dr. Sharshar disclosed no relevant financial relationships. Disclosures for the study coauthors appear in the original article.
 

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(4)
Issue
Neurology Reviews- 29(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NEUROLOGY

Citation Override
Publish date: February 19, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads