User login
Remote electrical neuromodulation device helps reduce migraine days
recent research published in the journal Headache.
, according toThe prospective, randomized, double-blind, placebo-controlled, multicenter trial showed that remote electrical neuromodulation (REN) with Nerivio (Theranica Bio-Electronics Ltd.; Bridgewater, N.J.) found a mean reduction/decrease in the number of migraine days by an average of 4.0 days per month, according to Stewart J. Tepper MD, of the Geisel School of Medicine at Dartmouth in Hanover, N.H., and colleagues.*
“The statistically significant results were maintained in separate subanalyses of the chronic and episodic subsamples, as well as in the separate subanalyses of participants who used and did not use migraine prophylaxis,” Dr. Tepper and colleagues wrote.
A nonpharmacological alternative
Researchers randomized 248 participants into active and placebo groups, with 95 participants in the active group and 84 participants in the placebo group meeting the criteria for a modified intention-to-treat (mITT) analysis. Most of the participants in the ITT dataset were women (85.9%) with an average age of 41.7 years, and a baseline average of 12.2 migraine days and 15.6 headache days. Overall, 52.4% of participants in the ITT dataset had chronic migraine, 25.0% had migraine with aura, and 41.1% were taking preventative medication.
Dr. Tepper and colleagues followed participants for 4 weeks at baseline for observation followed by 8 weeks of participants using the REN device every other day for 45 minutes, or a placebo device that “produces electrical pulses of the same maximum intensity (34 mA) and overall energy, but with different pulse durations and much lower frequencies compared with the active device.” Participants completed a daily diary where they recorded their symptoms.
Researchers assessed the mean change in number of migraine days per month as a primary outcome, and evaluated participants who experienced episodic and chronic migraines separately in subgroup analyses. Secondary outcome measures included mean change in number of moderate or severe headache days, 50% reduction in mean number of headache days compared with baseline, Headache Impact Test short form (HIT-6) and Migraine Specific Quality of Life Questionnaire (MSQ) Role Function Domain total score mean change at 12 weeks compared with week 1, and reduction in mean number of days taking acute headache or migraine medication.
Participants receiving REN treatment had a significant reduction in mean migraine days per month compared with the placebo group (4.0 days vs. 1.3 days; 95% confidence interval, –3.9 days to –1.5 days; P < .001). In subgroup analyses, a significant reduction in migraine days was seen in participants receiving REN treatment with episodic migraine (3.2 days vs. 1.0 days; P = .003) and chronic migraine (4.7 days vs. 1.6 days; P = .001) compared with placebo.
Dr. Tepper and colleagues found a significant reduction in moderate and/or severe headache days among participants receiving REN treatment compared with placebo (3.8 days vs. 2.2 days; P = .005), a significant reduction in headache days overall compared with placebo (4.5 days vs. 1.8 days; P < .001), a significant percentage of patients who experienced 50% reduction in moderate and/or severe headache days compared with placebo (51.6% vs. 35.7%; P = .033), and a significant reduction in acute medication days compared with placebo (3.5 days vs. 1.4 days; P = .001). Dr. Tepper and colleagues found no serious device-related adverse events in either group.
The researchers noted that REN therapy is a “much-needed nonpharmacological alternative” to other preventive and acute treatments for migraine. “Given the previously well-established clinical efficacy and high safety profile in acute treatment of migraine, REN can cover the entire treatment spectrum of migraine, including both acute and preventive treatments,” they said.
‘A good place to start’
Commenting on the study, Alan M. Rapoport, MD, clinical professor of neurology at University of California, Los Angeles; past president of the International Headache Society; and editor-in-chief of Neurology Reviews, said the study was well designed, but acknowledged the 8-week follow-up time for participants as one potential area where he would have wanted to see more data.
As a medical device cleared for use by the Food and Drug Administration for acute treatment of migraine, the REM device also appears to be effective as a migraine preventative based on the results of the study with “virtually no adverse events,” he noted.
“I think this is a great treatment. I think it’s a good place to start,” Dr. Rapoport said. Given the low adverse event rate, he said he would be willing to offer the device to patients as a first option for preventing migraine and either switch to another preventative option or add an additional medication in combination based on how the patient responds. However, at the moment, he noted that this device is not covered by insurance.
Now that a REN device has been shown to work in the acute setting and as a preventative, Dr. Rapoport said he is interested in seeing other devices that have been cleared by the FDA as migraine treatments evaluated in migraine prevention. “I think we need more patients tried on the devices so we get an idea of which ones work acutely, which ones work preventively,” he said.
The authors reported personal and institutional relationships in the form of advisory board positions, consultancies, grants, research principal investigator roles, royalties, speakers bureau positions, and stockholders for a variety of pharmaceutical companies, agencies, and other organizations. Several authors disclosed ties with Theranica, the manufacturer of the REN device used in the study. Dr. Rapoport is editor-in-chief of Neurology Reviews and a consultant for Theranica, but was not involved in studies associated with the REN device.
Correction, 2/10/23: An earlier version of this article misstated the reduction in number of migraine days.
recent research published in the journal Headache.
, according toThe prospective, randomized, double-blind, placebo-controlled, multicenter trial showed that remote electrical neuromodulation (REN) with Nerivio (Theranica Bio-Electronics Ltd.; Bridgewater, N.J.) found a mean reduction/decrease in the number of migraine days by an average of 4.0 days per month, according to Stewart J. Tepper MD, of the Geisel School of Medicine at Dartmouth in Hanover, N.H., and colleagues.*
“The statistically significant results were maintained in separate subanalyses of the chronic and episodic subsamples, as well as in the separate subanalyses of participants who used and did not use migraine prophylaxis,” Dr. Tepper and colleagues wrote.
A nonpharmacological alternative
Researchers randomized 248 participants into active and placebo groups, with 95 participants in the active group and 84 participants in the placebo group meeting the criteria for a modified intention-to-treat (mITT) analysis. Most of the participants in the ITT dataset were women (85.9%) with an average age of 41.7 years, and a baseline average of 12.2 migraine days and 15.6 headache days. Overall, 52.4% of participants in the ITT dataset had chronic migraine, 25.0% had migraine with aura, and 41.1% were taking preventative medication.
Dr. Tepper and colleagues followed participants for 4 weeks at baseline for observation followed by 8 weeks of participants using the REN device every other day for 45 minutes, or a placebo device that “produces electrical pulses of the same maximum intensity (34 mA) and overall energy, but with different pulse durations and much lower frequencies compared with the active device.” Participants completed a daily diary where they recorded their symptoms.
Researchers assessed the mean change in number of migraine days per month as a primary outcome, and evaluated participants who experienced episodic and chronic migraines separately in subgroup analyses. Secondary outcome measures included mean change in number of moderate or severe headache days, 50% reduction in mean number of headache days compared with baseline, Headache Impact Test short form (HIT-6) and Migraine Specific Quality of Life Questionnaire (MSQ) Role Function Domain total score mean change at 12 weeks compared with week 1, and reduction in mean number of days taking acute headache or migraine medication.
Participants receiving REN treatment had a significant reduction in mean migraine days per month compared with the placebo group (4.0 days vs. 1.3 days; 95% confidence interval, –3.9 days to –1.5 days; P < .001). In subgroup analyses, a significant reduction in migraine days was seen in participants receiving REN treatment with episodic migraine (3.2 days vs. 1.0 days; P = .003) and chronic migraine (4.7 days vs. 1.6 days; P = .001) compared with placebo.
Dr. Tepper and colleagues found a significant reduction in moderate and/or severe headache days among participants receiving REN treatment compared with placebo (3.8 days vs. 2.2 days; P = .005), a significant reduction in headache days overall compared with placebo (4.5 days vs. 1.8 days; P < .001), a significant percentage of patients who experienced 50% reduction in moderate and/or severe headache days compared with placebo (51.6% vs. 35.7%; P = .033), and a significant reduction in acute medication days compared with placebo (3.5 days vs. 1.4 days; P = .001). Dr. Tepper and colleagues found no serious device-related adverse events in either group.
The researchers noted that REN therapy is a “much-needed nonpharmacological alternative” to other preventive and acute treatments for migraine. “Given the previously well-established clinical efficacy and high safety profile in acute treatment of migraine, REN can cover the entire treatment spectrum of migraine, including both acute and preventive treatments,” they said.
‘A good place to start’
Commenting on the study, Alan M. Rapoport, MD, clinical professor of neurology at University of California, Los Angeles; past president of the International Headache Society; and editor-in-chief of Neurology Reviews, said the study was well designed, but acknowledged the 8-week follow-up time for participants as one potential area where he would have wanted to see more data.
As a medical device cleared for use by the Food and Drug Administration for acute treatment of migraine, the REM device also appears to be effective as a migraine preventative based on the results of the study with “virtually no adverse events,” he noted.
“I think this is a great treatment. I think it’s a good place to start,” Dr. Rapoport said. Given the low adverse event rate, he said he would be willing to offer the device to patients as a first option for preventing migraine and either switch to another preventative option or add an additional medication in combination based on how the patient responds. However, at the moment, he noted that this device is not covered by insurance.
Now that a REN device has been shown to work in the acute setting and as a preventative, Dr. Rapoport said he is interested in seeing other devices that have been cleared by the FDA as migraine treatments evaluated in migraine prevention. “I think we need more patients tried on the devices so we get an idea of which ones work acutely, which ones work preventively,” he said.
The authors reported personal and institutional relationships in the form of advisory board positions, consultancies, grants, research principal investigator roles, royalties, speakers bureau positions, and stockholders for a variety of pharmaceutical companies, agencies, and other organizations. Several authors disclosed ties with Theranica, the manufacturer of the REN device used in the study. Dr. Rapoport is editor-in-chief of Neurology Reviews and a consultant for Theranica, but was not involved in studies associated with the REN device.
Correction, 2/10/23: An earlier version of this article misstated the reduction in number of migraine days.
recent research published in the journal Headache.
, according toThe prospective, randomized, double-blind, placebo-controlled, multicenter trial showed that remote electrical neuromodulation (REN) with Nerivio (Theranica Bio-Electronics Ltd.; Bridgewater, N.J.) found a mean reduction/decrease in the number of migraine days by an average of 4.0 days per month, according to Stewart J. Tepper MD, of the Geisel School of Medicine at Dartmouth in Hanover, N.H., and colleagues.*
“The statistically significant results were maintained in separate subanalyses of the chronic and episodic subsamples, as well as in the separate subanalyses of participants who used and did not use migraine prophylaxis,” Dr. Tepper and colleagues wrote.
A nonpharmacological alternative
Researchers randomized 248 participants into active and placebo groups, with 95 participants in the active group and 84 participants in the placebo group meeting the criteria for a modified intention-to-treat (mITT) analysis. Most of the participants in the ITT dataset were women (85.9%) with an average age of 41.7 years, and a baseline average of 12.2 migraine days and 15.6 headache days. Overall, 52.4% of participants in the ITT dataset had chronic migraine, 25.0% had migraine with aura, and 41.1% were taking preventative medication.
Dr. Tepper and colleagues followed participants for 4 weeks at baseline for observation followed by 8 weeks of participants using the REN device every other day for 45 minutes, or a placebo device that “produces electrical pulses of the same maximum intensity (34 mA) and overall energy, but with different pulse durations and much lower frequencies compared with the active device.” Participants completed a daily diary where they recorded their symptoms.
Researchers assessed the mean change in number of migraine days per month as a primary outcome, and evaluated participants who experienced episodic and chronic migraines separately in subgroup analyses. Secondary outcome measures included mean change in number of moderate or severe headache days, 50% reduction in mean number of headache days compared with baseline, Headache Impact Test short form (HIT-6) and Migraine Specific Quality of Life Questionnaire (MSQ) Role Function Domain total score mean change at 12 weeks compared with week 1, and reduction in mean number of days taking acute headache or migraine medication.
Participants receiving REN treatment had a significant reduction in mean migraine days per month compared with the placebo group (4.0 days vs. 1.3 days; 95% confidence interval, –3.9 days to –1.5 days; P < .001). In subgroup analyses, a significant reduction in migraine days was seen in participants receiving REN treatment with episodic migraine (3.2 days vs. 1.0 days; P = .003) and chronic migraine (4.7 days vs. 1.6 days; P = .001) compared with placebo.
Dr. Tepper and colleagues found a significant reduction in moderate and/or severe headache days among participants receiving REN treatment compared with placebo (3.8 days vs. 2.2 days; P = .005), a significant reduction in headache days overall compared with placebo (4.5 days vs. 1.8 days; P < .001), a significant percentage of patients who experienced 50% reduction in moderate and/or severe headache days compared with placebo (51.6% vs. 35.7%; P = .033), and a significant reduction in acute medication days compared with placebo (3.5 days vs. 1.4 days; P = .001). Dr. Tepper and colleagues found no serious device-related adverse events in either group.
The researchers noted that REN therapy is a “much-needed nonpharmacological alternative” to other preventive and acute treatments for migraine. “Given the previously well-established clinical efficacy and high safety profile in acute treatment of migraine, REN can cover the entire treatment spectrum of migraine, including both acute and preventive treatments,” they said.
‘A good place to start’
Commenting on the study, Alan M. Rapoport, MD, clinical professor of neurology at University of California, Los Angeles; past president of the International Headache Society; and editor-in-chief of Neurology Reviews, said the study was well designed, but acknowledged the 8-week follow-up time for participants as one potential area where he would have wanted to see more data.
As a medical device cleared for use by the Food and Drug Administration for acute treatment of migraine, the REM device also appears to be effective as a migraine preventative based on the results of the study with “virtually no adverse events,” he noted.
“I think this is a great treatment. I think it’s a good place to start,” Dr. Rapoport said. Given the low adverse event rate, he said he would be willing to offer the device to patients as a first option for preventing migraine and either switch to another preventative option or add an additional medication in combination based on how the patient responds. However, at the moment, he noted that this device is not covered by insurance.
Now that a REN device has been shown to work in the acute setting and as a preventative, Dr. Rapoport said he is interested in seeing other devices that have been cleared by the FDA as migraine treatments evaluated in migraine prevention. “I think we need more patients tried on the devices so we get an idea of which ones work acutely, which ones work preventively,” he said.
The authors reported personal and institutional relationships in the form of advisory board positions, consultancies, grants, research principal investigator roles, royalties, speakers bureau positions, and stockholders for a variety of pharmaceutical companies, agencies, and other organizations. Several authors disclosed ties with Theranica, the manufacturer of the REN device used in the study. Dr. Rapoport is editor-in-chief of Neurology Reviews and a consultant for Theranica, but was not involved in studies associated with the REN device.
Correction, 2/10/23: An earlier version of this article misstated the reduction in number of migraine days.
FROM HEADACHE
Cognitive testing for older drivers: Is there a benefit?
, according to results from a large population-based study using data from Japan.
But the same study, published in the Journal of the American Geriatrics Society, also reported a concurrent increase in pedestrian and cycling injuries, possibly because more older former drivers were getting around by alternative means. That finding echoed a 2012 study from Denmark, which also looked at the effects of an age-based cognitive screening policy for older drivers, and saw more fatal road injuries among older people who were not driving.
While some governments, including those of Denmark, Taiwan, and Japan, have implemented age-based cognitive screening for older drivers, there has been little evidence to date that such policies improve road safety. Guidelines issued in 2010 by the American Academy of Neurology discourage age-based screening, advising instead that people diagnosed with cognitive disorders be carefully evaluated for driving fitness and recommending one widely used scale, the Clinical Dementia Rating, as useful in identifying potentially unsafe drivers.
Japan’s national screening policy: Did it work?
The new study, led by Haruhiko Inada, MD, PhD, an epidemiologist at Johns Hopkins University in Baltimore, used national crash data from Japan, where since 2017 all drivers 75 and older not only must take cognitive tests measuring temporal orientation and memory at license renewal, but are also referred for medical evaluation if they fail them. People receiving a subsequent dementia diagnosis can have their licenses suspended or revoked.
Dr. Inada and his colleagues looked at national data from nearly 603,000 police-reported vehicle collisions and nearly 197,000 pedestrian or cyclist road injuries between March 2012 and December 2019, all involving people aged 70 and older. To assess the screening policy’s impact, the researchers calculated estimated monthly collision or injury incidence rates per 100,000 person-years. This way, they could “control for secular trends that were unaffected by the policy, such as the decreasing incidence of motor vehicle collisions year by year,” the researchers explained.
After the screening was implemented, cumulative estimated collisions among drivers 75 or older decreased by 3,670 (95% confidence interval, 5,125-2,104), while reported pedestrian or cyclist injuries increased by an estimated 959 (95% CI, 24-1,834). Dr. Inada and colleagues found that crashes declined among men but not women, noting also that more older men than women are licensed to drive in Japan. Pedestrian and cyclist injuries were highest among men aged 80-84, and women aged 80 and older.
“Cognitively screening older drivers at license renewal and promoting voluntary surrender of licenses may prevent motor vehicle collisions,” Dr. Inada and his colleagues concluded. “However, they are associated with an increase in road injuries for older pedestrians and cyclists. Future studies should examine the effectiveness of mitigation measures, such as alternative, safe transportation, and accommodations for pedestrians and cyclists.”
No definitive answers
Two investigators who have studied cognitive screening related to road safety were contacted for commentary on the study findings.
Anu Siren, PhD, professor of gerontology at Tampere (Finland) University, who in 2012 reported higher injuries after implementation of older-driver cognitive screening in Denmark, commented that the new study, while benefiting from a much larger data set than earlier studies, still “fails to show that decrease in collisions is because ‘unfit’ drivers were removed from the road. But it does confirm previous findings about how strict screening policies make people shift from cars to unprotected modes of transportation,” which are riskier.
In studies measuring driving safety, the usual definition of risk is incidents per exposure, Dr. Siren noted. In Dr. Inada and colleagues’ study, “the incident measure, or numerator, is the number of collisions. The exposure measure or denominator is population. Because the study uses population and not driver licenses (or distance traveled) as an exposure measure, the observed decrease in collisions does not say much about how the collision risk develops after the implementation of screening.”
Older driver screening “is likely to cause some older persons to cease from driving and probably continue to travel as unprotected road users,” Dr. Siren continued. “Similar to what we found [in 2012], the injury rates for pedestrians and cyclists went up after the introduction of screening, which suggests that screening indirectly causes increasing number of injuries among older unprotected road users.”
Matthew Rizzo, MD, professor and chair of the department of neurological sciences at the University of Nebraska Medical Center and codirector of the Nebraska Neuroscience Alliance in Omaha, Neb., and the lead author of the 2010 AAN guidelines on cognitive impairment and driving risk, cautioned against ageism in designing policies meant to protect motorists.
“We find some erratic/weak effects of age here and there, but the big effects we consistently find are from cognitive and visual decline – which is somewhat correlated with age, but with huge variance,” Dr. Rizzo said. “It is hard to say what an optimal age threshold for risk would be, and if 75 is it.”
U.S. crash data from the last decade points to drivers 80 and older as significantly more accident-prone than those in their 70s, or even late 70s, Dr. Rizzo noted. Moreover, “willingness to get on the road, number of miles driven, type of road (urban, rural, highway, commercial, residential), type of vehicle driven, traffic, and environment (day, night, weather), et cetera, are all factors to consider in driving risk and restriction,” he said.
Dr. Rizzo added that the 2010 AAN guidelines might need to be revisited in light of newer vehicle safety systems and automation.
Dr. Inada and colleagues’ study was funded by Japanese government grants, and Dr. Inada and his coauthors reported no financial conflicts of interest. Dr. Siren and Dr. Rizzo reported no financial conflicts of interest.
, according to results from a large population-based study using data from Japan.
But the same study, published in the Journal of the American Geriatrics Society, also reported a concurrent increase in pedestrian and cycling injuries, possibly because more older former drivers were getting around by alternative means. That finding echoed a 2012 study from Denmark, which also looked at the effects of an age-based cognitive screening policy for older drivers, and saw more fatal road injuries among older people who were not driving.
While some governments, including those of Denmark, Taiwan, and Japan, have implemented age-based cognitive screening for older drivers, there has been little evidence to date that such policies improve road safety. Guidelines issued in 2010 by the American Academy of Neurology discourage age-based screening, advising instead that people diagnosed with cognitive disorders be carefully evaluated for driving fitness and recommending one widely used scale, the Clinical Dementia Rating, as useful in identifying potentially unsafe drivers.
Japan’s national screening policy: Did it work?
The new study, led by Haruhiko Inada, MD, PhD, an epidemiologist at Johns Hopkins University in Baltimore, used national crash data from Japan, where since 2017 all drivers 75 and older not only must take cognitive tests measuring temporal orientation and memory at license renewal, but are also referred for medical evaluation if they fail them. People receiving a subsequent dementia diagnosis can have their licenses suspended or revoked.
Dr. Inada and his colleagues looked at national data from nearly 603,000 police-reported vehicle collisions and nearly 197,000 pedestrian or cyclist road injuries between March 2012 and December 2019, all involving people aged 70 and older. To assess the screening policy’s impact, the researchers calculated estimated monthly collision or injury incidence rates per 100,000 person-years. This way, they could “control for secular trends that were unaffected by the policy, such as the decreasing incidence of motor vehicle collisions year by year,” the researchers explained.
After the screening was implemented, cumulative estimated collisions among drivers 75 or older decreased by 3,670 (95% confidence interval, 5,125-2,104), while reported pedestrian or cyclist injuries increased by an estimated 959 (95% CI, 24-1,834). Dr. Inada and colleagues found that crashes declined among men but not women, noting also that more older men than women are licensed to drive in Japan. Pedestrian and cyclist injuries were highest among men aged 80-84, and women aged 80 and older.
“Cognitively screening older drivers at license renewal and promoting voluntary surrender of licenses may prevent motor vehicle collisions,” Dr. Inada and his colleagues concluded. “However, they are associated with an increase in road injuries for older pedestrians and cyclists. Future studies should examine the effectiveness of mitigation measures, such as alternative, safe transportation, and accommodations for pedestrians and cyclists.”
No definitive answers
Two investigators who have studied cognitive screening related to road safety were contacted for commentary on the study findings.
Anu Siren, PhD, professor of gerontology at Tampere (Finland) University, who in 2012 reported higher injuries after implementation of older-driver cognitive screening in Denmark, commented that the new study, while benefiting from a much larger data set than earlier studies, still “fails to show that decrease in collisions is because ‘unfit’ drivers were removed from the road. But it does confirm previous findings about how strict screening policies make people shift from cars to unprotected modes of transportation,” which are riskier.
In studies measuring driving safety, the usual definition of risk is incidents per exposure, Dr. Siren noted. In Dr. Inada and colleagues’ study, “the incident measure, or numerator, is the number of collisions. The exposure measure or denominator is population. Because the study uses population and not driver licenses (or distance traveled) as an exposure measure, the observed decrease in collisions does not say much about how the collision risk develops after the implementation of screening.”
Older driver screening “is likely to cause some older persons to cease from driving and probably continue to travel as unprotected road users,” Dr. Siren continued. “Similar to what we found [in 2012], the injury rates for pedestrians and cyclists went up after the introduction of screening, which suggests that screening indirectly causes increasing number of injuries among older unprotected road users.”
Matthew Rizzo, MD, professor and chair of the department of neurological sciences at the University of Nebraska Medical Center and codirector of the Nebraska Neuroscience Alliance in Omaha, Neb., and the lead author of the 2010 AAN guidelines on cognitive impairment and driving risk, cautioned against ageism in designing policies meant to protect motorists.
“We find some erratic/weak effects of age here and there, but the big effects we consistently find are from cognitive and visual decline – which is somewhat correlated with age, but with huge variance,” Dr. Rizzo said. “It is hard to say what an optimal age threshold for risk would be, and if 75 is it.”
U.S. crash data from the last decade points to drivers 80 and older as significantly more accident-prone than those in their 70s, or even late 70s, Dr. Rizzo noted. Moreover, “willingness to get on the road, number of miles driven, type of road (urban, rural, highway, commercial, residential), type of vehicle driven, traffic, and environment (day, night, weather), et cetera, are all factors to consider in driving risk and restriction,” he said.
Dr. Rizzo added that the 2010 AAN guidelines might need to be revisited in light of newer vehicle safety systems and automation.
Dr. Inada and colleagues’ study was funded by Japanese government grants, and Dr. Inada and his coauthors reported no financial conflicts of interest. Dr. Siren and Dr. Rizzo reported no financial conflicts of interest.
, according to results from a large population-based study using data from Japan.
But the same study, published in the Journal of the American Geriatrics Society, also reported a concurrent increase in pedestrian and cycling injuries, possibly because more older former drivers were getting around by alternative means. That finding echoed a 2012 study from Denmark, which also looked at the effects of an age-based cognitive screening policy for older drivers, and saw more fatal road injuries among older people who were not driving.
While some governments, including those of Denmark, Taiwan, and Japan, have implemented age-based cognitive screening for older drivers, there has been little evidence to date that such policies improve road safety. Guidelines issued in 2010 by the American Academy of Neurology discourage age-based screening, advising instead that people diagnosed with cognitive disorders be carefully evaluated for driving fitness and recommending one widely used scale, the Clinical Dementia Rating, as useful in identifying potentially unsafe drivers.
Japan’s national screening policy: Did it work?
The new study, led by Haruhiko Inada, MD, PhD, an epidemiologist at Johns Hopkins University in Baltimore, used national crash data from Japan, where since 2017 all drivers 75 and older not only must take cognitive tests measuring temporal orientation and memory at license renewal, but are also referred for medical evaluation if they fail them. People receiving a subsequent dementia diagnosis can have their licenses suspended or revoked.
Dr. Inada and his colleagues looked at national data from nearly 603,000 police-reported vehicle collisions and nearly 197,000 pedestrian or cyclist road injuries between March 2012 and December 2019, all involving people aged 70 and older. To assess the screening policy’s impact, the researchers calculated estimated monthly collision or injury incidence rates per 100,000 person-years. This way, they could “control for secular trends that were unaffected by the policy, such as the decreasing incidence of motor vehicle collisions year by year,” the researchers explained.
After the screening was implemented, cumulative estimated collisions among drivers 75 or older decreased by 3,670 (95% confidence interval, 5,125-2,104), while reported pedestrian or cyclist injuries increased by an estimated 959 (95% CI, 24-1,834). Dr. Inada and colleagues found that crashes declined among men but not women, noting also that more older men than women are licensed to drive in Japan. Pedestrian and cyclist injuries were highest among men aged 80-84, and women aged 80 and older.
“Cognitively screening older drivers at license renewal and promoting voluntary surrender of licenses may prevent motor vehicle collisions,” Dr. Inada and his colleagues concluded. “However, they are associated with an increase in road injuries for older pedestrians and cyclists. Future studies should examine the effectiveness of mitigation measures, such as alternative, safe transportation, and accommodations for pedestrians and cyclists.”
No definitive answers
Two investigators who have studied cognitive screening related to road safety were contacted for commentary on the study findings.
Anu Siren, PhD, professor of gerontology at Tampere (Finland) University, who in 2012 reported higher injuries after implementation of older-driver cognitive screening in Denmark, commented that the new study, while benefiting from a much larger data set than earlier studies, still “fails to show that decrease in collisions is because ‘unfit’ drivers were removed from the road. But it does confirm previous findings about how strict screening policies make people shift from cars to unprotected modes of transportation,” which are riskier.
In studies measuring driving safety, the usual definition of risk is incidents per exposure, Dr. Siren noted. In Dr. Inada and colleagues’ study, “the incident measure, or numerator, is the number of collisions. The exposure measure or denominator is population. Because the study uses population and not driver licenses (or distance traveled) as an exposure measure, the observed decrease in collisions does not say much about how the collision risk develops after the implementation of screening.”
Older driver screening “is likely to cause some older persons to cease from driving and probably continue to travel as unprotected road users,” Dr. Siren continued. “Similar to what we found [in 2012], the injury rates for pedestrians and cyclists went up after the introduction of screening, which suggests that screening indirectly causes increasing number of injuries among older unprotected road users.”
Matthew Rizzo, MD, professor and chair of the department of neurological sciences at the University of Nebraska Medical Center and codirector of the Nebraska Neuroscience Alliance in Omaha, Neb., and the lead author of the 2010 AAN guidelines on cognitive impairment and driving risk, cautioned against ageism in designing policies meant to protect motorists.
“We find some erratic/weak effects of age here and there, but the big effects we consistently find are from cognitive and visual decline – which is somewhat correlated with age, but with huge variance,” Dr. Rizzo said. “It is hard to say what an optimal age threshold for risk would be, and if 75 is it.”
U.S. crash data from the last decade points to drivers 80 and older as significantly more accident-prone than those in their 70s, or even late 70s, Dr. Rizzo noted. Moreover, “willingness to get on the road, number of miles driven, type of road (urban, rural, highway, commercial, residential), type of vehicle driven, traffic, and environment (day, night, weather), et cetera, are all factors to consider in driving risk and restriction,” he said.
Dr. Rizzo added that the 2010 AAN guidelines might need to be revisited in light of newer vehicle safety systems and automation.
Dr. Inada and colleagues’ study was funded by Japanese government grants, and Dr. Inada and his coauthors reported no financial conflicts of interest. Dr. Siren and Dr. Rizzo reported no financial conflicts of interest.
FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY
Can a ‘smart’ skin patch detect early neurodegenerative diseases?
A new “smart patch” composed of microneedles that can detect proinflammatory markers via simulated skin interstitial fluid (ISF) may help diagnose neurodegenerative disorders such as Alzheimer’s disease and Parkinson’s disease very early on.
Originally developed to deliver medications and vaccines via the skin in a minimally invasive manner, the microneedle arrays were fitted with molecular sensors that, when placed on the skin, detect neuroinflammatory biomarkers such as interleukin-6 in as little as 6 minutes.
The literature suggests that these biomarkers of neurodegenerative disease are present years before patients become symptomatic, said study investigator Sanjiv Sharma, PhD.
“Neurodegenerative disorders such as Parkinson’s disease and Alzheimer’s disease are [characterized by] progressive loss in nerve cell and brain cells, which leads to memory problems and a loss of mental ability. That is why early diagnosis is key to preventing the loss of brain tissue in dementia, which can go undetected for years,” added Dr. Sharma, who is a lecturer in medical engineering at Swansea (Wales) University.
Dr. Sharma developed the patch with scientists at the Polytechnic of Porto (Portugal) School of Engineering in Portugal. In 2022, they designed, and are currently testing, a microneedle patch that will deliver the COVID vaccine.
The investigators describe their research on the patch’s ability to detect IL-6 in an article published in ACS Omega.
At-home diagnosis?
“The skin is the largest organ in the body – it contains more skin interstitial fluid than the total blood volume,” Dr. Sharma noted. “This fluid is an ultrafiltrate of blood and holds biomarkers that complement other biofluids, such as sweat, saliva, and urine. It can be sampled in a minimally invasive manner and used either for point-of-care testing or real-time using microneedle devices.”
Dr. Sharma and associates tested the microneedle patch in artificial ISF that contained the inflammatory cytokine IL-6. They found that the patch accurately detected IL-6 concentrations as low as 1 pg/mL in the fabricated ISF solution.
“In general, the transdermal sensor presented here showed simplicity in designing, short measuring time, high accuracy, and low detection limit. This approach seems a successful tool for the screening of inflammatory biomarkers in point of care testing wherein the skin acts as a window to the body,” the investigators reported.
Dr. Sharma noted that early detection of neurodegenerative diseases is crucial, as once symptoms appear, the disease may have already progressed significantly, and meaningful intervention is challenging.
The device has yet to be tested in humans, which is the next step, said Dr. Sharma.
“We will have to test the hypothesis through extensive preclinical and clinical studies to determine if bloodless, transdermal (skin) diagnostics can offer a cost-effective device that could allow testing in simpler settings such as a clinician’s practice or even home settings,” he noted.
Early days
Commenting on the research, David K. Simon, MD, PhD, professor of neurology at Harvard Medical School, Boston, said it is “a promising step regarding validation of a potentially beneficial method for rapidly and accurately measuring IL-6.”
However, he added, “many additional steps are needed to validate the method in actual human skin and to determine whether or not measuring these biomarkers in skin will be useful in studies of neurodegenerative diseases.”
He noted that one study limitation is that inflammatory cytokines such as IL-6 are highly nonspecific, and levels are elevated in various diseases associated with inflammation.
“It is highly unlikely that measuring IL-6 will be useful as a diagnostic tool. However, it does have potential as a biomarker for measuring the impact of treatments aimed at reducing inflammation. As the authors point out, it’s more likely that clinicians will require a panel of biomarkers rather than only measuring IL-6,” he said.
The study was funded by Fundação para a Ciência e Tecnologia. The investigators disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A new “smart patch” composed of microneedles that can detect proinflammatory markers via simulated skin interstitial fluid (ISF) may help diagnose neurodegenerative disorders such as Alzheimer’s disease and Parkinson’s disease very early on.
Originally developed to deliver medications and vaccines via the skin in a minimally invasive manner, the microneedle arrays were fitted with molecular sensors that, when placed on the skin, detect neuroinflammatory biomarkers such as interleukin-6 in as little as 6 minutes.
The literature suggests that these biomarkers of neurodegenerative disease are present years before patients become symptomatic, said study investigator Sanjiv Sharma, PhD.
“Neurodegenerative disorders such as Parkinson’s disease and Alzheimer’s disease are [characterized by] progressive loss in nerve cell and brain cells, which leads to memory problems and a loss of mental ability. That is why early diagnosis is key to preventing the loss of brain tissue in dementia, which can go undetected for years,” added Dr. Sharma, who is a lecturer in medical engineering at Swansea (Wales) University.
Dr. Sharma developed the patch with scientists at the Polytechnic of Porto (Portugal) School of Engineering in Portugal. In 2022, they designed, and are currently testing, a microneedle patch that will deliver the COVID vaccine.
The investigators describe their research on the patch’s ability to detect IL-6 in an article published in ACS Omega.
At-home diagnosis?
“The skin is the largest organ in the body – it contains more skin interstitial fluid than the total blood volume,” Dr. Sharma noted. “This fluid is an ultrafiltrate of blood and holds biomarkers that complement other biofluids, such as sweat, saliva, and urine. It can be sampled in a minimally invasive manner and used either for point-of-care testing or real-time using microneedle devices.”
Dr. Sharma and associates tested the microneedle patch in artificial ISF that contained the inflammatory cytokine IL-6. They found that the patch accurately detected IL-6 concentrations as low as 1 pg/mL in the fabricated ISF solution.
“In general, the transdermal sensor presented here showed simplicity in designing, short measuring time, high accuracy, and low detection limit. This approach seems a successful tool for the screening of inflammatory biomarkers in point of care testing wherein the skin acts as a window to the body,” the investigators reported.
Dr. Sharma noted that early detection of neurodegenerative diseases is crucial, as once symptoms appear, the disease may have already progressed significantly, and meaningful intervention is challenging.
The device has yet to be tested in humans, which is the next step, said Dr. Sharma.
“We will have to test the hypothesis through extensive preclinical and clinical studies to determine if bloodless, transdermal (skin) diagnostics can offer a cost-effective device that could allow testing in simpler settings such as a clinician’s practice or even home settings,” he noted.
Early days
Commenting on the research, David K. Simon, MD, PhD, professor of neurology at Harvard Medical School, Boston, said it is “a promising step regarding validation of a potentially beneficial method for rapidly and accurately measuring IL-6.”
However, he added, “many additional steps are needed to validate the method in actual human skin and to determine whether or not measuring these biomarkers in skin will be useful in studies of neurodegenerative diseases.”
He noted that one study limitation is that inflammatory cytokines such as IL-6 are highly nonspecific, and levels are elevated in various diseases associated with inflammation.
“It is highly unlikely that measuring IL-6 will be useful as a diagnostic tool. However, it does have potential as a biomarker for measuring the impact of treatments aimed at reducing inflammation. As the authors point out, it’s more likely that clinicians will require a panel of biomarkers rather than only measuring IL-6,” he said.
The study was funded by Fundação para a Ciência e Tecnologia. The investigators disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
A new “smart patch” composed of microneedles that can detect proinflammatory markers via simulated skin interstitial fluid (ISF) may help diagnose neurodegenerative disorders such as Alzheimer’s disease and Parkinson’s disease very early on.
Originally developed to deliver medications and vaccines via the skin in a minimally invasive manner, the microneedle arrays were fitted with molecular sensors that, when placed on the skin, detect neuroinflammatory biomarkers such as interleukin-6 in as little as 6 minutes.
The literature suggests that these biomarkers of neurodegenerative disease are present years before patients become symptomatic, said study investigator Sanjiv Sharma, PhD.
“Neurodegenerative disorders such as Parkinson’s disease and Alzheimer’s disease are [characterized by] progressive loss in nerve cell and brain cells, which leads to memory problems and a loss of mental ability. That is why early diagnosis is key to preventing the loss of brain tissue in dementia, which can go undetected for years,” added Dr. Sharma, who is a lecturer in medical engineering at Swansea (Wales) University.
Dr. Sharma developed the patch with scientists at the Polytechnic of Porto (Portugal) School of Engineering in Portugal. In 2022, they designed, and are currently testing, a microneedle patch that will deliver the COVID vaccine.
The investigators describe their research on the patch’s ability to detect IL-6 in an article published in ACS Omega.
At-home diagnosis?
“The skin is the largest organ in the body – it contains more skin interstitial fluid than the total blood volume,” Dr. Sharma noted. “This fluid is an ultrafiltrate of blood and holds biomarkers that complement other biofluids, such as sweat, saliva, and urine. It can be sampled in a minimally invasive manner and used either for point-of-care testing or real-time using microneedle devices.”
Dr. Sharma and associates tested the microneedle patch in artificial ISF that contained the inflammatory cytokine IL-6. They found that the patch accurately detected IL-6 concentrations as low as 1 pg/mL in the fabricated ISF solution.
“In general, the transdermal sensor presented here showed simplicity in designing, short measuring time, high accuracy, and low detection limit. This approach seems a successful tool for the screening of inflammatory biomarkers in point of care testing wherein the skin acts as a window to the body,” the investigators reported.
Dr. Sharma noted that early detection of neurodegenerative diseases is crucial, as once symptoms appear, the disease may have already progressed significantly, and meaningful intervention is challenging.
The device has yet to be tested in humans, which is the next step, said Dr. Sharma.
“We will have to test the hypothesis through extensive preclinical and clinical studies to determine if bloodless, transdermal (skin) diagnostics can offer a cost-effective device that could allow testing in simpler settings such as a clinician’s practice or even home settings,” he noted.
Early days
Commenting on the research, David K. Simon, MD, PhD, professor of neurology at Harvard Medical School, Boston, said it is “a promising step regarding validation of a potentially beneficial method for rapidly and accurately measuring IL-6.”
However, he added, “many additional steps are needed to validate the method in actual human skin and to determine whether or not measuring these biomarkers in skin will be useful in studies of neurodegenerative diseases.”
He noted that one study limitation is that inflammatory cytokines such as IL-6 are highly nonspecific, and levels are elevated in various diseases associated with inflammation.
“It is highly unlikely that measuring IL-6 will be useful as a diagnostic tool. However, it does have potential as a biomarker for measuring the impact of treatments aimed at reducing inflammation. As the authors point out, it’s more likely that clinicians will require a panel of biomarkers rather than only measuring IL-6,” he said.
The study was funded by Fundação para a Ciência e Tecnologia. The investigators disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM ACS OMEGA
Biosimilar equal to natalizumab for relapsing remitting MS
An agent biologically similar to the humanized monoclonal antibody natalizumab is as effective and safe as the original reference drug for relapsing remitting multiple sclerosis (RRMS) – and has a similar level of immunogenicity, new research shows.
The investigators noted that these phase 3 trial findings are the final stage in the regulatory approval process.
“There will be a biosimilar that with respect to all parameters – efficacy, side effects, immunogenicity – doesn’t differ from the original drug and will probably be an option to consider to reduce treatment costs in MS,” said lead investigator Bernhard Hemmer, MD, a professor in the department of neurology, Technical University of Munich (Germany).
The findings were published online in JAMA Neurology.
Potential cost savings
Disease-modifying therapies (DMTs), particularly targeted biologics, have revolutionized the treatment of MS, including RRMS. Natalizumab, which was the first targeted biologic therapy approved for RRMS, is very effective and widely used, Dr. Hemmer said.
However, this and other DMTs are costly. Biosimilars, which are medicines clinically similar to an already marketed reference biologic medicine, can address this issue. In the areas of rheumatology and oncology, biosimilars have already demonstrated significant cost savings and improved treatment access.
The biosimilar natalizumab (biosim-NTZ), developed by Polpharma Biologics, is the first biosimilar monoclonal antibody therapy to be developed for MS.
Health authorities such as the Food and Drug Administration require comparative phase 3 studies to confirm there are no clinically relevant differences between a proposed biosimilar and its reference medicine.
The new multicenter, phase 3, double-blind, randomized trial – known as Antelope – included 264 adult patients with RRMS at 48 centers in seven Eastern European countries. Most study participants were women (61.4%), and their mean age was 36.7 years.
All study participants were randomly assigned to receive intravenous infusions every 4 weeks of 300 mg of biosim-NTZ or reference natalizumab (ref-NTZ) for a total of 12 infusions.
At week 24, 30 patients were switched from ref-NTZ to biosim-NTZ for the remainder of their infusions. Including such a population is required by regulatory agencies to ensure switching patients from a drug they’ve been taking to a new biosimilar does not introduce any concerns, said Dr. Hemmer.
Comparable efficacy, safety profile
The primary efficacy endpoint was the cumulative number of new active brain lesions on MRI.
At baseline, 48.1% of the biosimilar group and 45.9% of the reference drug group had at least one gadolinium-enhancing lesion. In addition, 96.9% of the biosimilar group had more than 15 T2 lesions, compared with 96.2% of the reference group.
At week 24, the mean difference between biosim-NTZ and ref-NTZ in the cumulative number of new active lesions was 0.17 (least square means, 0.34 vs. 0.45), with a 95% confidence interval of –0.61 to 0.94 and a point estimate within the prespecified margins of ± 2.1.
The annualized relapse rate for biosim-NTZ and ref-NTZ was similar at 24 weeks (0.21 vs. 0.15), as well as at 48 weeks (0.17 vs. 0.13). For Expanded Disability Status Scale scores, which were similar between treatment groups at baseline (mean, 3.4 vs. 3.2), change at 24 and 48 weeks was minimal and similar in both groups.
The safety profile was as expected for patients with RRMS receiving natalizumab. There were few adverse events of special interest, with similar proportions across all treatment groups.
The overall adverse-event profile for patients who switched from ref-NTZ to biosim-NTZ was similar to patients continuing ref-NTZ treatment and did not indicate any new or increased risks associated with switching.
Rates of treatment-emergent adverse events (TEAEs) were similar, at 64.9% for biosim-NTZ, 68.9% for ref-NTZ, and 73.3% for the switch group. The most-reported TEAEs among all treatment groups were nervous system disorders and infections and infestations.
Progressive multifocal leukoencephalopathy (PML), a rare and potentially fatal demyelinating disease of the central nervous system, is associated with some DMTs – notably ref-NTZ. It is caused by infection with the John Cunningham virus (JCV) (also referred to as human polyomavirus), the researchers noted.
As per the study protocol, no participant had a JCV-positive index of more than 1.5 at baseline. Proportions of patients positive for anti-JCV antibodies were similarly distributed between treatment groups throughout the study.
Similar immunogenicity
There was strong concordance regarding positivity for treatment-emergent antidrug antibodies between the biosim-NTZ and ref-NTZ groups (79.4% and 74.0%). This was also the case for antinatalizumab-neutralizing antibodies (69.0% and 66.2%).
“There was nothing that indicated immunogenicity is different” between the two agents, said Dr. Hemmer.
While this might change “when you look at longer time periods,” antibodies to natalizumab usually develop “very early on,” he added.
Dr. Hemmer noted that this comparison of the proposed biosimilar with the reference drug had no real surprises.
“If the immunogenicity is the same, the mode of action is the same, and the dose is the same, you would expect to have a similar clinical effect and also a similar side-effect profile, which is indeed the case,” he said.
Dr. Hemmer added that he has no insight as to when the drug might be approved but believes developers expect that to occur sometime this year.
Welcome results
Commenting on the study results, Torge Rempe, MD, assistant professor in the department of neurology, University of Florida, Gainesville, and the William T. And Janice M. Neely professor for research in MS, said he welcomes these new results showing the biosimilar matched the reference medication.
“The authors report no significant difference in their primary endpoint of cumulative number of active lesions as well as their secondary clinical endpoints of annualized relapse rate and changes from baseline Expanded Disability Status Scale scores,” said Dr. Rempe, who was not involved with the research.
The study also showed the reported adverse events were similar between the biosimilar and reference natalizumab, he noted.
However, although no cases of PML were uncovered during the study period, further research is needed to determine long-term safety in this area, Dr. Rempe said.
Finally, he agreed that the development of biosimilars such as this one addresses the issue of high annual costs for DMTs, an area of concern in the field of MS.
The study was funded by Polpharma Biologics. Dr. Hemmer has reported receiving personal fees from Polpharma and Sandoz during the conduct of the study and personal fees from Novartis, Biocom, and TG Therapeutics outside the submitted work. He has also received a patent for genetic determinants of antibodies against interferon-beta and a patent for KIR4.1 antibodies in MS; served on scientific advisory boards for Novartis; served as a data monitoring and safety committee member for AllergyCare, Polpharma Biologics, Sandoz, and TG Therapeutics; and received speaker honoraria from Desitin, grants from Regeneron for MS research, and funding from the Multiple MS EU consortium, the CLINSPECT-M consortium, and the German Research Foundation. Dr. Rempe has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
An agent biologically similar to the humanized monoclonal antibody natalizumab is as effective and safe as the original reference drug for relapsing remitting multiple sclerosis (RRMS) – and has a similar level of immunogenicity, new research shows.
The investigators noted that these phase 3 trial findings are the final stage in the regulatory approval process.
“There will be a biosimilar that with respect to all parameters – efficacy, side effects, immunogenicity – doesn’t differ from the original drug and will probably be an option to consider to reduce treatment costs in MS,” said lead investigator Bernhard Hemmer, MD, a professor in the department of neurology, Technical University of Munich (Germany).
The findings were published online in JAMA Neurology.
Potential cost savings
Disease-modifying therapies (DMTs), particularly targeted biologics, have revolutionized the treatment of MS, including RRMS. Natalizumab, which was the first targeted biologic therapy approved for RRMS, is very effective and widely used, Dr. Hemmer said.
However, this and other DMTs are costly. Biosimilars, which are medicines clinically similar to an already marketed reference biologic medicine, can address this issue. In the areas of rheumatology and oncology, biosimilars have already demonstrated significant cost savings and improved treatment access.
The biosimilar natalizumab (biosim-NTZ), developed by Polpharma Biologics, is the first biosimilar monoclonal antibody therapy to be developed for MS.
Health authorities such as the Food and Drug Administration require comparative phase 3 studies to confirm there are no clinically relevant differences between a proposed biosimilar and its reference medicine.
The new multicenter, phase 3, double-blind, randomized trial – known as Antelope – included 264 adult patients with RRMS at 48 centers in seven Eastern European countries. Most study participants were women (61.4%), and their mean age was 36.7 years.
All study participants were randomly assigned to receive intravenous infusions every 4 weeks of 300 mg of biosim-NTZ or reference natalizumab (ref-NTZ) for a total of 12 infusions.
At week 24, 30 patients were switched from ref-NTZ to biosim-NTZ for the remainder of their infusions. Including such a population is required by regulatory agencies to ensure switching patients from a drug they’ve been taking to a new biosimilar does not introduce any concerns, said Dr. Hemmer.
Comparable efficacy, safety profile
The primary efficacy endpoint was the cumulative number of new active brain lesions on MRI.
At baseline, 48.1% of the biosimilar group and 45.9% of the reference drug group had at least one gadolinium-enhancing lesion. In addition, 96.9% of the biosimilar group had more than 15 T2 lesions, compared with 96.2% of the reference group.
At week 24, the mean difference between biosim-NTZ and ref-NTZ in the cumulative number of new active lesions was 0.17 (least square means, 0.34 vs. 0.45), with a 95% confidence interval of –0.61 to 0.94 and a point estimate within the prespecified margins of ± 2.1.
The annualized relapse rate for biosim-NTZ and ref-NTZ was similar at 24 weeks (0.21 vs. 0.15), as well as at 48 weeks (0.17 vs. 0.13). For Expanded Disability Status Scale scores, which were similar between treatment groups at baseline (mean, 3.4 vs. 3.2), change at 24 and 48 weeks was minimal and similar in both groups.
The safety profile was as expected for patients with RRMS receiving natalizumab. There were few adverse events of special interest, with similar proportions across all treatment groups.
The overall adverse-event profile for patients who switched from ref-NTZ to biosim-NTZ was similar to patients continuing ref-NTZ treatment and did not indicate any new or increased risks associated with switching.
Rates of treatment-emergent adverse events (TEAEs) were similar, at 64.9% for biosim-NTZ, 68.9% for ref-NTZ, and 73.3% for the switch group. The most-reported TEAEs among all treatment groups were nervous system disorders and infections and infestations.
Progressive multifocal leukoencephalopathy (PML), a rare and potentially fatal demyelinating disease of the central nervous system, is associated with some DMTs – notably ref-NTZ. It is caused by infection with the John Cunningham virus (JCV) (also referred to as human polyomavirus), the researchers noted.
As per the study protocol, no participant had a JCV-positive index of more than 1.5 at baseline. Proportions of patients positive for anti-JCV antibodies were similarly distributed between treatment groups throughout the study.
Similar immunogenicity
There was strong concordance regarding positivity for treatment-emergent antidrug antibodies between the biosim-NTZ and ref-NTZ groups (79.4% and 74.0%). This was also the case for antinatalizumab-neutralizing antibodies (69.0% and 66.2%).
“There was nothing that indicated immunogenicity is different” between the two agents, said Dr. Hemmer.
While this might change “when you look at longer time periods,” antibodies to natalizumab usually develop “very early on,” he added.
Dr. Hemmer noted that this comparison of the proposed biosimilar with the reference drug had no real surprises.
“If the immunogenicity is the same, the mode of action is the same, and the dose is the same, you would expect to have a similar clinical effect and also a similar side-effect profile, which is indeed the case,” he said.
Dr. Hemmer added that he has no insight as to when the drug might be approved but believes developers expect that to occur sometime this year.
Welcome results
Commenting on the study results, Torge Rempe, MD, assistant professor in the department of neurology, University of Florida, Gainesville, and the William T. And Janice M. Neely professor for research in MS, said he welcomes these new results showing the biosimilar matched the reference medication.
“The authors report no significant difference in their primary endpoint of cumulative number of active lesions as well as their secondary clinical endpoints of annualized relapse rate and changes from baseline Expanded Disability Status Scale scores,” said Dr. Rempe, who was not involved with the research.
The study also showed the reported adverse events were similar between the biosimilar and reference natalizumab, he noted.
However, although no cases of PML were uncovered during the study period, further research is needed to determine long-term safety in this area, Dr. Rempe said.
Finally, he agreed that the development of biosimilars such as this one addresses the issue of high annual costs for DMTs, an area of concern in the field of MS.
The study was funded by Polpharma Biologics. Dr. Hemmer has reported receiving personal fees from Polpharma and Sandoz during the conduct of the study and personal fees from Novartis, Biocom, and TG Therapeutics outside the submitted work. He has also received a patent for genetic determinants of antibodies against interferon-beta and a patent for KIR4.1 antibodies in MS; served on scientific advisory boards for Novartis; served as a data monitoring and safety committee member for AllergyCare, Polpharma Biologics, Sandoz, and TG Therapeutics; and received speaker honoraria from Desitin, grants from Regeneron for MS research, and funding from the Multiple MS EU consortium, the CLINSPECT-M consortium, and the German Research Foundation. Dr. Rempe has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
An agent biologically similar to the humanized monoclonal antibody natalizumab is as effective and safe as the original reference drug for relapsing remitting multiple sclerosis (RRMS) – and has a similar level of immunogenicity, new research shows.
The investigators noted that these phase 3 trial findings are the final stage in the regulatory approval process.
“There will be a biosimilar that with respect to all parameters – efficacy, side effects, immunogenicity – doesn’t differ from the original drug and will probably be an option to consider to reduce treatment costs in MS,” said lead investigator Bernhard Hemmer, MD, a professor in the department of neurology, Technical University of Munich (Germany).
The findings were published online in JAMA Neurology.
Potential cost savings
Disease-modifying therapies (DMTs), particularly targeted biologics, have revolutionized the treatment of MS, including RRMS. Natalizumab, which was the first targeted biologic therapy approved for RRMS, is very effective and widely used, Dr. Hemmer said.
However, this and other DMTs are costly. Biosimilars, which are medicines clinically similar to an already marketed reference biologic medicine, can address this issue. In the areas of rheumatology and oncology, biosimilars have already demonstrated significant cost savings and improved treatment access.
The biosimilar natalizumab (biosim-NTZ), developed by Polpharma Biologics, is the first biosimilar monoclonal antibody therapy to be developed for MS.
Health authorities such as the Food and Drug Administration require comparative phase 3 studies to confirm there are no clinically relevant differences between a proposed biosimilar and its reference medicine.
The new multicenter, phase 3, double-blind, randomized trial – known as Antelope – included 264 adult patients with RRMS at 48 centers in seven Eastern European countries. Most study participants were women (61.4%), and their mean age was 36.7 years.
All study participants were randomly assigned to receive intravenous infusions every 4 weeks of 300 mg of biosim-NTZ or reference natalizumab (ref-NTZ) for a total of 12 infusions.
At week 24, 30 patients were switched from ref-NTZ to biosim-NTZ for the remainder of their infusions. Including such a population is required by regulatory agencies to ensure switching patients from a drug they’ve been taking to a new biosimilar does not introduce any concerns, said Dr. Hemmer.
Comparable efficacy, safety profile
The primary efficacy endpoint was the cumulative number of new active brain lesions on MRI.
At baseline, 48.1% of the biosimilar group and 45.9% of the reference drug group had at least one gadolinium-enhancing lesion. In addition, 96.9% of the biosimilar group had more than 15 T2 lesions, compared with 96.2% of the reference group.
At week 24, the mean difference between biosim-NTZ and ref-NTZ in the cumulative number of new active lesions was 0.17 (least square means, 0.34 vs. 0.45), with a 95% confidence interval of –0.61 to 0.94 and a point estimate within the prespecified margins of ± 2.1.
The annualized relapse rate for biosim-NTZ and ref-NTZ was similar at 24 weeks (0.21 vs. 0.15), as well as at 48 weeks (0.17 vs. 0.13). For Expanded Disability Status Scale scores, which were similar between treatment groups at baseline (mean, 3.4 vs. 3.2), change at 24 and 48 weeks was minimal and similar in both groups.
The safety profile was as expected for patients with RRMS receiving natalizumab. There were few adverse events of special interest, with similar proportions across all treatment groups.
The overall adverse-event profile for patients who switched from ref-NTZ to biosim-NTZ was similar to patients continuing ref-NTZ treatment and did not indicate any new or increased risks associated with switching.
Rates of treatment-emergent adverse events (TEAEs) were similar, at 64.9% for biosim-NTZ, 68.9% for ref-NTZ, and 73.3% for the switch group. The most-reported TEAEs among all treatment groups were nervous system disorders and infections and infestations.
Progressive multifocal leukoencephalopathy (PML), a rare and potentially fatal demyelinating disease of the central nervous system, is associated with some DMTs – notably ref-NTZ. It is caused by infection with the John Cunningham virus (JCV) (also referred to as human polyomavirus), the researchers noted.
As per the study protocol, no participant had a JCV-positive index of more than 1.5 at baseline. Proportions of patients positive for anti-JCV antibodies were similarly distributed between treatment groups throughout the study.
Similar immunogenicity
There was strong concordance regarding positivity for treatment-emergent antidrug antibodies between the biosim-NTZ and ref-NTZ groups (79.4% and 74.0%). This was also the case for antinatalizumab-neutralizing antibodies (69.0% and 66.2%).
“There was nothing that indicated immunogenicity is different” between the two agents, said Dr. Hemmer.
While this might change “when you look at longer time periods,” antibodies to natalizumab usually develop “very early on,” he added.
Dr. Hemmer noted that this comparison of the proposed biosimilar with the reference drug had no real surprises.
“If the immunogenicity is the same, the mode of action is the same, and the dose is the same, you would expect to have a similar clinical effect and also a similar side-effect profile, which is indeed the case,” he said.
Dr. Hemmer added that he has no insight as to when the drug might be approved but believes developers expect that to occur sometime this year.
Welcome results
Commenting on the study results, Torge Rempe, MD, assistant professor in the department of neurology, University of Florida, Gainesville, and the William T. And Janice M. Neely professor for research in MS, said he welcomes these new results showing the biosimilar matched the reference medication.
“The authors report no significant difference in their primary endpoint of cumulative number of active lesions as well as their secondary clinical endpoints of annualized relapse rate and changes from baseline Expanded Disability Status Scale scores,” said Dr. Rempe, who was not involved with the research.
The study also showed the reported adverse events were similar between the biosimilar and reference natalizumab, he noted.
However, although no cases of PML were uncovered during the study period, further research is needed to determine long-term safety in this area, Dr. Rempe said.
Finally, he agreed that the development of biosimilars such as this one addresses the issue of high annual costs for DMTs, an area of concern in the field of MS.
The study was funded by Polpharma Biologics. Dr. Hemmer has reported receiving personal fees from Polpharma and Sandoz during the conduct of the study and personal fees from Novartis, Biocom, and TG Therapeutics outside the submitted work. He has also received a patent for genetic determinants of antibodies against interferon-beta and a patent for KIR4.1 antibodies in MS; served on scientific advisory boards for Novartis; served as a data monitoring and safety committee member for AllergyCare, Polpharma Biologics, Sandoz, and TG Therapeutics; and received speaker honoraria from Desitin, grants from Regeneron for MS research, and funding from the Multiple MS EU consortium, the CLINSPECT-M consortium, and the German Research Foundation. Dr. Rempe has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
Minorities with epilepsy blocked from receiving ‘highest quality of care’
, new research shows.
Even after controlling for epilepsy severity, comorbid conditions, and other factors that might affect medication choice, researchers found that newer medication use was 29% less likely in Black patients, 23% less likely in Native Hawaiian and other Pacific Islander patients, and 7% less likely in Hispanic patients, compared with White individuals.
“I hope that clinicians will see from our findings that minoritized patients with epilepsy face a myriad of barriers in receiving the highest quality of care, including ASM use,” said lead investigator Wyatt P. Bensken, PhD, adjunct assistant professor of Population and Quantitative Health Sciences at Case Western Reserve University, Cleveland. “Considering your patients’ barriers, and how that influences their care – including ASM selection – will be critical to helping reduce these population-level inequities.”
The study was published online in Neurology Clinical Practice.
A prompt for practice change
For the study, researchers used Medicaid claims for more than 78,000 people who had filled at least two prescriptions for an ASM between 2010 and 2014.
Most patients were White (53.4%); 22.6% were Black; 11.9% were Hispanic; 1.6% were Asian; 1.5% were Native Hawaiian or other Pacific Islander; 0.6% American Indian or Alaskan Native; and 8.3% were classified as “other.”
One-quarter of participants were taking an older ASM, such as carbamazepine, phenytoin, and valproate. About 65% were taking second-generation ASMs, including gabapentin, levetiracetam, and zonisamide. A little less than 10% were taking lacosamide, perampenel, or another third-generation ASM.
Compared with White patients, newer medication prescriptions were significantly less likely in Black individuals (adjusted odds ratio, 0.71; 95% confidence interval, 0.68-0.75), Native Hawaiian or other Pacific Islanders (aOR, 0.77; 95% CI, 0.67-0.88), and Hispanic patients (aOR, 0.93; 95% CI, 0.88-0.99).
Third-generation ASMs were used by 10.7% of White patients versus 6% of Black individuals and 5.1% of American Indian or Alaskan Native patients.
Researchers also found that taking a second-generation ASM was associated with better treatment adherence (aOR, 1.17; 95% CI, 1.11-1.23) and that patients on newer ASMs were more than three times as likely to be under the care of a neurologist (aOR, 3.26; 95% CI, 3.13-3.41).
The findings draw attention to racial inequities surrounding access to medication and specialists and subspecialists, Dr. Bensken said. Identifying specific barriers and developing solutions is the long-range goal, he added.
“In the interim, increasing the attention to these inequities will, we hope, prompt changes across practices,” Dr. Bensken said.
A ‘wake-up call’
Commenting on the findings, Joseph Sirven, MD, professor of neurology at the Mayo Clinic Florida, Jacksonville, said the results were “striking” because newer ASMs are generally the go-to for most physicians who treat epilepsy.
“Use of first-generation ASMs is typically reserved [for] if one runs out of options,” Dr. Sirven said.
This study and others like it should serve as a “wake-up call” for clinicians, Dr. Sirven added.
“This study is important because it shows that whether we realize it or not, race and ethnicities are playing a role in ASM, and this is related to financial access to newer-generation drugs,” he said. “Similar findings are seen in impoverished countries where first-generation ASM drugs are routinely used because of drug pricing.”
More to explore
Also commenting on the study, Scott Mintzer, MD, a professor and director of the Epilepsy Monitoring Unit at Thomas Jefferson University, Philadelphia, said using first-generation ASMs as a proxy for quality of care is “a very innovative concept.”
“From that perspective, the finding that racial minority patients are more likely to be on a first-generation drug is not surprising. But after that it gets far more complicated to interpret,” he added.
Neither adherence nor care by a neurologist was different in a consistent direction within the various minority populations, Dr. Mintzer noted. In addition, Black patients were as likely to see a neurologist as White patients but still more likely to be on a first-generation drug.
There are also a few caveats to the findings that should be considered, Dr. Mintzer added. First, the sample included only Medicaid recipients, nearly 35% of whom had a comorbid psychosis. Those and other characteristics of the study pool suggest participants aren’t representative of the United States population as a whole. Second, significant shifts in ASM use have occurred since the study data cutoff in 2014, none of which are reflected in these findings.
“So, I don’t think we can really say how to address this yet,” Dr. Mintzer said. “There’s a lot to explore about whether this is still occurring, how generalizable these findings are, and what they might be due to, as there are a host of potential explanations, which the authors themselves acknowledge.”
The study was funded by the U.S. Centers for Disease Control and Prevention and the National Institute on Minority Health and Health Disparities. Dr. Bensken has received support for this work from NIMHD and serves on the Editorial Board of the journal Neurology. Dr. Sirven and Mintzer report no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
, new research shows.
Even after controlling for epilepsy severity, comorbid conditions, and other factors that might affect medication choice, researchers found that newer medication use was 29% less likely in Black patients, 23% less likely in Native Hawaiian and other Pacific Islander patients, and 7% less likely in Hispanic patients, compared with White individuals.
“I hope that clinicians will see from our findings that minoritized patients with epilepsy face a myriad of barriers in receiving the highest quality of care, including ASM use,” said lead investigator Wyatt P. Bensken, PhD, adjunct assistant professor of Population and Quantitative Health Sciences at Case Western Reserve University, Cleveland. “Considering your patients’ barriers, and how that influences their care – including ASM selection – will be critical to helping reduce these population-level inequities.”
The study was published online in Neurology Clinical Practice.
A prompt for practice change
For the study, researchers used Medicaid claims for more than 78,000 people who had filled at least two prescriptions for an ASM between 2010 and 2014.
Most patients were White (53.4%); 22.6% were Black; 11.9% were Hispanic; 1.6% were Asian; 1.5% were Native Hawaiian or other Pacific Islander; 0.6% American Indian or Alaskan Native; and 8.3% were classified as “other.”
One-quarter of participants were taking an older ASM, such as carbamazepine, phenytoin, and valproate. About 65% were taking second-generation ASMs, including gabapentin, levetiracetam, and zonisamide. A little less than 10% were taking lacosamide, perampenel, or another third-generation ASM.
Compared with White patients, newer medication prescriptions were significantly less likely in Black individuals (adjusted odds ratio, 0.71; 95% confidence interval, 0.68-0.75), Native Hawaiian or other Pacific Islanders (aOR, 0.77; 95% CI, 0.67-0.88), and Hispanic patients (aOR, 0.93; 95% CI, 0.88-0.99).
Third-generation ASMs were used by 10.7% of White patients versus 6% of Black individuals and 5.1% of American Indian or Alaskan Native patients.
Researchers also found that taking a second-generation ASM was associated with better treatment adherence (aOR, 1.17; 95% CI, 1.11-1.23) and that patients on newer ASMs were more than three times as likely to be under the care of a neurologist (aOR, 3.26; 95% CI, 3.13-3.41).
The findings draw attention to racial inequities surrounding access to medication and specialists and subspecialists, Dr. Bensken said. Identifying specific barriers and developing solutions is the long-range goal, he added.
“In the interim, increasing the attention to these inequities will, we hope, prompt changes across practices,” Dr. Bensken said.
A ‘wake-up call’
Commenting on the findings, Joseph Sirven, MD, professor of neurology at the Mayo Clinic Florida, Jacksonville, said the results were “striking” because newer ASMs are generally the go-to for most physicians who treat epilepsy.
“Use of first-generation ASMs is typically reserved [for] if one runs out of options,” Dr. Sirven said.
This study and others like it should serve as a “wake-up call” for clinicians, Dr. Sirven added.
“This study is important because it shows that whether we realize it or not, race and ethnicities are playing a role in ASM, and this is related to financial access to newer-generation drugs,” he said. “Similar findings are seen in impoverished countries where first-generation ASM drugs are routinely used because of drug pricing.”
More to explore
Also commenting on the study, Scott Mintzer, MD, a professor and director of the Epilepsy Monitoring Unit at Thomas Jefferson University, Philadelphia, said using first-generation ASMs as a proxy for quality of care is “a very innovative concept.”
“From that perspective, the finding that racial minority patients are more likely to be on a first-generation drug is not surprising. But after that it gets far more complicated to interpret,” he added.
Neither adherence nor care by a neurologist was different in a consistent direction within the various minority populations, Dr. Mintzer noted. In addition, Black patients were as likely to see a neurologist as White patients but still more likely to be on a first-generation drug.
There are also a few caveats to the findings that should be considered, Dr. Mintzer added. First, the sample included only Medicaid recipients, nearly 35% of whom had a comorbid psychosis. Those and other characteristics of the study pool suggest participants aren’t representative of the United States population as a whole. Second, significant shifts in ASM use have occurred since the study data cutoff in 2014, none of which are reflected in these findings.
“So, I don’t think we can really say how to address this yet,” Dr. Mintzer said. “There’s a lot to explore about whether this is still occurring, how generalizable these findings are, and what they might be due to, as there are a host of potential explanations, which the authors themselves acknowledge.”
The study was funded by the U.S. Centers for Disease Control and Prevention and the National Institute on Minority Health and Health Disparities. Dr. Bensken has received support for this work from NIMHD and serves on the Editorial Board of the journal Neurology. Dr. Sirven and Mintzer report no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
, new research shows.
Even after controlling for epilepsy severity, comorbid conditions, and other factors that might affect medication choice, researchers found that newer medication use was 29% less likely in Black patients, 23% less likely in Native Hawaiian and other Pacific Islander patients, and 7% less likely in Hispanic patients, compared with White individuals.
“I hope that clinicians will see from our findings that minoritized patients with epilepsy face a myriad of barriers in receiving the highest quality of care, including ASM use,” said lead investigator Wyatt P. Bensken, PhD, adjunct assistant professor of Population and Quantitative Health Sciences at Case Western Reserve University, Cleveland. “Considering your patients’ barriers, and how that influences their care – including ASM selection – will be critical to helping reduce these population-level inequities.”
The study was published online in Neurology Clinical Practice.
A prompt for practice change
For the study, researchers used Medicaid claims for more than 78,000 people who had filled at least two prescriptions for an ASM between 2010 and 2014.
Most patients were White (53.4%); 22.6% were Black; 11.9% were Hispanic; 1.6% were Asian; 1.5% were Native Hawaiian or other Pacific Islander; 0.6% American Indian or Alaskan Native; and 8.3% were classified as “other.”
One-quarter of participants were taking an older ASM, such as carbamazepine, phenytoin, and valproate. About 65% were taking second-generation ASMs, including gabapentin, levetiracetam, and zonisamide. A little less than 10% were taking lacosamide, perampenel, or another third-generation ASM.
Compared with White patients, newer medication prescriptions were significantly less likely in Black individuals (adjusted odds ratio, 0.71; 95% confidence interval, 0.68-0.75), Native Hawaiian or other Pacific Islanders (aOR, 0.77; 95% CI, 0.67-0.88), and Hispanic patients (aOR, 0.93; 95% CI, 0.88-0.99).
Third-generation ASMs were used by 10.7% of White patients versus 6% of Black individuals and 5.1% of American Indian or Alaskan Native patients.
Researchers also found that taking a second-generation ASM was associated with better treatment adherence (aOR, 1.17; 95% CI, 1.11-1.23) and that patients on newer ASMs were more than three times as likely to be under the care of a neurologist (aOR, 3.26; 95% CI, 3.13-3.41).
The findings draw attention to racial inequities surrounding access to medication and specialists and subspecialists, Dr. Bensken said. Identifying specific barriers and developing solutions is the long-range goal, he added.
“In the interim, increasing the attention to these inequities will, we hope, prompt changes across practices,” Dr. Bensken said.
A ‘wake-up call’
Commenting on the findings, Joseph Sirven, MD, professor of neurology at the Mayo Clinic Florida, Jacksonville, said the results were “striking” because newer ASMs are generally the go-to for most physicians who treat epilepsy.
“Use of first-generation ASMs is typically reserved [for] if one runs out of options,” Dr. Sirven said.
This study and others like it should serve as a “wake-up call” for clinicians, Dr. Sirven added.
“This study is important because it shows that whether we realize it or not, race and ethnicities are playing a role in ASM, and this is related to financial access to newer-generation drugs,” he said. “Similar findings are seen in impoverished countries where first-generation ASM drugs are routinely used because of drug pricing.”
More to explore
Also commenting on the study, Scott Mintzer, MD, a professor and director of the Epilepsy Monitoring Unit at Thomas Jefferson University, Philadelphia, said using first-generation ASMs as a proxy for quality of care is “a very innovative concept.”
“From that perspective, the finding that racial minority patients are more likely to be on a first-generation drug is not surprising. But after that it gets far more complicated to interpret,” he added.
Neither adherence nor care by a neurologist was different in a consistent direction within the various minority populations, Dr. Mintzer noted. In addition, Black patients were as likely to see a neurologist as White patients but still more likely to be on a first-generation drug.
There are also a few caveats to the findings that should be considered, Dr. Mintzer added. First, the sample included only Medicaid recipients, nearly 35% of whom had a comorbid psychosis. Those and other characteristics of the study pool suggest participants aren’t representative of the United States population as a whole. Second, significant shifts in ASM use have occurred since the study data cutoff in 2014, none of which are reflected in these findings.
“So, I don’t think we can really say how to address this yet,” Dr. Mintzer said. “There’s a lot to explore about whether this is still occurring, how generalizable these findings are, and what they might be due to, as there are a host of potential explanations, which the authors themselves acknowledge.”
The study was funded by the U.S. Centers for Disease Control and Prevention and the National Institute on Minority Health and Health Disparities. Dr. Bensken has received support for this work from NIMHD and serves on the Editorial Board of the journal Neurology. Dr. Sirven and Mintzer report no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
FROM NEUROLOGY CLINICAL PRACTICE
Social isolation hikes dementia risk in older adults
, new research suggests. Results from a longitudinal study that included more than 5,000 United States–based seniors showed that nearly one-quarter were socially isolated.
After adjusting for demographic and health factors, social isolation was found to be associated with a 28% higher risk for developing dementia over a 9-year period, compared with non-isolation. In addition, this finding held true regardless of race or ethnicity.
“Social connections are increasingly understood as a critical factor for the health of individuals as they age,” senior study author Thomas K.M. Cudjoe, MD, Robert and Jane Meyerhoff Endowed Professor and assistant professor of medicine, Division of Geriatric Medicine and Gerontology, Johns Hopkins University School of Medicine, Baltimore, said in a press release. “Our study expands our understanding of the deleterious impact of social isolation on one’s risk for dementia over time,” Dr. Cudjoe added.
The findings were published online in the Journal of the American Geriatric Society.
Upstream resources, downstream outcomes
Social isolation is a “multidimensional construct” characterized by factors such as social connections, social support, resource sharing, and relationship strain. It also affects approximately a quarter of older adults, the investigators noted.
Although prior studies have pointed to an association between socially isolated older adults and increased risk for incident dementia, no study has described this longitudinal association in a nationally representative cohort of U.S. seniors.
Dr. Cudjoe said he was motivated to conduct the current study because he wondered whether or not older adults throughout the United States were similar to some of his patients “who might be at risk for worse cognitive outcomes because they lacked social contact with friends, family, or neighbors.”
The study was also “informed by conceptual foundation that upstream social and personal resources are linked to downstream health outcomes, including cognitive health and function,” the researchers added.
They turned to 2011-2020 data from the National Health and Aging Trends Study, a nationally representative, longitudinal cohort of U.S. Medicare beneficiaries. The sample was drawn from the Medicare enrollment file and incorporated 95 counties and 655 zip codes.
Participants (n = 5,022; mean age, 76.4 years; 57.2% women; 71.7% White, non-Hispanic; 42.4% having more than a college education) were community-dwelling older adults who completed annual 2-hour interviews that included assessment of function, economic health status, and well-being. To be included, they had to attend at least the baseline and first follow-up visits.
NHATS “includes domains that are relevant for the characterization of social isolation,” the investigators wrote. It used a typology of structural social isolation that is informed by the Berkman-Syme Social Network Index.
Included domains were living arrangements, discussion networks, and participation. All are “clinically relevant, practical, and components of a comprehensive social history,” the researchers noted.
They added that individuals classified as “socially isolated” often live alone, have no one or only one person that they can rely upon to discuss important matters, and have limited or no engagement in social or religious groups.
Social isolation in the study was characterized using questions about living with at least one other person, talking to two or more other people about “important matters” in the past year, attending religious services in the past month, and participating in the past month in such things as clubs, meetings, group activities, or volunteer work.
Wake-up call
Study participants received 1 point for each item/domain, with a sum score of 0 or 1 classified as “socially isolated” and 2 or more points considered “not socially isolated.” They were classified as having probable dementia based either on self-report or lower-than-mean performance in 2 or more cognitive domains, or a score indicating probable dementia on the AD8 Dementia Screening Interview.
Covariates included demographic factors, education, and health factors. Mean follow-up was 5.1 years.
Results showed close to one-quarter (23.3%) of the study population was classified as socially isolated, with one-fifth (21.1%) developing dementia by the end of the follow-up period.
Compared with non-isolated older adults, those who were socially isolated were more likely to develop dementia during the follow-up period (19.6% vs. 25.9%, respectively).
After adjusting for demographic factors, social isolation was significantly associated with a higher risk for incident dementia (hazard ratio, 1.33; 95% confidence interval, 1.13-1.56). This association persisted after further adjustment for health factors (HR, 1.27; 95% CI, 1.08-1.49). Race and ethnicity had no bearing on the association.
In addition to the association between social isolation and dementia, the researchers also estimated the cause-specific hazard of death before dementia and found that, overall, 18% of participants died prior to dementia over the follow-up period. In particular, the social isolation–associated cause-specific HR of death before dementia was 1.28 (95% CI, 1.2-1.5).
Dr. Cudjoe noted that the mechanism behind the association between social isolation and dementia in this population needs further study. Still, he hopes that the findings will “serve as a wake-up call for all of us to be more thoughtful of the role of social connections on our cognitive health.”
Clinicians “should be thinking about and assessing the presence or absence of social connections in their patients,” Dr. Cudjoe added.
‘Instrumental role’
Commenting on the study, Nicole Purcell, DO, neurologist and senior director of clinical practice at the Alzheimer’s Association, said the study “contributes to the growing body of evidence that finds social isolation is a serious public health risk for many seniors living in the United States, increasing their risk for dementia and other serious mental conditions.”
Dr. Purcell, who was not involved with the study, added that “health care systems and medical professionals can play an instrumental role in identifying individuals at risk for social isolation.”
She noted that for those experiencing social isolation, “interaction with health care providers may be one of the few opportunities those individuals have for social engagement, [so] using these interactions to identify individuals at risk for social isolation and referring them to local resources and groups that promote engagement, well-being, and access to senior services may help decrease dementia risk for vulnerable seniors.”
Dr. Purcell added that the Alzheimer’s Association offers early-stage programs throughout the country, including support groups, education, art, music, and other socially engaging activities.
The study was funded by the National Institute on Aging, National Institute on Minority Health and Health Disparities, and Secunda Family Foundation. The investigators and Dr. Purcell have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Results from a longitudinal study that included more than 5,000 United States–based seniors showed that nearly one-quarter were socially isolated.
After adjusting for demographic and health factors, social isolation was found to be associated with a 28% higher risk for developing dementia over a 9-year period, compared with non-isolation. In addition, this finding held true regardless of race or ethnicity.
“Social connections are increasingly understood as a critical factor for the health of individuals as they age,” senior study author Thomas K.M. Cudjoe, MD, Robert and Jane Meyerhoff Endowed Professor and assistant professor of medicine, Division of Geriatric Medicine and Gerontology, Johns Hopkins University School of Medicine, Baltimore, said in a press release. “Our study expands our understanding of the deleterious impact of social isolation on one’s risk for dementia over time,” Dr. Cudjoe added.
The findings were published online in the Journal of the American Geriatric Society.
Upstream resources, downstream outcomes
Social isolation is a “multidimensional construct” characterized by factors such as social connections, social support, resource sharing, and relationship strain. It also affects approximately a quarter of older adults, the investigators noted.
Although prior studies have pointed to an association between socially isolated older adults and increased risk for incident dementia, no study has described this longitudinal association in a nationally representative cohort of U.S. seniors.
Dr. Cudjoe said he was motivated to conduct the current study because he wondered whether or not older adults throughout the United States were similar to some of his patients “who might be at risk for worse cognitive outcomes because they lacked social contact with friends, family, or neighbors.”
The study was also “informed by conceptual foundation that upstream social and personal resources are linked to downstream health outcomes, including cognitive health and function,” the researchers added.
They turned to 2011-2020 data from the National Health and Aging Trends Study, a nationally representative, longitudinal cohort of U.S. Medicare beneficiaries. The sample was drawn from the Medicare enrollment file and incorporated 95 counties and 655 zip codes.
Participants (n = 5,022; mean age, 76.4 years; 57.2% women; 71.7% White, non-Hispanic; 42.4% having more than a college education) were community-dwelling older adults who completed annual 2-hour interviews that included assessment of function, economic health status, and well-being. To be included, they had to attend at least the baseline and first follow-up visits.
NHATS “includes domains that are relevant for the characterization of social isolation,” the investigators wrote. It used a typology of structural social isolation that is informed by the Berkman-Syme Social Network Index.
Included domains were living arrangements, discussion networks, and participation. All are “clinically relevant, practical, and components of a comprehensive social history,” the researchers noted.
They added that individuals classified as “socially isolated” often live alone, have no one or only one person that they can rely upon to discuss important matters, and have limited or no engagement in social or religious groups.
Social isolation in the study was characterized using questions about living with at least one other person, talking to two or more other people about “important matters” in the past year, attending religious services in the past month, and participating in the past month in such things as clubs, meetings, group activities, or volunteer work.
Wake-up call
Study participants received 1 point for each item/domain, with a sum score of 0 or 1 classified as “socially isolated” and 2 or more points considered “not socially isolated.” They were classified as having probable dementia based either on self-report or lower-than-mean performance in 2 or more cognitive domains, or a score indicating probable dementia on the AD8 Dementia Screening Interview.
Covariates included demographic factors, education, and health factors. Mean follow-up was 5.1 years.
Results showed close to one-quarter (23.3%) of the study population was classified as socially isolated, with one-fifth (21.1%) developing dementia by the end of the follow-up period.
Compared with non-isolated older adults, those who were socially isolated were more likely to develop dementia during the follow-up period (19.6% vs. 25.9%, respectively).
After adjusting for demographic factors, social isolation was significantly associated with a higher risk for incident dementia (hazard ratio, 1.33; 95% confidence interval, 1.13-1.56). This association persisted after further adjustment for health factors (HR, 1.27; 95% CI, 1.08-1.49). Race and ethnicity had no bearing on the association.
In addition to the association between social isolation and dementia, the researchers also estimated the cause-specific hazard of death before dementia and found that, overall, 18% of participants died prior to dementia over the follow-up period. In particular, the social isolation–associated cause-specific HR of death before dementia was 1.28 (95% CI, 1.2-1.5).
Dr. Cudjoe noted that the mechanism behind the association between social isolation and dementia in this population needs further study. Still, he hopes that the findings will “serve as a wake-up call for all of us to be more thoughtful of the role of social connections on our cognitive health.”
Clinicians “should be thinking about and assessing the presence or absence of social connections in their patients,” Dr. Cudjoe added.
‘Instrumental role’
Commenting on the study, Nicole Purcell, DO, neurologist and senior director of clinical practice at the Alzheimer’s Association, said the study “contributes to the growing body of evidence that finds social isolation is a serious public health risk for many seniors living in the United States, increasing their risk for dementia and other serious mental conditions.”
Dr. Purcell, who was not involved with the study, added that “health care systems and medical professionals can play an instrumental role in identifying individuals at risk for social isolation.”
She noted that for those experiencing social isolation, “interaction with health care providers may be one of the few opportunities those individuals have for social engagement, [so] using these interactions to identify individuals at risk for social isolation and referring them to local resources and groups that promote engagement, well-being, and access to senior services may help decrease dementia risk for vulnerable seniors.”
Dr. Purcell added that the Alzheimer’s Association offers early-stage programs throughout the country, including support groups, education, art, music, and other socially engaging activities.
The study was funded by the National Institute on Aging, National Institute on Minority Health and Health Disparities, and Secunda Family Foundation. The investigators and Dr. Purcell have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Results from a longitudinal study that included more than 5,000 United States–based seniors showed that nearly one-quarter were socially isolated.
After adjusting for demographic and health factors, social isolation was found to be associated with a 28% higher risk for developing dementia over a 9-year period, compared with non-isolation. In addition, this finding held true regardless of race or ethnicity.
“Social connections are increasingly understood as a critical factor for the health of individuals as they age,” senior study author Thomas K.M. Cudjoe, MD, Robert and Jane Meyerhoff Endowed Professor and assistant professor of medicine, Division of Geriatric Medicine and Gerontology, Johns Hopkins University School of Medicine, Baltimore, said in a press release. “Our study expands our understanding of the deleterious impact of social isolation on one’s risk for dementia over time,” Dr. Cudjoe added.
The findings were published online in the Journal of the American Geriatric Society.
Upstream resources, downstream outcomes
Social isolation is a “multidimensional construct” characterized by factors such as social connections, social support, resource sharing, and relationship strain. It also affects approximately a quarter of older adults, the investigators noted.
Although prior studies have pointed to an association between socially isolated older adults and increased risk for incident dementia, no study has described this longitudinal association in a nationally representative cohort of U.S. seniors.
Dr. Cudjoe said he was motivated to conduct the current study because he wondered whether or not older adults throughout the United States were similar to some of his patients “who might be at risk for worse cognitive outcomes because they lacked social contact with friends, family, or neighbors.”
The study was also “informed by conceptual foundation that upstream social and personal resources are linked to downstream health outcomes, including cognitive health and function,” the researchers added.
They turned to 2011-2020 data from the National Health and Aging Trends Study, a nationally representative, longitudinal cohort of U.S. Medicare beneficiaries. The sample was drawn from the Medicare enrollment file and incorporated 95 counties and 655 zip codes.
Participants (n = 5,022; mean age, 76.4 years; 57.2% women; 71.7% White, non-Hispanic; 42.4% having more than a college education) were community-dwelling older adults who completed annual 2-hour interviews that included assessment of function, economic health status, and well-being. To be included, they had to attend at least the baseline and first follow-up visits.
NHATS “includes domains that are relevant for the characterization of social isolation,” the investigators wrote. It used a typology of structural social isolation that is informed by the Berkman-Syme Social Network Index.
Included domains were living arrangements, discussion networks, and participation. All are “clinically relevant, practical, and components of a comprehensive social history,” the researchers noted.
They added that individuals classified as “socially isolated” often live alone, have no one or only one person that they can rely upon to discuss important matters, and have limited or no engagement in social or religious groups.
Social isolation in the study was characterized using questions about living with at least one other person, talking to two or more other people about “important matters” in the past year, attending religious services in the past month, and participating in the past month in such things as clubs, meetings, group activities, or volunteer work.
Wake-up call
Study participants received 1 point for each item/domain, with a sum score of 0 or 1 classified as “socially isolated” and 2 or more points considered “not socially isolated.” They were classified as having probable dementia based either on self-report or lower-than-mean performance in 2 or more cognitive domains, or a score indicating probable dementia on the AD8 Dementia Screening Interview.
Covariates included demographic factors, education, and health factors. Mean follow-up was 5.1 years.
Results showed close to one-quarter (23.3%) of the study population was classified as socially isolated, with one-fifth (21.1%) developing dementia by the end of the follow-up period.
Compared with non-isolated older adults, those who were socially isolated were more likely to develop dementia during the follow-up period (19.6% vs. 25.9%, respectively).
After adjusting for demographic factors, social isolation was significantly associated with a higher risk for incident dementia (hazard ratio, 1.33; 95% confidence interval, 1.13-1.56). This association persisted after further adjustment for health factors (HR, 1.27; 95% CI, 1.08-1.49). Race and ethnicity had no bearing on the association.
In addition to the association between social isolation and dementia, the researchers also estimated the cause-specific hazard of death before dementia and found that, overall, 18% of participants died prior to dementia over the follow-up period. In particular, the social isolation–associated cause-specific HR of death before dementia was 1.28 (95% CI, 1.2-1.5).
Dr. Cudjoe noted that the mechanism behind the association between social isolation and dementia in this population needs further study. Still, he hopes that the findings will “serve as a wake-up call for all of us to be more thoughtful of the role of social connections on our cognitive health.”
Clinicians “should be thinking about and assessing the presence or absence of social connections in their patients,” Dr. Cudjoe added.
‘Instrumental role’
Commenting on the study, Nicole Purcell, DO, neurologist and senior director of clinical practice at the Alzheimer’s Association, said the study “contributes to the growing body of evidence that finds social isolation is a serious public health risk for many seniors living in the United States, increasing their risk for dementia and other serious mental conditions.”
Dr. Purcell, who was not involved with the study, added that “health care systems and medical professionals can play an instrumental role in identifying individuals at risk for social isolation.”
She noted that for those experiencing social isolation, “interaction with health care providers may be one of the few opportunities those individuals have for social engagement, [so] using these interactions to identify individuals at risk for social isolation and referring them to local resources and groups that promote engagement, well-being, and access to senior services may help decrease dementia risk for vulnerable seniors.”
Dr. Purcell added that the Alzheimer’s Association offers early-stage programs throughout the country, including support groups, education, art, music, and other socially engaging activities.
The study was funded by the National Institute on Aging, National Institute on Minority Health and Health Disparities, and Secunda Family Foundation. The investigators and Dr. Purcell have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Diet packed with fast food found hard on the liver
The study finds that getting one-fifth or more of total daily calories from fast food can increase the risk of nonalcoholic fatty liver disease, which can lead to cirrhosis and its complications, including liver failure and liver cancer.
Although the magnitude of association was modest among the general population, “striking” elevations in steatosis were evident among persons with obesity and diabetes who consumed fast food, in comparison with their counterparts who did not have obesity and diabetes, the researchers reported.
“My hope is that this study encourages people to seek out more nutritious, healthy food options and provides information that clinicians can use to counsel their patients, particularly those with underlying metabolic risk factors, of the importance of avoiding foods that are high in fat, carbohydrates, and processed sugars,” lead investigator Ani Kardashian, MD, hepatologist with the University of Southern California, Los Angeles, said in an interview.
“At a policy level, public health efforts are needed to improve access to affordable, healthy, and nutritious food options across the U.S. This is especially important as more people have turned to fast foods during the pandemic and as the price of food as risen dramatically over the past year due to food inflation,” Dr. Kardashian added.
The study was published online in Clinical Gastroenterology and Hepatology.
More fast food, greater steatosis
The findings are based on data from 3,954 adults who participated in the National Health and Nutrition Examination Survey (NHANES) of 2017-2018 and who underwent vibration-controlled transient elastography. Of these participants, data regarding 1- or 2-day dietary recall were available.
Steatosis, the primary outcome, was measured via controlled attenuation parameter (CAP). Two validated cutoffs were utilized (CAP ≥ 263 dB/m and CAP ≥ 285 dB/m).
Of those surveyed, 52% consumed any fast food, and 29% derived 20% or more of their daily calories from fast food.
Fast-food intake of 20% or more of daily calories was significantly associated with greater steatosis after multivariable adjustment, both as a continuous measure (4.6 dB/m higher CAP score) and with respect to the CAP ≥ 263 dB/m cutoff (odds ratio [OR], 1.45).
“The negative effects are particularly severe in people who already have diabetes and obesity,” Dr. Kardashian told this news organization.
For example, with diabetes and fast-food intake of 20% or more of daily calories, the ORs of meeting the CAP ≥ 263 dB/m cutoff and the CAP ≥ 285 dB/m cutoff were 2.3 and 2.48, respectively.
The researchers said their findings are particularly “alarming,” given the overall increase in fast-food consumption over the past 50 years in the United States, regardless of socioeconomic status.
Diet coaching
The finding that fast food has more deleterious impact on those with obesity and diabetes “emphasizes that it is not just one insult but multiple factors that contribute to overall health,” said Nancy Reau, MD, section chief of hepatology at Rush University Medical Center in Chicago.
“This is actually great news, because diet is modifiable, vs. your genetics, which you currently can’t change. This doesn’t mean if you’re lean you can eat whatever you want, but if you are overweight, being careful with your diet does have impact, even if it doesn’t lead to substantial weight changes,” said Dr. Reau, who is not affiliated with the study.
For people who have limited options and need to eat fast food, “there are healthy choices at most restaurants; you just need to be smart about reading labels, watching calories, and ordering the healthier options,” Dr. Reau said in an interview.
Fast food and fatty liver go “hand in hand,” Lisa Ganjhu, DO, gastroenterologist and hepatologist at NYU Langone Health in New York, told this news organization.
“I counsel and coach my patients on healthy diet and exercise, and I’ve been pretty successful,” said Dr. Ganjhu, who was not involved with the study.
“If my patient is eating at McDonald’s a lot, I basically walk through the menu with them and help them find something healthy. When patients see the benefits of cutting out fat and reducing carbohydrates, they are more apt to continue,” Dr. Ganjhu said.
The study was funded by the University of Southern California. Dr. Kardashian, Dr. Reau, and Dr. Ganjhu have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
The study finds that getting one-fifth or more of total daily calories from fast food can increase the risk of nonalcoholic fatty liver disease, which can lead to cirrhosis and its complications, including liver failure and liver cancer.
Although the magnitude of association was modest among the general population, “striking” elevations in steatosis were evident among persons with obesity and diabetes who consumed fast food, in comparison with their counterparts who did not have obesity and diabetes, the researchers reported.
“My hope is that this study encourages people to seek out more nutritious, healthy food options and provides information that clinicians can use to counsel their patients, particularly those with underlying metabolic risk factors, of the importance of avoiding foods that are high in fat, carbohydrates, and processed sugars,” lead investigator Ani Kardashian, MD, hepatologist with the University of Southern California, Los Angeles, said in an interview.
“At a policy level, public health efforts are needed to improve access to affordable, healthy, and nutritious food options across the U.S. This is especially important as more people have turned to fast foods during the pandemic and as the price of food as risen dramatically over the past year due to food inflation,” Dr. Kardashian added.
The study was published online in Clinical Gastroenterology and Hepatology.
More fast food, greater steatosis
The findings are based on data from 3,954 adults who participated in the National Health and Nutrition Examination Survey (NHANES) of 2017-2018 and who underwent vibration-controlled transient elastography. Of these participants, data regarding 1- or 2-day dietary recall were available.
Steatosis, the primary outcome, was measured via controlled attenuation parameter (CAP). Two validated cutoffs were utilized (CAP ≥ 263 dB/m and CAP ≥ 285 dB/m).
Of those surveyed, 52% consumed any fast food, and 29% derived 20% or more of their daily calories from fast food.
Fast-food intake of 20% or more of daily calories was significantly associated with greater steatosis after multivariable adjustment, both as a continuous measure (4.6 dB/m higher CAP score) and with respect to the CAP ≥ 263 dB/m cutoff (odds ratio [OR], 1.45).
“The negative effects are particularly severe in people who already have diabetes and obesity,” Dr. Kardashian told this news organization.
For example, with diabetes and fast-food intake of 20% or more of daily calories, the ORs of meeting the CAP ≥ 263 dB/m cutoff and the CAP ≥ 285 dB/m cutoff were 2.3 and 2.48, respectively.
The researchers said their findings are particularly “alarming,” given the overall increase in fast-food consumption over the past 50 years in the United States, regardless of socioeconomic status.
Diet coaching
The finding that fast food has more deleterious impact on those with obesity and diabetes “emphasizes that it is not just one insult but multiple factors that contribute to overall health,” said Nancy Reau, MD, section chief of hepatology at Rush University Medical Center in Chicago.
“This is actually great news, because diet is modifiable, vs. your genetics, which you currently can’t change. This doesn’t mean if you’re lean you can eat whatever you want, but if you are overweight, being careful with your diet does have impact, even if it doesn’t lead to substantial weight changes,” said Dr. Reau, who is not affiliated with the study.
For people who have limited options and need to eat fast food, “there are healthy choices at most restaurants; you just need to be smart about reading labels, watching calories, and ordering the healthier options,” Dr. Reau said in an interview.
Fast food and fatty liver go “hand in hand,” Lisa Ganjhu, DO, gastroenterologist and hepatologist at NYU Langone Health in New York, told this news organization.
“I counsel and coach my patients on healthy diet and exercise, and I’ve been pretty successful,” said Dr. Ganjhu, who was not involved with the study.
“If my patient is eating at McDonald’s a lot, I basically walk through the menu with them and help them find something healthy. When patients see the benefits of cutting out fat and reducing carbohydrates, they are more apt to continue,” Dr. Ganjhu said.
The study was funded by the University of Southern California. Dr. Kardashian, Dr. Reau, and Dr. Ganjhu have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
The study finds that getting one-fifth or more of total daily calories from fast food can increase the risk of nonalcoholic fatty liver disease, which can lead to cirrhosis and its complications, including liver failure and liver cancer.
Although the magnitude of association was modest among the general population, “striking” elevations in steatosis were evident among persons with obesity and diabetes who consumed fast food, in comparison with their counterparts who did not have obesity and diabetes, the researchers reported.
“My hope is that this study encourages people to seek out more nutritious, healthy food options and provides information that clinicians can use to counsel their patients, particularly those with underlying metabolic risk factors, of the importance of avoiding foods that are high in fat, carbohydrates, and processed sugars,” lead investigator Ani Kardashian, MD, hepatologist with the University of Southern California, Los Angeles, said in an interview.
“At a policy level, public health efforts are needed to improve access to affordable, healthy, and nutritious food options across the U.S. This is especially important as more people have turned to fast foods during the pandemic and as the price of food as risen dramatically over the past year due to food inflation,” Dr. Kardashian added.
The study was published online in Clinical Gastroenterology and Hepatology.
More fast food, greater steatosis
The findings are based on data from 3,954 adults who participated in the National Health and Nutrition Examination Survey (NHANES) of 2017-2018 and who underwent vibration-controlled transient elastography. Of these participants, data regarding 1- or 2-day dietary recall were available.
Steatosis, the primary outcome, was measured via controlled attenuation parameter (CAP). Two validated cutoffs were utilized (CAP ≥ 263 dB/m and CAP ≥ 285 dB/m).
Of those surveyed, 52% consumed any fast food, and 29% derived 20% or more of their daily calories from fast food.
Fast-food intake of 20% or more of daily calories was significantly associated with greater steatosis after multivariable adjustment, both as a continuous measure (4.6 dB/m higher CAP score) and with respect to the CAP ≥ 263 dB/m cutoff (odds ratio [OR], 1.45).
“The negative effects are particularly severe in people who already have diabetes and obesity,” Dr. Kardashian told this news organization.
For example, with diabetes and fast-food intake of 20% or more of daily calories, the ORs of meeting the CAP ≥ 263 dB/m cutoff and the CAP ≥ 285 dB/m cutoff were 2.3 and 2.48, respectively.
The researchers said their findings are particularly “alarming,” given the overall increase in fast-food consumption over the past 50 years in the United States, regardless of socioeconomic status.
Diet coaching
The finding that fast food has more deleterious impact on those with obesity and diabetes “emphasizes that it is not just one insult but multiple factors that contribute to overall health,” said Nancy Reau, MD, section chief of hepatology at Rush University Medical Center in Chicago.
“This is actually great news, because diet is modifiable, vs. your genetics, which you currently can’t change. This doesn’t mean if you’re lean you can eat whatever you want, but if you are overweight, being careful with your diet does have impact, even if it doesn’t lead to substantial weight changes,” said Dr. Reau, who is not affiliated with the study.
For people who have limited options and need to eat fast food, “there are healthy choices at most restaurants; you just need to be smart about reading labels, watching calories, and ordering the healthier options,” Dr. Reau said in an interview.
Fast food and fatty liver go “hand in hand,” Lisa Ganjhu, DO, gastroenterologist and hepatologist at NYU Langone Health in New York, told this news organization.
“I counsel and coach my patients on healthy diet and exercise, and I’ve been pretty successful,” said Dr. Ganjhu, who was not involved with the study.
“If my patient is eating at McDonald’s a lot, I basically walk through the menu with them and help them find something healthy. When patients see the benefits of cutting out fat and reducing carbohydrates, they are more apt to continue,” Dr. Ganjhu said.
The study was funded by the University of Southern California. Dr. Kardashian, Dr. Reau, and Dr. Ganjhu have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Nearly 50% of patients with dementia experience falls
, suggests new research that also identifies multiple risk factors for these falls.
In a study of more than 5,500 participants, 45.5% of those with dementia experienced one or more falls, compared with 30.9% of their peers without dementia.
Vision impairment and living with a spouse were among the strongest predictors of future fall risk among participants living with dementia. Interestingly, high neighborhood social deprivation, which is reflected by such things as income and education, was associated with lower odds of falling.
Overall, the results highlight the need for a multidisciplinary approach to preventing falls among elderly individuals with dementia, said lead author Safiyyah M. Okoye, PhD, assistant professor, College of Nursing and Health Professions, Drexel University, Philadelphia.
“We need to consider different dimensions and figure out how we can try to go beyond the clinic in our interactions,” she said.
Dr. Okoye noted that in addition to reviewing medications that may contribute to falls and screening for vision problems, clinicians might also consider resources to improve the home environment and ensure that families have appropriate caregiving.
The findings were published online in Alzheimer’s and Dementia: The Journal of the Alzheimer’s Association.
No ‘silver bullet’
Every year, falls cause millions of injuries in older adults, and those with dementia are especially vulnerable. This population has twice the risk of falling and up to three times the risk of incurring serious fall-related injuries, such as fractures, the researchers noted.
Falls are a leading cause of hospitalization among those with dementia. Previous evidence has shown that persons with dementia are more likely to experience negative health consequences, such as delirium, while in hospital, compared with those without dementia. Even minor fall-related injuries are associated with the patient’s being discharged to a nursing home rather than returning home.
Dr. Okoye stressed that many factors contribute to falls, including health status; function, such as the ability to walk and balance; medications; home environment; and activity level.
“There are multidimensional aspects, and we can’t just find one silver bullet to address falls. It should be addressed comprehensively,” she said.
Existing studies “overwhelmingly” focus on factors related to health and function that could be addressed in the doctor’s office or with a referral, rather than on environmental and social factors, Dr. Okoye noted.
And even though the risk of falling is high among community-dwelling seniors with dementia, very few studies have addressed the risk of falls among these adults, she added.
The new analysis included a nationally representative sample of 5,581 community-dwelling adults who participated in both the 2015 and 2016 National Health and Aging Trends Study (NHATS). The NHATS is a population-based survey of health and disability trends and trajectories among Americans aged 65 years and older.
During interviews, participants were asked, personally or by proxy, about falls during the previous 12 months. Having fallen at baseline was evaluated as a possible predictor of falls in the subsequent 12 months.
To determine probable dementia, researchers asked whether a doctor had ever told the participants that they had dementia or Alzheimer’s disease. They also used a dementia screening questionnaire and neuropsychological tests of memory, orientation, and executive function.
Of the total sample, most (n = 5,093) did not have dementia.
Physical environmental factors that were assessed included conditions at home, such as clutter, tripping hazards, and structural issues, as well as neighborhood social and economic deprivation – such as income, education levels, and employment status.
Fall rates and counterintuitive findings
Results showed that significantly more of those with dementia than without experienced one or more falls (45.5% vs. 30.9%; P < .001).
In addition, a history of falling was significantly associated with subsequent falls among those with dementia (odds ratio, 6.20; 95% confidence interval, 3.81-10.09), as was vision impairment (OR, 2.22; 95% CI, 1.12-4.40) and living with a spouse versus alone (OR, 2.43; 95% CI, 1.09-5.43).
A possible explanation for higher fall risk among those living with a partner is that those living alone usually have better functioning, the investigators noted. Also, live-in partners tend to be of a similar age as the person with dementia and may have challenges of their own.
Interestingly, high neighborhood social deprivation was associated with lower odds of falling (OR, 0.55 for the highest deprivation scores; 95% CI, 0.31-0.98), a finding Dr. Okoye said was “counterintuitive.”
This result could be related to the social environment, she noted. “Maybe there are more people around in the house, more people with eyes on the person, or more people in the community who know the person. Despite the low economic resources, there could be social resources there,” she said.
The new findings underscore the idea that falling is a multidimensional phenomenon among older adults with dementia as well as those without dementia, Dr. Okoye noted.
Doctors can play a role in reducing falls among patients with dementia by asking about falls, possibly eliminating medications that are associated with risk of falling, and screening for and correcting vision and hearing impairments, she suggested.
They may also help determine household hazards for a patient, such as clutter and poor lighting, and ensure that these are addressed, Dr. Okoye added.
No surprise
Commenting on the study, David S. Knopman, MD, a clinical neurologist at Mayo Clinic, Rochester, Minn., said the finding that visual impairment and a prior history of falling are predictive of subsequent falls “comes as no surprise.”
Dr. Knopman, whose research focuses on late-life cognitive disorders, was not involved with the current study.
Risk reduction is “of course” a key management goal, he said. “Vigilance and optimizing the patient’s living space to reduce fall risks are the major strategies,” he added.
Dr. Knopman reiterated that falls among those with dementia are associated with higher mortality and often lead to loss of the capacity to live outside of an institution.
The study was supported by the National Institute on Aging. The investigators and Dr. Knopman report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, suggests new research that also identifies multiple risk factors for these falls.
In a study of more than 5,500 participants, 45.5% of those with dementia experienced one or more falls, compared with 30.9% of their peers without dementia.
Vision impairment and living with a spouse were among the strongest predictors of future fall risk among participants living with dementia. Interestingly, high neighborhood social deprivation, which is reflected by such things as income and education, was associated with lower odds of falling.
Overall, the results highlight the need for a multidisciplinary approach to preventing falls among elderly individuals with dementia, said lead author Safiyyah M. Okoye, PhD, assistant professor, College of Nursing and Health Professions, Drexel University, Philadelphia.
“We need to consider different dimensions and figure out how we can try to go beyond the clinic in our interactions,” she said.
Dr. Okoye noted that in addition to reviewing medications that may contribute to falls and screening for vision problems, clinicians might also consider resources to improve the home environment and ensure that families have appropriate caregiving.
The findings were published online in Alzheimer’s and Dementia: The Journal of the Alzheimer’s Association.
No ‘silver bullet’
Every year, falls cause millions of injuries in older adults, and those with dementia are especially vulnerable. This population has twice the risk of falling and up to three times the risk of incurring serious fall-related injuries, such as fractures, the researchers noted.
Falls are a leading cause of hospitalization among those with dementia. Previous evidence has shown that persons with dementia are more likely to experience negative health consequences, such as delirium, while in hospital, compared with those without dementia. Even minor fall-related injuries are associated with the patient’s being discharged to a nursing home rather than returning home.
Dr. Okoye stressed that many factors contribute to falls, including health status; function, such as the ability to walk and balance; medications; home environment; and activity level.
“There are multidimensional aspects, and we can’t just find one silver bullet to address falls. It should be addressed comprehensively,” she said.
Existing studies “overwhelmingly” focus on factors related to health and function that could be addressed in the doctor’s office or with a referral, rather than on environmental and social factors, Dr. Okoye noted.
And even though the risk of falling is high among community-dwelling seniors with dementia, very few studies have addressed the risk of falls among these adults, she added.
The new analysis included a nationally representative sample of 5,581 community-dwelling adults who participated in both the 2015 and 2016 National Health and Aging Trends Study (NHATS). The NHATS is a population-based survey of health and disability trends and trajectories among Americans aged 65 years and older.
During interviews, participants were asked, personally or by proxy, about falls during the previous 12 months. Having fallen at baseline was evaluated as a possible predictor of falls in the subsequent 12 months.
To determine probable dementia, researchers asked whether a doctor had ever told the participants that they had dementia or Alzheimer’s disease. They also used a dementia screening questionnaire and neuropsychological tests of memory, orientation, and executive function.
Of the total sample, most (n = 5,093) did not have dementia.
Physical environmental factors that were assessed included conditions at home, such as clutter, tripping hazards, and structural issues, as well as neighborhood social and economic deprivation – such as income, education levels, and employment status.
Fall rates and counterintuitive findings
Results showed that significantly more of those with dementia than without experienced one or more falls (45.5% vs. 30.9%; P < .001).
In addition, a history of falling was significantly associated with subsequent falls among those with dementia (odds ratio, 6.20; 95% confidence interval, 3.81-10.09), as was vision impairment (OR, 2.22; 95% CI, 1.12-4.40) and living with a spouse versus alone (OR, 2.43; 95% CI, 1.09-5.43).
A possible explanation for higher fall risk among those living with a partner is that those living alone usually have better functioning, the investigators noted. Also, live-in partners tend to be of a similar age as the person with dementia and may have challenges of their own.
Interestingly, high neighborhood social deprivation was associated with lower odds of falling (OR, 0.55 for the highest deprivation scores; 95% CI, 0.31-0.98), a finding Dr. Okoye said was “counterintuitive.”
This result could be related to the social environment, she noted. “Maybe there are more people around in the house, more people with eyes on the person, or more people in the community who know the person. Despite the low economic resources, there could be social resources there,” she said.
The new findings underscore the idea that falling is a multidimensional phenomenon among older adults with dementia as well as those without dementia, Dr. Okoye noted.
Doctors can play a role in reducing falls among patients with dementia by asking about falls, possibly eliminating medications that are associated with risk of falling, and screening for and correcting vision and hearing impairments, she suggested.
They may also help determine household hazards for a patient, such as clutter and poor lighting, and ensure that these are addressed, Dr. Okoye added.
No surprise
Commenting on the study, David S. Knopman, MD, a clinical neurologist at Mayo Clinic, Rochester, Minn., said the finding that visual impairment and a prior history of falling are predictive of subsequent falls “comes as no surprise.”
Dr. Knopman, whose research focuses on late-life cognitive disorders, was not involved with the current study.
Risk reduction is “of course” a key management goal, he said. “Vigilance and optimizing the patient’s living space to reduce fall risks are the major strategies,” he added.
Dr. Knopman reiterated that falls among those with dementia are associated with higher mortality and often lead to loss of the capacity to live outside of an institution.
The study was supported by the National Institute on Aging. The investigators and Dr. Knopman report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, suggests new research that also identifies multiple risk factors for these falls.
In a study of more than 5,500 participants, 45.5% of those with dementia experienced one or more falls, compared with 30.9% of their peers without dementia.
Vision impairment and living with a spouse were among the strongest predictors of future fall risk among participants living with dementia. Interestingly, high neighborhood social deprivation, which is reflected by such things as income and education, was associated with lower odds of falling.
Overall, the results highlight the need for a multidisciplinary approach to preventing falls among elderly individuals with dementia, said lead author Safiyyah M. Okoye, PhD, assistant professor, College of Nursing and Health Professions, Drexel University, Philadelphia.
“We need to consider different dimensions and figure out how we can try to go beyond the clinic in our interactions,” she said.
Dr. Okoye noted that in addition to reviewing medications that may contribute to falls and screening for vision problems, clinicians might also consider resources to improve the home environment and ensure that families have appropriate caregiving.
The findings were published online in Alzheimer’s and Dementia: The Journal of the Alzheimer’s Association.
No ‘silver bullet’
Every year, falls cause millions of injuries in older adults, and those with dementia are especially vulnerable. This population has twice the risk of falling and up to three times the risk of incurring serious fall-related injuries, such as fractures, the researchers noted.
Falls are a leading cause of hospitalization among those with dementia. Previous evidence has shown that persons with dementia are more likely to experience negative health consequences, such as delirium, while in hospital, compared with those without dementia. Even minor fall-related injuries are associated with the patient’s being discharged to a nursing home rather than returning home.
Dr. Okoye stressed that many factors contribute to falls, including health status; function, such as the ability to walk and balance; medications; home environment; and activity level.
“There are multidimensional aspects, and we can’t just find one silver bullet to address falls. It should be addressed comprehensively,” she said.
Existing studies “overwhelmingly” focus on factors related to health and function that could be addressed in the doctor’s office or with a referral, rather than on environmental and social factors, Dr. Okoye noted.
And even though the risk of falling is high among community-dwelling seniors with dementia, very few studies have addressed the risk of falls among these adults, she added.
The new analysis included a nationally representative sample of 5,581 community-dwelling adults who participated in both the 2015 and 2016 National Health and Aging Trends Study (NHATS). The NHATS is a population-based survey of health and disability trends and trajectories among Americans aged 65 years and older.
During interviews, participants were asked, personally or by proxy, about falls during the previous 12 months. Having fallen at baseline was evaluated as a possible predictor of falls in the subsequent 12 months.
To determine probable dementia, researchers asked whether a doctor had ever told the participants that they had dementia or Alzheimer’s disease. They also used a dementia screening questionnaire and neuropsychological tests of memory, orientation, and executive function.
Of the total sample, most (n = 5,093) did not have dementia.
Physical environmental factors that were assessed included conditions at home, such as clutter, tripping hazards, and structural issues, as well as neighborhood social and economic deprivation – such as income, education levels, and employment status.
Fall rates and counterintuitive findings
Results showed that significantly more of those with dementia than without experienced one or more falls (45.5% vs. 30.9%; P < .001).
In addition, a history of falling was significantly associated with subsequent falls among those with dementia (odds ratio, 6.20; 95% confidence interval, 3.81-10.09), as was vision impairment (OR, 2.22; 95% CI, 1.12-4.40) and living with a spouse versus alone (OR, 2.43; 95% CI, 1.09-5.43).
A possible explanation for higher fall risk among those living with a partner is that those living alone usually have better functioning, the investigators noted. Also, live-in partners tend to be of a similar age as the person with dementia and may have challenges of their own.
Interestingly, high neighborhood social deprivation was associated with lower odds of falling (OR, 0.55 for the highest deprivation scores; 95% CI, 0.31-0.98), a finding Dr. Okoye said was “counterintuitive.”
This result could be related to the social environment, she noted. “Maybe there are more people around in the house, more people with eyes on the person, or more people in the community who know the person. Despite the low economic resources, there could be social resources there,” she said.
The new findings underscore the idea that falling is a multidimensional phenomenon among older adults with dementia as well as those without dementia, Dr. Okoye noted.
Doctors can play a role in reducing falls among patients with dementia by asking about falls, possibly eliminating medications that are associated with risk of falling, and screening for and correcting vision and hearing impairments, she suggested.
They may also help determine household hazards for a patient, such as clutter and poor lighting, and ensure that these are addressed, Dr. Okoye added.
No surprise
Commenting on the study, David S. Knopman, MD, a clinical neurologist at Mayo Clinic, Rochester, Minn., said the finding that visual impairment and a prior history of falling are predictive of subsequent falls “comes as no surprise.”
Dr. Knopman, whose research focuses on late-life cognitive disorders, was not involved with the current study.
Risk reduction is “of course” a key management goal, he said. “Vigilance and optimizing the patient’s living space to reduce fall risks are the major strategies,” he added.
Dr. Knopman reiterated that falls among those with dementia are associated with higher mortality and often lead to loss of the capacity to live outside of an institution.
The study was supported by the National Institute on Aging. The investigators and Dr. Knopman report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM ALZHEIMER’S AND DEMENTIA
Hearing loss strongly tied to increased dementia risk
, new national data show. Investigators also found that even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and that hearing aid use was tied to a 32% decrease in dementia prevalence.
“Every 10-decibel increase in hearing loss was associated with 16% greater prevalence of dementia, such that prevalence of dementia in older adults with moderate or greater hearing loss was 61% higher than prevalence in those with normal hearing,” said lead investigator Alison Huang, PhD, senior research associate in epidemiology at Johns Hopkins Bloomberg School of Public Health and core faculty in the Cochlear Center for Hearing and Public Health, Baltimore.
The findings were published online in JAMA.
Dose dependent effect
For their study, researchers analyzed data on 2,413 community-dwelling participants in the National Health and Aging Trends Study, a nationally representative, continuous panel study of U.S. Medicare beneficiaries aged 65 and older.
Data from the study was collected during in-home interviews, setting it apart from previous work that relied on data collected in a clinical setting, Dr. Huang said.
“This study was able to capture more vulnerable populations, such as the oldest old and older adults with disabilities, typically excluded from prior epidemiologic studies of the hearing loss–dementia association that use clinic-based data collection, which only captures people who have the ability and means to get to clinics,” Dr. Huang said.
Weighted hearing loss prevalence was 36.7% for mild and 29.8% for moderate to severe hearing loss, and weighted prevalence of dementia was 10.3%.
Those with moderate to severe hearing loss were 61% more likely to have dementia than were those with normal hearing (prevalence ratio, 1.61; 95% confidence interval [CI], 1.09-2.38).
Dementia prevalence increased with increasing severity of hearing loss: Normal hearing: 6.19% (95% CI, 4.31-8.80); mild hearing loss: 8.93% (95% CI, 6.99-11.34); moderate to severe hearing loss: 16.52% (95% CI, 13.81-19.64). But only moderate to severe hearing loss showed a statistically significant association with dementia (P = .02).
Dementia prevalence increased 16% per 10-decibel increase in hearing loss (prevalence ratio 1.16; P < .001).
Among the 853 individuals in the study with moderate to severe hearing loss, those who used hearing aids (n = 414) had a 32% lower risk of dementia compared with those who didn’t use assisted devices (prevalence ratio, 0.68; 95% CI, 0.47-1.00). Similar data were published in JAMA Neurology, suggesting that hearing aids reduce dementia risk.
“With this study, we were able to refine our understanding of the strength of the hearing loss–dementia association in a study more representative of older adults in the United States,” said Dr. Huang.
Robust association
Commenting on the findings, Justin S. Golub, MD, associate professor in the department of otolaryngology–head and neck surgery at Columbia University, New York, said the study supports earlier research and suggests a “robust” association between hearing loss and dementia.
“The particular advantage of this study was that it was high quality and nationally representative,” Dr. Golub said. “It is also among a smaller set of studies that have shown hearing aid use to be associated with lower risk of dementia.”
Although not statistically significant, researchers did find increasing prevalence of dementia among people with only mild hearing loss, and clinicians should take note, said Dr. Golub, who was not involved with this study.
“We would expect the relationship between mild hearing loss and dementia to be weaker than severe hearing loss and dementia and, as a result, it might take more participants to show an association among the mild group,” Dr. Golub said.
“Even though this particular study did not specifically find a relationship between mild hearing loss and dementia, I would still recommend people to start treating their hearing loss when it is early,” Dr. Golub added.
The study was funded by the National Institute on Aging. Dr. Golub reports no relevant financial relationships. Full disclosures for study authors are included in the original article.
A version of this article first appeared on Medscape.com.
, new national data show. Investigators also found that even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and that hearing aid use was tied to a 32% decrease in dementia prevalence.
“Every 10-decibel increase in hearing loss was associated with 16% greater prevalence of dementia, such that prevalence of dementia in older adults with moderate or greater hearing loss was 61% higher than prevalence in those with normal hearing,” said lead investigator Alison Huang, PhD, senior research associate in epidemiology at Johns Hopkins Bloomberg School of Public Health and core faculty in the Cochlear Center for Hearing and Public Health, Baltimore.
The findings were published online in JAMA.
Dose dependent effect
For their study, researchers analyzed data on 2,413 community-dwelling participants in the National Health and Aging Trends Study, a nationally representative, continuous panel study of U.S. Medicare beneficiaries aged 65 and older.
Data from the study was collected during in-home interviews, setting it apart from previous work that relied on data collected in a clinical setting, Dr. Huang said.
“This study was able to capture more vulnerable populations, such as the oldest old and older adults with disabilities, typically excluded from prior epidemiologic studies of the hearing loss–dementia association that use clinic-based data collection, which only captures people who have the ability and means to get to clinics,” Dr. Huang said.
Weighted hearing loss prevalence was 36.7% for mild and 29.8% for moderate to severe hearing loss, and weighted prevalence of dementia was 10.3%.
Those with moderate to severe hearing loss were 61% more likely to have dementia than were those with normal hearing (prevalence ratio, 1.61; 95% confidence interval [CI], 1.09-2.38).
Dementia prevalence increased with increasing severity of hearing loss: Normal hearing: 6.19% (95% CI, 4.31-8.80); mild hearing loss: 8.93% (95% CI, 6.99-11.34); moderate to severe hearing loss: 16.52% (95% CI, 13.81-19.64). But only moderate to severe hearing loss showed a statistically significant association with dementia (P = .02).
Dementia prevalence increased 16% per 10-decibel increase in hearing loss (prevalence ratio 1.16; P < .001).
Among the 853 individuals in the study with moderate to severe hearing loss, those who used hearing aids (n = 414) had a 32% lower risk of dementia compared with those who didn’t use assisted devices (prevalence ratio, 0.68; 95% CI, 0.47-1.00). Similar data were published in JAMA Neurology, suggesting that hearing aids reduce dementia risk.
“With this study, we were able to refine our understanding of the strength of the hearing loss–dementia association in a study more representative of older adults in the United States,” said Dr. Huang.
Robust association
Commenting on the findings, Justin S. Golub, MD, associate professor in the department of otolaryngology–head and neck surgery at Columbia University, New York, said the study supports earlier research and suggests a “robust” association between hearing loss and dementia.
“The particular advantage of this study was that it was high quality and nationally representative,” Dr. Golub said. “It is also among a smaller set of studies that have shown hearing aid use to be associated with lower risk of dementia.”
Although not statistically significant, researchers did find increasing prevalence of dementia among people with only mild hearing loss, and clinicians should take note, said Dr. Golub, who was not involved with this study.
“We would expect the relationship between mild hearing loss and dementia to be weaker than severe hearing loss and dementia and, as a result, it might take more participants to show an association among the mild group,” Dr. Golub said.
“Even though this particular study did not specifically find a relationship between mild hearing loss and dementia, I would still recommend people to start treating their hearing loss when it is early,” Dr. Golub added.
The study was funded by the National Institute on Aging. Dr. Golub reports no relevant financial relationships. Full disclosures for study authors are included in the original article.
A version of this article first appeared on Medscape.com.
, new national data show. Investigators also found that even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and that hearing aid use was tied to a 32% decrease in dementia prevalence.
“Every 10-decibel increase in hearing loss was associated with 16% greater prevalence of dementia, such that prevalence of dementia in older adults with moderate or greater hearing loss was 61% higher than prevalence in those with normal hearing,” said lead investigator Alison Huang, PhD, senior research associate in epidemiology at Johns Hopkins Bloomberg School of Public Health and core faculty in the Cochlear Center for Hearing and Public Health, Baltimore.
The findings were published online in JAMA.
Dose dependent effect
For their study, researchers analyzed data on 2,413 community-dwelling participants in the National Health and Aging Trends Study, a nationally representative, continuous panel study of U.S. Medicare beneficiaries aged 65 and older.
Data from the study was collected during in-home interviews, setting it apart from previous work that relied on data collected in a clinical setting, Dr. Huang said.
“This study was able to capture more vulnerable populations, such as the oldest old and older adults with disabilities, typically excluded from prior epidemiologic studies of the hearing loss–dementia association that use clinic-based data collection, which only captures people who have the ability and means to get to clinics,” Dr. Huang said.
Weighted hearing loss prevalence was 36.7% for mild and 29.8% for moderate to severe hearing loss, and weighted prevalence of dementia was 10.3%.
Those with moderate to severe hearing loss were 61% more likely to have dementia than were those with normal hearing (prevalence ratio, 1.61; 95% confidence interval [CI], 1.09-2.38).
Dementia prevalence increased with increasing severity of hearing loss: Normal hearing: 6.19% (95% CI, 4.31-8.80); mild hearing loss: 8.93% (95% CI, 6.99-11.34); moderate to severe hearing loss: 16.52% (95% CI, 13.81-19.64). But only moderate to severe hearing loss showed a statistically significant association with dementia (P = .02).
Dementia prevalence increased 16% per 10-decibel increase in hearing loss (prevalence ratio 1.16; P < .001).
Among the 853 individuals in the study with moderate to severe hearing loss, those who used hearing aids (n = 414) had a 32% lower risk of dementia compared with those who didn’t use assisted devices (prevalence ratio, 0.68; 95% CI, 0.47-1.00). Similar data were published in JAMA Neurology, suggesting that hearing aids reduce dementia risk.
“With this study, we were able to refine our understanding of the strength of the hearing loss–dementia association in a study more representative of older adults in the United States,” said Dr. Huang.
Robust association
Commenting on the findings, Justin S. Golub, MD, associate professor in the department of otolaryngology–head and neck surgery at Columbia University, New York, said the study supports earlier research and suggests a “robust” association between hearing loss and dementia.
“The particular advantage of this study was that it was high quality and nationally representative,” Dr. Golub said. “It is also among a smaller set of studies that have shown hearing aid use to be associated with lower risk of dementia.”
Although not statistically significant, researchers did find increasing prevalence of dementia among people with only mild hearing loss, and clinicians should take note, said Dr. Golub, who was not involved with this study.
“We would expect the relationship between mild hearing loss and dementia to be weaker than severe hearing loss and dementia and, as a result, it might take more participants to show an association among the mild group,” Dr. Golub said.
“Even though this particular study did not specifically find a relationship between mild hearing loss and dementia, I would still recommend people to start treating their hearing loss when it is early,” Dr. Golub added.
The study was funded by the National Institute on Aging. Dr. Golub reports no relevant financial relationships. Full disclosures for study authors are included in the original article.
A version of this article first appeared on Medscape.com.
Dietary zinc seen reducing migraine risk
, according to results from a cross-sectional study of more than 11,000 American adults.
For their research, published online in Headache, Huanxian Liu, MD, and colleagues at Chinese PLA General Hospital in Beijing, analyzed publicly available data from the U.S. National Health and Nutrition Examination Survey to determine whether people self-reporting migraine or severe headache saw lower zinc intake, compared with people without migraine. The data used in the analysis was collected between 1999 and 2004, and contained information on foods and drinks consumed by participants in a 24-hour period, along with additional health information.
An inverse relationship
The investigators divided their study’s 11,088 participants (mean age, 46.5 years; 50% female) into quintiles based on dietary zinc consumption as inferred from foods eaten. They also considered zinc supplementation, for which data was available for 4,324 participants, of whom 2,607 reported use of supplements containing zinc.
Some 20% of the cohort (n = 2,236) reported migraine or severe headache within the previous 3 months. Pregnant women were excluded from analysis, and the investigators adjusted for a range of covariates, including age, sex, ethnicity, education level, body mass, smoking, diabetes, cardiovascular disease, and nutritional factors.
Dr. Liu and colleagues reported an inverse association between dietary zinc consumption and migraine, with the highest-consuming quintile of the cohort (15.8 mg or more zinc per day) seeing lowest risk of migraine (odds ratio, 0.70; 95% confidence interval, 0.52-0.94; P = .029), compared with the low-consuming quintile (5.9 mg or less daily). Among people getting high levels of zinc (19.3-32.5 mg daily) through supplements, risk of migraine was lower still, to between an OR of 0.62 (95% CI: 0.46–0.83, P = 0.019) and an OR of 0.67 (95% CI, 0.49–0.91; P = .045).
While the investigators acknowledged limitations of the study, including its cross-sectional design and use of a broad question to discern prevalence of migraine, the findings suggest that “zinc is an important nutrient that influences migraine,” they wrote, citing evidence for its antioxidant and anti-inflammatory properties.
The importance of nutritional factors
Commenting on the research findings, Deborah I. Friedman, MD, MPH, a headache specialist in Dallas, said that Dr. Liu and colleagues’ findings added to a growing information base about nutritional factors and migraine. For example, “low magnesium levels are common in people with migraine, and magnesium supplementation is a recommended preventive treatment for migraine.”
Dr. Friedman cited a recent study showing that vitamin B12 and magnesium supplementation in women (, combined with high-intensity interval training, “silenced” the inflammation signaling pathway, helped migraine pain and decreased levels of calcitonin gene-related peptide. A 2022 randomized trial found that alpha lipoic acid supplementation reduced migraine severity, frequency and disability in women with episodic migraine.
Vitamin D levels are also lower in people with migraine, compared with controls, Dr. Friedman noted, and a randomized trial of 2,000 IU of vitamin D3 daily saw reduced monthly headache days, attack duration, severe headaches, and analgesic use, compared with placebo. Other nutrients implicated in migraine include coenzyme Q10, calcium, folic acid, vitamin B6, and vitamin B1.
“What should a patient with migraine do with all of this information? First, eat a healthy and balanced diet,” Dr. Friedman said. “Sources of dietary zinc include red meat, nuts, legumes, poultry, shellfish (especially oysters), whole grains, some cereals, and even dark chocolate. The recommended daily dosage of zinc is 9.5 mg in men and 7 mg in women. Most people get enough zinc in their diet; vegetarians, vegans, pregnant or breastfeeding women, and adults over age 65 may need to take supplemental zinc.”
Dr. Liu and colleagues’ work was supported by China’s National Natural Science Foundation. The investigators reported no financial conflicts of interest. Dr. Friedman has received financial support from Alder, Allergan, Amgen, Biohaven, Eli Lilly, Merck, Teva, and other pharmaceutical manufacturers.
, according to results from a cross-sectional study of more than 11,000 American adults.
For their research, published online in Headache, Huanxian Liu, MD, and colleagues at Chinese PLA General Hospital in Beijing, analyzed publicly available data from the U.S. National Health and Nutrition Examination Survey to determine whether people self-reporting migraine or severe headache saw lower zinc intake, compared with people without migraine. The data used in the analysis was collected between 1999 and 2004, and contained information on foods and drinks consumed by participants in a 24-hour period, along with additional health information.
An inverse relationship
The investigators divided their study’s 11,088 participants (mean age, 46.5 years; 50% female) into quintiles based on dietary zinc consumption as inferred from foods eaten. They also considered zinc supplementation, for which data was available for 4,324 participants, of whom 2,607 reported use of supplements containing zinc.
Some 20% of the cohort (n = 2,236) reported migraine or severe headache within the previous 3 months. Pregnant women were excluded from analysis, and the investigators adjusted for a range of covariates, including age, sex, ethnicity, education level, body mass, smoking, diabetes, cardiovascular disease, and nutritional factors.
Dr. Liu and colleagues reported an inverse association between dietary zinc consumption and migraine, with the highest-consuming quintile of the cohort (15.8 mg or more zinc per day) seeing lowest risk of migraine (odds ratio, 0.70; 95% confidence interval, 0.52-0.94; P = .029), compared with the low-consuming quintile (5.9 mg or less daily). Among people getting high levels of zinc (19.3-32.5 mg daily) through supplements, risk of migraine was lower still, to between an OR of 0.62 (95% CI: 0.46–0.83, P = 0.019) and an OR of 0.67 (95% CI, 0.49–0.91; P = .045).
While the investigators acknowledged limitations of the study, including its cross-sectional design and use of a broad question to discern prevalence of migraine, the findings suggest that “zinc is an important nutrient that influences migraine,” they wrote, citing evidence for its antioxidant and anti-inflammatory properties.
The importance of nutritional factors
Commenting on the research findings, Deborah I. Friedman, MD, MPH, a headache specialist in Dallas, said that Dr. Liu and colleagues’ findings added to a growing information base about nutritional factors and migraine. For example, “low magnesium levels are common in people with migraine, and magnesium supplementation is a recommended preventive treatment for migraine.”
Dr. Friedman cited a recent study showing that vitamin B12 and magnesium supplementation in women (, combined with high-intensity interval training, “silenced” the inflammation signaling pathway, helped migraine pain and decreased levels of calcitonin gene-related peptide. A 2022 randomized trial found that alpha lipoic acid supplementation reduced migraine severity, frequency and disability in women with episodic migraine.
Vitamin D levels are also lower in people with migraine, compared with controls, Dr. Friedman noted, and a randomized trial of 2,000 IU of vitamin D3 daily saw reduced monthly headache days, attack duration, severe headaches, and analgesic use, compared with placebo. Other nutrients implicated in migraine include coenzyme Q10, calcium, folic acid, vitamin B6, and vitamin B1.
“What should a patient with migraine do with all of this information? First, eat a healthy and balanced diet,” Dr. Friedman said. “Sources of dietary zinc include red meat, nuts, legumes, poultry, shellfish (especially oysters), whole grains, some cereals, and even dark chocolate. The recommended daily dosage of zinc is 9.5 mg in men and 7 mg in women. Most people get enough zinc in their diet; vegetarians, vegans, pregnant or breastfeeding women, and adults over age 65 may need to take supplemental zinc.”
Dr. Liu and colleagues’ work was supported by China’s National Natural Science Foundation. The investigators reported no financial conflicts of interest. Dr. Friedman has received financial support from Alder, Allergan, Amgen, Biohaven, Eli Lilly, Merck, Teva, and other pharmaceutical manufacturers.
, according to results from a cross-sectional study of more than 11,000 American adults.
For their research, published online in Headache, Huanxian Liu, MD, and colleagues at Chinese PLA General Hospital in Beijing, analyzed publicly available data from the U.S. National Health and Nutrition Examination Survey to determine whether people self-reporting migraine or severe headache saw lower zinc intake, compared with people without migraine. The data used in the analysis was collected between 1999 and 2004, and contained information on foods and drinks consumed by participants in a 24-hour period, along with additional health information.
An inverse relationship
The investigators divided their study’s 11,088 participants (mean age, 46.5 years; 50% female) into quintiles based on dietary zinc consumption as inferred from foods eaten. They also considered zinc supplementation, for which data was available for 4,324 participants, of whom 2,607 reported use of supplements containing zinc.
Some 20% of the cohort (n = 2,236) reported migraine or severe headache within the previous 3 months. Pregnant women were excluded from analysis, and the investigators adjusted for a range of covariates, including age, sex, ethnicity, education level, body mass, smoking, diabetes, cardiovascular disease, and nutritional factors.
Dr. Liu and colleagues reported an inverse association between dietary zinc consumption and migraine, with the highest-consuming quintile of the cohort (15.8 mg or more zinc per day) seeing lowest risk of migraine (odds ratio, 0.70; 95% confidence interval, 0.52-0.94; P = .029), compared with the low-consuming quintile (5.9 mg or less daily). Among people getting high levels of zinc (19.3-32.5 mg daily) through supplements, risk of migraine was lower still, to between an OR of 0.62 (95% CI: 0.46–0.83, P = 0.019) and an OR of 0.67 (95% CI, 0.49–0.91; P = .045).
While the investigators acknowledged limitations of the study, including its cross-sectional design and use of a broad question to discern prevalence of migraine, the findings suggest that “zinc is an important nutrient that influences migraine,” they wrote, citing evidence for its antioxidant and anti-inflammatory properties.
The importance of nutritional factors
Commenting on the research findings, Deborah I. Friedman, MD, MPH, a headache specialist in Dallas, said that Dr. Liu and colleagues’ findings added to a growing information base about nutritional factors and migraine. For example, “low magnesium levels are common in people with migraine, and magnesium supplementation is a recommended preventive treatment for migraine.”
Dr. Friedman cited a recent study showing that vitamin B12 and magnesium supplementation in women (, combined with high-intensity interval training, “silenced” the inflammation signaling pathway, helped migraine pain and decreased levels of calcitonin gene-related peptide. A 2022 randomized trial found that alpha lipoic acid supplementation reduced migraine severity, frequency and disability in women with episodic migraine.
Vitamin D levels are also lower in people with migraine, compared with controls, Dr. Friedman noted, and a randomized trial of 2,000 IU of vitamin D3 daily saw reduced monthly headache days, attack duration, severe headaches, and analgesic use, compared with placebo. Other nutrients implicated in migraine include coenzyme Q10, calcium, folic acid, vitamin B6, and vitamin B1.
“What should a patient with migraine do with all of this information? First, eat a healthy and balanced diet,” Dr. Friedman said. “Sources of dietary zinc include red meat, nuts, legumes, poultry, shellfish (especially oysters), whole grains, some cereals, and even dark chocolate. The recommended daily dosage of zinc is 9.5 mg in men and 7 mg in women. Most people get enough zinc in their diet; vegetarians, vegans, pregnant or breastfeeding women, and adults over age 65 may need to take supplemental zinc.”
Dr. Liu and colleagues’ work was supported by China’s National Natural Science Foundation. The investigators reported no financial conflicts of interest. Dr. Friedman has received financial support from Alder, Allergan, Amgen, Biohaven, Eli Lilly, Merck, Teva, and other pharmaceutical manufacturers.
FROM HEADACHE