User login
MDedge latest news is breaking news from medical conferences, journals, guidelines, the FDA and CDC.
Could a Urinary Biomarker Panel Be a ‘Game Changer’ for Lupus Nephritis Management?
WASHINGTON — An investigational 12-protein panel of urinary biomarkers predicted histologically active lupus nephritis (LN) with 86% accuracy, according to research presented at the American College of Rheumatology (ACR) 2024 Annual Meeting.
The noninvasive biomarker panel “robustly predicts meaningful and actionable histological findings” in patients with active proliferative LN, Andrea Fava, MD, assistant professor of medicine in the Division of Rheumatology at Johns Hopkins Medicine in Baltimore, told attendees.
“In contrast to proteinuria, which can’t differentiate inflammation from damage, this panel for histological activity includes a set of 12 proteins linked to intrarenal inflammation,” said Fava, director of Lupus Translational Research at Johns Hopkins. A decline in the biomarker score at 3 months predicted a clinical response at 1 year, and persistent elevation of the score at 1 year predicted permanent loss of kidney function, “which makes it tempting as a treatment endpoint,” Fava said. “Upon further validation, this biomarker panel could aid in the diagnosis of lupus nephritis and guide treatment decisions.”
Alfred Kim, MD, PhD, an associate professor of medicine at Washington University in St. Louis, was not involved in the research but noted the potential value of a reliable biomarker panel.
“If we have urinary biomarkers that strongly associate with histologic activity, this would be a game changer in the management of LN,” Kim told Medscape Medical News. “Right now, the gold standard is to perform another kidney biopsy to determine if therapy is working. But this is invasive, and many patients do not want to do another kidney biopsy. Conversely, the easiest way to assess lupus nephritis activity is through a urinalysis, focusing on urinary protein levels,” but relying on proteinuria has limitations as well.
“The most important [limitation] is that proteinuria cannot distinguish treatable inflammation from chronic damage,” Fava said. Persistent histologic activity in patients without proteinuria predicts flares, but tracking histologic activity, as Kim noted, requires repeat biopsies.
“So we need better biomarkers because biomarkers that can reflect tissue biology in real time can guide personalized treatment, and that’s one of the main goals of the Accelerating Medicines Partnership [AMP],” he said. The AMP is a public-private partnership between the National Institutes of Health (NIH), the US Food and Drug Administration (FDA), multiple biopharmaceutical and life science companies, and nonprofit and other organizations. Lupus is one of the AMP’s funded projects.
Kim agreed that “effective biomarkers are a huge unmet need in LN.” Further, he said, “imagine a world where the diagnosis of LN can be made just through urinary biomarkers and obviate the need for biopsy. Both patients and providers will be ecstatic at this possibility.”
Fava described the background for how his research team determined what biomarkers to test. They had previously enrolled 225 patients with LN undergoing a clinically indicated kidney biopsy and collected urine samples from them at baseline and at 12, 24, and 52 weeks after their biopsy.
Of the 225 patients included, 9% with only mesangial LN (class I-II), 25% with pure membranous LN (class V), 24% with mixed LN (class III or IV with or without V), 38% with proliferative LN (class III or IV), and 4% with advanced sclerosis LN (class VI). From these samples, they quantified 1200 proteins and looked at how they correlated with histologic activity.
“What was interesting was that in patients who were classified as responders after 1 year, there were many of these proteins that declined as early as 3 months, suggesting that effective immunosuppression is reducing intrarenal inflammation, and we can capture it in real time,” Fava said.
Biomarker Panel Predicts Histologically Active LN
So they set to determining whether they could develop a urinary biomarker for histologically active LN that could be useful in clinical decision-making. They focused on one that could detect active proliferative LN with an NIH activity index score > 2. Their 179 participants included 47.5% Black, 27.9% White, and 14.5% Asian participants, with 10.1% of other races. The predominantly female (86.6%) cohort had an average age of 37 years. Among the LN classes, about one third (34.6%) had pure proliferative disease, 17.9% had mixed proliferative, 27.9% had pure membranous, 11.7% had class I or II, and 5% had class VI. Just over half the participants (55.7%) had not responded to treatment at 12 months, whereas 25% had a complete response, and 19.3% had a partial response.
However, both the 78 participants with an NIH activity index score > 2 and the 101 with a score ≤ 2 had a median score of 3 on the NIH chronicity index. And the urine protein-to-creatinine ratio — 2.8 in the group with an NIH activity index score > 2 and 2.4 in the other group — was nearly indistinguishable between the two groups, Fava said.
They then trained multiple algorithms on 80% of the data to find the best performing set of proteins (with an area under the curve [AUC] of 90%) for predicting an NIH activity index score > 2. They reduced the number of proteins to maximize practicality and performance of the panel, Fava said, and ultimately identified a 12-protein panel that was highly predictive of an NIH activity index score > 2. Then, they validated that panel using the other 20% of the data. The training set had an AUC of 90%, and the test set was validated with an AUC of 93%.
The 12-protein panel score outperformed anti-dsDNA, C3 complement, and proteinuria, with a sensitivity of 81%, a specificity of 90%, a positive predictive value of 87%, a negative predictive value of 86%, and an accuracy of 86%. The proteins with the greatest relative importance were CD163, cathepsin S, FOLR2, and CEACAM-1.
“In contrast to proteinuria, these proteins were related to inflammatory processes found in the kidneys in patients with lupus nephritis, such as activation of macrophages, neutrophils and monocytes, lymphocytes, and complement,” Fava said.
When they looked at the trajectories of the probabilities from the biomarker panel at 3, 6, and 12 months, the probability of the NIH activity index score remaining > 2 stayed high in the nonresponders over 1 year, but the trajectory declined at 3 months in the responders, indicating a decrease in kidney inflammation (P < .001).
Can the Biomarker Panel Serve as a Treatment Endpoint?
Then, to determine whether the panel could act as a reliable treatment endpoint, the researchers followed the patients for up to 7 years. One third of the patients lost more than 40% of their kidney function during the follow-up. They found that a high urinary biomarker score at 12 months predicted future glomerular filtration rate loss, independent of proteinuria.
This panel was tested specifically for proliferative LN, so “we may need distinct panels for each [LN type] to capture most of these patients,” Kim said. “I think that’s where the gold mine is: A personalized medicine approach where a large biomarker panel identifies which smaller panel that patient best fits, then use that for monitoring.”
Kim did note an important potential limitation in the study regarding how samples are used in biomarker discovery and validation vs in clinical practice. “Most samples in research studies are frozen, then thawed, while urine is assayed within a couple hours after collection in the clinical setting,” he said. “Do sample processing differences create a situation where a biomarker works in a research project but not in the clinical setting?” But more likely, he said, the opposite may be the case, where frozen samples allow for more degradation of proteins and potentially useful LN biomarker candidates are never detected.
Another challenge, Kim added, albeit unrelated to the study findings, is that diagnostic companies are finding it difficult to get payers to cover new tests, so that could become a challenge if the panel undergoes further validation and then FDA qualification.
The research was funded by Exagen. Fava reported disclosures with Arctiva, AstraZeneca, Exagen, Novartis, UCB, Bristol Myers Squibb, Annexon Bio, and Bain Capital. His coauthors reported financial relationships with numerous pharmaceutical and life science companies, including Exagen, and some are employees of Exagen.
Kim reported research agreements with AstraZeneca, Bristol Myers Squibb, Novartis, and CRISPR Therapeutics; receiving royalties from Kypha; and receiving consulting/speaking fees from AbbVie, Amgen, Atara Bio, Aurinia, Cargo Tx, Exagen, GlaxoSmithKline, Hinge Bio, Kypha, and UpToDate.
A version of this article appeared on Medscape.com.
WASHINGTON — An investigational 12-protein panel of urinary biomarkers predicted histologically active lupus nephritis (LN) with 86% accuracy, according to research presented at the American College of Rheumatology (ACR) 2024 Annual Meeting.
The noninvasive biomarker panel “robustly predicts meaningful and actionable histological findings” in patients with active proliferative LN, Andrea Fava, MD, assistant professor of medicine in the Division of Rheumatology at Johns Hopkins Medicine in Baltimore, told attendees.
“In contrast to proteinuria, which can’t differentiate inflammation from damage, this panel for histological activity includes a set of 12 proteins linked to intrarenal inflammation,” said Fava, director of Lupus Translational Research at Johns Hopkins. A decline in the biomarker score at 3 months predicted a clinical response at 1 year, and persistent elevation of the score at 1 year predicted permanent loss of kidney function, “which makes it tempting as a treatment endpoint,” Fava said. “Upon further validation, this biomarker panel could aid in the diagnosis of lupus nephritis and guide treatment decisions.”
Alfred Kim, MD, PhD, an associate professor of medicine at Washington University in St. Louis, was not involved in the research but noted the potential value of a reliable biomarker panel.
“If we have urinary biomarkers that strongly associate with histologic activity, this would be a game changer in the management of LN,” Kim told Medscape Medical News. “Right now, the gold standard is to perform another kidney biopsy to determine if therapy is working. But this is invasive, and many patients do not want to do another kidney biopsy. Conversely, the easiest way to assess lupus nephritis activity is through a urinalysis, focusing on urinary protein levels,” but relying on proteinuria has limitations as well.
“The most important [limitation] is that proteinuria cannot distinguish treatable inflammation from chronic damage,” Fava said. Persistent histologic activity in patients without proteinuria predicts flares, but tracking histologic activity, as Kim noted, requires repeat biopsies.
“So we need better biomarkers because biomarkers that can reflect tissue biology in real time can guide personalized treatment, and that’s one of the main goals of the Accelerating Medicines Partnership [AMP],” he said. The AMP is a public-private partnership between the National Institutes of Health (NIH), the US Food and Drug Administration (FDA), multiple biopharmaceutical and life science companies, and nonprofit and other organizations. Lupus is one of the AMP’s funded projects.
Kim agreed that “effective biomarkers are a huge unmet need in LN.” Further, he said, “imagine a world where the diagnosis of LN can be made just through urinary biomarkers and obviate the need for biopsy. Both patients and providers will be ecstatic at this possibility.”
Fava described the background for how his research team determined what biomarkers to test. They had previously enrolled 225 patients with LN undergoing a clinically indicated kidney biopsy and collected urine samples from them at baseline and at 12, 24, and 52 weeks after their biopsy.
Of the 225 patients included, 9% with only mesangial LN (class I-II), 25% with pure membranous LN (class V), 24% with mixed LN (class III or IV with or without V), 38% with proliferative LN (class III or IV), and 4% with advanced sclerosis LN (class VI). From these samples, they quantified 1200 proteins and looked at how they correlated with histologic activity.
“What was interesting was that in patients who were classified as responders after 1 year, there were many of these proteins that declined as early as 3 months, suggesting that effective immunosuppression is reducing intrarenal inflammation, and we can capture it in real time,” Fava said.
Biomarker Panel Predicts Histologically Active LN
So they set to determining whether they could develop a urinary biomarker for histologically active LN that could be useful in clinical decision-making. They focused on one that could detect active proliferative LN with an NIH activity index score > 2. Their 179 participants included 47.5% Black, 27.9% White, and 14.5% Asian participants, with 10.1% of other races. The predominantly female (86.6%) cohort had an average age of 37 years. Among the LN classes, about one third (34.6%) had pure proliferative disease, 17.9% had mixed proliferative, 27.9% had pure membranous, 11.7% had class I or II, and 5% had class VI. Just over half the participants (55.7%) had not responded to treatment at 12 months, whereas 25% had a complete response, and 19.3% had a partial response.
However, both the 78 participants with an NIH activity index score > 2 and the 101 with a score ≤ 2 had a median score of 3 on the NIH chronicity index. And the urine protein-to-creatinine ratio — 2.8 in the group with an NIH activity index score > 2 and 2.4 in the other group — was nearly indistinguishable between the two groups, Fava said.
They then trained multiple algorithms on 80% of the data to find the best performing set of proteins (with an area under the curve [AUC] of 90%) for predicting an NIH activity index score > 2. They reduced the number of proteins to maximize practicality and performance of the panel, Fava said, and ultimately identified a 12-protein panel that was highly predictive of an NIH activity index score > 2. Then, they validated that panel using the other 20% of the data. The training set had an AUC of 90%, and the test set was validated with an AUC of 93%.
The 12-protein panel score outperformed anti-dsDNA, C3 complement, and proteinuria, with a sensitivity of 81%, a specificity of 90%, a positive predictive value of 87%, a negative predictive value of 86%, and an accuracy of 86%. The proteins with the greatest relative importance were CD163, cathepsin S, FOLR2, and CEACAM-1.
“In contrast to proteinuria, these proteins were related to inflammatory processes found in the kidneys in patients with lupus nephritis, such as activation of macrophages, neutrophils and monocytes, lymphocytes, and complement,” Fava said.
When they looked at the trajectories of the probabilities from the biomarker panel at 3, 6, and 12 months, the probability of the NIH activity index score remaining > 2 stayed high in the nonresponders over 1 year, but the trajectory declined at 3 months in the responders, indicating a decrease in kidney inflammation (P < .001).
Can the Biomarker Panel Serve as a Treatment Endpoint?
Then, to determine whether the panel could act as a reliable treatment endpoint, the researchers followed the patients for up to 7 years. One third of the patients lost more than 40% of their kidney function during the follow-up. They found that a high urinary biomarker score at 12 months predicted future glomerular filtration rate loss, independent of proteinuria.
This panel was tested specifically for proliferative LN, so “we may need distinct panels for each [LN type] to capture most of these patients,” Kim said. “I think that’s where the gold mine is: A personalized medicine approach where a large biomarker panel identifies which smaller panel that patient best fits, then use that for monitoring.”
Kim did note an important potential limitation in the study regarding how samples are used in biomarker discovery and validation vs in clinical practice. “Most samples in research studies are frozen, then thawed, while urine is assayed within a couple hours after collection in the clinical setting,” he said. “Do sample processing differences create a situation where a biomarker works in a research project but not in the clinical setting?” But more likely, he said, the opposite may be the case, where frozen samples allow for more degradation of proteins and potentially useful LN biomarker candidates are never detected.
Another challenge, Kim added, albeit unrelated to the study findings, is that diagnostic companies are finding it difficult to get payers to cover new tests, so that could become a challenge if the panel undergoes further validation and then FDA qualification.
The research was funded by Exagen. Fava reported disclosures with Arctiva, AstraZeneca, Exagen, Novartis, UCB, Bristol Myers Squibb, Annexon Bio, and Bain Capital. His coauthors reported financial relationships with numerous pharmaceutical and life science companies, including Exagen, and some are employees of Exagen.
Kim reported research agreements with AstraZeneca, Bristol Myers Squibb, Novartis, and CRISPR Therapeutics; receiving royalties from Kypha; and receiving consulting/speaking fees from AbbVie, Amgen, Atara Bio, Aurinia, Cargo Tx, Exagen, GlaxoSmithKline, Hinge Bio, Kypha, and UpToDate.
A version of this article appeared on Medscape.com.
WASHINGTON — An investigational 12-protein panel of urinary biomarkers predicted histologically active lupus nephritis (LN) with 86% accuracy, according to research presented at the American College of Rheumatology (ACR) 2024 Annual Meeting.
The noninvasive biomarker panel “robustly predicts meaningful and actionable histological findings” in patients with active proliferative LN, Andrea Fava, MD, assistant professor of medicine in the Division of Rheumatology at Johns Hopkins Medicine in Baltimore, told attendees.
“In contrast to proteinuria, which can’t differentiate inflammation from damage, this panel for histological activity includes a set of 12 proteins linked to intrarenal inflammation,” said Fava, director of Lupus Translational Research at Johns Hopkins. A decline in the biomarker score at 3 months predicted a clinical response at 1 year, and persistent elevation of the score at 1 year predicted permanent loss of kidney function, “which makes it tempting as a treatment endpoint,” Fava said. “Upon further validation, this biomarker panel could aid in the diagnosis of lupus nephritis and guide treatment decisions.”
Alfred Kim, MD, PhD, an associate professor of medicine at Washington University in St. Louis, was not involved in the research but noted the potential value of a reliable biomarker panel.
“If we have urinary biomarkers that strongly associate with histologic activity, this would be a game changer in the management of LN,” Kim told Medscape Medical News. “Right now, the gold standard is to perform another kidney biopsy to determine if therapy is working. But this is invasive, and many patients do not want to do another kidney biopsy. Conversely, the easiest way to assess lupus nephritis activity is through a urinalysis, focusing on urinary protein levels,” but relying on proteinuria has limitations as well.
“The most important [limitation] is that proteinuria cannot distinguish treatable inflammation from chronic damage,” Fava said. Persistent histologic activity in patients without proteinuria predicts flares, but tracking histologic activity, as Kim noted, requires repeat biopsies.
“So we need better biomarkers because biomarkers that can reflect tissue biology in real time can guide personalized treatment, and that’s one of the main goals of the Accelerating Medicines Partnership [AMP],” he said. The AMP is a public-private partnership between the National Institutes of Health (NIH), the US Food and Drug Administration (FDA), multiple biopharmaceutical and life science companies, and nonprofit and other organizations. Lupus is one of the AMP’s funded projects.
Kim agreed that “effective biomarkers are a huge unmet need in LN.” Further, he said, “imagine a world where the diagnosis of LN can be made just through urinary biomarkers and obviate the need for biopsy. Both patients and providers will be ecstatic at this possibility.”
Fava described the background for how his research team determined what biomarkers to test. They had previously enrolled 225 patients with LN undergoing a clinically indicated kidney biopsy and collected urine samples from them at baseline and at 12, 24, and 52 weeks after their biopsy.
Of the 225 patients included, 9% with only mesangial LN (class I-II), 25% with pure membranous LN (class V), 24% with mixed LN (class III or IV with or without V), 38% with proliferative LN (class III or IV), and 4% with advanced sclerosis LN (class VI). From these samples, they quantified 1200 proteins and looked at how they correlated with histologic activity.
“What was interesting was that in patients who were classified as responders after 1 year, there were many of these proteins that declined as early as 3 months, suggesting that effective immunosuppression is reducing intrarenal inflammation, and we can capture it in real time,” Fava said.
Biomarker Panel Predicts Histologically Active LN
So they set to determining whether they could develop a urinary biomarker for histologically active LN that could be useful in clinical decision-making. They focused on one that could detect active proliferative LN with an NIH activity index score > 2. Their 179 participants included 47.5% Black, 27.9% White, and 14.5% Asian participants, with 10.1% of other races. The predominantly female (86.6%) cohort had an average age of 37 years. Among the LN classes, about one third (34.6%) had pure proliferative disease, 17.9% had mixed proliferative, 27.9% had pure membranous, 11.7% had class I or II, and 5% had class VI. Just over half the participants (55.7%) had not responded to treatment at 12 months, whereas 25% had a complete response, and 19.3% had a partial response.
However, both the 78 participants with an NIH activity index score > 2 and the 101 with a score ≤ 2 had a median score of 3 on the NIH chronicity index. And the urine protein-to-creatinine ratio — 2.8 in the group with an NIH activity index score > 2 and 2.4 in the other group — was nearly indistinguishable between the two groups, Fava said.
They then trained multiple algorithms on 80% of the data to find the best performing set of proteins (with an area under the curve [AUC] of 90%) for predicting an NIH activity index score > 2. They reduced the number of proteins to maximize practicality and performance of the panel, Fava said, and ultimately identified a 12-protein panel that was highly predictive of an NIH activity index score > 2. Then, they validated that panel using the other 20% of the data. The training set had an AUC of 90%, and the test set was validated with an AUC of 93%.
The 12-protein panel score outperformed anti-dsDNA, C3 complement, and proteinuria, with a sensitivity of 81%, a specificity of 90%, a positive predictive value of 87%, a negative predictive value of 86%, and an accuracy of 86%. The proteins with the greatest relative importance were CD163, cathepsin S, FOLR2, and CEACAM-1.
“In contrast to proteinuria, these proteins were related to inflammatory processes found in the kidneys in patients with lupus nephritis, such as activation of macrophages, neutrophils and monocytes, lymphocytes, and complement,” Fava said.
When they looked at the trajectories of the probabilities from the biomarker panel at 3, 6, and 12 months, the probability of the NIH activity index score remaining > 2 stayed high in the nonresponders over 1 year, but the trajectory declined at 3 months in the responders, indicating a decrease in kidney inflammation (P < .001).
Can the Biomarker Panel Serve as a Treatment Endpoint?
Then, to determine whether the panel could act as a reliable treatment endpoint, the researchers followed the patients for up to 7 years. One third of the patients lost more than 40% of their kidney function during the follow-up. They found that a high urinary biomarker score at 12 months predicted future glomerular filtration rate loss, independent of proteinuria.
This panel was tested specifically for proliferative LN, so “we may need distinct panels for each [LN type] to capture most of these patients,” Kim said. “I think that’s where the gold mine is: A personalized medicine approach where a large biomarker panel identifies which smaller panel that patient best fits, then use that for monitoring.”
Kim did note an important potential limitation in the study regarding how samples are used in biomarker discovery and validation vs in clinical practice. “Most samples in research studies are frozen, then thawed, while urine is assayed within a couple hours after collection in the clinical setting,” he said. “Do sample processing differences create a situation where a biomarker works in a research project but not in the clinical setting?” But more likely, he said, the opposite may be the case, where frozen samples allow for more degradation of proteins and potentially useful LN biomarker candidates are never detected.
Another challenge, Kim added, albeit unrelated to the study findings, is that diagnostic companies are finding it difficult to get payers to cover new tests, so that could become a challenge if the panel undergoes further validation and then FDA qualification.
The research was funded by Exagen. Fava reported disclosures with Arctiva, AstraZeneca, Exagen, Novartis, UCB, Bristol Myers Squibb, Annexon Bio, and Bain Capital. His coauthors reported financial relationships with numerous pharmaceutical and life science companies, including Exagen, and some are employees of Exagen.
Kim reported research agreements with AstraZeneca, Bristol Myers Squibb, Novartis, and CRISPR Therapeutics; receiving royalties from Kypha; and receiving consulting/speaking fees from AbbVie, Amgen, Atara Bio, Aurinia, Cargo Tx, Exagen, GlaxoSmithKline, Hinge Bio, Kypha, and UpToDate.
A version of this article appeared on Medscape.com.
FROM ACR 2024
Digestive Disease Mortality Higher for US Indigenous Communities
which experience the highest death rates and ongoing increases, according to a recent study.
Policymakers, healthcare providers, and communities need to respond with targeted interventions and collaborative efforts that address these inequities and advance digestive health equity, lead author Wafa A. Aldhaleei, MD, of Mayo Clinic, Rochester, Minnesota, and colleagues reported.
“Several studies have reported the epidemiological characteristics of certain digestive diseases such as pancreatitis, liver and biliary diseases, and inflammatory bowel disease,” the investigators wrote in Clinical Gastroenterology and Hepatology. “These studies provide insights into the US burden by sex and racial and ethnic disparities of various digestive diseases individually. However, little is known about racial disparities in the United States digestive diseases mortality burden.”
As part of the Global Burden of Disease Study, the investigators analyzed data from the Institute of Health Metrics and Evaluation Global Health Data Exchange, including age-standardized digestive disease mortality rates for five racial and ethnic groups (Black, White, American Indian and Alaska Native, Asian and Pacific Islander, and Latino) between 2000 and 2019, with further subgroups based on sex, state, and county. Joinpoint regression analysis was employed to determine overall temporal trends by demography.
Results showed striking mortality rate differences across racial and ethnic groups. In 2019, digestive disease mortality rates were highest among American Indian and Alaska Native individuals, reaching 86.2 per 100,000 — over twice the rate seen in White (35.5 per 100,000), Black (33.6 per 100,000), and Latino (33.6 per 100,000) populations, and more than five times higher than in Asian and Pacific Islander individuals (15.6 per 100,000). Over the study period, American Indian and Alaska Native individuals experienced a significant 0.87% average annual increase in mortality rates, while White individuals saw a smaller increase of 0.12% annually. In contrast, Latino, Black, and Asian and Pacific Islander individuals had declining average annual rates.
Geographic disparities in digestive disease mortality were significant, with West Virginia recording the highest state-level rate in 2019 at 44.8 deaths per 100,000, well above the national rate of 34.5 per 100,000. Certain regions with high concentrations of American Indian and Alaska Native populations, such as the Southwest Tribes service area (including Arizona and New Mexico) and the Plain Indians service area (spanning Montana, North Dakota, and South Dakota), reported mortality rates exceeding 70 per 100,000, more than double the national average. In Alaska, the American Indian and Alaska Native population’s mortality rate surged with annual increases of up to 3.53% during some periods.
Analyses also revealed some notable sex-based trends. Among American Indian and Alaska Native individuals, males experienced a mortality rate increase of 0.87% annually, reaching 93.5 per 100,000 by 2019, while females saw an even sharper rise at 1.11% per year, with a mortality rate of 79.6 per 100,000 in 2019. For White individuals, the average annual percentage increase was 0.12% for males, bringing their rate to 40.2 per 100,000, and 0.30% for females, with a rate of 31.0 per 100,000 in 2019.
“Our study reveals persistent racial, ethnic, and geographic disparities in digestive diseases mortality in the United States,” the investigators concluded. “Targeted interventions and further research are needed to address these disparities and promote digestive health equity. Collaboration among researchers, policymakers, healthcare providers, and communities is essential to achieve this goal.”This research was conducted as part of Global Burden of Disease, Injuries and Risk Factors Study, coordinated by the Institute of Health Metrics and Evaluation. The investigators disclosed no conflicts of interest.
which experience the highest death rates and ongoing increases, according to a recent study.
Policymakers, healthcare providers, and communities need to respond with targeted interventions and collaborative efforts that address these inequities and advance digestive health equity, lead author Wafa A. Aldhaleei, MD, of Mayo Clinic, Rochester, Minnesota, and colleagues reported.
“Several studies have reported the epidemiological characteristics of certain digestive diseases such as pancreatitis, liver and biliary diseases, and inflammatory bowel disease,” the investigators wrote in Clinical Gastroenterology and Hepatology. “These studies provide insights into the US burden by sex and racial and ethnic disparities of various digestive diseases individually. However, little is known about racial disparities in the United States digestive diseases mortality burden.”
As part of the Global Burden of Disease Study, the investigators analyzed data from the Institute of Health Metrics and Evaluation Global Health Data Exchange, including age-standardized digestive disease mortality rates for five racial and ethnic groups (Black, White, American Indian and Alaska Native, Asian and Pacific Islander, and Latino) between 2000 and 2019, with further subgroups based on sex, state, and county. Joinpoint regression analysis was employed to determine overall temporal trends by demography.
Results showed striking mortality rate differences across racial and ethnic groups. In 2019, digestive disease mortality rates were highest among American Indian and Alaska Native individuals, reaching 86.2 per 100,000 — over twice the rate seen in White (35.5 per 100,000), Black (33.6 per 100,000), and Latino (33.6 per 100,000) populations, and more than five times higher than in Asian and Pacific Islander individuals (15.6 per 100,000). Over the study period, American Indian and Alaska Native individuals experienced a significant 0.87% average annual increase in mortality rates, while White individuals saw a smaller increase of 0.12% annually. In contrast, Latino, Black, and Asian and Pacific Islander individuals had declining average annual rates.
Geographic disparities in digestive disease mortality were significant, with West Virginia recording the highest state-level rate in 2019 at 44.8 deaths per 100,000, well above the national rate of 34.5 per 100,000. Certain regions with high concentrations of American Indian and Alaska Native populations, such as the Southwest Tribes service area (including Arizona and New Mexico) and the Plain Indians service area (spanning Montana, North Dakota, and South Dakota), reported mortality rates exceeding 70 per 100,000, more than double the national average. In Alaska, the American Indian and Alaska Native population’s mortality rate surged with annual increases of up to 3.53% during some periods.
Analyses also revealed some notable sex-based trends. Among American Indian and Alaska Native individuals, males experienced a mortality rate increase of 0.87% annually, reaching 93.5 per 100,000 by 2019, while females saw an even sharper rise at 1.11% per year, with a mortality rate of 79.6 per 100,000 in 2019. For White individuals, the average annual percentage increase was 0.12% for males, bringing their rate to 40.2 per 100,000, and 0.30% for females, with a rate of 31.0 per 100,000 in 2019.
“Our study reveals persistent racial, ethnic, and geographic disparities in digestive diseases mortality in the United States,” the investigators concluded. “Targeted interventions and further research are needed to address these disparities and promote digestive health equity. Collaboration among researchers, policymakers, healthcare providers, and communities is essential to achieve this goal.”This research was conducted as part of Global Burden of Disease, Injuries and Risk Factors Study, coordinated by the Institute of Health Metrics and Evaluation. The investigators disclosed no conflicts of interest.
which experience the highest death rates and ongoing increases, according to a recent study.
Policymakers, healthcare providers, and communities need to respond with targeted interventions and collaborative efforts that address these inequities and advance digestive health equity, lead author Wafa A. Aldhaleei, MD, of Mayo Clinic, Rochester, Minnesota, and colleagues reported.
“Several studies have reported the epidemiological characteristics of certain digestive diseases such as pancreatitis, liver and biliary diseases, and inflammatory bowel disease,” the investigators wrote in Clinical Gastroenterology and Hepatology. “These studies provide insights into the US burden by sex and racial and ethnic disparities of various digestive diseases individually. However, little is known about racial disparities in the United States digestive diseases mortality burden.”
As part of the Global Burden of Disease Study, the investigators analyzed data from the Institute of Health Metrics and Evaluation Global Health Data Exchange, including age-standardized digestive disease mortality rates for five racial and ethnic groups (Black, White, American Indian and Alaska Native, Asian and Pacific Islander, and Latino) between 2000 and 2019, with further subgroups based on sex, state, and county. Joinpoint regression analysis was employed to determine overall temporal trends by demography.
Results showed striking mortality rate differences across racial and ethnic groups. In 2019, digestive disease mortality rates were highest among American Indian and Alaska Native individuals, reaching 86.2 per 100,000 — over twice the rate seen in White (35.5 per 100,000), Black (33.6 per 100,000), and Latino (33.6 per 100,000) populations, and more than five times higher than in Asian and Pacific Islander individuals (15.6 per 100,000). Over the study period, American Indian and Alaska Native individuals experienced a significant 0.87% average annual increase in mortality rates, while White individuals saw a smaller increase of 0.12% annually. In contrast, Latino, Black, and Asian and Pacific Islander individuals had declining average annual rates.
Geographic disparities in digestive disease mortality were significant, with West Virginia recording the highest state-level rate in 2019 at 44.8 deaths per 100,000, well above the national rate of 34.5 per 100,000. Certain regions with high concentrations of American Indian and Alaska Native populations, such as the Southwest Tribes service area (including Arizona and New Mexico) and the Plain Indians service area (spanning Montana, North Dakota, and South Dakota), reported mortality rates exceeding 70 per 100,000, more than double the national average. In Alaska, the American Indian and Alaska Native population’s mortality rate surged with annual increases of up to 3.53% during some periods.
Analyses also revealed some notable sex-based trends. Among American Indian and Alaska Native individuals, males experienced a mortality rate increase of 0.87% annually, reaching 93.5 per 100,000 by 2019, while females saw an even sharper rise at 1.11% per year, with a mortality rate of 79.6 per 100,000 in 2019. For White individuals, the average annual percentage increase was 0.12% for males, bringing their rate to 40.2 per 100,000, and 0.30% for females, with a rate of 31.0 per 100,000 in 2019.
“Our study reveals persistent racial, ethnic, and geographic disparities in digestive diseases mortality in the United States,” the investigators concluded. “Targeted interventions and further research are needed to address these disparities and promote digestive health equity. Collaboration among researchers, policymakers, healthcare providers, and communities is essential to achieve this goal.”This research was conducted as part of Global Burden of Disease, Injuries and Risk Factors Study, coordinated by the Institute of Health Metrics and Evaluation. The investigators disclosed no conflicts of interest.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Why Do People Struggle to Prioritize Their Long-Term Health?
Understanding how people make health-related decisions requires a deeper exploration of their motivations, beliefs, and circumstances, Christopher Dye, DPhil, professor of epidemiology at the University of Oxford in England, and former director of strategy at the World Health Organization, said in an interview. “In public health, we tend to prescribe solutions. But unless we understand how people really make choices about health and why they are less interested in prevention and happier to wait until they become ill, then we are not in the position to shift away from curative treatments to preventive treatments.”
Despite the well-documented benefits of preventive measures, many people fail to engage in proactive health behaviors. This can be attributed to psychological biases and socioeconomic factors that shape how people prioritize their health.
“The choices people make have some to do with facts, but they also have much to do with values and perception. We need to understand and take these perceptions and values seriously,” Dye said.
The Paradox of Prevention
People often recognize prevention as the right course of action but fail to act. “We know it’s the right thing to do, but we don’t do it,” Dye said.
He explained that, when considering potential future threats, we assess two key factors: The severity of the danger and the cost of addressing it. Action is more likely when the danger is significant and the cost of mitigation is low.
This dynamic can be broken down into three critical questions:
What is the nature of the hazard? Is the threat severe, like Ebola, which has a case fatality rate of around 50% in untreated cases, or relatively milder, like COVID-19, with a fatality rate of less than 1% but a much broader spread? The nastier the hazard, the more likely we are to take it seriously.
How likely is it to happen? Even a severe threat will not prompt much concern if its likelihood is perceived as low. Our willingness to act depends heavily on how probable people think the hazard is.
When is it likely to happen? A threat looming in the immediate future is more compelling than one projected weeks, months, or years away. This is because people tend to heavily discount the value of future risks.
When these factors — severity, likelihood, and immediacy — combine with low mitigation costs, the incentives for action align.
However, cost is not limited to financial expense. It encompasses effort, willpower, access to information, and personal inclination. Similarly, the perception of threat is shaped not just by hard data and epidemiology but also by subjective values and cultural interpretations.
“We place a high value on now rather than later,” Theresa Marteau, PhD, a psychologist and behavioral scientist and director of the Behaviour and Health Research Unit at the University of Cambridge in England, said in an interview. “Treatment is about fixing a problem that we have now, rather than trying to avoid a problem sometime in the future. We also place a high value on certainty: I’m ill today, and I want to avoid that, as opposed to putting resources on a possible disease that might or might not occur.”
Investing in the Future: A Privilege of Stability
People often undervalue future health risks because of temporal discounting, a cognitive bias where immediate rewards are prioritized over long-term benefits. This tendency makes it challenging to address health issues that may only manifest years later.
From a public health perspective, this creates challenges. Warning individuals that harmful behaviors, such as smoking, may lead to severe health problems in a decade often falls on deaf ears. People naturally focus on immediate concerns, particularly when grappling with present challenges. For those living in poverty or social instability, the urgency of daily survival frequently outweighs the perceived benefits of preventive health measures.
“A cigarette during the day is just one brief source of pleasure, a short-term escape from all the other stuff happening in their lives, and there’s more of that stuff happening to poorer people than there is to richer people,” Dye said.
He said that long-term thinking comes more naturally to those with stability and resources. People who are financially secure, have stable jobs, supportive families, and comfortable homes are better equipped to invest for the future and prioritize their health.
“People value their health regardless of their social and economic circumstances,” said Marteau. “But they might not have the resources to engage in behavior-changing activities.”
Bringing the Future to the Present
Effective interventions often involve a combination of “sticks” (deterrents) and “carrots” (rewards), Dye explained. Both approaches aim to bridge the gap between immediate actions and future benefits by making preventive behaviors more appealing in the short term. “We need to bring the future into the present,” he added.
Raising the cost of unhealthy behaviors has proven effective. For example, increasing the price of cigarettes leads to significant reductions in smoking rates. When smoking becomes less affordable, individuals are more likely to quit. Dye said that this approach works to a certain extent. At some point, the number of people quitting plateaus and those from low socioeconomic backgrounds are those more likely to continue to smoke.
Offering immediate rewards for preventive behaviors provides a powerful incentive. Things that give tangible benefits, like attending regular health checkups, receiving vaccinations, or joining fitness programs, can motivate individuals to engage in health-preserving activities. “The key is ensuring these benefits are timely and meaningful, as delayed rewards are less effective in overcoming the natural bias toward the present,” said Dye.
Healthcare providers are best placed to help people engage in preventive behavior by referring patients to the right services, such as programs to stop smoking, weight loss programs and medications, or mental health providers, Marteau said. “It’s not telling people to stop smoking or change their diet. It’s about signposting them to effective services that will help them change their behavior.”
Dye and Marteau reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Understanding how people make health-related decisions requires a deeper exploration of their motivations, beliefs, and circumstances, Christopher Dye, DPhil, professor of epidemiology at the University of Oxford in England, and former director of strategy at the World Health Organization, said in an interview. “In public health, we tend to prescribe solutions. But unless we understand how people really make choices about health and why they are less interested in prevention and happier to wait until they become ill, then we are not in the position to shift away from curative treatments to preventive treatments.”
Despite the well-documented benefits of preventive measures, many people fail to engage in proactive health behaviors. This can be attributed to psychological biases and socioeconomic factors that shape how people prioritize their health.
“The choices people make have some to do with facts, but they also have much to do with values and perception. We need to understand and take these perceptions and values seriously,” Dye said.
The Paradox of Prevention
People often recognize prevention as the right course of action but fail to act. “We know it’s the right thing to do, but we don’t do it,” Dye said.
He explained that, when considering potential future threats, we assess two key factors: The severity of the danger and the cost of addressing it. Action is more likely when the danger is significant and the cost of mitigation is low.
This dynamic can be broken down into three critical questions:
What is the nature of the hazard? Is the threat severe, like Ebola, which has a case fatality rate of around 50% in untreated cases, or relatively milder, like COVID-19, with a fatality rate of less than 1% but a much broader spread? The nastier the hazard, the more likely we are to take it seriously.
How likely is it to happen? Even a severe threat will not prompt much concern if its likelihood is perceived as low. Our willingness to act depends heavily on how probable people think the hazard is.
When is it likely to happen? A threat looming in the immediate future is more compelling than one projected weeks, months, or years away. This is because people tend to heavily discount the value of future risks.
When these factors — severity, likelihood, and immediacy — combine with low mitigation costs, the incentives for action align.
However, cost is not limited to financial expense. It encompasses effort, willpower, access to information, and personal inclination. Similarly, the perception of threat is shaped not just by hard data and epidemiology but also by subjective values and cultural interpretations.
“We place a high value on now rather than later,” Theresa Marteau, PhD, a psychologist and behavioral scientist and director of the Behaviour and Health Research Unit at the University of Cambridge in England, said in an interview. “Treatment is about fixing a problem that we have now, rather than trying to avoid a problem sometime in the future. We also place a high value on certainty: I’m ill today, and I want to avoid that, as opposed to putting resources on a possible disease that might or might not occur.”
Investing in the Future: A Privilege of Stability
People often undervalue future health risks because of temporal discounting, a cognitive bias where immediate rewards are prioritized over long-term benefits. This tendency makes it challenging to address health issues that may only manifest years later.
From a public health perspective, this creates challenges. Warning individuals that harmful behaviors, such as smoking, may lead to severe health problems in a decade often falls on deaf ears. People naturally focus on immediate concerns, particularly when grappling with present challenges. For those living in poverty or social instability, the urgency of daily survival frequently outweighs the perceived benefits of preventive health measures.
“A cigarette during the day is just one brief source of pleasure, a short-term escape from all the other stuff happening in their lives, and there’s more of that stuff happening to poorer people than there is to richer people,” Dye said.
He said that long-term thinking comes more naturally to those with stability and resources. People who are financially secure, have stable jobs, supportive families, and comfortable homes are better equipped to invest for the future and prioritize their health.
“People value their health regardless of their social and economic circumstances,” said Marteau. “But they might not have the resources to engage in behavior-changing activities.”
Bringing the Future to the Present
Effective interventions often involve a combination of “sticks” (deterrents) and “carrots” (rewards), Dye explained. Both approaches aim to bridge the gap between immediate actions and future benefits by making preventive behaviors more appealing in the short term. “We need to bring the future into the present,” he added.
Raising the cost of unhealthy behaviors has proven effective. For example, increasing the price of cigarettes leads to significant reductions in smoking rates. When smoking becomes less affordable, individuals are more likely to quit. Dye said that this approach works to a certain extent. At some point, the number of people quitting plateaus and those from low socioeconomic backgrounds are those more likely to continue to smoke.
Offering immediate rewards for preventive behaviors provides a powerful incentive. Things that give tangible benefits, like attending regular health checkups, receiving vaccinations, or joining fitness programs, can motivate individuals to engage in health-preserving activities. “The key is ensuring these benefits are timely and meaningful, as delayed rewards are less effective in overcoming the natural bias toward the present,” said Dye.
Healthcare providers are best placed to help people engage in preventive behavior by referring patients to the right services, such as programs to stop smoking, weight loss programs and medications, or mental health providers, Marteau said. “It’s not telling people to stop smoking or change their diet. It’s about signposting them to effective services that will help them change their behavior.”
Dye and Marteau reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Understanding how people make health-related decisions requires a deeper exploration of their motivations, beliefs, and circumstances, Christopher Dye, DPhil, professor of epidemiology at the University of Oxford in England, and former director of strategy at the World Health Organization, said in an interview. “In public health, we tend to prescribe solutions. But unless we understand how people really make choices about health and why they are less interested in prevention and happier to wait until they become ill, then we are not in the position to shift away from curative treatments to preventive treatments.”
Despite the well-documented benefits of preventive measures, many people fail to engage in proactive health behaviors. This can be attributed to psychological biases and socioeconomic factors that shape how people prioritize their health.
“The choices people make have some to do with facts, but they also have much to do with values and perception. We need to understand and take these perceptions and values seriously,” Dye said.
The Paradox of Prevention
People often recognize prevention as the right course of action but fail to act. “We know it’s the right thing to do, but we don’t do it,” Dye said.
He explained that, when considering potential future threats, we assess two key factors: The severity of the danger and the cost of addressing it. Action is more likely when the danger is significant and the cost of mitigation is low.
This dynamic can be broken down into three critical questions:
What is the nature of the hazard? Is the threat severe, like Ebola, which has a case fatality rate of around 50% in untreated cases, or relatively milder, like COVID-19, with a fatality rate of less than 1% but a much broader spread? The nastier the hazard, the more likely we are to take it seriously.
How likely is it to happen? Even a severe threat will not prompt much concern if its likelihood is perceived as low. Our willingness to act depends heavily on how probable people think the hazard is.
When is it likely to happen? A threat looming in the immediate future is more compelling than one projected weeks, months, or years away. This is because people tend to heavily discount the value of future risks.
When these factors — severity, likelihood, and immediacy — combine with low mitigation costs, the incentives for action align.
However, cost is not limited to financial expense. It encompasses effort, willpower, access to information, and personal inclination. Similarly, the perception of threat is shaped not just by hard data and epidemiology but also by subjective values and cultural interpretations.
“We place a high value on now rather than later,” Theresa Marteau, PhD, a psychologist and behavioral scientist and director of the Behaviour and Health Research Unit at the University of Cambridge in England, said in an interview. “Treatment is about fixing a problem that we have now, rather than trying to avoid a problem sometime in the future. We also place a high value on certainty: I’m ill today, and I want to avoid that, as opposed to putting resources on a possible disease that might or might not occur.”
Investing in the Future: A Privilege of Stability
People often undervalue future health risks because of temporal discounting, a cognitive bias where immediate rewards are prioritized over long-term benefits. This tendency makes it challenging to address health issues that may only manifest years later.
From a public health perspective, this creates challenges. Warning individuals that harmful behaviors, such as smoking, may lead to severe health problems in a decade often falls on deaf ears. People naturally focus on immediate concerns, particularly when grappling with present challenges. For those living in poverty or social instability, the urgency of daily survival frequently outweighs the perceived benefits of preventive health measures.
“A cigarette during the day is just one brief source of pleasure, a short-term escape from all the other stuff happening in their lives, and there’s more of that stuff happening to poorer people than there is to richer people,” Dye said.
He said that long-term thinking comes more naturally to those with stability and resources. People who are financially secure, have stable jobs, supportive families, and comfortable homes are better equipped to invest for the future and prioritize their health.
“People value their health regardless of their social and economic circumstances,” said Marteau. “But they might not have the resources to engage in behavior-changing activities.”
Bringing the Future to the Present
Effective interventions often involve a combination of “sticks” (deterrents) and “carrots” (rewards), Dye explained. Both approaches aim to bridge the gap between immediate actions and future benefits by making preventive behaviors more appealing in the short term. “We need to bring the future into the present,” he added.
Raising the cost of unhealthy behaviors has proven effective. For example, increasing the price of cigarettes leads to significant reductions in smoking rates. When smoking becomes less affordable, individuals are more likely to quit. Dye said that this approach works to a certain extent. At some point, the number of people quitting plateaus and those from low socioeconomic backgrounds are those more likely to continue to smoke.
Offering immediate rewards for preventive behaviors provides a powerful incentive. Things that give tangible benefits, like attending regular health checkups, receiving vaccinations, or joining fitness programs, can motivate individuals to engage in health-preserving activities. “The key is ensuring these benefits are timely and meaningful, as delayed rewards are less effective in overcoming the natural bias toward the present,” said Dye.
Healthcare providers are best placed to help people engage in preventive behavior by referring patients to the right services, such as programs to stop smoking, weight loss programs and medications, or mental health providers, Marteau said. “It’s not telling people to stop smoking or change their diet. It’s about signposting them to effective services that will help them change their behavior.”
Dye and Marteau reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Common Herbicide a Player in Neurodegeneration?
new research showed.
Researchers found that glyphosate exposure even at regulated levels was associated with increased neuroinflammation and accelerated Alzheimer’s disease–like pathology in mice — an effect that persisted 6 months after a recovery period when exposure was stopped.
“More research is needed to understand the consequences of glyphosate exposure to the brain in humans and to understand the appropriate dose of exposure to limit detrimental outcomes,” said co–senior author Ramon Velazquez, PhD, with Arizona State University, Tempe.
The study was published online in The Journal of Neuroinflammation.
Persistent Accumulation Within the Brain
Glyphosate is the most heavily applied herbicide in the United States, with roughly 300 million pounds used annually in agricultural communities throughout the United States. It is also used for weed control in parks, residential areas, and personal gardens.
The Environmental Protection Agency (EPA) has determined that glyphosate poses no risks to human health when used as directed. But the World Health Organization’s International Agency for Research on Cancer disagrees, classifying the herbicide as “possibly carcinogenic to humans.”
In addition to the possible cancer risk, multiple reports have also suggested potential harmful effects of glyphosate exposure on the brain.
In earlier work, Velazquez and colleagues showed that glyphosate crosses the blood-brain barrier and infiltrates the brains of mice, contributing to neuroinflammation and other detrimental effects on brain function.
In their latest study, they examined the long-term effects of glyphosate exposure on neuroinflammation and Alzheimer’s disease–like pathology using a mouse model.
They dosed 4.5-month-old mice genetically predisposed to Alzheimer’s disease and non-transgenic control mice with either 0, 50, or 500 mg/kg of glyphosate daily for 13 weeks followed by a 6-month recovery period.
The high dose is similar to levels used in earlier research, and the low dose is close to the limit used to establish the current EPA acceptable dose in humans.
Glyphosate’s metabolite, aminomethylphosphonic acid, was detectable and persisted in mouse brain tissue even 6 months after exposure ceased, the researchers reported.
Additionally, there was a significant increase in soluble and insoluble fractions of amyloid-beta (Abeta), Abeta42 plaque load and plaque size, and phosphorylated tau at Threonine 181 and Serine 396 in hippocampus and cortex brain tissue from glyphosate-exposed mice, “highlighting an exacerbation of hallmark Alzheimer’s disease–like proteinopathies,” they noted.
Glyphosate exposure was also associated with significant elevations in both pro- and anti-inflammatory cytokines and chemokines in brain tissue of transgenic and normal mice and in peripheral blood plasma of transgenic mice.
Glyphosate-exposed transgenic mice also showed heightened anxiety-like behaviors and reduced survival.
“These findings highlight that many chemicals we regularly encounter, previously considered safe, may pose potential health risks,” co–senior author Patrick Pirrotte, PhD, with the Translational Genomics Research Institute, Phoenix, Arizona, said in a statement.
“However, further research is needed to fully assess the public health impact and identify safer alternatives,” Pirrotte added.
Funding for the study was provided by the National Institutes on Aging, National Cancer Institute and the Arizona State University (ASU) Biodesign Institute. The authors have declared no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
new research showed.
Researchers found that glyphosate exposure even at regulated levels was associated with increased neuroinflammation and accelerated Alzheimer’s disease–like pathology in mice — an effect that persisted 6 months after a recovery period when exposure was stopped.
“More research is needed to understand the consequences of glyphosate exposure to the brain in humans and to understand the appropriate dose of exposure to limit detrimental outcomes,” said co–senior author Ramon Velazquez, PhD, with Arizona State University, Tempe.
The study was published online in The Journal of Neuroinflammation.
Persistent Accumulation Within the Brain
Glyphosate is the most heavily applied herbicide in the United States, with roughly 300 million pounds used annually in agricultural communities throughout the United States. It is also used for weed control in parks, residential areas, and personal gardens.
The Environmental Protection Agency (EPA) has determined that glyphosate poses no risks to human health when used as directed. But the World Health Organization’s International Agency for Research on Cancer disagrees, classifying the herbicide as “possibly carcinogenic to humans.”
In addition to the possible cancer risk, multiple reports have also suggested potential harmful effects of glyphosate exposure on the brain.
In earlier work, Velazquez and colleagues showed that glyphosate crosses the blood-brain barrier and infiltrates the brains of mice, contributing to neuroinflammation and other detrimental effects on brain function.
In their latest study, they examined the long-term effects of glyphosate exposure on neuroinflammation and Alzheimer’s disease–like pathology using a mouse model.
They dosed 4.5-month-old mice genetically predisposed to Alzheimer’s disease and non-transgenic control mice with either 0, 50, or 500 mg/kg of glyphosate daily for 13 weeks followed by a 6-month recovery period.
The high dose is similar to levels used in earlier research, and the low dose is close to the limit used to establish the current EPA acceptable dose in humans.
Glyphosate’s metabolite, aminomethylphosphonic acid, was detectable and persisted in mouse brain tissue even 6 months after exposure ceased, the researchers reported.
Additionally, there was a significant increase in soluble and insoluble fractions of amyloid-beta (Abeta), Abeta42 plaque load and plaque size, and phosphorylated tau at Threonine 181 and Serine 396 in hippocampus and cortex brain tissue from glyphosate-exposed mice, “highlighting an exacerbation of hallmark Alzheimer’s disease–like proteinopathies,” they noted.
Glyphosate exposure was also associated with significant elevations in both pro- and anti-inflammatory cytokines and chemokines in brain tissue of transgenic and normal mice and in peripheral blood plasma of transgenic mice.
Glyphosate-exposed transgenic mice also showed heightened anxiety-like behaviors and reduced survival.
“These findings highlight that many chemicals we regularly encounter, previously considered safe, may pose potential health risks,” co–senior author Patrick Pirrotte, PhD, with the Translational Genomics Research Institute, Phoenix, Arizona, said in a statement.
“However, further research is needed to fully assess the public health impact and identify safer alternatives,” Pirrotte added.
Funding for the study was provided by the National Institutes on Aging, National Cancer Institute and the Arizona State University (ASU) Biodesign Institute. The authors have declared no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
new research showed.
Researchers found that glyphosate exposure even at regulated levels was associated with increased neuroinflammation and accelerated Alzheimer’s disease–like pathology in mice — an effect that persisted 6 months after a recovery period when exposure was stopped.
“More research is needed to understand the consequences of glyphosate exposure to the brain in humans and to understand the appropriate dose of exposure to limit detrimental outcomes,” said co–senior author Ramon Velazquez, PhD, with Arizona State University, Tempe.
The study was published online in The Journal of Neuroinflammation.
Persistent Accumulation Within the Brain
Glyphosate is the most heavily applied herbicide in the United States, with roughly 300 million pounds used annually in agricultural communities throughout the United States. It is also used for weed control in parks, residential areas, and personal gardens.
The Environmental Protection Agency (EPA) has determined that glyphosate poses no risks to human health when used as directed. But the World Health Organization’s International Agency for Research on Cancer disagrees, classifying the herbicide as “possibly carcinogenic to humans.”
In addition to the possible cancer risk, multiple reports have also suggested potential harmful effects of glyphosate exposure on the brain.
In earlier work, Velazquez and colleagues showed that glyphosate crosses the blood-brain barrier and infiltrates the brains of mice, contributing to neuroinflammation and other detrimental effects on brain function.
In their latest study, they examined the long-term effects of glyphosate exposure on neuroinflammation and Alzheimer’s disease–like pathology using a mouse model.
They dosed 4.5-month-old mice genetically predisposed to Alzheimer’s disease and non-transgenic control mice with either 0, 50, or 500 mg/kg of glyphosate daily for 13 weeks followed by a 6-month recovery period.
The high dose is similar to levels used in earlier research, and the low dose is close to the limit used to establish the current EPA acceptable dose in humans.
Glyphosate’s metabolite, aminomethylphosphonic acid, was detectable and persisted in mouse brain tissue even 6 months after exposure ceased, the researchers reported.
Additionally, there was a significant increase in soluble and insoluble fractions of amyloid-beta (Abeta), Abeta42 plaque load and plaque size, and phosphorylated tau at Threonine 181 and Serine 396 in hippocampus and cortex brain tissue from glyphosate-exposed mice, “highlighting an exacerbation of hallmark Alzheimer’s disease–like proteinopathies,” they noted.
Glyphosate exposure was also associated with significant elevations in both pro- and anti-inflammatory cytokines and chemokines in brain tissue of transgenic and normal mice and in peripheral blood plasma of transgenic mice.
Glyphosate-exposed transgenic mice also showed heightened anxiety-like behaviors and reduced survival.
“These findings highlight that many chemicals we regularly encounter, previously considered safe, may pose potential health risks,” co–senior author Patrick Pirrotte, PhD, with the Translational Genomics Research Institute, Phoenix, Arizona, said in a statement.
“However, further research is needed to fully assess the public health impact and identify safer alternatives,” Pirrotte added.
Funding for the study was provided by the National Institutes on Aging, National Cancer Institute and the Arizona State University (ASU) Biodesign Institute. The authors have declared no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF NEUROINFLAMMATION
Real-World Data Question Low-Dose Steroid Use in ANCA Vasculitis
TOPLINE:
Compared with a standard dosing regimen, a reduced-dose glucocorticoid regimen is associated with an increased risk for disease progression, relapse, death, or kidney failure in antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis, particularly affecting patients receiving rituximab or those with elevated creatinine levels.
METHODOLOGY:
- The PEXIVAS trial demonstrated that a reduced-dose glucocorticoid regimen was noninferior to standard dosing in terms of death or end-stage kidney disease in ANCA-associated vasculitis. However, the trial did not include disease progression or relapse as a primary endpoint, and cyclophosphamide was the primary induction therapy.
- Researchers conducted this retrospective study across 19 hospitals (18 in France and one in Luxembourg) between January 2018 and November 2022 to compare the effectiveness of a reduced-dose glucocorticoid regimen, as used in the PEXIVAS trial, with a standard-dose regimen in patients with ANCA-associated vasculitis in the real-world setting.
- They included 234 patients aged > 15 years (51% men) with severe granulomatosis with polyangiitis (n = 141) or microscopic polyangiitis (n = 93) who received induction therapy with rituximab or cyclophosphamide; 126 and 108 patients received reduced-dose and standard-dose glucocorticoid regimens, respectively.
- Most patients (70%) had severe renal involvement.
- The primary composite outcome encompassed minor relapse, major relapse, disease progression before remission, end-stage kidney disease requiring dialysis for > 12 weeks or transplantation, and death within 12 months post-induction.
TAKEAWAY:
- The primary composite outcome occurred in a higher proportion of patients receiving reduced-dose glucocorticoid therapy than in those receiving standard-dose therapy (33.3% vs 18.5%; hazard ratio [HR], 2.20; 95% CI, 1.23-3.94).
- However, no significant association was found between reduced-dose glucocorticoids and the risk for death or end-stage kidney disease or the occurrence of serious infections.
- Among patients receiving reduced-dose glucocorticoids, serum creatinine levels > 300 μmol/L were associated with an increased risk for the primary composite outcome (adjusted HR, 3.02; 95% CI, 1.28-7.11).
- In the rituximab induction subgroup, reduced-dose glucocorticoid was associated with an increased risk for the primary composite outcome (adjusted HR, 2.36; 95% CI, 1.18-4.71), compared with standard-dose glucocorticoids.
IN PRACTICE:
“Our data suggest increased vigilance when using the [reduced-dose glucocorticoid] regimen, especially in the two subgroups of patients at higher risk of failure, that is, those receiving [rituximab] as induction therapy and those with a baseline serum creatinine greater than 300 μmol/L,” the authors wrote.
SOURCE:
The study was led by Sophie Nagle, MD, National Referral Centre for Rare Autoimmune and Systemic Diseases, Department of Internal Medicine, Hôpital Cochin, Paris, France. It was published online on November 20, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
The retrospective nature of this study may have introduced inherent limitations and potential selection bias. The study lacked data on patient comorbidities, which could have influenced treatment choice and outcomes. Additionally, about a quarter of patients did not receive methylprednisolone pulses prior to oral glucocorticoids, unlike the PEXIVAS trial protocol. The group receiving standard-dose glucocorticoids showed heterogeneity in glucocorticoid regimens, and the minimum follow-up was only 6 months.
DISCLOSURES:
This study did not report any source of funding. The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Compared with a standard dosing regimen, a reduced-dose glucocorticoid regimen is associated with an increased risk for disease progression, relapse, death, or kidney failure in antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis, particularly affecting patients receiving rituximab or those with elevated creatinine levels.
METHODOLOGY:
- The PEXIVAS trial demonstrated that a reduced-dose glucocorticoid regimen was noninferior to standard dosing in terms of death or end-stage kidney disease in ANCA-associated vasculitis. However, the trial did not include disease progression or relapse as a primary endpoint, and cyclophosphamide was the primary induction therapy.
- Researchers conducted this retrospective study across 19 hospitals (18 in France and one in Luxembourg) between January 2018 and November 2022 to compare the effectiveness of a reduced-dose glucocorticoid regimen, as used in the PEXIVAS trial, with a standard-dose regimen in patients with ANCA-associated vasculitis in the real-world setting.
- They included 234 patients aged > 15 years (51% men) with severe granulomatosis with polyangiitis (n = 141) or microscopic polyangiitis (n = 93) who received induction therapy with rituximab or cyclophosphamide; 126 and 108 patients received reduced-dose and standard-dose glucocorticoid regimens, respectively.
- Most patients (70%) had severe renal involvement.
- The primary composite outcome encompassed minor relapse, major relapse, disease progression before remission, end-stage kidney disease requiring dialysis for > 12 weeks or transplantation, and death within 12 months post-induction.
TAKEAWAY:
- The primary composite outcome occurred in a higher proportion of patients receiving reduced-dose glucocorticoid therapy than in those receiving standard-dose therapy (33.3% vs 18.5%; hazard ratio [HR], 2.20; 95% CI, 1.23-3.94).
- However, no significant association was found between reduced-dose glucocorticoids and the risk for death or end-stage kidney disease or the occurrence of serious infections.
- Among patients receiving reduced-dose glucocorticoids, serum creatinine levels > 300 μmol/L were associated with an increased risk for the primary composite outcome (adjusted HR, 3.02; 95% CI, 1.28-7.11).
- In the rituximab induction subgroup, reduced-dose glucocorticoid was associated with an increased risk for the primary composite outcome (adjusted HR, 2.36; 95% CI, 1.18-4.71), compared with standard-dose glucocorticoids.
IN PRACTICE:
“Our data suggest increased vigilance when using the [reduced-dose glucocorticoid] regimen, especially in the two subgroups of patients at higher risk of failure, that is, those receiving [rituximab] as induction therapy and those with a baseline serum creatinine greater than 300 μmol/L,” the authors wrote.
SOURCE:
The study was led by Sophie Nagle, MD, National Referral Centre for Rare Autoimmune and Systemic Diseases, Department of Internal Medicine, Hôpital Cochin, Paris, France. It was published online on November 20, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
The retrospective nature of this study may have introduced inherent limitations and potential selection bias. The study lacked data on patient comorbidities, which could have influenced treatment choice and outcomes. Additionally, about a quarter of patients did not receive methylprednisolone pulses prior to oral glucocorticoids, unlike the PEXIVAS trial protocol. The group receiving standard-dose glucocorticoids showed heterogeneity in glucocorticoid regimens, and the minimum follow-up was only 6 months.
DISCLOSURES:
This study did not report any source of funding. The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Compared with a standard dosing regimen, a reduced-dose glucocorticoid regimen is associated with an increased risk for disease progression, relapse, death, or kidney failure in antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis, particularly affecting patients receiving rituximab or those with elevated creatinine levels.
METHODOLOGY:
- The PEXIVAS trial demonstrated that a reduced-dose glucocorticoid regimen was noninferior to standard dosing in terms of death or end-stage kidney disease in ANCA-associated vasculitis. However, the trial did not include disease progression or relapse as a primary endpoint, and cyclophosphamide was the primary induction therapy.
- Researchers conducted this retrospective study across 19 hospitals (18 in France and one in Luxembourg) between January 2018 and November 2022 to compare the effectiveness of a reduced-dose glucocorticoid regimen, as used in the PEXIVAS trial, with a standard-dose regimen in patients with ANCA-associated vasculitis in the real-world setting.
- They included 234 patients aged > 15 years (51% men) with severe granulomatosis with polyangiitis (n = 141) or microscopic polyangiitis (n = 93) who received induction therapy with rituximab or cyclophosphamide; 126 and 108 patients received reduced-dose and standard-dose glucocorticoid regimens, respectively.
- Most patients (70%) had severe renal involvement.
- The primary composite outcome encompassed minor relapse, major relapse, disease progression before remission, end-stage kidney disease requiring dialysis for > 12 weeks or transplantation, and death within 12 months post-induction.
TAKEAWAY:
- The primary composite outcome occurred in a higher proportion of patients receiving reduced-dose glucocorticoid therapy than in those receiving standard-dose therapy (33.3% vs 18.5%; hazard ratio [HR], 2.20; 95% CI, 1.23-3.94).
- However, no significant association was found between reduced-dose glucocorticoids and the risk for death or end-stage kidney disease or the occurrence of serious infections.
- Among patients receiving reduced-dose glucocorticoids, serum creatinine levels > 300 μmol/L were associated with an increased risk for the primary composite outcome (adjusted HR, 3.02; 95% CI, 1.28-7.11).
- In the rituximab induction subgroup, reduced-dose glucocorticoid was associated with an increased risk for the primary composite outcome (adjusted HR, 2.36; 95% CI, 1.18-4.71), compared with standard-dose glucocorticoids.
IN PRACTICE:
“Our data suggest increased vigilance when using the [reduced-dose glucocorticoid] regimen, especially in the two subgroups of patients at higher risk of failure, that is, those receiving [rituximab] as induction therapy and those with a baseline serum creatinine greater than 300 μmol/L,” the authors wrote.
SOURCE:
The study was led by Sophie Nagle, MD, National Referral Centre for Rare Autoimmune and Systemic Diseases, Department of Internal Medicine, Hôpital Cochin, Paris, France. It was published online on November 20, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
The retrospective nature of this study may have introduced inherent limitations and potential selection bias. The study lacked data on patient comorbidities, which could have influenced treatment choice and outcomes. Additionally, about a quarter of patients did not receive methylprednisolone pulses prior to oral glucocorticoids, unlike the PEXIVAS trial protocol. The group receiving standard-dose glucocorticoids showed heterogeneity in glucocorticoid regimens, and the minimum follow-up was only 6 months.
DISCLOSURES:
This study did not report any source of funding. The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
With Chemo, Blinatumomab Boosts DFS in Pediatric B-ALL
Among pediatric patients with B-ALL followed for a mean of 2.5 years (1.6-3.2 years), 718 patients in the blinatumomab-plus-chemotherapy group had a 3-year DFS of 96.0 ± 1.2%, compared with 87.9 ± 2.1% of the 722 patients in the chemotherapy-only group, researchers reported at the American Society of Hematology (ASH) 2024 Annual Meeting.
“Our results demonstrate that blinatumomab added to chemotherapy represents a new treatment standard for most patients with NCI [National Cancer Institute] standard-risk [B-ALL],” said first author Rachel E. Rau, MD, Seattle Children’s Hospital, University of Washington, during a news briefing.
As Cynthia E. Dunbar, MD, chief of the Translational Stem Cell Biology Branch at the National Heart, Lung, and Blood Institute, noted in a news briefing: “B-cell ALL is the most common childhood cancer and one of the most treatable. However, some children still relapse following standard chemotherapy treatments and then have a much grimmer outcome.”
The AALL1731 study was initiated in 2019 with a recruitment goal of 2245 participants. The patients were over age 1 and less than 10 years, with an initial white blood cell count of < 50,000/μL and were considered to be standard risk–high or standard risk–average.
The control group received standard-intensity chemotherapy (standard risk–average patients) or augmented Berlin-Frankfurt-Münster–based chemotherapy (standard risk–high patients). In addition, the blinatumomab groups received two cycles of the drug.
Randomization was terminated in 2024 at 1440 patients because of the positive results. Patients had a median age of 4.3 years (2.8-6.4), 52.6% were boys, 26% were Hispanic, and 5% were non-Hispanic Black.
The addition of blinatumomab improved DFS by 61% (hazard ratio, 0.39; 95% CI, 0.24-0.64; P < .0001).
In the group of standard risk–average patients, 3-year DFS was 97.5±1.3% in the blinatumomab group vs 90.2±2.3% in the control group (HR, 0.33; 95% CI, 0.15-0.69). For standard risk–high patients, 3-year DFS was 94.1 ± 2.5% and 84.8 ± 3.8%, respectively.
Six deaths occurred in remission, all in standard risk–high patients and none during blinatumomab cycles. Out of first courses of blinatumomab, 0.3% were associated with Grade 3 or higher cytokine release syndrome and 0.7% with seizures.
“We did note higher rates of subsequent sepsis and catheter-related infections in our standard risk–average patients who received blinatumomab,” Rau said.
“The improvement in disease survival was secondary to significant reduction in bone marrow relapse,” Rau added. “We did not see a similar reduction in the more rare event of an isolated central nervous system relapse. This finding was not surprising given blinatumomab’s known limited activity in the central nervous system.”
Rau noted that there are two challenges in terms of access to blinatumomab: its cost, at about $225,000 per a 2023 report, and its administration. The drug is administered via 4-week-long infusions. “The delivery method is very cumbersome,” she said.
“These are big problems that are going to take the combined efforts of pediatric oncologist cancer consortia and pharmaceutical industry partners as well as government agencies,” she said. Fortunately, she said, in June 2024 the Food and Drug Administration approved blinatumomab for adult and pediatric patients 1 month and older with CD19-positive Philadelphia chromosome–negative B-ALL in the consolidation phase of multiphase chemotherapy.
“So it’s relatively easy, at least, to prescribe blinatumomab in the United States for our patients that we feel would benefit from it,” she said.
As for method of delivery, Rau said easier-to-deliver formulations are in development.
Rau has disclosed spousal employment (AbbVie), serving on advisory boards (Servier, Jazz), consulting, and receiving honoraria (Jazz). Other study authors report various disclosures including ties with Amgen, the maker of blinatumomab. Dunbar has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Among pediatric patients with B-ALL followed for a mean of 2.5 years (1.6-3.2 years), 718 patients in the blinatumomab-plus-chemotherapy group had a 3-year DFS of 96.0 ± 1.2%, compared with 87.9 ± 2.1% of the 722 patients in the chemotherapy-only group, researchers reported at the American Society of Hematology (ASH) 2024 Annual Meeting.
“Our results demonstrate that blinatumomab added to chemotherapy represents a new treatment standard for most patients with NCI [National Cancer Institute] standard-risk [B-ALL],” said first author Rachel E. Rau, MD, Seattle Children’s Hospital, University of Washington, during a news briefing.
As Cynthia E. Dunbar, MD, chief of the Translational Stem Cell Biology Branch at the National Heart, Lung, and Blood Institute, noted in a news briefing: “B-cell ALL is the most common childhood cancer and one of the most treatable. However, some children still relapse following standard chemotherapy treatments and then have a much grimmer outcome.”
The AALL1731 study was initiated in 2019 with a recruitment goal of 2245 participants. The patients were over age 1 and less than 10 years, with an initial white blood cell count of < 50,000/μL and were considered to be standard risk–high or standard risk–average.
The control group received standard-intensity chemotherapy (standard risk–average patients) or augmented Berlin-Frankfurt-Münster–based chemotherapy (standard risk–high patients). In addition, the blinatumomab groups received two cycles of the drug.
Randomization was terminated in 2024 at 1440 patients because of the positive results. Patients had a median age of 4.3 years (2.8-6.4), 52.6% were boys, 26% were Hispanic, and 5% were non-Hispanic Black.
The addition of blinatumomab improved DFS by 61% (hazard ratio, 0.39; 95% CI, 0.24-0.64; P < .0001).
In the group of standard risk–average patients, 3-year DFS was 97.5±1.3% in the blinatumomab group vs 90.2±2.3% in the control group (HR, 0.33; 95% CI, 0.15-0.69). For standard risk–high patients, 3-year DFS was 94.1 ± 2.5% and 84.8 ± 3.8%, respectively.
Six deaths occurred in remission, all in standard risk–high patients and none during blinatumomab cycles. Out of first courses of blinatumomab, 0.3% were associated with Grade 3 or higher cytokine release syndrome and 0.7% with seizures.
“We did note higher rates of subsequent sepsis and catheter-related infections in our standard risk–average patients who received blinatumomab,” Rau said.
“The improvement in disease survival was secondary to significant reduction in bone marrow relapse,” Rau added. “We did not see a similar reduction in the more rare event of an isolated central nervous system relapse. This finding was not surprising given blinatumomab’s known limited activity in the central nervous system.”
Rau noted that there are two challenges in terms of access to blinatumomab: its cost, at about $225,000 per a 2023 report, and its administration. The drug is administered via 4-week-long infusions. “The delivery method is very cumbersome,” she said.
“These are big problems that are going to take the combined efforts of pediatric oncologist cancer consortia and pharmaceutical industry partners as well as government agencies,” she said. Fortunately, she said, in June 2024 the Food and Drug Administration approved blinatumomab for adult and pediatric patients 1 month and older with CD19-positive Philadelphia chromosome–negative B-ALL in the consolidation phase of multiphase chemotherapy.
“So it’s relatively easy, at least, to prescribe blinatumomab in the United States for our patients that we feel would benefit from it,” she said.
As for method of delivery, Rau said easier-to-deliver formulations are in development.
Rau has disclosed spousal employment (AbbVie), serving on advisory boards (Servier, Jazz), consulting, and receiving honoraria (Jazz). Other study authors report various disclosures including ties with Amgen, the maker of blinatumomab. Dunbar has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Among pediatric patients with B-ALL followed for a mean of 2.5 years (1.6-3.2 years), 718 patients in the blinatumomab-plus-chemotherapy group had a 3-year DFS of 96.0 ± 1.2%, compared with 87.9 ± 2.1% of the 722 patients in the chemotherapy-only group, researchers reported at the American Society of Hematology (ASH) 2024 Annual Meeting.
“Our results demonstrate that blinatumomab added to chemotherapy represents a new treatment standard for most patients with NCI [National Cancer Institute] standard-risk [B-ALL],” said first author Rachel E. Rau, MD, Seattle Children’s Hospital, University of Washington, during a news briefing.
As Cynthia E. Dunbar, MD, chief of the Translational Stem Cell Biology Branch at the National Heart, Lung, and Blood Institute, noted in a news briefing: “B-cell ALL is the most common childhood cancer and one of the most treatable. However, some children still relapse following standard chemotherapy treatments and then have a much grimmer outcome.”
The AALL1731 study was initiated in 2019 with a recruitment goal of 2245 participants. The patients were over age 1 and less than 10 years, with an initial white blood cell count of < 50,000/μL and were considered to be standard risk–high or standard risk–average.
The control group received standard-intensity chemotherapy (standard risk–average patients) or augmented Berlin-Frankfurt-Münster–based chemotherapy (standard risk–high patients). In addition, the blinatumomab groups received two cycles of the drug.
Randomization was terminated in 2024 at 1440 patients because of the positive results. Patients had a median age of 4.3 years (2.8-6.4), 52.6% were boys, 26% were Hispanic, and 5% were non-Hispanic Black.
The addition of blinatumomab improved DFS by 61% (hazard ratio, 0.39; 95% CI, 0.24-0.64; P < .0001).
In the group of standard risk–average patients, 3-year DFS was 97.5±1.3% in the blinatumomab group vs 90.2±2.3% in the control group (HR, 0.33; 95% CI, 0.15-0.69). For standard risk–high patients, 3-year DFS was 94.1 ± 2.5% and 84.8 ± 3.8%, respectively.
Six deaths occurred in remission, all in standard risk–high patients and none during blinatumomab cycles. Out of first courses of blinatumomab, 0.3% were associated with Grade 3 or higher cytokine release syndrome and 0.7% with seizures.
“We did note higher rates of subsequent sepsis and catheter-related infections in our standard risk–average patients who received blinatumomab,” Rau said.
“The improvement in disease survival was secondary to significant reduction in bone marrow relapse,” Rau added. “We did not see a similar reduction in the more rare event of an isolated central nervous system relapse. This finding was not surprising given blinatumomab’s known limited activity in the central nervous system.”
Rau noted that there are two challenges in terms of access to blinatumomab: its cost, at about $225,000 per a 2023 report, and its administration. The drug is administered via 4-week-long infusions. “The delivery method is very cumbersome,” she said.
“These are big problems that are going to take the combined efforts of pediatric oncologist cancer consortia and pharmaceutical industry partners as well as government agencies,” she said. Fortunately, she said, in June 2024 the Food and Drug Administration approved blinatumomab for adult and pediatric patients 1 month and older with CD19-positive Philadelphia chromosome–negative B-ALL in the consolidation phase of multiphase chemotherapy.
“So it’s relatively easy, at least, to prescribe blinatumomab in the United States for our patients that we feel would benefit from it,” she said.
As for method of delivery, Rau said easier-to-deliver formulations are in development.
Rau has disclosed spousal employment (AbbVie), serving on advisory boards (Servier, Jazz), consulting, and receiving honoraria (Jazz). Other study authors report various disclosures including ties with Amgen, the maker of blinatumomab. Dunbar has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM ASH 2024
High-Fiber Diet Linked to Improved Stem Cell Transplant, GvHD Outcomes
Importantly, the findings suggest standard recommendations for patients of a low-fiber diet following allo-HCT may run counter to the potential benefits.
“Significant decrease of fiber intake during transplantation is detrimental. It’s a lost opportunity to promote a healthy gut microbiome, recover from treatment-related microbiota injury, and protect against GVHD,” first author Jenny Paredes, PhD, a staff scientist at City of Hope National Medical Center in Duarte, California, said in a press statement for the study presented at the American Society of Hematology (ASH) 2024 Annual Meeting.
Although the health benefits of dietary fiber on the gut microbiome are well-documented, the effects have recently been shown to extend to outcomes after allo-HCT in general, with researchers finding increased overall survival when there is higher diversity in the gut microbiome, including a higher abundance of butyrate producers and lower abundance of enterococcus, explained Paredes when presenting the findings.
Acute GvHD, a common and potentially life-threatening complication of allo-HCT, can have symptoms that mimic irritable bowel disease (IBD), including abdominal pain or cramps, nausea, vomiting, and diarrhea. The low-fiber diet recommendations, including avoidance of raw vegetables and fruits before and after the allo-HCT procedure, are designed to counter those effects, as well as reduce exposure to bacteria.
However, with data suggesting the potential benefits of dietary fiber could extend to the prevention of GvHD, Paredes and colleagues further investigated.
For the observational study, they evaluated all dietary data on 173 allo-HCT recipients at Memorial Sloan Kettering Cancer Center (MSKCC) from 10 days prior to transplantation to 30 days post-transplantation, representing 3837 patient-days in total.
Data collected from the patients also included rRNA sequencing of fecal samples and fecal short-chain fatty acid concentration.
Participants had a median age of 60, and 45% were female. The most common diseases being treated were leukemia (50%), myelodysplastic syndrome (25%), and non-Hodgkin’s lymphoma (8.7%).
After stratifying patients based on high- or low-fiber intake, those with high-fiber intake were found to have significantly higher rates of microbial α-diversity (P = .009), a higher abundance of butyrate producers (P = .03), and a higher concentration of butyrate (P = .02), a short-chain fatty acid that plays a key role in gut health.
Furthermore, the high-fiber group had significantly higher overall survival in an analysis extending to 24 months relative to day 12 of the study (P = .04).
Focusing on GvHD outcomes, the authors further evaluated data on 101 non-T-cell–depleted patients, and identified 29 patients without GvHD and 24 who developed lower gastrointestinal (GI) GvHD.
Patients with lower GI GvHD had significantly lower fecal concentrations of butyrate (P = .03) and acetate (P = .02).
However, patients among those in the high-fiber intake group had a significantly lower cumulative incidence of developing GvHD at day 100 (P = .034) and a lower incidence of lower GI GvHD (P = .04).
A separate preclinical analysis of a mouse model with GvHD further showed that a fiber-rich diet (12% cellulose) significantly increased the expression of genes associated with reduced GvHD, including IDO1 and CEACAM1, and those associated with enrichment of the bile acid pathway.
The findings suggest an opportunity to improve outcomes with relatively small dietary changes, Paredes said.
“Strategies to increase the fiber concentration in these diets paired with the safety that these patients need is what makes this study exciting,” she said in an interview.
“Increasing the fiber intake by 10 to 20 grams/day could potentially increase the microbiome diversity and abundance of butyrate producers, which have been correlated with higher overall survival rates post allo-HCT,” she continued.
“[For instance], that could be an avocado per day, or it could be a small salad per day, or a small vegetable soup per day,” she added. “I would encourage institutions to re-evaluate their menu planning and see how to include more fiber into the meals in a safe way.”
Ultimately, “I think that a dietary intervention outweighs the risks of a pharmacological intervention,” Paredes added.
The necessary duration of a high-fiber diet to produce the beneficial effects on allo-HCT outcomes would likely be over the course of the pre- and post-transplant periods, Paredes added.
“With the survival analysis extending from 5 days before transplantation to 12 days post, we are looking at an intervention that potentially could be around 20 days,” she said.
“We would love to take advantage of the pretransplantation window, in particular, and we can see that just increasing the fiber intake by about 20 grams during this window was shown to improve overall survival after 24 months,” Paredes added.
Importantly, however, some patients may not be appropriate for high-fiber dietary changes, Paredes cautioned.
“Patients that have developed IBD-like symptoms and severe GvHD patients, for example, or with lower GI-GvHD grades 3 and 4 would be not appropriate candidates for a high-fiber diet,” she said.
High-Fiber Diet Slows MM Disease Progression?
The potential important benefits of a high-fiber diet in blood diseases were further demonstrated in a separate study also by MSKCC researchers presented at the meeting, which showed encouraging signs that a plant-based diet rich in fiber could potentially slow disease progression in multiple myeloma (MM).
NUTRIVENTION included 20 patients with the two precancerous MM conditions, monoclonal gammopathy of undetermined significance (MGUS) and smoldering multiple myeloma (SMM), which can last for years without progressing to MM and which researchers have speculated could be a potential opportunity to intervene to prevent progression to cancer.
Patients were provided with a 12-week controlled diet plus health coaching for another 3 months; no meals or coaching were provided for the rest of the 1-year study period. Participants had a median age of 62 and, with being overweight/obesity a risk factor for MM, had a body mass index (BMI) of 25 kg/m2 or higher.
The trial met its endpoint of feasibility, with 91% adherence in the first 3 months. The rate of consumption of unprocessed plant foods increased from 20% at baseline to 92% on the intervention. Overall adherence was 58%. Insulin and anti-inflammatory markers also improved and, despite no calorie restriction, there was a 7% sustained reduction in BMI.
Notably, two patients in the study had stabilization of disease progression.
“We saw improvements in all spheres, including metabolism, microbiome, and immune system markers, and we also saw that two patients with progressive disease had the progression stabilize and slow down on the intervention,” principal investigator Urvi A. Shah, MD, said in a press statement.
“Even though it’s just two cases, to our knowledge, it has not been shown before in an intervention setting that you can improve diet and lifestyle and actually slow or change the trajectory of the disease,” she noted.
The researchers caution that findings in mice do not necessarily translate to humans but note another experiment in mice with SMM that showed animals fed a normal diet had progression to MM after a median of 12 weeks, compared with a median of 30 weeks among those fed a high-fiber diet.
Notably, all mice in the normal-diet group progressed to MM, whereas 40% of mice in the high-fiber group did not.
“We found that a high-fiber plant-based diet can improve BMI, improve insulin resistance [and] the microbiome through diversity and butyrate producers, and with the production of short-chain fatty acids, can have effects on inflammation, immunity, innate and adaptive antitumor immunity, and tumor cells or plasma cells,” Shah said during her presentation.
The study was supported by funding from the National Cancer Institute and private foundations. Paredes has reported no relevant financial relationships. Shah has reported relationships with Sanofi, Bristol Myers Squibb, and Janssen.
A version of this article first appeared on Medscape.com.
Importantly, the findings suggest standard recommendations for patients of a low-fiber diet following allo-HCT may run counter to the potential benefits.
“Significant decrease of fiber intake during transplantation is detrimental. It’s a lost opportunity to promote a healthy gut microbiome, recover from treatment-related microbiota injury, and protect against GVHD,” first author Jenny Paredes, PhD, a staff scientist at City of Hope National Medical Center in Duarte, California, said in a press statement for the study presented at the American Society of Hematology (ASH) 2024 Annual Meeting.
Although the health benefits of dietary fiber on the gut microbiome are well-documented, the effects have recently been shown to extend to outcomes after allo-HCT in general, with researchers finding increased overall survival when there is higher diversity in the gut microbiome, including a higher abundance of butyrate producers and lower abundance of enterococcus, explained Paredes when presenting the findings.
Acute GvHD, a common and potentially life-threatening complication of allo-HCT, can have symptoms that mimic irritable bowel disease (IBD), including abdominal pain or cramps, nausea, vomiting, and diarrhea. The low-fiber diet recommendations, including avoidance of raw vegetables and fruits before and after the allo-HCT procedure, are designed to counter those effects, as well as reduce exposure to bacteria.
However, with data suggesting the potential benefits of dietary fiber could extend to the prevention of GvHD, Paredes and colleagues further investigated.
For the observational study, they evaluated all dietary data on 173 allo-HCT recipients at Memorial Sloan Kettering Cancer Center (MSKCC) from 10 days prior to transplantation to 30 days post-transplantation, representing 3837 patient-days in total.
Data collected from the patients also included rRNA sequencing of fecal samples and fecal short-chain fatty acid concentration.
Participants had a median age of 60, and 45% were female. The most common diseases being treated were leukemia (50%), myelodysplastic syndrome (25%), and non-Hodgkin’s lymphoma (8.7%).
After stratifying patients based on high- or low-fiber intake, those with high-fiber intake were found to have significantly higher rates of microbial α-diversity (P = .009), a higher abundance of butyrate producers (P = .03), and a higher concentration of butyrate (P = .02), a short-chain fatty acid that plays a key role in gut health.
Furthermore, the high-fiber group had significantly higher overall survival in an analysis extending to 24 months relative to day 12 of the study (P = .04).
Focusing on GvHD outcomes, the authors further evaluated data on 101 non-T-cell–depleted patients, and identified 29 patients without GvHD and 24 who developed lower gastrointestinal (GI) GvHD.
Patients with lower GI GvHD had significantly lower fecal concentrations of butyrate (P = .03) and acetate (P = .02).
However, patients among those in the high-fiber intake group had a significantly lower cumulative incidence of developing GvHD at day 100 (P = .034) and a lower incidence of lower GI GvHD (P = .04).
A separate preclinical analysis of a mouse model with GvHD further showed that a fiber-rich diet (12% cellulose) significantly increased the expression of genes associated with reduced GvHD, including IDO1 and CEACAM1, and those associated with enrichment of the bile acid pathway.
The findings suggest an opportunity to improve outcomes with relatively small dietary changes, Paredes said.
“Strategies to increase the fiber concentration in these diets paired with the safety that these patients need is what makes this study exciting,” she said in an interview.
“Increasing the fiber intake by 10 to 20 grams/day could potentially increase the microbiome diversity and abundance of butyrate producers, which have been correlated with higher overall survival rates post allo-HCT,” she continued.
“[For instance], that could be an avocado per day, or it could be a small salad per day, or a small vegetable soup per day,” she added. “I would encourage institutions to re-evaluate their menu planning and see how to include more fiber into the meals in a safe way.”
Ultimately, “I think that a dietary intervention outweighs the risks of a pharmacological intervention,” Paredes added.
The necessary duration of a high-fiber diet to produce the beneficial effects on allo-HCT outcomes would likely be over the course of the pre- and post-transplant periods, Paredes added.
“With the survival analysis extending from 5 days before transplantation to 12 days post, we are looking at an intervention that potentially could be around 20 days,” she said.
“We would love to take advantage of the pretransplantation window, in particular, and we can see that just increasing the fiber intake by about 20 grams during this window was shown to improve overall survival after 24 months,” Paredes added.
Importantly, however, some patients may not be appropriate for high-fiber dietary changes, Paredes cautioned.
“Patients that have developed IBD-like symptoms and severe GvHD patients, for example, or with lower GI-GvHD grades 3 and 4 would be not appropriate candidates for a high-fiber diet,” she said.
High-Fiber Diet Slows MM Disease Progression?
The potential important benefits of a high-fiber diet in blood diseases were further demonstrated in a separate study also by MSKCC researchers presented at the meeting, which showed encouraging signs that a plant-based diet rich in fiber could potentially slow disease progression in multiple myeloma (MM).
NUTRIVENTION included 20 patients with the two precancerous MM conditions, monoclonal gammopathy of undetermined significance (MGUS) and smoldering multiple myeloma (SMM), which can last for years without progressing to MM and which researchers have speculated could be a potential opportunity to intervene to prevent progression to cancer.
Patients were provided with a 12-week controlled diet plus health coaching for another 3 months; no meals or coaching were provided for the rest of the 1-year study period. Participants had a median age of 62 and, with being overweight/obesity a risk factor for MM, had a body mass index (BMI) of 25 kg/m2 or higher.
The trial met its endpoint of feasibility, with 91% adherence in the first 3 months. The rate of consumption of unprocessed plant foods increased from 20% at baseline to 92% on the intervention. Overall adherence was 58%. Insulin and anti-inflammatory markers also improved and, despite no calorie restriction, there was a 7% sustained reduction in BMI.
Notably, two patients in the study had stabilization of disease progression.
“We saw improvements in all spheres, including metabolism, microbiome, and immune system markers, and we also saw that two patients with progressive disease had the progression stabilize and slow down on the intervention,” principal investigator Urvi A. Shah, MD, said in a press statement.
“Even though it’s just two cases, to our knowledge, it has not been shown before in an intervention setting that you can improve diet and lifestyle and actually slow or change the trajectory of the disease,” she noted.
The researchers caution that findings in mice do not necessarily translate to humans but note another experiment in mice with SMM that showed animals fed a normal diet had progression to MM after a median of 12 weeks, compared with a median of 30 weeks among those fed a high-fiber diet.
Notably, all mice in the normal-diet group progressed to MM, whereas 40% of mice in the high-fiber group did not.
“We found that a high-fiber plant-based diet can improve BMI, improve insulin resistance [and] the microbiome through diversity and butyrate producers, and with the production of short-chain fatty acids, can have effects on inflammation, immunity, innate and adaptive antitumor immunity, and tumor cells or plasma cells,” Shah said during her presentation.
The study was supported by funding from the National Cancer Institute and private foundations. Paredes has reported no relevant financial relationships. Shah has reported relationships with Sanofi, Bristol Myers Squibb, and Janssen.
A version of this article first appeared on Medscape.com.
Importantly, the findings suggest standard recommendations for patients of a low-fiber diet following allo-HCT may run counter to the potential benefits.
“Significant decrease of fiber intake during transplantation is detrimental. It’s a lost opportunity to promote a healthy gut microbiome, recover from treatment-related microbiota injury, and protect against GVHD,” first author Jenny Paredes, PhD, a staff scientist at City of Hope National Medical Center in Duarte, California, said in a press statement for the study presented at the American Society of Hematology (ASH) 2024 Annual Meeting.
Although the health benefits of dietary fiber on the gut microbiome are well-documented, the effects have recently been shown to extend to outcomes after allo-HCT in general, with researchers finding increased overall survival when there is higher diversity in the gut microbiome, including a higher abundance of butyrate producers and lower abundance of enterococcus, explained Paredes when presenting the findings.
Acute GvHD, a common and potentially life-threatening complication of allo-HCT, can have symptoms that mimic irritable bowel disease (IBD), including abdominal pain or cramps, nausea, vomiting, and diarrhea. The low-fiber diet recommendations, including avoidance of raw vegetables and fruits before and after the allo-HCT procedure, are designed to counter those effects, as well as reduce exposure to bacteria.
However, with data suggesting the potential benefits of dietary fiber could extend to the prevention of GvHD, Paredes and colleagues further investigated.
For the observational study, they evaluated all dietary data on 173 allo-HCT recipients at Memorial Sloan Kettering Cancer Center (MSKCC) from 10 days prior to transplantation to 30 days post-transplantation, representing 3837 patient-days in total.
Data collected from the patients also included rRNA sequencing of fecal samples and fecal short-chain fatty acid concentration.
Participants had a median age of 60, and 45% were female. The most common diseases being treated were leukemia (50%), myelodysplastic syndrome (25%), and non-Hodgkin’s lymphoma (8.7%).
After stratifying patients based on high- or low-fiber intake, those with high-fiber intake were found to have significantly higher rates of microbial α-diversity (P = .009), a higher abundance of butyrate producers (P = .03), and a higher concentration of butyrate (P = .02), a short-chain fatty acid that plays a key role in gut health.
Furthermore, the high-fiber group had significantly higher overall survival in an analysis extending to 24 months relative to day 12 of the study (P = .04).
Focusing on GvHD outcomes, the authors further evaluated data on 101 non-T-cell–depleted patients, and identified 29 patients without GvHD and 24 who developed lower gastrointestinal (GI) GvHD.
Patients with lower GI GvHD had significantly lower fecal concentrations of butyrate (P = .03) and acetate (P = .02).
However, patients among those in the high-fiber intake group had a significantly lower cumulative incidence of developing GvHD at day 100 (P = .034) and a lower incidence of lower GI GvHD (P = .04).
A separate preclinical analysis of a mouse model with GvHD further showed that a fiber-rich diet (12% cellulose) significantly increased the expression of genes associated with reduced GvHD, including IDO1 and CEACAM1, and those associated with enrichment of the bile acid pathway.
The findings suggest an opportunity to improve outcomes with relatively small dietary changes, Paredes said.
“Strategies to increase the fiber concentration in these diets paired with the safety that these patients need is what makes this study exciting,” she said in an interview.
“Increasing the fiber intake by 10 to 20 grams/day could potentially increase the microbiome diversity and abundance of butyrate producers, which have been correlated with higher overall survival rates post allo-HCT,” she continued.
“[For instance], that could be an avocado per day, or it could be a small salad per day, or a small vegetable soup per day,” she added. “I would encourage institutions to re-evaluate their menu planning and see how to include more fiber into the meals in a safe way.”
Ultimately, “I think that a dietary intervention outweighs the risks of a pharmacological intervention,” Paredes added.
The necessary duration of a high-fiber diet to produce the beneficial effects on allo-HCT outcomes would likely be over the course of the pre- and post-transplant periods, Paredes added.
“With the survival analysis extending from 5 days before transplantation to 12 days post, we are looking at an intervention that potentially could be around 20 days,” she said.
“We would love to take advantage of the pretransplantation window, in particular, and we can see that just increasing the fiber intake by about 20 grams during this window was shown to improve overall survival after 24 months,” Paredes added.
Importantly, however, some patients may not be appropriate for high-fiber dietary changes, Paredes cautioned.
“Patients that have developed IBD-like symptoms and severe GvHD patients, for example, or with lower GI-GvHD grades 3 and 4 would be not appropriate candidates for a high-fiber diet,” she said.
High-Fiber Diet Slows MM Disease Progression?
The potential important benefits of a high-fiber diet in blood diseases were further demonstrated in a separate study also by MSKCC researchers presented at the meeting, which showed encouraging signs that a plant-based diet rich in fiber could potentially slow disease progression in multiple myeloma (MM).
NUTRIVENTION included 20 patients with the two precancerous MM conditions, monoclonal gammopathy of undetermined significance (MGUS) and smoldering multiple myeloma (SMM), which can last for years without progressing to MM and which researchers have speculated could be a potential opportunity to intervene to prevent progression to cancer.
Patients were provided with a 12-week controlled diet plus health coaching for another 3 months; no meals or coaching were provided for the rest of the 1-year study period. Participants had a median age of 62 and, with being overweight/obesity a risk factor for MM, had a body mass index (BMI) of 25 kg/m2 or higher.
The trial met its endpoint of feasibility, with 91% adherence in the first 3 months. The rate of consumption of unprocessed plant foods increased from 20% at baseline to 92% on the intervention. Overall adherence was 58%. Insulin and anti-inflammatory markers also improved and, despite no calorie restriction, there was a 7% sustained reduction in BMI.
Notably, two patients in the study had stabilization of disease progression.
“We saw improvements in all spheres, including metabolism, microbiome, and immune system markers, and we also saw that two patients with progressive disease had the progression stabilize and slow down on the intervention,” principal investigator Urvi A. Shah, MD, said in a press statement.
“Even though it’s just two cases, to our knowledge, it has not been shown before in an intervention setting that you can improve diet and lifestyle and actually slow or change the trajectory of the disease,” she noted.
The researchers caution that findings in mice do not necessarily translate to humans but note another experiment in mice with SMM that showed animals fed a normal diet had progression to MM after a median of 12 weeks, compared with a median of 30 weeks among those fed a high-fiber diet.
Notably, all mice in the normal-diet group progressed to MM, whereas 40% of mice in the high-fiber group did not.
“We found that a high-fiber plant-based diet can improve BMI, improve insulin resistance [and] the microbiome through diversity and butyrate producers, and with the production of short-chain fatty acids, can have effects on inflammation, immunity, innate and adaptive antitumor immunity, and tumor cells or plasma cells,” Shah said during her presentation.
The study was supported by funding from the National Cancer Institute and private foundations. Paredes has reported no relevant financial relationships. Shah has reported relationships with Sanofi, Bristol Myers Squibb, and Janssen.
A version of this article first appeared on Medscape.com.
FROM ASH 2024
24-Hour Urine Testing in Multiple Myeloma: Time to Stop?
Overall, evaluating patients’ responses using urine-free and traditional criteria led to nearly identical assessments. When comparing the two criteria, only 7 of 645 patients evaluated had discordant results.
The findings, presented at the American Society of Hematology (ASH) 2024 Annual Meeting, add weight to the push to drop the requirement to perform routine urine tests from International Myeloma Working Group (IMWG) response criteria for multiple myeloma, said the study’s lead author, Rahul Banerjee, MD, from Fred Hutch Cancer Center, University of Washington School of Medicine, Seattle.
“International guidelines for multiple myeloma, which haven’t been updated in almost a decade, currently recommend these refrigerated 24-hour urine assessments, which are cumbersome for patients and can create substantial disparities,” Banerjee said in an interview.
“The international community is actually in the midst of updating its guidelines (I am part of this effort), and our work will hopefully help lead the way for future guidelines that de-emphasize the need for 24-hour urine testing to only a few rare scenarios, such as AL amyloidosis,” Banerjee added.
Urine tests can help detect the presence of abnormal proteins, which can indicate the level of myeloma tumor burden. Performing these tests routinely can help physicians monitor the effectiveness of patients’ treatment in practice and clinical trials.
Some recent data, however, suggest that dropping urine testing from the response criteria would change the response assessment in fewer than 5% of patients. Still, it’s not clear how urine-free criteria would impact assessments of progression free survival.
In the current study, Banerjee and colleagues performed a secondary analysis of the STaMINA trial. In the original trial, patients were randomized to lenalidomide maintenance, tandem autologous hematopoietic cell transplantation followed by lenalidomide maintenance, or consolidation therapy (lenalidomide, bortezomib, and dexamethasone) followed by lenalidomide maintenance until disease progression.
The secondary analysis included 645 patients from the original trial who were evaluable 56 days following autologous hematopoietic cell transplantation. The analysis looked at patients across all groups, but excluded those with progressive disease, and compared patients’ responses using traditional IMWG criteria, which includes 24-hour urine assessments, and urine-free criteria. Response measurements included complete response, very good partial response, partial response, and stable disease.
Patients were a median age of 56 years, 41% were female, 17% were Black, and 7% were Hispanic; 26% had light-chain only disease. About half (49%) had received lenalidomide alone, 28% had received post-autologous stem cell transplantation consolidation followed by lenalidomide, and 24% had received tandem transplantation followed by lenalidomide.
The analysis showed that “urine-free response criteria worked just fine in terms of their prognostic value,” Banerjee said while presenting the findings.
Specifically, the complete response rate was 29.4% using the traditional criteria vs 29.7% using the urine-free criteria. The very good partial response rate was 37.0% with the traditional approach vs 36.6% with the urine-free approach. The partial response rate was 30.7% for both and the stable disease rate was 3.0% for both.
Achieving a complete response based on the urine-free criteria was highly prognostic for progression-free survival (P = .005) while achieving a very good partial response by either criterion was borderline prognostic for progression-free survival (P = .102).
Only 1.1% of patients — seven patients altogether — had discordant responses between traditional and urine-free response criteria, Banerjee noted. One patient, for instance, was downgraded from a very good partial response with traditional criteria to a partial response with urine-free criteria “because current response criteria rate urine [as] more important than serum-free light chains,” Banerjee explained. Two other patients who met all other stringent criteria for a complete response but still had urine paraprotein at Day 56 were classified as having a very good partial response using traditional criteria but as a complete response with the urine-free criteria.
The other four patients with discordant results were the most important, Banerjee said. These patients were missing urine protein electrophoresis values, which made them non-evaluable using traditional criteria, but became evaluable when using urine-free criteria. “This is, I think, the bane of our existence, right? We ask our patients to put their blood, soul, sweat, and tears into being in a clinical trial, and then they’re not evaluable,” he said.
Overall, these results strongly support the de-emphasis of 24-hour urine requirements in updated IMWG response criteria, said Banerjee. However, he noted, 24-hour urine testing still has a very important place in the screening process and in patients with monoclonal gammopathy of renal significance or AL amyloidosis.
“This study provides reassurance to those of us already not repeating urine tests that urine testing is unnecessary for tracking responses,” said Manni Mohyuddin, MD, from the Multiple Myeloma Program at Huntsman Cancer Institute and assistant professor at the University of Utah, Salt Lake City. “These assessments aren’t done consistently in practice outside of trials anyway, and I hope that this study will lead to a formal change in criteria and the omission of urine assessments in clinical trials.”
Funding for the study was provided by the National Heart, Lung, and Blood Institute; National Cancer Institute; Alliance for Clinical Trials in Oncology; ECOG-ACRIN Cancer Research Group; and SWOG; and contributions were provided by Celgene and Millennium Pharmaceuticals. Banerjee has reported consulting for Adaptive Biotechnologies, Bristol-Myers Squibb, Caribou Biosciences, Genentech, GSK, Johnson & Johnson/Janssen, Karyopharm, Legend Biotech, Pfizer, Sanofi, and SparkCures, and receiving research funding from AbbVie, Johnson & Johnson, Novartis, Pack Health, Prothena, and Sanofi. Mohyuddin has disclosed no personal payments and no consultation for industry. His institution has received research funding from Janssen for his role as a principal investigator on a trial.
A version of this article first appeared on Medscape.com.
Overall, evaluating patients’ responses using urine-free and traditional criteria led to nearly identical assessments. When comparing the two criteria, only 7 of 645 patients evaluated had discordant results.
The findings, presented at the American Society of Hematology (ASH) 2024 Annual Meeting, add weight to the push to drop the requirement to perform routine urine tests from International Myeloma Working Group (IMWG) response criteria for multiple myeloma, said the study’s lead author, Rahul Banerjee, MD, from Fred Hutch Cancer Center, University of Washington School of Medicine, Seattle.
“International guidelines for multiple myeloma, which haven’t been updated in almost a decade, currently recommend these refrigerated 24-hour urine assessments, which are cumbersome for patients and can create substantial disparities,” Banerjee said in an interview.
“The international community is actually in the midst of updating its guidelines (I am part of this effort), and our work will hopefully help lead the way for future guidelines that de-emphasize the need for 24-hour urine testing to only a few rare scenarios, such as AL amyloidosis,” Banerjee added.
Urine tests can help detect the presence of abnormal proteins, which can indicate the level of myeloma tumor burden. Performing these tests routinely can help physicians monitor the effectiveness of patients’ treatment in practice and clinical trials.
Some recent data, however, suggest that dropping urine testing from the response criteria would change the response assessment in fewer than 5% of patients. Still, it’s not clear how urine-free criteria would impact assessments of progression free survival.
In the current study, Banerjee and colleagues performed a secondary analysis of the STaMINA trial. In the original trial, patients were randomized to lenalidomide maintenance, tandem autologous hematopoietic cell transplantation followed by lenalidomide maintenance, or consolidation therapy (lenalidomide, bortezomib, and dexamethasone) followed by lenalidomide maintenance until disease progression.
The secondary analysis included 645 patients from the original trial who were evaluable 56 days following autologous hematopoietic cell transplantation. The analysis looked at patients across all groups, but excluded those with progressive disease, and compared patients’ responses using traditional IMWG criteria, which includes 24-hour urine assessments, and urine-free criteria. Response measurements included complete response, very good partial response, partial response, and stable disease.
Patients were a median age of 56 years, 41% were female, 17% were Black, and 7% were Hispanic; 26% had light-chain only disease. About half (49%) had received lenalidomide alone, 28% had received post-autologous stem cell transplantation consolidation followed by lenalidomide, and 24% had received tandem transplantation followed by lenalidomide.
The analysis showed that “urine-free response criteria worked just fine in terms of their prognostic value,” Banerjee said while presenting the findings.
Specifically, the complete response rate was 29.4% using the traditional criteria vs 29.7% using the urine-free criteria. The very good partial response rate was 37.0% with the traditional approach vs 36.6% with the urine-free approach. The partial response rate was 30.7% for both and the stable disease rate was 3.0% for both.
Achieving a complete response based on the urine-free criteria was highly prognostic for progression-free survival (P = .005) while achieving a very good partial response by either criterion was borderline prognostic for progression-free survival (P = .102).
Only 1.1% of patients — seven patients altogether — had discordant responses between traditional and urine-free response criteria, Banerjee noted. One patient, for instance, was downgraded from a very good partial response with traditional criteria to a partial response with urine-free criteria “because current response criteria rate urine [as] more important than serum-free light chains,” Banerjee explained. Two other patients who met all other stringent criteria for a complete response but still had urine paraprotein at Day 56 were classified as having a very good partial response using traditional criteria but as a complete response with the urine-free criteria.
The other four patients with discordant results were the most important, Banerjee said. These patients were missing urine protein electrophoresis values, which made them non-evaluable using traditional criteria, but became evaluable when using urine-free criteria. “This is, I think, the bane of our existence, right? We ask our patients to put their blood, soul, sweat, and tears into being in a clinical trial, and then they’re not evaluable,” he said.
Overall, these results strongly support the de-emphasis of 24-hour urine requirements in updated IMWG response criteria, said Banerjee. However, he noted, 24-hour urine testing still has a very important place in the screening process and in patients with monoclonal gammopathy of renal significance or AL amyloidosis.
“This study provides reassurance to those of us already not repeating urine tests that urine testing is unnecessary for tracking responses,” said Manni Mohyuddin, MD, from the Multiple Myeloma Program at Huntsman Cancer Institute and assistant professor at the University of Utah, Salt Lake City. “These assessments aren’t done consistently in practice outside of trials anyway, and I hope that this study will lead to a formal change in criteria and the omission of urine assessments in clinical trials.”
Funding for the study was provided by the National Heart, Lung, and Blood Institute; National Cancer Institute; Alliance for Clinical Trials in Oncology; ECOG-ACRIN Cancer Research Group; and SWOG; and contributions were provided by Celgene and Millennium Pharmaceuticals. Banerjee has reported consulting for Adaptive Biotechnologies, Bristol-Myers Squibb, Caribou Biosciences, Genentech, GSK, Johnson & Johnson/Janssen, Karyopharm, Legend Biotech, Pfizer, Sanofi, and SparkCures, and receiving research funding from AbbVie, Johnson & Johnson, Novartis, Pack Health, Prothena, and Sanofi. Mohyuddin has disclosed no personal payments and no consultation for industry. His institution has received research funding from Janssen for his role as a principal investigator on a trial.
A version of this article first appeared on Medscape.com.
Overall, evaluating patients’ responses using urine-free and traditional criteria led to nearly identical assessments. When comparing the two criteria, only 7 of 645 patients evaluated had discordant results.
The findings, presented at the American Society of Hematology (ASH) 2024 Annual Meeting, add weight to the push to drop the requirement to perform routine urine tests from International Myeloma Working Group (IMWG) response criteria for multiple myeloma, said the study’s lead author, Rahul Banerjee, MD, from Fred Hutch Cancer Center, University of Washington School of Medicine, Seattle.
“International guidelines for multiple myeloma, which haven’t been updated in almost a decade, currently recommend these refrigerated 24-hour urine assessments, which are cumbersome for patients and can create substantial disparities,” Banerjee said in an interview.
“The international community is actually in the midst of updating its guidelines (I am part of this effort), and our work will hopefully help lead the way for future guidelines that de-emphasize the need for 24-hour urine testing to only a few rare scenarios, such as AL amyloidosis,” Banerjee added.
Urine tests can help detect the presence of abnormal proteins, which can indicate the level of myeloma tumor burden. Performing these tests routinely can help physicians monitor the effectiveness of patients’ treatment in practice and clinical trials.
Some recent data, however, suggest that dropping urine testing from the response criteria would change the response assessment in fewer than 5% of patients. Still, it’s not clear how urine-free criteria would impact assessments of progression free survival.
In the current study, Banerjee and colleagues performed a secondary analysis of the STaMINA trial. In the original trial, patients were randomized to lenalidomide maintenance, tandem autologous hematopoietic cell transplantation followed by lenalidomide maintenance, or consolidation therapy (lenalidomide, bortezomib, and dexamethasone) followed by lenalidomide maintenance until disease progression.
The secondary analysis included 645 patients from the original trial who were evaluable 56 days following autologous hematopoietic cell transplantation. The analysis looked at patients across all groups, but excluded those with progressive disease, and compared patients’ responses using traditional IMWG criteria, which includes 24-hour urine assessments, and urine-free criteria. Response measurements included complete response, very good partial response, partial response, and stable disease.
Patients were a median age of 56 years, 41% were female, 17% were Black, and 7% were Hispanic; 26% had light-chain only disease. About half (49%) had received lenalidomide alone, 28% had received post-autologous stem cell transplantation consolidation followed by lenalidomide, and 24% had received tandem transplantation followed by lenalidomide.
The analysis showed that “urine-free response criteria worked just fine in terms of their prognostic value,” Banerjee said while presenting the findings.
Specifically, the complete response rate was 29.4% using the traditional criteria vs 29.7% using the urine-free criteria. The very good partial response rate was 37.0% with the traditional approach vs 36.6% with the urine-free approach. The partial response rate was 30.7% for both and the stable disease rate was 3.0% for both.
Achieving a complete response based on the urine-free criteria was highly prognostic for progression-free survival (P = .005) while achieving a very good partial response by either criterion was borderline prognostic for progression-free survival (P = .102).
Only 1.1% of patients — seven patients altogether — had discordant responses between traditional and urine-free response criteria, Banerjee noted. One patient, for instance, was downgraded from a very good partial response with traditional criteria to a partial response with urine-free criteria “because current response criteria rate urine [as] more important than serum-free light chains,” Banerjee explained. Two other patients who met all other stringent criteria for a complete response but still had urine paraprotein at Day 56 were classified as having a very good partial response using traditional criteria but as a complete response with the urine-free criteria.
The other four patients with discordant results were the most important, Banerjee said. These patients were missing urine protein electrophoresis values, which made them non-evaluable using traditional criteria, but became evaluable when using urine-free criteria. “This is, I think, the bane of our existence, right? We ask our patients to put their blood, soul, sweat, and tears into being in a clinical trial, and then they’re not evaluable,” he said.
Overall, these results strongly support the de-emphasis of 24-hour urine requirements in updated IMWG response criteria, said Banerjee. However, he noted, 24-hour urine testing still has a very important place in the screening process and in patients with monoclonal gammopathy of renal significance or AL amyloidosis.
“This study provides reassurance to those of us already not repeating urine tests that urine testing is unnecessary for tracking responses,” said Manni Mohyuddin, MD, from the Multiple Myeloma Program at Huntsman Cancer Institute and assistant professor at the University of Utah, Salt Lake City. “These assessments aren’t done consistently in practice outside of trials anyway, and I hope that this study will lead to a formal change in criteria and the omission of urine assessments in clinical trials.”
Funding for the study was provided by the National Heart, Lung, and Blood Institute; National Cancer Institute; Alliance for Clinical Trials in Oncology; ECOG-ACRIN Cancer Research Group; and SWOG; and contributions were provided by Celgene and Millennium Pharmaceuticals. Banerjee has reported consulting for Adaptive Biotechnologies, Bristol-Myers Squibb, Caribou Biosciences, Genentech, GSK, Johnson & Johnson/Janssen, Karyopharm, Legend Biotech, Pfizer, Sanofi, and SparkCures, and receiving research funding from AbbVie, Johnson & Johnson, Novartis, Pack Health, Prothena, and Sanofi. Mohyuddin has disclosed no personal payments and no consultation for industry. His institution has received research funding from Janssen for his role as a principal investigator on a trial.
A version of this article first appeared on Medscape.com.
FROM ASH 2024
IVIG Prophylaxis in Multiple Myeloma Cuts Infections, Boosts Survival
Among 225 consecutive patients who received at least one treatment for relapsed and/or refractory multiple myeloma, those who received IVIG prophylaxis experienced a significantly longer duration of infection-free survival and an almost threefold longer median overall survival, compared with patients who did not receive IVIG prophylaxis.
IVIG supplementation has been shown to prevent severe infections in patients with multiple myeloma, but evidence on the best time to initiate IVIG prophylaxis among those receiving teclistamab remains less clear.
“Our institutional practice is to start IVIG about cycle 2 of therapy, which ended up being around 39 days,” but a key takeaway from the current findings is to “start IVIG within 30 days,” said lead investigator Heloise Cheruvalath, BA, a medical student at Medical College of Wisconsin, Milwaukee, who presented the findings.
The 225 patients included in the study had received at least one dose of standard-of-care teclistamab or an investigational B-cell maturation antigen (BCMA)–directed bispecific antibody (bsAb). IVIG was given as prophylaxis to 92 patients (41%) in the primary arm. The remaining 133 patients (59%) did not receive IVIG prophylaxis, but 29% received IVIG after a documented infection.
In total, there were 288 infections in 136 patients, and about 61% of infections required hospitalization. Median time to infection was 97 days, with the 12-month cumulative incidence of all-grade infections reaching 73% and the incidence of grade 3 or higher infections totaling 53%. Respiratory tract infections were the most common infection type, with COVID-19 accounting for 11% of cases, Cheruvalath noted.
Comparing patients who did and did not receive IVIG prophylaxis, median infection-free survival was significantly longer in the prophylaxis group — a median of 7.7 months vs 3 months — as was grade 3 or higher infection-free survival — a median of 14 months vs 7.5 months.
IVIG prophylaxis also led to a higher rate of 2-year progression free survival in the prophylaxis vs nonprophylaxis group — at 38% vs 32% — as well as longer median progression-free survival — at 15 months vs 8 months.
After multivariate analysis, IVIG prophylaxis was no longer significantly associated with improved progression-free survival.
However, median overall survival did remain significantly better in the IVIG prophylaxis than the nonprophylaxis group after multivariate analysis — 44 months vs 16 months. The presence of high-risk and extramedullary disease was independently associated with worse overall survival.
The effects of IVIG prophylaxis were stronger for bacterial infections at earlier (30 days or sooner) vs later (31 days or later) time points, but timing of IVIG therapy did not appear to affect the incidence of viral infections.
A study limitation was lack of randomization; IVIG prophylaxis was given at the physician’s discretion. In addition, multiple myeloma treatment was not standardized, with 15% of IVIG patients and 38% of non-IVIG patients receiving investigational BCMA bsAB.
“However, the majority of those who received primary IVIG prophylaxis were treated with standard-of-care teclistamab, making our results generalizable to current clinical practice,” Cheruvalath said.
Rahul Banerjee, MD, who was not involved with the research, noted he has already started providing routine IVIG prophylaxis based on earlier research from this group. “Before I did, my patients would often get very rare infections requiring protracted courses of antibiotics,” Banerjee, from Fred Hutch Cancer Center, University of Washington School of Medicine, Seattle, said in an interview. “Moving to IVIG before the infections start makes much more sense.”
Banerjee also commented that, in general, “the myeloma field has been moving from IV treatments to subcutaneous treatments to lower ‘time toxicity’ and IVIG is a notable exception to that trend, but perhaps it won’t be this way forever.”
Many patients with rheumatologic conditions receive subcutaneous immunoglobulin, in some cases, with kits they can self-administer at home, Banerjee said, and “I know some groups are starting to work on moving subcutaneous immunoglobulin to the oncologic setting.”
Funding was provided by the Advancing Healthier Wisconsin Endowment. Cheruvalath has reported no relevant disclosures. Banerjee has reported consulting for Adaptive Biotechnologies, Bristol Myers Squibb, Caribou Biosciences, Genentech, GSK, Johnson & Johnson/Janssen, Karyopharm, Legend Biotech, Pfizer, Sanofi, and SparkCures; and receiving research funding from AbbVie, Johnson & Johnson, Novartis, Pack Health, Prothena, and Sanofi.
A version of this article first appeared on Medscape.com.
Among 225 consecutive patients who received at least one treatment for relapsed and/or refractory multiple myeloma, those who received IVIG prophylaxis experienced a significantly longer duration of infection-free survival and an almost threefold longer median overall survival, compared with patients who did not receive IVIG prophylaxis.
IVIG supplementation has been shown to prevent severe infections in patients with multiple myeloma, but evidence on the best time to initiate IVIG prophylaxis among those receiving teclistamab remains less clear.
“Our institutional practice is to start IVIG about cycle 2 of therapy, which ended up being around 39 days,” but a key takeaway from the current findings is to “start IVIG within 30 days,” said lead investigator Heloise Cheruvalath, BA, a medical student at Medical College of Wisconsin, Milwaukee, who presented the findings.
The 225 patients included in the study had received at least one dose of standard-of-care teclistamab or an investigational B-cell maturation antigen (BCMA)–directed bispecific antibody (bsAb). IVIG was given as prophylaxis to 92 patients (41%) in the primary arm. The remaining 133 patients (59%) did not receive IVIG prophylaxis, but 29% received IVIG after a documented infection.
In total, there were 288 infections in 136 patients, and about 61% of infections required hospitalization. Median time to infection was 97 days, with the 12-month cumulative incidence of all-grade infections reaching 73% and the incidence of grade 3 or higher infections totaling 53%. Respiratory tract infections were the most common infection type, with COVID-19 accounting for 11% of cases, Cheruvalath noted.
Comparing patients who did and did not receive IVIG prophylaxis, median infection-free survival was significantly longer in the prophylaxis group — a median of 7.7 months vs 3 months — as was grade 3 or higher infection-free survival — a median of 14 months vs 7.5 months.
IVIG prophylaxis also led to a higher rate of 2-year progression free survival in the prophylaxis vs nonprophylaxis group — at 38% vs 32% — as well as longer median progression-free survival — at 15 months vs 8 months.
After multivariate analysis, IVIG prophylaxis was no longer significantly associated with improved progression-free survival.
However, median overall survival did remain significantly better in the IVIG prophylaxis than the nonprophylaxis group after multivariate analysis — 44 months vs 16 months. The presence of high-risk and extramedullary disease was independently associated with worse overall survival.
The effects of IVIG prophylaxis were stronger for bacterial infections at earlier (30 days or sooner) vs later (31 days or later) time points, but timing of IVIG therapy did not appear to affect the incidence of viral infections.
A study limitation was lack of randomization; IVIG prophylaxis was given at the physician’s discretion. In addition, multiple myeloma treatment was not standardized, with 15% of IVIG patients and 38% of non-IVIG patients receiving investigational BCMA bsAB.
“However, the majority of those who received primary IVIG prophylaxis were treated with standard-of-care teclistamab, making our results generalizable to current clinical practice,” Cheruvalath said.
Rahul Banerjee, MD, who was not involved with the research, noted he has already started providing routine IVIG prophylaxis based on earlier research from this group. “Before I did, my patients would often get very rare infections requiring protracted courses of antibiotics,” Banerjee, from Fred Hutch Cancer Center, University of Washington School of Medicine, Seattle, said in an interview. “Moving to IVIG before the infections start makes much more sense.”
Banerjee also commented that, in general, “the myeloma field has been moving from IV treatments to subcutaneous treatments to lower ‘time toxicity’ and IVIG is a notable exception to that trend, but perhaps it won’t be this way forever.”
Many patients with rheumatologic conditions receive subcutaneous immunoglobulin, in some cases, with kits they can self-administer at home, Banerjee said, and “I know some groups are starting to work on moving subcutaneous immunoglobulin to the oncologic setting.”
Funding was provided by the Advancing Healthier Wisconsin Endowment. Cheruvalath has reported no relevant disclosures. Banerjee has reported consulting for Adaptive Biotechnologies, Bristol Myers Squibb, Caribou Biosciences, Genentech, GSK, Johnson & Johnson/Janssen, Karyopharm, Legend Biotech, Pfizer, Sanofi, and SparkCures; and receiving research funding from AbbVie, Johnson & Johnson, Novartis, Pack Health, Prothena, and Sanofi.
A version of this article first appeared on Medscape.com.
Among 225 consecutive patients who received at least one treatment for relapsed and/or refractory multiple myeloma, those who received IVIG prophylaxis experienced a significantly longer duration of infection-free survival and an almost threefold longer median overall survival, compared with patients who did not receive IVIG prophylaxis.
IVIG supplementation has been shown to prevent severe infections in patients with multiple myeloma, but evidence on the best time to initiate IVIG prophylaxis among those receiving teclistamab remains less clear.
“Our institutional practice is to start IVIG about cycle 2 of therapy, which ended up being around 39 days,” but a key takeaway from the current findings is to “start IVIG within 30 days,” said lead investigator Heloise Cheruvalath, BA, a medical student at Medical College of Wisconsin, Milwaukee, who presented the findings.
The 225 patients included in the study had received at least one dose of standard-of-care teclistamab or an investigational B-cell maturation antigen (BCMA)–directed bispecific antibody (bsAb). IVIG was given as prophylaxis to 92 patients (41%) in the primary arm. The remaining 133 patients (59%) did not receive IVIG prophylaxis, but 29% received IVIG after a documented infection.
In total, there were 288 infections in 136 patients, and about 61% of infections required hospitalization. Median time to infection was 97 days, with the 12-month cumulative incidence of all-grade infections reaching 73% and the incidence of grade 3 or higher infections totaling 53%. Respiratory tract infections were the most common infection type, with COVID-19 accounting for 11% of cases, Cheruvalath noted.
Comparing patients who did and did not receive IVIG prophylaxis, median infection-free survival was significantly longer in the prophylaxis group — a median of 7.7 months vs 3 months — as was grade 3 or higher infection-free survival — a median of 14 months vs 7.5 months.
IVIG prophylaxis also led to a higher rate of 2-year progression free survival in the prophylaxis vs nonprophylaxis group — at 38% vs 32% — as well as longer median progression-free survival — at 15 months vs 8 months.
After multivariate analysis, IVIG prophylaxis was no longer significantly associated with improved progression-free survival.
However, median overall survival did remain significantly better in the IVIG prophylaxis than the nonprophylaxis group after multivariate analysis — 44 months vs 16 months. The presence of high-risk and extramedullary disease was independently associated with worse overall survival.
The effects of IVIG prophylaxis were stronger for bacterial infections at earlier (30 days or sooner) vs later (31 days or later) time points, but timing of IVIG therapy did not appear to affect the incidence of viral infections.
A study limitation was lack of randomization; IVIG prophylaxis was given at the physician’s discretion. In addition, multiple myeloma treatment was not standardized, with 15% of IVIG patients and 38% of non-IVIG patients receiving investigational BCMA bsAB.
“However, the majority of those who received primary IVIG prophylaxis were treated with standard-of-care teclistamab, making our results generalizable to current clinical practice,” Cheruvalath said.
Rahul Banerjee, MD, who was not involved with the research, noted he has already started providing routine IVIG prophylaxis based on earlier research from this group. “Before I did, my patients would often get very rare infections requiring protracted courses of antibiotics,” Banerjee, from Fred Hutch Cancer Center, University of Washington School of Medicine, Seattle, said in an interview. “Moving to IVIG before the infections start makes much more sense.”
Banerjee also commented that, in general, “the myeloma field has been moving from IV treatments to subcutaneous treatments to lower ‘time toxicity’ and IVIG is a notable exception to that trend, but perhaps it won’t be this way forever.”
Many patients with rheumatologic conditions receive subcutaneous immunoglobulin, in some cases, with kits they can self-administer at home, Banerjee said, and “I know some groups are starting to work on moving subcutaneous immunoglobulin to the oncologic setting.”
Funding was provided by the Advancing Healthier Wisconsin Endowment. Cheruvalath has reported no relevant disclosures. Banerjee has reported consulting for Adaptive Biotechnologies, Bristol Myers Squibb, Caribou Biosciences, Genentech, GSK, Johnson & Johnson/Janssen, Karyopharm, Legend Biotech, Pfizer, Sanofi, and SparkCures; and receiving research funding from AbbVie, Johnson & Johnson, Novartis, Pack Health, Prothena, and Sanofi.
A version of this article first appeared on Medscape.com.
FROM ASH 2024
LBCL: Bispecific Antibodies Fare Less Well in Real-World Analysis
In a presentation at the American Society of Hematology (ASH) 2024 Annual Meeting, researchers reported that of 172 patients treated with the drugs who had evaluable responses over a median follow-up of 5 months, median progression-free survival was 2.7 months (95% CI, 2.0-3.9) and median overall survival was 7.2 months (95% CI, 6.1–not reached).
It’s important to consider the real-world nature of the study’s patient population, said first author Taylor R. Brooks, MD, of Cleveland Clinic, Ohio, in an interview. “Compared to pivotal trials, our cohort was enriched for patients with high-risk features, with almost three quarters having some comorbidity that would’ve excluded them from one of the [earlier] studies.”
He added that “though individuals eligible to receive these medicines may be more sick with high-risk disease, a sizable fraction will respond, and some will maintain remissions.’”
According to Brooks, about one third of patients with diffuse LBCL relapse after standard front-line R-CHOP therapy. “The prognosis is poor for patients who are not candidates for aggressive salvage chemotherapy and for those who relapse after two or more lines,” Brooks said. “T cell–engaging bispecific antibodies have emerged as a promising option for patients with relapsed or refractory large B-cell lymphoma, given their favorable rates and duration of responses as well as their manageable rates of toxicities.”
The Food and Drug Administration (FDA) granted accelerated approval for epcoritamab and glofitamab in 2023.
“With increasing uptake into clinical practice following the FDA approvals, there is increasing interest in assessing the efficacy and safety of these drugs in real-world, nontrial settings,” Brooks said. “The goal of our study was to investigate outcomes and identify clinical factors associated with outcomes.”
The multicenter, retrospective, observational REALBiTE study tracked 209 patients with relapsed/refractory diffuse LBCL at 19 US centers (epcoritamab, n = 139; glofitamab, n = 70; median age at start of treatment, 67 years [58-76]; 62.2% male; 74.2% diffuse LBCL). The median number of lines of therapy was three (range, 1-12).
“Patients who received epcoritamab tended to be slightly older, were more likely to have a history of indolent non-Hodgkin lymphoma prior to their diagnosis of aggressive B-cell lymphoma and were more likely to have an elevated International Prognostic Index score at the start of bispecific therapy, suggesting that these patients may have been slightly older with higher-risk disease compared to those who received glofitamab,” Brooks said.
In total, 172 patients were response-evaluable. The overall response rate was 50.6% (complete response, 23.8%; partial response, 26.7%; stable disease, 5.8%; progressive disease, 43.6%).
The overall and complete response rates were “somewhat lower that what has been published in the pivotal trials of these medicines,” Brooks said. The low progression-free and overall survival rates “highlight the difficulty in managing this group of patients.”
Cytokine release syndrome (CRS) of any grade occurred in 39.2% of patients: 51% in the epcoritamab group and 28.6% in the glofitamab group. Grade ≥3 CRS occurred in 4.3% of patients, who were all taking epcoritamab.
“For epcoritamab, CRS was almost entirely of low grade, and most CRS events occurred around administration of the first full dose of the drug on day 15,” Brooks said. “Similarly, the CRS events for glofitamab were mostly of low grade, though events were observed to occur throughout the step-up dosing. Tocilizumab was administered in about one fifth of the patients.”
In addition, Brooks said, “we found that, among the 19 individuals with paired biopsy samples before and after bispecific therapy, nearly all — 89% — were found to have lost CD20 expression. We expected some patients to experience loss of this important target, but the rate at which we found this to be the case was surprisingly high.”
Brooks added that “clinicians should be acquainted with CRS, ICANS [immune effector cell-associated neurotoxicity syndrome], and mitigation strategies if they are prescribing these medicines. Appropriate and timely management using tocilizumab, steroids, and other adjunctive measures can effectively manage these complications and hopefully allow for the continued delivery of therapy.”
In an interview, Matthew Lunning, DO, associate professor at the University of Nebraska Medical Center/Fred & Pamela Buffett Cancer Center, Omaha, who didn’t take part in the new study, said the findings aren’t bad news. Instead, they’re “practical news,” because they offer insight into how the drugs work.
“The big lesson from this and other trials is the importance of assessing for CD20 expression prior to taking a bispecific off the shelf, “ he said. “These are learnings that often come after approval.”
He added that it’s clear that, “in more heavily pretreated patients, more disease led to less optimal results and higher risk for toxicities.”
Lunning also noted that both epcoritamab and glofitamab “entered into a crowded and chaotic relapsed/refractory LBCL space based high complete response rates with the opportunity for durability in those complete responses.”
Academic institutions were especially interested, as they can manage CRS and ICANS, but “significantly less enthusiasm has been seen in community practices that expect CRS/ICANS to be in the rear-view mirror if they are going to deliver any bispecific,” he said. “It is not that they don’t have the clinical acumen to manage CRS/ICANS. I believe it is the perception of the lack of supportive infrastructure necessary to manage these toxicities.”
There was no study funding. Brooks has reported no disclosures. Other authors have reported various disclosures including relationships with Novartis, AbbVie, Genentech, Genmab, Biogen, Amgen, and others. Lunning has disclosed ties with AbbVie, Genmab, Kite, Bristol-Myers Squibb, Regeneron, and ADC Therapeutics.
A version of this article first appeared on Medscape.com.
In a presentation at the American Society of Hematology (ASH) 2024 Annual Meeting, researchers reported that of 172 patients treated with the drugs who had evaluable responses over a median follow-up of 5 months, median progression-free survival was 2.7 months (95% CI, 2.0-3.9) and median overall survival was 7.2 months (95% CI, 6.1–not reached).
It’s important to consider the real-world nature of the study’s patient population, said first author Taylor R. Brooks, MD, of Cleveland Clinic, Ohio, in an interview. “Compared to pivotal trials, our cohort was enriched for patients with high-risk features, with almost three quarters having some comorbidity that would’ve excluded them from one of the [earlier] studies.”
He added that “though individuals eligible to receive these medicines may be more sick with high-risk disease, a sizable fraction will respond, and some will maintain remissions.’”
According to Brooks, about one third of patients with diffuse LBCL relapse after standard front-line R-CHOP therapy. “The prognosis is poor for patients who are not candidates for aggressive salvage chemotherapy and for those who relapse after two or more lines,” Brooks said. “T cell–engaging bispecific antibodies have emerged as a promising option for patients with relapsed or refractory large B-cell lymphoma, given their favorable rates and duration of responses as well as their manageable rates of toxicities.”
The Food and Drug Administration (FDA) granted accelerated approval for epcoritamab and glofitamab in 2023.
“With increasing uptake into clinical practice following the FDA approvals, there is increasing interest in assessing the efficacy and safety of these drugs in real-world, nontrial settings,” Brooks said. “The goal of our study was to investigate outcomes and identify clinical factors associated with outcomes.”
The multicenter, retrospective, observational REALBiTE study tracked 209 patients with relapsed/refractory diffuse LBCL at 19 US centers (epcoritamab, n = 139; glofitamab, n = 70; median age at start of treatment, 67 years [58-76]; 62.2% male; 74.2% diffuse LBCL). The median number of lines of therapy was three (range, 1-12).
“Patients who received epcoritamab tended to be slightly older, were more likely to have a history of indolent non-Hodgkin lymphoma prior to their diagnosis of aggressive B-cell lymphoma and were more likely to have an elevated International Prognostic Index score at the start of bispecific therapy, suggesting that these patients may have been slightly older with higher-risk disease compared to those who received glofitamab,” Brooks said.
In total, 172 patients were response-evaluable. The overall response rate was 50.6% (complete response, 23.8%; partial response, 26.7%; stable disease, 5.8%; progressive disease, 43.6%).
The overall and complete response rates were “somewhat lower that what has been published in the pivotal trials of these medicines,” Brooks said. The low progression-free and overall survival rates “highlight the difficulty in managing this group of patients.”
Cytokine release syndrome (CRS) of any grade occurred in 39.2% of patients: 51% in the epcoritamab group and 28.6% in the glofitamab group. Grade ≥3 CRS occurred in 4.3% of patients, who were all taking epcoritamab.
“For epcoritamab, CRS was almost entirely of low grade, and most CRS events occurred around administration of the first full dose of the drug on day 15,” Brooks said. “Similarly, the CRS events for glofitamab were mostly of low grade, though events were observed to occur throughout the step-up dosing. Tocilizumab was administered in about one fifth of the patients.”
In addition, Brooks said, “we found that, among the 19 individuals with paired biopsy samples before and after bispecific therapy, nearly all — 89% — were found to have lost CD20 expression. We expected some patients to experience loss of this important target, but the rate at which we found this to be the case was surprisingly high.”
Brooks added that “clinicians should be acquainted with CRS, ICANS [immune effector cell-associated neurotoxicity syndrome], and mitigation strategies if they are prescribing these medicines. Appropriate and timely management using tocilizumab, steroids, and other adjunctive measures can effectively manage these complications and hopefully allow for the continued delivery of therapy.”
In an interview, Matthew Lunning, DO, associate professor at the University of Nebraska Medical Center/Fred & Pamela Buffett Cancer Center, Omaha, who didn’t take part in the new study, said the findings aren’t bad news. Instead, they’re “practical news,” because they offer insight into how the drugs work.
“The big lesson from this and other trials is the importance of assessing for CD20 expression prior to taking a bispecific off the shelf, “ he said. “These are learnings that often come after approval.”
He added that it’s clear that, “in more heavily pretreated patients, more disease led to less optimal results and higher risk for toxicities.”
Lunning also noted that both epcoritamab and glofitamab “entered into a crowded and chaotic relapsed/refractory LBCL space based high complete response rates with the opportunity for durability in those complete responses.”
Academic institutions were especially interested, as they can manage CRS and ICANS, but “significantly less enthusiasm has been seen in community practices that expect CRS/ICANS to be in the rear-view mirror if they are going to deliver any bispecific,” he said. “It is not that they don’t have the clinical acumen to manage CRS/ICANS. I believe it is the perception of the lack of supportive infrastructure necessary to manage these toxicities.”
There was no study funding. Brooks has reported no disclosures. Other authors have reported various disclosures including relationships with Novartis, AbbVie, Genentech, Genmab, Biogen, Amgen, and others. Lunning has disclosed ties with AbbVie, Genmab, Kite, Bristol-Myers Squibb, Regeneron, and ADC Therapeutics.
A version of this article first appeared on Medscape.com.
In a presentation at the American Society of Hematology (ASH) 2024 Annual Meeting, researchers reported that of 172 patients treated with the drugs who had evaluable responses over a median follow-up of 5 months, median progression-free survival was 2.7 months (95% CI, 2.0-3.9) and median overall survival was 7.2 months (95% CI, 6.1–not reached).
It’s important to consider the real-world nature of the study’s patient population, said first author Taylor R. Brooks, MD, of Cleveland Clinic, Ohio, in an interview. “Compared to pivotal trials, our cohort was enriched for patients with high-risk features, with almost three quarters having some comorbidity that would’ve excluded them from one of the [earlier] studies.”
He added that “though individuals eligible to receive these medicines may be more sick with high-risk disease, a sizable fraction will respond, and some will maintain remissions.’”
According to Brooks, about one third of patients with diffuse LBCL relapse after standard front-line R-CHOP therapy. “The prognosis is poor for patients who are not candidates for aggressive salvage chemotherapy and for those who relapse after two or more lines,” Brooks said. “T cell–engaging bispecific antibodies have emerged as a promising option for patients with relapsed or refractory large B-cell lymphoma, given their favorable rates and duration of responses as well as their manageable rates of toxicities.”
The Food and Drug Administration (FDA) granted accelerated approval for epcoritamab and glofitamab in 2023.
“With increasing uptake into clinical practice following the FDA approvals, there is increasing interest in assessing the efficacy and safety of these drugs in real-world, nontrial settings,” Brooks said. “The goal of our study was to investigate outcomes and identify clinical factors associated with outcomes.”
The multicenter, retrospective, observational REALBiTE study tracked 209 patients with relapsed/refractory diffuse LBCL at 19 US centers (epcoritamab, n = 139; glofitamab, n = 70; median age at start of treatment, 67 years [58-76]; 62.2% male; 74.2% diffuse LBCL). The median number of lines of therapy was three (range, 1-12).
“Patients who received epcoritamab tended to be slightly older, were more likely to have a history of indolent non-Hodgkin lymphoma prior to their diagnosis of aggressive B-cell lymphoma and were more likely to have an elevated International Prognostic Index score at the start of bispecific therapy, suggesting that these patients may have been slightly older with higher-risk disease compared to those who received glofitamab,” Brooks said.
In total, 172 patients were response-evaluable. The overall response rate was 50.6% (complete response, 23.8%; partial response, 26.7%; stable disease, 5.8%; progressive disease, 43.6%).
The overall and complete response rates were “somewhat lower that what has been published in the pivotal trials of these medicines,” Brooks said. The low progression-free and overall survival rates “highlight the difficulty in managing this group of patients.”
Cytokine release syndrome (CRS) of any grade occurred in 39.2% of patients: 51% in the epcoritamab group and 28.6% in the glofitamab group. Grade ≥3 CRS occurred in 4.3% of patients, who were all taking epcoritamab.
“For epcoritamab, CRS was almost entirely of low grade, and most CRS events occurred around administration of the first full dose of the drug on day 15,” Brooks said. “Similarly, the CRS events for glofitamab were mostly of low grade, though events were observed to occur throughout the step-up dosing. Tocilizumab was administered in about one fifth of the patients.”
In addition, Brooks said, “we found that, among the 19 individuals with paired biopsy samples before and after bispecific therapy, nearly all — 89% — were found to have lost CD20 expression. We expected some patients to experience loss of this important target, but the rate at which we found this to be the case was surprisingly high.”
Brooks added that “clinicians should be acquainted with CRS, ICANS [immune effector cell-associated neurotoxicity syndrome], and mitigation strategies if they are prescribing these medicines. Appropriate and timely management using tocilizumab, steroids, and other adjunctive measures can effectively manage these complications and hopefully allow for the continued delivery of therapy.”
In an interview, Matthew Lunning, DO, associate professor at the University of Nebraska Medical Center/Fred & Pamela Buffett Cancer Center, Omaha, who didn’t take part in the new study, said the findings aren’t bad news. Instead, they’re “practical news,” because they offer insight into how the drugs work.
“The big lesson from this and other trials is the importance of assessing for CD20 expression prior to taking a bispecific off the shelf, “ he said. “These are learnings that often come after approval.”
He added that it’s clear that, “in more heavily pretreated patients, more disease led to less optimal results and higher risk for toxicities.”
Lunning also noted that both epcoritamab and glofitamab “entered into a crowded and chaotic relapsed/refractory LBCL space based high complete response rates with the opportunity for durability in those complete responses.”
Academic institutions were especially interested, as they can manage CRS and ICANS, but “significantly less enthusiasm has been seen in community practices that expect CRS/ICANS to be in the rear-view mirror if they are going to deliver any bispecific,” he said. “It is not that they don’t have the clinical acumen to manage CRS/ICANS. I believe it is the perception of the lack of supportive infrastructure necessary to manage these toxicities.”
There was no study funding. Brooks has reported no disclosures. Other authors have reported various disclosures including relationships with Novartis, AbbVie, Genentech, Genmab, Biogen, Amgen, and others. Lunning has disclosed ties with AbbVie, Genmab, Kite, Bristol-Myers Squibb, Regeneron, and ADC Therapeutics.
A version of this article first appeared on Medscape.com.
FROM ASH 2024