User login
MDedge latest news is breaking news from medical conferences, journals, guidelines, the FDA and CDC.
Can Better Diet Improve Survival in Black Women With Ovarian Cancer?
TOPLINE:
No significant survival association was found among the full study sample, which included women with multiple types of epithelial ovarian cancer (EOC).
METHODOLOGY:
- Researchers conducted a prospective cohort study among 483 self-identified Black women aged 20-79 years newly diagnosed with histologically confirmed EOC between December 2010 and December 2015.
- The study aimed to examine associations between dietary patterns and survival among Black women diagnosed with EOC using data from the African American Cancer Epidemiology Study.
- Dietary patterns were assessed using the Healthy Eating Index–2020 (HEI-2020) and Alternative Healthy Eating Index–2010 (AHEI-2010), based on dietary intake in the year prior to diagnosis collected via the validated Block 2005 Food Frequency Questionnaire (FFQ). Participant characteristics were summarized across quartiles of HEI-2020 and AHEI-2010 scores.
- The researchers obtained and summarized clinical characteristics, including tumor characteristics, first-line treatment regimen, debulking status, residual disease, and cancer antigen 125 levels, from medical records.
- The main outcome measure was overall survival, with hazard ratios (HRs) and 95% CIs estimated from multivariable Cox models for the association between adherence to dietary recommendations and overall mortality. Follow-up was conducted until October 2022, with data analyzed from March 2023 to June 2024.
TAKEAWAY:
- No significant association was found between dietary patterns and overall mortality among women with EOC.
- Among women with HGSOC, the most lethal histotype of EOC, better adherence to the HEI-2020 was associated with decreased mortality in later quartiles vs the first quartile (HR, 0.63; 95% CI, 0.44-0.92).
- Similar results were observed with the AHEI-2010 among women with HGSOC for the second (HR, 0.62; 95% CI, 0.43-0.89) and fourth (HR, 0.67; 95% CI, 0.45-0.98) quartiles vs the first quartile.
- Women with moderate and high prediagnosis dietary quality had significantly lower mortality rates from HGSOC than those with the lowest prediagnosis dietary quality.
IN PRACTICE:
“Our findings suggest that prediagnosis dietary patterns (ie, the combination of foods and nutrients) are more important than individual components for ovarian cancer survival as shown by comparing results of dietary patterns with individual components,” the authors of the study wrote.
SOURCE:
This study was led by Tsion A. Armidie, MPH, Rollins School of Public Health, Emory University in Atlanta, Georgia. It was published online on October 18 in JAMA Network Open.
LIMITATIONS:
This study’s limitations included the potential for residual confounding, despite accounting for a wide array of covariates. The median time between diagnosis and FFQ completion was 5.8 months, which may have introduced measurement errors in dietary recall. Additionally, the study did not collect postdiagnostic dietary information, which could have provided further insights into the association between diet and survival.
DISCLOSURES:
This study was supported by grants from the National Cancer Institute. One coauthor reported receiving personal fees from Pfizer outside the submitted work. One coauthor reported receiving grants from the US Department of Defense during the conduct of the study and Bristol-Myers Squibb and Karyopharm outside the submitted work. One coauthor reported receiving personal fees from Ashcraft and Gerel outside the submitted work. One coauthor reported receiving personal fees from Epidemiologic Research & Methods outside the submitted work. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
No significant survival association was found among the full study sample, which included women with multiple types of epithelial ovarian cancer (EOC).
METHODOLOGY:
- Researchers conducted a prospective cohort study among 483 self-identified Black women aged 20-79 years newly diagnosed with histologically confirmed EOC between December 2010 and December 2015.
- The study aimed to examine associations between dietary patterns and survival among Black women diagnosed with EOC using data from the African American Cancer Epidemiology Study.
- Dietary patterns were assessed using the Healthy Eating Index–2020 (HEI-2020) and Alternative Healthy Eating Index–2010 (AHEI-2010), based on dietary intake in the year prior to diagnosis collected via the validated Block 2005 Food Frequency Questionnaire (FFQ). Participant characteristics were summarized across quartiles of HEI-2020 and AHEI-2010 scores.
- The researchers obtained and summarized clinical characteristics, including tumor characteristics, first-line treatment regimen, debulking status, residual disease, and cancer antigen 125 levels, from medical records.
- The main outcome measure was overall survival, with hazard ratios (HRs) and 95% CIs estimated from multivariable Cox models for the association between adherence to dietary recommendations and overall mortality. Follow-up was conducted until October 2022, with data analyzed from March 2023 to June 2024.
TAKEAWAY:
- No significant association was found between dietary patterns and overall mortality among women with EOC.
- Among women with HGSOC, the most lethal histotype of EOC, better adherence to the HEI-2020 was associated with decreased mortality in later quartiles vs the first quartile (HR, 0.63; 95% CI, 0.44-0.92).
- Similar results were observed with the AHEI-2010 among women with HGSOC for the second (HR, 0.62; 95% CI, 0.43-0.89) and fourth (HR, 0.67; 95% CI, 0.45-0.98) quartiles vs the first quartile.
- Women with moderate and high prediagnosis dietary quality had significantly lower mortality rates from HGSOC than those with the lowest prediagnosis dietary quality.
IN PRACTICE:
“Our findings suggest that prediagnosis dietary patterns (ie, the combination of foods and nutrients) are more important than individual components for ovarian cancer survival as shown by comparing results of dietary patterns with individual components,” the authors of the study wrote.
SOURCE:
This study was led by Tsion A. Armidie, MPH, Rollins School of Public Health, Emory University in Atlanta, Georgia. It was published online on October 18 in JAMA Network Open.
LIMITATIONS:
This study’s limitations included the potential for residual confounding, despite accounting for a wide array of covariates. The median time between diagnosis and FFQ completion was 5.8 months, which may have introduced measurement errors in dietary recall. Additionally, the study did not collect postdiagnostic dietary information, which could have provided further insights into the association between diet and survival.
DISCLOSURES:
This study was supported by grants from the National Cancer Institute. One coauthor reported receiving personal fees from Pfizer outside the submitted work. One coauthor reported receiving grants from the US Department of Defense during the conduct of the study and Bristol-Myers Squibb and Karyopharm outside the submitted work. One coauthor reported receiving personal fees from Ashcraft and Gerel outside the submitted work. One coauthor reported receiving personal fees from Epidemiologic Research & Methods outside the submitted work. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
No significant survival association was found among the full study sample, which included women with multiple types of epithelial ovarian cancer (EOC).
METHODOLOGY:
- Researchers conducted a prospective cohort study among 483 self-identified Black women aged 20-79 years newly diagnosed with histologically confirmed EOC between December 2010 and December 2015.
- The study aimed to examine associations between dietary patterns and survival among Black women diagnosed with EOC using data from the African American Cancer Epidemiology Study.
- Dietary patterns were assessed using the Healthy Eating Index–2020 (HEI-2020) and Alternative Healthy Eating Index–2010 (AHEI-2010), based on dietary intake in the year prior to diagnosis collected via the validated Block 2005 Food Frequency Questionnaire (FFQ). Participant characteristics were summarized across quartiles of HEI-2020 and AHEI-2010 scores.
- The researchers obtained and summarized clinical characteristics, including tumor characteristics, first-line treatment regimen, debulking status, residual disease, and cancer antigen 125 levels, from medical records.
- The main outcome measure was overall survival, with hazard ratios (HRs) and 95% CIs estimated from multivariable Cox models for the association between adherence to dietary recommendations and overall mortality. Follow-up was conducted until October 2022, with data analyzed from March 2023 to June 2024.
TAKEAWAY:
- No significant association was found between dietary patterns and overall mortality among women with EOC.
- Among women with HGSOC, the most lethal histotype of EOC, better adherence to the HEI-2020 was associated with decreased mortality in later quartiles vs the first quartile (HR, 0.63; 95% CI, 0.44-0.92).
- Similar results were observed with the AHEI-2010 among women with HGSOC for the second (HR, 0.62; 95% CI, 0.43-0.89) and fourth (HR, 0.67; 95% CI, 0.45-0.98) quartiles vs the first quartile.
- Women with moderate and high prediagnosis dietary quality had significantly lower mortality rates from HGSOC than those with the lowest prediagnosis dietary quality.
IN PRACTICE:
“Our findings suggest that prediagnosis dietary patterns (ie, the combination of foods and nutrients) are more important than individual components for ovarian cancer survival as shown by comparing results of dietary patterns with individual components,” the authors of the study wrote.
SOURCE:
This study was led by Tsion A. Armidie, MPH, Rollins School of Public Health, Emory University in Atlanta, Georgia. It was published online on October 18 in JAMA Network Open.
LIMITATIONS:
This study’s limitations included the potential for residual confounding, despite accounting for a wide array of covariates. The median time between diagnosis and FFQ completion was 5.8 months, which may have introduced measurement errors in dietary recall. Additionally, the study did not collect postdiagnostic dietary information, which could have provided further insights into the association between diet and survival.
DISCLOSURES:
This study was supported by grants from the National Cancer Institute. One coauthor reported receiving personal fees from Pfizer outside the submitted work. One coauthor reported receiving grants from the US Department of Defense during the conduct of the study and Bristol-Myers Squibb and Karyopharm outside the submitted work. One coauthor reported receiving personal fees from Ashcraft and Gerel outside the submitted work. One coauthor reported receiving personal fees from Epidemiologic Research & Methods outside the submitted work. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
ICD-10-CM Codes for CCCA, FFA Now Available
in the field of hair loss disorders.
“CCCA and FFA are conditions that require early diagnosis and intervention to prevent irreversible hair loss,” Maria Hordinsky, MD, professor of dermatology at the University of Minnesota, Minneapolis, and a member of the Board of Directors, Scarring Alopecia Foundation (SAF), said in an interview.
“The use of these new codes will make it easier for clinicians to identify affected patients and improve treatment outcomes. It also opens the door for more robust research efforts aimed at understanding the etiology and progression of CCCA and FFA, which could lead to new and more effective treatments in the future. Overall, this development represents a positive step toward improving care for individuals affected by these challenging conditions.”
The new codes — L66.81 for CCCA and L66.12 for FFA — were approved by the Centers for Disease Control and Prevention (CDC) on June 15, 2023, but not implemented until October 1, 2024.
Amy J. McMichael, MD, professor of dermatology at Wake Forest University School of Medicine, Winston-Salem, North Carolina, and a scientific advisor to SAF, told this news organization that Itisha Jefferson, a medical student at Loyola University Chicago’s Stritch School of Medicine, and her peers on the SAF’s Medical Student Executive Board, played a pivotal role in advocating for the codes.
In 2022, Jefferson, who has CCCA, and her fellow medical students helped create the proposals that were ultimately submitted to the CDC.
“They were critical in working with the CDC leaders to get the necessary information submitted and processed,” McMichael said. “They were also amazing at corralling our dermatologist group for the development of the necessary presentations and helped to shepherd us to the finish line for all logistic issues.”
On March 8, 2023, McMichael and Hordinsky made their pitch for the codes in person at the CDC’s ICD-10 Coordination and Maintenance Committee meeting, with McMichael discussing CCCA and Hordinsky discussing FFA.
“We also discussed the lack of standardized tracking, which has contributed to misdiagnoses and inadequate treatment options,” Hordinsky recalled. “We highlighted the importance of having distinct codes for these conditions to improve clinical outcomes, ensure that patients have access to appropriate care, better tracking of disease prevalence, and greater epidemiologic monitoring with access to electronic medical records and other large real-world evidence datasets and databases, the results of which could contribute to health policy decision-making.”
To spread the word about the new codes, McMichael, Hordinsky, and other members of the SAF are working with the original team of medical students, some of whom who are now dermatology residents, to develop an information guide to send to societies and organizations that were supportive of the codes. A publication in the dermatology literature is also planned.
For her part, Jefferson said that she will continue to advocate for patients with scarring alopecia as a medical student and when she becomes a physician. “I hope in the near future we will see an externally led FDA Patient-Focused Drug Development meeting for both CCCA and FFA, further advancing care and research for these conditions,” she said in an interview.
McMichael, Hordinsky, and Jefferson had no relevant disclosures to report.
A version of this article appeared on Medscape.com.
in the field of hair loss disorders.
“CCCA and FFA are conditions that require early diagnosis and intervention to prevent irreversible hair loss,” Maria Hordinsky, MD, professor of dermatology at the University of Minnesota, Minneapolis, and a member of the Board of Directors, Scarring Alopecia Foundation (SAF), said in an interview.
“The use of these new codes will make it easier for clinicians to identify affected patients and improve treatment outcomes. It also opens the door for more robust research efforts aimed at understanding the etiology and progression of CCCA and FFA, which could lead to new and more effective treatments in the future. Overall, this development represents a positive step toward improving care for individuals affected by these challenging conditions.”
The new codes — L66.81 for CCCA and L66.12 for FFA — were approved by the Centers for Disease Control and Prevention (CDC) on June 15, 2023, but not implemented until October 1, 2024.
Amy J. McMichael, MD, professor of dermatology at Wake Forest University School of Medicine, Winston-Salem, North Carolina, and a scientific advisor to SAF, told this news organization that Itisha Jefferson, a medical student at Loyola University Chicago’s Stritch School of Medicine, and her peers on the SAF’s Medical Student Executive Board, played a pivotal role in advocating for the codes.
In 2022, Jefferson, who has CCCA, and her fellow medical students helped create the proposals that were ultimately submitted to the CDC.
“They were critical in working with the CDC leaders to get the necessary information submitted and processed,” McMichael said. “They were also amazing at corralling our dermatologist group for the development of the necessary presentations and helped to shepherd us to the finish line for all logistic issues.”
On March 8, 2023, McMichael and Hordinsky made their pitch for the codes in person at the CDC’s ICD-10 Coordination and Maintenance Committee meeting, with McMichael discussing CCCA and Hordinsky discussing FFA.
“We also discussed the lack of standardized tracking, which has contributed to misdiagnoses and inadequate treatment options,” Hordinsky recalled. “We highlighted the importance of having distinct codes for these conditions to improve clinical outcomes, ensure that patients have access to appropriate care, better tracking of disease prevalence, and greater epidemiologic monitoring with access to electronic medical records and other large real-world evidence datasets and databases, the results of which could contribute to health policy decision-making.”
To spread the word about the new codes, McMichael, Hordinsky, and other members of the SAF are working with the original team of medical students, some of whom who are now dermatology residents, to develop an information guide to send to societies and organizations that were supportive of the codes. A publication in the dermatology literature is also planned.
For her part, Jefferson said that she will continue to advocate for patients with scarring alopecia as a medical student and when she becomes a physician. “I hope in the near future we will see an externally led FDA Patient-Focused Drug Development meeting for both CCCA and FFA, further advancing care and research for these conditions,” she said in an interview.
McMichael, Hordinsky, and Jefferson had no relevant disclosures to report.
A version of this article appeared on Medscape.com.
in the field of hair loss disorders.
“CCCA and FFA are conditions that require early diagnosis and intervention to prevent irreversible hair loss,” Maria Hordinsky, MD, professor of dermatology at the University of Minnesota, Minneapolis, and a member of the Board of Directors, Scarring Alopecia Foundation (SAF), said in an interview.
“The use of these new codes will make it easier for clinicians to identify affected patients and improve treatment outcomes. It also opens the door for more robust research efforts aimed at understanding the etiology and progression of CCCA and FFA, which could lead to new and more effective treatments in the future. Overall, this development represents a positive step toward improving care for individuals affected by these challenging conditions.”
The new codes — L66.81 for CCCA and L66.12 for FFA — were approved by the Centers for Disease Control and Prevention (CDC) on June 15, 2023, but not implemented until October 1, 2024.
Amy J. McMichael, MD, professor of dermatology at Wake Forest University School of Medicine, Winston-Salem, North Carolina, and a scientific advisor to SAF, told this news organization that Itisha Jefferson, a medical student at Loyola University Chicago’s Stritch School of Medicine, and her peers on the SAF’s Medical Student Executive Board, played a pivotal role in advocating for the codes.
In 2022, Jefferson, who has CCCA, and her fellow medical students helped create the proposals that were ultimately submitted to the CDC.
“They were critical in working with the CDC leaders to get the necessary information submitted and processed,” McMichael said. “They were also amazing at corralling our dermatologist group for the development of the necessary presentations and helped to shepherd us to the finish line for all logistic issues.”
On March 8, 2023, McMichael and Hordinsky made their pitch for the codes in person at the CDC’s ICD-10 Coordination and Maintenance Committee meeting, with McMichael discussing CCCA and Hordinsky discussing FFA.
“We also discussed the lack of standardized tracking, which has contributed to misdiagnoses and inadequate treatment options,” Hordinsky recalled. “We highlighted the importance of having distinct codes for these conditions to improve clinical outcomes, ensure that patients have access to appropriate care, better tracking of disease prevalence, and greater epidemiologic monitoring with access to electronic medical records and other large real-world evidence datasets and databases, the results of which could contribute to health policy decision-making.”
To spread the word about the new codes, McMichael, Hordinsky, and other members of the SAF are working with the original team of medical students, some of whom who are now dermatology residents, to develop an information guide to send to societies and organizations that were supportive of the codes. A publication in the dermatology literature is also planned.
For her part, Jefferson said that she will continue to advocate for patients with scarring alopecia as a medical student and when she becomes a physician. “I hope in the near future we will see an externally led FDA Patient-Focused Drug Development meeting for both CCCA and FFA, further advancing care and research for these conditions,” she said in an interview.
McMichael, Hordinsky, and Jefferson had no relevant disclosures to report.
A version of this article appeared on Medscape.com.
Are Three Cycles of Chemotherapy as Effective as Six for Retinoblastoma?
TOPLINE:
The three-cycle regimen also resulted in fewer adverse events and lower costs.
METHODOLOGY:
- The introduction of chemotherapy has increased survival rates for patients with retinoblastoma, but the optimal number of postoperative adjuvant cycles remains unclear due to scant randomized clinical trial data for high-risk patients.
- In the new trial, participants at two premier eye centers in China were randomly assigned to receive either three (n = 94) or six (n = 93) cycles of carboplatin, etoposide, and vincristine (CEV) chemotherapy after enucleation.
- The primary endpoint was 5-year DFS, and the secondary endpoints were overall survival, safety, economic burden, and quality of life.
- Patients were followed up every 3 months for the first 2 years and then every 6 months thereafter, with a median follow-up of 79 months.
- Adverse events were graded using the National Cancer Institute Common Terminology Criteria for Adverse Events (version 5.0).
TAKEAWAY:
- The 5-year DFS rates were 90.4% and 89.2% for the three- and six-cycle groups, respectively, meeting the noninferiority criterion (P = .003).
- The six-cycle group experienced a higher frequency of adverse events, including neutropenia, anemia, and nausea, than the three-cycle group.
- The quality-of-life scores were higher in the three-cycle group, particularly in physical, emotional, and social functioning parameters.
- The total, direct, and indirect costs were significantly lower in the three-cycle group than in the six-cycle group.
IN PRACTICE:
“A three-cycle CEV regimen demonstrated noninferiority, compared with a six-cycle approach, and was and proved to be an efficacious adjuvant chemotherapy regimen for individuals diagnosed with pathologically high-risk retinoblastoma,” the authors of the study wrote.
In an accompanying editorial, Ning Li, MD, and colleagues wrote that the findings “could lead to changes in clinical practice, reducing treatment burden and costs without compromising patient outcomes.”
SOURCE:
This study was led by Huijing Ye, MD, PhD, State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University in Guangzhou, China. Both the study and editorial were published online in JAMA.
LIMITATIONS:
The open-label design of the study might introduce bias, although an independent, blinded committee evaluated the clinical outcomes. The 12% noninferiority margin was notably substantial, considering the rarity of retinoblastoma and the wide range of survival rates. The criteria for adjuvant therapy, especially regarding choroidal invasion, were debatable and required further follow-up to clarify the prognosis related to various pathologic features.
DISCLOSURES:
This study was supported by the Sun Yat-Sen University Clinical Research 5010 Program and the Shanghai Committee of Science and Technology. No relevant conflict of interest was disclosed by the authors of the paper or the editorial.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The three-cycle regimen also resulted in fewer adverse events and lower costs.
METHODOLOGY:
- The introduction of chemotherapy has increased survival rates for patients with retinoblastoma, but the optimal number of postoperative adjuvant cycles remains unclear due to scant randomized clinical trial data for high-risk patients.
- In the new trial, participants at two premier eye centers in China were randomly assigned to receive either three (n = 94) or six (n = 93) cycles of carboplatin, etoposide, and vincristine (CEV) chemotherapy after enucleation.
- The primary endpoint was 5-year DFS, and the secondary endpoints were overall survival, safety, economic burden, and quality of life.
- Patients were followed up every 3 months for the first 2 years and then every 6 months thereafter, with a median follow-up of 79 months.
- Adverse events were graded using the National Cancer Institute Common Terminology Criteria for Adverse Events (version 5.0).
TAKEAWAY:
- The 5-year DFS rates were 90.4% and 89.2% for the three- and six-cycle groups, respectively, meeting the noninferiority criterion (P = .003).
- The six-cycle group experienced a higher frequency of adverse events, including neutropenia, anemia, and nausea, than the three-cycle group.
- The quality-of-life scores were higher in the three-cycle group, particularly in physical, emotional, and social functioning parameters.
- The total, direct, and indirect costs were significantly lower in the three-cycle group than in the six-cycle group.
IN PRACTICE:
“A three-cycle CEV regimen demonstrated noninferiority, compared with a six-cycle approach, and was and proved to be an efficacious adjuvant chemotherapy regimen for individuals diagnosed with pathologically high-risk retinoblastoma,” the authors of the study wrote.
In an accompanying editorial, Ning Li, MD, and colleagues wrote that the findings “could lead to changes in clinical practice, reducing treatment burden and costs without compromising patient outcomes.”
SOURCE:
This study was led by Huijing Ye, MD, PhD, State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University in Guangzhou, China. Both the study and editorial were published online in JAMA.
LIMITATIONS:
The open-label design of the study might introduce bias, although an independent, blinded committee evaluated the clinical outcomes. The 12% noninferiority margin was notably substantial, considering the rarity of retinoblastoma and the wide range of survival rates. The criteria for adjuvant therapy, especially regarding choroidal invasion, were debatable and required further follow-up to clarify the prognosis related to various pathologic features.
DISCLOSURES:
This study was supported by the Sun Yat-Sen University Clinical Research 5010 Program and the Shanghai Committee of Science and Technology. No relevant conflict of interest was disclosed by the authors of the paper or the editorial.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The three-cycle regimen also resulted in fewer adverse events and lower costs.
METHODOLOGY:
- The introduction of chemotherapy has increased survival rates for patients with retinoblastoma, but the optimal number of postoperative adjuvant cycles remains unclear due to scant randomized clinical trial data for high-risk patients.
- In the new trial, participants at two premier eye centers in China were randomly assigned to receive either three (n = 94) or six (n = 93) cycles of carboplatin, etoposide, and vincristine (CEV) chemotherapy after enucleation.
- The primary endpoint was 5-year DFS, and the secondary endpoints were overall survival, safety, economic burden, and quality of life.
- Patients were followed up every 3 months for the first 2 years and then every 6 months thereafter, with a median follow-up of 79 months.
- Adverse events were graded using the National Cancer Institute Common Terminology Criteria for Adverse Events (version 5.0).
TAKEAWAY:
- The 5-year DFS rates were 90.4% and 89.2% for the three- and six-cycle groups, respectively, meeting the noninferiority criterion (P = .003).
- The six-cycle group experienced a higher frequency of adverse events, including neutropenia, anemia, and nausea, than the three-cycle group.
- The quality-of-life scores were higher in the three-cycle group, particularly in physical, emotional, and social functioning parameters.
- The total, direct, and indirect costs were significantly lower in the three-cycle group than in the six-cycle group.
IN PRACTICE:
“A three-cycle CEV regimen demonstrated noninferiority, compared with a six-cycle approach, and was and proved to be an efficacious adjuvant chemotherapy regimen for individuals diagnosed with pathologically high-risk retinoblastoma,” the authors of the study wrote.
In an accompanying editorial, Ning Li, MD, and colleagues wrote that the findings “could lead to changes in clinical practice, reducing treatment burden and costs without compromising patient outcomes.”
SOURCE:
This study was led by Huijing Ye, MD, PhD, State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University in Guangzhou, China. Both the study and editorial were published online in JAMA.
LIMITATIONS:
The open-label design of the study might introduce bias, although an independent, blinded committee evaluated the clinical outcomes. The 12% noninferiority margin was notably substantial, considering the rarity of retinoblastoma and the wide range of survival rates. The criteria for adjuvant therapy, especially regarding choroidal invasion, were debatable and required further follow-up to clarify the prognosis related to various pathologic features.
DISCLOSURES:
This study was supported by the Sun Yat-Sen University Clinical Research 5010 Program and the Shanghai Committee of Science and Technology. No relevant conflict of interest was disclosed by the authors of the paper or the editorial.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Humans and Carbs: A Complicated 800,000-Year Relationship
Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.
Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.
However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.
A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.
The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.
The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
How Researchers Found Our Carb-Lover Gene
The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.
Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.
To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.
The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).
Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”
The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)
“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.
However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.”
A previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
Potential Health Impacts on Modern Humans
The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.
Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.
Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.
“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.
It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.
However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
How Will Today’s Processed Carbs Affect Our Genes Tomorrow?
As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.
Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.
“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.
Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
A version of this article appeared on Medscape.com.
Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.
Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.
However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.
A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.
The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.
The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
How Researchers Found Our Carb-Lover Gene
The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.
Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.
To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.
The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).
Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”
The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)
“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.
However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.”
A previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
Potential Health Impacts on Modern Humans
The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.
Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.
Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.
“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.
It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.
However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
How Will Today’s Processed Carbs Affect Our Genes Tomorrow?
As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.
Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.
“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.
Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
A version of this article appeared on Medscape.com.
Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.
Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.
However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.
A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.
The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.
The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
How Researchers Found Our Carb-Lover Gene
The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.
Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.
To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.
The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).
Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”
The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)
“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.
However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.”
A previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
Potential Health Impacts on Modern Humans
The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.
Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.
Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.
“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.
It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.
However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
How Will Today’s Processed Carbs Affect Our Genes Tomorrow?
As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.
Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.
“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.
Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
A version of this article appeared on Medscape.com.
Rising Stroke Rates in Californians With Sickle Cell Disease
TOPLINE:
METHODOLOGY:
- Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
- A total of 7636 patients with SCD were included in the study cohort.
- Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
- Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
- The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.
TAKEAWAY:
- The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
- Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
- Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
- The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.
IN PRACTICE:
“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.
SOURCE:
This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.
LIMITATIONS:
This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.
DISCLOSURES:
The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
- A total of 7636 patients with SCD were included in the study cohort.
- Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
- Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
- The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.
TAKEAWAY:
- The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
- Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
- Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
- The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.
IN PRACTICE:
“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.
SOURCE:
This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.
LIMITATIONS:
This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.
DISCLOSURES:
The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
- A total of 7636 patients with SCD were included in the study cohort.
- Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
- Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
- The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.
TAKEAWAY:
- The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
- Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
- Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
- The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.
IN PRACTICE:
“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.
SOURCE:
This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.
LIMITATIONS:
This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.
DISCLOSURES:
The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
New Data on DOAC Initiation After Stroke in AF: Final Word?
ABU DHABI, UAE — The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.
Results of the OPTIMAS trial, the largest trial to address this question, showed that
In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.
The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.
“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.
Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”
Werring pointed out that starting anticoagulation early also had important logistical advantages.
“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
Clinical Dilemma
Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.
“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.
So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.
The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.
The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.
There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.
Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).
The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.
Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.
Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
Applicable to Real-World Practice
A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.
Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.
Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.
During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.
Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.
In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”
Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
CATALYST Findings
The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.
The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).
The results were consistent across all subgroups, all suggesting an advantage for early DOAC.
Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.
The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).
At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
‘Practice Changing’ Results
Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”
“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.
“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.
Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.
During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.
“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”
Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”
The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
A version of this article appeared on Medscape.com.
ABU DHABI, UAE — The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.
Results of the OPTIMAS trial, the largest trial to address this question, showed that
In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.
The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.
“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.
Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”
Werring pointed out that starting anticoagulation early also had important logistical advantages.
“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
Clinical Dilemma
Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.
“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.
So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.
The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.
The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.
There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.
Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).
The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.
Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.
Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
Applicable to Real-World Practice
A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.
Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.
Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.
During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.
Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.
In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”
Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
CATALYST Findings
The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.
The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).
The results were consistent across all subgroups, all suggesting an advantage for early DOAC.
Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.
The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).
At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
‘Practice Changing’ Results
Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”
“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.
“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.
Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.
During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.
“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”
Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”
The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
A version of this article appeared on Medscape.com.
ABU DHABI, UAE — The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.
Results of the OPTIMAS trial, the largest trial to address this question, showed that
In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.
The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.
“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.
Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”
Werring pointed out that starting anticoagulation early also had important logistical advantages.
“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
Clinical Dilemma
Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.
“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.
So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.
The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.
The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.
There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.
Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).
The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.
Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.
Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
Applicable to Real-World Practice
A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.
Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.
Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.
During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.
Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.
In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”
Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
CATALYST Findings
The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.
The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).
The results were consistent across all subgroups, all suggesting an advantage for early DOAC.
Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.
The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).
At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
‘Practice Changing’ Results
Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”
“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.
“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.
Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.
During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.
“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”
Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”
The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
A version of this article appeared on Medscape.com.
FROM WSC 2024
Climate Change Linked to Lung Cancer in Never-Smokers
The incidence of lung cancer in never-smokers (LCINS) is increasing, and experts think climate change may be driving the uptick.
LCINS differs histologically and epidemiologically from smoking-related cancers, occurring almost always as adenocarcinomas and mostly affecting women and individuals of Asian ancestry, according to a study published in Nature Reviews Clinical Oncology in January 2024. Cases of LCINS are estimated to be the fifth most common cause of cancer-related deaths worldwide.
These potential culprits are varied and sometimes interrelated — and they underscore the need for continued emphasis on environmental hazards, the panelists agreed.
Focusing on climate change — and taking action at the individual level — is a good place to start, said Leticia M. Nogueira, PhD, scientific director of health services research in the Surveillance and Health Equity Science Department of the American Cancer Society.
Long-Term Exposure to Wildfires Linked to Increased Cancer Risk
Climate change is associated with climate-driven disasters such as more intense hurricanes and more frequent wildfires that can expose populations to environmental carcinogens, Nogueira explained.
Such weather events disrupt the care of patients with cancer and lead to poorer outcomes, according to her own research. They also contribute to the rising incidence of LCINS, she said.
In a population-based study published in The Lancet Planetary Health, long-term exposure to wildfires was associated with an increased risk for lung cancer and brain tumors. Individuals exposed to a wildfire within 50 km of their residential locations in the prior decade has a 4.9% relatively higher incidence of lung cancer and a 10% relatively higher incidence of brain tumors.
“These findings are relevant on a global scale given the anticipated effects of climate change on wildfire frequency and severity,” the authors concluded, noting the study limitations and the need for further research.
How Clinicians Can Help
Nogueira urged attendees to take action to help improve healthcare outcomes.
“Let’s not forget that the healthcare system is one of the most emission-intensive industries in the world. Emissions from the US healthcare system exceed emission from the entire UK, and we can be doing much better.
“There is something for each one of us here today to do: We can champion environmentally responsible efforts at our institutions, we can engage with disaster preparedness and response ... and we can document ongoing suffering to increase awareness and incentivize action,” she said.
In a commentary published in CA: A Cancer Journal for Clinicians, Nogueira and her colleagues further addressed the links between climate change and cancer and listed various sources of greenhouse gas emissions and proposed interventions, including those associated with the healthcare industry.
“If you look at this list and say ‘No way — there is no chance my institution will do any of that,’ let me ask you something: Are you allowed to smoke on campus? How do you think that happened? How do you think that started?” she said, invoking Archimedes’ famous quote, “Give me a lever long enough, and I shall move the world.”
“You most certainly have the power to make a difference,” Nogueira said. “So recognize where your points of influence are – move your lever, move the world.”
A version of this article appeared on Medscape.com.
The incidence of lung cancer in never-smokers (LCINS) is increasing, and experts think climate change may be driving the uptick.
LCINS differs histologically and epidemiologically from smoking-related cancers, occurring almost always as adenocarcinomas and mostly affecting women and individuals of Asian ancestry, according to a study published in Nature Reviews Clinical Oncology in January 2024. Cases of LCINS are estimated to be the fifth most common cause of cancer-related deaths worldwide.
These potential culprits are varied and sometimes interrelated — and they underscore the need for continued emphasis on environmental hazards, the panelists agreed.
Focusing on climate change — and taking action at the individual level — is a good place to start, said Leticia M. Nogueira, PhD, scientific director of health services research in the Surveillance and Health Equity Science Department of the American Cancer Society.
Long-Term Exposure to Wildfires Linked to Increased Cancer Risk
Climate change is associated with climate-driven disasters such as more intense hurricanes and more frequent wildfires that can expose populations to environmental carcinogens, Nogueira explained.
Such weather events disrupt the care of patients with cancer and lead to poorer outcomes, according to her own research. They also contribute to the rising incidence of LCINS, she said.
In a population-based study published in The Lancet Planetary Health, long-term exposure to wildfires was associated with an increased risk for lung cancer and brain tumors. Individuals exposed to a wildfire within 50 km of their residential locations in the prior decade has a 4.9% relatively higher incidence of lung cancer and a 10% relatively higher incidence of brain tumors.
“These findings are relevant on a global scale given the anticipated effects of climate change on wildfire frequency and severity,” the authors concluded, noting the study limitations and the need for further research.
How Clinicians Can Help
Nogueira urged attendees to take action to help improve healthcare outcomes.
“Let’s not forget that the healthcare system is one of the most emission-intensive industries in the world. Emissions from the US healthcare system exceed emission from the entire UK, and we can be doing much better.
“There is something for each one of us here today to do: We can champion environmentally responsible efforts at our institutions, we can engage with disaster preparedness and response ... and we can document ongoing suffering to increase awareness and incentivize action,” she said.
In a commentary published in CA: A Cancer Journal for Clinicians, Nogueira and her colleagues further addressed the links between climate change and cancer and listed various sources of greenhouse gas emissions and proposed interventions, including those associated with the healthcare industry.
“If you look at this list and say ‘No way — there is no chance my institution will do any of that,’ let me ask you something: Are you allowed to smoke on campus? How do you think that happened? How do you think that started?” she said, invoking Archimedes’ famous quote, “Give me a lever long enough, and I shall move the world.”
“You most certainly have the power to make a difference,” Nogueira said. “So recognize where your points of influence are – move your lever, move the world.”
A version of this article appeared on Medscape.com.
The incidence of lung cancer in never-smokers (LCINS) is increasing, and experts think climate change may be driving the uptick.
LCINS differs histologically and epidemiologically from smoking-related cancers, occurring almost always as adenocarcinomas and mostly affecting women and individuals of Asian ancestry, according to a study published in Nature Reviews Clinical Oncology in January 2024. Cases of LCINS are estimated to be the fifth most common cause of cancer-related deaths worldwide.
These potential culprits are varied and sometimes interrelated — and they underscore the need for continued emphasis on environmental hazards, the panelists agreed.
Focusing on climate change — and taking action at the individual level — is a good place to start, said Leticia M. Nogueira, PhD, scientific director of health services research in the Surveillance and Health Equity Science Department of the American Cancer Society.
Long-Term Exposure to Wildfires Linked to Increased Cancer Risk
Climate change is associated with climate-driven disasters such as more intense hurricanes and more frequent wildfires that can expose populations to environmental carcinogens, Nogueira explained.
Such weather events disrupt the care of patients with cancer and lead to poorer outcomes, according to her own research. They also contribute to the rising incidence of LCINS, she said.
In a population-based study published in The Lancet Planetary Health, long-term exposure to wildfires was associated with an increased risk for lung cancer and brain tumors. Individuals exposed to a wildfire within 50 km of their residential locations in the prior decade has a 4.9% relatively higher incidence of lung cancer and a 10% relatively higher incidence of brain tumors.
“These findings are relevant on a global scale given the anticipated effects of climate change on wildfire frequency and severity,” the authors concluded, noting the study limitations and the need for further research.
How Clinicians Can Help
Nogueira urged attendees to take action to help improve healthcare outcomes.
“Let’s not forget that the healthcare system is one of the most emission-intensive industries in the world. Emissions from the US healthcare system exceed emission from the entire UK, and we can be doing much better.
“There is something for each one of us here today to do: We can champion environmentally responsible efforts at our institutions, we can engage with disaster preparedness and response ... and we can document ongoing suffering to increase awareness and incentivize action,” she said.
In a commentary published in CA: A Cancer Journal for Clinicians, Nogueira and her colleagues further addressed the links between climate change and cancer and listed various sources of greenhouse gas emissions and proposed interventions, including those associated with the healthcare industry.
“If you look at this list and say ‘No way — there is no chance my institution will do any of that,’ let me ask you something: Are you allowed to smoke on campus? How do you think that happened? How do you think that started?” she said, invoking Archimedes’ famous quote, “Give me a lever long enough, and I shall move the world.”
“You most certainly have the power to make a difference,” Nogueira said. “So recognize where your points of influence are – move your lever, move the world.”
A version of this article appeared on Medscape.com.
Disc Degeneration in Chronic Low Back Pain: Can Stem Cells Help?
TOPLINE:
Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.
METHODOLOGY:
- The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
- Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
- The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
- The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.
TAKEAWAY:
- At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
- The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
- The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
- The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.
IN PRACTICE:
“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.
SOURCE:
The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.
DISCLOSURES:
The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.
METHODOLOGY:
- The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
- Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
- The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
- The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.
TAKEAWAY:
- At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
- The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
- The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
- The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.
IN PRACTICE:
“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.
SOURCE:
The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.
DISCLOSURES:
The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.
METHODOLOGY:
- The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
- Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
- The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
- The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.
TAKEAWAY:
- At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
- The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
- The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
- The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.
IN PRACTICE:
“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.
SOURCE:
The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.
DISCLOSURES:
The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Duloxetine Bottles Recalled by FDA Because of Potential Carcinogen
The US Food and Drug Administration (FDA) has announced a voluntary manufacturer-initiated recall of more than 7000 bottles of duloxetine delayed-release capsules due to unacceptable levels of a potential carcinogen.
Duloxetine (Cymbalta) is a serotonin-norepinephrine reuptake inhibitor used to treat major depressive disorder, generalized anxiety disorder, fibromyalgia, chronic musculoskeletal pain, and neuropathic pain associated with diabetic peripheral neuropathy.
The recall is due to the detection of the nitrosamine impurity, N-nitroso duloxetine, above the proposed interim limit.
Nitrosamines are common in water and foods, and exposure to some levels of the chemical is common. Exposure to nitrosamine impurities above acceptable levels and over long periods may increase cancer risk, the FDA reported.
“If drugs contain levels of nitrosamines above the acceptable daily intake limits, FDA recommends these drugs be recalled by the manufacturer as appropriate,” the agency noted on its website.
The recall was initiated by Breckenridge Pharmaceutical and covers 7107 bottles of 500-count, 20 mg duloxetine delayed-release capsules. The drug is manufactured by Towa Pharmaceutical Europe and distributed nationwide by BPI.
The affected bottles are from lot number 220128 with an expiration date of 12/2024 and NDC of 51991-746-05.
The recall was initiated on October 10 and is ongoing.
“Healthcare professionals can educate patients about alternative treatment options to medications with potential nitrosamine impurities if available and clinically appropriate,” the FDA advises. “If a medication has been recalled, pharmacists may be able to dispense the same medication from a manufacturing lot that has not been recalled. Prescribers may also determine whether there is an alternative treatment option for patients.”
The FDA has labeled this a “class II” recall, which the agency defines as “a situation in which use of or exposure to a violative product may cause temporary or medically reversible adverse health consequences or where the probability of serious adverse health consequences is remote.”
Nitrosamine impurities have prompted a number of drug recalls in recent years, including oral anticoagulants, metformin, and skeletal muscle relaxants.
The impurities may be found in drugs for a number of reasons, the agency reported. The source may be from a drug’s manufacturing process, chemical structure, or the conditions under which it is stored or packaged.
A version of this article appeared on Medscape.com.
The US Food and Drug Administration (FDA) has announced a voluntary manufacturer-initiated recall of more than 7000 bottles of duloxetine delayed-release capsules due to unacceptable levels of a potential carcinogen.
Duloxetine (Cymbalta) is a serotonin-norepinephrine reuptake inhibitor used to treat major depressive disorder, generalized anxiety disorder, fibromyalgia, chronic musculoskeletal pain, and neuropathic pain associated with diabetic peripheral neuropathy.
The recall is due to the detection of the nitrosamine impurity, N-nitroso duloxetine, above the proposed interim limit.
Nitrosamines are common in water and foods, and exposure to some levels of the chemical is common. Exposure to nitrosamine impurities above acceptable levels and over long periods may increase cancer risk, the FDA reported.
“If drugs contain levels of nitrosamines above the acceptable daily intake limits, FDA recommends these drugs be recalled by the manufacturer as appropriate,” the agency noted on its website.
The recall was initiated by Breckenridge Pharmaceutical and covers 7107 bottles of 500-count, 20 mg duloxetine delayed-release capsules. The drug is manufactured by Towa Pharmaceutical Europe and distributed nationwide by BPI.
The affected bottles are from lot number 220128 with an expiration date of 12/2024 and NDC of 51991-746-05.
The recall was initiated on October 10 and is ongoing.
“Healthcare professionals can educate patients about alternative treatment options to medications with potential nitrosamine impurities if available and clinically appropriate,” the FDA advises. “If a medication has been recalled, pharmacists may be able to dispense the same medication from a manufacturing lot that has not been recalled. Prescribers may also determine whether there is an alternative treatment option for patients.”
The FDA has labeled this a “class II” recall, which the agency defines as “a situation in which use of or exposure to a violative product may cause temporary or medically reversible adverse health consequences or where the probability of serious adverse health consequences is remote.”
Nitrosamine impurities have prompted a number of drug recalls in recent years, including oral anticoagulants, metformin, and skeletal muscle relaxants.
The impurities may be found in drugs for a number of reasons, the agency reported. The source may be from a drug’s manufacturing process, chemical structure, or the conditions under which it is stored or packaged.
A version of this article appeared on Medscape.com.
The US Food and Drug Administration (FDA) has announced a voluntary manufacturer-initiated recall of more than 7000 bottles of duloxetine delayed-release capsules due to unacceptable levels of a potential carcinogen.
Duloxetine (Cymbalta) is a serotonin-norepinephrine reuptake inhibitor used to treat major depressive disorder, generalized anxiety disorder, fibromyalgia, chronic musculoskeletal pain, and neuropathic pain associated with diabetic peripheral neuropathy.
The recall is due to the detection of the nitrosamine impurity, N-nitroso duloxetine, above the proposed interim limit.
Nitrosamines are common in water and foods, and exposure to some levels of the chemical is common. Exposure to nitrosamine impurities above acceptable levels and over long periods may increase cancer risk, the FDA reported.
“If drugs contain levels of nitrosamines above the acceptable daily intake limits, FDA recommends these drugs be recalled by the manufacturer as appropriate,” the agency noted on its website.
The recall was initiated by Breckenridge Pharmaceutical and covers 7107 bottles of 500-count, 20 mg duloxetine delayed-release capsules. The drug is manufactured by Towa Pharmaceutical Europe and distributed nationwide by BPI.
The affected bottles are from lot number 220128 with an expiration date of 12/2024 and NDC of 51991-746-05.
The recall was initiated on October 10 and is ongoing.
“Healthcare professionals can educate patients about alternative treatment options to medications with potential nitrosamine impurities if available and clinically appropriate,” the FDA advises. “If a medication has been recalled, pharmacists may be able to dispense the same medication from a manufacturing lot that has not been recalled. Prescribers may also determine whether there is an alternative treatment option for patients.”
The FDA has labeled this a “class II” recall, which the agency defines as “a situation in which use of or exposure to a violative product may cause temporary or medically reversible adverse health consequences or where the probability of serious adverse health consequences is remote.”
Nitrosamine impurities have prompted a number of drug recalls in recent years, including oral anticoagulants, metformin, and skeletal muscle relaxants.
The impurities may be found in drugs for a number of reasons, the agency reported. The source may be from a drug’s manufacturing process, chemical structure, or the conditions under which it is stored or packaged.
A version of this article appeared on Medscape.com.
Mepivacaine Reduces Pain During IUD Placement in Nulliparous Women
TOPLINE:
Mepivacaine instillation significantly reduced pain during intrauterine device (IUD) placement in nulliparous women. More than 90% of women in the intervention group reported tolerable pain compared with 80% of those in the placebo group.
METHODOLOGY:
- A multicenter, double-blind, randomized, placebo-controlled trial was conducted in 12 centers in Sweden, which involved 151 nulliparous women aged 18-31 years.
- Participants were randomly assigned to receive either 10 mL of 20 mg/mL mepivacaine or 10 mL of 0.9 mg/mL sodium chloride (placebo) through a hydrosonography catheter 2 minutes before IUD placement.
- Pain scores were measured using a 100-mm visual analog scale (VAS) at baseline, after instillation, during IUD placement, and 10 minutes post placement.
- The primary outcome was the difference in VAS pain scores during IUD placement between the intervention and placebo groups.
TAKEAWAY:
- Mepivacaine instillation resulted in a statistically significant reduction in mean VAS pain scores during IUD placement, with a mean difference of 13.3 mm (95% CI, 5.75-20.87; P < .001).
- After adjusting for provider impact, the mean VAS pain score difference remained significant at 12.2 mm (95% CI, 4.85-19.62; P < .001).
- A higher proportion of women in the mepivacaine group reported tolerable pain during IUD placement (93.3%) than the placebo group (80.3%; P = .021).
- No serious adverse effects were associated with mepivacaine instillation, and there were no cases of uterine perforation in either group.
IN PRACTICE:
“We argue that the pain reduction in our study is clinically important as a greater proportion of women in our intervention group, compared to the placebo group, reported tolerable pain during placement and to a higher extent rated the placement as easier than expected and expressed a willingness to choose IUD as contraception again,” the authors of the study wrote.
SOURCE:
This study was led by Niklas Envall, PhD; Karin Elgemark, MD; and Helena Kopp Kallner, MD, PhD, at the Department of Clinical Sciences, Danderyd Hospital, Karolinska Institutet in Stockholm, Sweden. It was published online in American Journal of Obstetrics & Gynecology.
LIMITATIONS:
This study’s limitations included the exclusive focus on one type of IUD (LNG-IUS 52 mg, 4.4 mm), which may limit generalizability to other IUD types. Additionally, only experienced providers participated, which may not reflect settings with less experienced providers. Factors such as anticipated pain and patient anxiety were not systematically assessed, potentially influencing pain perception.
DISCLOSURES:
Envall received personal fees from Bayer for educational activities and honorarium from Medsphere Corp USA for expert opinions on long-acting reversible contraception. Kallner received honoraria for consultancy work and lectures from multiple pharmaceutical companies, including AbbVie, Actavis, Bayer, and others. The study was funded by the Swedish Research Council. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Mepivacaine instillation significantly reduced pain during intrauterine device (IUD) placement in nulliparous women. More than 90% of women in the intervention group reported tolerable pain compared with 80% of those in the placebo group.
METHODOLOGY:
- A multicenter, double-blind, randomized, placebo-controlled trial was conducted in 12 centers in Sweden, which involved 151 nulliparous women aged 18-31 years.
- Participants were randomly assigned to receive either 10 mL of 20 mg/mL mepivacaine or 10 mL of 0.9 mg/mL sodium chloride (placebo) through a hydrosonography catheter 2 minutes before IUD placement.
- Pain scores were measured using a 100-mm visual analog scale (VAS) at baseline, after instillation, during IUD placement, and 10 minutes post placement.
- The primary outcome was the difference in VAS pain scores during IUD placement between the intervention and placebo groups.
TAKEAWAY:
- Mepivacaine instillation resulted in a statistically significant reduction in mean VAS pain scores during IUD placement, with a mean difference of 13.3 mm (95% CI, 5.75-20.87; P < .001).
- After adjusting for provider impact, the mean VAS pain score difference remained significant at 12.2 mm (95% CI, 4.85-19.62; P < .001).
- A higher proportion of women in the mepivacaine group reported tolerable pain during IUD placement (93.3%) than the placebo group (80.3%; P = .021).
- No serious adverse effects were associated with mepivacaine instillation, and there were no cases of uterine perforation in either group.
IN PRACTICE:
“We argue that the pain reduction in our study is clinically important as a greater proportion of women in our intervention group, compared to the placebo group, reported tolerable pain during placement and to a higher extent rated the placement as easier than expected and expressed a willingness to choose IUD as contraception again,” the authors of the study wrote.
SOURCE:
This study was led by Niklas Envall, PhD; Karin Elgemark, MD; and Helena Kopp Kallner, MD, PhD, at the Department of Clinical Sciences, Danderyd Hospital, Karolinska Institutet in Stockholm, Sweden. It was published online in American Journal of Obstetrics & Gynecology.
LIMITATIONS:
This study’s limitations included the exclusive focus on one type of IUD (LNG-IUS 52 mg, 4.4 mm), which may limit generalizability to other IUD types. Additionally, only experienced providers participated, which may not reflect settings with less experienced providers. Factors such as anticipated pain and patient anxiety were not systematically assessed, potentially influencing pain perception.
DISCLOSURES:
Envall received personal fees from Bayer for educational activities and honorarium from Medsphere Corp USA for expert opinions on long-acting reversible contraception. Kallner received honoraria for consultancy work and lectures from multiple pharmaceutical companies, including AbbVie, Actavis, Bayer, and others. The study was funded by the Swedish Research Council. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Mepivacaine instillation significantly reduced pain during intrauterine device (IUD) placement in nulliparous women. More than 90% of women in the intervention group reported tolerable pain compared with 80% of those in the placebo group.
METHODOLOGY:
- A multicenter, double-blind, randomized, placebo-controlled trial was conducted in 12 centers in Sweden, which involved 151 nulliparous women aged 18-31 years.
- Participants were randomly assigned to receive either 10 mL of 20 mg/mL mepivacaine or 10 mL of 0.9 mg/mL sodium chloride (placebo) through a hydrosonography catheter 2 minutes before IUD placement.
- Pain scores were measured using a 100-mm visual analog scale (VAS) at baseline, after instillation, during IUD placement, and 10 minutes post placement.
- The primary outcome was the difference in VAS pain scores during IUD placement between the intervention and placebo groups.
TAKEAWAY:
- Mepivacaine instillation resulted in a statistically significant reduction in mean VAS pain scores during IUD placement, with a mean difference of 13.3 mm (95% CI, 5.75-20.87; P < .001).
- After adjusting for provider impact, the mean VAS pain score difference remained significant at 12.2 mm (95% CI, 4.85-19.62; P < .001).
- A higher proportion of women in the mepivacaine group reported tolerable pain during IUD placement (93.3%) than the placebo group (80.3%; P = .021).
- No serious adverse effects were associated with mepivacaine instillation, and there were no cases of uterine perforation in either group.
IN PRACTICE:
“We argue that the pain reduction in our study is clinically important as a greater proportion of women in our intervention group, compared to the placebo group, reported tolerable pain during placement and to a higher extent rated the placement as easier than expected and expressed a willingness to choose IUD as contraception again,” the authors of the study wrote.
SOURCE:
This study was led by Niklas Envall, PhD; Karin Elgemark, MD; and Helena Kopp Kallner, MD, PhD, at the Department of Clinical Sciences, Danderyd Hospital, Karolinska Institutet in Stockholm, Sweden. It was published online in American Journal of Obstetrics & Gynecology.
LIMITATIONS:
This study’s limitations included the exclusive focus on one type of IUD (LNG-IUS 52 mg, 4.4 mm), which may limit generalizability to other IUD types. Additionally, only experienced providers participated, which may not reflect settings with less experienced providers. Factors such as anticipated pain and patient anxiety were not systematically assessed, potentially influencing pain perception.
DISCLOSURES:
Envall received personal fees from Bayer for educational activities and honorarium from Medsphere Corp USA for expert opinions on long-acting reversible contraception. Kallner received honoraria for consultancy work and lectures from multiple pharmaceutical companies, including AbbVie, Actavis, Bayer, and others. The study was funded by the Swedish Research Council. Additional disclosures are noted in the original article.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.