Small Bowel Dysmotility Brings Challenges to Patients With Systemic Sclerosis

Article Type
Changed
Wed, 10/23/2024 - 12:10

 

TOPLINE:

Patients with systemic sclerosis (SSc) who exhibit abnormal small bowel transit are more likely to be men, experience more severe cardiac involvement, have a higher mortality risk, and show fewer sicca symptoms.

METHODOLOGY:

  • Researchers enrolled 130 patients with SSc having gastrointestinal (GI) symptoms (mean age at symptom onset, 56.8 years; 90% women; 81% White) seen at the Johns Hopkins Scleroderma Center, Baltimore, from October 2014 to May 2022.
  • Clinical data and serum samples were longitudinally collected from all actively followed patients at the time of enrollment and every 6 months thereafter (median disease duration, 8.4 years).
  • Participants underwent whole gut transit scintigraphy for the assessment of small bowel motility.
  • A cross-sectional analysis compared the clinical features of patients with (n = 22; mean age at symptom onset, 61.4 years) and without (n = 108; mean age at symptom onset, 55.8 years) abnormal small bowel transit.

TAKEAWAY:

  • Men with SSc (odds ratio [OR], 3.70; P = .038) and those with severe cardiac involvement (OR, 3.98; P = .035) were more likely to have abnormal small bowel transit.
  • Sicca symptoms were negatively associated with abnormal small bowel transit in patients with SSc (adjusted OR, 0.28; P = .043).
  • Patients with abnormal small bowel transit reported significantly worse  (P = .028) and social functioning (P = .015) than those having a normal transit.
  • A multivariate analysis showed that patients with abnormal small bowel transit had higher mortality than those with a normal transit (adjusted hazard ratio, 5.03; P = .005).

IN PRACTICE:

“Our findings improve our understanding of risk factors associated with abnormal small bowel transit in SSc patients and shed light on the lived experience of patients with this GI [gastrointestinal] complication,” the authors wrote. “Overall, these findings are important for patient risk stratification and monitoring and will help to identify a more homogeneous group of patients for future clinical and translational studies,” they added.

SOURCE:

The study was led by Jenice X. Cheah, MD, University of California, Los Angeles. It was published online on October 7, 2024, in Rheumatology.

LIMITATIONS:

The study may be subject to referral bias as it was conducted at a tertiary referral center, potentially including patients with a more severe disease status. Furthermore, this study was retrospective in nature, and whole gut transit studies were not conducted in all the patients seen at the referral center. Additionally, the cross-sectional design limited the ability to establish causality between the clinical features and abnormal small bowel transit.

DISCLOSURES:

The study was supported by grants from the National Institute of Arthritis and Musculoskeletal and Skin Diseases. The authors declared no conflicts of interest.
 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Patients with systemic sclerosis (SSc) who exhibit abnormal small bowel transit are more likely to be men, experience more severe cardiac involvement, have a higher mortality risk, and show fewer sicca symptoms.

METHODOLOGY:

  • Researchers enrolled 130 patients with SSc having gastrointestinal (GI) symptoms (mean age at symptom onset, 56.8 years; 90% women; 81% White) seen at the Johns Hopkins Scleroderma Center, Baltimore, from October 2014 to May 2022.
  • Clinical data and serum samples were longitudinally collected from all actively followed patients at the time of enrollment and every 6 months thereafter (median disease duration, 8.4 years).
  • Participants underwent whole gut transit scintigraphy for the assessment of small bowel motility.
  • A cross-sectional analysis compared the clinical features of patients with (n = 22; mean age at symptom onset, 61.4 years) and without (n = 108; mean age at symptom onset, 55.8 years) abnormal small bowel transit.

TAKEAWAY:

  • Men with SSc (odds ratio [OR], 3.70; P = .038) and those with severe cardiac involvement (OR, 3.98; P = .035) were more likely to have abnormal small bowel transit.
  • Sicca symptoms were negatively associated with abnormal small bowel transit in patients with SSc (adjusted OR, 0.28; P = .043).
  • Patients with abnormal small bowel transit reported significantly worse  (P = .028) and social functioning (P = .015) than those having a normal transit.
  • A multivariate analysis showed that patients with abnormal small bowel transit had higher mortality than those with a normal transit (adjusted hazard ratio, 5.03; P = .005).

IN PRACTICE:

“Our findings improve our understanding of risk factors associated with abnormal small bowel transit in SSc patients and shed light on the lived experience of patients with this GI [gastrointestinal] complication,” the authors wrote. “Overall, these findings are important for patient risk stratification and monitoring and will help to identify a more homogeneous group of patients for future clinical and translational studies,” they added.

SOURCE:

The study was led by Jenice X. Cheah, MD, University of California, Los Angeles. It was published online on October 7, 2024, in Rheumatology.

LIMITATIONS:

The study may be subject to referral bias as it was conducted at a tertiary referral center, potentially including patients with a more severe disease status. Furthermore, this study was retrospective in nature, and whole gut transit studies were not conducted in all the patients seen at the referral center. Additionally, the cross-sectional design limited the ability to establish causality between the clinical features and abnormal small bowel transit.

DISCLOSURES:

The study was supported by grants from the National Institute of Arthritis and Musculoskeletal and Skin Diseases. The authors declared no conflicts of interest.
 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

Patients with systemic sclerosis (SSc) who exhibit abnormal small bowel transit are more likely to be men, experience more severe cardiac involvement, have a higher mortality risk, and show fewer sicca symptoms.

METHODOLOGY:

  • Researchers enrolled 130 patients with SSc having gastrointestinal (GI) symptoms (mean age at symptom onset, 56.8 years; 90% women; 81% White) seen at the Johns Hopkins Scleroderma Center, Baltimore, from October 2014 to May 2022.
  • Clinical data and serum samples were longitudinally collected from all actively followed patients at the time of enrollment and every 6 months thereafter (median disease duration, 8.4 years).
  • Participants underwent whole gut transit scintigraphy for the assessment of small bowel motility.
  • A cross-sectional analysis compared the clinical features of patients with (n = 22; mean age at symptom onset, 61.4 years) and without (n = 108; mean age at symptom onset, 55.8 years) abnormal small bowel transit.

TAKEAWAY:

  • Men with SSc (odds ratio [OR], 3.70; P = .038) and those with severe cardiac involvement (OR, 3.98; P = .035) were more likely to have abnormal small bowel transit.
  • Sicca symptoms were negatively associated with abnormal small bowel transit in patients with SSc (adjusted OR, 0.28; P = .043).
  • Patients with abnormal small bowel transit reported significantly worse  (P = .028) and social functioning (P = .015) than those having a normal transit.
  • A multivariate analysis showed that patients with abnormal small bowel transit had higher mortality than those with a normal transit (adjusted hazard ratio, 5.03; P = .005).

IN PRACTICE:

“Our findings improve our understanding of risk factors associated with abnormal small bowel transit in SSc patients and shed light on the lived experience of patients with this GI [gastrointestinal] complication,” the authors wrote. “Overall, these findings are important for patient risk stratification and monitoring and will help to identify a more homogeneous group of patients for future clinical and translational studies,” they added.

SOURCE:

The study was led by Jenice X. Cheah, MD, University of California, Los Angeles. It was published online on October 7, 2024, in Rheumatology.

LIMITATIONS:

The study may be subject to referral bias as it was conducted at a tertiary referral center, potentially including patients with a more severe disease status. Furthermore, this study was retrospective in nature, and whole gut transit studies were not conducted in all the patients seen at the referral center. Additionally, the cross-sectional design limited the ability to establish causality between the clinical features and abnormal small bowel transit.

DISCLOSURES:

The study was supported by grants from the National Institute of Arthritis and Musculoskeletal and Skin Diseases. The authors declared no conflicts of interest.
 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

A Single Jog Can Improve Glucose Metabolism in Young Adults

Article Type
Changed
Wed, 10/23/2024 - 08:25

 

TOPLINE:

In healthy young adults, a single 30-minute bout of outdoor aerobic exercise significantly reduces fasting and 1-hour glucose levels during an oral glucose tolerance test (OGTT) the next day and improves insulin sensitivity.

METHODOLOGY:

  • Recent studies have identified 1-hour post-load glucose concentration during an OGTT as a specific and early predictor of diabetes, and exercise has long been known for its metabolic benefits in people with and without diabetes.
  • The researchers investigated the effect of a single bout of aerobic exercise on 1-hour post-load glucose levels during an OGTT in 32 young, healthy, normal-weight or marginally overweight individuals (mean age, 35 years; 14 women and 18 men) with a sedentary or moderately active lifestyle.
  • The participants underwent an initial OGTT after at least 4 days of physical inactivity, followed by a second OGTT the day after a single 30-minute bout of aerobic exercise.
  • The exercise session consisted of a light jog for 30 minutes, monitored using a metabolic holter to quantify energy expenditure and exercise intensity. The participants did not undertake any exercise outside the lab sessions.
  • Blood glucose levels were measured, and insulin sensitivity and secretion were estimated using surrogate indices derived from OGTT glucose and insulin assays, including the Matsuda index, oral glucose insulin sensitivity (OGIS) index, and quantitative insulin sensitivity check index, as well as the homeostasis model assessment (HOMA) of insulin resistance and of beta-cell function (HOMA-B).

TAKEAWAY:

  •  
  • Postexercise insulin levels also were significantly lower 1 hour after glucose load, decreasing from 57.4 IU/mL at baseline to 43.5 IU/mL the day after exercise (P = .01).
  • Insulin sensitivity improved significantly after exercise, as indicated by increases in the Matsuda index (P = .02) and OGIS index (P = .04), along with a reduction in insulin resistance (P = .04).
  • The study found a trend toward increased beta-cell function the day after an exercise bout, as indicated by a nonsignificant increase in HOMA-B from 144.7 at baseline to 167.1 after exercise.

IN PRACTICE:

“Improvement in 1-hour post-load plasma glucose following a single session of aerobic physical activity suggests that exercise could have a direct effect on T2D [type 2 diabetes] risk and cardiovascular risk,” the authors wrote.

SOURCE:

The study was led by Simona Moffa, Fondazione Policlinico Universitario Agostino Gemelli IRCCS, Rome, and Gian Pio Sorice, Università Degli Studi di Bari “Aldo Moro,” Bari, Italy. It was published online in the Journal of Endocrinological Investigation.

LIMITATIONS:

The study had a limited sample size, which may affect the generalizability of the findings. C-peptide levels, which could have provided additional insights into insulin secretion, were not assessed in the study.

DISCLOSURES:

The study was supported by grants from Università Cattolica del Sacro Cuore. The authors declared no conflicts of interest.
 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

In healthy young adults, a single 30-minute bout of outdoor aerobic exercise significantly reduces fasting and 1-hour glucose levels during an oral glucose tolerance test (OGTT) the next day and improves insulin sensitivity.

METHODOLOGY:

  • Recent studies have identified 1-hour post-load glucose concentration during an OGTT as a specific and early predictor of diabetes, and exercise has long been known for its metabolic benefits in people with and without diabetes.
  • The researchers investigated the effect of a single bout of aerobic exercise on 1-hour post-load glucose levels during an OGTT in 32 young, healthy, normal-weight or marginally overweight individuals (mean age, 35 years; 14 women and 18 men) with a sedentary or moderately active lifestyle.
  • The participants underwent an initial OGTT after at least 4 days of physical inactivity, followed by a second OGTT the day after a single 30-minute bout of aerobic exercise.
  • The exercise session consisted of a light jog for 30 minutes, monitored using a metabolic holter to quantify energy expenditure and exercise intensity. The participants did not undertake any exercise outside the lab sessions.
  • Blood glucose levels were measured, and insulin sensitivity and secretion were estimated using surrogate indices derived from OGTT glucose and insulin assays, including the Matsuda index, oral glucose insulin sensitivity (OGIS) index, and quantitative insulin sensitivity check index, as well as the homeostasis model assessment (HOMA) of insulin resistance and of beta-cell function (HOMA-B).

TAKEAWAY:

  •  
  • Postexercise insulin levels also were significantly lower 1 hour after glucose load, decreasing from 57.4 IU/mL at baseline to 43.5 IU/mL the day after exercise (P = .01).
  • Insulin sensitivity improved significantly after exercise, as indicated by increases in the Matsuda index (P = .02) and OGIS index (P = .04), along with a reduction in insulin resistance (P = .04).
  • The study found a trend toward increased beta-cell function the day after an exercise bout, as indicated by a nonsignificant increase in HOMA-B from 144.7 at baseline to 167.1 after exercise.

IN PRACTICE:

“Improvement in 1-hour post-load plasma glucose following a single session of aerobic physical activity suggests that exercise could have a direct effect on T2D [type 2 diabetes] risk and cardiovascular risk,” the authors wrote.

SOURCE:

The study was led by Simona Moffa, Fondazione Policlinico Universitario Agostino Gemelli IRCCS, Rome, and Gian Pio Sorice, Università Degli Studi di Bari “Aldo Moro,” Bari, Italy. It was published online in the Journal of Endocrinological Investigation.

LIMITATIONS:

The study had a limited sample size, which may affect the generalizability of the findings. C-peptide levels, which could have provided additional insights into insulin secretion, were not assessed in the study.

DISCLOSURES:

The study was supported by grants from Università Cattolica del Sacro Cuore. The authors declared no conflicts of interest.
 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

In healthy young adults, a single 30-minute bout of outdoor aerobic exercise significantly reduces fasting and 1-hour glucose levels during an oral glucose tolerance test (OGTT) the next day and improves insulin sensitivity.

METHODOLOGY:

  • Recent studies have identified 1-hour post-load glucose concentration during an OGTT as a specific and early predictor of diabetes, and exercise has long been known for its metabolic benefits in people with and without diabetes.
  • The researchers investigated the effect of a single bout of aerobic exercise on 1-hour post-load glucose levels during an OGTT in 32 young, healthy, normal-weight or marginally overweight individuals (mean age, 35 years; 14 women and 18 men) with a sedentary or moderately active lifestyle.
  • The participants underwent an initial OGTT after at least 4 days of physical inactivity, followed by a second OGTT the day after a single 30-minute bout of aerobic exercise.
  • The exercise session consisted of a light jog for 30 minutes, monitored using a metabolic holter to quantify energy expenditure and exercise intensity. The participants did not undertake any exercise outside the lab sessions.
  • Blood glucose levels were measured, and insulin sensitivity and secretion were estimated using surrogate indices derived from OGTT glucose and insulin assays, including the Matsuda index, oral glucose insulin sensitivity (OGIS) index, and quantitative insulin sensitivity check index, as well as the homeostasis model assessment (HOMA) of insulin resistance and of beta-cell function (HOMA-B).

TAKEAWAY:

  •  
  • Postexercise insulin levels also were significantly lower 1 hour after glucose load, decreasing from 57.4 IU/mL at baseline to 43.5 IU/mL the day after exercise (P = .01).
  • Insulin sensitivity improved significantly after exercise, as indicated by increases in the Matsuda index (P = .02) and OGIS index (P = .04), along with a reduction in insulin resistance (P = .04).
  • The study found a trend toward increased beta-cell function the day after an exercise bout, as indicated by a nonsignificant increase in HOMA-B from 144.7 at baseline to 167.1 after exercise.

IN PRACTICE:

“Improvement in 1-hour post-load plasma glucose following a single session of aerobic physical activity suggests that exercise could have a direct effect on T2D [type 2 diabetes] risk and cardiovascular risk,” the authors wrote.

SOURCE:

The study was led by Simona Moffa, Fondazione Policlinico Universitario Agostino Gemelli IRCCS, Rome, and Gian Pio Sorice, Università Degli Studi di Bari “Aldo Moro,” Bari, Italy. It was published online in the Journal of Endocrinological Investigation.

LIMITATIONS:

The study had a limited sample size, which may affect the generalizability of the findings. C-peptide levels, which could have provided additional insights into insulin secretion, were not assessed in the study.

DISCLOSURES:

The study was supported by grants from Università Cattolica del Sacro Cuore. The authors declared no conflicts of interest.
 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Fewer Recurrent Cardiovascular Events Seen With TNF Inhibitor Use in Axial Spondyloarthritis

Article Type
Changed
Tue, 10/15/2024 - 16:13

 

TOPLINE:

Tumor necrosis factor (TNF) inhibitors are associated with a reduced risk for recurrent cardiovascular events in patients with radiographic axial spondyloarthritis (axSpA) and a history of cardiovascular events.

METHODOLOGY:

  • The researchers conducted a nationwide cohort study using data from the Korean National Claims Database, including 413 patients diagnosed with cardiovascular events following a radiographic axSpA diagnosis.
  • Of all patients, 75 received TNF inhibitors (mean age, 51.9 years; 92% men) and 338 did not receive TNF inhibitors (mean age, 60.7 years; 74.9% men).
  • Patients were followed from the date of the first cardiovascular event to the date of recurrence, the last date with claims data, or up to December 2021.
  • The study outcome was recurrent cardiovascular events that occurred within 28 days of the first incidence and included myocardial infarction and stroke.
  • The effect of TNF inhibitor exposure on the risk for recurrent cardiovascular events was assessed using an inverse probability weighted Cox regression analysis.

TAKEAWAY:

  • The incidence of recurrent cardiovascular events in patients with radiographic axSpA was 32 per 1000 person-years.
  • The incidence was 19 per 1000 person-years in the patients exposed to TNF inhibitors, whereas it was 36 per 1000 person-years in those not exposed to TNF inhibitors.
  • Exposure to TNF inhibitors was associated with a 67% lower risk for recurrent cardiovascular events than non-exposure (P = .038).

IN PRACTICE:

“Our data add to previous knowledge by providing more direct evidence that TNFi [tumor necrosis factor inhibitors] could reduce the risk of recurrent cardiovascular events,” the authors wrote.
 

SOURCE:

The study was led by Oh Chan Kwon, MD, PhD, and Hye Sun Lee, PhD, Yonsei University College of Medicine, Seoul, South Korea. It was published online on October 4, 2024, in Arthritis Research & Therapy.

LIMITATIONS:

The lack of data on certain cardiovascular risk factors such as obesity, smoking, and lifestyle may have led to residual confounding. The patient count in the TNF inhibitor exposure group was not adequate to analyze each TNF inhibitor medication separately. The study included only Korean patients, limiting the generalizability to other ethnic populations. The number of recurrent stroke events was relatively small, making it infeasible to analyze myocardial infarction and stroke separately.

DISCLOSURES:

The study was funded by Yuhan Corporation as part of its “2023 Investigator Initiated Translation Research Program.” The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Tumor necrosis factor (TNF) inhibitors are associated with a reduced risk for recurrent cardiovascular events in patients with radiographic axial spondyloarthritis (axSpA) and a history of cardiovascular events.

METHODOLOGY:

  • The researchers conducted a nationwide cohort study using data from the Korean National Claims Database, including 413 patients diagnosed with cardiovascular events following a radiographic axSpA diagnosis.
  • Of all patients, 75 received TNF inhibitors (mean age, 51.9 years; 92% men) and 338 did not receive TNF inhibitors (mean age, 60.7 years; 74.9% men).
  • Patients were followed from the date of the first cardiovascular event to the date of recurrence, the last date with claims data, or up to December 2021.
  • The study outcome was recurrent cardiovascular events that occurred within 28 days of the first incidence and included myocardial infarction and stroke.
  • The effect of TNF inhibitor exposure on the risk for recurrent cardiovascular events was assessed using an inverse probability weighted Cox regression analysis.

TAKEAWAY:

  • The incidence of recurrent cardiovascular events in patients with radiographic axSpA was 32 per 1000 person-years.
  • The incidence was 19 per 1000 person-years in the patients exposed to TNF inhibitors, whereas it was 36 per 1000 person-years in those not exposed to TNF inhibitors.
  • Exposure to TNF inhibitors was associated with a 67% lower risk for recurrent cardiovascular events than non-exposure (P = .038).

IN PRACTICE:

“Our data add to previous knowledge by providing more direct evidence that TNFi [tumor necrosis factor inhibitors] could reduce the risk of recurrent cardiovascular events,” the authors wrote.
 

SOURCE:

The study was led by Oh Chan Kwon, MD, PhD, and Hye Sun Lee, PhD, Yonsei University College of Medicine, Seoul, South Korea. It was published online on October 4, 2024, in Arthritis Research & Therapy.

LIMITATIONS:

The lack of data on certain cardiovascular risk factors such as obesity, smoking, and lifestyle may have led to residual confounding. The patient count in the TNF inhibitor exposure group was not adequate to analyze each TNF inhibitor medication separately. The study included only Korean patients, limiting the generalizability to other ethnic populations. The number of recurrent stroke events was relatively small, making it infeasible to analyze myocardial infarction and stroke separately.

DISCLOSURES:

The study was funded by Yuhan Corporation as part of its “2023 Investigator Initiated Translation Research Program.” The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Tumor necrosis factor (TNF) inhibitors are associated with a reduced risk for recurrent cardiovascular events in patients with radiographic axial spondyloarthritis (axSpA) and a history of cardiovascular events.

METHODOLOGY:

  • The researchers conducted a nationwide cohort study using data from the Korean National Claims Database, including 413 patients diagnosed with cardiovascular events following a radiographic axSpA diagnosis.
  • Of all patients, 75 received TNF inhibitors (mean age, 51.9 years; 92% men) and 338 did not receive TNF inhibitors (mean age, 60.7 years; 74.9% men).
  • Patients were followed from the date of the first cardiovascular event to the date of recurrence, the last date with claims data, or up to December 2021.
  • The study outcome was recurrent cardiovascular events that occurred within 28 days of the first incidence and included myocardial infarction and stroke.
  • The effect of TNF inhibitor exposure on the risk for recurrent cardiovascular events was assessed using an inverse probability weighted Cox regression analysis.

TAKEAWAY:

  • The incidence of recurrent cardiovascular events in patients with radiographic axSpA was 32 per 1000 person-years.
  • The incidence was 19 per 1000 person-years in the patients exposed to TNF inhibitors, whereas it was 36 per 1000 person-years in those not exposed to TNF inhibitors.
  • Exposure to TNF inhibitors was associated with a 67% lower risk for recurrent cardiovascular events than non-exposure (P = .038).

IN PRACTICE:

“Our data add to previous knowledge by providing more direct evidence that TNFi [tumor necrosis factor inhibitors] could reduce the risk of recurrent cardiovascular events,” the authors wrote.
 

SOURCE:

The study was led by Oh Chan Kwon, MD, PhD, and Hye Sun Lee, PhD, Yonsei University College of Medicine, Seoul, South Korea. It was published online on October 4, 2024, in Arthritis Research & Therapy.

LIMITATIONS:

The lack of data on certain cardiovascular risk factors such as obesity, smoking, and lifestyle may have led to residual confounding. The patient count in the TNF inhibitor exposure group was not adequate to analyze each TNF inhibitor medication separately. The study included only Korean patients, limiting the generalizability to other ethnic populations. The number of recurrent stroke events was relatively small, making it infeasible to analyze myocardial infarction and stroke separately.

DISCLOSURES:

The study was funded by Yuhan Corporation as part of its “2023 Investigator Initiated Translation Research Program.” The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Eggs: A Weighty Matter for Postmenopausal Women?

Article Type
Changed
Mon, 10/14/2024 - 15:33

 

TOPLINE:

Eating more eggs is linked to weight gain in postmenopausal women, especially those with a high intake of Western foods such as processed and red meat, French fries, sweets, and deserts. Genetic predisposition for a high body mass index (BMI) also influences weight gain with higher egg intake.

METHODOLOGY:

  • Egg consumption and elevated body weight are each linked to an increased risk for serious chronic diseases; however, whether elevated body weight mediates the association between egg intake and an elevated risk for serious chronic diseases needs further assessment.
  • To investigate the association between eating eggs and weight gain, as well as the role of genetic susceptibility to an elevated BMI, researchers conducted a prospective study including 4439 postmenopausal women of European descent from the Women’s Health Initiative (WHI).
  • They measured the participants’ consumption of egg and egg nutrients using a self-administered food frequency questionnaire.
  • Changes in the consumption of eggs and egg nutrients such as cholesterol, choline, and betaine were assessed between baseline and follow-up visits at 1, 3, 6, and 9 years.
  • The primary outcome was the change in body weight between baseline and each follow-up visit. Furthermore, an exploratory analysis evaluated how eating Western foods and genetic predisposition for a high BMI (assessed through a polygenic score) influenced weight change.

TAKEAWAY:

  • An increased consumption of eggs was associated with weight gain, showing a positive linear trend at 1, 2, 3, and 6 years. By the third year, women whose egg consumption had increased by two eggs per week gained 0.70 kg more weight (P = .0002) than women whose egg consumption was reduced by 2.4 eggs per week (P-linear < .0001).
  • An increase in the consumption of nutrients obtained from eggs, including cholesterol (P-linear < .0001) and choline (P-linear < .02), was positively associated with weight gain.
  • Women with a higher consumption of Western foods showed significant associations between changes in egg, cholesterol, and choline intake and weight gain.
  • A significant association was found between the BMI polygenic score and changes in body weight, with women most genetically predisposed to a higher BMI showing greater weight gain when their egg consumption increased by an average of 3.5 eggs per week.

IN PRACTICE:

“These results suggest that even relatively small increases or decreases in egg consumption could cause increases or decreases, respectively, in body weight among postmenopausal women, unless there are adequate compensating changes in factors such as dietary energy intake or physical activity,” the authors wrote. “Our results require confirmation,” they added.

SOURCE:

This study, led by James A. Greenberg, Department of Health and Nutrition Sciences, Brooklyn College, The City University of New York, was published online in Clinical Nutrition.

LIMITATIONS: 

This observational study was susceptible to residual confounding, which suggests that the observed associations may not have established causality. Additionally, the results were according to data from a group of postmenopausal American women of European descent, which limited the generalizability to other populations.

DISCLOSURES:

The WHI program was funded by the National Heart, Lung, and Blood Institute, National Institutes of Health, Department of Health & Human Services. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Eating more eggs is linked to weight gain in postmenopausal women, especially those with a high intake of Western foods such as processed and red meat, French fries, sweets, and deserts. Genetic predisposition for a high body mass index (BMI) also influences weight gain with higher egg intake.

METHODOLOGY:

  • Egg consumption and elevated body weight are each linked to an increased risk for serious chronic diseases; however, whether elevated body weight mediates the association between egg intake and an elevated risk for serious chronic diseases needs further assessment.
  • To investigate the association between eating eggs and weight gain, as well as the role of genetic susceptibility to an elevated BMI, researchers conducted a prospective study including 4439 postmenopausal women of European descent from the Women’s Health Initiative (WHI).
  • They measured the participants’ consumption of egg and egg nutrients using a self-administered food frequency questionnaire.
  • Changes in the consumption of eggs and egg nutrients such as cholesterol, choline, and betaine were assessed between baseline and follow-up visits at 1, 3, 6, and 9 years.
  • The primary outcome was the change in body weight between baseline and each follow-up visit. Furthermore, an exploratory analysis evaluated how eating Western foods and genetic predisposition for a high BMI (assessed through a polygenic score) influenced weight change.

TAKEAWAY:

  • An increased consumption of eggs was associated with weight gain, showing a positive linear trend at 1, 2, 3, and 6 years. By the third year, women whose egg consumption had increased by two eggs per week gained 0.70 kg more weight (P = .0002) than women whose egg consumption was reduced by 2.4 eggs per week (P-linear < .0001).
  • An increase in the consumption of nutrients obtained from eggs, including cholesterol (P-linear < .0001) and choline (P-linear < .02), was positively associated with weight gain.
  • Women with a higher consumption of Western foods showed significant associations between changes in egg, cholesterol, and choline intake and weight gain.
  • A significant association was found between the BMI polygenic score and changes in body weight, with women most genetically predisposed to a higher BMI showing greater weight gain when their egg consumption increased by an average of 3.5 eggs per week.

IN PRACTICE:

“These results suggest that even relatively small increases or decreases in egg consumption could cause increases or decreases, respectively, in body weight among postmenopausal women, unless there are adequate compensating changes in factors such as dietary energy intake or physical activity,” the authors wrote. “Our results require confirmation,” they added.

SOURCE:

This study, led by James A. Greenberg, Department of Health and Nutrition Sciences, Brooklyn College, The City University of New York, was published online in Clinical Nutrition.

LIMITATIONS: 

This observational study was susceptible to residual confounding, which suggests that the observed associations may not have established causality. Additionally, the results were according to data from a group of postmenopausal American women of European descent, which limited the generalizability to other populations.

DISCLOSURES:

The WHI program was funded by the National Heart, Lung, and Blood Institute, National Institutes of Health, Department of Health & Human Services. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Eating more eggs is linked to weight gain in postmenopausal women, especially those with a high intake of Western foods such as processed and red meat, French fries, sweets, and deserts. Genetic predisposition for a high body mass index (BMI) also influences weight gain with higher egg intake.

METHODOLOGY:

  • Egg consumption and elevated body weight are each linked to an increased risk for serious chronic diseases; however, whether elevated body weight mediates the association between egg intake and an elevated risk for serious chronic diseases needs further assessment.
  • To investigate the association between eating eggs and weight gain, as well as the role of genetic susceptibility to an elevated BMI, researchers conducted a prospective study including 4439 postmenopausal women of European descent from the Women’s Health Initiative (WHI).
  • They measured the participants’ consumption of egg and egg nutrients using a self-administered food frequency questionnaire.
  • Changes in the consumption of eggs and egg nutrients such as cholesterol, choline, and betaine were assessed between baseline and follow-up visits at 1, 3, 6, and 9 years.
  • The primary outcome was the change in body weight between baseline and each follow-up visit. Furthermore, an exploratory analysis evaluated how eating Western foods and genetic predisposition for a high BMI (assessed through a polygenic score) influenced weight change.

TAKEAWAY:

  • An increased consumption of eggs was associated with weight gain, showing a positive linear trend at 1, 2, 3, and 6 years. By the third year, women whose egg consumption had increased by two eggs per week gained 0.70 kg more weight (P = .0002) than women whose egg consumption was reduced by 2.4 eggs per week (P-linear < .0001).
  • An increase in the consumption of nutrients obtained from eggs, including cholesterol (P-linear < .0001) and choline (P-linear < .02), was positively associated with weight gain.
  • Women with a higher consumption of Western foods showed significant associations between changes in egg, cholesterol, and choline intake and weight gain.
  • A significant association was found between the BMI polygenic score and changes in body weight, with women most genetically predisposed to a higher BMI showing greater weight gain when their egg consumption increased by an average of 3.5 eggs per week.

IN PRACTICE:

“These results suggest that even relatively small increases or decreases in egg consumption could cause increases or decreases, respectively, in body weight among postmenopausal women, unless there are adequate compensating changes in factors such as dietary energy intake or physical activity,” the authors wrote. “Our results require confirmation,” they added.

SOURCE:

This study, led by James A. Greenberg, Department of Health and Nutrition Sciences, Brooklyn College, The City University of New York, was published online in Clinical Nutrition.

LIMITATIONS: 

This observational study was susceptible to residual confounding, which suggests that the observed associations may not have established causality. Additionally, the results were according to data from a group of postmenopausal American women of European descent, which limited the generalizability to other populations.

DISCLOSURES:

The WHI program was funded by the National Heart, Lung, and Blood Institute, National Institutes of Health, Department of Health & Human Services. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Underutilized Mifepristone Shows Promise in Care of Early Pregnancy Loss

Article Type
Changed
Fri, 10/11/2024 - 12:23

 

TOPLINE:

Mifepristone plus misoprostol reduces the need for subsequent uterine aspiration and emergency department visits in the management of early pregnancy loss. Despite its effectiveness, mifepristone remains underutilized, with 8.6% of patients receiving it in 2022.

METHODOLOGY:

  • Researchers conducted a retrospective cohort study using national insurance claims data of US patients with commercial insurance.
  • More than 31,000 pregnant women (mean age, 32.7 years) with a diagnosis of early pregnancy loss between 2015 and 2022 were included.
  • The diagnosis of patients included having a missed abortion (72.3%), spontaneous abortion (26.9%), or both (0.8%).
  • Researchers compared the outcomes of individuals who received a combination of mifepristone and misoprostol vs those who received misoprostol alone. The outcome measures included the need for subsequent procedural management (uterine aspiration), return visits to the emergency department or an outpatient clinic, hospitalizations, and complications within 6 weeks of initial diagnosis.

TAKEAWAY:

  • The use of mifepristone was more common in outpatient clinics than in emergency departments (3.4% vs 0.9%; < .001).
  • The use of mifepristone plus misoprostol vs misoprostol alone was linked to a lower incidence of subsequent procedural management (10.5% vs 14.0%; P = .002) and fewer emergency department visits (3.5% vs 7.9%; P < .001).
  • The multivariable analysis showed that the use of mifepristone was linked to decreased odds of subsequent procedural management (adjusted odds ratio, 0.71; 95% CI, 0.57-0.87).
  • Despite its effectiveness, mifepristone was used in only 8.6% of those receiving medication management for early pregnancy loss in 2022.

IN PRACTICE:

“Continued efforts are needed to reduce barriers to mifepristone use for medication management of EPL,” the authors wrote.

“Any practitioner who cares for patients experiencing early pregnancy loss should consider mifepristone pretreatment to misoprostol to be the standard of care for medication management. Provision of the evidence-based standard of care with the use of mifepristone for early pregnancy loss is an opportunity to advocate for an essential strategy in improving sexual and reproductive health in the US,” wrote Sarita Sonalkar, MD, MPH, and Rachel McKean, MD, MPH, of the Perelman School of Medicine at the University of Pennsylvania, Philadelphia, in an invited commentary.
 

SOURCE:

The study was led by Lyndsey S. Benson, MD, MS, of the University of Washington School of Medicine, Seattle, and was published online in JAMA Network Open.

LIMITATIONS:

The study was limited by the accuracy of the diagnosis of early pregnancy loss and procedure codes because claims data are intended for billing purposes and may be incomplete or inaccurate. The use of de-identified data meant that specific gestational durations, exact dosing, or routes of misoprostol administration could not be determined. The findings may not be generalizable to those with public insurance or no insurance.

DISCLOSURES:

The study was supported in part by a grant from a Women’s Reproductive Health Research grant from the National Institutes of Health Eunice Kennedy Shriver National Institute for Child Health and Human Development. One author reported serving as an adviser and investigator, while another reported receiving personal fees and serving as an expert witness, contributing editor, and course instructor outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Mifepristone plus misoprostol reduces the need for subsequent uterine aspiration and emergency department visits in the management of early pregnancy loss. Despite its effectiveness, mifepristone remains underutilized, with 8.6% of patients receiving it in 2022.

METHODOLOGY:

  • Researchers conducted a retrospective cohort study using national insurance claims data of US patients with commercial insurance.
  • More than 31,000 pregnant women (mean age, 32.7 years) with a diagnosis of early pregnancy loss between 2015 and 2022 were included.
  • The diagnosis of patients included having a missed abortion (72.3%), spontaneous abortion (26.9%), or both (0.8%).
  • Researchers compared the outcomes of individuals who received a combination of mifepristone and misoprostol vs those who received misoprostol alone. The outcome measures included the need for subsequent procedural management (uterine aspiration), return visits to the emergency department or an outpatient clinic, hospitalizations, and complications within 6 weeks of initial diagnosis.

TAKEAWAY:

  • The use of mifepristone was more common in outpatient clinics than in emergency departments (3.4% vs 0.9%; < .001).
  • The use of mifepristone plus misoprostol vs misoprostol alone was linked to a lower incidence of subsequent procedural management (10.5% vs 14.0%; P = .002) and fewer emergency department visits (3.5% vs 7.9%; P < .001).
  • The multivariable analysis showed that the use of mifepristone was linked to decreased odds of subsequent procedural management (adjusted odds ratio, 0.71; 95% CI, 0.57-0.87).
  • Despite its effectiveness, mifepristone was used in only 8.6% of those receiving medication management for early pregnancy loss in 2022.

IN PRACTICE:

“Continued efforts are needed to reduce barriers to mifepristone use for medication management of EPL,” the authors wrote.

“Any practitioner who cares for patients experiencing early pregnancy loss should consider mifepristone pretreatment to misoprostol to be the standard of care for medication management. Provision of the evidence-based standard of care with the use of mifepristone for early pregnancy loss is an opportunity to advocate for an essential strategy in improving sexual and reproductive health in the US,” wrote Sarita Sonalkar, MD, MPH, and Rachel McKean, MD, MPH, of the Perelman School of Medicine at the University of Pennsylvania, Philadelphia, in an invited commentary.
 

SOURCE:

The study was led by Lyndsey S. Benson, MD, MS, of the University of Washington School of Medicine, Seattle, and was published online in JAMA Network Open.

LIMITATIONS:

The study was limited by the accuracy of the diagnosis of early pregnancy loss and procedure codes because claims data are intended for billing purposes and may be incomplete or inaccurate. The use of de-identified data meant that specific gestational durations, exact dosing, or routes of misoprostol administration could not be determined. The findings may not be generalizable to those with public insurance or no insurance.

DISCLOSURES:

The study was supported in part by a grant from a Women’s Reproductive Health Research grant from the National Institutes of Health Eunice Kennedy Shriver National Institute for Child Health and Human Development. One author reported serving as an adviser and investigator, while another reported receiving personal fees and serving as an expert witness, contributing editor, and course instructor outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Mifepristone plus misoprostol reduces the need for subsequent uterine aspiration and emergency department visits in the management of early pregnancy loss. Despite its effectiveness, mifepristone remains underutilized, with 8.6% of patients receiving it in 2022.

METHODOLOGY:

  • Researchers conducted a retrospective cohort study using national insurance claims data of US patients with commercial insurance.
  • More than 31,000 pregnant women (mean age, 32.7 years) with a diagnosis of early pregnancy loss between 2015 and 2022 were included.
  • The diagnosis of patients included having a missed abortion (72.3%), spontaneous abortion (26.9%), or both (0.8%).
  • Researchers compared the outcomes of individuals who received a combination of mifepristone and misoprostol vs those who received misoprostol alone. The outcome measures included the need for subsequent procedural management (uterine aspiration), return visits to the emergency department or an outpatient clinic, hospitalizations, and complications within 6 weeks of initial diagnosis.

TAKEAWAY:

  • The use of mifepristone was more common in outpatient clinics than in emergency departments (3.4% vs 0.9%; < .001).
  • The use of mifepristone plus misoprostol vs misoprostol alone was linked to a lower incidence of subsequent procedural management (10.5% vs 14.0%; P = .002) and fewer emergency department visits (3.5% vs 7.9%; P < .001).
  • The multivariable analysis showed that the use of mifepristone was linked to decreased odds of subsequent procedural management (adjusted odds ratio, 0.71; 95% CI, 0.57-0.87).
  • Despite its effectiveness, mifepristone was used in only 8.6% of those receiving medication management for early pregnancy loss in 2022.

IN PRACTICE:

“Continued efforts are needed to reduce barriers to mifepristone use for medication management of EPL,” the authors wrote.

“Any practitioner who cares for patients experiencing early pregnancy loss should consider mifepristone pretreatment to misoprostol to be the standard of care for medication management. Provision of the evidence-based standard of care with the use of mifepristone for early pregnancy loss is an opportunity to advocate for an essential strategy in improving sexual and reproductive health in the US,” wrote Sarita Sonalkar, MD, MPH, and Rachel McKean, MD, MPH, of the Perelman School of Medicine at the University of Pennsylvania, Philadelphia, in an invited commentary.
 

SOURCE:

The study was led by Lyndsey S. Benson, MD, MS, of the University of Washington School of Medicine, Seattle, and was published online in JAMA Network Open.

LIMITATIONS:

The study was limited by the accuracy of the diagnosis of early pregnancy loss and procedure codes because claims data are intended for billing purposes and may be incomplete or inaccurate. The use of de-identified data meant that specific gestational durations, exact dosing, or routes of misoprostol administration could not be determined. The findings may not be generalizable to those with public insurance or no insurance.

DISCLOSURES:

The study was supported in part by a grant from a Women’s Reproductive Health Research grant from the National Institutes of Health Eunice Kennedy Shriver National Institute for Child Health and Human Development. One author reported serving as an adviser and investigator, while another reported receiving personal fees and serving as an expert witness, contributing editor, and course instructor outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Is Metformin An Unexpected Ally Against Long COVID?

Article Type
Changed
Wed, 10/09/2024 - 12:39

 

TOPLINE:

Metformin use in adults with type 2 diabetes (T2D) is associated with a slightly lower incidence of long COVID and death within 180 days after SARS-CoV-2 infection.

METHODOLOGY:

  • Previous studies have shown that metformin use before and during SARS-CoV-2 infection reduces severe COVID-19 and postacute sequelae of SARS-CoV-2 (PASC), also referred to as long COVID, in adults.
  • A retrospective cohort analysis was conducted to evaluate the association between metformin use before and during SARS-CoV-2 infection and the subsequent incidence of PASC.
  • Researchers used data from the National COVID Cohort Collaborative (N3C) and National Patient-Centered Clinical Research Network (PCORnet) electronic health record (EHR) databases to identify adults (age, ≥ 21 years) with T2D prescribed a diabetes medication within the past 12 months.
  • Participants were categorized into those using metformin (metformin group) and those using other noninsulin diabetes medications such as sulfonylureas, dipeptidyl peptidase-4 inhibitors, or thiazolidinediones (the comparator group); those who used glucagon-like peptide 1 receptor agonists or sodium-glucose cotransporter-2 inhibitors were excluded.
  • The primary outcome was the incidence of PASC or death within 180 days after SARS-CoV-2 infection, defined using International Classification of Diseases U09.9 diagnosis code and/or computable phenotype defined by a predicted probability of > 75% for PASC using a machine learning model trained on patients diagnosed using U09.9 (PASC computable phenotype).

TAKEAWAY:

  • Researchers identified 51,385 and 37,947 participants from the N3C and PCORnet datasets, respectively.
  • Metformin use was associated with a 21% lower risk for death or PASC using the U09.9 diagnosis code (P < .001) and a 15% lower risk using the PASC computable phenotype (P < .001) in the N3C dataset than non-metformin use.
  • In the PCORnet dataset, the risk for death or PASC was 13% lower using the U09.9 diagnosis code (P = .08) with metformin use vs non-metformin use, whereas the risk did not differ significantly between the groups when using the PASC computable phenotype (P = .58).
  • The incidence of PASC using the U09.9 diagnosis code for the metformin and comparator groups was similar between the two datasets (1.6% and 2.0% in N3C and 2.2 and 2.6% in PCORnet, respectively).
  • However, when using the computable phenotype, the incidence rates of PASC for the metformin and comparator groups were 4.8% and 5.2% in N3C and 25.2% and 24.2% in PCORnet, respectively.

IN PRACTICE:

“The incidence of PASC was lower when defined by [International Classification of Diseases] code, compared with a computable phenotype in both databases,” the authors wrote. “This may reflect the challenges of clinical care for adults needing chronic medication management and the likelihood of those adults receiving a formal PASC diagnosis.” 

SOURCE:

The study was led by Steven G. Johnson, PhD, Institute for Health Informatics, University of Minnesota, Minneapolis. It was published online in Diabetes Care.

 

 

LIMITATIONS:

The use of EHR data had several limitations, including the inability to examine a dose-dependent relationship and the lack of information on whether medications were taken before, during, or after the acute infection. The outcome definition involved the need for a medical encounter and, thus, may not capture data on all patients experiencing symptoms of PASC. The analysis focused on the prevalent use of chronic medications, limiting the assessment of initiating metformin in those diagnosed with COVID-19.

DISCLOSURES:

The study was supported by the National Institutes of Health Agreement as part of the RECOVER research program. One author reported receiving salary support from the Center for Pharmacoepidemiology and owning stock options in various pharmaceutical and biopharmaceutical companies. Another author reported receiving grant support and consulting contracts, being involved in expert witness engagement, and owning stock options in various pharmaceutical, biopharmaceutical, diabetes management, and medical device companies.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Metformin use in adults with type 2 diabetes (T2D) is associated with a slightly lower incidence of long COVID and death within 180 days after SARS-CoV-2 infection.

METHODOLOGY:

  • Previous studies have shown that metformin use before and during SARS-CoV-2 infection reduces severe COVID-19 and postacute sequelae of SARS-CoV-2 (PASC), also referred to as long COVID, in adults.
  • A retrospective cohort analysis was conducted to evaluate the association between metformin use before and during SARS-CoV-2 infection and the subsequent incidence of PASC.
  • Researchers used data from the National COVID Cohort Collaborative (N3C) and National Patient-Centered Clinical Research Network (PCORnet) electronic health record (EHR) databases to identify adults (age, ≥ 21 years) with T2D prescribed a diabetes medication within the past 12 months.
  • Participants were categorized into those using metformin (metformin group) and those using other noninsulin diabetes medications such as sulfonylureas, dipeptidyl peptidase-4 inhibitors, or thiazolidinediones (the comparator group); those who used glucagon-like peptide 1 receptor agonists or sodium-glucose cotransporter-2 inhibitors were excluded.
  • The primary outcome was the incidence of PASC or death within 180 days after SARS-CoV-2 infection, defined using International Classification of Diseases U09.9 diagnosis code and/or computable phenotype defined by a predicted probability of > 75% for PASC using a machine learning model trained on patients diagnosed using U09.9 (PASC computable phenotype).

TAKEAWAY:

  • Researchers identified 51,385 and 37,947 participants from the N3C and PCORnet datasets, respectively.
  • Metformin use was associated with a 21% lower risk for death or PASC using the U09.9 diagnosis code (P < .001) and a 15% lower risk using the PASC computable phenotype (P < .001) in the N3C dataset than non-metformin use.
  • In the PCORnet dataset, the risk for death or PASC was 13% lower using the U09.9 diagnosis code (P = .08) with metformin use vs non-metformin use, whereas the risk did not differ significantly between the groups when using the PASC computable phenotype (P = .58).
  • The incidence of PASC using the U09.9 diagnosis code for the metformin and comparator groups was similar between the two datasets (1.6% and 2.0% in N3C and 2.2 and 2.6% in PCORnet, respectively).
  • However, when using the computable phenotype, the incidence rates of PASC for the metformin and comparator groups were 4.8% and 5.2% in N3C and 25.2% and 24.2% in PCORnet, respectively.

IN PRACTICE:

“The incidence of PASC was lower when defined by [International Classification of Diseases] code, compared with a computable phenotype in both databases,” the authors wrote. “This may reflect the challenges of clinical care for adults needing chronic medication management and the likelihood of those adults receiving a formal PASC diagnosis.” 

SOURCE:

The study was led by Steven G. Johnson, PhD, Institute for Health Informatics, University of Minnesota, Minneapolis. It was published online in Diabetes Care.

 

 

LIMITATIONS:

The use of EHR data had several limitations, including the inability to examine a dose-dependent relationship and the lack of information on whether medications were taken before, during, or after the acute infection. The outcome definition involved the need for a medical encounter and, thus, may not capture data on all patients experiencing symptoms of PASC. The analysis focused on the prevalent use of chronic medications, limiting the assessment of initiating metformin in those diagnosed with COVID-19.

DISCLOSURES:

The study was supported by the National Institutes of Health Agreement as part of the RECOVER research program. One author reported receiving salary support from the Center for Pharmacoepidemiology and owning stock options in various pharmaceutical and biopharmaceutical companies. Another author reported receiving grant support and consulting contracts, being involved in expert witness engagement, and owning stock options in various pharmaceutical, biopharmaceutical, diabetes management, and medical device companies.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Metformin use in adults with type 2 diabetes (T2D) is associated with a slightly lower incidence of long COVID and death within 180 days after SARS-CoV-2 infection.

METHODOLOGY:

  • Previous studies have shown that metformin use before and during SARS-CoV-2 infection reduces severe COVID-19 and postacute sequelae of SARS-CoV-2 (PASC), also referred to as long COVID, in adults.
  • A retrospective cohort analysis was conducted to evaluate the association between metformin use before and during SARS-CoV-2 infection and the subsequent incidence of PASC.
  • Researchers used data from the National COVID Cohort Collaborative (N3C) and National Patient-Centered Clinical Research Network (PCORnet) electronic health record (EHR) databases to identify adults (age, ≥ 21 years) with T2D prescribed a diabetes medication within the past 12 months.
  • Participants were categorized into those using metformin (metformin group) and those using other noninsulin diabetes medications such as sulfonylureas, dipeptidyl peptidase-4 inhibitors, or thiazolidinediones (the comparator group); those who used glucagon-like peptide 1 receptor agonists or sodium-glucose cotransporter-2 inhibitors were excluded.
  • The primary outcome was the incidence of PASC or death within 180 days after SARS-CoV-2 infection, defined using International Classification of Diseases U09.9 diagnosis code and/or computable phenotype defined by a predicted probability of > 75% for PASC using a machine learning model trained on patients diagnosed using U09.9 (PASC computable phenotype).

TAKEAWAY:

  • Researchers identified 51,385 and 37,947 participants from the N3C and PCORnet datasets, respectively.
  • Metformin use was associated with a 21% lower risk for death or PASC using the U09.9 diagnosis code (P < .001) and a 15% lower risk using the PASC computable phenotype (P < .001) in the N3C dataset than non-metformin use.
  • In the PCORnet dataset, the risk for death or PASC was 13% lower using the U09.9 diagnosis code (P = .08) with metformin use vs non-metformin use, whereas the risk did not differ significantly between the groups when using the PASC computable phenotype (P = .58).
  • The incidence of PASC using the U09.9 diagnosis code for the metformin and comparator groups was similar between the two datasets (1.6% and 2.0% in N3C and 2.2 and 2.6% in PCORnet, respectively).
  • However, when using the computable phenotype, the incidence rates of PASC for the metformin and comparator groups were 4.8% and 5.2% in N3C and 25.2% and 24.2% in PCORnet, respectively.

IN PRACTICE:

“The incidence of PASC was lower when defined by [International Classification of Diseases] code, compared with a computable phenotype in both databases,” the authors wrote. “This may reflect the challenges of clinical care for adults needing chronic medication management and the likelihood of those adults receiving a formal PASC diagnosis.” 

SOURCE:

The study was led by Steven G. Johnson, PhD, Institute for Health Informatics, University of Minnesota, Minneapolis. It was published online in Diabetes Care.

 

 

LIMITATIONS:

The use of EHR data had several limitations, including the inability to examine a dose-dependent relationship and the lack of information on whether medications were taken before, during, or after the acute infection. The outcome definition involved the need for a medical encounter and, thus, may not capture data on all patients experiencing symptoms of PASC. The analysis focused on the prevalent use of chronic medications, limiting the assessment of initiating metformin in those diagnosed with COVID-19.

DISCLOSURES:

The study was supported by the National Institutes of Health Agreement as part of the RECOVER research program. One author reported receiving salary support from the Center for Pharmacoepidemiology and owning stock options in various pharmaceutical and biopharmaceutical companies. Another author reported receiving grant support and consulting contracts, being involved in expert witness engagement, and owning stock options in various pharmaceutical, biopharmaceutical, diabetes management, and medical device companies.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vitamin D in Pregnancy Results in Stronger Bones for Kids

Article Type
Changed
Wed, 10/09/2024 - 11:40

 

TOPLINE:

Gestational supplementation of 1000 IU/d cholecalciferol (vitamin D3) from early pregnancy until delivery increases the bone mineral content, bone mineral density (BMD), and bone mineral apparent density in children at age 6-7 years.

METHODOLOGY:

  • The double-blinded, placebo-controlled MAVIDOS trial of gestational vitamin D supplementation previously showed increased BMD at age 4 years (but no difference at birth), and it is unclear how the effect may persist or change over time.
  • In the original trial, researchers randomized 1134 pregnant women with singleton pregnancy from three UK hospitals from 2008 to 2014, and the 723 children born to mothers recruited in Southampton were invited to continue in offspring follow-up.
  • Mothers were randomly assigned to receive either 1000-IU/d vitamin D or placebo from 14-17 weeks’ gestation until delivery; women in the placebo arm could take up to 400-IU/d vitamin D.
  • In this post hoc analysis, among 454 children who were followed up at age 6-7 years, 447 had a usable whole body and lumbar spine dual-energy x-ray absorptiometry scan (placebo group: n = 216; 48% boys; 98% White mothers and vitamin D group: n = 231; 56% boys; 96% White mothers).
  • Offspring follow-up measures at birth and 4 and 6-7 years were bone area, bone mineral content, BMD, and bone mineral apparent density, derived from a dual-energy x-ray absorptiometry scan of whole body less head (WBLH), as well as fat and lean mass.

TAKEAWAY:

  • The effect of gestational vitamin D supplementation on bone outcomes in children was similar at ages 4 and 6-7 years.
  • At age 6-7 years, gestational vitamin D supplementation resulted in higher WBLH bone mineral content (0.15 SD; 95% CI, 0.04-0.26) and BMD (0.18 SD; 95% CI, 0.06-0.31) than placebo.
  • The WBLH bone mineral apparent density (0.18 SD; 95% CI, 0.04-0.32) was also higher in the vitamin D group.
  • The lean mass was greater in the vitamin D group (0.09 SD; 95% CI, 0.00-0.17) than in the placebo group.

IN PRACTICE:

“These findings suggest that pregnancy vitamin D supplementation may be an important population health strategy to improve bone health,” the authors wrote.

SOURCE:

This study was led by Rebecca J. Moon, PhD, MRC Lifecourse Epidemiology Centre, University of Southampton, Southampton General Hospital, England. It was published online in The American Journal of Clinical Nutrition.

LIMITATIONS: 

Only individuals with baseline vitamin D levels of 25-100 nmol/L were eligible, excluding those with severe deficiency who might have benefited the most from supplementation. The participants were mostly White and well-educated, commonly overweight, which may have limited generalizability to other populations. Only 47% of the original cohort participated in the follow-up phase. Differences in maternal age, smoking status, and education between participants who remained in the study and those who did not may have introduced bias and affected generalizability.

DISCLOSURES:

The study was supported by Versus Arthritis UK, Medical Research Council, Bupa Foundation, and National Institute for Health and Care Research, Southampton Biomedical Research Centre, and other sources. Some authors disclosed receiving travel reimbursement, speaker or lecture fees, honoraria, research funding, or personal or consultancy fees from Alliance for Better Bone Health and various pharmaceutical, biotechnology, medical device, healthcare, and food and nutrition companies outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Gestational supplementation of 1000 IU/d cholecalciferol (vitamin D3) from early pregnancy until delivery increases the bone mineral content, bone mineral density (BMD), and bone mineral apparent density in children at age 6-7 years.

METHODOLOGY:

  • The double-blinded, placebo-controlled MAVIDOS trial of gestational vitamin D supplementation previously showed increased BMD at age 4 years (but no difference at birth), and it is unclear how the effect may persist or change over time.
  • In the original trial, researchers randomized 1134 pregnant women with singleton pregnancy from three UK hospitals from 2008 to 2014, and the 723 children born to mothers recruited in Southampton were invited to continue in offspring follow-up.
  • Mothers were randomly assigned to receive either 1000-IU/d vitamin D or placebo from 14-17 weeks’ gestation until delivery; women in the placebo arm could take up to 400-IU/d vitamin D.
  • In this post hoc analysis, among 454 children who were followed up at age 6-7 years, 447 had a usable whole body and lumbar spine dual-energy x-ray absorptiometry scan (placebo group: n = 216; 48% boys; 98% White mothers and vitamin D group: n = 231; 56% boys; 96% White mothers).
  • Offspring follow-up measures at birth and 4 and 6-7 years were bone area, bone mineral content, BMD, and bone mineral apparent density, derived from a dual-energy x-ray absorptiometry scan of whole body less head (WBLH), as well as fat and lean mass.

TAKEAWAY:

  • The effect of gestational vitamin D supplementation on bone outcomes in children was similar at ages 4 and 6-7 years.
  • At age 6-7 years, gestational vitamin D supplementation resulted in higher WBLH bone mineral content (0.15 SD; 95% CI, 0.04-0.26) and BMD (0.18 SD; 95% CI, 0.06-0.31) than placebo.
  • The WBLH bone mineral apparent density (0.18 SD; 95% CI, 0.04-0.32) was also higher in the vitamin D group.
  • The lean mass was greater in the vitamin D group (0.09 SD; 95% CI, 0.00-0.17) than in the placebo group.

IN PRACTICE:

“These findings suggest that pregnancy vitamin D supplementation may be an important population health strategy to improve bone health,” the authors wrote.

SOURCE:

This study was led by Rebecca J. Moon, PhD, MRC Lifecourse Epidemiology Centre, University of Southampton, Southampton General Hospital, England. It was published online in The American Journal of Clinical Nutrition.

LIMITATIONS: 

Only individuals with baseline vitamin D levels of 25-100 nmol/L were eligible, excluding those with severe deficiency who might have benefited the most from supplementation. The participants were mostly White and well-educated, commonly overweight, which may have limited generalizability to other populations. Only 47% of the original cohort participated in the follow-up phase. Differences in maternal age, smoking status, and education between participants who remained in the study and those who did not may have introduced bias and affected generalizability.

DISCLOSURES:

The study was supported by Versus Arthritis UK, Medical Research Council, Bupa Foundation, and National Institute for Health and Care Research, Southampton Biomedical Research Centre, and other sources. Some authors disclosed receiving travel reimbursement, speaker or lecture fees, honoraria, research funding, or personal or consultancy fees from Alliance for Better Bone Health and various pharmaceutical, biotechnology, medical device, healthcare, and food and nutrition companies outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Gestational supplementation of 1000 IU/d cholecalciferol (vitamin D3) from early pregnancy until delivery increases the bone mineral content, bone mineral density (BMD), and bone mineral apparent density in children at age 6-7 years.

METHODOLOGY:

  • The double-blinded, placebo-controlled MAVIDOS trial of gestational vitamin D supplementation previously showed increased BMD at age 4 years (but no difference at birth), and it is unclear how the effect may persist or change over time.
  • In the original trial, researchers randomized 1134 pregnant women with singleton pregnancy from three UK hospitals from 2008 to 2014, and the 723 children born to mothers recruited in Southampton were invited to continue in offspring follow-up.
  • Mothers were randomly assigned to receive either 1000-IU/d vitamin D or placebo from 14-17 weeks’ gestation until delivery; women in the placebo arm could take up to 400-IU/d vitamin D.
  • In this post hoc analysis, among 454 children who were followed up at age 6-7 years, 447 had a usable whole body and lumbar spine dual-energy x-ray absorptiometry scan (placebo group: n = 216; 48% boys; 98% White mothers and vitamin D group: n = 231; 56% boys; 96% White mothers).
  • Offspring follow-up measures at birth and 4 and 6-7 years were bone area, bone mineral content, BMD, and bone mineral apparent density, derived from a dual-energy x-ray absorptiometry scan of whole body less head (WBLH), as well as fat and lean mass.

TAKEAWAY:

  • The effect of gestational vitamin D supplementation on bone outcomes in children was similar at ages 4 and 6-7 years.
  • At age 6-7 years, gestational vitamin D supplementation resulted in higher WBLH bone mineral content (0.15 SD; 95% CI, 0.04-0.26) and BMD (0.18 SD; 95% CI, 0.06-0.31) than placebo.
  • The WBLH bone mineral apparent density (0.18 SD; 95% CI, 0.04-0.32) was also higher in the vitamin D group.
  • The lean mass was greater in the vitamin D group (0.09 SD; 95% CI, 0.00-0.17) than in the placebo group.

IN PRACTICE:

“These findings suggest that pregnancy vitamin D supplementation may be an important population health strategy to improve bone health,” the authors wrote.

SOURCE:

This study was led by Rebecca J. Moon, PhD, MRC Lifecourse Epidemiology Centre, University of Southampton, Southampton General Hospital, England. It was published online in The American Journal of Clinical Nutrition.

LIMITATIONS: 

Only individuals with baseline vitamin D levels of 25-100 nmol/L were eligible, excluding those with severe deficiency who might have benefited the most from supplementation. The participants were mostly White and well-educated, commonly overweight, which may have limited generalizability to other populations. Only 47% of the original cohort participated in the follow-up phase. Differences in maternal age, smoking status, and education between participants who remained in the study and those who did not may have introduced bias and affected generalizability.

DISCLOSURES:

The study was supported by Versus Arthritis UK, Medical Research Council, Bupa Foundation, and National Institute for Health and Care Research, Southampton Biomedical Research Centre, and other sources. Some authors disclosed receiving travel reimbursement, speaker or lecture fees, honoraria, research funding, or personal or consultancy fees from Alliance for Better Bone Health and various pharmaceutical, biotechnology, medical device, healthcare, and food and nutrition companies outside the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Frailty, Not Just Advanced Age, Affects ANCA Vasculitis Outcomes

Article Type
Changed
Tue, 10/08/2024 - 15:07

 

TOPLINE:

Older adults with antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis face a higher risk for end-stage renal disease or mortality and severe infections. However, frailty, more than age, predicts severe infections within 2 years of diagnosis.

METHODOLOGY:

  • Researchers conducted a retrospective cohort study using data from the Mass General Brigham ANCA-associated vasculitis cohort in the United States.
  • They included 234 individuals (median age, 75 years) with incident ANCA-associated vasculitis who were treated from January 2002 to December 2019.
  • Baseline frailty was measured using a claims-based frailty index, with data collected in the year before treatment initiation; individuals were categorized as those who were nonfrail, prefrail, mildly frail, and moderately to severely frail.
  • Frailty, either mild or moderate to severe, was noted in 44 of 118 individuals aged ≥ 75 years and in 25 of 116 individuals aged 65-74 years.
  • The outcomes of interest were the incidences of end-stage renal disease or death and severe infections within 2 years of diagnosis. The association of age and frailty with clinical outcomes was assessed in those aged 65-74 years and ≥ 75 years.

TAKEAWAY:

  • Frailty was a significant predictor of severe infections within 2 years of ANCA-associated vasculitis diagnosis (adjusted hazard ratio [aHR], 8.46; 95% CI, 3.95-18.14), showing a stronger association than seen for chronological age ≥ 75 years (aHR, 2.52; 95% CI, 1.26-5.04).
  • The incidence of severe infections was higher in those with vs without frailty in the age groups 65-74 years (38.9 vs 0.8 cases per 100 person-years) and ≥ 75 years (61.9 vs 12.3 cases per 100 person-years).
  • Older age (≥ 75 years) was associated with an increased risk for end-stage renal disease or death (aHR, 4.50; 95% CI, 1.83-11.09); however, frailty was not.
  • The effect of frailty on end-stage renal disease or death varied by age, with a larger difference observed in individuals aged 65-74 years (frail vs nonfrail, 7.5 vs 2.0 cases per 100 person-years) than in those aged ≥ 75 years (13.5 vs 16.0 cases per 100 person-years).

IN PRACTICE:

“Our results highlight the fact that assessment of frailty in the care of older adults with ANCA-associated vasculitis can distinguish a group of patients at increased risk of severe infections who might benefit from interventions to minimize infection risk and optimize outcomes,” the authors wrote.

“Incorporating frailty screening into the management of ANCA-associated vasculitis provides an opportunity to offer personalized, evidence-based care to frail older adults,” Alexandra Legge, MD, Dalhousie University, Halifax, Nova Scotia, Canada, wrote in an associated comment published online in The Lancet Rheumatology.
 

SOURCE:

This study was led by Sebastian E. Sattui, MD, Division of Rheumatology and Clinical Immunology, Department of Medicine, University of Pittsburgh in Pennsylvania. It was published online on September 18, 2024, in The Lancet Rheumatology.

LIMITATIONS:

The study’s observational design and single-center setting limited the generalizability of the findings. Residual confounding may have been possible despite adjustments for relevant baseline factors. The requirement of at least one healthcare encounter before baseline may have underrepresented individuals without frailty. Differences in treatment patterns after the baseline period were not controlled for.

DISCLOSURES:

The study was supported by the National Institutes of Health and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some authors reported receiving grants, research support, consulting fees, honoraria, or royalties or serving on advisory boards of pharmaceutical companies and other sources outside of the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Older adults with antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis face a higher risk for end-stage renal disease or mortality and severe infections. However, frailty, more than age, predicts severe infections within 2 years of diagnosis.

METHODOLOGY:

  • Researchers conducted a retrospective cohort study using data from the Mass General Brigham ANCA-associated vasculitis cohort in the United States.
  • They included 234 individuals (median age, 75 years) with incident ANCA-associated vasculitis who were treated from January 2002 to December 2019.
  • Baseline frailty was measured using a claims-based frailty index, with data collected in the year before treatment initiation; individuals were categorized as those who were nonfrail, prefrail, mildly frail, and moderately to severely frail.
  • Frailty, either mild or moderate to severe, was noted in 44 of 118 individuals aged ≥ 75 years and in 25 of 116 individuals aged 65-74 years.
  • The outcomes of interest were the incidences of end-stage renal disease or death and severe infections within 2 years of diagnosis. The association of age and frailty with clinical outcomes was assessed in those aged 65-74 years and ≥ 75 years.

TAKEAWAY:

  • Frailty was a significant predictor of severe infections within 2 years of ANCA-associated vasculitis diagnosis (adjusted hazard ratio [aHR], 8.46; 95% CI, 3.95-18.14), showing a stronger association than seen for chronological age ≥ 75 years (aHR, 2.52; 95% CI, 1.26-5.04).
  • The incidence of severe infections was higher in those with vs without frailty in the age groups 65-74 years (38.9 vs 0.8 cases per 100 person-years) and ≥ 75 years (61.9 vs 12.3 cases per 100 person-years).
  • Older age (≥ 75 years) was associated with an increased risk for end-stage renal disease or death (aHR, 4.50; 95% CI, 1.83-11.09); however, frailty was not.
  • The effect of frailty on end-stage renal disease or death varied by age, with a larger difference observed in individuals aged 65-74 years (frail vs nonfrail, 7.5 vs 2.0 cases per 100 person-years) than in those aged ≥ 75 years (13.5 vs 16.0 cases per 100 person-years).

IN PRACTICE:

“Our results highlight the fact that assessment of frailty in the care of older adults with ANCA-associated vasculitis can distinguish a group of patients at increased risk of severe infections who might benefit from interventions to minimize infection risk and optimize outcomes,” the authors wrote.

“Incorporating frailty screening into the management of ANCA-associated vasculitis provides an opportunity to offer personalized, evidence-based care to frail older adults,” Alexandra Legge, MD, Dalhousie University, Halifax, Nova Scotia, Canada, wrote in an associated comment published online in The Lancet Rheumatology.
 

SOURCE:

This study was led by Sebastian E. Sattui, MD, Division of Rheumatology and Clinical Immunology, Department of Medicine, University of Pittsburgh in Pennsylvania. It was published online on September 18, 2024, in The Lancet Rheumatology.

LIMITATIONS:

The study’s observational design and single-center setting limited the generalizability of the findings. Residual confounding may have been possible despite adjustments for relevant baseline factors. The requirement of at least one healthcare encounter before baseline may have underrepresented individuals without frailty. Differences in treatment patterns after the baseline period were not controlled for.

DISCLOSURES:

The study was supported by the National Institutes of Health and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some authors reported receiving grants, research support, consulting fees, honoraria, or royalties or serving on advisory boards of pharmaceutical companies and other sources outside of the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Older adults with antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis face a higher risk for end-stage renal disease or mortality and severe infections. However, frailty, more than age, predicts severe infections within 2 years of diagnosis.

METHODOLOGY:

  • Researchers conducted a retrospective cohort study using data from the Mass General Brigham ANCA-associated vasculitis cohort in the United States.
  • They included 234 individuals (median age, 75 years) with incident ANCA-associated vasculitis who were treated from January 2002 to December 2019.
  • Baseline frailty was measured using a claims-based frailty index, with data collected in the year before treatment initiation; individuals were categorized as those who were nonfrail, prefrail, mildly frail, and moderately to severely frail.
  • Frailty, either mild or moderate to severe, was noted in 44 of 118 individuals aged ≥ 75 years and in 25 of 116 individuals aged 65-74 years.
  • The outcomes of interest were the incidences of end-stage renal disease or death and severe infections within 2 years of diagnosis. The association of age and frailty with clinical outcomes was assessed in those aged 65-74 years and ≥ 75 years.

TAKEAWAY:

  • Frailty was a significant predictor of severe infections within 2 years of ANCA-associated vasculitis diagnosis (adjusted hazard ratio [aHR], 8.46; 95% CI, 3.95-18.14), showing a stronger association than seen for chronological age ≥ 75 years (aHR, 2.52; 95% CI, 1.26-5.04).
  • The incidence of severe infections was higher in those with vs without frailty in the age groups 65-74 years (38.9 vs 0.8 cases per 100 person-years) and ≥ 75 years (61.9 vs 12.3 cases per 100 person-years).
  • Older age (≥ 75 years) was associated with an increased risk for end-stage renal disease or death (aHR, 4.50; 95% CI, 1.83-11.09); however, frailty was not.
  • The effect of frailty on end-stage renal disease or death varied by age, with a larger difference observed in individuals aged 65-74 years (frail vs nonfrail, 7.5 vs 2.0 cases per 100 person-years) than in those aged ≥ 75 years (13.5 vs 16.0 cases per 100 person-years).

IN PRACTICE:

“Our results highlight the fact that assessment of frailty in the care of older adults with ANCA-associated vasculitis can distinguish a group of patients at increased risk of severe infections who might benefit from interventions to minimize infection risk and optimize outcomes,” the authors wrote.

“Incorporating frailty screening into the management of ANCA-associated vasculitis provides an opportunity to offer personalized, evidence-based care to frail older adults,” Alexandra Legge, MD, Dalhousie University, Halifax, Nova Scotia, Canada, wrote in an associated comment published online in The Lancet Rheumatology.
 

SOURCE:

This study was led by Sebastian E. Sattui, MD, Division of Rheumatology and Clinical Immunology, Department of Medicine, University of Pittsburgh in Pennsylvania. It was published online on September 18, 2024, in The Lancet Rheumatology.

LIMITATIONS:

The study’s observational design and single-center setting limited the generalizability of the findings. Residual confounding may have been possible despite adjustments for relevant baseline factors. The requirement of at least one healthcare encounter before baseline may have underrepresented individuals without frailty. Differences in treatment patterns after the baseline period were not controlled for.

DISCLOSURES:

The study was supported by the National Institutes of Health and the National Institute of Arthritis and Musculoskeletal and Skin Diseases. Some authors reported receiving grants, research support, consulting fees, honoraria, or royalties or serving on advisory boards of pharmaceutical companies and other sources outside of the submitted work.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Hypothyroidism Treatment Does Not Affect Cognitive Decline in Menopausal Women

Article Type
Changed
Fri, 10/04/2024 - 10:54

 

TOPLINE:

Women with hypothyroidism treated with levothyroxine show no significant cognitive decline across the menopausal transition compared with those without thyroid disease.

METHODOLOGY:

  • Levothyroxine, the primary treatment for hypothyroidism, has been linked to perceived cognitive deficits, yet it is unclear whether this is due to the underlying condition being inadequately treated or other factors.
  • Using data collected from the Study of Women’s Health Across the Nation, which encompasses five ethnic/racial groups from seven centers across the United States, researchers compared cognitive function over time between women with hypothyroidism treated with levothyroxine and those without thyroid disease.
  • Participants underwent cognitive testing across three domains — processing speed, working memory, and episodic memory — which were assessed over a mean follow-up of 13 years.
  • Further analyses assessed the impact of abnormal levels of thyroid-stimulating hormone on cognitive outcomes.

TAKEAWAY:

  • Of 2033 women included, 227 (mean age, 49.8 years) had levothyroxine-treated hypothyroidism and 1806 (mean age, 50.0 years) did not have thyroid disease; the proportion of women with premenopausal or early perimenopausal status at baseline was higher in the hypothyroidism group (54.2% vs 49.8%; = .010).
  • At baseline, levothyroxine-treated women had higher scores for processing speed (mean score, 56.5 vs 54.4; P = .006) and working memory (mean score, 6.8 vs 6.4; P = .018) than those without thyroid disease; however, no difference in episodic memory was observed between the groups.
  • Over the study period, there was no significant difference in cognitive decline between the groups.
  • There was no significant effect of levothyroxine-treated hypothyroidism on working memory or episodic memory, although an annual decline in processing speed was observed (P < .001).
  • Sensitivity analyses determined that abnormal levels of thyroid-stimulating hormone did not affect cognitive outcomes in women with hypothyroidism.

IN PRACTICE:

When cognitive decline is observed in these patients, the authors advised that “clinicians should resist anchoring on inadequate treatment of hypothyroidism as the cause of these symptoms and may investigate other disease processes (eg, iron deficiency, B12 deficiency, sleep apnea, celiac disease).”

SOURCE:

The study, led by Matthew D. Ettleson, Section of Endocrinology, Diabetes, and Metabolism, University of Chicago, was published online in Thyroid.

LIMITATIONS:

The cognitive assessments in the study were not designed to provide a thorough evaluation of all aspects of cognitive function. The study may not have been adequately powered to detect small effects of levothyroxine-treated hypothyroidism on cognitive outcomes. The higher levels of education attained by the study population may have acted as a protective factor against cognitive decline, potentially biasing the results.

DISCLOSURES:

The Study of Women’s Health Across the Nation was supported by grants from the National Institutes of Health (NIH), DHHS, through the National Institute on Aging, the National Institute of Nursing Research, and the NIH Office of Research on Women’s Health. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Women with hypothyroidism treated with levothyroxine show no significant cognitive decline across the menopausal transition compared with those without thyroid disease.

METHODOLOGY:

  • Levothyroxine, the primary treatment for hypothyroidism, has been linked to perceived cognitive deficits, yet it is unclear whether this is due to the underlying condition being inadequately treated or other factors.
  • Using data collected from the Study of Women’s Health Across the Nation, which encompasses five ethnic/racial groups from seven centers across the United States, researchers compared cognitive function over time between women with hypothyroidism treated with levothyroxine and those without thyroid disease.
  • Participants underwent cognitive testing across three domains — processing speed, working memory, and episodic memory — which were assessed over a mean follow-up of 13 years.
  • Further analyses assessed the impact of abnormal levels of thyroid-stimulating hormone on cognitive outcomes.

TAKEAWAY:

  • Of 2033 women included, 227 (mean age, 49.8 years) had levothyroxine-treated hypothyroidism and 1806 (mean age, 50.0 years) did not have thyroid disease; the proportion of women with premenopausal or early perimenopausal status at baseline was higher in the hypothyroidism group (54.2% vs 49.8%; = .010).
  • At baseline, levothyroxine-treated women had higher scores for processing speed (mean score, 56.5 vs 54.4; P = .006) and working memory (mean score, 6.8 vs 6.4; P = .018) than those without thyroid disease; however, no difference in episodic memory was observed between the groups.
  • Over the study period, there was no significant difference in cognitive decline between the groups.
  • There was no significant effect of levothyroxine-treated hypothyroidism on working memory or episodic memory, although an annual decline in processing speed was observed (P < .001).
  • Sensitivity analyses determined that abnormal levels of thyroid-stimulating hormone did not affect cognitive outcomes in women with hypothyroidism.

IN PRACTICE:

When cognitive decline is observed in these patients, the authors advised that “clinicians should resist anchoring on inadequate treatment of hypothyroidism as the cause of these symptoms and may investigate other disease processes (eg, iron deficiency, B12 deficiency, sleep apnea, celiac disease).”

SOURCE:

The study, led by Matthew D. Ettleson, Section of Endocrinology, Diabetes, and Metabolism, University of Chicago, was published online in Thyroid.

LIMITATIONS:

The cognitive assessments in the study were not designed to provide a thorough evaluation of all aspects of cognitive function. The study may not have been adequately powered to detect small effects of levothyroxine-treated hypothyroidism on cognitive outcomes. The higher levels of education attained by the study population may have acted as a protective factor against cognitive decline, potentially biasing the results.

DISCLOSURES:

The Study of Women’s Health Across the Nation was supported by grants from the National Institutes of Health (NIH), DHHS, through the National Institute on Aging, the National Institute of Nursing Research, and the NIH Office of Research on Women’s Health. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Women with hypothyroidism treated with levothyroxine show no significant cognitive decline across the menopausal transition compared with those without thyroid disease.

METHODOLOGY:

  • Levothyroxine, the primary treatment for hypothyroidism, has been linked to perceived cognitive deficits, yet it is unclear whether this is due to the underlying condition being inadequately treated or other factors.
  • Using data collected from the Study of Women’s Health Across the Nation, which encompasses five ethnic/racial groups from seven centers across the United States, researchers compared cognitive function over time between women with hypothyroidism treated with levothyroxine and those without thyroid disease.
  • Participants underwent cognitive testing across three domains — processing speed, working memory, and episodic memory — which were assessed over a mean follow-up of 13 years.
  • Further analyses assessed the impact of abnormal levels of thyroid-stimulating hormone on cognitive outcomes.

TAKEAWAY:

  • Of 2033 women included, 227 (mean age, 49.8 years) had levothyroxine-treated hypothyroidism and 1806 (mean age, 50.0 years) did not have thyroid disease; the proportion of women with premenopausal or early perimenopausal status at baseline was higher in the hypothyroidism group (54.2% vs 49.8%; = .010).
  • At baseline, levothyroxine-treated women had higher scores for processing speed (mean score, 56.5 vs 54.4; P = .006) and working memory (mean score, 6.8 vs 6.4; P = .018) than those without thyroid disease; however, no difference in episodic memory was observed between the groups.
  • Over the study period, there was no significant difference in cognitive decline between the groups.
  • There was no significant effect of levothyroxine-treated hypothyroidism on working memory or episodic memory, although an annual decline in processing speed was observed (P < .001).
  • Sensitivity analyses determined that abnormal levels of thyroid-stimulating hormone did not affect cognitive outcomes in women with hypothyroidism.

IN PRACTICE:

When cognitive decline is observed in these patients, the authors advised that “clinicians should resist anchoring on inadequate treatment of hypothyroidism as the cause of these symptoms and may investigate other disease processes (eg, iron deficiency, B12 deficiency, sleep apnea, celiac disease).”

SOURCE:

The study, led by Matthew D. Ettleson, Section of Endocrinology, Diabetes, and Metabolism, University of Chicago, was published online in Thyroid.

LIMITATIONS:

The cognitive assessments in the study were not designed to provide a thorough evaluation of all aspects of cognitive function. The study may not have been adequately powered to detect small effects of levothyroxine-treated hypothyroidism on cognitive outcomes. The higher levels of education attained by the study population may have acted as a protective factor against cognitive decline, potentially biasing the results.

DISCLOSURES:

The Study of Women’s Health Across the Nation was supported by grants from the National Institutes of Health (NIH), DHHS, through the National Institute on Aging, the National Institute of Nursing Research, and the NIH Office of Research on Women’s Health. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Race Adjustments in Algorithms Boost CRC Risk Prediction

Article Type
Changed
Mon, 10/07/2024 - 02:40

 

TOPLINE:

Accounting for racial disparities, including in the quality of family history data, enhanced the predictive performance of a colorectal cancer (CRC) risk prediction model.

METHODOLOGY:

  • The medical community is reevaluating the use of race adjustments in clinical algorithms due to concerns about the exacerbation of health disparities, especially as reported family history data are known to vary by race.
  • To understand how adjusting for race affects the accuracy of CRC prediction algorithms, researchers studied data from community health centers across 12 states as part of the Southern Community Cohort Study.
  • Researchers compared two screening algorithms that modeled 10-year CRC risk: A race-blind algorithm and a race-adjusted algorithm that included Black race as a main effect and an interaction with family history.
  • The primary outcome was the development of CRC within 10 years of enrollment, assessed using data collected from surveys at enrollment and follow-ups, cancer registry data, and National Death Index reports.
  • The researchers compared the algorithms’ predictive performance using such measures as area under the receiver operating characteristic curve (AUC) and calibration and also assessed how adjusting for race changed the proportion of Black participants identified as being at high risk for CRC.

TAKEAWAY:

  • The study sample included 77,836 adults aged 40-74 years with no history of CRC at baseline.
  • Despite having higher cancer rates, Black participants were more likely to report unknown family history (odds ratio [OR], 1.69; P < .001) and less likely to report known positive family history (OR, 0.68; P < .001) than White participants.
  • The interaction term between race and family history was 0.56, indicating that reported family history was less predictive of CRC risk in Black participants than in White participants (P = .010).
  • Compared with the race-blinded algorithm, the race-adjusted algorithm increased the fraction of Black participants among the predicted high-risk group (66.1% vs 74.4%; P < .001), potentially enhancing access to screening.
  • The race-adjusted algorithm improved the goodness of fit (< .001) and showed a small improvement in AUC among Black participants (0.611 vs 0.608; P = .006).

IN PRACTICE:

“Our analysis found that removing race from colorectal screening predictors could reduce the number of Black patients recommended for screening, which would work against efforts to reduce disparities in colorectal cancer screening and outcomes,” the authors wrote.

SOURCE:

The study, led by Anna Zink, PhD, the University of Chicago Booth School of Business, Chicago, was published online in Proceedings of the National Academy of Sciences of the USA .

LIMITATIONS:

The study did not report any limitations.

DISCLOSURES:

The study was supported by the National Cancer Institute of the National Institutes of Health. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Accounting for racial disparities, including in the quality of family history data, enhanced the predictive performance of a colorectal cancer (CRC) risk prediction model.

METHODOLOGY:

  • The medical community is reevaluating the use of race adjustments in clinical algorithms due to concerns about the exacerbation of health disparities, especially as reported family history data are known to vary by race.
  • To understand how adjusting for race affects the accuracy of CRC prediction algorithms, researchers studied data from community health centers across 12 states as part of the Southern Community Cohort Study.
  • Researchers compared two screening algorithms that modeled 10-year CRC risk: A race-blind algorithm and a race-adjusted algorithm that included Black race as a main effect and an interaction with family history.
  • The primary outcome was the development of CRC within 10 years of enrollment, assessed using data collected from surveys at enrollment and follow-ups, cancer registry data, and National Death Index reports.
  • The researchers compared the algorithms’ predictive performance using such measures as area under the receiver operating characteristic curve (AUC) and calibration and also assessed how adjusting for race changed the proportion of Black participants identified as being at high risk for CRC.

TAKEAWAY:

  • The study sample included 77,836 adults aged 40-74 years with no history of CRC at baseline.
  • Despite having higher cancer rates, Black participants were more likely to report unknown family history (odds ratio [OR], 1.69; P < .001) and less likely to report known positive family history (OR, 0.68; P < .001) than White participants.
  • The interaction term between race and family history was 0.56, indicating that reported family history was less predictive of CRC risk in Black participants than in White participants (P = .010).
  • Compared with the race-blinded algorithm, the race-adjusted algorithm increased the fraction of Black participants among the predicted high-risk group (66.1% vs 74.4%; P < .001), potentially enhancing access to screening.
  • The race-adjusted algorithm improved the goodness of fit (< .001) and showed a small improvement in AUC among Black participants (0.611 vs 0.608; P = .006).

IN PRACTICE:

“Our analysis found that removing race from colorectal screening predictors could reduce the number of Black patients recommended for screening, which would work against efforts to reduce disparities in colorectal cancer screening and outcomes,” the authors wrote.

SOURCE:

The study, led by Anna Zink, PhD, the University of Chicago Booth School of Business, Chicago, was published online in Proceedings of the National Academy of Sciences of the USA .

LIMITATIONS:

The study did not report any limitations.

DISCLOSURES:

The study was supported by the National Cancer Institute of the National Institutes of Health. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Accounting for racial disparities, including in the quality of family history data, enhanced the predictive performance of a colorectal cancer (CRC) risk prediction model.

METHODOLOGY:

  • The medical community is reevaluating the use of race adjustments in clinical algorithms due to concerns about the exacerbation of health disparities, especially as reported family history data are known to vary by race.
  • To understand how adjusting for race affects the accuracy of CRC prediction algorithms, researchers studied data from community health centers across 12 states as part of the Southern Community Cohort Study.
  • Researchers compared two screening algorithms that modeled 10-year CRC risk: A race-blind algorithm and a race-adjusted algorithm that included Black race as a main effect and an interaction with family history.
  • The primary outcome was the development of CRC within 10 years of enrollment, assessed using data collected from surveys at enrollment and follow-ups, cancer registry data, and National Death Index reports.
  • The researchers compared the algorithms’ predictive performance using such measures as area under the receiver operating characteristic curve (AUC) and calibration and also assessed how adjusting for race changed the proportion of Black participants identified as being at high risk for CRC.

TAKEAWAY:

  • The study sample included 77,836 adults aged 40-74 years with no history of CRC at baseline.
  • Despite having higher cancer rates, Black participants were more likely to report unknown family history (odds ratio [OR], 1.69; P < .001) and less likely to report known positive family history (OR, 0.68; P < .001) than White participants.
  • The interaction term between race and family history was 0.56, indicating that reported family history was less predictive of CRC risk in Black participants than in White participants (P = .010).
  • Compared with the race-blinded algorithm, the race-adjusted algorithm increased the fraction of Black participants among the predicted high-risk group (66.1% vs 74.4%; P < .001), potentially enhancing access to screening.
  • The race-adjusted algorithm improved the goodness of fit (< .001) and showed a small improvement in AUC among Black participants (0.611 vs 0.608; P = .006).

IN PRACTICE:

“Our analysis found that removing race from colorectal screening predictors could reduce the number of Black patients recommended for screening, which would work against efforts to reduce disparities in colorectal cancer screening and outcomes,” the authors wrote.

SOURCE:

The study, led by Anna Zink, PhD, the University of Chicago Booth School of Business, Chicago, was published online in Proceedings of the National Academy of Sciences of the USA .

LIMITATIONS:

The study did not report any limitations.

DISCLOSURES:

The study was supported by the National Cancer Institute of the National Institutes of Health. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article