Commentary: Endocrine therapy and mammography, May 2023

Article Type
Changed
Thu, 05/11/2023 - 17:21
Dr. Roesch scans the journals, so you don't have to!

Erin Roesch, MD
The use of endocrine therapy for prevention and adherence in the adjuvant setting is often affected by the patient's fear or experience of adverse side effects. Studies focused on finding the minimal effective dose of endocrine therapy while decreasing toxicity can lead to better uptake and improved adherence. The 10-year results from the TAM-01 trial, evaluating 5 mg tamoxifen daily (babytam) for 3 years among 500 women with ductal carcinoma in situ (DCIS), lobular carcinoma in situ, or atypical ductal hyperplasia, were recently presented. There was a 42% reduced risk for recurrence with low-dose tamoxifen vs placebo, and in the DCIS cohort there was a 50% reduction in recurrence risk with 3 years of low-dose tamoxifen.1

Serrano and colleagues performed a multicenter, double-blind, phase 2b randomized trial investigating various dosing schedules of exemestane (25 mg once daily, three times weekly, or once weekly) for 4-6 weeks before surgery, among 180 postmenopausal women with stage 0-II estrogen receptor–positive breast cancer (BC). Among adherent patients (89% of the population), 25 mg exemestane given three times weekly was noninferior to once-daily dosing in reducing serum estradiol (mean decrease of estradiol, -92% and -91%, respectively; difference in percentage change, 2.0%; P for noninferiority = .02), whereas once-weekly dosing was less effective. Adverse effects were similar, although owing to short exposure in this study, it will be important to explore longer-term differences because aromatase inhibitor–related toxicities may arise later on. These data support further exploration of alternative endocrine therapy schedules in the prevention setting, and also in adjuvant treatment for women who are unable to tolerate the standard dose.

Screening mammography reduces mortality from BC, and advances in techniques, such as digital breast tomosynthesis (DBT), have led to lower recall rates, and higher cancer detection rates compared with digital mammography (DM). Additionally, DBT has demonstrated better cancer detection compared with DM, notably among younger women and those with dense breast tissue.2 A retrospective study including over 2.5 million screening mammograms among women 40-79 years of age showed that, compared with DM, DBT had a lower recall rate (10.3% vs 8.9%; adjusted odds ratio [OR] 0.92; P < .001) and higher positive predictive value of recall (4.3% vs 5.9%; adjusted OR 1.33; P < .001), cancer detection rate (4.5 of 1000 vs 5.3 of 1000 screening mammograms; adjusted OR 1.24; P < .001), and biopsy rate (17.6 of 1000 vs 14.5 of 1000 screening mammograms; adjusted OR 1.33, P < .001) (Conant et al). These data add to the growing body of evidence showing superiority in BC screening with DBT vs DM and add support of this technique in routine clinical practice for our patients.

The initial treatment strategy for metastatic hormone receptor–positive (HR+)/human epidermal growth factor receptor 2–negative (HER2-) BC involves endocrine therapy in combination with a cyclin-dependent kinase (CDK) 4/6 inhibitor. The three PALOMA trials demonstrated progression-free survival (PFS) benefit with palbociclib plus endocrine therapy, and a pooled analysis of these studies reported consistent improvement in PFS with palbociclib plus endocrine therapy vs endocrine therapy alone in older patients.3 A retrospective study evaluated real-world outcomes of palbociclib plus letrozole vs letrozole alone among 796 women ≥ 65 years of age with HR+/HER- metastatic BC. First-line palbociclib plus letrozole compared with letrozole alone significantly improved median real-world PFS (22.2 vs 15.8 months; adjusted hazard ratio [HR] 0.59; P < .001) and overall survival (not reached vs 43.4 months; adjusted HR 0.55; P < .001). Real-world best tumor response rate was also higher (52.4% vs 22.1%; OR 2.0; P < .001) (Rugo et al). This study highlights the effectiveness of palbociclib plus letrozole in older adults with HR+/HER2- metastatic BC and the benefits of examining a real-world population that adds value to the existing data from randomized clinical trials.

Additional References

  1. De Censi A, Lazzeroni M, Puntoni M, et al. 10-year results of a phase 3 trial of low-dose tamoxifen in non-invasive breast cancer. Presented at the 2022 San Antonio Breast Cancer Symposium; December 6-10, 2022; San Antonio, Texas. Abstract GS4-08. https://www.sabcs.org/Portals/SABCS2016/2022%20SABCS/Friday.pdf?ver=2022-11-22-205358-350
  2. Conant EF, Barlow WE, Herschorn SD, et al; Population-based Research Optimizing Screening Through Personalized Regimen (PROSPR) Consortium. Association of digital breast tomosynthesis vs digital mammography with cancer detection and recall rates by age and breast density. JAMA Oncol. 2019;5:635-64 doi: 10.1001/jamaoncol.2018.7078
  3. Rugo HS, Turner NC, Finn RS, et al. Palbociclib plus endocrine therapy in older women with HR+/HER2- advanced breast cancer: a pooled analysis of randomised PALOMA clinical studies. Eur J Cancer. 2018;101:123-13 doi: 10.1016/j.ejca.2018.05.017

 

Author and Disclosure Information

Erin E. Roesch, MD, Associate Staff, Department of Medical Oncology, Cleveland Clinic, Cleveland, Ohio
Erin E. Roesch, MD, has disclosed the following relevant financial relationships:
Serve(d) as a speaker or a member of a speakers bureau for: Puma Biotechnology

Publications
Topics
Sections
Author and Disclosure Information

Erin E. Roesch, MD, Associate Staff, Department of Medical Oncology, Cleveland Clinic, Cleveland, Ohio
Erin E. Roesch, MD, has disclosed the following relevant financial relationships:
Serve(d) as a speaker or a member of a speakers bureau for: Puma Biotechnology

Author and Disclosure Information

Erin E. Roesch, MD, Associate Staff, Department of Medical Oncology, Cleveland Clinic, Cleveland, Ohio
Erin E. Roesch, MD, has disclosed the following relevant financial relationships:
Serve(d) as a speaker or a member of a speakers bureau for: Puma Biotechnology

Dr. Roesch scans the journals, so you don't have to!
Dr. Roesch scans the journals, so you don't have to!

Erin Roesch, MD
The use of endocrine therapy for prevention and adherence in the adjuvant setting is often affected by the patient's fear or experience of adverse side effects. Studies focused on finding the minimal effective dose of endocrine therapy while decreasing toxicity can lead to better uptake and improved adherence. The 10-year results from the TAM-01 trial, evaluating 5 mg tamoxifen daily (babytam) for 3 years among 500 women with ductal carcinoma in situ (DCIS), lobular carcinoma in situ, or atypical ductal hyperplasia, were recently presented. There was a 42% reduced risk for recurrence with low-dose tamoxifen vs placebo, and in the DCIS cohort there was a 50% reduction in recurrence risk with 3 years of low-dose tamoxifen.1

Serrano and colleagues performed a multicenter, double-blind, phase 2b randomized trial investigating various dosing schedules of exemestane (25 mg once daily, three times weekly, or once weekly) for 4-6 weeks before surgery, among 180 postmenopausal women with stage 0-II estrogen receptor–positive breast cancer (BC). Among adherent patients (89% of the population), 25 mg exemestane given three times weekly was noninferior to once-daily dosing in reducing serum estradiol (mean decrease of estradiol, -92% and -91%, respectively; difference in percentage change, 2.0%; P for noninferiority = .02), whereas once-weekly dosing was less effective. Adverse effects were similar, although owing to short exposure in this study, it will be important to explore longer-term differences because aromatase inhibitor–related toxicities may arise later on. These data support further exploration of alternative endocrine therapy schedules in the prevention setting, and also in adjuvant treatment for women who are unable to tolerate the standard dose.

Screening mammography reduces mortality from BC, and advances in techniques, such as digital breast tomosynthesis (DBT), have led to lower recall rates, and higher cancer detection rates compared with digital mammography (DM). Additionally, DBT has demonstrated better cancer detection compared with DM, notably among younger women and those with dense breast tissue.2 A retrospective study including over 2.5 million screening mammograms among women 40-79 years of age showed that, compared with DM, DBT had a lower recall rate (10.3% vs 8.9%; adjusted odds ratio [OR] 0.92; P < .001) and higher positive predictive value of recall (4.3% vs 5.9%; adjusted OR 1.33; P < .001), cancer detection rate (4.5 of 1000 vs 5.3 of 1000 screening mammograms; adjusted OR 1.24; P < .001), and biopsy rate (17.6 of 1000 vs 14.5 of 1000 screening mammograms; adjusted OR 1.33, P < .001) (Conant et al). These data add to the growing body of evidence showing superiority in BC screening with DBT vs DM and add support of this technique in routine clinical practice for our patients.

The initial treatment strategy for metastatic hormone receptor–positive (HR+)/human epidermal growth factor receptor 2–negative (HER2-) BC involves endocrine therapy in combination with a cyclin-dependent kinase (CDK) 4/6 inhibitor. The three PALOMA trials demonstrated progression-free survival (PFS) benefit with palbociclib plus endocrine therapy, and a pooled analysis of these studies reported consistent improvement in PFS with palbociclib plus endocrine therapy vs endocrine therapy alone in older patients.3 A retrospective study evaluated real-world outcomes of palbociclib plus letrozole vs letrozole alone among 796 women ≥ 65 years of age with HR+/HER- metastatic BC. First-line palbociclib plus letrozole compared with letrozole alone significantly improved median real-world PFS (22.2 vs 15.8 months; adjusted hazard ratio [HR] 0.59; P < .001) and overall survival (not reached vs 43.4 months; adjusted HR 0.55; P < .001). Real-world best tumor response rate was also higher (52.4% vs 22.1%; OR 2.0; P < .001) (Rugo et al). This study highlights the effectiveness of palbociclib plus letrozole in older adults with HR+/HER2- metastatic BC and the benefits of examining a real-world population that adds value to the existing data from randomized clinical trials.

Additional References

  1. De Censi A, Lazzeroni M, Puntoni M, et al. 10-year results of a phase 3 trial of low-dose tamoxifen in non-invasive breast cancer. Presented at the 2022 San Antonio Breast Cancer Symposium; December 6-10, 2022; San Antonio, Texas. Abstract GS4-08. https://www.sabcs.org/Portals/SABCS2016/2022%20SABCS/Friday.pdf?ver=2022-11-22-205358-350
  2. Conant EF, Barlow WE, Herschorn SD, et al; Population-based Research Optimizing Screening Through Personalized Regimen (PROSPR) Consortium. Association of digital breast tomosynthesis vs digital mammography with cancer detection and recall rates by age and breast density. JAMA Oncol. 2019;5:635-64 doi: 10.1001/jamaoncol.2018.7078
  3. Rugo HS, Turner NC, Finn RS, et al. Palbociclib plus endocrine therapy in older women with HR+/HER2- advanced breast cancer: a pooled analysis of randomised PALOMA clinical studies. Eur J Cancer. 2018;101:123-13 doi: 10.1016/j.ejca.2018.05.017

 

Erin Roesch, MD
The use of endocrine therapy for prevention and adherence in the adjuvant setting is often affected by the patient's fear or experience of adverse side effects. Studies focused on finding the minimal effective dose of endocrine therapy while decreasing toxicity can lead to better uptake and improved adherence. The 10-year results from the TAM-01 trial, evaluating 5 mg tamoxifen daily (babytam) for 3 years among 500 women with ductal carcinoma in situ (DCIS), lobular carcinoma in situ, or atypical ductal hyperplasia, were recently presented. There was a 42% reduced risk for recurrence with low-dose tamoxifen vs placebo, and in the DCIS cohort there was a 50% reduction in recurrence risk with 3 years of low-dose tamoxifen.1

Serrano and colleagues performed a multicenter, double-blind, phase 2b randomized trial investigating various dosing schedules of exemestane (25 mg once daily, three times weekly, or once weekly) for 4-6 weeks before surgery, among 180 postmenopausal women with stage 0-II estrogen receptor–positive breast cancer (BC). Among adherent patients (89% of the population), 25 mg exemestane given three times weekly was noninferior to once-daily dosing in reducing serum estradiol (mean decrease of estradiol, -92% and -91%, respectively; difference in percentage change, 2.0%; P for noninferiority = .02), whereas once-weekly dosing was less effective. Adverse effects were similar, although owing to short exposure in this study, it will be important to explore longer-term differences because aromatase inhibitor–related toxicities may arise later on. These data support further exploration of alternative endocrine therapy schedules in the prevention setting, and also in adjuvant treatment for women who are unable to tolerate the standard dose.

Screening mammography reduces mortality from BC, and advances in techniques, such as digital breast tomosynthesis (DBT), have led to lower recall rates, and higher cancer detection rates compared with digital mammography (DM). Additionally, DBT has demonstrated better cancer detection compared with DM, notably among younger women and those with dense breast tissue.2 A retrospective study including over 2.5 million screening mammograms among women 40-79 years of age showed that, compared with DM, DBT had a lower recall rate (10.3% vs 8.9%; adjusted odds ratio [OR] 0.92; P < .001) and higher positive predictive value of recall (4.3% vs 5.9%; adjusted OR 1.33; P < .001), cancer detection rate (4.5 of 1000 vs 5.3 of 1000 screening mammograms; adjusted OR 1.24; P < .001), and biopsy rate (17.6 of 1000 vs 14.5 of 1000 screening mammograms; adjusted OR 1.33, P < .001) (Conant et al). These data add to the growing body of evidence showing superiority in BC screening with DBT vs DM and add support of this technique in routine clinical practice for our patients.

The initial treatment strategy for metastatic hormone receptor–positive (HR+)/human epidermal growth factor receptor 2–negative (HER2-) BC involves endocrine therapy in combination with a cyclin-dependent kinase (CDK) 4/6 inhibitor. The three PALOMA trials demonstrated progression-free survival (PFS) benefit with palbociclib plus endocrine therapy, and a pooled analysis of these studies reported consistent improvement in PFS with palbociclib plus endocrine therapy vs endocrine therapy alone in older patients.3 A retrospective study evaluated real-world outcomes of palbociclib plus letrozole vs letrozole alone among 796 women ≥ 65 years of age with HR+/HER- metastatic BC. First-line palbociclib plus letrozole compared with letrozole alone significantly improved median real-world PFS (22.2 vs 15.8 months; adjusted hazard ratio [HR] 0.59; P < .001) and overall survival (not reached vs 43.4 months; adjusted HR 0.55; P < .001). Real-world best tumor response rate was also higher (52.4% vs 22.1%; OR 2.0; P < .001) (Rugo et al). This study highlights the effectiveness of palbociclib plus letrozole in older adults with HR+/HER2- metastatic BC and the benefits of examining a real-world population that adds value to the existing data from randomized clinical trials.

Additional References

  1. De Censi A, Lazzeroni M, Puntoni M, et al. 10-year results of a phase 3 trial of low-dose tamoxifen in non-invasive breast cancer. Presented at the 2022 San Antonio Breast Cancer Symposium; December 6-10, 2022; San Antonio, Texas. Abstract GS4-08. https://www.sabcs.org/Portals/SABCS2016/2022%20SABCS/Friday.pdf?ver=2022-11-22-205358-350
  2. Conant EF, Barlow WE, Herschorn SD, et al; Population-based Research Optimizing Screening Through Personalized Regimen (PROSPR) Consortium. Association of digital breast tomosynthesis vs digital mammography with cancer detection and recall rates by age and breast density. JAMA Oncol. 2019;5:635-64 doi: 10.1001/jamaoncol.2018.7078
  3. Rugo HS, Turner NC, Finn RS, et al. Palbociclib plus endocrine therapy in older women with HR+/HER2- advanced breast cancer: a pooled analysis of randomised PALOMA clinical studies. Eur J Cancer. 2018;101:123-13 doi: 10.1016/j.ejca.2018.05.017

 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Article Series
Clinical Edge Journal Scan: Breast Cancer May 2023
Gate On Date
Mon, 05/03/2021 - 14:45
Un-Gate On Date
Mon, 05/03/2021 - 14:45
Use ProPublica
CFC Schedule Remove Status
Mon, 05/03/2021 - 14:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Activity Salesforce Deliverable ID
367005.2
Activity ID
93656
Product Name
Clinical Edge Journal Scan
Product ID
124
Supporter Name /ID
Perjeta [ 3532 ]

Commentary: Three New AD Treatments and a Study of Food Allergy, May 2023

Article Type
Changed
Wed, 05/03/2023 - 09:47
Dr. Feldman scans the journals, so you don’t have to!

Steven R. Feldman, MD, PhD
Silverberg and colleagues present the results of two phase 3 clinical trials of lebrikizumab. Considering what we already know about interleukin 13 (IL-13) blockade with dupilumab and tralokinumab, it isn't surprising that lebrikizumab was effective and had few side effects. The Investigator Global Assessment (IGA) success rates in the 40% range seem roughly similar to those of dupilumab. While "40% success" doesn't sound great, real-life success rates are much higher — at least with dupilumab — than you'd expect on the basis of this IGA success rate. A minor limitation of dupilumab treatment is the side effect of conjunctivitis (minor in that most patients can be treated with saline eye drops); conjunctivitis was also seen with lebrikizumab in these phase 3 studies. Lebrikizumab appears to be another good tool in our toolbox for patients with moderate to severe atopic dermatitis, but it's not a quantum leap forward in atopic dermatitis management.

Torrelo and colleagues described the efficacy and safety of baricitinib in combination with topical corticosteroids in pediatric patients with moderate to severe atopic dermatitis. At the high dose of 4 mg daily, the IGA success rate was about 40%, similar to what we expect for adults treated with dupilumab and less than what we might expect with upadacitinib.

Studies have already been done on efficacy and safety of baricitinib in adults with atopic dermatitis. But baricitinib is indicated for the treatment of adult patients with severe alopecia areata and is not currently indicated as a treatment for anyone with atopic dermatitis, at least not in the United States. At this time, I think the most useful aspect of Torrelo and colleagues' findings is being able to tell our adult patients with alopecia areata that baricitinib was safe enough that they could test it in children as young as 2 years old with eczema.

Perälä and colleagues' report comparing topical tacrolimus and topical corticosteroids (1% hydrocortisone acetate or, if needed, 0.1% hydrocortisone butyrate ointment) in young children with atopic dermatitis is fascinating. They saw patients back at 1 week and followed them for 3 years. In just 1 week, both groups had massive and similar improvement in their atopic dermatitis, and that improvement continued throughout the study. Here are some take-home points:

  • Atopic dermatitis responds rapidly to low-to-medium–strength topical steroids.
  • Bringing patients back at 1 week may have been a critical aspect of this study, as adherence to topicals can be abysmal; bringing patients back at 1 week probably enables them to use their treatment much better than they would otherwise.
  • If we need a nonsteroidal topical, we have an excellent one available at low cost in the form of topical tacrolimus.

Perälä and colleagues also did this study to see whether good treatment of atopic dermatitis in these young children would have long-term benefits on atopic airway issues. Because the researchers didn't have a placebo group (and considered it unethical to have one), we cannot tell whether the topical treatment provided any benefit in that regard.

Yamamoto-Hanada and colleaguesexamined whether "enhanced" topical steroid treatment would prevent food allergy in children with eczema compared with standard topical steroid treatment. Perhaps a better word than "enhanced" would be "aggressive." The enhanced treatment entailed having infants receive alclometasone dipropionate for the whole face and betamethasone valerate for the whole body except face and scalp. While the researchers saw a reduction in egg allergy (from roughly 40% to 30%), they also saw reduced body weight and height. A key take-home message is that with extensive use of topical steroids, we can see systemic effects.

 

Author and Disclosure Information

Steven R. Feldman, MD, PhD
Professor of Dermatology, Pathology and Social Sciences & Health Policy Wake Forest University School of Medicine, Winston-Salem, NC
 

Publications
Topics
Sections
Author and Disclosure Information

Steven R. Feldman, MD, PhD
Professor of Dermatology, Pathology and Social Sciences & Health Policy Wake Forest University School of Medicine, Winston-Salem, NC
 

Author and Disclosure Information

Steven R. Feldman, MD, PhD
Professor of Dermatology, Pathology and Social Sciences & Health Policy Wake Forest University School of Medicine, Winston-Salem, NC
 

Dr. Feldman scans the journals, so you don’t have to!
Dr. Feldman scans the journals, so you don’t have to!

Steven R. Feldman, MD, PhD
Silverberg and colleagues present the results of two phase 3 clinical trials of lebrikizumab. Considering what we already know about interleukin 13 (IL-13) blockade with dupilumab and tralokinumab, it isn't surprising that lebrikizumab was effective and had few side effects. The Investigator Global Assessment (IGA) success rates in the 40% range seem roughly similar to those of dupilumab. While "40% success" doesn't sound great, real-life success rates are much higher — at least with dupilumab — than you'd expect on the basis of this IGA success rate. A minor limitation of dupilumab treatment is the side effect of conjunctivitis (minor in that most patients can be treated with saline eye drops); conjunctivitis was also seen with lebrikizumab in these phase 3 studies. Lebrikizumab appears to be another good tool in our toolbox for patients with moderate to severe atopic dermatitis, but it's not a quantum leap forward in atopic dermatitis management.

Torrelo and colleagues described the efficacy and safety of baricitinib in combination with topical corticosteroids in pediatric patients with moderate to severe atopic dermatitis. At the high dose of 4 mg daily, the IGA success rate was about 40%, similar to what we expect for adults treated with dupilumab and less than what we might expect with upadacitinib.

Studies have already been done on efficacy and safety of baricitinib in adults with atopic dermatitis. But baricitinib is indicated for the treatment of adult patients with severe alopecia areata and is not currently indicated as a treatment for anyone with atopic dermatitis, at least not in the United States. At this time, I think the most useful aspect of Torrelo and colleagues' findings is being able to tell our adult patients with alopecia areata that baricitinib was safe enough that they could test it in children as young as 2 years old with eczema.

Perälä and colleagues' report comparing topical tacrolimus and topical corticosteroids (1% hydrocortisone acetate or, if needed, 0.1% hydrocortisone butyrate ointment) in young children with atopic dermatitis is fascinating. They saw patients back at 1 week and followed them for 3 years. In just 1 week, both groups had massive and similar improvement in their atopic dermatitis, and that improvement continued throughout the study. Here are some take-home points:

  • Atopic dermatitis responds rapidly to low-to-medium–strength topical steroids.
  • Bringing patients back at 1 week may have been a critical aspect of this study, as adherence to topicals can be abysmal; bringing patients back at 1 week probably enables them to use their treatment much better than they would otherwise.
  • If we need a nonsteroidal topical, we have an excellent one available at low cost in the form of topical tacrolimus.

Perälä and colleagues also did this study to see whether good treatment of atopic dermatitis in these young children would have long-term benefits on atopic airway issues. Because the researchers didn't have a placebo group (and considered it unethical to have one), we cannot tell whether the topical treatment provided any benefit in that regard.

Yamamoto-Hanada and colleaguesexamined whether "enhanced" topical steroid treatment would prevent food allergy in children with eczema compared with standard topical steroid treatment. Perhaps a better word than "enhanced" would be "aggressive." The enhanced treatment entailed having infants receive alclometasone dipropionate for the whole face and betamethasone valerate for the whole body except face and scalp. While the researchers saw a reduction in egg allergy (from roughly 40% to 30%), they also saw reduced body weight and height. A key take-home message is that with extensive use of topical steroids, we can see systemic effects.

 

Steven R. Feldman, MD, PhD
Silverberg and colleagues present the results of two phase 3 clinical trials of lebrikizumab. Considering what we already know about interleukin 13 (IL-13) blockade with dupilumab and tralokinumab, it isn't surprising that lebrikizumab was effective and had few side effects. The Investigator Global Assessment (IGA) success rates in the 40% range seem roughly similar to those of dupilumab. While "40% success" doesn't sound great, real-life success rates are much higher — at least with dupilumab — than you'd expect on the basis of this IGA success rate. A minor limitation of dupilumab treatment is the side effect of conjunctivitis (minor in that most patients can be treated with saline eye drops); conjunctivitis was also seen with lebrikizumab in these phase 3 studies. Lebrikizumab appears to be another good tool in our toolbox for patients with moderate to severe atopic dermatitis, but it's not a quantum leap forward in atopic dermatitis management.

Torrelo and colleagues described the efficacy and safety of baricitinib in combination with topical corticosteroids in pediatric patients with moderate to severe atopic dermatitis. At the high dose of 4 mg daily, the IGA success rate was about 40%, similar to what we expect for adults treated with dupilumab and less than what we might expect with upadacitinib.

Studies have already been done on efficacy and safety of baricitinib in adults with atopic dermatitis. But baricitinib is indicated for the treatment of adult patients with severe alopecia areata and is not currently indicated as a treatment for anyone with atopic dermatitis, at least not in the United States. At this time, I think the most useful aspect of Torrelo and colleagues' findings is being able to tell our adult patients with alopecia areata that baricitinib was safe enough that they could test it in children as young as 2 years old with eczema.

Perälä and colleagues' report comparing topical tacrolimus and topical corticosteroids (1% hydrocortisone acetate or, if needed, 0.1% hydrocortisone butyrate ointment) in young children with atopic dermatitis is fascinating. They saw patients back at 1 week and followed them for 3 years. In just 1 week, both groups had massive and similar improvement in their atopic dermatitis, and that improvement continued throughout the study. Here are some take-home points:

  • Atopic dermatitis responds rapidly to low-to-medium–strength topical steroids.
  • Bringing patients back at 1 week may have been a critical aspect of this study, as adherence to topicals can be abysmal; bringing patients back at 1 week probably enables them to use their treatment much better than they would otherwise.
  • If we need a nonsteroidal topical, we have an excellent one available at low cost in the form of topical tacrolimus.

Perälä and colleagues also did this study to see whether good treatment of atopic dermatitis in these young children would have long-term benefits on atopic airway issues. Because the researchers didn't have a placebo group (and considered it unethical to have one), we cannot tell whether the topical treatment provided any benefit in that regard.

Yamamoto-Hanada and colleaguesexamined whether "enhanced" topical steroid treatment would prevent food allergy in children with eczema compared with standard topical steroid treatment. Perhaps a better word than "enhanced" would be "aggressive." The enhanced treatment entailed having infants receive alclometasone dipropionate for the whole face and betamethasone valerate for the whole body except face and scalp. While the researchers saw a reduction in egg allergy (from roughly 40% to 30%), they also saw reduced body weight and height. A key take-home message is that with extensive use of topical steroids, we can see systemic effects.

 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Article Series
Clinical Edge Journal Scan: Atopic Dermatitis May 2023
Gate On Date
Thu, 07/29/2021 - 18:45
Un-Gate On Date
Thu, 07/29/2021 - 18:45
Use ProPublica
CFC Schedule Remove Status
Thu, 07/29/2021 - 18:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Activity Salesforce Deliverable ID
380491.14
Activity ID
94686
Product Name
Clinical Edge Journal Scan
Product ID
124
Supporter Name /ID
RINVOQ [ 5260 ]

New ABIM fees to stay listed as ‘board certified’ irk physicians

Article Type
Changed
Fri, 04/28/2023 - 14:59

 

Abdul Moiz Hafiz, MD, was flabbergasted when he received a phone call from his institution’s credentialing office telling him that he was not certified for interventional cardiology – even though he had passed that exam in 2016.

Dr. Hafiz, who directs the Advanced Structural Heart Disease Program at Southern Illinois University, phoned the American Board of Internal Medicine (ABIM), where he learned that to restore his credentials, he would need to pay $1,225 in maintenance of certification (MOC) fees.

Like Dr. Hafiz, many physicians have been dismayed to learn that the ABIM is now listing as “not certified” physicians who have passed board exams but have not paid annual MOC fees of $220 per year for the first certificate and $120 for each additional certificate.

Even doctors who are participating in mandatory continuing education outside the ABIM’s auspices are finding themselves listed as “not certified.” Some physicians learned of the policy change only after applying for hospital privileges or for jobs that require ABIM certification.

Now that increasing numbers of physicians are employed by hospitals and health care organizations that require ABIM certification, many doctors have no option but to pony up the fees if they want to continue to practice medicine.

“We have no say in the matter,” said Dr. Hafiz, “and there’s no appeal process.”

The change affects nearly 330,000 physicians. Responses to the policy on Twitter included accusations of extortion and denunciations of the ABIM’s “money grab policies.”

Sunil Rao, MD, director of interventional cardiology at NYU Langone Health and president of the Society for Cardiovascular Angiography and Interventions (SCAI), has heard from many SCAI members who had experiences similar to Dr. Hafiz’s. While Dr. Rao describes some of the Twitter outrage as “emotional,” he does acknowledge that the ABIM’s moves appear to be financially motivated.

“The issue here was that as soon as they paid the fee, all of a sudden, ABIM flipped the switch and said they were certified,” he said. “It certainly sounds like a purely financial kind of structure.”

Richard Baron, MD, president and CEO of the ABIM, said doctors are misunderstanding the policy change.

“No doctor loses certification solely for failure to pay fees,” Dr. Baron told this news organization. “What caused them to be reported as not certified was that we didn’t have evidence that they had met program requirements. They could say, ‘But I did meet program requirements, you just didn’t know it.’ To which our answer would be, for us to know it, we have to process them. And our policy is that we don’t process them unless you are current on your fees.”

This is not the first time ABIM policies have alienated physicians.

Last year, the ABIM raised its MOC fees from $165 to $220. That also prompted a wave of outrage. Other grievances go further back. At one time, being board certified was a lifetime credential. However, in 1990 the ABIM made periodic recertification mandatory.

The process, which came to be known as “maintenance of certification,” had to be completed every 10 years, and fees were charged for each certification. At that point, said Dr. Baron, the relationship between the ABIM and physicians changed from a one-time interaction to a career-long relationship. He advises doctors to check in periodically on their portal page at the ABIM or download the app so they will always know their status.

Many physicians would prefer not to be bound to a lifetime relationship with the ABIM. There is an alternative licensing board, the National Board of Physicians and Surgeons (NBPAS), but it is accepted by only a limited number of hospitals.

“Until the NBPAS gains wide recognition,” said Dr. Hafiz, “the ABIM is going to continue to have basically a monopoly over the market.”

The value of MOC itself has been called into question. “There are no direct data supporting the value of the MOC process in either improving care, making patient care safer, or making patient care higher quality,” said Dr. Rao. This feeds frustration in a clinical community already dealing with onerous training requirements and expensive board certification exams and adds to the perception that it is a purely financial transaction, he said. (Studies examining whether the MOC system improves patient care have shown mixed results.)

The true value of the ABIM to physicians, Dr. Baron contends, is that the organization is an independent third party that differentiates those doctors from people who don’t have their skills, training, and expertise. “In these days, where anyone can be an ‘expert’ on the Internet, that’s more valuable than ever before,” he said.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Abdul Moiz Hafiz, MD, was flabbergasted when he received a phone call from his institution’s credentialing office telling him that he was not certified for interventional cardiology – even though he had passed that exam in 2016.

Dr. Hafiz, who directs the Advanced Structural Heart Disease Program at Southern Illinois University, phoned the American Board of Internal Medicine (ABIM), where he learned that to restore his credentials, he would need to pay $1,225 in maintenance of certification (MOC) fees.

Like Dr. Hafiz, many physicians have been dismayed to learn that the ABIM is now listing as “not certified” physicians who have passed board exams but have not paid annual MOC fees of $220 per year for the first certificate and $120 for each additional certificate.

Even doctors who are participating in mandatory continuing education outside the ABIM’s auspices are finding themselves listed as “not certified.” Some physicians learned of the policy change only after applying for hospital privileges or for jobs that require ABIM certification.

Now that increasing numbers of physicians are employed by hospitals and health care organizations that require ABIM certification, many doctors have no option but to pony up the fees if they want to continue to practice medicine.

“We have no say in the matter,” said Dr. Hafiz, “and there’s no appeal process.”

The change affects nearly 330,000 physicians. Responses to the policy on Twitter included accusations of extortion and denunciations of the ABIM’s “money grab policies.”

Sunil Rao, MD, director of interventional cardiology at NYU Langone Health and president of the Society for Cardiovascular Angiography and Interventions (SCAI), has heard from many SCAI members who had experiences similar to Dr. Hafiz’s. While Dr. Rao describes some of the Twitter outrage as “emotional,” he does acknowledge that the ABIM’s moves appear to be financially motivated.

“The issue here was that as soon as they paid the fee, all of a sudden, ABIM flipped the switch and said they were certified,” he said. “It certainly sounds like a purely financial kind of structure.”

Richard Baron, MD, president and CEO of the ABIM, said doctors are misunderstanding the policy change.

“No doctor loses certification solely for failure to pay fees,” Dr. Baron told this news organization. “What caused them to be reported as not certified was that we didn’t have evidence that they had met program requirements. They could say, ‘But I did meet program requirements, you just didn’t know it.’ To which our answer would be, for us to know it, we have to process them. And our policy is that we don’t process them unless you are current on your fees.”

This is not the first time ABIM policies have alienated physicians.

Last year, the ABIM raised its MOC fees from $165 to $220. That also prompted a wave of outrage. Other grievances go further back. At one time, being board certified was a lifetime credential. However, in 1990 the ABIM made periodic recertification mandatory.

The process, which came to be known as “maintenance of certification,” had to be completed every 10 years, and fees were charged for each certification. At that point, said Dr. Baron, the relationship between the ABIM and physicians changed from a one-time interaction to a career-long relationship. He advises doctors to check in periodically on their portal page at the ABIM or download the app so they will always know their status.

Many physicians would prefer not to be bound to a lifetime relationship with the ABIM. There is an alternative licensing board, the National Board of Physicians and Surgeons (NBPAS), but it is accepted by only a limited number of hospitals.

“Until the NBPAS gains wide recognition,” said Dr. Hafiz, “the ABIM is going to continue to have basically a monopoly over the market.”

The value of MOC itself has been called into question. “There are no direct data supporting the value of the MOC process in either improving care, making patient care safer, or making patient care higher quality,” said Dr. Rao. This feeds frustration in a clinical community already dealing with onerous training requirements and expensive board certification exams and adds to the perception that it is a purely financial transaction, he said. (Studies examining whether the MOC system improves patient care have shown mixed results.)

The true value of the ABIM to physicians, Dr. Baron contends, is that the organization is an independent third party that differentiates those doctors from people who don’t have their skills, training, and expertise. “In these days, where anyone can be an ‘expert’ on the Internet, that’s more valuable than ever before,” he said.
 

A version of this article first appeared on Medscape.com.

 

Abdul Moiz Hafiz, MD, was flabbergasted when he received a phone call from his institution’s credentialing office telling him that he was not certified for interventional cardiology – even though he had passed that exam in 2016.

Dr. Hafiz, who directs the Advanced Structural Heart Disease Program at Southern Illinois University, phoned the American Board of Internal Medicine (ABIM), where he learned that to restore his credentials, he would need to pay $1,225 in maintenance of certification (MOC) fees.

Like Dr. Hafiz, many physicians have been dismayed to learn that the ABIM is now listing as “not certified” physicians who have passed board exams but have not paid annual MOC fees of $220 per year for the first certificate and $120 for each additional certificate.

Even doctors who are participating in mandatory continuing education outside the ABIM’s auspices are finding themselves listed as “not certified.” Some physicians learned of the policy change only after applying for hospital privileges or for jobs that require ABIM certification.

Now that increasing numbers of physicians are employed by hospitals and health care organizations that require ABIM certification, many doctors have no option but to pony up the fees if they want to continue to practice medicine.

“We have no say in the matter,” said Dr. Hafiz, “and there’s no appeal process.”

The change affects nearly 330,000 physicians. Responses to the policy on Twitter included accusations of extortion and denunciations of the ABIM’s “money grab policies.”

Sunil Rao, MD, director of interventional cardiology at NYU Langone Health and president of the Society for Cardiovascular Angiography and Interventions (SCAI), has heard from many SCAI members who had experiences similar to Dr. Hafiz’s. While Dr. Rao describes some of the Twitter outrage as “emotional,” he does acknowledge that the ABIM’s moves appear to be financially motivated.

“The issue here was that as soon as they paid the fee, all of a sudden, ABIM flipped the switch and said they were certified,” he said. “It certainly sounds like a purely financial kind of structure.”

Richard Baron, MD, president and CEO of the ABIM, said doctors are misunderstanding the policy change.

“No doctor loses certification solely for failure to pay fees,” Dr. Baron told this news organization. “What caused them to be reported as not certified was that we didn’t have evidence that they had met program requirements. They could say, ‘But I did meet program requirements, you just didn’t know it.’ To which our answer would be, for us to know it, we have to process them. And our policy is that we don’t process them unless you are current on your fees.”

This is not the first time ABIM policies have alienated physicians.

Last year, the ABIM raised its MOC fees from $165 to $220. That also prompted a wave of outrage. Other grievances go further back. At one time, being board certified was a lifetime credential. However, in 1990 the ABIM made periodic recertification mandatory.

The process, which came to be known as “maintenance of certification,” had to be completed every 10 years, and fees were charged for each certification. At that point, said Dr. Baron, the relationship between the ABIM and physicians changed from a one-time interaction to a career-long relationship. He advises doctors to check in periodically on their portal page at the ABIM or download the app so they will always know their status.

Many physicians would prefer not to be bound to a lifetime relationship with the ABIM. There is an alternative licensing board, the National Board of Physicians and Surgeons (NBPAS), but it is accepted by only a limited number of hospitals.

“Until the NBPAS gains wide recognition,” said Dr. Hafiz, “the ABIM is going to continue to have basically a monopoly over the market.”

The value of MOC itself has been called into question. “There are no direct data supporting the value of the MOC process in either improving care, making patient care safer, or making patient care higher quality,” said Dr. Rao. This feeds frustration in a clinical community already dealing with onerous training requirements and expensive board certification exams and adds to the perception that it is a purely financial transaction, he said. (Studies examining whether the MOC system improves patient care have shown mixed results.)

The true value of the ABIM to physicians, Dr. Baron contends, is that the organization is an independent third party that differentiates those doctors from people who don’t have their skills, training, and expertise. “In these days, where anyone can be an ‘expert’ on the Internet, that’s more valuable than ever before,” he said.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

BMI is a flawed measure of obesity. What are alternatives?

Article Type
Changed
Mon, 05/01/2023 - 13:53

“BMI is trash. Full stop.” This controversial tweet, which received thousands of likes and retweets, was cited in a recent article by one doctor on when physicians might stop using body mass index (BMI) to diagnose obesity.

BMI has for years been the consensus default method for assessing whether a person is overweight or has obesity, and is still widely used as the gatekeeper metric for treatment eligibility for certain weight-loss agents and bariatric surgery.

But growing appreciation of the limitations of BMI is causing many clinicians to consider alternative measures of obesity that can better assess both the amount of adiposity as well as its body location, an important determinant of the cardiometabolic consequences of fat.

Alternative metrics include waist circumference and/or waist-to-height ratio (WHtR); imaging methods such as CT, MRI, and dual-energy x-ray absorptiometry (DXA); and bioelectrical impedance to assess fat volume and location. All have made some inroads on the tight grip BMI has had on obesity assessment.

Chances are, however, that BMI will not fade away anytime soon given how entrenched it has become in clinical practice and for insurance coverage, as well as its relative simplicity and precision.

“BMI is embedded in a wide range of guidelines on the use of medications and surgery. It’s embedded in Food and Drug Administration regulations and for billing and insurance coverage. It would take extremely strong data and years of work to undo the infrastructure built around BMI and replace it with something else. I don’t see that happening [anytime soon],” commented Daniel H. Bessesen, MD, a professor at the University of Colorado at Denver, Aurora, and chief of endocrinology for Denver Health.

“It would be almost impossible to replace all the studies that have used BMI with investigations using some other measure,” he said.
 

BMI Is ‘imperfect’

The entrenched position of BMI as the go-to metric doesn’t keep detractors from weighing in. As noted in a commentary on current clinical challenges surrounding obesity recently published in Annals of Internal Medicine, the journal’s editor-in-chief, Christine Laine, MD, and senior deputy editor Christina C. Wee, MD, listed six top issues clinicians must deal with, one of which, they say, is the need for a better measure of obesity than BMI.

“Unfortunately, BMI is an imperfect measure of body composition that differs with ethnicity, sex, body frame, and muscle mass,” noted Dr. Laine and Dr. Wee.

BMI is based on a person’s weight in kilograms divided by the square of their height in meters. A “healthy” BMI is between 18.5 and 24.9 kg/m2, overweight is 25-29.9, and 30 or greater is considered to represent obesity. However, certain ethnic groups have lower cutoffs for overweight or obesity because of evidence that such individuals can be at higher risk of obesity-related comorbidities at lower BMIs.

“BMI was chosen as the initial screening tool [for obesity] not because anyone thought it was perfect or the best measure but because of its simplicity. All you need is height, weight, and a calculator,” Dr. Wee said in an interview.

Numerous online calculators are available, including one from the Centers for Disease Control and Prevention where height in feet and inches and weight in pounds can be entered to generate the BMI.

BMI is also inherently limited by being “a proxy for adiposity” and not a direct measure, added Dr. Wee, who is also director of the Obesity Research Program of Beth Israel Deaconess Medical Center, Boston.

As such, BMI can’t distinguish between fat and muscle because it relies on weight only to gauge adiposity, noted Tiffany Powell-Wiley, MD, an obesity researcher at the National Heart, Lung, and Blood Institute in Bethesda, Md. Another shortcoming of BMI is that it “is good for distinguishing population-level risk for cardiovascular disease and other chronic diseases, but it does not help as much for distinguishing risk at an individual level,” she said in an interview.

These and other drawbacks have prompted researchers to look for other useful metrics. WHtR, for example, has recently made headway as a potential BMI alternative or complement.
 

 

 

The case for WHtR

Concern about overreliance on BMI despite its limitations is not new. In 2015, an American Heart Association scientific statement from the group’s Obesity Committee concluded that “BMI alone, even with lower thresholds, is a useful but not an ideal tool for identification of obesity or assessment of cardiovascular risk,” especially for people from Asian, Black, Hispanic, and Pacific Islander populations.

The writing panel also recommended that clinicians measure waist circumference annually and use that information along with BMI “to better gauge cardiovascular risk in diverse populations.”

Momentum for moving beyond BMI alone has continued to build following the AHA statement.

In September 2022, the National Institute for Health and Care Excellence, which sets policies for the United Kingdom’s National Health Service, revised its guidancefor assessment and management of people with obesity. The updated guidance recommends that when clinicians assess “adults with BMI below 35 kg/m2, measure and use their WHtR, as well as their BMI, as a practical estimate of central adiposity and use these measurements to help to assess and predict health risks.”

NICE released an extensive literature review with the revision, and based on the evidence, said that “using waist-to-height ratio as well as BMI would help give a practical estimate of central adiposity in adults with BMI under 35 kg/m2. This would in turn help professionals assess and predict health risks.”

However, the review added that, “because people with a BMI over 35 kg/m2 are always likely to have a high WHtR, the committee recognized that it may not be a useful addition for predicting health risks in this group.” The 2022 NICE review also said that it is “important to estimate central adiposity when assessing future health risks, including for people whose BMI is in the healthy-weight category.”

This new emphasis by NICE on measuring and using WHtR as part of obesity assessment “represents an important change in population health policy,” commented Dr. Powell-Wiley. “I expect more professional organizations will endorse use of waist circumference or waist-to-height ratio now that NICE has taken this step,” she predicted.

Waist circumference and WHtR may become standard measures of adiposity in clinical practice over the next 5-10 years.

The recent move by NICE to highlight a complementary role for WHtR “is another acknowledgment that BMI is an imperfect tool for stratifying cardiometabolic risk in a diverse population, especially in people with lower BMIs” because of its variability, commented Jamie Almandoz, MD, medical director of the weight wellness program at UT Southwestern Medical Center, Dallas.
 

WHtR vs. BMI

Another recent step forward for WHtR came with the publication of a post hoc analysis of data collected in the PARADIGM-HF trial, a study that had the primary purpose of comparing two medications for improving outcomes in more than 8,000 patients with heart failure with reduced ejection fraction.

The new analysis showed that “two indices that incorporate waist circumference and height, but not weight, showed a clearer association between greater adiposity and a higher risk of heart failure hospitalization,” compared with BMI.

WHtR was one of the two indices identified as being a better correlate for the adverse effect of excess adiposity compared with BMI.

The authors of the post hoc analysis did not design their analysis to compare WHtR with BMI. Instead, their goal was to better understand what’s known as the “obesity paradox” in people with heart failure with reduced ejection fraction: The recurring observation that, when these patients with heart failure have lower BMIs they fare worse, with higher rates of mortality and adverse cardiovascular outcomes, compared with patients with higher BMIs.

The new analysis showed that this paradox disappeared when WHtR was substituted for BMI as the obesity metric.

This “provides meaningful data about the superiority of WHtR, compared with BMI, for predicting heart failure outcomes,” said Dr. Powell-Wiley, although she cautioned that the analysis was limited by scant data in diverse populations and did not look at other important cardiovascular disease outcomes. While Dr. Powell-Wiley does not think that WHtR needs assessment in a prospective, controlled trial, she called for analysis of pooled prospective studies with more diverse populations to better document the advantages of WHtR over BMI.

The PARADIGM-HF post hoc analysis shows again how flawed BMI is for health assessment and the relative importance of an individualized understanding of a person’s body composition, Dr. Almandoz said in an interview. “As we collect more data, there is increasing awareness of how imperfect BMI is.”
 

 

 

Measuring waist circumference is tricky

Although WHtR looks promising as a substitute for or add-on to BMI, it has its own limitations, particularly the challenge of accurately measuring waist circumference.

Measuring waist circumference “not only takes more time but requires the assessor to be well trained about where to put the tape measure and making sure it’s measured at the same place each time,” even when different people take serial measurements from individual patients, noted Dr. Wee. Determining waist circumference can also be technically difficult when done on larger people, she added, and collectively these challenges make waist circumference “less reproducible from measurement to measurement.”

“It’s relatively clear how to standardize measurement of weight and height, but there is a huge amount of variability when the waist is measured,” agreed Dr. Almandoz. “And waist circumference also differs by ethnicity, race, sex, and body frame. There are significant differences in waist circumference levels that associate with increased health risks” between, for example, White and South Asian people.

Another limitation of waist circumference and WHtR is that they “cannot differentiate between visceral and abdominal subcutaneous adipose tissue, which are vastly different regarding cardiometabolic risk, commented Ian Neeland, MD, director of cardiovascular prevention at the University Hospitals Harrington Heart & Vascular Institute, Cleveland.
 

The imaging option

“Waist-to-height ratio is not the ultimate answer,” Dr. Neeland said in an interview. He instead endorsed “advanced imaging for body fat distribution,” such as CT or MRI scans, as his pick for what should be the standard obesity metric, “given that it is much more specific and actionable for both risk assessment and response to therapy. I expect slow but steady advancements that move away from BMI cutoffs, for example for bariatric surgery, given that BMI is an imprecise and crude tool.”

But although imaging with methods like CT and MRI may provide the best accuracy and precision for tracking the volume of a person’s cardiometabolically dangerous fat, they are also hampered by relatively high cost and, for CT and DXA, the issue of radiation exposure.

“CT, MRI, and DXA scans give more in-depth assessment of body composition, but should we expose people to the radiation and the cost?” Dr. Almandoz wondered.

“Height, weight, and waist circumference cost nothing to obtain,” creating a big relative disadvantage for imaging, said Naveed Sattar, MD, professor of metabolic medicine at the University of Glasgow.

“Data would need to show that imaging gives clinicians substantially more information about future risk” to justify its price, Dr. Sattar emphasized.
 

BMI’s limits mean adding on

Regardless of whichever alternatives to BMI end up getting used most, experts generally agree that BMI alone is looking increasingly inadequate.

“Over the next 5 years, BMI will come to be seen as a screening tool that categorizes people into general risk groups” that also needs “other metrics and variables, such as age, race, ethnicity, family history, blood glucose, and blood pressure to better describe health risk in an individual,” predicted Dr. Bessesen.

The endorsement of WHtR by NICE “will lead to more research into how to incorporate WHtR into routine practice. We need more evidence to translate what NICE said into practice,” said Dr. Sattar. “I don’t think we’ll see a shift away from BMI, but we’ll add alternative measures that are particularly useful in certain patients.”

“Because we live in diverse societies, we need to individualize risk assessment and couple that with technology that makes analysis of body composition more accessible,” agreed Dr. Almandoz. He noted that the UT Southwestern weight wellness program where he practices has, for about the past decade, routinely collected waist circumference and bioelectrical impedance data as well as BMI on all people seen in the practice for obesity concerns. Making these additional measurements on a routine basis also helps strengthen patient engagement.

“We get into trouble when we make rigid health policy and clinical decisions based on BMI alone without looking at the patient holistically,” said Dr. Wee. “Patients are more than arbitrary numbers, and clinicians should make clinical decisions based on the totality of evidence for each individual patient.”

Dr. Bessesen, Dr. Wee, Dr. Powell-Wiley, and Dr. Almandoz reported no relevant financial relationships. Dr. Neeland has reported being a consultant for Merck. Dr. Sattar has reported being a consultant or speaker for Abbott Laboratories, Afimmune, Amgen, AstraZeneca, Boehringer Ingelheim, Eli Lilly, Hanmi Pharmaceuticals, Janssen, MSD, Novartis, Novo Nordisk, Pfizer, Roche Diagnostics, and Sanofi.

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

“BMI is trash. Full stop.” This controversial tweet, which received thousands of likes and retweets, was cited in a recent article by one doctor on when physicians might stop using body mass index (BMI) to diagnose obesity.

BMI has for years been the consensus default method for assessing whether a person is overweight or has obesity, and is still widely used as the gatekeeper metric for treatment eligibility for certain weight-loss agents and bariatric surgery.

But growing appreciation of the limitations of BMI is causing many clinicians to consider alternative measures of obesity that can better assess both the amount of adiposity as well as its body location, an important determinant of the cardiometabolic consequences of fat.

Alternative metrics include waist circumference and/or waist-to-height ratio (WHtR); imaging methods such as CT, MRI, and dual-energy x-ray absorptiometry (DXA); and bioelectrical impedance to assess fat volume and location. All have made some inroads on the tight grip BMI has had on obesity assessment.

Chances are, however, that BMI will not fade away anytime soon given how entrenched it has become in clinical practice and for insurance coverage, as well as its relative simplicity and precision.

“BMI is embedded in a wide range of guidelines on the use of medications and surgery. It’s embedded in Food and Drug Administration regulations and for billing and insurance coverage. It would take extremely strong data and years of work to undo the infrastructure built around BMI and replace it with something else. I don’t see that happening [anytime soon],” commented Daniel H. Bessesen, MD, a professor at the University of Colorado at Denver, Aurora, and chief of endocrinology for Denver Health.

“It would be almost impossible to replace all the studies that have used BMI with investigations using some other measure,” he said.
 

BMI Is ‘imperfect’

The entrenched position of BMI as the go-to metric doesn’t keep detractors from weighing in. As noted in a commentary on current clinical challenges surrounding obesity recently published in Annals of Internal Medicine, the journal’s editor-in-chief, Christine Laine, MD, and senior deputy editor Christina C. Wee, MD, listed six top issues clinicians must deal with, one of which, they say, is the need for a better measure of obesity than BMI.

“Unfortunately, BMI is an imperfect measure of body composition that differs with ethnicity, sex, body frame, and muscle mass,” noted Dr. Laine and Dr. Wee.

BMI is based on a person’s weight in kilograms divided by the square of their height in meters. A “healthy” BMI is between 18.5 and 24.9 kg/m2, overweight is 25-29.9, and 30 or greater is considered to represent obesity. However, certain ethnic groups have lower cutoffs for overweight or obesity because of evidence that such individuals can be at higher risk of obesity-related comorbidities at lower BMIs.

“BMI was chosen as the initial screening tool [for obesity] not because anyone thought it was perfect or the best measure but because of its simplicity. All you need is height, weight, and a calculator,” Dr. Wee said in an interview.

Numerous online calculators are available, including one from the Centers for Disease Control and Prevention where height in feet and inches and weight in pounds can be entered to generate the BMI.

BMI is also inherently limited by being “a proxy for adiposity” and not a direct measure, added Dr. Wee, who is also director of the Obesity Research Program of Beth Israel Deaconess Medical Center, Boston.

As such, BMI can’t distinguish between fat and muscle because it relies on weight only to gauge adiposity, noted Tiffany Powell-Wiley, MD, an obesity researcher at the National Heart, Lung, and Blood Institute in Bethesda, Md. Another shortcoming of BMI is that it “is good for distinguishing population-level risk for cardiovascular disease and other chronic diseases, but it does not help as much for distinguishing risk at an individual level,” she said in an interview.

These and other drawbacks have prompted researchers to look for other useful metrics. WHtR, for example, has recently made headway as a potential BMI alternative or complement.
 

 

 

The case for WHtR

Concern about overreliance on BMI despite its limitations is not new. In 2015, an American Heart Association scientific statement from the group’s Obesity Committee concluded that “BMI alone, even with lower thresholds, is a useful but not an ideal tool for identification of obesity or assessment of cardiovascular risk,” especially for people from Asian, Black, Hispanic, and Pacific Islander populations.

The writing panel also recommended that clinicians measure waist circumference annually and use that information along with BMI “to better gauge cardiovascular risk in diverse populations.”

Momentum for moving beyond BMI alone has continued to build following the AHA statement.

In September 2022, the National Institute for Health and Care Excellence, which sets policies for the United Kingdom’s National Health Service, revised its guidancefor assessment and management of people with obesity. The updated guidance recommends that when clinicians assess “adults with BMI below 35 kg/m2, measure and use their WHtR, as well as their BMI, as a practical estimate of central adiposity and use these measurements to help to assess and predict health risks.”

NICE released an extensive literature review with the revision, and based on the evidence, said that “using waist-to-height ratio as well as BMI would help give a practical estimate of central adiposity in adults with BMI under 35 kg/m2. This would in turn help professionals assess and predict health risks.”

However, the review added that, “because people with a BMI over 35 kg/m2 are always likely to have a high WHtR, the committee recognized that it may not be a useful addition for predicting health risks in this group.” The 2022 NICE review also said that it is “important to estimate central adiposity when assessing future health risks, including for people whose BMI is in the healthy-weight category.”

This new emphasis by NICE on measuring and using WHtR as part of obesity assessment “represents an important change in population health policy,” commented Dr. Powell-Wiley. “I expect more professional organizations will endorse use of waist circumference or waist-to-height ratio now that NICE has taken this step,” she predicted.

Waist circumference and WHtR may become standard measures of adiposity in clinical practice over the next 5-10 years.

The recent move by NICE to highlight a complementary role for WHtR “is another acknowledgment that BMI is an imperfect tool for stratifying cardiometabolic risk in a diverse population, especially in people with lower BMIs” because of its variability, commented Jamie Almandoz, MD, medical director of the weight wellness program at UT Southwestern Medical Center, Dallas.
 

WHtR vs. BMI

Another recent step forward for WHtR came with the publication of a post hoc analysis of data collected in the PARADIGM-HF trial, a study that had the primary purpose of comparing two medications for improving outcomes in more than 8,000 patients with heart failure with reduced ejection fraction.

The new analysis showed that “two indices that incorporate waist circumference and height, but not weight, showed a clearer association between greater adiposity and a higher risk of heart failure hospitalization,” compared with BMI.

WHtR was one of the two indices identified as being a better correlate for the adverse effect of excess adiposity compared with BMI.

The authors of the post hoc analysis did not design their analysis to compare WHtR with BMI. Instead, their goal was to better understand what’s known as the “obesity paradox” in people with heart failure with reduced ejection fraction: The recurring observation that, when these patients with heart failure have lower BMIs they fare worse, with higher rates of mortality and adverse cardiovascular outcomes, compared with patients with higher BMIs.

The new analysis showed that this paradox disappeared when WHtR was substituted for BMI as the obesity metric.

This “provides meaningful data about the superiority of WHtR, compared with BMI, for predicting heart failure outcomes,” said Dr. Powell-Wiley, although she cautioned that the analysis was limited by scant data in diverse populations and did not look at other important cardiovascular disease outcomes. While Dr. Powell-Wiley does not think that WHtR needs assessment in a prospective, controlled trial, she called for analysis of pooled prospective studies with more diverse populations to better document the advantages of WHtR over BMI.

The PARADIGM-HF post hoc analysis shows again how flawed BMI is for health assessment and the relative importance of an individualized understanding of a person’s body composition, Dr. Almandoz said in an interview. “As we collect more data, there is increasing awareness of how imperfect BMI is.”
 

 

 

Measuring waist circumference is tricky

Although WHtR looks promising as a substitute for or add-on to BMI, it has its own limitations, particularly the challenge of accurately measuring waist circumference.

Measuring waist circumference “not only takes more time but requires the assessor to be well trained about where to put the tape measure and making sure it’s measured at the same place each time,” even when different people take serial measurements from individual patients, noted Dr. Wee. Determining waist circumference can also be technically difficult when done on larger people, she added, and collectively these challenges make waist circumference “less reproducible from measurement to measurement.”

“It’s relatively clear how to standardize measurement of weight and height, but there is a huge amount of variability when the waist is measured,” agreed Dr. Almandoz. “And waist circumference also differs by ethnicity, race, sex, and body frame. There are significant differences in waist circumference levels that associate with increased health risks” between, for example, White and South Asian people.

Another limitation of waist circumference and WHtR is that they “cannot differentiate between visceral and abdominal subcutaneous adipose tissue, which are vastly different regarding cardiometabolic risk, commented Ian Neeland, MD, director of cardiovascular prevention at the University Hospitals Harrington Heart & Vascular Institute, Cleveland.
 

The imaging option

“Waist-to-height ratio is not the ultimate answer,” Dr. Neeland said in an interview. He instead endorsed “advanced imaging for body fat distribution,” such as CT or MRI scans, as his pick for what should be the standard obesity metric, “given that it is much more specific and actionable for both risk assessment and response to therapy. I expect slow but steady advancements that move away from BMI cutoffs, for example for bariatric surgery, given that BMI is an imprecise and crude tool.”

But although imaging with methods like CT and MRI may provide the best accuracy and precision for tracking the volume of a person’s cardiometabolically dangerous fat, they are also hampered by relatively high cost and, for CT and DXA, the issue of radiation exposure.

“CT, MRI, and DXA scans give more in-depth assessment of body composition, but should we expose people to the radiation and the cost?” Dr. Almandoz wondered.

“Height, weight, and waist circumference cost nothing to obtain,” creating a big relative disadvantage for imaging, said Naveed Sattar, MD, professor of metabolic medicine at the University of Glasgow.

“Data would need to show that imaging gives clinicians substantially more information about future risk” to justify its price, Dr. Sattar emphasized.
 

BMI’s limits mean adding on

Regardless of whichever alternatives to BMI end up getting used most, experts generally agree that BMI alone is looking increasingly inadequate.

“Over the next 5 years, BMI will come to be seen as a screening tool that categorizes people into general risk groups” that also needs “other metrics and variables, such as age, race, ethnicity, family history, blood glucose, and blood pressure to better describe health risk in an individual,” predicted Dr. Bessesen.

The endorsement of WHtR by NICE “will lead to more research into how to incorporate WHtR into routine practice. We need more evidence to translate what NICE said into practice,” said Dr. Sattar. “I don’t think we’ll see a shift away from BMI, but we’ll add alternative measures that are particularly useful in certain patients.”

“Because we live in diverse societies, we need to individualize risk assessment and couple that with technology that makes analysis of body composition more accessible,” agreed Dr. Almandoz. He noted that the UT Southwestern weight wellness program where he practices has, for about the past decade, routinely collected waist circumference and bioelectrical impedance data as well as BMI on all people seen in the practice for obesity concerns. Making these additional measurements on a routine basis also helps strengthen patient engagement.

“We get into trouble when we make rigid health policy and clinical decisions based on BMI alone without looking at the patient holistically,” said Dr. Wee. “Patients are more than arbitrary numbers, and clinicians should make clinical decisions based on the totality of evidence for each individual patient.”

Dr. Bessesen, Dr. Wee, Dr. Powell-Wiley, and Dr. Almandoz reported no relevant financial relationships. Dr. Neeland has reported being a consultant for Merck. Dr. Sattar has reported being a consultant or speaker for Abbott Laboratories, Afimmune, Amgen, AstraZeneca, Boehringer Ingelheim, Eli Lilly, Hanmi Pharmaceuticals, Janssen, MSD, Novartis, Novo Nordisk, Pfizer, Roche Diagnostics, and Sanofi.

A version of this article originally appeared on Medscape.com.

“BMI is trash. Full stop.” This controversial tweet, which received thousands of likes and retweets, was cited in a recent article by one doctor on when physicians might stop using body mass index (BMI) to diagnose obesity.

BMI has for years been the consensus default method for assessing whether a person is overweight or has obesity, and is still widely used as the gatekeeper metric for treatment eligibility for certain weight-loss agents and bariatric surgery.

But growing appreciation of the limitations of BMI is causing many clinicians to consider alternative measures of obesity that can better assess both the amount of adiposity as well as its body location, an important determinant of the cardiometabolic consequences of fat.

Alternative metrics include waist circumference and/or waist-to-height ratio (WHtR); imaging methods such as CT, MRI, and dual-energy x-ray absorptiometry (DXA); and bioelectrical impedance to assess fat volume and location. All have made some inroads on the tight grip BMI has had on obesity assessment.

Chances are, however, that BMI will not fade away anytime soon given how entrenched it has become in clinical practice and for insurance coverage, as well as its relative simplicity and precision.

“BMI is embedded in a wide range of guidelines on the use of medications and surgery. It’s embedded in Food and Drug Administration regulations and for billing and insurance coverage. It would take extremely strong data and years of work to undo the infrastructure built around BMI and replace it with something else. I don’t see that happening [anytime soon],” commented Daniel H. Bessesen, MD, a professor at the University of Colorado at Denver, Aurora, and chief of endocrinology for Denver Health.

“It would be almost impossible to replace all the studies that have used BMI with investigations using some other measure,” he said.
 

BMI Is ‘imperfect’

The entrenched position of BMI as the go-to metric doesn’t keep detractors from weighing in. As noted in a commentary on current clinical challenges surrounding obesity recently published in Annals of Internal Medicine, the journal’s editor-in-chief, Christine Laine, MD, and senior deputy editor Christina C. Wee, MD, listed six top issues clinicians must deal with, one of which, they say, is the need for a better measure of obesity than BMI.

“Unfortunately, BMI is an imperfect measure of body composition that differs with ethnicity, sex, body frame, and muscle mass,” noted Dr. Laine and Dr. Wee.

BMI is based on a person’s weight in kilograms divided by the square of their height in meters. A “healthy” BMI is between 18.5 and 24.9 kg/m2, overweight is 25-29.9, and 30 or greater is considered to represent obesity. However, certain ethnic groups have lower cutoffs for overweight or obesity because of evidence that such individuals can be at higher risk of obesity-related comorbidities at lower BMIs.

“BMI was chosen as the initial screening tool [for obesity] not because anyone thought it was perfect or the best measure but because of its simplicity. All you need is height, weight, and a calculator,” Dr. Wee said in an interview.

Numerous online calculators are available, including one from the Centers for Disease Control and Prevention where height in feet and inches and weight in pounds can be entered to generate the BMI.

BMI is also inherently limited by being “a proxy for adiposity” and not a direct measure, added Dr. Wee, who is also director of the Obesity Research Program of Beth Israel Deaconess Medical Center, Boston.

As such, BMI can’t distinguish between fat and muscle because it relies on weight only to gauge adiposity, noted Tiffany Powell-Wiley, MD, an obesity researcher at the National Heart, Lung, and Blood Institute in Bethesda, Md. Another shortcoming of BMI is that it “is good for distinguishing population-level risk for cardiovascular disease and other chronic diseases, but it does not help as much for distinguishing risk at an individual level,” she said in an interview.

These and other drawbacks have prompted researchers to look for other useful metrics. WHtR, for example, has recently made headway as a potential BMI alternative or complement.
 

 

 

The case for WHtR

Concern about overreliance on BMI despite its limitations is not new. In 2015, an American Heart Association scientific statement from the group’s Obesity Committee concluded that “BMI alone, even with lower thresholds, is a useful but not an ideal tool for identification of obesity or assessment of cardiovascular risk,” especially for people from Asian, Black, Hispanic, and Pacific Islander populations.

The writing panel also recommended that clinicians measure waist circumference annually and use that information along with BMI “to better gauge cardiovascular risk in diverse populations.”

Momentum for moving beyond BMI alone has continued to build following the AHA statement.

In September 2022, the National Institute for Health and Care Excellence, which sets policies for the United Kingdom’s National Health Service, revised its guidancefor assessment and management of people with obesity. The updated guidance recommends that when clinicians assess “adults with BMI below 35 kg/m2, measure and use their WHtR, as well as their BMI, as a practical estimate of central adiposity and use these measurements to help to assess and predict health risks.”

NICE released an extensive literature review with the revision, and based on the evidence, said that “using waist-to-height ratio as well as BMI would help give a practical estimate of central adiposity in adults with BMI under 35 kg/m2. This would in turn help professionals assess and predict health risks.”

However, the review added that, “because people with a BMI over 35 kg/m2 are always likely to have a high WHtR, the committee recognized that it may not be a useful addition for predicting health risks in this group.” The 2022 NICE review also said that it is “important to estimate central adiposity when assessing future health risks, including for people whose BMI is in the healthy-weight category.”

This new emphasis by NICE on measuring and using WHtR as part of obesity assessment “represents an important change in population health policy,” commented Dr. Powell-Wiley. “I expect more professional organizations will endorse use of waist circumference or waist-to-height ratio now that NICE has taken this step,” she predicted.

Waist circumference and WHtR may become standard measures of adiposity in clinical practice over the next 5-10 years.

The recent move by NICE to highlight a complementary role for WHtR “is another acknowledgment that BMI is an imperfect tool for stratifying cardiometabolic risk in a diverse population, especially in people with lower BMIs” because of its variability, commented Jamie Almandoz, MD, medical director of the weight wellness program at UT Southwestern Medical Center, Dallas.
 

WHtR vs. BMI

Another recent step forward for WHtR came with the publication of a post hoc analysis of data collected in the PARADIGM-HF trial, a study that had the primary purpose of comparing two medications for improving outcomes in more than 8,000 patients with heart failure with reduced ejection fraction.

The new analysis showed that “two indices that incorporate waist circumference and height, but not weight, showed a clearer association between greater adiposity and a higher risk of heart failure hospitalization,” compared with BMI.

WHtR was one of the two indices identified as being a better correlate for the adverse effect of excess adiposity compared with BMI.

The authors of the post hoc analysis did not design their analysis to compare WHtR with BMI. Instead, their goal was to better understand what’s known as the “obesity paradox” in people with heart failure with reduced ejection fraction: The recurring observation that, when these patients with heart failure have lower BMIs they fare worse, with higher rates of mortality and adverse cardiovascular outcomes, compared with patients with higher BMIs.

The new analysis showed that this paradox disappeared when WHtR was substituted for BMI as the obesity metric.

This “provides meaningful data about the superiority of WHtR, compared with BMI, for predicting heart failure outcomes,” said Dr. Powell-Wiley, although she cautioned that the analysis was limited by scant data in diverse populations and did not look at other important cardiovascular disease outcomes. While Dr. Powell-Wiley does not think that WHtR needs assessment in a prospective, controlled trial, she called for analysis of pooled prospective studies with more diverse populations to better document the advantages of WHtR over BMI.

The PARADIGM-HF post hoc analysis shows again how flawed BMI is for health assessment and the relative importance of an individualized understanding of a person’s body composition, Dr. Almandoz said in an interview. “As we collect more data, there is increasing awareness of how imperfect BMI is.”
 

 

 

Measuring waist circumference is tricky

Although WHtR looks promising as a substitute for or add-on to BMI, it has its own limitations, particularly the challenge of accurately measuring waist circumference.

Measuring waist circumference “not only takes more time but requires the assessor to be well trained about where to put the tape measure and making sure it’s measured at the same place each time,” even when different people take serial measurements from individual patients, noted Dr. Wee. Determining waist circumference can also be technically difficult when done on larger people, she added, and collectively these challenges make waist circumference “less reproducible from measurement to measurement.”

“It’s relatively clear how to standardize measurement of weight and height, but there is a huge amount of variability when the waist is measured,” agreed Dr. Almandoz. “And waist circumference also differs by ethnicity, race, sex, and body frame. There are significant differences in waist circumference levels that associate with increased health risks” between, for example, White and South Asian people.

Another limitation of waist circumference and WHtR is that they “cannot differentiate between visceral and abdominal subcutaneous adipose tissue, which are vastly different regarding cardiometabolic risk, commented Ian Neeland, MD, director of cardiovascular prevention at the University Hospitals Harrington Heart & Vascular Institute, Cleveland.
 

The imaging option

“Waist-to-height ratio is not the ultimate answer,” Dr. Neeland said in an interview. He instead endorsed “advanced imaging for body fat distribution,” such as CT or MRI scans, as his pick for what should be the standard obesity metric, “given that it is much more specific and actionable for both risk assessment and response to therapy. I expect slow but steady advancements that move away from BMI cutoffs, for example for bariatric surgery, given that BMI is an imprecise and crude tool.”

But although imaging with methods like CT and MRI may provide the best accuracy and precision for tracking the volume of a person’s cardiometabolically dangerous fat, they are also hampered by relatively high cost and, for CT and DXA, the issue of radiation exposure.

“CT, MRI, and DXA scans give more in-depth assessment of body composition, but should we expose people to the radiation and the cost?” Dr. Almandoz wondered.

“Height, weight, and waist circumference cost nothing to obtain,” creating a big relative disadvantage for imaging, said Naveed Sattar, MD, professor of metabolic medicine at the University of Glasgow.

“Data would need to show that imaging gives clinicians substantially more information about future risk” to justify its price, Dr. Sattar emphasized.
 

BMI’s limits mean adding on

Regardless of whichever alternatives to BMI end up getting used most, experts generally agree that BMI alone is looking increasingly inadequate.

“Over the next 5 years, BMI will come to be seen as a screening tool that categorizes people into general risk groups” that also needs “other metrics and variables, such as age, race, ethnicity, family history, blood glucose, and blood pressure to better describe health risk in an individual,” predicted Dr. Bessesen.

The endorsement of WHtR by NICE “will lead to more research into how to incorporate WHtR into routine practice. We need more evidence to translate what NICE said into practice,” said Dr. Sattar. “I don’t think we’ll see a shift away from BMI, but we’ll add alternative measures that are particularly useful in certain patients.”

“Because we live in diverse societies, we need to individualize risk assessment and couple that with technology that makes analysis of body composition more accessible,” agreed Dr. Almandoz. He noted that the UT Southwestern weight wellness program where he practices has, for about the past decade, routinely collected waist circumference and bioelectrical impedance data as well as BMI on all people seen in the practice for obesity concerns. Making these additional measurements on a routine basis also helps strengthen patient engagement.

“We get into trouble when we make rigid health policy and clinical decisions based on BMI alone without looking at the patient holistically,” said Dr. Wee. “Patients are more than arbitrary numbers, and clinicians should make clinical decisions based on the totality of evidence for each individual patient.”

Dr. Bessesen, Dr. Wee, Dr. Powell-Wiley, and Dr. Almandoz reported no relevant financial relationships. Dr. Neeland has reported being a consultant for Merck. Dr. Sattar has reported being a consultant or speaker for Abbott Laboratories, Afimmune, Amgen, AstraZeneca, Boehringer Ingelheim, Eli Lilly, Hanmi Pharmaceuticals, Janssen, MSD, Novartis, Novo Nordisk, Pfizer, Roche Diagnostics, and Sanofi.

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Disrupted gut microbiome a key driver of major depression?

Article Type
Changed
Fri, 04/28/2023 - 00:43

Major depressive disorder (MDD) is linked to disruptions in energy and lipid metabolism, possibly caused by the interplay of the gut microbiome and blood metabolome, new research suggests.

Investigators found that MDD had specific metabolic “signatures” consisting of 124 metabolites that spanned energy and lipid pathways, with some involving the tricarboxylic acid cycle in particular. These changes in metabolites were consistent with differences in composition of several gut microbiota.

The researchers found that fatty acids and intermediate and very large lipoproteins changed in association with the depressive disease process. However, high-density lipoproteins and metabolites in the tricarboxylic acid cycle did not.

“As we wait to establish causal influences through clinical trials, clinicians should advise patients suffering from mood disorders to modify their diet by increasing the intake of fresh fruits, vegetables, and whole grains, as these provide the required fuel/fiber to the gut microbiota for their enrichment, and more short-chain fatty acids are produced for the optimal functioning of the body,” study investigator Najaf Amin, PhD, DSc, senior researcher, Nuffield Department of Population Health, Oxford University, England, told this news organization.

“At the same time, patients should be advised to minimize the intake of sugars and processed foods, which are known to have an inverse impact on the gut microbiome and are associated with higher inflammation,” she said.

The study was published online in JAMA Psychiatry.
 

MDD poorly understood

Although most antidepressants target the monoamine pathway, “evidence is increasing for a more complex interplay of multiple pathways involving a wide range of metabolic alterations spanning energy and lipid metabolism,” the authors wrote.

Previous research using the Nightingale proton nuclear magnetic resonance (NMR) metabolomics platform showed a “shift” toward decreased levels of high-density lipoproteins (HDLs) and increased levels of very low-density lipoproteins (VLDLs) and triglycerides among patients with depression.

The gut microbiome, which is primarily modulated by diet, “has been shown to be a major determinant of circulating lipids, specifically triglycerides and HDLs, and to regulate mitochondrial function,” the investigators noted. Patients with MDD are known to have disruptions in the gut microbiome.

The gut microbiome may “explain part of the shift in VLDL and HDL levels observed in patients with depression and if the metabolic signatures of the disease based on Nightingale metabolites can be used as a tool to infer the association between gut microbiome and depression.”

Dr. Amin called depression “one of the most poorly understood diseases, as underlying mechanisms remain elusive.”

Large-scale genetic studies “have shown that the contribution of genetics to depression is modest,” she continued. On the other hand, initial animal studies suggest the gut microbiome “may potentially have a causal influence on depression.”

Several studies have evaluated the influence of gut microbiome on depression, “but, due to small sample sizes and inadequate control for confounding factors, most of their findings were not reproducible.”

Harnessing the power of the UK Biobank, the investigators studied 58,257 individuals who were between the ages of 37 and 73 years at recruitment. They used data on NMR spectroscopy–based plasma metabolites in depression. Individuals who didn’t report depression at baseline served as controls.

Logistic regression analysis was used to test the association of metabolite levels with depression in four models, each with an increasing number of covariates.

To identify patterns of correlation in the “metabolic signatures of MDD and the human gut biome,” they regressed the metabolic signatures of MDD on the metabolic signatures of the gut microbiota and then regressed the metabolic signature of gut microbiota on the metabolic signatures of MDD.

Bidirectional 2-sample Mendelian randomization was used to ascertain the direction of the association observed between metabolites and MDD.

Individuals with lifetime and recurrent MDD were compared with controls (6,811 vs. 51,446 and 4,370 vs. 62,508, respectively).

Participants with lifetime MDD were significantly younger (median [IQR] age, 56 [49-62] years vs. 58 [51-64] years) and were more likely to be female in comparison with controls (54% vs. 35%).
 

 

 

‘Novel findings’

In the fully adjusted analysis, metabolic signatures of MDD were found to consist of 124 metabolites that spanned energy and lipid metabolism pathways.

The investigators noted that these “novel findings” included 49 metabolites encompassing those involved in the tricarboxylic acid cycle – citrate and pyruvate.

The findings revealed that fatty acids and intermediate and VLDL changed in association with the disease process. On the other hand, HDL and the metabolites in the tricarboxylic acid cycle did not.

“We observed that the genera Sellimonas, Eggerthella, Hungatella, and Lachnoclostridium were more abundant, while genera Ruminococcaceae ... Coprococcus, Lachnospiraceae ... Eubacterium ventriosum, Subdoligranulum, and family Ruminococcaceae were depleted in the guts of individuals with more symptoms of depression,” said Dr. Amin. “Of these, genus Eggerthella showed statistical evidence of being involved in the causal pathway.”

These microbes are involved in the synthesis of important neurotransmitters, such as gamma aminobutyric acid, butyrate, glutamate, and serotonin, she noted.

Butyrate produced by the gut can cross the blood-brain barrier, enter the brain, and affect transcriptional and translational activity or be used by the cells for generating energy, she added. “So basically, butyrate can influence depression through several routes – i.e., via immune regulation, genomic transcript/translation, and/or affecting energy metabolism.”
 

No causality

Commenting on the study, Emeran Mayer, MD, distinguished research professor of medicine, G. Oppenheimer Center for Neurobiology of Stress and Resilience and UCLA Brain Gut Microbiome Center, called it the “largest, most comprehensive and best validated association study to date providing further evidence for an association between gut microbial taxa, previously identified in patients with MDD, blood metabolites (generated by host and by microbes) and questionnaire data.”

However, “despite its strengths, the study does not allow [us] to identify a causal role of the microbiome alterations in the observed microbial and metabolic changes (fatty acids, Krebs cycle components),” cautioned Dr. Mayer, who was not involved with the study.

Moreover, “causality of gut microbial changes on the behavioral phenotype of depression cannot been inferred,” he concluded.

Metabolomics data were provided by the Alzheimer’s Disease Metabolomics Consortium. The study was funded wholly or in part by grants from the National Institute on Aging and Foundation for the National Institutes of Health. It was further supported by a grant from ZonMW Memorabel. Dr. Amin reports no relevant financial relationships. The other authors’ disclosures are listed oin the original article. Dr. Mayer is a scientific advisory board member of Danone, Axial Therapeutics, Viome, Amare, Mahana Therapeutics, Pendulum, Bloom Biosciences, and APC Microbiome Ireland.
 

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

Major depressive disorder (MDD) is linked to disruptions in energy and lipid metabolism, possibly caused by the interplay of the gut microbiome and blood metabolome, new research suggests.

Investigators found that MDD had specific metabolic “signatures” consisting of 124 metabolites that spanned energy and lipid pathways, with some involving the tricarboxylic acid cycle in particular. These changes in metabolites were consistent with differences in composition of several gut microbiota.

The researchers found that fatty acids and intermediate and very large lipoproteins changed in association with the depressive disease process. However, high-density lipoproteins and metabolites in the tricarboxylic acid cycle did not.

“As we wait to establish causal influences through clinical trials, clinicians should advise patients suffering from mood disorders to modify their diet by increasing the intake of fresh fruits, vegetables, and whole grains, as these provide the required fuel/fiber to the gut microbiota for their enrichment, and more short-chain fatty acids are produced for the optimal functioning of the body,” study investigator Najaf Amin, PhD, DSc, senior researcher, Nuffield Department of Population Health, Oxford University, England, told this news organization.

“At the same time, patients should be advised to minimize the intake of sugars and processed foods, which are known to have an inverse impact on the gut microbiome and are associated with higher inflammation,” she said.

The study was published online in JAMA Psychiatry.
 

MDD poorly understood

Although most antidepressants target the monoamine pathway, “evidence is increasing for a more complex interplay of multiple pathways involving a wide range of metabolic alterations spanning energy and lipid metabolism,” the authors wrote.

Previous research using the Nightingale proton nuclear magnetic resonance (NMR) metabolomics platform showed a “shift” toward decreased levels of high-density lipoproteins (HDLs) and increased levels of very low-density lipoproteins (VLDLs) and triglycerides among patients with depression.

The gut microbiome, which is primarily modulated by diet, “has been shown to be a major determinant of circulating lipids, specifically triglycerides and HDLs, and to regulate mitochondrial function,” the investigators noted. Patients with MDD are known to have disruptions in the gut microbiome.

The gut microbiome may “explain part of the shift in VLDL and HDL levels observed in patients with depression and if the metabolic signatures of the disease based on Nightingale metabolites can be used as a tool to infer the association between gut microbiome and depression.”

Dr. Amin called depression “one of the most poorly understood diseases, as underlying mechanisms remain elusive.”

Large-scale genetic studies “have shown that the contribution of genetics to depression is modest,” she continued. On the other hand, initial animal studies suggest the gut microbiome “may potentially have a causal influence on depression.”

Several studies have evaluated the influence of gut microbiome on depression, “but, due to small sample sizes and inadequate control for confounding factors, most of their findings were not reproducible.”

Harnessing the power of the UK Biobank, the investigators studied 58,257 individuals who were between the ages of 37 and 73 years at recruitment. They used data on NMR spectroscopy–based plasma metabolites in depression. Individuals who didn’t report depression at baseline served as controls.

Logistic regression analysis was used to test the association of metabolite levels with depression in four models, each with an increasing number of covariates.

To identify patterns of correlation in the “metabolic signatures of MDD and the human gut biome,” they regressed the metabolic signatures of MDD on the metabolic signatures of the gut microbiota and then regressed the metabolic signature of gut microbiota on the metabolic signatures of MDD.

Bidirectional 2-sample Mendelian randomization was used to ascertain the direction of the association observed between metabolites and MDD.

Individuals with lifetime and recurrent MDD were compared with controls (6,811 vs. 51,446 and 4,370 vs. 62,508, respectively).

Participants with lifetime MDD were significantly younger (median [IQR] age, 56 [49-62] years vs. 58 [51-64] years) and were more likely to be female in comparison with controls (54% vs. 35%).
 

 

 

‘Novel findings’

In the fully adjusted analysis, metabolic signatures of MDD were found to consist of 124 metabolites that spanned energy and lipid metabolism pathways.

The investigators noted that these “novel findings” included 49 metabolites encompassing those involved in the tricarboxylic acid cycle – citrate and pyruvate.

The findings revealed that fatty acids and intermediate and VLDL changed in association with the disease process. On the other hand, HDL and the metabolites in the tricarboxylic acid cycle did not.

“We observed that the genera Sellimonas, Eggerthella, Hungatella, and Lachnoclostridium were more abundant, while genera Ruminococcaceae ... Coprococcus, Lachnospiraceae ... Eubacterium ventriosum, Subdoligranulum, and family Ruminococcaceae were depleted in the guts of individuals with more symptoms of depression,” said Dr. Amin. “Of these, genus Eggerthella showed statistical evidence of being involved in the causal pathway.”

These microbes are involved in the synthesis of important neurotransmitters, such as gamma aminobutyric acid, butyrate, glutamate, and serotonin, she noted.

Butyrate produced by the gut can cross the blood-brain barrier, enter the brain, and affect transcriptional and translational activity or be used by the cells for generating energy, she added. “So basically, butyrate can influence depression through several routes – i.e., via immune regulation, genomic transcript/translation, and/or affecting energy metabolism.”
 

No causality

Commenting on the study, Emeran Mayer, MD, distinguished research professor of medicine, G. Oppenheimer Center for Neurobiology of Stress and Resilience and UCLA Brain Gut Microbiome Center, called it the “largest, most comprehensive and best validated association study to date providing further evidence for an association between gut microbial taxa, previously identified in patients with MDD, blood metabolites (generated by host and by microbes) and questionnaire data.”

However, “despite its strengths, the study does not allow [us] to identify a causal role of the microbiome alterations in the observed microbial and metabolic changes (fatty acids, Krebs cycle components),” cautioned Dr. Mayer, who was not involved with the study.

Moreover, “causality of gut microbial changes on the behavioral phenotype of depression cannot been inferred,” he concluded.

Metabolomics data were provided by the Alzheimer’s Disease Metabolomics Consortium. The study was funded wholly or in part by grants from the National Institute on Aging and Foundation for the National Institutes of Health. It was further supported by a grant from ZonMW Memorabel. Dr. Amin reports no relevant financial relationships. The other authors’ disclosures are listed oin the original article. Dr. Mayer is a scientific advisory board member of Danone, Axial Therapeutics, Viome, Amare, Mahana Therapeutics, Pendulum, Bloom Biosciences, and APC Microbiome Ireland.
 

A version of this article originally appeared on Medscape.com.

Major depressive disorder (MDD) is linked to disruptions in energy and lipid metabolism, possibly caused by the interplay of the gut microbiome and blood metabolome, new research suggests.

Investigators found that MDD had specific metabolic “signatures” consisting of 124 metabolites that spanned energy and lipid pathways, with some involving the tricarboxylic acid cycle in particular. These changes in metabolites were consistent with differences in composition of several gut microbiota.

The researchers found that fatty acids and intermediate and very large lipoproteins changed in association with the depressive disease process. However, high-density lipoproteins and metabolites in the tricarboxylic acid cycle did not.

“As we wait to establish causal influences through clinical trials, clinicians should advise patients suffering from mood disorders to modify their diet by increasing the intake of fresh fruits, vegetables, and whole grains, as these provide the required fuel/fiber to the gut microbiota for their enrichment, and more short-chain fatty acids are produced for the optimal functioning of the body,” study investigator Najaf Amin, PhD, DSc, senior researcher, Nuffield Department of Population Health, Oxford University, England, told this news organization.

“At the same time, patients should be advised to minimize the intake of sugars and processed foods, which are known to have an inverse impact on the gut microbiome and are associated with higher inflammation,” she said.

The study was published online in JAMA Psychiatry.
 

MDD poorly understood

Although most antidepressants target the monoamine pathway, “evidence is increasing for a more complex interplay of multiple pathways involving a wide range of metabolic alterations spanning energy and lipid metabolism,” the authors wrote.

Previous research using the Nightingale proton nuclear magnetic resonance (NMR) metabolomics platform showed a “shift” toward decreased levels of high-density lipoproteins (HDLs) and increased levels of very low-density lipoproteins (VLDLs) and triglycerides among patients with depression.

The gut microbiome, which is primarily modulated by diet, “has been shown to be a major determinant of circulating lipids, specifically triglycerides and HDLs, and to regulate mitochondrial function,” the investigators noted. Patients with MDD are known to have disruptions in the gut microbiome.

The gut microbiome may “explain part of the shift in VLDL and HDL levels observed in patients with depression and if the metabolic signatures of the disease based on Nightingale metabolites can be used as a tool to infer the association between gut microbiome and depression.”

Dr. Amin called depression “one of the most poorly understood diseases, as underlying mechanisms remain elusive.”

Large-scale genetic studies “have shown that the contribution of genetics to depression is modest,” she continued. On the other hand, initial animal studies suggest the gut microbiome “may potentially have a causal influence on depression.”

Several studies have evaluated the influence of gut microbiome on depression, “but, due to small sample sizes and inadequate control for confounding factors, most of their findings were not reproducible.”

Harnessing the power of the UK Biobank, the investigators studied 58,257 individuals who were between the ages of 37 and 73 years at recruitment. They used data on NMR spectroscopy–based plasma metabolites in depression. Individuals who didn’t report depression at baseline served as controls.

Logistic regression analysis was used to test the association of metabolite levels with depression in four models, each with an increasing number of covariates.

To identify patterns of correlation in the “metabolic signatures of MDD and the human gut biome,” they regressed the metabolic signatures of MDD on the metabolic signatures of the gut microbiota and then regressed the metabolic signature of gut microbiota on the metabolic signatures of MDD.

Bidirectional 2-sample Mendelian randomization was used to ascertain the direction of the association observed between metabolites and MDD.

Individuals with lifetime and recurrent MDD were compared with controls (6,811 vs. 51,446 and 4,370 vs. 62,508, respectively).

Participants with lifetime MDD were significantly younger (median [IQR] age, 56 [49-62] years vs. 58 [51-64] years) and were more likely to be female in comparison with controls (54% vs. 35%).
 

 

 

‘Novel findings’

In the fully adjusted analysis, metabolic signatures of MDD were found to consist of 124 metabolites that spanned energy and lipid metabolism pathways.

The investigators noted that these “novel findings” included 49 metabolites encompassing those involved in the tricarboxylic acid cycle – citrate and pyruvate.

The findings revealed that fatty acids and intermediate and VLDL changed in association with the disease process. On the other hand, HDL and the metabolites in the tricarboxylic acid cycle did not.

“We observed that the genera Sellimonas, Eggerthella, Hungatella, and Lachnoclostridium were more abundant, while genera Ruminococcaceae ... Coprococcus, Lachnospiraceae ... Eubacterium ventriosum, Subdoligranulum, and family Ruminococcaceae were depleted in the guts of individuals with more symptoms of depression,” said Dr. Amin. “Of these, genus Eggerthella showed statistical evidence of being involved in the causal pathway.”

These microbes are involved in the synthesis of important neurotransmitters, such as gamma aminobutyric acid, butyrate, glutamate, and serotonin, she noted.

Butyrate produced by the gut can cross the blood-brain barrier, enter the brain, and affect transcriptional and translational activity or be used by the cells for generating energy, she added. “So basically, butyrate can influence depression through several routes – i.e., via immune regulation, genomic transcript/translation, and/or affecting energy metabolism.”
 

No causality

Commenting on the study, Emeran Mayer, MD, distinguished research professor of medicine, G. Oppenheimer Center for Neurobiology of Stress and Resilience and UCLA Brain Gut Microbiome Center, called it the “largest, most comprehensive and best validated association study to date providing further evidence for an association between gut microbial taxa, previously identified in patients with MDD, blood metabolites (generated by host and by microbes) and questionnaire data.”

However, “despite its strengths, the study does not allow [us] to identify a causal role of the microbiome alterations in the observed microbial and metabolic changes (fatty acids, Krebs cycle components),” cautioned Dr. Mayer, who was not involved with the study.

Moreover, “causality of gut microbial changes on the behavioral phenotype of depression cannot been inferred,” he concluded.

Metabolomics data were provided by the Alzheimer’s Disease Metabolomics Consortium. The study was funded wholly or in part by grants from the National Institute on Aging and Foundation for the National Institutes of Health. It was further supported by a grant from ZonMW Memorabel. Dr. Amin reports no relevant financial relationships. The other authors’ disclosures are listed oin the original article. Dr. Mayer is a scientific advisory board member of Danone, Axial Therapeutics, Viome, Amare, Mahana Therapeutics, Pendulum, Bloom Biosciences, and APC Microbiome Ireland.
 

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Walnuts linked to improved attention, psychological maturity in teens

Article Type
Changed
Fri, 04/28/2023 - 00:44

Walnuts have been associated with better cognitive development and psychological maturation in teens, new research shows. Adolescents who consumed walnuts for at least 100 days showed improved sustained attention and fluid intelligence as well as a reduction in symptoms of attension deficit hyperactivity disorder, compared with matched controls who did not consume the nuts. However, there were no statistically significant changes between the groups in other parameters, such as working memory and executive function.

Clinicians should advise adolescents “to eat a handful of walnuts three times a week for the rest of their lives. They may have a healthier brain with better cognitive function,” said senior investigator Jordi Julvez, PhD, group leader at the Institute of Health Research Pere Virgili, Barcelona, and associated researcher at the Barcelona Institute for Global Health.

The study was published online in eClinicalMedicine.
 

Rich source of omega-3s

Adolescence is “a period of refinement of brain connectivity and complex behaviors,” the investigators noted.  

Previous research suggests polyunsaturated fatty acids are key in central nervous system architecture and function during times of neural development, with three specific PUFAs playing an “essential developmental role.”

Two omega-3 fatty acids – docosahexaenoic acid and eicosapentaenoic acid – are PUFAs that must be obtained through diet, mainly from seafood. Walnuts are “among the richest sources” of plant-derived omega-3 fatty acids, particularly alpha-linolenic acid (ALA), a precursor for longer-chain EPA and DHA.

ALA independently “has positive effects on brain function and plasticity,” the authors wrote. In addition, walnut constituents – particularly polyphenols and other bioactive compounds – “may act synergistically with ALA to foster brain health.”

Earlier small studies have found positive associations between walnut consumption and cognitive function in children, adolescents, and young adults, but to date, no randomized controlled trial has focused on the effect of walnut consumption on adolescent neuropsychological function.

The researchers studied 771 healthy adolescents (aged 11-16 years, mean age 14) drawn from 12 Spanish high schools. Participants were instructed to follow healthy eating recommendations and were randomly assigned 1:1 to the intervention (n = 386) or the control group (n = 385).

At baseline and after 6 months, they completed neuropsychological tests and behavioral rating scales. The Attention Network Test assessed attention, and the N-back test was used to assess working memory. The Tests of Primary Mental Abilities assessed fluid intelligence. Risky decision-making was tested using the Roulettes Task.
 

Fruit and nuts

Participants also completed the Strengths and Difficulties Questionnaire, which provided a total score of problem behavior. Teachers filled out the ADHD DSM-IV form list to provide additional information about ADHD behaviors.

The intervention group received 30 grams/day of raw California walnut kernels to incorporate into their daily diet. It is estimated that this walnut contains about 9 g of ALA per 100 g.

All participants received a seasonal fruit calendar and were asked to eat at least one piece of seasonal fruit daily.

Parents reported their child’s daily walnut consumption, with adherence defined as 100 or more days of eating walnuts during the 6-month period.

All main analyses were based on an intention-to-treat method (participants were analyzed according to their original group assignment, regardless of their adherence to the intervention).

The researchers also conducted a secondary per-protocol analysis, comparing the intervention and control groups to estimate the effect if all participants had adhered to their assigned intervention. They censored data for participants who reported eating walnuts for less than 100 days during the 6-month trial period.

Secondary outcomes included changes in height, weight, waist circumference, and BMI, as well as red blood cell proportions of omega-3 fatty acids (DHA, EPA, and ALA) at baseline and after 6 months.
 

 

 

Adherence counts

Most participants had “medium” or “high” levels of adherence to the Mediterranean diet, with “no meaningful differences” at baseline between the intervention and control groups in lifestyle characteristics or mean scores in all primary endpoints.

In the ITT analysis, there were no statistically significant differences in primary outcomes between the groups following the intervention. As for secondary outcomes, the RBC ALA significantly increased in the walnuts group but not the control group (coefficient, 0.04%; 95% confidence interval, 0.03%-0.06%; P < .0001).

However, there were differences in primary outcomes between the groups in the per-protocol analysis: The adherence-adjusted effect on improvement in attention score was −11.26 ms; 95% CI, −19.92 to −2.60; P = .011) for the intervention versus the control group.

The per-protocol analysis showed other differences: an improvement in fluid intelligence score (1.78; 95% CI, 0.90 - 2.67; P < .0001) and a reduction in ADHD symptom score (−2.18; 95% CI, −3.70 to −0.67; P = .0050).

“Overall, no significant differences were found in the intervention group in relation to the control group,” Dr. Julvez said in a news release. “But if the adherence factor is considered, then positive results are observed, since participants who most closely followed the guidelines – in terms of the recommended dose of walnuts and the number of days of consumption – did show improvements in the neuropsychological functions evaluated.”

Adolescence “is a time of great biological changes. Hormonal transformation occurs, which in turn is responsible for stimulating the synaptic growth of the frontal lobe,” he continued, adding that this brain region “enables neuropsychological maturation of more complex emotional and cognitive functions.”

“Neurons that are well nourished with these types of fatty acids will be able to grow and form new, stronger synapses,” he said.
 

Food as medicine

Uma Naidoo, MD, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital, Boston, “commends” the researchers for conducting an RCT with a “robust” sample size and said she is “excited to see research like this furthering functional nutrition for mental health,” as she believes that “food is medicine.”

Dr. Naidoo, a professional chef, nutritional biologist, and author of the book “This Is Your Brain on Food,” said the findings “align” with her own approach to nutritional psychiatry and are also “in line” with her clinical practice.

However, although these results are “promising,” more research is needed across more diverse populations to “make sure these results are truly generalizable,” said Dr. Naidoo, a faculty member at Harvard Medical School, Boston, who was not involved with the study.

She “envisions a future where the research is so advanced that we can ‘dose’ these healthy whole foods for specific psychiatric symptoms and conditions.”

This study was supported by Instituto de Salud Carlos III (co-funded by European Union Regional Development Fund “A way to make Europe”). The California Walnut Commission has given support by supplying the walnuts for free for the Walnuts Smart Snack Dietary Intervention Trial. Dr. Julvez holds a Miguel Servet-II contract awarded by the Instituto de Salud Carlos III (co-funded by European Union Social Fund). The other authors’ disclosures are listed in the original article. Dr. Naidoo reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Walnuts have been associated with better cognitive development and psychological maturation in teens, new research shows. Adolescents who consumed walnuts for at least 100 days showed improved sustained attention and fluid intelligence as well as a reduction in symptoms of attension deficit hyperactivity disorder, compared with matched controls who did not consume the nuts. However, there were no statistically significant changes between the groups in other parameters, such as working memory and executive function.

Clinicians should advise adolescents “to eat a handful of walnuts three times a week for the rest of their lives. They may have a healthier brain with better cognitive function,” said senior investigator Jordi Julvez, PhD, group leader at the Institute of Health Research Pere Virgili, Barcelona, and associated researcher at the Barcelona Institute for Global Health.

The study was published online in eClinicalMedicine.
 

Rich source of omega-3s

Adolescence is “a period of refinement of brain connectivity and complex behaviors,” the investigators noted.  

Previous research suggests polyunsaturated fatty acids are key in central nervous system architecture and function during times of neural development, with three specific PUFAs playing an “essential developmental role.”

Two omega-3 fatty acids – docosahexaenoic acid and eicosapentaenoic acid – are PUFAs that must be obtained through diet, mainly from seafood. Walnuts are “among the richest sources” of plant-derived omega-3 fatty acids, particularly alpha-linolenic acid (ALA), a precursor for longer-chain EPA and DHA.

ALA independently “has positive effects on brain function and plasticity,” the authors wrote. In addition, walnut constituents – particularly polyphenols and other bioactive compounds – “may act synergistically with ALA to foster brain health.”

Earlier small studies have found positive associations between walnut consumption and cognitive function in children, adolescents, and young adults, but to date, no randomized controlled trial has focused on the effect of walnut consumption on adolescent neuropsychological function.

The researchers studied 771 healthy adolescents (aged 11-16 years, mean age 14) drawn from 12 Spanish high schools. Participants were instructed to follow healthy eating recommendations and were randomly assigned 1:1 to the intervention (n = 386) or the control group (n = 385).

At baseline and after 6 months, they completed neuropsychological tests and behavioral rating scales. The Attention Network Test assessed attention, and the N-back test was used to assess working memory. The Tests of Primary Mental Abilities assessed fluid intelligence. Risky decision-making was tested using the Roulettes Task.
 

Fruit and nuts

Participants also completed the Strengths and Difficulties Questionnaire, which provided a total score of problem behavior. Teachers filled out the ADHD DSM-IV form list to provide additional information about ADHD behaviors.

The intervention group received 30 grams/day of raw California walnut kernels to incorporate into their daily diet. It is estimated that this walnut contains about 9 g of ALA per 100 g.

All participants received a seasonal fruit calendar and were asked to eat at least one piece of seasonal fruit daily.

Parents reported their child’s daily walnut consumption, with adherence defined as 100 or more days of eating walnuts during the 6-month period.

All main analyses were based on an intention-to-treat method (participants were analyzed according to their original group assignment, regardless of their adherence to the intervention).

The researchers also conducted a secondary per-protocol analysis, comparing the intervention and control groups to estimate the effect if all participants had adhered to their assigned intervention. They censored data for participants who reported eating walnuts for less than 100 days during the 6-month trial period.

Secondary outcomes included changes in height, weight, waist circumference, and BMI, as well as red blood cell proportions of omega-3 fatty acids (DHA, EPA, and ALA) at baseline and after 6 months.
 

 

 

Adherence counts

Most participants had “medium” or “high” levels of adherence to the Mediterranean diet, with “no meaningful differences” at baseline between the intervention and control groups in lifestyle characteristics or mean scores in all primary endpoints.

In the ITT analysis, there were no statistically significant differences in primary outcomes between the groups following the intervention. As for secondary outcomes, the RBC ALA significantly increased in the walnuts group but not the control group (coefficient, 0.04%; 95% confidence interval, 0.03%-0.06%; P < .0001).

However, there were differences in primary outcomes between the groups in the per-protocol analysis: The adherence-adjusted effect on improvement in attention score was −11.26 ms; 95% CI, −19.92 to −2.60; P = .011) for the intervention versus the control group.

The per-protocol analysis showed other differences: an improvement in fluid intelligence score (1.78; 95% CI, 0.90 - 2.67; P < .0001) and a reduction in ADHD symptom score (−2.18; 95% CI, −3.70 to −0.67; P = .0050).

“Overall, no significant differences were found in the intervention group in relation to the control group,” Dr. Julvez said in a news release. “But if the adherence factor is considered, then positive results are observed, since participants who most closely followed the guidelines – in terms of the recommended dose of walnuts and the number of days of consumption – did show improvements in the neuropsychological functions evaluated.”

Adolescence “is a time of great biological changes. Hormonal transformation occurs, which in turn is responsible for stimulating the synaptic growth of the frontal lobe,” he continued, adding that this brain region “enables neuropsychological maturation of more complex emotional and cognitive functions.”

“Neurons that are well nourished with these types of fatty acids will be able to grow and form new, stronger synapses,” he said.
 

Food as medicine

Uma Naidoo, MD, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital, Boston, “commends” the researchers for conducting an RCT with a “robust” sample size and said she is “excited to see research like this furthering functional nutrition for mental health,” as she believes that “food is medicine.”

Dr. Naidoo, a professional chef, nutritional biologist, and author of the book “This Is Your Brain on Food,” said the findings “align” with her own approach to nutritional psychiatry and are also “in line” with her clinical practice.

However, although these results are “promising,” more research is needed across more diverse populations to “make sure these results are truly generalizable,” said Dr. Naidoo, a faculty member at Harvard Medical School, Boston, who was not involved with the study.

She “envisions a future where the research is so advanced that we can ‘dose’ these healthy whole foods for specific psychiatric symptoms and conditions.”

This study was supported by Instituto de Salud Carlos III (co-funded by European Union Regional Development Fund “A way to make Europe”). The California Walnut Commission has given support by supplying the walnuts for free for the Walnuts Smart Snack Dietary Intervention Trial. Dr. Julvez holds a Miguel Servet-II contract awarded by the Instituto de Salud Carlos III (co-funded by European Union Social Fund). The other authors’ disclosures are listed in the original article. Dr. Naidoo reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Walnuts have been associated with better cognitive development and psychological maturation in teens, new research shows. Adolescents who consumed walnuts for at least 100 days showed improved sustained attention and fluid intelligence as well as a reduction in symptoms of attension deficit hyperactivity disorder, compared with matched controls who did not consume the nuts. However, there were no statistically significant changes between the groups in other parameters, such as working memory and executive function.

Clinicians should advise adolescents “to eat a handful of walnuts three times a week for the rest of their lives. They may have a healthier brain with better cognitive function,” said senior investigator Jordi Julvez, PhD, group leader at the Institute of Health Research Pere Virgili, Barcelona, and associated researcher at the Barcelona Institute for Global Health.

The study was published online in eClinicalMedicine.
 

Rich source of omega-3s

Adolescence is “a period of refinement of brain connectivity and complex behaviors,” the investigators noted.  

Previous research suggests polyunsaturated fatty acids are key in central nervous system architecture and function during times of neural development, with three specific PUFAs playing an “essential developmental role.”

Two omega-3 fatty acids – docosahexaenoic acid and eicosapentaenoic acid – are PUFAs that must be obtained through diet, mainly from seafood. Walnuts are “among the richest sources” of plant-derived omega-3 fatty acids, particularly alpha-linolenic acid (ALA), a precursor for longer-chain EPA and DHA.

ALA independently “has positive effects on brain function and plasticity,” the authors wrote. In addition, walnut constituents – particularly polyphenols and other bioactive compounds – “may act synergistically with ALA to foster brain health.”

Earlier small studies have found positive associations between walnut consumption and cognitive function in children, adolescents, and young adults, but to date, no randomized controlled trial has focused on the effect of walnut consumption on adolescent neuropsychological function.

The researchers studied 771 healthy adolescents (aged 11-16 years, mean age 14) drawn from 12 Spanish high schools. Participants were instructed to follow healthy eating recommendations and were randomly assigned 1:1 to the intervention (n = 386) or the control group (n = 385).

At baseline and after 6 months, they completed neuropsychological tests and behavioral rating scales. The Attention Network Test assessed attention, and the N-back test was used to assess working memory. The Tests of Primary Mental Abilities assessed fluid intelligence. Risky decision-making was tested using the Roulettes Task.
 

Fruit and nuts

Participants also completed the Strengths and Difficulties Questionnaire, which provided a total score of problem behavior. Teachers filled out the ADHD DSM-IV form list to provide additional information about ADHD behaviors.

The intervention group received 30 grams/day of raw California walnut kernels to incorporate into their daily diet. It is estimated that this walnut contains about 9 g of ALA per 100 g.

All participants received a seasonal fruit calendar and were asked to eat at least one piece of seasonal fruit daily.

Parents reported their child’s daily walnut consumption, with adherence defined as 100 or more days of eating walnuts during the 6-month period.

All main analyses were based on an intention-to-treat method (participants were analyzed according to their original group assignment, regardless of their adherence to the intervention).

The researchers also conducted a secondary per-protocol analysis, comparing the intervention and control groups to estimate the effect if all participants had adhered to their assigned intervention. They censored data for participants who reported eating walnuts for less than 100 days during the 6-month trial period.

Secondary outcomes included changes in height, weight, waist circumference, and BMI, as well as red blood cell proportions of omega-3 fatty acids (DHA, EPA, and ALA) at baseline and after 6 months.
 

 

 

Adherence counts

Most participants had “medium” or “high” levels of adherence to the Mediterranean diet, with “no meaningful differences” at baseline between the intervention and control groups in lifestyle characteristics or mean scores in all primary endpoints.

In the ITT analysis, there were no statistically significant differences in primary outcomes between the groups following the intervention. As for secondary outcomes, the RBC ALA significantly increased in the walnuts group but not the control group (coefficient, 0.04%; 95% confidence interval, 0.03%-0.06%; P < .0001).

However, there were differences in primary outcomes between the groups in the per-protocol analysis: The adherence-adjusted effect on improvement in attention score was −11.26 ms; 95% CI, −19.92 to −2.60; P = .011) for the intervention versus the control group.

The per-protocol analysis showed other differences: an improvement in fluid intelligence score (1.78; 95% CI, 0.90 - 2.67; P < .0001) and a reduction in ADHD symptom score (−2.18; 95% CI, −3.70 to −0.67; P = .0050).

“Overall, no significant differences were found in the intervention group in relation to the control group,” Dr. Julvez said in a news release. “But if the adherence factor is considered, then positive results are observed, since participants who most closely followed the guidelines – in terms of the recommended dose of walnuts and the number of days of consumption – did show improvements in the neuropsychological functions evaluated.”

Adolescence “is a time of great biological changes. Hormonal transformation occurs, which in turn is responsible for stimulating the synaptic growth of the frontal lobe,” he continued, adding that this brain region “enables neuropsychological maturation of more complex emotional and cognitive functions.”

“Neurons that are well nourished with these types of fatty acids will be able to grow and form new, stronger synapses,” he said.
 

Food as medicine

Uma Naidoo, MD, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital, Boston, “commends” the researchers for conducting an RCT with a “robust” sample size and said she is “excited to see research like this furthering functional nutrition for mental health,” as she believes that “food is medicine.”

Dr. Naidoo, a professional chef, nutritional biologist, and author of the book “This Is Your Brain on Food,” said the findings “align” with her own approach to nutritional psychiatry and are also “in line” with her clinical practice.

However, although these results are “promising,” more research is needed across more diverse populations to “make sure these results are truly generalizable,” said Dr. Naidoo, a faculty member at Harvard Medical School, Boston, who was not involved with the study.

She “envisions a future where the research is so advanced that we can ‘dose’ these healthy whole foods for specific psychiatric symptoms and conditions.”

This study was supported by Instituto de Salud Carlos III (co-funded by European Union Regional Development Fund “A way to make Europe”). The California Walnut Commission has given support by supplying the walnuts for free for the Walnuts Smart Snack Dietary Intervention Trial. Dr. Julvez holds a Miguel Servet-II contract awarded by the Instituto de Salud Carlos III (co-funded by European Union Social Fund). The other authors’ disclosures are listed in the original article. Dr. Naidoo reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECLINICALMEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Meta-analysis examines cancer risk concern for JAK inhibitors

Article Type
Changed
Fri, 04/28/2023 - 00:45

– Janus kinase (JAK) inhibitors may be associated with a higher risk for cancer relative to tumor necrosis factor (TNF) inhibitors, according to a meta-analysis reported at the annual meeting of the British Society for Rheumatology.

Looking at all phase 2, 3, and 4 trials and long-term extension studies across the indications of rheumatoid arthritis, psoriatic arthritis, psoriasis, axial spondyloarthritis, inflammatory bowel disease, and atopic dermatitis, the risk ratio for any cancer developing was 1.63 when compared with anti-TNF therapy (95% confidence interval, 1.27-2.09).

Sara Freeman/MDedge News
Dr. Christopher Stovin

By comparison, JAK inhibitor use was not significantly associated with any greater risk for cancer than methotrexate (RR, 1.06; 95% confidence interval, 0.58-1.94) or placebo (RR, 1.16; 95% CI, 0.75-1.80).

“Our data suggests that rather than JAK inhibitors necessarily being harmful, it could be more a case of TNF inhibitors being protective,” said Christopher Stovin, MBChB, a specialist registrar in rheumatology at the Princess Royal University Hospital, King’s College Hospital NHS Trust, London.

“We should stress that these are rare events in our study, roughly around 1 in every 100 patient-years of exposure,” Dr. Stovin said.

“Despite having over 80,000 years of patient exposure, the median follow-up duration for JAK inhibitors was still only 118 weeks, which for cancers [that] obviously have long latency periods is still a relatively small duration of time,” the researcher added.

Dr. Anurag Bharadwaj

“People worry about the drugs. But there is a possibility that [a] disturbed immune system plays a role per se in development of cancers,” consultant rheumatologist Anurag Bharadwaj, MD, DM, said in an interview.

“Although there are studies which attribute increased risk of cancer to different DMARDs [disease-modifying antirheumatic drugs] and biologics like TNF, but on other hand, it’s maybe that we are giving these drugs to patients who have got more serious immunological disease,” suggested Bharadwaj, who serves as the clinical lead for rheumatology at Basildon (England) Hospital, Mid & South Essex Foundation Trust.

“So, a possibility may be that the more severe or the more active the immunological inflammatory disease, the higher the chance of cancer, and these are the patients who go for the stronger medications,” Dr. Bharadwaj said.

There is an “immunological window of opportunity” when treating these inflammatory diseases, said Dr. Bharadwaj, noting that the first few months of treatment are vital. “For all immunological diseases, the more quickly you bring the immunological abnormality down, the chances of long-term complications go down, including [possibly that the] chances of cancer go down, chances of cardiovascular disease go down, and chances of lung disease go down. Hit it early, hit it hard.”

Concern over a possible higher risk for cancer with JAK inhibitors than with TNF inhibitors was raised following the release of data from the ORAL Surveillance trial, a postmarketing trial of tofacitinib (Xeljanz) that had been mandated by the Food and Drug Administration.

“This was a study looking at the coprimary endpoints of malignancy and major adverse cardiovascular events, and it was enriched with patients over the age of 50, with one additional cardiac risk factor, designed to amplify the detection of these rare events,” Dr. Stovin said.



“There was a signal of an increased risk of malignancy in the tofacitinib group, and this led to the FDA issuing a [boxed warning for all licensed JAK inhibitors] at that time,” he added.

Dr. Stovin and colleagues aimed to determine what, if any, cancer risk was associated with all available JAK inhibitors relative to placebo, TNF inhibitors, and methotrexate.

In all, data from 62 randomized controlled trials and 14 long-term extension studies were included in the meta-analysis, accounting for 82,366 patient years of follow-up. The JAK inhibitors analyzed included tofacitinib, baricitinib (Olumiant), upadacitinib (Rinvoq), filgotinib (Jyseleca), and peficitinib (Smyraf). (Filgotinib and peficitinib have not been approved by the FDA.)

The researchers performed sensitivity analyses that excluded cancers detected within the first 6 months of treatment, the use of higher than licensed JAK inhibitor doses, and patients with non-rheumatoid arthritis diagnoses, but the results remained largely unchanged, Dr. Stovin reported.  

“Perhaps not surprisingly, when we removed ORAL Surveillance” from the analysis comparing JAK inhibitors and TNF inhibitors, “we lost statistical significance,” he said.

“Longitudinal observational data is needed but currently remains limited,” Dr. Stovin concluded.

Dr. Stovin and Dr. Bharadwaj reported no relevant financial relationships. The meta-analysis was independently supported. Dr. Bharadwaj was not involved in the study and provided comment ahead of the presentation.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Janus kinase (JAK) inhibitors may be associated with a higher risk for cancer relative to tumor necrosis factor (TNF) inhibitors, according to a meta-analysis reported at the annual meeting of the British Society for Rheumatology.

Looking at all phase 2, 3, and 4 trials and long-term extension studies across the indications of rheumatoid arthritis, psoriatic arthritis, psoriasis, axial spondyloarthritis, inflammatory bowel disease, and atopic dermatitis, the risk ratio for any cancer developing was 1.63 when compared with anti-TNF therapy (95% confidence interval, 1.27-2.09).

Sara Freeman/MDedge News
Dr. Christopher Stovin

By comparison, JAK inhibitor use was not significantly associated with any greater risk for cancer than methotrexate (RR, 1.06; 95% confidence interval, 0.58-1.94) or placebo (RR, 1.16; 95% CI, 0.75-1.80).

“Our data suggests that rather than JAK inhibitors necessarily being harmful, it could be more a case of TNF inhibitors being protective,” said Christopher Stovin, MBChB, a specialist registrar in rheumatology at the Princess Royal University Hospital, King’s College Hospital NHS Trust, London.

“We should stress that these are rare events in our study, roughly around 1 in every 100 patient-years of exposure,” Dr. Stovin said.

“Despite having over 80,000 years of patient exposure, the median follow-up duration for JAK inhibitors was still only 118 weeks, which for cancers [that] obviously have long latency periods is still a relatively small duration of time,” the researcher added.

Dr. Anurag Bharadwaj

“People worry about the drugs. But there is a possibility that [a] disturbed immune system plays a role per se in development of cancers,” consultant rheumatologist Anurag Bharadwaj, MD, DM, said in an interview.

“Although there are studies which attribute increased risk of cancer to different DMARDs [disease-modifying antirheumatic drugs] and biologics like TNF, but on other hand, it’s maybe that we are giving these drugs to patients who have got more serious immunological disease,” suggested Bharadwaj, who serves as the clinical lead for rheumatology at Basildon (England) Hospital, Mid & South Essex Foundation Trust.

“So, a possibility may be that the more severe or the more active the immunological inflammatory disease, the higher the chance of cancer, and these are the patients who go for the stronger medications,” Dr. Bharadwaj said.

There is an “immunological window of opportunity” when treating these inflammatory diseases, said Dr. Bharadwaj, noting that the first few months of treatment are vital. “For all immunological diseases, the more quickly you bring the immunological abnormality down, the chances of long-term complications go down, including [possibly that the] chances of cancer go down, chances of cardiovascular disease go down, and chances of lung disease go down. Hit it early, hit it hard.”

Concern over a possible higher risk for cancer with JAK inhibitors than with TNF inhibitors was raised following the release of data from the ORAL Surveillance trial, a postmarketing trial of tofacitinib (Xeljanz) that had been mandated by the Food and Drug Administration.

“This was a study looking at the coprimary endpoints of malignancy and major adverse cardiovascular events, and it was enriched with patients over the age of 50, with one additional cardiac risk factor, designed to amplify the detection of these rare events,” Dr. Stovin said.



“There was a signal of an increased risk of malignancy in the tofacitinib group, and this led to the FDA issuing a [boxed warning for all licensed JAK inhibitors] at that time,” he added.

Dr. Stovin and colleagues aimed to determine what, if any, cancer risk was associated with all available JAK inhibitors relative to placebo, TNF inhibitors, and methotrexate.

In all, data from 62 randomized controlled trials and 14 long-term extension studies were included in the meta-analysis, accounting for 82,366 patient years of follow-up. The JAK inhibitors analyzed included tofacitinib, baricitinib (Olumiant), upadacitinib (Rinvoq), filgotinib (Jyseleca), and peficitinib (Smyraf). (Filgotinib and peficitinib have not been approved by the FDA.)

The researchers performed sensitivity analyses that excluded cancers detected within the first 6 months of treatment, the use of higher than licensed JAK inhibitor doses, and patients with non-rheumatoid arthritis diagnoses, but the results remained largely unchanged, Dr. Stovin reported.  

“Perhaps not surprisingly, when we removed ORAL Surveillance” from the analysis comparing JAK inhibitors and TNF inhibitors, “we lost statistical significance,” he said.

“Longitudinal observational data is needed but currently remains limited,” Dr. Stovin concluded.

Dr. Stovin and Dr. Bharadwaj reported no relevant financial relationships. The meta-analysis was independently supported. Dr. Bharadwaj was not involved in the study and provided comment ahead of the presentation.

A version of this article first appeared on Medscape.com.

– Janus kinase (JAK) inhibitors may be associated with a higher risk for cancer relative to tumor necrosis factor (TNF) inhibitors, according to a meta-analysis reported at the annual meeting of the British Society for Rheumatology.

Looking at all phase 2, 3, and 4 trials and long-term extension studies across the indications of rheumatoid arthritis, psoriatic arthritis, psoriasis, axial spondyloarthritis, inflammatory bowel disease, and atopic dermatitis, the risk ratio for any cancer developing was 1.63 when compared with anti-TNF therapy (95% confidence interval, 1.27-2.09).

Sara Freeman/MDedge News
Dr. Christopher Stovin

By comparison, JAK inhibitor use was not significantly associated with any greater risk for cancer than methotrexate (RR, 1.06; 95% confidence interval, 0.58-1.94) or placebo (RR, 1.16; 95% CI, 0.75-1.80).

“Our data suggests that rather than JAK inhibitors necessarily being harmful, it could be more a case of TNF inhibitors being protective,” said Christopher Stovin, MBChB, a specialist registrar in rheumatology at the Princess Royal University Hospital, King’s College Hospital NHS Trust, London.

“We should stress that these are rare events in our study, roughly around 1 in every 100 patient-years of exposure,” Dr. Stovin said.

“Despite having over 80,000 years of patient exposure, the median follow-up duration for JAK inhibitors was still only 118 weeks, which for cancers [that] obviously have long latency periods is still a relatively small duration of time,” the researcher added.

Dr. Anurag Bharadwaj

“People worry about the drugs. But there is a possibility that [a] disturbed immune system plays a role per se in development of cancers,” consultant rheumatologist Anurag Bharadwaj, MD, DM, said in an interview.

“Although there are studies which attribute increased risk of cancer to different DMARDs [disease-modifying antirheumatic drugs] and biologics like TNF, but on other hand, it’s maybe that we are giving these drugs to patients who have got more serious immunological disease,” suggested Bharadwaj, who serves as the clinical lead for rheumatology at Basildon (England) Hospital, Mid & South Essex Foundation Trust.

“So, a possibility may be that the more severe or the more active the immunological inflammatory disease, the higher the chance of cancer, and these are the patients who go for the stronger medications,” Dr. Bharadwaj said.

There is an “immunological window of opportunity” when treating these inflammatory diseases, said Dr. Bharadwaj, noting that the first few months of treatment are vital. “For all immunological diseases, the more quickly you bring the immunological abnormality down, the chances of long-term complications go down, including [possibly that the] chances of cancer go down, chances of cardiovascular disease go down, and chances of lung disease go down. Hit it early, hit it hard.”

Concern over a possible higher risk for cancer with JAK inhibitors than with TNF inhibitors was raised following the release of data from the ORAL Surveillance trial, a postmarketing trial of tofacitinib (Xeljanz) that had been mandated by the Food and Drug Administration.

“This was a study looking at the coprimary endpoints of malignancy and major adverse cardiovascular events, and it was enriched with patients over the age of 50, with one additional cardiac risk factor, designed to amplify the detection of these rare events,” Dr. Stovin said.



“There was a signal of an increased risk of malignancy in the tofacitinib group, and this led to the FDA issuing a [boxed warning for all licensed JAK inhibitors] at that time,” he added.

Dr. Stovin and colleagues aimed to determine what, if any, cancer risk was associated with all available JAK inhibitors relative to placebo, TNF inhibitors, and methotrexate.

In all, data from 62 randomized controlled trials and 14 long-term extension studies were included in the meta-analysis, accounting for 82,366 patient years of follow-up. The JAK inhibitors analyzed included tofacitinib, baricitinib (Olumiant), upadacitinib (Rinvoq), filgotinib (Jyseleca), and peficitinib (Smyraf). (Filgotinib and peficitinib have not been approved by the FDA.)

The researchers performed sensitivity analyses that excluded cancers detected within the first 6 months of treatment, the use of higher than licensed JAK inhibitor doses, and patients with non-rheumatoid arthritis diagnoses, but the results remained largely unchanged, Dr. Stovin reported.  

“Perhaps not surprisingly, when we removed ORAL Surveillance” from the analysis comparing JAK inhibitors and TNF inhibitors, “we lost statistical significance,” he said.

“Longitudinal observational data is needed but currently remains limited,” Dr. Stovin concluded.

Dr. Stovin and Dr. Bharadwaj reported no relevant financial relationships. The meta-analysis was independently supported. Dr. Bharadwaj was not involved in the study and provided comment ahead of the presentation.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT BSR 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Small study finds IPL-radiofrequency combination effective for dry eye disease

Article Type
Changed
Fri, 04/28/2023 - 00:44

Combining intense pulsed light (IPL) with topical radiofrequency (RF) for dry eye disease related to meibomian gland dysfunction resulted in about a doubling of meibomian gland expression and improved meibum quality in both upper and lower eyelids, results from an ongoing, novel study showed.

Dry eye disease affects a large proportion of people in the United States “and the factors that contribute to that are certainly not going away,” lead study author James G. Chelnis MD, said at the annual conference of the American Society for Laser Medicine and Surgery, where he presented the results during an abstract session. “Prepandemic, we used to have meetings in person; now most are on a computer screen,” a common risk factor for worsening dry eyes, he said. Telltale dry eye symptoms include blurry vision, irritation, and corneal damage – mostly caused by meibomian gland dysfunction – which impacts the quality and quantity of meibum secreted. Common treatments include warm compresses, doxycycline, and artificial tears.

Dr. Chelnis
Dr. James G. Chelnis

While some studies have shown IPL is helpful in treating dry eye disease caused by meibomian gland dysfunction, little information is available on its use alone or in combination with topical RF to preserve and improve the function of meibomian glands, said Dr. Chelnis, an ophthalmic plastic surgeon in New York City. “The theory here is that the radiofrequency would be able to vibrate the water molecules inside the meibomian glands, which would allow you to turn over the meibum faster, as well as improve the blink reflex response by building supporting collagen,” he said. “Our novel study explores the ability of this combined modality treatment to improve upon meibomian gland health.”
 

Study design, results

Dr. Chelnis and his colleagues enrolled 11 individuals with a previous diagnosis of dry eye disease and meibomian gland dysfunction with Ocular Surface Disease Index (OSDI) survey scores higher than 23, which indicate at least moderate dry eye symptoms. Inclusion criteria were being 22 years of age or older, signs of meibomian gland dysfunction as detected by biomicroscopy, a modified meibomian gland score over 12 in the lower eyelid of at least one eye, and type I-IV skin.

All patients received four treatments (each 2 weeks apart) of IPL to the lower eyelid, surrounding malar region, and nose, followed by 7 minutes of topical RF treatments at 1 MHz and 4 MHz extending to the inferior, lateral, and superior orbital rim. Evaluation of meibomian gland expression and quality of meibum upon expression was conducted following each treatment session, with a final evaluation 4 weeks after the final treatment session.

Meibum quality was evaluated on a scale of 0-3 representing clear (0), cloudy (1), inspissated (2), and blocked (3) meibum, respectively.

Following treatment, meibomian gland expression and meibum quality improved in all eyelids in all 11 patients. Specifically, in the right eye, the number of upper lid expressible glands increased from an average of 13 to 27.9 and the number of lower lid expressible glands increased from an average of 14.6 to 28.2; and in the left eye, the number of upper lid expressible glands increased from an average of 13.3 to 27.3 and the number of lower lid expressible glands increased from an average of 14.8 to 26.8 (P < .001 for all associations).



The overall percentage improvement in meibomian gland expression in the right eye was 82.7% for the upper lids and 136.6% for the lower lids, and in the left eye, 82.9% for the upper lids, and 112.2% for the lower lids.

When comparing upper against lower lids, meibomian gland expression increased 124.4% and 82.8%, respectively. Meibum quality improved in all four eyelids, although upper eyelids displayed a superior improvement compared with lower eyelids.

“We are finding that combining IPL plus RF produces a more complete and comprehensive improvement in the quality of their meibomian gland health, and as such, their dry eyes,” with “a large decrease in their symptom profile,” he concluded.

More patients to be studied

Dr. Chelnis acknowledged certain limitations of the study, including the small number of patients, but he and his colleagues have added an additional clinical site to expand the sample size. “Larger scale studies are needed to evaluate long-term effectiveness of IPL plus RF as well as a comparison with other treatment options.”

During a question-and-answer session Mathew M. Avram, MD, JD, director of laser, cosmetics, and dermatologic surgery at Massachusetts General Hospital, Boston, who served as one of the abstract session moderators, asked Dr. Chelnis to comment on what the mechanism of action of the IPL-RF combination in improving meibomian gland health.

“It’s not fully understood, but part of it is improved vascularity at the lid margin,” said Dr. Chelnis, who holds a faculty position in the department of ophthalmology at Icahn School of Medicine at Mount Sinai, New York. “Your ocular surface is sort of like your screen door; it catches everything that’s in the environment. An increase in vascularity and immunologic cytokines occurs in response to that. If you’re looking at the eye with a slit lamp, you can see a lot of vascularity that occurs at the lid margin and crowds the meibomian glands. When you decrease that crowding and immunogenic response, you move towards a normally functioning lid margin.”

Dr. Chelnis disclosed that he is a consultant to or an adviser for Lumenis, Horizon Therapeutics, and Soniquence.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Combining intense pulsed light (IPL) with topical radiofrequency (RF) for dry eye disease related to meibomian gland dysfunction resulted in about a doubling of meibomian gland expression and improved meibum quality in both upper and lower eyelids, results from an ongoing, novel study showed.

Dry eye disease affects a large proportion of people in the United States “and the factors that contribute to that are certainly not going away,” lead study author James G. Chelnis MD, said at the annual conference of the American Society for Laser Medicine and Surgery, where he presented the results during an abstract session. “Prepandemic, we used to have meetings in person; now most are on a computer screen,” a common risk factor for worsening dry eyes, he said. Telltale dry eye symptoms include blurry vision, irritation, and corneal damage – mostly caused by meibomian gland dysfunction – which impacts the quality and quantity of meibum secreted. Common treatments include warm compresses, doxycycline, and artificial tears.

Dr. Chelnis
Dr. James G. Chelnis

While some studies have shown IPL is helpful in treating dry eye disease caused by meibomian gland dysfunction, little information is available on its use alone or in combination with topical RF to preserve and improve the function of meibomian glands, said Dr. Chelnis, an ophthalmic plastic surgeon in New York City. “The theory here is that the radiofrequency would be able to vibrate the water molecules inside the meibomian glands, which would allow you to turn over the meibum faster, as well as improve the blink reflex response by building supporting collagen,” he said. “Our novel study explores the ability of this combined modality treatment to improve upon meibomian gland health.”
 

Study design, results

Dr. Chelnis and his colleagues enrolled 11 individuals with a previous diagnosis of dry eye disease and meibomian gland dysfunction with Ocular Surface Disease Index (OSDI) survey scores higher than 23, which indicate at least moderate dry eye symptoms. Inclusion criteria were being 22 years of age or older, signs of meibomian gland dysfunction as detected by biomicroscopy, a modified meibomian gland score over 12 in the lower eyelid of at least one eye, and type I-IV skin.

All patients received four treatments (each 2 weeks apart) of IPL to the lower eyelid, surrounding malar region, and nose, followed by 7 minutes of topical RF treatments at 1 MHz and 4 MHz extending to the inferior, lateral, and superior orbital rim. Evaluation of meibomian gland expression and quality of meibum upon expression was conducted following each treatment session, with a final evaluation 4 weeks after the final treatment session.

Meibum quality was evaluated on a scale of 0-3 representing clear (0), cloudy (1), inspissated (2), and blocked (3) meibum, respectively.

Following treatment, meibomian gland expression and meibum quality improved in all eyelids in all 11 patients. Specifically, in the right eye, the number of upper lid expressible glands increased from an average of 13 to 27.9 and the number of lower lid expressible glands increased from an average of 14.6 to 28.2; and in the left eye, the number of upper lid expressible glands increased from an average of 13.3 to 27.3 and the number of lower lid expressible glands increased from an average of 14.8 to 26.8 (P < .001 for all associations).



The overall percentage improvement in meibomian gland expression in the right eye was 82.7% for the upper lids and 136.6% for the lower lids, and in the left eye, 82.9% for the upper lids, and 112.2% for the lower lids.

When comparing upper against lower lids, meibomian gland expression increased 124.4% and 82.8%, respectively. Meibum quality improved in all four eyelids, although upper eyelids displayed a superior improvement compared with lower eyelids.

“We are finding that combining IPL plus RF produces a more complete and comprehensive improvement in the quality of their meibomian gland health, and as such, their dry eyes,” with “a large decrease in their symptom profile,” he concluded.

More patients to be studied

Dr. Chelnis acknowledged certain limitations of the study, including the small number of patients, but he and his colleagues have added an additional clinical site to expand the sample size. “Larger scale studies are needed to evaluate long-term effectiveness of IPL plus RF as well as a comparison with other treatment options.”

During a question-and-answer session Mathew M. Avram, MD, JD, director of laser, cosmetics, and dermatologic surgery at Massachusetts General Hospital, Boston, who served as one of the abstract session moderators, asked Dr. Chelnis to comment on what the mechanism of action of the IPL-RF combination in improving meibomian gland health.

“It’s not fully understood, but part of it is improved vascularity at the lid margin,” said Dr. Chelnis, who holds a faculty position in the department of ophthalmology at Icahn School of Medicine at Mount Sinai, New York. “Your ocular surface is sort of like your screen door; it catches everything that’s in the environment. An increase in vascularity and immunologic cytokines occurs in response to that. If you’re looking at the eye with a slit lamp, you can see a lot of vascularity that occurs at the lid margin and crowds the meibomian glands. When you decrease that crowding and immunogenic response, you move towards a normally functioning lid margin.”

Dr. Chelnis disclosed that he is a consultant to or an adviser for Lumenis, Horizon Therapeutics, and Soniquence.

Combining intense pulsed light (IPL) with topical radiofrequency (RF) for dry eye disease related to meibomian gland dysfunction resulted in about a doubling of meibomian gland expression and improved meibum quality in both upper and lower eyelids, results from an ongoing, novel study showed.

Dry eye disease affects a large proportion of people in the United States “and the factors that contribute to that are certainly not going away,” lead study author James G. Chelnis MD, said at the annual conference of the American Society for Laser Medicine and Surgery, where he presented the results during an abstract session. “Prepandemic, we used to have meetings in person; now most are on a computer screen,” a common risk factor for worsening dry eyes, he said. Telltale dry eye symptoms include blurry vision, irritation, and corneal damage – mostly caused by meibomian gland dysfunction – which impacts the quality and quantity of meibum secreted. Common treatments include warm compresses, doxycycline, and artificial tears.

Dr. Chelnis
Dr. James G. Chelnis

While some studies have shown IPL is helpful in treating dry eye disease caused by meibomian gland dysfunction, little information is available on its use alone or in combination with topical RF to preserve and improve the function of meibomian glands, said Dr. Chelnis, an ophthalmic plastic surgeon in New York City. “The theory here is that the radiofrequency would be able to vibrate the water molecules inside the meibomian glands, which would allow you to turn over the meibum faster, as well as improve the blink reflex response by building supporting collagen,” he said. “Our novel study explores the ability of this combined modality treatment to improve upon meibomian gland health.”
 

Study design, results

Dr. Chelnis and his colleagues enrolled 11 individuals with a previous diagnosis of dry eye disease and meibomian gland dysfunction with Ocular Surface Disease Index (OSDI) survey scores higher than 23, which indicate at least moderate dry eye symptoms. Inclusion criteria were being 22 years of age or older, signs of meibomian gland dysfunction as detected by biomicroscopy, a modified meibomian gland score over 12 in the lower eyelid of at least one eye, and type I-IV skin.

All patients received four treatments (each 2 weeks apart) of IPL to the lower eyelid, surrounding malar region, and nose, followed by 7 minutes of topical RF treatments at 1 MHz and 4 MHz extending to the inferior, lateral, and superior orbital rim. Evaluation of meibomian gland expression and quality of meibum upon expression was conducted following each treatment session, with a final evaluation 4 weeks after the final treatment session.

Meibum quality was evaluated on a scale of 0-3 representing clear (0), cloudy (1), inspissated (2), and blocked (3) meibum, respectively.

Following treatment, meibomian gland expression and meibum quality improved in all eyelids in all 11 patients. Specifically, in the right eye, the number of upper lid expressible glands increased from an average of 13 to 27.9 and the number of lower lid expressible glands increased from an average of 14.6 to 28.2; and in the left eye, the number of upper lid expressible glands increased from an average of 13.3 to 27.3 and the number of lower lid expressible glands increased from an average of 14.8 to 26.8 (P < .001 for all associations).



The overall percentage improvement in meibomian gland expression in the right eye was 82.7% for the upper lids and 136.6% for the lower lids, and in the left eye, 82.9% for the upper lids, and 112.2% for the lower lids.

When comparing upper against lower lids, meibomian gland expression increased 124.4% and 82.8%, respectively. Meibum quality improved in all four eyelids, although upper eyelids displayed a superior improvement compared with lower eyelids.

“We are finding that combining IPL plus RF produces a more complete and comprehensive improvement in the quality of their meibomian gland health, and as such, their dry eyes,” with “a large decrease in their symptom profile,” he concluded.

More patients to be studied

Dr. Chelnis acknowledged certain limitations of the study, including the small number of patients, but he and his colleagues have added an additional clinical site to expand the sample size. “Larger scale studies are needed to evaluate long-term effectiveness of IPL plus RF as well as a comparison with other treatment options.”

During a question-and-answer session Mathew M. Avram, MD, JD, director of laser, cosmetics, and dermatologic surgery at Massachusetts General Hospital, Boston, who served as one of the abstract session moderators, asked Dr. Chelnis to comment on what the mechanism of action of the IPL-RF combination in improving meibomian gland health.

“It’s not fully understood, but part of it is improved vascularity at the lid margin,” said Dr. Chelnis, who holds a faculty position in the department of ophthalmology at Icahn School of Medicine at Mount Sinai, New York. “Your ocular surface is sort of like your screen door; it catches everything that’s in the environment. An increase in vascularity and immunologic cytokines occurs in response to that. If you’re looking at the eye with a slit lamp, you can see a lot of vascularity that occurs at the lid margin and crowds the meibomian glands. When you decrease that crowding and immunogenic response, you move towards a normally functioning lid margin.”

Dr. Chelnis disclosed that he is a consultant to or an adviser for Lumenis, Horizon Therapeutics, and Soniquence.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ASLMS 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Painful Nodules With a Crawling Sensation

Article Type
Changed
Wed, 04/26/2023 - 11:03
Display Headline
Painful Nodules With a Crawling Sensation

The Diagnosis: Cutaneous Furuncular Myiasis

Histopathology of the punch biopsy showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from the exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates on hematoxylin and eosin stain (Figure 1). Live insect larvae were observed and extracted, which immediately relieved the crawling sensation (Figure 2). Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (Figure 3).

Histopathology showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates
FIGURE 1. A and B, Histopathology showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates (H&E, original magnifications ×4 and ×40).

Myiasis is a parasitic infestation of the dipterous fly’s larvae in the host organ and tissue. There are 5 types of myiasis based on the location of the infestation: wound myiasis occurs with egg infestations on an open wound; furuncular myiasis results from egg placement by penetration of healthy skin by a mosquito vector; plaque myiasis comprises the placement of eggs on clothing through several maggots and flies; creeping myiasis involves the Gasterophilus fly delivering the larva intradermally; and body cavity myiasis may develop in the orbit, nasal cavity, urogenital system, and gastrointestinal tract.1-3

An insect larva was extracted from a lesion on the arm, which immediately relieved the crawling sensation experienced by the patient, characteristic of furuncular myiasis.
FIGURE 2. An insect larva was extracted from a lesion on the arm, which immediately relieved the crawling sensation experienced by the patient, characteristic of furuncular myiasis.

Furuncular myiasis infestation occurs via a complex life cycle in which mosquitoes act as a vector and transfer the eggs to the human or animal host.1-3 Botfly larvae then penetrate the skin and reside within the subdermis to mature. Adults then emerge after 1 month to repeat the cycle.1Dermatobia hominis and Cordylobia anthropophaga are the most common causes of furuncular myiasis.2,3 Furuncular myiasis commonly presents in travelers that are returning from tropical countries. Initially, an itching erythematous papule develops. After the larvae mature, they can appear as boil-like lesions with a small central punctum.1-3 Dermoscopy can be utilized for visualization of different larvae anatomy such as a furuncularlike lesion, spines, and posterior breathing spiracle from the central punctum.4

Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (original magnification ×40).
FIGURE 3. Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (original magnification ×40).

Our patient’s recent travel to the Amazon in Brazil, clinical history, and histopathologic findings ruled out other differential diagnoses such as cutaneous larva migrans, gnathostomiasis, loiasis, and tungiasis.

Treatment is curative with the extraction of the intact larva from the nodule. Localized skin anesthetic injection can be used to bulge the larva outward for easier extraction. A single dose of ivermectin 15 mg can treat the parasitic infestation of myiasis.1-3

References
  1. John DT, Petri WA, Markell EK, et al. Markell and Voge’s Medical Parasitology. 9th ed. Saunders Elsevier; 2006.
  2. Caissie R, Beaulieu F, Giroux M, et al. Cutaneous myiasis: diagnosis, treatment, and prevention. J Oral Maxillofac Surg. 2008;66:560-568.
  3. Lachish T, Marhoom E, Mumcuoglu KY, et al. Myiasis in travelers. J Travel Med. 2015;22:232-236.
  4. Mello C, Magalhães R. Triangular black dots in dermoscopy of furuncular myiasis. JAAD Case Rep. 2021;12:49-50.
Article PDF
Author and Disclosure Information

Dr. Yousefian is from the Center for Clinical and Cosmetic Research, Aventura, Florida, and the University of Incarnate Word, San Antonio, Texas. Drs. Foss, Ambur, Dunn, and Nathoo are from the Department of Dermatology, Kansas City University Graduate Medical Education Consortium, Missouri, and Advanced Dermatology and Cosmetic Surgery, Orlando, Florida.

The authors report no conflict of interest.

Correspondence: Faraz Yousefian, DO, 2925 Aventura Blvd, Ste 205, Aventura, FL 30180 (yousefian.faraz@gmail.com).

Issue
Cutis - 111(4)
Publications
Topics
Page Number
E30-E32
Sections
Author and Disclosure Information

Dr. Yousefian is from the Center for Clinical and Cosmetic Research, Aventura, Florida, and the University of Incarnate Word, San Antonio, Texas. Drs. Foss, Ambur, Dunn, and Nathoo are from the Department of Dermatology, Kansas City University Graduate Medical Education Consortium, Missouri, and Advanced Dermatology and Cosmetic Surgery, Orlando, Florida.

The authors report no conflict of interest.

Correspondence: Faraz Yousefian, DO, 2925 Aventura Blvd, Ste 205, Aventura, FL 30180 (yousefian.faraz@gmail.com).

Author and Disclosure Information

Dr. Yousefian is from the Center for Clinical and Cosmetic Research, Aventura, Florida, and the University of Incarnate Word, San Antonio, Texas. Drs. Foss, Ambur, Dunn, and Nathoo are from the Department of Dermatology, Kansas City University Graduate Medical Education Consortium, Missouri, and Advanced Dermatology and Cosmetic Surgery, Orlando, Florida.

The authors report no conflict of interest.

Correspondence: Faraz Yousefian, DO, 2925 Aventura Blvd, Ste 205, Aventura, FL 30180 (yousefian.faraz@gmail.com).

Article PDF
Article PDF
Related Articles

The Diagnosis: Cutaneous Furuncular Myiasis

Histopathology of the punch biopsy showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from the exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates on hematoxylin and eosin stain (Figure 1). Live insect larvae were observed and extracted, which immediately relieved the crawling sensation (Figure 2). Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (Figure 3).

Histopathology showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates
FIGURE 1. A and B, Histopathology showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates (H&E, original magnifications ×4 and ×40).

Myiasis is a parasitic infestation of the dipterous fly’s larvae in the host organ and tissue. There are 5 types of myiasis based on the location of the infestation: wound myiasis occurs with egg infestations on an open wound; furuncular myiasis results from egg placement by penetration of healthy skin by a mosquito vector; plaque myiasis comprises the placement of eggs on clothing through several maggots and flies; creeping myiasis involves the Gasterophilus fly delivering the larva intradermally; and body cavity myiasis may develop in the orbit, nasal cavity, urogenital system, and gastrointestinal tract.1-3

An insect larva was extracted from a lesion on the arm, which immediately relieved the crawling sensation experienced by the patient, characteristic of furuncular myiasis.
FIGURE 2. An insect larva was extracted from a lesion on the arm, which immediately relieved the crawling sensation experienced by the patient, characteristic of furuncular myiasis.

Furuncular myiasis infestation occurs via a complex life cycle in which mosquitoes act as a vector and transfer the eggs to the human or animal host.1-3 Botfly larvae then penetrate the skin and reside within the subdermis to mature. Adults then emerge after 1 month to repeat the cycle.1Dermatobia hominis and Cordylobia anthropophaga are the most common causes of furuncular myiasis.2,3 Furuncular myiasis commonly presents in travelers that are returning from tropical countries. Initially, an itching erythematous papule develops. After the larvae mature, they can appear as boil-like lesions with a small central punctum.1-3 Dermoscopy can be utilized for visualization of different larvae anatomy such as a furuncularlike lesion, spines, and posterior breathing spiracle from the central punctum.4

Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (original magnification ×40).
FIGURE 3. Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (original magnification ×40).

Our patient’s recent travel to the Amazon in Brazil, clinical history, and histopathologic findings ruled out other differential diagnoses such as cutaneous larva migrans, gnathostomiasis, loiasis, and tungiasis.

Treatment is curative with the extraction of the intact larva from the nodule. Localized skin anesthetic injection can be used to bulge the larva outward for easier extraction. A single dose of ivermectin 15 mg can treat the parasitic infestation of myiasis.1-3

The Diagnosis: Cutaneous Furuncular Myiasis

Histopathology of the punch biopsy showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from the exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates on hematoxylin and eosin stain (Figure 1). Live insect larvae were observed and extracted, which immediately relieved the crawling sensation (Figure 2). Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (Figure 3).

Histopathology showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates
FIGURE 1. A and B, Histopathology showed an undulating chitinous exoskeleton and pigmented spines (setae) protruding from exoskeleton with associated superficial perivascular lymphohistiocytic infiltrates (H&E, original magnifications ×4 and ×40).

Myiasis is a parasitic infestation of the dipterous fly’s larvae in the host organ and tissue. There are 5 types of myiasis based on the location of the infestation: wound myiasis occurs with egg infestations on an open wound; furuncular myiasis results from egg placement by penetration of healthy skin by a mosquito vector; plaque myiasis comprises the placement of eggs on clothing through several maggots and flies; creeping myiasis involves the Gasterophilus fly delivering the larva intradermally; and body cavity myiasis may develop in the orbit, nasal cavity, urogenital system, and gastrointestinal tract.1-3

An insect larva was extracted from a lesion on the arm, which immediately relieved the crawling sensation experienced by the patient, characteristic of furuncular myiasis.
FIGURE 2. An insect larva was extracted from a lesion on the arm, which immediately relieved the crawling sensation experienced by the patient, characteristic of furuncular myiasis.

Furuncular myiasis infestation occurs via a complex life cycle in which mosquitoes act as a vector and transfer the eggs to the human or animal host.1-3 Botfly larvae then penetrate the skin and reside within the subdermis to mature. Adults then emerge after 1 month to repeat the cycle.1Dermatobia hominis and Cordylobia anthropophaga are the most common causes of furuncular myiasis.2,3 Furuncular myiasis commonly presents in travelers that are returning from tropical countries. Initially, an itching erythematous papule develops. After the larvae mature, they can appear as boil-like lesions with a small central punctum.1-3 Dermoscopy can be utilized for visualization of different larvae anatomy such as a furuncularlike lesion, spines, and posterior breathing spiracle from the central punctum.4

Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (original magnification ×40).
FIGURE 3. Light microscopy of the larva showed a row of hooks surrounding a tapered body with a head attached anteriorly (original magnification ×40).

Our patient’s recent travel to the Amazon in Brazil, clinical history, and histopathologic findings ruled out other differential diagnoses such as cutaneous larva migrans, gnathostomiasis, loiasis, and tungiasis.

Treatment is curative with the extraction of the intact larva from the nodule. Localized skin anesthetic injection can be used to bulge the larva outward for easier extraction. A single dose of ivermectin 15 mg can treat the parasitic infestation of myiasis.1-3

References
  1. John DT, Petri WA, Markell EK, et al. Markell and Voge’s Medical Parasitology. 9th ed. Saunders Elsevier; 2006.
  2. Caissie R, Beaulieu F, Giroux M, et al. Cutaneous myiasis: diagnosis, treatment, and prevention. J Oral Maxillofac Surg. 2008;66:560-568.
  3. Lachish T, Marhoom E, Mumcuoglu KY, et al. Myiasis in travelers. J Travel Med. 2015;22:232-236.
  4. Mello C, Magalhães R. Triangular black dots in dermoscopy of furuncular myiasis. JAAD Case Rep. 2021;12:49-50.
References
  1. John DT, Petri WA, Markell EK, et al. Markell and Voge’s Medical Parasitology. 9th ed. Saunders Elsevier; 2006.
  2. Caissie R, Beaulieu F, Giroux M, et al. Cutaneous myiasis: diagnosis, treatment, and prevention. J Oral Maxillofac Surg. 2008;66:560-568.
  3. Lachish T, Marhoom E, Mumcuoglu KY, et al. Myiasis in travelers. J Travel Med. 2015;22:232-236.
  4. Mello C, Magalhães R. Triangular black dots in dermoscopy of furuncular myiasis. JAAD Case Rep. 2021;12:49-50.
Issue
Cutis - 111(4)
Issue
Cutis - 111(4)
Page Number
E30-E32
Page Number
E30-E32
Publications
Publications
Topics
Article Type
Display Headline
Painful Nodules With a Crawling Sensation
Display Headline
Painful Nodules With a Crawling Sensation
Sections
Questionnaire Body

A 20-year-old man presented with progressively enlarging, painful lesions on the arm with a crawling sensation of 3 weeks’ duration. The lesions appeared after a recent trip to Brazil where he was hiking in the Amazon. He noted that the pain occurred suddenly and there was some serous drainage from the lesions. He denied any trauma to the area and reported no history of similar eruptions, treatments, or systemic symptoms. Physical examination revealed 2 tender erythematous nodules, each measuring 0.6 cm in diameter, with associated crust and a reported crawling sensation on the posterior aspect of the left arm. No drainage was seen. A punch biopsy was performed.

Painful nodules with a crawling sensation

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 04/26/2023 - 10:30
Un-Gate On Date
Wed, 04/26/2023 - 10:30
Use ProPublica
CFC Schedule Remove Status
Wed, 04/26/2023 - 10:30
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Diagnosis by dog: Canines detect COVID in schoolchildren with no symptoms

Article Type
Changed
Fri, 04/28/2023 - 00:44

Scent-detecting dogs have long been used to sniff out medical conditions ranging from low blood sugar and cancer to malaria, impending seizures, and migraines – not to mention explosives and narcotics.

Recently, the sensitivity of the canine nose has been tested as a strategy for screening for SARS-CoV-2 infection in schoolchildren showing no outward symptoms of the virus. A pilot study led by Carol A. Glaser, DVM, MD, of the California Department of Public Health in Richmond, found that trained dogs had an accuracy of more than 95% for detecting the odor of volatile organic compounds, or VOCs, produced by COVID-infected individuals.

California Department of Public Health
Dr. Carol A. Glaser

The authors believe that odor-based diagnosis with dogs could eventually provide a rapid, inexpensive, and noninvasive way to screen large groups for COVID-19 without the need for antigen testing.

“This is a new program with research ongoing, so it would be premature to consider it from a consumer’s perspective,” Dr. Glaser said in an interview. “However, the data look promising and we are hopeful we can continue to pilot various programs in various settings to see where, and if, dogs can be used for biomedical detection.”
 

In the lab and in the field

In a study published online in JAMA Pediatrics, Dr. Glaser’s group found that after 2 months’ training on COVID-19 scent samples in the laboratory, the dogs detected the presence of the virus more than 95% of the time. Antigen tests were used as a comparative reference.

In medical terms, the dogs achieved a greater than 95% accuracy on two important measures of effectiveness: sensitivity – a test’s ability to correctly detect the positive presence of disease – and specificity – the ability of a test to accurately rule out the presence of disease and identify as negative an uninfected person.

Next, the researchers piloted field tests in 50 visits at 27 schools from April 1 to May 25, 2022, to compare dogs’ detection ability with that of standard laboratory antigen testing. Participants in the completely voluntary screening numbered 1,558 and ranged in age from 9 to 17 years. Of these, 56% were girls and 89% were students. Almost 70% were screened at least twice.

Overall, the field test compared 3,897 paired antigen-vs.-dog screenings. The dogs accurately signaled the presence of 85 infections and ruled out 3,411 infections, for an overall accuracy of 90%. In 383 cases, however, they inaccurately signaled the presence of infection (false positives) and missed 18 actual infections (false negatives). That translated to a sensitivity in the field of 83%, considerably lower than that of their lab performance.

Direct screening of individuals with dogs outside of the lab involved circumstantial factors that likely contributed to decreased sensitivity and specificity, the authors acknowledged. These included such distractions as noise and the presence of excitable young children as well environmental conditions such as wind and other odors. What about dog phobia and dog hair allergy? “Dog screening takes only a few seconds per student and the dogs do not generally touch the participant as they run a line and sniff at ankles,” Dr. Glaser explained.

As for allergies, the rapid, ankle-level screening occurred in outdoor settings. “The chance of allergies is very low. This would be similar to someone who is out walking on the sidewalk and walks by a dog,” Dr. Glaser said.

Last year, a British trial of almost 4,000 adults tested six dogs trained to detect differences in VOCs between COVID-infected and uninfected individuals. Given samples from both groups, the dogs were able to distinguish between infected and uninfected samples with a sensitivity for detecting the virus ranging from 82% to 94% and a specificity for ruling it out of 76% to 92%. And they were able to smell the VOCs even when the viral load was low. The study also tested organic sensors, which proved even more accurate than the canines.

According to lead author James G. Logan, PhD, a disease control expert at the London School of Hygiene & Tropical Medicine in London, “Odour-based diagnostics using dogs and/or sensors may prove a rapid and effective tool for screening large numbers of people. Mathematical modelling suggests that dog screening plus a confirmatory PCR test could detect up to 89% of SARS-CoV-2 infections, averting up to 2.2 times as much transmission compared to isolation of symptomatic individuals only.”

Funding was provided by the Centers for Disease Control and Prevention Foundation (CDCF) to Early Alert Canines for the purchase and care of the dogs and the support of the handlers and trainers. The CDCF had no other role in the study. Coauthor Carol A. Edwards of Early Alert Canines reported receiving grants from the CDCF.

Publications
Topics
Sections

Scent-detecting dogs have long been used to sniff out medical conditions ranging from low blood sugar and cancer to malaria, impending seizures, and migraines – not to mention explosives and narcotics.

Recently, the sensitivity of the canine nose has been tested as a strategy for screening for SARS-CoV-2 infection in schoolchildren showing no outward symptoms of the virus. A pilot study led by Carol A. Glaser, DVM, MD, of the California Department of Public Health in Richmond, found that trained dogs had an accuracy of more than 95% for detecting the odor of volatile organic compounds, or VOCs, produced by COVID-infected individuals.

California Department of Public Health
Dr. Carol A. Glaser

The authors believe that odor-based diagnosis with dogs could eventually provide a rapid, inexpensive, and noninvasive way to screen large groups for COVID-19 without the need for antigen testing.

“This is a new program with research ongoing, so it would be premature to consider it from a consumer’s perspective,” Dr. Glaser said in an interview. “However, the data look promising and we are hopeful we can continue to pilot various programs in various settings to see where, and if, dogs can be used for biomedical detection.”
 

In the lab and in the field

In a study published online in JAMA Pediatrics, Dr. Glaser’s group found that after 2 months’ training on COVID-19 scent samples in the laboratory, the dogs detected the presence of the virus more than 95% of the time. Antigen tests were used as a comparative reference.

In medical terms, the dogs achieved a greater than 95% accuracy on two important measures of effectiveness: sensitivity – a test’s ability to correctly detect the positive presence of disease – and specificity – the ability of a test to accurately rule out the presence of disease and identify as negative an uninfected person.

Next, the researchers piloted field tests in 50 visits at 27 schools from April 1 to May 25, 2022, to compare dogs’ detection ability with that of standard laboratory antigen testing. Participants in the completely voluntary screening numbered 1,558 and ranged in age from 9 to 17 years. Of these, 56% were girls and 89% were students. Almost 70% were screened at least twice.

Overall, the field test compared 3,897 paired antigen-vs.-dog screenings. The dogs accurately signaled the presence of 85 infections and ruled out 3,411 infections, for an overall accuracy of 90%. In 383 cases, however, they inaccurately signaled the presence of infection (false positives) and missed 18 actual infections (false negatives). That translated to a sensitivity in the field of 83%, considerably lower than that of their lab performance.

Direct screening of individuals with dogs outside of the lab involved circumstantial factors that likely contributed to decreased sensitivity and specificity, the authors acknowledged. These included such distractions as noise and the presence of excitable young children as well environmental conditions such as wind and other odors. What about dog phobia and dog hair allergy? “Dog screening takes only a few seconds per student and the dogs do not generally touch the participant as they run a line and sniff at ankles,” Dr. Glaser explained.

As for allergies, the rapid, ankle-level screening occurred in outdoor settings. “The chance of allergies is very low. This would be similar to someone who is out walking on the sidewalk and walks by a dog,” Dr. Glaser said.

Last year, a British trial of almost 4,000 adults tested six dogs trained to detect differences in VOCs between COVID-infected and uninfected individuals. Given samples from both groups, the dogs were able to distinguish between infected and uninfected samples with a sensitivity for detecting the virus ranging from 82% to 94% and a specificity for ruling it out of 76% to 92%. And they were able to smell the VOCs even when the viral load was low. The study also tested organic sensors, which proved even more accurate than the canines.

According to lead author James G. Logan, PhD, a disease control expert at the London School of Hygiene & Tropical Medicine in London, “Odour-based diagnostics using dogs and/or sensors may prove a rapid and effective tool for screening large numbers of people. Mathematical modelling suggests that dog screening plus a confirmatory PCR test could detect up to 89% of SARS-CoV-2 infections, averting up to 2.2 times as much transmission compared to isolation of symptomatic individuals only.”

Funding was provided by the Centers for Disease Control and Prevention Foundation (CDCF) to Early Alert Canines for the purchase and care of the dogs and the support of the handlers and trainers. The CDCF had no other role in the study. Coauthor Carol A. Edwards of Early Alert Canines reported receiving grants from the CDCF.

Scent-detecting dogs have long been used to sniff out medical conditions ranging from low blood sugar and cancer to malaria, impending seizures, and migraines – not to mention explosives and narcotics.

Recently, the sensitivity of the canine nose has been tested as a strategy for screening for SARS-CoV-2 infection in schoolchildren showing no outward symptoms of the virus. A pilot study led by Carol A. Glaser, DVM, MD, of the California Department of Public Health in Richmond, found that trained dogs had an accuracy of more than 95% for detecting the odor of volatile organic compounds, or VOCs, produced by COVID-infected individuals.

California Department of Public Health
Dr. Carol A. Glaser

The authors believe that odor-based diagnosis with dogs could eventually provide a rapid, inexpensive, and noninvasive way to screen large groups for COVID-19 without the need for antigen testing.

“This is a new program with research ongoing, so it would be premature to consider it from a consumer’s perspective,” Dr. Glaser said in an interview. “However, the data look promising and we are hopeful we can continue to pilot various programs in various settings to see where, and if, dogs can be used for biomedical detection.”
 

In the lab and in the field

In a study published online in JAMA Pediatrics, Dr. Glaser’s group found that after 2 months’ training on COVID-19 scent samples in the laboratory, the dogs detected the presence of the virus more than 95% of the time. Antigen tests were used as a comparative reference.

In medical terms, the dogs achieved a greater than 95% accuracy on two important measures of effectiveness: sensitivity – a test’s ability to correctly detect the positive presence of disease – and specificity – the ability of a test to accurately rule out the presence of disease and identify as negative an uninfected person.

Next, the researchers piloted field tests in 50 visits at 27 schools from April 1 to May 25, 2022, to compare dogs’ detection ability with that of standard laboratory antigen testing. Participants in the completely voluntary screening numbered 1,558 and ranged in age from 9 to 17 years. Of these, 56% were girls and 89% were students. Almost 70% were screened at least twice.

Overall, the field test compared 3,897 paired antigen-vs.-dog screenings. The dogs accurately signaled the presence of 85 infections and ruled out 3,411 infections, for an overall accuracy of 90%. In 383 cases, however, they inaccurately signaled the presence of infection (false positives) and missed 18 actual infections (false negatives). That translated to a sensitivity in the field of 83%, considerably lower than that of their lab performance.

Direct screening of individuals with dogs outside of the lab involved circumstantial factors that likely contributed to decreased sensitivity and specificity, the authors acknowledged. These included such distractions as noise and the presence of excitable young children as well environmental conditions such as wind and other odors. What about dog phobia and dog hair allergy? “Dog screening takes only a few seconds per student and the dogs do not generally touch the participant as they run a line and sniff at ankles,” Dr. Glaser explained.

As for allergies, the rapid, ankle-level screening occurred in outdoor settings. “The chance of allergies is very low. This would be similar to someone who is out walking on the sidewalk and walks by a dog,” Dr. Glaser said.

Last year, a British trial of almost 4,000 adults tested six dogs trained to detect differences in VOCs between COVID-infected and uninfected individuals. Given samples from both groups, the dogs were able to distinguish between infected and uninfected samples with a sensitivity for detecting the virus ranging from 82% to 94% and a specificity for ruling it out of 76% to 92%. And they were able to smell the VOCs even when the viral load was low. The study also tested organic sensors, which proved even more accurate than the canines.

According to lead author James G. Logan, PhD, a disease control expert at the London School of Hygiene & Tropical Medicine in London, “Odour-based diagnostics using dogs and/or sensors may prove a rapid and effective tool for screening large numbers of people. Mathematical modelling suggests that dog screening plus a confirmatory PCR test could detect up to 89% of SARS-CoV-2 infections, averting up to 2.2 times as much transmission compared to isolation of symptomatic individuals only.”

Funding was provided by the Centers for Disease Control and Prevention Foundation (CDCF) to Early Alert Canines for the purchase and care of the dogs and the support of the handlers and trainers. The CDCF had no other role in the study. Coauthor Carol A. Edwards of Early Alert Canines reported receiving grants from the CDCF.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article