User login
Apremilast has neutral effect on vascular inflammation in psoriasis study
BOSTON – Treatment with , and glucose metabolism, in a study presented at the 2022 American Academy of Dermatology annual meeting.
In the phase 4, open-label, single arm trial, participants also lost subcutaneous and visceral fat after 16 weeks on the oral medication, a phosphodiesterase 4 (PDE4) inhibitor, and maintained that loss at 52 weeks.
People with psoriasis have an increased risk of obesity, type 2 diabetes, and cardiovascular events. Patients with more significant psoriasis “tend to die about 5 years younger than they should, based on their risk factors for mortality,” Joel Gelfand, MD, MSCE, professor of dermatology and epidemiology and vice chair of clinical research in dermatology at the University of Pennsylvania Perelman School of Medicine, Philadelphia, told this news organization.
He led the research and presented the findings at the AAD meeting March 26. “As a result, there has been a keen interest in understanding how psoriasis therapies impact cardiovascular risk, the idea being that by controlling inflammation, you may lower the risk of these patients developing cardiovascular disease over time,” he said.
Previous trials looking at the effect of psoriasis therapies on vascular inflammation “have been, for the most part, inconclusive,” Michael Garshick, MD, a cardiologist at NYU Langone Health, told this news organization. Dr. Garshick was not involved with the research. A 2021 systematic review of psoriasis clinicals trials reported that the tumor necrosis factor (TNF) blocker adalimumab (Humira) and phototherapy had the greatest effect on cardiometabolic markers, while ustekinumab (Stelara), an interleukin (IL)-12 and IL-23 antagonist, was the only treatment that improved vascular inflammation. These variable findings make this area “ripe for study,” noted Dr. Garshick.
To observe how apremilast, which is approved by the FDA for treating psoriasis and psoriatic arthritis, affected vascular inflammation, adiposity, and blood-based cardiometabolic markers, Dr. Gelfand organized an open-label study in adults with moderate-to-severe psoriasis. All participants were 18 years or older, had psoriasis for at least 6 months, and were candidates for systemic therapy. All patients underwent FDG PET/CT scans to assess aortic vascular inflammation and had blood work at baseline. Of the 70 patients originally enrolled in the study, 60 remained in the study at week 16, including 57 who underwent imaging for the second time. Thirty-nine participants remained in the study until week 52, and all except one had another scan.
The average age of participants was 47 years, and their mean BMI was 30. More than 80% of participants were White (83%) and 77% were male. The study population had lived with psoriasis for an average of 16 years and 8 patients also had psoriatic arthritis. At baseline, on average, participants had a Psoriasis Area and Severity Index (PASI) score of 18.62, a dermatology life quality index (DLQI) score of 11.60, and 22% of participants’ BSA (body surface area) were affected. The mean TBRmax, the marker for vascular inflammation, was 1.61.
Treatment responses were as expected for apremilast, with 35% of patients achieving PASI 75 and 65% of participants reporting DLQI scores of 5 or less by 16 weeks. At 52 weeks, 31% of the cohort had achieved PASI 75, and 67% reported DLQI score of 5 or higher. All psoriasis endpoints had improved since baseline (P = .001).
Throughout the study period, there was no significant change in TBRmax. However, in a sensitivity analysis, the 16 patients with a baseline TBRmax of 1.6 or higher had an absolute reduction of 0.21 in TBR by week 52. “That suggests that maybe a subset of people who have higher levels of aortic inflammation at baseline may experience some reduction that portend, potentially, some health benefits over time,” Dr. Gelfand said. “Ultimately, I wouldn’t hang my hat on the finding,” he said, noting that additional research comparing the treatment to placebo is necessary.
Both visceral and subcutaneous adipose tissue (VAT and SAT) decreased by week 16, and this reduction was maintained through week 52. In the first 16 weeks of the study, VAT decreased by 5.32% (P = .0009), and SAT decreased by 5.53% (P = .0005). From baseline to 52 weeks, VAT decreased by 5.52% (P = .0148), and SAT decreased by 5.50% (P = .0096). There were no significant differences between week 16 and week 52 in VAT or SAT.
Of the 68 blood biomarkers analyzed, there were significant decreases in the inflammatory markers ferritin (P = .015) and IL-beta (P = .006), the lipid metabolism biomarker HDL-cholesterol efflux (P = .008), and ketone bodies (P = .006). There were also increases in the inflammatory marker IL-8 (P = .003), the lipid metabolism marker ApoA (P = .05), and insulin (P = .05). Ferritin was the only biomarker that was reduced on both week 16 and week 52.
“If you want to be a purist, this was a negative trial,” said Dr. Garshick, because apremilast was not found to decrease vascular inflammation; however, he noted that the biomarker changes “were hopeful secondary endpoints.” It could be, he said, that another outcome measure may be better able to show changes in vascular inflammation compared with FDG. “It’s always hard to figure out what a good surrogate endpoint is in cardiovascular trials,” he noted, “so it may be that FDG/PET is too noisy or not reliable enough to see the outcome that we want to see.”
Dr. Gelfand reports consulting fees/grants from Amgen, AbbVie, BMS, Boehringer Ingelheim, Janssen Biologics, Novartis Corp, Pfizer, and UCB (DSMB). He serves as the Deputy Editor for the Journal of Investigative Dermatology and the Chief Medical Editor at Healio Psoriatic Disease and receives honoraria for both roles. Dr. Garshick has received consulting fees from AbbVie.
A version of this article first appeared on Medscape.com.
BOSTON – Treatment with , and glucose metabolism, in a study presented at the 2022 American Academy of Dermatology annual meeting.
In the phase 4, open-label, single arm trial, participants also lost subcutaneous and visceral fat after 16 weeks on the oral medication, a phosphodiesterase 4 (PDE4) inhibitor, and maintained that loss at 52 weeks.
People with psoriasis have an increased risk of obesity, type 2 diabetes, and cardiovascular events. Patients with more significant psoriasis “tend to die about 5 years younger than they should, based on their risk factors for mortality,” Joel Gelfand, MD, MSCE, professor of dermatology and epidemiology and vice chair of clinical research in dermatology at the University of Pennsylvania Perelman School of Medicine, Philadelphia, told this news organization.
He led the research and presented the findings at the AAD meeting March 26. “As a result, there has been a keen interest in understanding how psoriasis therapies impact cardiovascular risk, the idea being that by controlling inflammation, you may lower the risk of these patients developing cardiovascular disease over time,” he said.
Previous trials looking at the effect of psoriasis therapies on vascular inflammation “have been, for the most part, inconclusive,” Michael Garshick, MD, a cardiologist at NYU Langone Health, told this news organization. Dr. Garshick was not involved with the research. A 2021 systematic review of psoriasis clinicals trials reported that the tumor necrosis factor (TNF) blocker adalimumab (Humira) and phototherapy had the greatest effect on cardiometabolic markers, while ustekinumab (Stelara), an interleukin (IL)-12 and IL-23 antagonist, was the only treatment that improved vascular inflammation. These variable findings make this area “ripe for study,” noted Dr. Garshick.
To observe how apremilast, which is approved by the FDA for treating psoriasis and psoriatic arthritis, affected vascular inflammation, adiposity, and blood-based cardiometabolic markers, Dr. Gelfand organized an open-label study in adults with moderate-to-severe psoriasis. All participants were 18 years or older, had psoriasis for at least 6 months, and were candidates for systemic therapy. All patients underwent FDG PET/CT scans to assess aortic vascular inflammation and had blood work at baseline. Of the 70 patients originally enrolled in the study, 60 remained in the study at week 16, including 57 who underwent imaging for the second time. Thirty-nine participants remained in the study until week 52, and all except one had another scan.
The average age of participants was 47 years, and their mean BMI was 30. More than 80% of participants were White (83%) and 77% were male. The study population had lived with psoriasis for an average of 16 years and 8 patients also had psoriatic arthritis. At baseline, on average, participants had a Psoriasis Area and Severity Index (PASI) score of 18.62, a dermatology life quality index (DLQI) score of 11.60, and 22% of participants’ BSA (body surface area) were affected. The mean TBRmax, the marker for vascular inflammation, was 1.61.
Treatment responses were as expected for apremilast, with 35% of patients achieving PASI 75 and 65% of participants reporting DLQI scores of 5 or less by 16 weeks. At 52 weeks, 31% of the cohort had achieved PASI 75, and 67% reported DLQI score of 5 or higher. All psoriasis endpoints had improved since baseline (P = .001).
Throughout the study period, there was no significant change in TBRmax. However, in a sensitivity analysis, the 16 patients with a baseline TBRmax of 1.6 or higher had an absolute reduction of 0.21 in TBR by week 52. “That suggests that maybe a subset of people who have higher levels of aortic inflammation at baseline may experience some reduction that portend, potentially, some health benefits over time,” Dr. Gelfand said. “Ultimately, I wouldn’t hang my hat on the finding,” he said, noting that additional research comparing the treatment to placebo is necessary.
Both visceral and subcutaneous adipose tissue (VAT and SAT) decreased by week 16, and this reduction was maintained through week 52. In the first 16 weeks of the study, VAT decreased by 5.32% (P = .0009), and SAT decreased by 5.53% (P = .0005). From baseline to 52 weeks, VAT decreased by 5.52% (P = .0148), and SAT decreased by 5.50% (P = .0096). There were no significant differences between week 16 and week 52 in VAT or SAT.
Of the 68 blood biomarkers analyzed, there were significant decreases in the inflammatory markers ferritin (P = .015) and IL-beta (P = .006), the lipid metabolism biomarker HDL-cholesterol efflux (P = .008), and ketone bodies (P = .006). There were also increases in the inflammatory marker IL-8 (P = .003), the lipid metabolism marker ApoA (P = .05), and insulin (P = .05). Ferritin was the only biomarker that was reduced on both week 16 and week 52.
“If you want to be a purist, this was a negative trial,” said Dr. Garshick, because apremilast was not found to decrease vascular inflammation; however, he noted that the biomarker changes “were hopeful secondary endpoints.” It could be, he said, that another outcome measure may be better able to show changes in vascular inflammation compared with FDG. “It’s always hard to figure out what a good surrogate endpoint is in cardiovascular trials,” he noted, “so it may be that FDG/PET is too noisy or not reliable enough to see the outcome that we want to see.”
Dr. Gelfand reports consulting fees/grants from Amgen, AbbVie, BMS, Boehringer Ingelheim, Janssen Biologics, Novartis Corp, Pfizer, and UCB (DSMB). He serves as the Deputy Editor for the Journal of Investigative Dermatology and the Chief Medical Editor at Healio Psoriatic Disease and receives honoraria for both roles. Dr. Garshick has received consulting fees from AbbVie.
A version of this article first appeared on Medscape.com.
BOSTON – Treatment with , and glucose metabolism, in a study presented at the 2022 American Academy of Dermatology annual meeting.
In the phase 4, open-label, single arm trial, participants also lost subcutaneous and visceral fat after 16 weeks on the oral medication, a phosphodiesterase 4 (PDE4) inhibitor, and maintained that loss at 52 weeks.
People with psoriasis have an increased risk of obesity, type 2 diabetes, and cardiovascular events. Patients with more significant psoriasis “tend to die about 5 years younger than they should, based on their risk factors for mortality,” Joel Gelfand, MD, MSCE, professor of dermatology and epidemiology and vice chair of clinical research in dermatology at the University of Pennsylvania Perelman School of Medicine, Philadelphia, told this news organization.
He led the research and presented the findings at the AAD meeting March 26. “As a result, there has been a keen interest in understanding how psoriasis therapies impact cardiovascular risk, the idea being that by controlling inflammation, you may lower the risk of these patients developing cardiovascular disease over time,” he said.
Previous trials looking at the effect of psoriasis therapies on vascular inflammation “have been, for the most part, inconclusive,” Michael Garshick, MD, a cardiologist at NYU Langone Health, told this news organization. Dr. Garshick was not involved with the research. A 2021 systematic review of psoriasis clinicals trials reported that the tumor necrosis factor (TNF) blocker adalimumab (Humira) and phototherapy had the greatest effect on cardiometabolic markers, while ustekinumab (Stelara), an interleukin (IL)-12 and IL-23 antagonist, was the only treatment that improved vascular inflammation. These variable findings make this area “ripe for study,” noted Dr. Garshick.
To observe how apremilast, which is approved by the FDA for treating psoriasis and psoriatic arthritis, affected vascular inflammation, adiposity, and blood-based cardiometabolic markers, Dr. Gelfand organized an open-label study in adults with moderate-to-severe psoriasis. All participants were 18 years or older, had psoriasis for at least 6 months, and were candidates for systemic therapy. All patients underwent FDG PET/CT scans to assess aortic vascular inflammation and had blood work at baseline. Of the 70 patients originally enrolled in the study, 60 remained in the study at week 16, including 57 who underwent imaging for the second time. Thirty-nine participants remained in the study until week 52, and all except one had another scan.
The average age of participants was 47 years, and their mean BMI was 30. More than 80% of participants were White (83%) and 77% were male. The study population had lived with psoriasis for an average of 16 years and 8 patients also had psoriatic arthritis. At baseline, on average, participants had a Psoriasis Area and Severity Index (PASI) score of 18.62, a dermatology life quality index (DLQI) score of 11.60, and 22% of participants’ BSA (body surface area) were affected. The mean TBRmax, the marker for vascular inflammation, was 1.61.
Treatment responses were as expected for apremilast, with 35% of patients achieving PASI 75 and 65% of participants reporting DLQI scores of 5 or less by 16 weeks. At 52 weeks, 31% of the cohort had achieved PASI 75, and 67% reported DLQI score of 5 or higher. All psoriasis endpoints had improved since baseline (P = .001).
Throughout the study period, there was no significant change in TBRmax. However, in a sensitivity analysis, the 16 patients with a baseline TBRmax of 1.6 or higher had an absolute reduction of 0.21 in TBR by week 52. “That suggests that maybe a subset of people who have higher levels of aortic inflammation at baseline may experience some reduction that portend, potentially, some health benefits over time,” Dr. Gelfand said. “Ultimately, I wouldn’t hang my hat on the finding,” he said, noting that additional research comparing the treatment to placebo is necessary.
Both visceral and subcutaneous adipose tissue (VAT and SAT) decreased by week 16, and this reduction was maintained through week 52. In the first 16 weeks of the study, VAT decreased by 5.32% (P = .0009), and SAT decreased by 5.53% (P = .0005). From baseline to 52 weeks, VAT decreased by 5.52% (P = .0148), and SAT decreased by 5.50% (P = .0096). There were no significant differences between week 16 and week 52 in VAT or SAT.
Of the 68 blood biomarkers analyzed, there were significant decreases in the inflammatory markers ferritin (P = .015) and IL-beta (P = .006), the lipid metabolism biomarker HDL-cholesterol efflux (P = .008), and ketone bodies (P = .006). There were also increases in the inflammatory marker IL-8 (P = .003), the lipid metabolism marker ApoA (P = .05), and insulin (P = .05). Ferritin was the only biomarker that was reduced on both week 16 and week 52.
“If you want to be a purist, this was a negative trial,” said Dr. Garshick, because apremilast was not found to decrease vascular inflammation; however, he noted that the biomarker changes “were hopeful secondary endpoints.” It could be, he said, that another outcome measure may be better able to show changes in vascular inflammation compared with FDG. “It’s always hard to figure out what a good surrogate endpoint is in cardiovascular trials,” he noted, “so it may be that FDG/PET is too noisy or not reliable enough to see the outcome that we want to see.”
Dr. Gelfand reports consulting fees/grants from Amgen, AbbVie, BMS, Boehringer Ingelheim, Janssen Biologics, Novartis Corp, Pfizer, and UCB (DSMB). He serves as the Deputy Editor for the Journal of Investigative Dermatology and the Chief Medical Editor at Healio Psoriatic Disease and receives honoraria for both roles. Dr. Garshick has received consulting fees from AbbVie.
A version of this article first appeared on Medscape.com.
AT AAD 2022
Protease inhibitors increase small-for-gestational-age but not other pregnancy risks
Pregnant women with HIV can be reassured that protease inhibitors are safer than previously thought in terms of risk to the fetus, according to research from the National Perinatal Epidemiology Unit (NPEU) at Oxford Population Health, a research institute based at the University of Oxford (England).
Antiretroviral therapy (ART) is recommended for all pregnant women living with HIV and plays a crucial role both in improving maternal health and in reducing transmission of HIV from mother to child. However, there has been a critical lack of evidence about the effects of ART on the risk of adverse pregnancy outcomes, with particular concern about protease inhibitors.
Current guidelines recommend that protease inhibitor-based therapies should be used in pregnancy only if first-line treatments (such as integrase and reverse-transcriptase based treatments) are either unsuitable or unavailable. These guidelines also often advise against the use of a specific protease inhibitor, lopinavir/ritonavir, citing an increased risk of preterm birth. However, such advice may restrict treatment options for pregnant women with HIV on the basis of limited evidence.
Largest review to date
The NPEU researchers, therefore, conducted the largest systematic review to date of adverse perinatal outcomes after a range of antiretroviral therapies. It included 34 cohort studies published between 1980 and 2020 and involving over 57,000 pregnant women with HIV in 22 different countries. The review, published in eClinicalMedicine, looked for evidence of 11 perinatal outcomes:
- Preterm birth, very preterm birth, and spontaneous preterm birth
- Low birth weight, very low birth weight, term low birth weight, and preterm low birth weight
- Small for gestational age and very small for gestational age
- Stillbirth, and neonatal death
Using pairwise random-effects meta-analyses, researchers compared protease inhibitor versus non-protease inhibitor-based ART, as well as specifically looking at the comparative risks associated with different protease inhibitor regimens.
They found that protease inhibitor-based ART significantly increased the risk of small or very small for gestational age babies, with relative risks of 1.24 (95% confidence interval, 1.08-1.43; I2 = 66.7%) and 1.40 (95% CI, 1.09-1.81; I2 = 0.0%), respectively. However there were no significant differences in other adverse pregnancy outcomes for protease inhibitors, compared with other therapies.
In addition, researchers found no significant differences in perinatal outcomes between ART regimens containing lopinavir/ritonavir, atazanavir/ritonavir, or darunavir/ritonavir, which are the most frequently used protease inhibitors.
No increased risk of preterm birth
Senior author Dr. Joris Hemelaar, senior clinical research fellow at the NPEU and honorary consultant in obstetrics at the John Radcliffe Hospital, Oxford (England), said: “Antiretroviral therapy in pregnancy has clear benefits for maternal health and prevention of HIV transmission to the child, but our study has shown for the first time that protease inhibitors are associated with babies being small or very small for their gestational age.”
“However, there was no increased risk of preterm birth, or any other adverse pregnancy outcomes. This means protease inhibitors remain an important option for pregnant women living with HIV if other treatments are unsuitable, for example due to drug resistance, or unavailable. The evidence presented here indicates that the commonly used protease inhibitors atazanavir, lopinavir, and darunavir are comparable with regard to perinatal outcomes, which should inform international treatment guidelines.”
Over 70% of the studies assessed were conducted in high-income countries, and Dr. Hemelaar added that there is an urgent need for more research on pregnancy outcomes after different ART in low- to middle-income countries, where the burden of HIV is highest.
Professor Yvonne Gilleece, a spokesperson for the British HIV Association (BHIVA) and immediate past chair of the BHIVA guidelines on the management of HIV in pregnancy and the postpartum period commented: “Pregnancy is a unique life situation in which we must consider the safety of both the birthing parent and the baby. Due to ongoing under-representation of all women in clinical trials, but particularly pregnant women, we do not have enough evidence on which to base all our management decisions. This systematic review includes large numbers of pregnant women living with HIV and can, therefore, improve an informed discussion regarding the safety of the use of protease inhibitors during pregnancy.”
Dr. Hemelaar told Medscape UK: “Many international treatment guidelines cite adverse pregnancy outcomes, in particular preterm birth, associated with protease inhibitor (PI)-drugs as a reason for caution for their use in pregnancy. However, PI drugs are not associated with preterm birth in our analysis. This suggests that PI drugs may not be as detrimental as previously thought (and we found no differences between different PI drugs used), and, hence, these drugs may have a more favourable profile for use in pregnancy.
“However, many other aspects of treatment, including the extent to which the virus can be suppressed, adverse drug effects, adherence to drug prescriptions, antiretroviral drug resistance, drug interactions, drug cost, and availability, should also be taken into account by clinicians and guideline development committees.”
A version of this article first appeared on Medscape UK.
Pregnant women with HIV can be reassured that protease inhibitors are safer than previously thought in terms of risk to the fetus, according to research from the National Perinatal Epidemiology Unit (NPEU) at Oxford Population Health, a research institute based at the University of Oxford (England).
Antiretroviral therapy (ART) is recommended for all pregnant women living with HIV and plays a crucial role both in improving maternal health and in reducing transmission of HIV from mother to child. However, there has been a critical lack of evidence about the effects of ART on the risk of adverse pregnancy outcomes, with particular concern about protease inhibitors.
Current guidelines recommend that protease inhibitor-based therapies should be used in pregnancy only if first-line treatments (such as integrase and reverse-transcriptase based treatments) are either unsuitable or unavailable. These guidelines also often advise against the use of a specific protease inhibitor, lopinavir/ritonavir, citing an increased risk of preterm birth. However, such advice may restrict treatment options for pregnant women with HIV on the basis of limited evidence.
Largest review to date
The NPEU researchers, therefore, conducted the largest systematic review to date of adverse perinatal outcomes after a range of antiretroviral therapies. It included 34 cohort studies published between 1980 and 2020 and involving over 57,000 pregnant women with HIV in 22 different countries. The review, published in eClinicalMedicine, looked for evidence of 11 perinatal outcomes:
- Preterm birth, very preterm birth, and spontaneous preterm birth
- Low birth weight, very low birth weight, term low birth weight, and preterm low birth weight
- Small for gestational age and very small for gestational age
- Stillbirth, and neonatal death
Using pairwise random-effects meta-analyses, researchers compared protease inhibitor versus non-protease inhibitor-based ART, as well as specifically looking at the comparative risks associated with different protease inhibitor regimens.
They found that protease inhibitor-based ART significantly increased the risk of small or very small for gestational age babies, with relative risks of 1.24 (95% confidence interval, 1.08-1.43; I2 = 66.7%) and 1.40 (95% CI, 1.09-1.81; I2 = 0.0%), respectively. However there were no significant differences in other adverse pregnancy outcomes for protease inhibitors, compared with other therapies.
In addition, researchers found no significant differences in perinatal outcomes between ART regimens containing lopinavir/ritonavir, atazanavir/ritonavir, or darunavir/ritonavir, which are the most frequently used protease inhibitors.
No increased risk of preterm birth
Senior author Dr. Joris Hemelaar, senior clinical research fellow at the NPEU and honorary consultant in obstetrics at the John Radcliffe Hospital, Oxford (England), said: “Antiretroviral therapy in pregnancy has clear benefits for maternal health and prevention of HIV transmission to the child, but our study has shown for the first time that protease inhibitors are associated with babies being small or very small for their gestational age.”
“However, there was no increased risk of preterm birth, or any other adverse pregnancy outcomes. This means protease inhibitors remain an important option for pregnant women living with HIV if other treatments are unsuitable, for example due to drug resistance, or unavailable. The evidence presented here indicates that the commonly used protease inhibitors atazanavir, lopinavir, and darunavir are comparable with regard to perinatal outcomes, which should inform international treatment guidelines.”
Over 70% of the studies assessed were conducted in high-income countries, and Dr. Hemelaar added that there is an urgent need for more research on pregnancy outcomes after different ART in low- to middle-income countries, where the burden of HIV is highest.
Professor Yvonne Gilleece, a spokesperson for the British HIV Association (BHIVA) and immediate past chair of the BHIVA guidelines on the management of HIV in pregnancy and the postpartum period commented: “Pregnancy is a unique life situation in which we must consider the safety of both the birthing parent and the baby. Due to ongoing under-representation of all women in clinical trials, but particularly pregnant women, we do not have enough evidence on which to base all our management decisions. This systematic review includes large numbers of pregnant women living with HIV and can, therefore, improve an informed discussion regarding the safety of the use of protease inhibitors during pregnancy.”
Dr. Hemelaar told Medscape UK: “Many international treatment guidelines cite adverse pregnancy outcomes, in particular preterm birth, associated with protease inhibitor (PI)-drugs as a reason for caution for their use in pregnancy. However, PI drugs are not associated with preterm birth in our analysis. This suggests that PI drugs may not be as detrimental as previously thought (and we found no differences between different PI drugs used), and, hence, these drugs may have a more favourable profile for use in pregnancy.
“However, many other aspects of treatment, including the extent to which the virus can be suppressed, adverse drug effects, adherence to drug prescriptions, antiretroviral drug resistance, drug interactions, drug cost, and availability, should also be taken into account by clinicians and guideline development committees.”
A version of this article first appeared on Medscape UK.
Pregnant women with HIV can be reassured that protease inhibitors are safer than previously thought in terms of risk to the fetus, according to research from the National Perinatal Epidemiology Unit (NPEU) at Oxford Population Health, a research institute based at the University of Oxford (England).
Antiretroviral therapy (ART) is recommended for all pregnant women living with HIV and plays a crucial role both in improving maternal health and in reducing transmission of HIV from mother to child. However, there has been a critical lack of evidence about the effects of ART on the risk of adverse pregnancy outcomes, with particular concern about protease inhibitors.
Current guidelines recommend that protease inhibitor-based therapies should be used in pregnancy only if first-line treatments (such as integrase and reverse-transcriptase based treatments) are either unsuitable or unavailable. These guidelines also often advise against the use of a specific protease inhibitor, lopinavir/ritonavir, citing an increased risk of preterm birth. However, such advice may restrict treatment options for pregnant women with HIV on the basis of limited evidence.
Largest review to date
The NPEU researchers, therefore, conducted the largest systematic review to date of adverse perinatal outcomes after a range of antiretroviral therapies. It included 34 cohort studies published between 1980 and 2020 and involving over 57,000 pregnant women with HIV in 22 different countries. The review, published in eClinicalMedicine, looked for evidence of 11 perinatal outcomes:
- Preterm birth, very preterm birth, and spontaneous preterm birth
- Low birth weight, very low birth weight, term low birth weight, and preterm low birth weight
- Small for gestational age and very small for gestational age
- Stillbirth, and neonatal death
Using pairwise random-effects meta-analyses, researchers compared protease inhibitor versus non-protease inhibitor-based ART, as well as specifically looking at the comparative risks associated with different protease inhibitor regimens.
They found that protease inhibitor-based ART significantly increased the risk of small or very small for gestational age babies, with relative risks of 1.24 (95% confidence interval, 1.08-1.43; I2 = 66.7%) and 1.40 (95% CI, 1.09-1.81; I2 = 0.0%), respectively. However there were no significant differences in other adverse pregnancy outcomes for protease inhibitors, compared with other therapies.
In addition, researchers found no significant differences in perinatal outcomes between ART regimens containing lopinavir/ritonavir, atazanavir/ritonavir, or darunavir/ritonavir, which are the most frequently used protease inhibitors.
No increased risk of preterm birth
Senior author Dr. Joris Hemelaar, senior clinical research fellow at the NPEU and honorary consultant in obstetrics at the John Radcliffe Hospital, Oxford (England), said: “Antiretroviral therapy in pregnancy has clear benefits for maternal health and prevention of HIV transmission to the child, but our study has shown for the first time that protease inhibitors are associated with babies being small or very small for their gestational age.”
“However, there was no increased risk of preterm birth, or any other adverse pregnancy outcomes. This means protease inhibitors remain an important option for pregnant women living with HIV if other treatments are unsuitable, for example due to drug resistance, or unavailable. The evidence presented here indicates that the commonly used protease inhibitors atazanavir, lopinavir, and darunavir are comparable with regard to perinatal outcomes, which should inform international treatment guidelines.”
Over 70% of the studies assessed were conducted in high-income countries, and Dr. Hemelaar added that there is an urgent need for more research on pregnancy outcomes after different ART in low- to middle-income countries, where the burden of HIV is highest.
Professor Yvonne Gilleece, a spokesperson for the British HIV Association (BHIVA) and immediate past chair of the BHIVA guidelines on the management of HIV in pregnancy and the postpartum period commented: “Pregnancy is a unique life situation in which we must consider the safety of both the birthing parent and the baby. Due to ongoing under-representation of all women in clinical trials, but particularly pregnant women, we do not have enough evidence on which to base all our management decisions. This systematic review includes large numbers of pregnant women living with HIV and can, therefore, improve an informed discussion regarding the safety of the use of protease inhibitors during pregnancy.”
Dr. Hemelaar told Medscape UK: “Many international treatment guidelines cite adverse pregnancy outcomes, in particular preterm birth, associated with protease inhibitor (PI)-drugs as a reason for caution for their use in pregnancy. However, PI drugs are not associated with preterm birth in our analysis. This suggests that PI drugs may not be as detrimental as previously thought (and we found no differences between different PI drugs used), and, hence, these drugs may have a more favourable profile for use in pregnancy.
“However, many other aspects of treatment, including the extent to which the virus can be suppressed, adverse drug effects, adherence to drug prescriptions, antiretroviral drug resistance, drug interactions, drug cost, and availability, should also be taken into account by clinicians and guideline development committees.”
A version of this article first appeared on Medscape UK.
FROM ECLINICALMEDICINE
Mutation testing recommended for advanced and refractory thyroid cancer
A
focuses on a definition of advanced thyroid cancer and outlines strategies for mutation testing and targeted treatment.Mutation testing should not be pursued if cancer burden and disease threat is low, since most thyroid cancers have a very good prognosis and are highly treatable. But 15% of differentiated thyroid cancer cases are locally advanced, and radioiodine refractory differentiated thyroid cancer has a 10-year survival below 50%.
More generally, advanced thyroid cancer has not been well defined clinically. Physicians with experience diagnosing advanced disease may recognize it, but there is no widely accepted definition. “This may be the first time that an expert group of physicians has attempted to define what advanced thyroid cancer is,” said David Shonka, MD, who is a coauthor of the consensus statement, which was published online in Head & Neck. He is an associate professor of otolaryngology/head and neck surgery at the University of Virginia, Charlottesville.
“All patients with advanced thyroid disease and most patients with incurable radioiodine refractory differentiated thyroid cancer should undergo somatic mutational testing,” the authors wrote. “Next-generation sequencing can reveal targetable mutations and potentially give patients affected by advanced thyroid carcinoma systemic treatment options that can prolong survival. These new innovative approaches are changing the landscape of clinical care for patients with advanced thyroid cancer.”
For differentiated thyroid cancer and medullary thyroid carcinoma, the authors created a definition that combines structural factors on imaging, along with surgical findings, and biochemical, histologic, and molecular factors. Anaplastic thyroid cancer should always be considered advanced, even after a complete resection and incidental pathological identification.
The statement also summarizes recent advances in thyroid cancer that have revealed molecular markers which contribute to oncogenesis. Initially, those approaches were applied to indeterminate fine needle biopsies to improve diagnosis. More recent studies used them to match patients to targeted therapies. There are Food and Drug Administration–approved therapies targeting the BRAF and RET mutations, but advanced thyroid cancer is also included in some “basket” trials that test targeted agents against driver mutations across multiple tumor types.
Radioiodine refractory differentiated thyroid cancer had few treatments as recently as 10 years ago. But recent research has shown that multikinase inhibitors improve outcomes, and a range of mutations have been found in this type of thyroid cancer, including BRAF V600E, RET, PIK3CA, and PTEN, and fusions involving RET, NTRK, and ALK. Other mutations have been linked to more aggressive disease. Efforts to personalize treatment also include microsatellite stability status, tumor mutational burden, and programmed death–ligand 1 status as indicators for immunotherapy. “With discovery of many other molecular targets, and emerging literature showcasing promise of matched targeted therapies, we recommend that all patients with advanced thyroid cancer have comprehensive genomic profiling on tumor tissue through (next generation sequencing),” the authors wrote.
These newer and novel therapies have presented physicians with options outside of surgery, chemotherapy, or radiotherapy, which have low efficacy against advanced thyroid cancer. “It is an area in which there has been substantial change. Even 5-7 years ago, patients with advanced thyroid cancer that was not responsive to radioactive iodine or surgery really didn’t have a lot of options. This is a really an exciting and growing field,” Dr. Shonka said.
He specifically cited anaplastic thyroid cancer, which like radioiodine refractory differentiated thyroid cancer has had few treatment options until recently. “Now, if you see a patient with anaplastic thyroid cancer, your knee-jerk reaction should be ‘let’s do molecular testing on this, this is definitely advanced disease.’ If they have a BRAF mutation, that’s targetable, and we can treat this patient with combination therapy that actually improves their survival. So, there’s some exciting stuff happening and probably more coming down the road as we develop new drugs that can target these mutations that we’re identifying.”
Dr. Shonka has no relevant financial disclosures.
A
focuses on a definition of advanced thyroid cancer and outlines strategies for mutation testing and targeted treatment.Mutation testing should not be pursued if cancer burden and disease threat is low, since most thyroid cancers have a very good prognosis and are highly treatable. But 15% of differentiated thyroid cancer cases are locally advanced, and radioiodine refractory differentiated thyroid cancer has a 10-year survival below 50%.
More generally, advanced thyroid cancer has not been well defined clinically. Physicians with experience diagnosing advanced disease may recognize it, but there is no widely accepted definition. “This may be the first time that an expert group of physicians has attempted to define what advanced thyroid cancer is,” said David Shonka, MD, who is a coauthor of the consensus statement, which was published online in Head & Neck. He is an associate professor of otolaryngology/head and neck surgery at the University of Virginia, Charlottesville.
“All patients with advanced thyroid disease and most patients with incurable radioiodine refractory differentiated thyroid cancer should undergo somatic mutational testing,” the authors wrote. “Next-generation sequencing can reveal targetable mutations and potentially give patients affected by advanced thyroid carcinoma systemic treatment options that can prolong survival. These new innovative approaches are changing the landscape of clinical care for patients with advanced thyroid cancer.”
For differentiated thyroid cancer and medullary thyroid carcinoma, the authors created a definition that combines structural factors on imaging, along with surgical findings, and biochemical, histologic, and molecular factors. Anaplastic thyroid cancer should always be considered advanced, even after a complete resection and incidental pathological identification.
The statement also summarizes recent advances in thyroid cancer that have revealed molecular markers which contribute to oncogenesis. Initially, those approaches were applied to indeterminate fine needle biopsies to improve diagnosis. More recent studies used them to match patients to targeted therapies. There are Food and Drug Administration–approved therapies targeting the BRAF and RET mutations, but advanced thyroid cancer is also included in some “basket” trials that test targeted agents against driver mutations across multiple tumor types.
Radioiodine refractory differentiated thyroid cancer had few treatments as recently as 10 years ago. But recent research has shown that multikinase inhibitors improve outcomes, and a range of mutations have been found in this type of thyroid cancer, including BRAF V600E, RET, PIK3CA, and PTEN, and fusions involving RET, NTRK, and ALK. Other mutations have been linked to more aggressive disease. Efforts to personalize treatment also include microsatellite stability status, tumor mutational burden, and programmed death–ligand 1 status as indicators for immunotherapy. “With discovery of many other molecular targets, and emerging literature showcasing promise of matched targeted therapies, we recommend that all patients with advanced thyroid cancer have comprehensive genomic profiling on tumor tissue through (next generation sequencing),” the authors wrote.
These newer and novel therapies have presented physicians with options outside of surgery, chemotherapy, or radiotherapy, which have low efficacy against advanced thyroid cancer. “It is an area in which there has been substantial change. Even 5-7 years ago, patients with advanced thyroid cancer that was not responsive to radioactive iodine or surgery really didn’t have a lot of options. This is a really an exciting and growing field,” Dr. Shonka said.
He specifically cited anaplastic thyroid cancer, which like radioiodine refractory differentiated thyroid cancer has had few treatment options until recently. “Now, if you see a patient with anaplastic thyroid cancer, your knee-jerk reaction should be ‘let’s do molecular testing on this, this is definitely advanced disease.’ If they have a BRAF mutation, that’s targetable, and we can treat this patient with combination therapy that actually improves their survival. So, there’s some exciting stuff happening and probably more coming down the road as we develop new drugs that can target these mutations that we’re identifying.”
Dr. Shonka has no relevant financial disclosures.
A
focuses on a definition of advanced thyroid cancer and outlines strategies for mutation testing and targeted treatment.Mutation testing should not be pursued if cancer burden and disease threat is low, since most thyroid cancers have a very good prognosis and are highly treatable. But 15% of differentiated thyroid cancer cases are locally advanced, and radioiodine refractory differentiated thyroid cancer has a 10-year survival below 50%.
More generally, advanced thyroid cancer has not been well defined clinically. Physicians with experience diagnosing advanced disease may recognize it, but there is no widely accepted definition. “This may be the first time that an expert group of physicians has attempted to define what advanced thyroid cancer is,” said David Shonka, MD, who is a coauthor of the consensus statement, which was published online in Head & Neck. He is an associate professor of otolaryngology/head and neck surgery at the University of Virginia, Charlottesville.
“All patients with advanced thyroid disease and most patients with incurable radioiodine refractory differentiated thyroid cancer should undergo somatic mutational testing,” the authors wrote. “Next-generation sequencing can reveal targetable mutations and potentially give patients affected by advanced thyroid carcinoma systemic treatment options that can prolong survival. These new innovative approaches are changing the landscape of clinical care for patients with advanced thyroid cancer.”
For differentiated thyroid cancer and medullary thyroid carcinoma, the authors created a definition that combines structural factors on imaging, along with surgical findings, and biochemical, histologic, and molecular factors. Anaplastic thyroid cancer should always be considered advanced, even after a complete resection and incidental pathological identification.
The statement also summarizes recent advances in thyroid cancer that have revealed molecular markers which contribute to oncogenesis. Initially, those approaches were applied to indeterminate fine needle biopsies to improve diagnosis. More recent studies used them to match patients to targeted therapies. There are Food and Drug Administration–approved therapies targeting the BRAF and RET mutations, but advanced thyroid cancer is also included in some “basket” trials that test targeted agents against driver mutations across multiple tumor types.
Radioiodine refractory differentiated thyroid cancer had few treatments as recently as 10 years ago. But recent research has shown that multikinase inhibitors improve outcomes, and a range of mutations have been found in this type of thyroid cancer, including BRAF V600E, RET, PIK3CA, and PTEN, and fusions involving RET, NTRK, and ALK. Other mutations have been linked to more aggressive disease. Efforts to personalize treatment also include microsatellite stability status, tumor mutational burden, and programmed death–ligand 1 status as indicators for immunotherapy. “With discovery of many other molecular targets, and emerging literature showcasing promise of matched targeted therapies, we recommend that all patients with advanced thyroid cancer have comprehensive genomic profiling on tumor tissue through (next generation sequencing),” the authors wrote.
These newer and novel therapies have presented physicians with options outside of surgery, chemotherapy, or radiotherapy, which have low efficacy against advanced thyroid cancer. “It is an area in which there has been substantial change. Even 5-7 years ago, patients with advanced thyroid cancer that was not responsive to radioactive iodine or surgery really didn’t have a lot of options. This is a really an exciting and growing field,” Dr. Shonka said.
He specifically cited anaplastic thyroid cancer, which like radioiodine refractory differentiated thyroid cancer has had few treatment options until recently. “Now, if you see a patient with anaplastic thyroid cancer, your knee-jerk reaction should be ‘let’s do molecular testing on this, this is definitely advanced disease.’ If they have a BRAF mutation, that’s targetable, and we can treat this patient with combination therapy that actually improves their survival. So, there’s some exciting stuff happening and probably more coming down the road as we develop new drugs that can target these mutations that we’re identifying.”
Dr. Shonka has no relevant financial disclosures.
FROM HEAD & NECK
Trichotillomania: What you should know about this common hair-pulling disorder
Trichotillomania is a chronic psychiatric disorder that causes people to repeatedly pull out their own hair. Not only does it result in alopecia with no other underlying causes but it can have significant psychosocial ramifications and rare, but serious, complications. Though the reported prevalence rates are up to approximately 2%, it’s probable that you’ll come upon a patient suffering with this disorder at your practice, if you haven’t already.
To find out more about the best methods for diagnosing and treating this disorder, we spoke with Jon E. Grant, JD, MD, MPH, a leading trichotillomania researcher and part of the department of psychiatry and behavioral neuroscience at the University of Chicago.
Defining trichotillomania
What were the earliest descriptions of trichotillomania in medical literature?
The first real discussion of it probably goes back to Hippocrates, but from a modern medical perspective, discussion began in the 19th century with reports from the French dermatologist François Hallopeau.
They didn’t really call them disorders then – it was long before the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) – but they described this in young men who kept pulling their hair for unclear reasons. These early case reports don’t provide a lot of psychological perspective, but they seem consistent with what we see now.
What are the diagnostic criteria for trichotillomania?
The current DSM-5 criteria are recurrent pulling out of hair, an inability to stop it, the pulling resulting in some noticeable thinning or hair loss, and that it causes some level of distress or some type of impairment in functioning.
At what age do most people experience an onset of symptoms?
Generally speaking, it’s in early adolescence, post puberty, around 12-15 years of age. Having said that, we do see children as young as 1-2 years who are pulling their hair, and we occasionally see somebody far older who is doing it for the first time, a sort of geriatric onset.
Overlap and differences with other disorders
You’ve written that although trichotillomania is grouped with obsessive-compulsive disorder (OCD) in the DSM-5, the thinking around that has recently shifted. Why is that?
At first, it was noticed that many of these people pulled their hair repetitively in an almost ritualized manner, perhaps every night before bed. That looked like a compulsion of OCD.
When DSM-5 came out in 2013, they grouped it with OCD. Yet people shifted to thinking that it’s kind of a cousin of OCD because it has this compulsive quality but doesn’t really have obsessive thinking that drives it. Many people just pull their hair. They’re not even always aware of it: sometimes yes, sometimes no.
We know that it has some links to OCD. You’ll see more OCD in folks with trichotillomania, but it clearly is not just the same as OCD. One of the biggest pieces of evidence for that is that our first-line treatment for OCD – a selective serotonin reuptake inhibitor antidepressant – does not really help hair pulling.
Having said that, if people are looking for help with trichotillomania, they often are best served by therapists and doctors who have a familiarity with OCD and have kept it on their radar over the past couple of decades.
How does trichotillomania overlap with skin picking disorder, which is another condition that you’ve closely researched?
It does have some overlap with skin picking in the sense that it often seems familial. For example, the mother may pull her hair and child picks their skin.
It also has a fair amount of comorbidity with skin picking. Many people who pull will pick a little bit or did at some point. Many people who pick pulled their hair at some point. It seems closely related to nail biting as well.
Studies have also shown that one of the things that runs in the histories of most families of people with trichotillomania might be substance abuse – alcohol or drug addiction.
All of this has led people to believe that there might be subtypes of trichotillomania: one that’s more like an OCD and one that’s more like an addiction. That’s similar to the debate with other mental health conditions, that there are probably multiple types of depression, multiple types of schizophrenia.
Is there a component of this that could be defined as self-harm?
That’s been its own debate. It doesn’t seem to have the same developmental trajectory that we see with self-harm, or even some of the personality features.
However, there may be a small segment of folks with trichotillomania that might more appropriately fit that category. For example, those with family histories of trauma, higher rates of posttraumatic stress disorder, or borderline personality. But it wouldn’t be the majority.
The problem is, if you look at some of the pediatrician data, they often group picking, pulling, and cutting. I think that’s far too all-inclusive.
A gap in clinician education
Are adolescent patients likely to self-report this behavior, or is it something that physicians need to suss out for themselves?
Clearly, if child psychologists, psychiatrists, or pediatricians see young people with patches of alopecia – eyebrows or eyelashes missing, head hair with spots – in addition to a dermatologic assessment, they should simply ask, “Do you pull your hair?”
But it’s interesting that with the internet, young people are much more likely to disclose and actually come forward and tell their parents that they think they have trichotillomania.
I also hear from a lot of the adolescents that they have to educate their doctors about trichotillomania because so often physicians don’t know much about it and will assume that it’s self-injury or just a symptom of anxiety. It’s a little bit of a flip from what we might have seen 20 years ago.
I’ve seen several patients who’ve said, basically, “I’m tired of no professionals seeming to know about this. I shouldn’t have to be educating my doctors about this.” I tell them that I completely agree. It’s a shame because if a doctor doesn’t know about it, then how can they get the appropriate care?
What are the complications that accompany trichotillomania?
A small percentage, maybe about 10%, will ingest their hair, much like people who bite and swallow their fingernails. The concern there is that because hair is nondigestible, it could create an intestinal plug that could rupture and be potentially life-threatening. That makes it all the more important to ask those who pull their hair what they do with the hair once they pull it.
However, with most people, the real problem is with self-esteem. Young people may not want to socialize, go on dates, or do other things they would normally do because of it. In adults, you may find that they’re far more educated than their job allows but don’t want to go to an interview because they don’t want to have somebody sit there and look at them and notice that perhaps they don’t have any eyebrows, or that they’re wearing a wig. Those psychosocial implications are huge for so many people.
Treatment options
In a 2021 study, you showed that nearly one-quarter of people with trichotillomania do naturally recover from it. What characteristics do they seem to have?
It’s interesting because we see natural recovery across many mental health problems: alcohol addition, gambling, OCD. The question then becomes why is that some people can seemingly just stop doing a behavior? Can we learn from those people?
We did see that those who naturally recovered were less likely to have some other mental health comorbidities. It seems like when you have other things such as skin picking or OCD plus trichotillomania, that it probably speaks to something that perhaps synergistically is keeping it going. But this is just a first study; learning how to harness and understand it is the next step.
What’s the goal of treating trichotillomania?
The desired goal is zero pulling. The realistic goal is more likely significantly reduced pulling that then leads to greater function in life, greater quality-of-life.
One doesn’t have to go from 100 to 0 in order to do that. I always tell people that maybe every now and then, every few months, when something is going on in life, you might find yourself pulling a hair or two. That’s okay. If you’re not pulling every day and it’s significantly reduced, we’ll call that a success. I think that setting reasonable goals at this point is really important.
And what would the treatment pathway look like for most patients?
The standard approach is probably some type of habit-reversal therapy, of which there have been many variants over the years. It involves doing something different with your hand, identifying the triggers that may set you off, and then doing something in response to those triggers that is not pulling and might neutralize whatever that anxious or stressed feeling is. That could be different with each person.
At this point, there is no drug approved by the U.S. Food and Drug Administration for trichotillomania. Our best approaches have included N-acetylcysteine, a glutamate modulator, which we’ve done research in.
That’s kind of a go-to option for people because its side-effect profile is generally innocuous. The data show that it could be beneficial in many people with very few, if any, side effects. That would be one “medication,” although it’s actually an over-the-counter vitamin. But we’re constantly looking for better and better treatments.
Do you have any final advice for clinicians or researchers?
Given how common it is, I don’t think clinicians should just see it as an innocuous little habit that people should be able to stop on their own. Clinicians should educate themselves about trichotillomania and know where the person should get the appropriate care.
From the research perspective, given the fact that we see this in animals of multiple species – that they overgroom – this seems to be deeply ingrained in us as animals. So when it comes to the underlying neuroscience, people should pay more attention because it probably has a lot to do with our understanding of habit and compulsive behaviors. It arguably can cut across a lot of different behaviors.
A version of this article first appeared on Medscape.com.
Trichotillomania is a chronic psychiatric disorder that causes people to repeatedly pull out their own hair. Not only does it result in alopecia with no other underlying causes but it can have significant psychosocial ramifications and rare, but serious, complications. Though the reported prevalence rates are up to approximately 2%, it’s probable that you’ll come upon a patient suffering with this disorder at your practice, if you haven’t already.
To find out more about the best methods for diagnosing and treating this disorder, we spoke with Jon E. Grant, JD, MD, MPH, a leading trichotillomania researcher and part of the department of psychiatry and behavioral neuroscience at the University of Chicago.
Defining trichotillomania
What were the earliest descriptions of trichotillomania in medical literature?
The first real discussion of it probably goes back to Hippocrates, but from a modern medical perspective, discussion began in the 19th century with reports from the French dermatologist François Hallopeau.
They didn’t really call them disorders then – it was long before the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) – but they described this in young men who kept pulling their hair for unclear reasons. These early case reports don’t provide a lot of psychological perspective, but they seem consistent with what we see now.
What are the diagnostic criteria for trichotillomania?
The current DSM-5 criteria are recurrent pulling out of hair, an inability to stop it, the pulling resulting in some noticeable thinning or hair loss, and that it causes some level of distress or some type of impairment in functioning.
At what age do most people experience an onset of symptoms?
Generally speaking, it’s in early adolescence, post puberty, around 12-15 years of age. Having said that, we do see children as young as 1-2 years who are pulling their hair, and we occasionally see somebody far older who is doing it for the first time, a sort of geriatric onset.
Overlap and differences with other disorders
You’ve written that although trichotillomania is grouped with obsessive-compulsive disorder (OCD) in the DSM-5, the thinking around that has recently shifted. Why is that?
At first, it was noticed that many of these people pulled their hair repetitively in an almost ritualized manner, perhaps every night before bed. That looked like a compulsion of OCD.
When DSM-5 came out in 2013, they grouped it with OCD. Yet people shifted to thinking that it’s kind of a cousin of OCD because it has this compulsive quality but doesn’t really have obsessive thinking that drives it. Many people just pull their hair. They’re not even always aware of it: sometimes yes, sometimes no.
We know that it has some links to OCD. You’ll see more OCD in folks with trichotillomania, but it clearly is not just the same as OCD. One of the biggest pieces of evidence for that is that our first-line treatment for OCD – a selective serotonin reuptake inhibitor antidepressant – does not really help hair pulling.
Having said that, if people are looking for help with trichotillomania, they often are best served by therapists and doctors who have a familiarity with OCD and have kept it on their radar over the past couple of decades.
How does trichotillomania overlap with skin picking disorder, which is another condition that you’ve closely researched?
It does have some overlap with skin picking in the sense that it often seems familial. For example, the mother may pull her hair and child picks their skin.
It also has a fair amount of comorbidity with skin picking. Many people who pull will pick a little bit or did at some point. Many people who pick pulled their hair at some point. It seems closely related to nail biting as well.
Studies have also shown that one of the things that runs in the histories of most families of people with trichotillomania might be substance abuse – alcohol or drug addiction.
All of this has led people to believe that there might be subtypes of trichotillomania: one that’s more like an OCD and one that’s more like an addiction. That’s similar to the debate with other mental health conditions, that there are probably multiple types of depression, multiple types of schizophrenia.
Is there a component of this that could be defined as self-harm?
That’s been its own debate. It doesn’t seem to have the same developmental trajectory that we see with self-harm, or even some of the personality features.
However, there may be a small segment of folks with trichotillomania that might more appropriately fit that category. For example, those with family histories of trauma, higher rates of posttraumatic stress disorder, or borderline personality. But it wouldn’t be the majority.
The problem is, if you look at some of the pediatrician data, they often group picking, pulling, and cutting. I think that’s far too all-inclusive.
A gap in clinician education
Are adolescent patients likely to self-report this behavior, or is it something that physicians need to suss out for themselves?
Clearly, if child psychologists, psychiatrists, or pediatricians see young people with patches of alopecia – eyebrows or eyelashes missing, head hair with spots – in addition to a dermatologic assessment, they should simply ask, “Do you pull your hair?”
But it’s interesting that with the internet, young people are much more likely to disclose and actually come forward and tell their parents that they think they have trichotillomania.
I also hear from a lot of the adolescents that they have to educate their doctors about trichotillomania because so often physicians don’t know much about it and will assume that it’s self-injury or just a symptom of anxiety. It’s a little bit of a flip from what we might have seen 20 years ago.
I’ve seen several patients who’ve said, basically, “I’m tired of no professionals seeming to know about this. I shouldn’t have to be educating my doctors about this.” I tell them that I completely agree. It’s a shame because if a doctor doesn’t know about it, then how can they get the appropriate care?
What are the complications that accompany trichotillomania?
A small percentage, maybe about 10%, will ingest their hair, much like people who bite and swallow their fingernails. The concern there is that because hair is nondigestible, it could create an intestinal plug that could rupture and be potentially life-threatening. That makes it all the more important to ask those who pull their hair what they do with the hair once they pull it.
However, with most people, the real problem is with self-esteem. Young people may not want to socialize, go on dates, or do other things they would normally do because of it. In adults, you may find that they’re far more educated than their job allows but don’t want to go to an interview because they don’t want to have somebody sit there and look at them and notice that perhaps they don’t have any eyebrows, or that they’re wearing a wig. Those psychosocial implications are huge for so many people.
Treatment options
In a 2021 study, you showed that nearly one-quarter of people with trichotillomania do naturally recover from it. What characteristics do they seem to have?
It’s interesting because we see natural recovery across many mental health problems: alcohol addition, gambling, OCD. The question then becomes why is that some people can seemingly just stop doing a behavior? Can we learn from those people?
We did see that those who naturally recovered were less likely to have some other mental health comorbidities. It seems like when you have other things such as skin picking or OCD plus trichotillomania, that it probably speaks to something that perhaps synergistically is keeping it going. But this is just a first study; learning how to harness and understand it is the next step.
What’s the goal of treating trichotillomania?
The desired goal is zero pulling. The realistic goal is more likely significantly reduced pulling that then leads to greater function in life, greater quality-of-life.
One doesn’t have to go from 100 to 0 in order to do that. I always tell people that maybe every now and then, every few months, when something is going on in life, you might find yourself pulling a hair or two. That’s okay. If you’re not pulling every day and it’s significantly reduced, we’ll call that a success. I think that setting reasonable goals at this point is really important.
And what would the treatment pathway look like for most patients?
The standard approach is probably some type of habit-reversal therapy, of which there have been many variants over the years. It involves doing something different with your hand, identifying the triggers that may set you off, and then doing something in response to those triggers that is not pulling and might neutralize whatever that anxious or stressed feeling is. That could be different with each person.
At this point, there is no drug approved by the U.S. Food and Drug Administration for trichotillomania. Our best approaches have included N-acetylcysteine, a glutamate modulator, which we’ve done research in.
That’s kind of a go-to option for people because its side-effect profile is generally innocuous. The data show that it could be beneficial in many people with very few, if any, side effects. That would be one “medication,” although it’s actually an over-the-counter vitamin. But we’re constantly looking for better and better treatments.
Do you have any final advice for clinicians or researchers?
Given how common it is, I don’t think clinicians should just see it as an innocuous little habit that people should be able to stop on their own. Clinicians should educate themselves about trichotillomania and know where the person should get the appropriate care.
From the research perspective, given the fact that we see this in animals of multiple species – that they overgroom – this seems to be deeply ingrained in us as animals. So when it comes to the underlying neuroscience, people should pay more attention because it probably has a lot to do with our understanding of habit and compulsive behaviors. It arguably can cut across a lot of different behaviors.
A version of this article first appeared on Medscape.com.
Trichotillomania is a chronic psychiatric disorder that causes people to repeatedly pull out their own hair. Not only does it result in alopecia with no other underlying causes but it can have significant psychosocial ramifications and rare, but serious, complications. Though the reported prevalence rates are up to approximately 2%, it’s probable that you’ll come upon a patient suffering with this disorder at your practice, if you haven’t already.
To find out more about the best methods for diagnosing and treating this disorder, we spoke with Jon E. Grant, JD, MD, MPH, a leading trichotillomania researcher and part of the department of psychiatry and behavioral neuroscience at the University of Chicago.
Defining trichotillomania
What were the earliest descriptions of trichotillomania in medical literature?
The first real discussion of it probably goes back to Hippocrates, but from a modern medical perspective, discussion began in the 19th century with reports from the French dermatologist François Hallopeau.
They didn’t really call them disorders then – it was long before the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) – but they described this in young men who kept pulling their hair for unclear reasons. These early case reports don’t provide a lot of psychological perspective, but they seem consistent with what we see now.
What are the diagnostic criteria for trichotillomania?
The current DSM-5 criteria are recurrent pulling out of hair, an inability to stop it, the pulling resulting in some noticeable thinning or hair loss, and that it causes some level of distress or some type of impairment in functioning.
At what age do most people experience an onset of symptoms?
Generally speaking, it’s in early adolescence, post puberty, around 12-15 years of age. Having said that, we do see children as young as 1-2 years who are pulling their hair, and we occasionally see somebody far older who is doing it for the first time, a sort of geriatric onset.
Overlap and differences with other disorders
You’ve written that although trichotillomania is grouped with obsessive-compulsive disorder (OCD) in the DSM-5, the thinking around that has recently shifted. Why is that?
At first, it was noticed that many of these people pulled their hair repetitively in an almost ritualized manner, perhaps every night before bed. That looked like a compulsion of OCD.
When DSM-5 came out in 2013, they grouped it with OCD. Yet people shifted to thinking that it’s kind of a cousin of OCD because it has this compulsive quality but doesn’t really have obsessive thinking that drives it. Many people just pull their hair. They’re not even always aware of it: sometimes yes, sometimes no.
We know that it has some links to OCD. You’ll see more OCD in folks with trichotillomania, but it clearly is not just the same as OCD. One of the biggest pieces of evidence for that is that our first-line treatment for OCD – a selective serotonin reuptake inhibitor antidepressant – does not really help hair pulling.
Having said that, if people are looking for help with trichotillomania, they often are best served by therapists and doctors who have a familiarity with OCD and have kept it on their radar over the past couple of decades.
How does trichotillomania overlap with skin picking disorder, which is another condition that you’ve closely researched?
It does have some overlap with skin picking in the sense that it often seems familial. For example, the mother may pull her hair and child picks their skin.
It also has a fair amount of comorbidity with skin picking. Many people who pull will pick a little bit or did at some point. Many people who pick pulled their hair at some point. It seems closely related to nail biting as well.
Studies have also shown that one of the things that runs in the histories of most families of people with trichotillomania might be substance abuse – alcohol or drug addiction.
All of this has led people to believe that there might be subtypes of trichotillomania: one that’s more like an OCD and one that’s more like an addiction. That’s similar to the debate with other mental health conditions, that there are probably multiple types of depression, multiple types of schizophrenia.
Is there a component of this that could be defined as self-harm?
That’s been its own debate. It doesn’t seem to have the same developmental trajectory that we see with self-harm, or even some of the personality features.
However, there may be a small segment of folks with trichotillomania that might more appropriately fit that category. For example, those with family histories of trauma, higher rates of posttraumatic stress disorder, or borderline personality. But it wouldn’t be the majority.
The problem is, if you look at some of the pediatrician data, they often group picking, pulling, and cutting. I think that’s far too all-inclusive.
A gap in clinician education
Are adolescent patients likely to self-report this behavior, or is it something that physicians need to suss out for themselves?
Clearly, if child psychologists, psychiatrists, or pediatricians see young people with patches of alopecia – eyebrows or eyelashes missing, head hair with spots – in addition to a dermatologic assessment, they should simply ask, “Do you pull your hair?”
But it’s interesting that with the internet, young people are much more likely to disclose and actually come forward and tell their parents that they think they have trichotillomania.
I also hear from a lot of the adolescents that they have to educate their doctors about trichotillomania because so often physicians don’t know much about it and will assume that it’s self-injury or just a symptom of anxiety. It’s a little bit of a flip from what we might have seen 20 years ago.
I’ve seen several patients who’ve said, basically, “I’m tired of no professionals seeming to know about this. I shouldn’t have to be educating my doctors about this.” I tell them that I completely agree. It’s a shame because if a doctor doesn’t know about it, then how can they get the appropriate care?
What are the complications that accompany trichotillomania?
A small percentage, maybe about 10%, will ingest their hair, much like people who bite and swallow their fingernails. The concern there is that because hair is nondigestible, it could create an intestinal plug that could rupture and be potentially life-threatening. That makes it all the more important to ask those who pull their hair what they do with the hair once they pull it.
However, with most people, the real problem is with self-esteem. Young people may not want to socialize, go on dates, or do other things they would normally do because of it. In adults, you may find that they’re far more educated than their job allows but don’t want to go to an interview because they don’t want to have somebody sit there and look at them and notice that perhaps they don’t have any eyebrows, or that they’re wearing a wig. Those psychosocial implications are huge for so many people.
Treatment options
In a 2021 study, you showed that nearly one-quarter of people with trichotillomania do naturally recover from it. What characteristics do they seem to have?
It’s interesting because we see natural recovery across many mental health problems: alcohol addition, gambling, OCD. The question then becomes why is that some people can seemingly just stop doing a behavior? Can we learn from those people?
We did see that those who naturally recovered were less likely to have some other mental health comorbidities. It seems like when you have other things such as skin picking or OCD plus trichotillomania, that it probably speaks to something that perhaps synergistically is keeping it going. But this is just a first study; learning how to harness and understand it is the next step.
What’s the goal of treating trichotillomania?
The desired goal is zero pulling. The realistic goal is more likely significantly reduced pulling that then leads to greater function in life, greater quality-of-life.
One doesn’t have to go from 100 to 0 in order to do that. I always tell people that maybe every now and then, every few months, when something is going on in life, you might find yourself pulling a hair or two. That’s okay. If you’re not pulling every day and it’s significantly reduced, we’ll call that a success. I think that setting reasonable goals at this point is really important.
And what would the treatment pathway look like for most patients?
The standard approach is probably some type of habit-reversal therapy, of which there have been many variants over the years. It involves doing something different with your hand, identifying the triggers that may set you off, and then doing something in response to those triggers that is not pulling and might neutralize whatever that anxious or stressed feeling is. That could be different with each person.
At this point, there is no drug approved by the U.S. Food and Drug Administration for trichotillomania. Our best approaches have included N-acetylcysteine, a glutamate modulator, which we’ve done research in.
That’s kind of a go-to option for people because its side-effect profile is generally innocuous. The data show that it could be beneficial in many people with very few, if any, side effects. That would be one “medication,” although it’s actually an over-the-counter vitamin. But we’re constantly looking for better and better treatments.
Do you have any final advice for clinicians or researchers?
Given how common it is, I don’t think clinicians should just see it as an innocuous little habit that people should be able to stop on their own. Clinicians should educate themselves about trichotillomania and know where the person should get the appropriate care.
From the research perspective, given the fact that we see this in animals of multiple species – that they overgroom – this seems to be deeply ingrained in us as animals. So when it comes to the underlying neuroscience, people should pay more attention because it probably has a lot to do with our understanding of habit and compulsive behaviors. It arguably can cut across a lot of different behaviors.
A version of this article first appeared on Medscape.com.
The Molting Man: Anasarca-Induced Full-Body Desquamation
Edema blisters are a common but often underreported entity most commonly seen on the lower extremities in the setting of acute edema. 1 Reported risk factors and associations include chronic venous insufficiency, congestive heart failure, hereditary angioedema, and medications (eg, amlodipine). 1,2 We report a newly described variant that we have termed anasarca-induced desquamation in which a patient sloughed the entire cutaneous surface of the body after gaining almost 40 pounds over 5 days.
Case Report
A 50-year-old man without a home was found minimally responsive in a yard. His core body temperature was 25.5 °C. He was profoundly acidotic (pH, <6.733 [reference range, 7.35–7.45]; lactic acid, 20.5 mmol/L [reference range, 0.5–2.2 mmol/L]) at admission. His medical history was notable for diabetes mellitus, hypertension, alcohol abuse, and pulmonary embolism. The patient was resuscitated with rewarming and intravenous fluids in the setting of acute renal insufficiency. By day 5 of the hospital stay, he had a net positive intake of 21.8 L and an 18-kg (39.7-lb) weight gain.
Dermatology was consulted for skin sloughing. Physical examination revealed nonpainful desquamation of the vermilion lip, periorbital skin, right shoulder, and hips without notable mucosal changes. Two 4-mm punch biopsies of the shoulder revealed an intracorneal split with desquamation of the stratum corneum and a mild dermal lymphocytic infiltrate, consistent with exfoliation secondary to edema or staphylococcal scalded skin syndrome (Figure 1). No staphylococcal growth was noted on blood, urine, nasal, wound, and ocular cultures throughout the hospital stay.
As the patient’s anasarca improved with diuretics and continuous renal replacement therapy, the entire cutaneous surface—head to toe—underwent desquamation, including the palms and soles. He was managed with supportive skin care. The anasarca healed completely with residual hypopigmentation (Figures 2 and 3).
Comment
Anasarca-induced desquamation represents a more diffuse form of a known entity: edema blisters. Occurring most commonly in the setting of acute exacerbation of chronic venous insufficiency, edema blisters can mimic other vesiculobullous conditions, such as bullous pemphigoid and herpes zoster.3
Pathogenesis of Edema Blisters—Edema develops in the skin when the capillary filtration rate, determined by the hydrostatic and oncotic pressures of the capillaries and interstitium, exceeds venous and lymphatic drainage. The appearance of edema blisters in the acute setting likely is related to the speed at which edema develops in skin.1 Although edema blisters often are described as tense, there is a paucity of histologic data at the anatomical level of split in the skin.In our patient, desquamation was within the stratum corneum and likely multifactorial. His weight gain of nearly 40 lb, the result of intravenous instillation of fluids and low urine output, was undeniably a contributing factor. The anasarca was aggravated by hypoalbuminemia (2.1 g/dL) in the setting of known liver disease. Other possible contributing factors were hypotension, which required vasopressor therapy that led to hypoperfusion of the skin, and treatment of hypothermia, with resulting reactive vasodilation and capillary leak.
Management—Treatment of acute edema blisters is focused on the underlying cause of the edema. In a study of 13 patients with edema blisters, all had blisters on the legs that resolved with treatment, such as diuretics or compression therapy.1
Anasarca-induced desquamation is an inherently benign condition that mimics potentially fatal disorders, such as Stevens-Johnson syndrome, staphylococcal scalded skin syndrome, and toxic shock syndrome. Therefore, patients presenting with diffuse superficial desquamation should be assessed for the mucosal changes of Stevens-Johnson syndrome and a history of acute edema in the affected areas to avoid potentially harmful empiric treatments, such as corticosteroids and intravenous antibiotics.
Conclusion
Anasarca-induced desquamation represents a more diffuse form of edema blisters. This desquamation can mimic a potentially fatal rash, such as Stevens-Johnson syndrome and staphylococcal scalded skin syndrome.
- Bhushan M, Chalmers RJ, Cox NH. Acute oedema blisters: a report of 13 cases. Br J Dermatol. 2001;144:580-582. doi:10.1046/j.1365-2133.2001.04087.x
- Fabiani J, Bork K. Acute edema blisters on a skin swelling: an unusual manifestation of hereditary angioedema. Acta Derm Venereol. 2016;96:556-557. doi:10.2340/00015555-2252
- Chen SX, Cohen PR. Edema bullae mimicking disseminated herpes zoster. Cureus. 2017;9:E1780. doi:10.7759/cureus.1780
Edema blisters are a common but often underreported entity most commonly seen on the lower extremities in the setting of acute edema. 1 Reported risk factors and associations include chronic venous insufficiency, congestive heart failure, hereditary angioedema, and medications (eg, amlodipine). 1,2 We report a newly described variant that we have termed anasarca-induced desquamation in which a patient sloughed the entire cutaneous surface of the body after gaining almost 40 pounds over 5 days.
Case Report
A 50-year-old man without a home was found minimally responsive in a yard. His core body temperature was 25.5 °C. He was profoundly acidotic (pH, <6.733 [reference range, 7.35–7.45]; lactic acid, 20.5 mmol/L [reference range, 0.5–2.2 mmol/L]) at admission. His medical history was notable for diabetes mellitus, hypertension, alcohol abuse, and pulmonary embolism. The patient was resuscitated with rewarming and intravenous fluids in the setting of acute renal insufficiency. By day 5 of the hospital stay, he had a net positive intake of 21.8 L and an 18-kg (39.7-lb) weight gain.
Dermatology was consulted for skin sloughing. Physical examination revealed nonpainful desquamation of the vermilion lip, periorbital skin, right shoulder, and hips without notable mucosal changes. Two 4-mm punch biopsies of the shoulder revealed an intracorneal split with desquamation of the stratum corneum and a mild dermal lymphocytic infiltrate, consistent with exfoliation secondary to edema or staphylococcal scalded skin syndrome (Figure 1). No staphylococcal growth was noted on blood, urine, nasal, wound, and ocular cultures throughout the hospital stay.
As the patient’s anasarca improved with diuretics and continuous renal replacement therapy, the entire cutaneous surface—head to toe—underwent desquamation, including the palms and soles. He was managed with supportive skin care. The anasarca healed completely with residual hypopigmentation (Figures 2 and 3).
Comment
Anasarca-induced desquamation represents a more diffuse form of a known entity: edema blisters. Occurring most commonly in the setting of acute exacerbation of chronic venous insufficiency, edema blisters can mimic other vesiculobullous conditions, such as bullous pemphigoid and herpes zoster.3
Pathogenesis of Edema Blisters—Edema develops in the skin when the capillary filtration rate, determined by the hydrostatic and oncotic pressures of the capillaries and interstitium, exceeds venous and lymphatic drainage. The appearance of edema blisters in the acute setting likely is related to the speed at which edema develops in skin.1 Although edema blisters often are described as tense, there is a paucity of histologic data at the anatomical level of split in the skin.In our patient, desquamation was within the stratum corneum and likely multifactorial. His weight gain of nearly 40 lb, the result of intravenous instillation of fluids and low urine output, was undeniably a contributing factor. The anasarca was aggravated by hypoalbuminemia (2.1 g/dL) in the setting of known liver disease. Other possible contributing factors were hypotension, which required vasopressor therapy that led to hypoperfusion of the skin, and treatment of hypothermia, with resulting reactive vasodilation and capillary leak.
Management—Treatment of acute edema blisters is focused on the underlying cause of the edema. In a study of 13 patients with edema blisters, all had blisters on the legs that resolved with treatment, such as diuretics or compression therapy.1
Anasarca-induced desquamation is an inherently benign condition that mimics potentially fatal disorders, such as Stevens-Johnson syndrome, staphylococcal scalded skin syndrome, and toxic shock syndrome. Therefore, patients presenting with diffuse superficial desquamation should be assessed for the mucosal changes of Stevens-Johnson syndrome and a history of acute edema in the affected areas to avoid potentially harmful empiric treatments, such as corticosteroids and intravenous antibiotics.
Conclusion
Anasarca-induced desquamation represents a more diffuse form of edema blisters. This desquamation can mimic a potentially fatal rash, such as Stevens-Johnson syndrome and staphylococcal scalded skin syndrome.
Edema blisters are a common but often underreported entity most commonly seen on the lower extremities in the setting of acute edema. 1 Reported risk factors and associations include chronic venous insufficiency, congestive heart failure, hereditary angioedema, and medications (eg, amlodipine). 1,2 We report a newly described variant that we have termed anasarca-induced desquamation in which a patient sloughed the entire cutaneous surface of the body after gaining almost 40 pounds over 5 days.
Case Report
A 50-year-old man without a home was found minimally responsive in a yard. His core body temperature was 25.5 °C. He was profoundly acidotic (pH, <6.733 [reference range, 7.35–7.45]; lactic acid, 20.5 mmol/L [reference range, 0.5–2.2 mmol/L]) at admission. His medical history was notable for diabetes mellitus, hypertension, alcohol abuse, and pulmonary embolism. The patient was resuscitated with rewarming and intravenous fluids in the setting of acute renal insufficiency. By day 5 of the hospital stay, he had a net positive intake of 21.8 L and an 18-kg (39.7-lb) weight gain.
Dermatology was consulted for skin sloughing. Physical examination revealed nonpainful desquamation of the vermilion lip, periorbital skin, right shoulder, and hips without notable mucosal changes. Two 4-mm punch biopsies of the shoulder revealed an intracorneal split with desquamation of the stratum corneum and a mild dermal lymphocytic infiltrate, consistent with exfoliation secondary to edema or staphylococcal scalded skin syndrome (Figure 1). No staphylococcal growth was noted on blood, urine, nasal, wound, and ocular cultures throughout the hospital stay.
As the patient’s anasarca improved with diuretics and continuous renal replacement therapy, the entire cutaneous surface—head to toe—underwent desquamation, including the palms and soles. He was managed with supportive skin care. The anasarca healed completely with residual hypopigmentation (Figures 2 and 3).
Comment
Anasarca-induced desquamation represents a more diffuse form of a known entity: edema blisters. Occurring most commonly in the setting of acute exacerbation of chronic venous insufficiency, edema blisters can mimic other vesiculobullous conditions, such as bullous pemphigoid and herpes zoster.3
Pathogenesis of Edema Blisters—Edema develops in the skin when the capillary filtration rate, determined by the hydrostatic and oncotic pressures of the capillaries and interstitium, exceeds venous and lymphatic drainage. The appearance of edema blisters in the acute setting likely is related to the speed at which edema develops in skin.1 Although edema blisters often are described as tense, there is a paucity of histologic data at the anatomical level of split in the skin.In our patient, desquamation was within the stratum corneum and likely multifactorial. His weight gain of nearly 40 lb, the result of intravenous instillation of fluids and low urine output, was undeniably a contributing factor. The anasarca was aggravated by hypoalbuminemia (2.1 g/dL) in the setting of known liver disease. Other possible contributing factors were hypotension, which required vasopressor therapy that led to hypoperfusion of the skin, and treatment of hypothermia, with resulting reactive vasodilation and capillary leak.
Management—Treatment of acute edema blisters is focused on the underlying cause of the edema. In a study of 13 patients with edema blisters, all had blisters on the legs that resolved with treatment, such as diuretics or compression therapy.1
Anasarca-induced desquamation is an inherently benign condition that mimics potentially fatal disorders, such as Stevens-Johnson syndrome, staphylococcal scalded skin syndrome, and toxic shock syndrome. Therefore, patients presenting with diffuse superficial desquamation should be assessed for the mucosal changes of Stevens-Johnson syndrome and a history of acute edema in the affected areas to avoid potentially harmful empiric treatments, such as corticosteroids and intravenous antibiotics.
Conclusion
Anasarca-induced desquamation represents a more diffuse form of edema blisters. This desquamation can mimic a potentially fatal rash, such as Stevens-Johnson syndrome and staphylococcal scalded skin syndrome.
- Bhushan M, Chalmers RJ, Cox NH. Acute oedema blisters: a report of 13 cases. Br J Dermatol. 2001;144:580-582. doi:10.1046/j.1365-2133.2001.04087.x
- Fabiani J, Bork K. Acute edema blisters on a skin swelling: an unusual manifestation of hereditary angioedema. Acta Derm Venereol. 2016;96:556-557. doi:10.2340/00015555-2252
- Chen SX, Cohen PR. Edema bullae mimicking disseminated herpes zoster. Cureus. 2017;9:E1780. doi:10.7759/cureus.1780
- Bhushan M, Chalmers RJ, Cox NH. Acute oedema blisters: a report of 13 cases. Br J Dermatol. 2001;144:580-582. doi:10.1046/j.1365-2133.2001.04087.x
- Fabiani J, Bork K. Acute edema blisters on a skin swelling: an unusual manifestation of hereditary angioedema. Acta Derm Venereol. 2016;96:556-557. doi:10.2340/00015555-2252
- Chen SX, Cohen PR. Edema bullae mimicking disseminated herpes zoster. Cureus. 2017;9:E1780. doi:10.7759/cureus.1780
Practice Points
- The appearance of anasarca-induced desquamation can be similar to staphylococcal scalded skin syndrome and Stevens-Johnson syndrome.
- Histopathologic evaluation of this condition shows desquamation localized to the stratum corneum without epidermal necrosis.
- Careful evaluation, including bacterial culture, is required to rule out an infectious cause.
- Early diagnosis of anasarca-induced desquamation reduces the potential for providing harmful empiric treatment, such as systemic steroids and intravenous antibiotics, especially in patients known to have comorbidities.
Study suggests keto diet increases tumor growth in ovarian cancer
A ketogenic diet fed to mice with epithelial ovarian cancer led to significantly increased tumor growth and gut microbiome alterations, according to study recently presented at the annual meeting of the Society of Gynecologic Oncology.
“The keto diet is very popular, especially among patients who believe it may treat cancer by starving tumors of the fuel they need to grow, altering the immune system, and other anticancer effects,” said study leader Mariam AlHilli, MD, of the Cleveland Clinic.
The findings are surprising because in other studies the high-fat, zero-carb ketogenic diet has demonstrated tumor-suppressing effects. It has been under study as a possible adjuvant therapy for other cancers, such as glioblastoma, colon cancer, prostate cancer, and pancreatic cancer.
“While we don’t know yet whether these findings extend to patients, the results in animals indicate that instead of being protective, the keto diet appears to promote ovarian cancer growth and progression,” Dr. AlHilli said. In the present study, tumor bearing mice were fed a keto diet consisting of 10% protein, 0% carbohydrates, and 90% fat, while the high-fat diet was 10% protein, 15% carbohydrates, and 75% fat. The control diet consisted of 10% protein, 77% carbohydrates, and 13% fat. Epithelial ovarian cancer tumor growth was monitored weekly.
Over the 6- to 10-week course of study, a 9.1-fold increase from baseline in tumor growth was observed in the keto diet-fed mice (n = 20). Among mice fed a high-fat diet (n = 20) that included some carbohydrates, tumor growth increased 2.0-fold from baseline, and among control group mice (n = 20) fed a low-fat, high carbohydrate diet, tumor growth increased 3.1-fold.
The investigators observed several hallmarks of tumor progression: tumor associated macrophages were enriched significantly, activated lymphoid cells (natural killer cells) were significantly reduced (P < .001), and M2:M1 polarization trended higher. Also, in keto diet–fed mice, gene set enrichment analysis revealed that epithelial ovarian cancer tumors had increased angiogenesis and inflammatory responses, enhanced epithelial-to-mesenchymal transition phenotype, and altered lipid metabolism. Compared with high-fat diet–fed mice, the keto-fed mice had increases in lipid catalytic activity and catabolism, as well as decreases in lipid synthesis.
“The tumor increase could be mediated by the gut microbiome or by gene alterations or by metabolite levels that influence tumor growth. It’s possible that each cancer type is different. The composition of the diet may be a factor, as well as how tumors metabolize fat and ketones,” Dr. AlHilli said.
The results need to be confirmed in preclinical animal studies and in additional models, she added.
The study was funded by a K12 Grant and internal funding from Cleveland Clinic. Dr. AlHilli declared no relevant disclosures.
A ketogenic diet fed to mice with epithelial ovarian cancer led to significantly increased tumor growth and gut microbiome alterations, according to study recently presented at the annual meeting of the Society of Gynecologic Oncology.
“The keto diet is very popular, especially among patients who believe it may treat cancer by starving tumors of the fuel they need to grow, altering the immune system, and other anticancer effects,” said study leader Mariam AlHilli, MD, of the Cleveland Clinic.
The findings are surprising because in other studies the high-fat, zero-carb ketogenic diet has demonstrated tumor-suppressing effects. It has been under study as a possible adjuvant therapy for other cancers, such as glioblastoma, colon cancer, prostate cancer, and pancreatic cancer.
“While we don’t know yet whether these findings extend to patients, the results in animals indicate that instead of being protective, the keto diet appears to promote ovarian cancer growth and progression,” Dr. AlHilli said. In the present study, tumor bearing mice were fed a keto diet consisting of 10% protein, 0% carbohydrates, and 90% fat, while the high-fat diet was 10% protein, 15% carbohydrates, and 75% fat. The control diet consisted of 10% protein, 77% carbohydrates, and 13% fat. Epithelial ovarian cancer tumor growth was monitored weekly.
Over the 6- to 10-week course of study, a 9.1-fold increase from baseline in tumor growth was observed in the keto diet-fed mice (n = 20). Among mice fed a high-fat diet (n = 20) that included some carbohydrates, tumor growth increased 2.0-fold from baseline, and among control group mice (n = 20) fed a low-fat, high carbohydrate diet, tumor growth increased 3.1-fold.
The investigators observed several hallmarks of tumor progression: tumor associated macrophages were enriched significantly, activated lymphoid cells (natural killer cells) were significantly reduced (P < .001), and M2:M1 polarization trended higher. Also, in keto diet–fed mice, gene set enrichment analysis revealed that epithelial ovarian cancer tumors had increased angiogenesis and inflammatory responses, enhanced epithelial-to-mesenchymal transition phenotype, and altered lipid metabolism. Compared with high-fat diet–fed mice, the keto-fed mice had increases in lipid catalytic activity and catabolism, as well as decreases in lipid synthesis.
“The tumor increase could be mediated by the gut microbiome or by gene alterations or by metabolite levels that influence tumor growth. It’s possible that each cancer type is different. The composition of the diet may be a factor, as well as how tumors metabolize fat and ketones,” Dr. AlHilli said.
The results need to be confirmed in preclinical animal studies and in additional models, she added.
The study was funded by a K12 Grant and internal funding from Cleveland Clinic. Dr. AlHilli declared no relevant disclosures.
A ketogenic diet fed to mice with epithelial ovarian cancer led to significantly increased tumor growth and gut microbiome alterations, according to study recently presented at the annual meeting of the Society of Gynecologic Oncology.
“The keto diet is very popular, especially among patients who believe it may treat cancer by starving tumors of the fuel they need to grow, altering the immune system, and other anticancer effects,” said study leader Mariam AlHilli, MD, of the Cleveland Clinic.
The findings are surprising because in other studies the high-fat, zero-carb ketogenic diet has demonstrated tumor-suppressing effects. It has been under study as a possible adjuvant therapy for other cancers, such as glioblastoma, colon cancer, prostate cancer, and pancreatic cancer.
“While we don’t know yet whether these findings extend to patients, the results in animals indicate that instead of being protective, the keto diet appears to promote ovarian cancer growth and progression,” Dr. AlHilli said. In the present study, tumor bearing mice were fed a keto diet consisting of 10% protein, 0% carbohydrates, and 90% fat, while the high-fat diet was 10% protein, 15% carbohydrates, and 75% fat. The control diet consisted of 10% protein, 77% carbohydrates, and 13% fat. Epithelial ovarian cancer tumor growth was monitored weekly.
Over the 6- to 10-week course of study, a 9.1-fold increase from baseline in tumor growth was observed in the keto diet-fed mice (n = 20). Among mice fed a high-fat diet (n = 20) that included some carbohydrates, tumor growth increased 2.0-fold from baseline, and among control group mice (n = 20) fed a low-fat, high carbohydrate diet, tumor growth increased 3.1-fold.
The investigators observed several hallmarks of tumor progression: tumor associated macrophages were enriched significantly, activated lymphoid cells (natural killer cells) were significantly reduced (P < .001), and M2:M1 polarization trended higher. Also, in keto diet–fed mice, gene set enrichment analysis revealed that epithelial ovarian cancer tumors had increased angiogenesis and inflammatory responses, enhanced epithelial-to-mesenchymal transition phenotype, and altered lipid metabolism. Compared with high-fat diet–fed mice, the keto-fed mice had increases in lipid catalytic activity and catabolism, as well as decreases in lipid synthesis.
“The tumor increase could be mediated by the gut microbiome or by gene alterations or by metabolite levels that influence tumor growth. It’s possible that each cancer type is different. The composition of the diet may be a factor, as well as how tumors metabolize fat and ketones,” Dr. AlHilli said.
The results need to be confirmed in preclinical animal studies and in additional models, she added.
The study was funded by a K12 Grant and internal funding from Cleveland Clinic. Dr. AlHilli declared no relevant disclosures.
FROM SGO 2022
AI model predicts ovarian cancer responses
Dr. Glassman described her research in a presentation given at the annual meeting of the Society of Gynecologic Oncology.
While the AI model successfully identified all excellent-response patients, it did classify about a third of patients with poor responses as excellent responses. The smaller number of images in the poor-response category, Dr. Glassman speculated, may explain the misclassification.
Researchers took 435 representative still-frame images from pretreatment laparoscopic surgical videos of 113 patients with pathologically proven high-grade serous ovarian cancer. Using 70% of the images to train the model, they used 10% for validation and 20% for the actual testing. They developed the AI model with images from four anatomical locations (diaphragm, omentum, peritoneum, and pelvis), training it using deep learning and neural networks to extract morphological disease patterns for correlation with either of two outcomes: excellent response or poor response. An excellent response was defined as progression-free survival of 12 months or more, and poor response as PFS of 6 months or less. In the retrospective study of images, after excluding 32 gray-zone patients, 75 patients (66%) had durable responses to therapy and 6 (5%) had poor responses.
The PFS was 19 months in the excellent-response group and 3 months in the poor-response group.
Clinicians have often observed differences in gross morphology within the single histologic diagnosis of high-grade serous ovarian cancer. The research intent was to determine if AI could detect these distinct morphological patterns in the still frame images taken at the time of laparoscopy, and correlate them with the eventual clinical outcomes. Dr. Glassman and colleagues are currently validating the model with a much larger cohort, and will look into clinical testing.
“The big-picture goal,” Dr. Glassman said in an interview, “would be to utilize the model to predict which patients would do well with traditional standard of care treatments and those who wouldn’t do well so that we can personalize the treatment plan for those patients with alternative agents and therapies.”
Once validated, the model could also be employed to identify patterns of disease in other gynecologic cancers or distinguish between viable and necrosed malignant tissue.
The study’s predominant limitation was the small sample size which is being addressed in a larger ongoing study.
Funding was provided by a T32 grant, MD Anderson Cancer Center Support Grant, MD Anderson Ovarian Cancer Moon Shot, SPORE in Ovarian Cancer, the American Cancer Society, and the Ovarian Cancer Research Alliance. Dr. Glassman declared no relevant financial relationships.
Dr. Glassman described her research in a presentation given at the annual meeting of the Society of Gynecologic Oncology.
While the AI model successfully identified all excellent-response patients, it did classify about a third of patients with poor responses as excellent responses. The smaller number of images in the poor-response category, Dr. Glassman speculated, may explain the misclassification.
Researchers took 435 representative still-frame images from pretreatment laparoscopic surgical videos of 113 patients with pathologically proven high-grade serous ovarian cancer. Using 70% of the images to train the model, they used 10% for validation and 20% for the actual testing. They developed the AI model with images from four anatomical locations (diaphragm, omentum, peritoneum, and pelvis), training it using deep learning and neural networks to extract morphological disease patterns for correlation with either of two outcomes: excellent response or poor response. An excellent response was defined as progression-free survival of 12 months or more, and poor response as PFS of 6 months or less. In the retrospective study of images, after excluding 32 gray-zone patients, 75 patients (66%) had durable responses to therapy and 6 (5%) had poor responses.
The PFS was 19 months in the excellent-response group and 3 months in the poor-response group.
Clinicians have often observed differences in gross morphology within the single histologic diagnosis of high-grade serous ovarian cancer. The research intent was to determine if AI could detect these distinct morphological patterns in the still frame images taken at the time of laparoscopy, and correlate them with the eventual clinical outcomes. Dr. Glassman and colleagues are currently validating the model with a much larger cohort, and will look into clinical testing.
“The big-picture goal,” Dr. Glassman said in an interview, “would be to utilize the model to predict which patients would do well with traditional standard of care treatments and those who wouldn’t do well so that we can personalize the treatment plan for those patients with alternative agents and therapies.”
Once validated, the model could also be employed to identify patterns of disease in other gynecologic cancers or distinguish between viable and necrosed malignant tissue.
The study’s predominant limitation was the small sample size which is being addressed in a larger ongoing study.
Funding was provided by a T32 grant, MD Anderson Cancer Center Support Grant, MD Anderson Ovarian Cancer Moon Shot, SPORE in Ovarian Cancer, the American Cancer Society, and the Ovarian Cancer Research Alliance. Dr. Glassman declared no relevant financial relationships.
Dr. Glassman described her research in a presentation given at the annual meeting of the Society of Gynecologic Oncology.
While the AI model successfully identified all excellent-response patients, it did classify about a third of patients with poor responses as excellent responses. The smaller number of images in the poor-response category, Dr. Glassman speculated, may explain the misclassification.
Researchers took 435 representative still-frame images from pretreatment laparoscopic surgical videos of 113 patients with pathologically proven high-grade serous ovarian cancer. Using 70% of the images to train the model, they used 10% for validation and 20% for the actual testing. They developed the AI model with images from four anatomical locations (diaphragm, omentum, peritoneum, and pelvis), training it using deep learning and neural networks to extract morphological disease patterns for correlation with either of two outcomes: excellent response or poor response. An excellent response was defined as progression-free survival of 12 months or more, and poor response as PFS of 6 months or less. In the retrospective study of images, after excluding 32 gray-zone patients, 75 patients (66%) had durable responses to therapy and 6 (5%) had poor responses.
The PFS was 19 months in the excellent-response group and 3 months in the poor-response group.
Clinicians have often observed differences in gross morphology within the single histologic diagnosis of high-grade serous ovarian cancer. The research intent was to determine if AI could detect these distinct morphological patterns in the still frame images taken at the time of laparoscopy, and correlate them with the eventual clinical outcomes. Dr. Glassman and colleagues are currently validating the model with a much larger cohort, and will look into clinical testing.
“The big-picture goal,” Dr. Glassman said in an interview, “would be to utilize the model to predict which patients would do well with traditional standard of care treatments and those who wouldn’t do well so that we can personalize the treatment plan for those patients with alternative agents and therapies.”
Once validated, the model could also be employed to identify patterns of disease in other gynecologic cancers or distinguish between viable and necrosed malignant tissue.
The study’s predominant limitation was the small sample size which is being addressed in a larger ongoing study.
Funding was provided by a T32 grant, MD Anderson Cancer Center Support Grant, MD Anderson Ovarian Cancer Moon Shot, SPORE in Ovarian Cancer, the American Cancer Society, and the Ovarian Cancer Research Alliance. Dr. Glassman declared no relevant financial relationships.
FROM SGO 2022
Poverty-related stress linked to aggressive head and neck cancer
A humanized mouse model suggests that head and neck cancer growth may stem from chronic stress. The study found that animals had immunophenotypic changes and a greater propensity towards tumor growth and metastasis.
Head and Neck.
Led by Heather A. Himburg, PhD, associate professor of radiation oncology with the Medical College of Wisconsin, Milwaukee, researchers conducted a study of head and neck cancer models in which tumor cells were implanted into a mouse with a humanized immune system.
Their theory was that psychosocial stress may contribute to the growth of head and neck tumors. The stress of poverty, social deprivation and social isolation can lead to the up-regulation of proinflammatory markers in circulating blood leukocytes, and this has been tied to worse outcomes in hematologic malignancies and breast cancer. Many such studies examined social adversity and found an association with greater tumor growth rates and treatment resistance.
Other researchers have used mouse models to study the phenomenon, but the results have been inconclusive. For example, some research linked the beta-adrenergic pathway to head and neck cancer, but clinical trials of beta-blockers showed no benefit, and even potential harm, for patients with head and neck cancers. Those results imply that this pathway does not drive tumor growth and metastasis in the presence of chronic stress.
Previous research used immunocompromised or nonhumanized mice. However, neither type of model reproduces the human tumor microenvironment, which may contribute to ensuing clinical failures. In the new study, researchers describe results from a preclinical model created using a human head and neck cancer xenograft in a mouse with a humanized immune system.
How the study was conducted
The animals were randomly assigned to normal housing of two or three animals from the same litter to a cage, or social isolation from littermates. There were five male and five female animals in each arm, and the animals were housed in their separate conditions for 4 weeks before tumor implantation.
The isolated animals experienced increased growth and metastasis of the xenografts, compared with controls. The results are consistent with findings in immunodeficient or syngeneic mice, but the humanized nature of the new model could lead to better translation of findings into clinical studies. “The humanized model system in this study demonstrated the presence of both human myeloid and lymphoid lineages as well as expression of at least 40 human cytokines. These data indicate that our model is likely to well-represent the human condition and better predict human clinical responses as compared to both immunodeficient and syngeneic models,” the authors wrote.
The researchers also found that chronic stress may act through an immunoregulatory effect, since there was greater human immune infiltrate into the tumors of stressed animals. Increased presence of regulatory components like myeloid-derived suppressor cells or regulatory T cells, or eroded function of tumor-infiltrating lymphocytes, might explain this finding. The researchers also identified a proinflammatory change in peripheral blood monocular cells in the stressed group. When they analyzed samples from patients who were low income earners of less than $45,000 in annual household income, they found a similar pattern. “This suggests that chronic socioeconomic stress may induce a similar proinflammatory immune state as our chronic stress model system,” the authors wrote.
Tumors were also different between the two groups of mice. Tumors in stressed animals had a higher percentage of cancer stem cells, which is associated with more aggressive tumors and worse disease-free survival. The researchers suggested that up-regulated levels of the chemokine SDF-1 seen in the stressed animals may be driving the higher proportion of stem cells through its effects on the CXCR4 receptor, which is expressed by stem cells in various organs and may cause migration, proliferation, and cell survival.
The study was funded by an endowment from Advancing a Healthier Wisconsin and a grant from the National Center for Advancing Translational Sciences. The authors reported no conflicts of interest.
A humanized mouse model suggests that head and neck cancer growth may stem from chronic stress. The study found that animals had immunophenotypic changes and a greater propensity towards tumor growth and metastasis.
Head and Neck.
Led by Heather A. Himburg, PhD, associate professor of radiation oncology with the Medical College of Wisconsin, Milwaukee, researchers conducted a study of head and neck cancer models in which tumor cells were implanted into a mouse with a humanized immune system.
Their theory was that psychosocial stress may contribute to the growth of head and neck tumors. The stress of poverty, social deprivation and social isolation can lead to the up-regulation of proinflammatory markers in circulating blood leukocytes, and this has been tied to worse outcomes in hematologic malignancies and breast cancer. Many such studies examined social adversity and found an association with greater tumor growth rates and treatment resistance.
Other researchers have used mouse models to study the phenomenon, but the results have been inconclusive. For example, some research linked the beta-adrenergic pathway to head and neck cancer, but clinical trials of beta-blockers showed no benefit, and even potential harm, for patients with head and neck cancers. Those results imply that this pathway does not drive tumor growth and metastasis in the presence of chronic stress.
Previous research used immunocompromised or nonhumanized mice. However, neither type of model reproduces the human tumor microenvironment, which may contribute to ensuing clinical failures. In the new study, researchers describe results from a preclinical model created using a human head and neck cancer xenograft in a mouse with a humanized immune system.
How the study was conducted
The animals were randomly assigned to normal housing of two or three animals from the same litter to a cage, or social isolation from littermates. There were five male and five female animals in each arm, and the animals were housed in their separate conditions for 4 weeks before tumor implantation.
The isolated animals experienced increased growth and metastasis of the xenografts, compared with controls. The results are consistent with findings in immunodeficient or syngeneic mice, but the humanized nature of the new model could lead to better translation of findings into clinical studies. “The humanized model system in this study demonstrated the presence of both human myeloid and lymphoid lineages as well as expression of at least 40 human cytokines. These data indicate that our model is likely to well-represent the human condition and better predict human clinical responses as compared to both immunodeficient and syngeneic models,” the authors wrote.
The researchers also found that chronic stress may act through an immunoregulatory effect, since there was greater human immune infiltrate into the tumors of stressed animals. Increased presence of regulatory components like myeloid-derived suppressor cells or regulatory T cells, or eroded function of tumor-infiltrating lymphocytes, might explain this finding. The researchers also identified a proinflammatory change in peripheral blood monocular cells in the stressed group. When they analyzed samples from patients who were low income earners of less than $45,000 in annual household income, they found a similar pattern. “This suggests that chronic socioeconomic stress may induce a similar proinflammatory immune state as our chronic stress model system,” the authors wrote.
Tumors were also different between the two groups of mice. Tumors in stressed animals had a higher percentage of cancer stem cells, which is associated with more aggressive tumors and worse disease-free survival. The researchers suggested that up-regulated levels of the chemokine SDF-1 seen in the stressed animals may be driving the higher proportion of stem cells through its effects on the CXCR4 receptor, which is expressed by stem cells in various organs and may cause migration, proliferation, and cell survival.
The study was funded by an endowment from Advancing a Healthier Wisconsin and a grant from the National Center for Advancing Translational Sciences. The authors reported no conflicts of interest.
A humanized mouse model suggests that head and neck cancer growth may stem from chronic stress. The study found that animals had immunophenotypic changes and a greater propensity towards tumor growth and metastasis.
Head and Neck.
Led by Heather A. Himburg, PhD, associate professor of radiation oncology with the Medical College of Wisconsin, Milwaukee, researchers conducted a study of head and neck cancer models in which tumor cells were implanted into a mouse with a humanized immune system.
Their theory was that psychosocial stress may contribute to the growth of head and neck tumors. The stress of poverty, social deprivation and social isolation can lead to the up-regulation of proinflammatory markers in circulating blood leukocytes, and this has been tied to worse outcomes in hematologic malignancies and breast cancer. Many such studies examined social adversity and found an association with greater tumor growth rates and treatment resistance.
Other researchers have used mouse models to study the phenomenon, but the results have been inconclusive. For example, some research linked the beta-adrenergic pathway to head and neck cancer, but clinical trials of beta-blockers showed no benefit, and even potential harm, for patients with head and neck cancers. Those results imply that this pathway does not drive tumor growth and metastasis in the presence of chronic stress.
Previous research used immunocompromised or nonhumanized mice. However, neither type of model reproduces the human tumor microenvironment, which may contribute to ensuing clinical failures. In the new study, researchers describe results from a preclinical model created using a human head and neck cancer xenograft in a mouse with a humanized immune system.
How the study was conducted
The animals were randomly assigned to normal housing of two or three animals from the same litter to a cage, or social isolation from littermates. There were five male and five female animals in each arm, and the animals were housed in their separate conditions for 4 weeks before tumor implantation.
The isolated animals experienced increased growth and metastasis of the xenografts, compared with controls. The results are consistent with findings in immunodeficient or syngeneic mice, but the humanized nature of the new model could lead to better translation of findings into clinical studies. “The humanized model system in this study demonstrated the presence of both human myeloid and lymphoid lineages as well as expression of at least 40 human cytokines. These data indicate that our model is likely to well-represent the human condition and better predict human clinical responses as compared to both immunodeficient and syngeneic models,” the authors wrote.
The researchers also found that chronic stress may act through an immunoregulatory effect, since there was greater human immune infiltrate into the tumors of stressed animals. Increased presence of regulatory components like myeloid-derived suppressor cells or regulatory T cells, or eroded function of tumor-infiltrating lymphocytes, might explain this finding. The researchers also identified a proinflammatory change in peripheral blood monocular cells in the stressed group. When they analyzed samples from patients who were low income earners of less than $45,000 in annual household income, they found a similar pattern. “This suggests that chronic socioeconomic stress may induce a similar proinflammatory immune state as our chronic stress model system,” the authors wrote.
Tumors were also different between the two groups of mice. Tumors in stressed animals had a higher percentage of cancer stem cells, which is associated with more aggressive tumors and worse disease-free survival. The researchers suggested that up-regulated levels of the chemokine SDF-1 seen in the stressed animals may be driving the higher proportion of stem cells through its effects on the CXCR4 receptor, which is expressed by stem cells in various organs and may cause migration, proliferation, and cell survival.
The study was funded by an endowment from Advancing a Healthier Wisconsin and a grant from the National Center for Advancing Translational Sciences. The authors reported no conflicts of interest.
FROM HEAD & NECK
Steroids counter ataxia telangiectasia
SEATTLE –
The disease is an autosomal recessive disorder caused by mutations in the ATM gene, which is critical to the response to cellular insults such as DNA breaks, oxidative damage, and other forms of stress. The result is clinical manifestations that range from a suppressed immune system to organ damage and neurological symptoms that typically lead patients to be wheelchair bound by their teenage years.
“It’s really multisystem and a very, very difficult disease for people to live with,” Howard M. Lederman, MD, PhD, said in an interview. Dr. Lederman is a coauthor of the study, which was presented by Stefan Zielen, PhD, professor at the University of Goethe, at the 2022 annual meeting of the American Academy of Neurology.
Various therapies have been developed to improve immunodeficiency, lung disease, and some of the other clinical aspects of the condition, but there is no treatment for its neurological effects. “There’s not really been a good animal model, which has been a big problem in trying to test drugs and design treatment trials,” said Dr. Lederman, professor of pediatrics and medicine at Johns Hopkins University, Baltimore.
The new results may change that. “In the children under the age of 9, there was really a very clear slowdown in the neurodegeneration, and specifically the time that it took for them to lose the ability to ambulate. It’s very exciting, because it’s the first time that anybody has really shown in a double-blind, placebo-controlled, large phase 3 study that any drug has been able to do this. And there were really no steroid side effects, which is the other really remarkable thing about this study,” said Dr. Lederman.
The therapy grew out of a study by researchers in Italy who treated pediatric ataxia telangiectasia patients with corticosteroids and found some transitory improvements in gross motor function, but concerns about long-term exposure to steroids limited its application. EryDel, which specializes in encapsulating therapeutics in red blood cells, became interested and developed a formulation using the patient’s own red blood cells infused with DSP. Reinfused to the patients, the red blood cells slowly release the steroid.
It isn’t clear how dexamethasone works. There are data suggesting that it might lead to transcription of small pieces of the ATM protein, “but that has really not been nailed down in any way at this point. Corticosteroids act on all kinds of cells in all kinds of ways, and so there might be a little bit of this so-called mini-ATM that’s produced, but that may or may not be related to the way in which corticosteroids have a beneficial effect on the rate of neurodegeneration,” said Dr. Lederman.
The treatment process is not easy. Children must have 50-60 cc of blood removed. Red blood cells treated to become porous are exposed to DSP, and then resealed. Then the cells are reinfused. “The whole process takes from beginning to end probably about 3 hours, with a really experienced team of people doing it. And it’s limiting because it’s not easy to put in an IV and take 50 or 60 cc of blood out of children much younger than 5 or 6. The process is now being modified to see whether we could do it with 20 to 30 cc instead,” said Dr. Lederman.
A ‘promising and impressive’ study
The study is promising, according to Nicholas Johnson, MD, who comoderated the session where the study was presented. “They were able to show a slower rate of neurological degeneration or duration on both the lower and higher dose compared with the placebo. This is promising and impressive, in the sense that it’s a really large (trial) for a rare condition,” Dr. Johnson, vice chair of research at Virginia Commonwealth University, Richmond, said in an interview.
The study included 164 patients Europe, Australia, Israel, Tunisia, India, and the United States, who received 5-10 mg dexamethasone, 14-22 mg DSP, or placebo. Mean ages in each group ranged from 9.6 to 10.4 years.
In an intention-to-treat analysis, modified International Cooperative Ataxia Rating Scale (mICARS) scores trended toward improvement in the low-dose (–1.37; P = .0847) and high-dose groups (–1.40; P = .0765) when determined by central raters during the COVID-19 pandemic. There was also a trend toward improvement when determined by local raters in the low dose group (–1.73; P = .0720) and a statistically significant change in the high dose group (–2.11; P = .0277). The researchers noted some inconsistency between local and central raters, due to inconsistency of videography and language challenges for central raters.
An intention-to-treat analysis of a subgroup of 89 patients age 6-9, who were compared with natural history data from 245 patients, found a deterioration of mICARS of 3.7 per year, compared with 0.92 in the high-dose group, for a reduction of 75% (P = .020). In the high-dose group, 51.7% had a minimal or significant improvement compared with baseline according to the Clinical Global Impression of Change, as did 29.0% on low dose, and 27.6% in the placebo group.
SEATTLE –
The disease is an autosomal recessive disorder caused by mutations in the ATM gene, which is critical to the response to cellular insults such as DNA breaks, oxidative damage, and other forms of stress. The result is clinical manifestations that range from a suppressed immune system to organ damage and neurological symptoms that typically lead patients to be wheelchair bound by their teenage years.
“It’s really multisystem and a very, very difficult disease for people to live with,” Howard M. Lederman, MD, PhD, said in an interview. Dr. Lederman is a coauthor of the study, which was presented by Stefan Zielen, PhD, professor at the University of Goethe, at the 2022 annual meeting of the American Academy of Neurology.
Various therapies have been developed to improve immunodeficiency, lung disease, and some of the other clinical aspects of the condition, but there is no treatment for its neurological effects. “There’s not really been a good animal model, which has been a big problem in trying to test drugs and design treatment trials,” said Dr. Lederman, professor of pediatrics and medicine at Johns Hopkins University, Baltimore.
The new results may change that. “In the children under the age of 9, there was really a very clear slowdown in the neurodegeneration, and specifically the time that it took for them to lose the ability to ambulate. It’s very exciting, because it’s the first time that anybody has really shown in a double-blind, placebo-controlled, large phase 3 study that any drug has been able to do this. And there were really no steroid side effects, which is the other really remarkable thing about this study,” said Dr. Lederman.
The therapy grew out of a study by researchers in Italy who treated pediatric ataxia telangiectasia patients with corticosteroids and found some transitory improvements in gross motor function, but concerns about long-term exposure to steroids limited its application. EryDel, which specializes in encapsulating therapeutics in red blood cells, became interested and developed a formulation using the patient’s own red blood cells infused with DSP. Reinfused to the patients, the red blood cells slowly release the steroid.
It isn’t clear how dexamethasone works. There are data suggesting that it might lead to transcription of small pieces of the ATM protein, “but that has really not been nailed down in any way at this point. Corticosteroids act on all kinds of cells in all kinds of ways, and so there might be a little bit of this so-called mini-ATM that’s produced, but that may or may not be related to the way in which corticosteroids have a beneficial effect on the rate of neurodegeneration,” said Dr. Lederman.
The treatment process is not easy. Children must have 50-60 cc of blood removed. Red blood cells treated to become porous are exposed to DSP, and then resealed. Then the cells are reinfused. “The whole process takes from beginning to end probably about 3 hours, with a really experienced team of people doing it. And it’s limiting because it’s not easy to put in an IV and take 50 or 60 cc of blood out of children much younger than 5 or 6. The process is now being modified to see whether we could do it with 20 to 30 cc instead,” said Dr. Lederman.
A ‘promising and impressive’ study
The study is promising, according to Nicholas Johnson, MD, who comoderated the session where the study was presented. “They were able to show a slower rate of neurological degeneration or duration on both the lower and higher dose compared with the placebo. This is promising and impressive, in the sense that it’s a really large (trial) for a rare condition,” Dr. Johnson, vice chair of research at Virginia Commonwealth University, Richmond, said in an interview.
The study included 164 patients Europe, Australia, Israel, Tunisia, India, and the United States, who received 5-10 mg dexamethasone, 14-22 mg DSP, or placebo. Mean ages in each group ranged from 9.6 to 10.4 years.
In an intention-to-treat analysis, modified International Cooperative Ataxia Rating Scale (mICARS) scores trended toward improvement in the low-dose (–1.37; P = .0847) and high-dose groups (–1.40; P = .0765) when determined by central raters during the COVID-19 pandemic. There was also a trend toward improvement when determined by local raters in the low dose group (–1.73; P = .0720) and a statistically significant change in the high dose group (–2.11; P = .0277). The researchers noted some inconsistency between local and central raters, due to inconsistency of videography and language challenges for central raters.
An intention-to-treat analysis of a subgroup of 89 patients age 6-9, who were compared with natural history data from 245 patients, found a deterioration of mICARS of 3.7 per year, compared with 0.92 in the high-dose group, for a reduction of 75% (P = .020). In the high-dose group, 51.7% had a minimal or significant improvement compared with baseline according to the Clinical Global Impression of Change, as did 29.0% on low dose, and 27.6% in the placebo group.
SEATTLE –
The disease is an autosomal recessive disorder caused by mutations in the ATM gene, which is critical to the response to cellular insults such as DNA breaks, oxidative damage, and other forms of stress. The result is clinical manifestations that range from a suppressed immune system to organ damage and neurological symptoms that typically lead patients to be wheelchair bound by their teenage years.
“It’s really multisystem and a very, very difficult disease for people to live with,” Howard M. Lederman, MD, PhD, said in an interview. Dr. Lederman is a coauthor of the study, which was presented by Stefan Zielen, PhD, professor at the University of Goethe, at the 2022 annual meeting of the American Academy of Neurology.
Various therapies have been developed to improve immunodeficiency, lung disease, and some of the other clinical aspects of the condition, but there is no treatment for its neurological effects. “There’s not really been a good animal model, which has been a big problem in trying to test drugs and design treatment trials,” said Dr. Lederman, professor of pediatrics and medicine at Johns Hopkins University, Baltimore.
The new results may change that. “In the children under the age of 9, there was really a very clear slowdown in the neurodegeneration, and specifically the time that it took for them to lose the ability to ambulate. It’s very exciting, because it’s the first time that anybody has really shown in a double-blind, placebo-controlled, large phase 3 study that any drug has been able to do this. And there were really no steroid side effects, which is the other really remarkable thing about this study,” said Dr. Lederman.
The therapy grew out of a study by researchers in Italy who treated pediatric ataxia telangiectasia patients with corticosteroids and found some transitory improvements in gross motor function, but concerns about long-term exposure to steroids limited its application. EryDel, which specializes in encapsulating therapeutics in red blood cells, became interested and developed a formulation using the patient’s own red blood cells infused with DSP. Reinfused to the patients, the red blood cells slowly release the steroid.
It isn’t clear how dexamethasone works. There are data suggesting that it might lead to transcription of small pieces of the ATM protein, “but that has really not been nailed down in any way at this point. Corticosteroids act on all kinds of cells in all kinds of ways, and so there might be a little bit of this so-called mini-ATM that’s produced, but that may or may not be related to the way in which corticosteroids have a beneficial effect on the rate of neurodegeneration,” said Dr. Lederman.
The treatment process is not easy. Children must have 50-60 cc of blood removed. Red blood cells treated to become porous are exposed to DSP, and then resealed. Then the cells are reinfused. “The whole process takes from beginning to end probably about 3 hours, with a really experienced team of people doing it. And it’s limiting because it’s not easy to put in an IV and take 50 or 60 cc of blood out of children much younger than 5 or 6. The process is now being modified to see whether we could do it with 20 to 30 cc instead,” said Dr. Lederman.
A ‘promising and impressive’ study
The study is promising, according to Nicholas Johnson, MD, who comoderated the session where the study was presented. “They were able to show a slower rate of neurological degeneration or duration on both the lower and higher dose compared with the placebo. This is promising and impressive, in the sense that it’s a really large (trial) for a rare condition,” Dr. Johnson, vice chair of research at Virginia Commonwealth University, Richmond, said in an interview.
The study included 164 patients Europe, Australia, Israel, Tunisia, India, and the United States, who received 5-10 mg dexamethasone, 14-22 mg DSP, or placebo. Mean ages in each group ranged from 9.6 to 10.4 years.
In an intention-to-treat analysis, modified International Cooperative Ataxia Rating Scale (mICARS) scores trended toward improvement in the low-dose (–1.37; P = .0847) and high-dose groups (–1.40; P = .0765) when determined by central raters during the COVID-19 pandemic. There was also a trend toward improvement when determined by local raters in the low dose group (–1.73; P = .0720) and a statistically significant change in the high dose group (–2.11; P = .0277). The researchers noted some inconsistency between local and central raters, due to inconsistency of videography and language challenges for central raters.
An intention-to-treat analysis of a subgroup of 89 patients age 6-9, who were compared with natural history data from 245 patients, found a deterioration of mICARS of 3.7 per year, compared with 0.92 in the high-dose group, for a reduction of 75% (P = .020). In the high-dose group, 51.7% had a minimal or significant improvement compared with baseline according to the Clinical Global Impression of Change, as did 29.0% on low dose, and 27.6% in the placebo group.
AT AAN 2022
Postpartum HCV treatment rare in infected mothers with opioid use disorder
Despite the availability of effective direct-acting antivirals, very few a mothers with opioid use disorder (OUD) and hepatitis C virus (HCV) during pregnancy received follow-up care or treatment for the infection within 6 months of giving birth, a retrospective study of Medicaid maternity patients found.
The study pooled data on 23,780 Medicaid-enrolled pregnant women with OUD who had a live or stillbirth during 2016-2019 and were followed for 6 months after delivery. Among these women – drawn from six states in the Medicaid Outcomes Distributed Research Network – the pooled average probability of HCV testing during pregnancy was 70.3% (95% confidence interval, 61.5%-79.1%). Of these, 30.9% (95% CI, 23.8%-38%) tested positive. At 60 days postpartum, just 3.2% (95% CI, 2.6%-3.8%) had a follow-up visit or treatment for HCV. In a subset of patients followed for 6 months, only 5.9% (95% CI, 4.9%-6.9%) had any HCV follow-up visit or medication within 6 months of delivery.
While HCV screening and diagnosis rates varied across states, postpartum follow-up rates were universally low. The results suggest a need to improve the cascade of postpartum care for HCV and, ultimately perhaps, introduce antenatal HCV treatment, as is currently given safely for HIV, if current clinical research establishes safety, according to Marian P. Jarlenski, PhD, MPH, an associate professor of public health policy and management at the University of Pittsburgh. The study was published in Obstetrics & Gynecology.
HCV infection has risen substantially in people of reproductive age in tandem with an increase in OUDs. HCV is transmitted from an infected mother to her baby in about 6% of cases, according to the Centers for Disease Control and Prevention, which in 2020 expanded its HCV screening recommendations to include all pregnant women. Currently no treatment for HCV during pregnancy has been approved.
In light of those recent recommendations, Dr. Jarlenski said in an interview that her group was “interested in looking at high-risk screened people and estimating what proportion received follow-up care and treatment for HCV. What is the promise of screening? The promise is that you can treat. Otherwise why screen?”
She acknowledged, however, that the postpartum period is a challenging time for a mother to seek health information or care for herself, whether she’s a new parent or has other children in the home. Nevertheless, the low rate of follow-up and treatment was unexpected. “Even the 70% rate of screening was low – we felt it should have been closer to 100% – but the follow-up rate was surprisingly low,” Dr. Jarlenski said.
Mishka Terplan, MD, MPH, medical director of Friends Research Institute in Baltimore, was not surprised at the low follow-up rate. “The cascade of care for hep C is demoralizing,” said Dr. Terplan, who was not involved in the study. “We know that hep C is syndemic with OUD and other opioid crises and we know that screening is effective for identifying hep C and that antiviral medications are now more effective and less toxic than ever before. But despite this, we’re failing pregnant women and their kids at every step along the cascade. We do a better job with initial testing than with the follow-up testing. We do a horrible job with postpartum medication initiation.”
He pointed to the systemic challenges mothers face in getting postpartum HCV care. “They may be transferred to a subspecialist for treatment, and this transfer is compounded by issues of insurance coverage and eligibility.” With the onus on new mothers to submit the paperwork, “the idea that mothers would be able to initiate much less continue postpartum treatment is absurd,” Dr. Terplan said.
He added that the children born to HCV-positive mothers need surveillance as well, but data suggest that the rates of newborn testing are also low. “There’s a preventable public health burden in all of this.”
The obvious way to increase eradicative therapy would be to treat women while they are getting antenatal care. A small phase 1 trial found that all pregnant participants who were HCV positive and given antivirals in their second trimester were safely treated and gave birth to healthy babies.
“If larger trials prove this treatment is safe and effective, then these results should be communicated to care providers and pregnant patients,” Dr. Jarlenski said. Otherwise, the public health potential of universal screening in pregnancy will not be realized.
This research was supported by the National Institute of Drug Abuse and by the Delaware Division of Medicaid and Medical Assistance and the University of Delaware, Center for Community Research & Service. Dr. Jarlenski disclosed no competing interests. One coauthor disclosed grant funding through her institution from Gilead Sciences and Organon unrelated to this work. Dr. Terplan reported no relevant competing interests.
Despite the availability of effective direct-acting antivirals, very few a mothers with opioid use disorder (OUD) and hepatitis C virus (HCV) during pregnancy received follow-up care or treatment for the infection within 6 months of giving birth, a retrospective study of Medicaid maternity patients found.
The study pooled data on 23,780 Medicaid-enrolled pregnant women with OUD who had a live or stillbirth during 2016-2019 and were followed for 6 months after delivery. Among these women – drawn from six states in the Medicaid Outcomes Distributed Research Network – the pooled average probability of HCV testing during pregnancy was 70.3% (95% confidence interval, 61.5%-79.1%). Of these, 30.9% (95% CI, 23.8%-38%) tested positive. At 60 days postpartum, just 3.2% (95% CI, 2.6%-3.8%) had a follow-up visit or treatment for HCV. In a subset of patients followed for 6 months, only 5.9% (95% CI, 4.9%-6.9%) had any HCV follow-up visit or medication within 6 months of delivery.
While HCV screening and diagnosis rates varied across states, postpartum follow-up rates were universally low. The results suggest a need to improve the cascade of postpartum care for HCV and, ultimately perhaps, introduce antenatal HCV treatment, as is currently given safely for HIV, if current clinical research establishes safety, according to Marian P. Jarlenski, PhD, MPH, an associate professor of public health policy and management at the University of Pittsburgh. The study was published in Obstetrics & Gynecology.
HCV infection has risen substantially in people of reproductive age in tandem with an increase in OUDs. HCV is transmitted from an infected mother to her baby in about 6% of cases, according to the Centers for Disease Control and Prevention, which in 2020 expanded its HCV screening recommendations to include all pregnant women. Currently no treatment for HCV during pregnancy has been approved.
In light of those recent recommendations, Dr. Jarlenski said in an interview that her group was “interested in looking at high-risk screened people and estimating what proportion received follow-up care and treatment for HCV. What is the promise of screening? The promise is that you can treat. Otherwise why screen?”
She acknowledged, however, that the postpartum period is a challenging time for a mother to seek health information or care for herself, whether she’s a new parent or has other children in the home. Nevertheless, the low rate of follow-up and treatment was unexpected. “Even the 70% rate of screening was low – we felt it should have been closer to 100% – but the follow-up rate was surprisingly low,” Dr. Jarlenski said.
Mishka Terplan, MD, MPH, medical director of Friends Research Institute in Baltimore, was not surprised at the low follow-up rate. “The cascade of care for hep C is demoralizing,” said Dr. Terplan, who was not involved in the study. “We know that hep C is syndemic with OUD and other opioid crises and we know that screening is effective for identifying hep C and that antiviral medications are now more effective and less toxic than ever before. But despite this, we’re failing pregnant women and their kids at every step along the cascade. We do a better job with initial testing than with the follow-up testing. We do a horrible job with postpartum medication initiation.”
He pointed to the systemic challenges mothers face in getting postpartum HCV care. “They may be transferred to a subspecialist for treatment, and this transfer is compounded by issues of insurance coverage and eligibility.” With the onus on new mothers to submit the paperwork, “the idea that mothers would be able to initiate much less continue postpartum treatment is absurd,” Dr. Terplan said.
He added that the children born to HCV-positive mothers need surveillance as well, but data suggest that the rates of newborn testing are also low. “There’s a preventable public health burden in all of this.”
The obvious way to increase eradicative therapy would be to treat women while they are getting antenatal care. A small phase 1 trial found that all pregnant participants who were HCV positive and given antivirals in their second trimester were safely treated and gave birth to healthy babies.
“If larger trials prove this treatment is safe and effective, then these results should be communicated to care providers and pregnant patients,” Dr. Jarlenski said. Otherwise, the public health potential of universal screening in pregnancy will not be realized.
This research was supported by the National Institute of Drug Abuse and by the Delaware Division of Medicaid and Medical Assistance and the University of Delaware, Center for Community Research & Service. Dr. Jarlenski disclosed no competing interests. One coauthor disclosed grant funding through her institution from Gilead Sciences and Organon unrelated to this work. Dr. Terplan reported no relevant competing interests.
Despite the availability of effective direct-acting antivirals, very few a mothers with opioid use disorder (OUD) and hepatitis C virus (HCV) during pregnancy received follow-up care or treatment for the infection within 6 months of giving birth, a retrospective study of Medicaid maternity patients found.
The study pooled data on 23,780 Medicaid-enrolled pregnant women with OUD who had a live or stillbirth during 2016-2019 and were followed for 6 months after delivery. Among these women – drawn from six states in the Medicaid Outcomes Distributed Research Network – the pooled average probability of HCV testing during pregnancy was 70.3% (95% confidence interval, 61.5%-79.1%). Of these, 30.9% (95% CI, 23.8%-38%) tested positive. At 60 days postpartum, just 3.2% (95% CI, 2.6%-3.8%) had a follow-up visit or treatment for HCV. In a subset of patients followed for 6 months, only 5.9% (95% CI, 4.9%-6.9%) had any HCV follow-up visit or medication within 6 months of delivery.
While HCV screening and diagnosis rates varied across states, postpartum follow-up rates were universally low. The results suggest a need to improve the cascade of postpartum care for HCV and, ultimately perhaps, introduce antenatal HCV treatment, as is currently given safely for HIV, if current clinical research establishes safety, according to Marian P. Jarlenski, PhD, MPH, an associate professor of public health policy and management at the University of Pittsburgh. The study was published in Obstetrics & Gynecology.
HCV infection has risen substantially in people of reproductive age in tandem with an increase in OUDs. HCV is transmitted from an infected mother to her baby in about 6% of cases, according to the Centers for Disease Control and Prevention, which in 2020 expanded its HCV screening recommendations to include all pregnant women. Currently no treatment for HCV during pregnancy has been approved.
In light of those recent recommendations, Dr. Jarlenski said in an interview that her group was “interested in looking at high-risk screened people and estimating what proportion received follow-up care and treatment for HCV. What is the promise of screening? The promise is that you can treat. Otherwise why screen?”
She acknowledged, however, that the postpartum period is a challenging time for a mother to seek health information or care for herself, whether she’s a new parent or has other children in the home. Nevertheless, the low rate of follow-up and treatment was unexpected. “Even the 70% rate of screening was low – we felt it should have been closer to 100% – but the follow-up rate was surprisingly low,” Dr. Jarlenski said.
Mishka Terplan, MD, MPH, medical director of Friends Research Institute in Baltimore, was not surprised at the low follow-up rate. “The cascade of care for hep C is demoralizing,” said Dr. Terplan, who was not involved in the study. “We know that hep C is syndemic with OUD and other opioid crises and we know that screening is effective for identifying hep C and that antiviral medications are now more effective and less toxic than ever before. But despite this, we’re failing pregnant women and their kids at every step along the cascade. We do a better job with initial testing than with the follow-up testing. We do a horrible job with postpartum medication initiation.”
He pointed to the systemic challenges mothers face in getting postpartum HCV care. “They may be transferred to a subspecialist for treatment, and this transfer is compounded by issues of insurance coverage and eligibility.” With the onus on new mothers to submit the paperwork, “the idea that mothers would be able to initiate much less continue postpartum treatment is absurd,” Dr. Terplan said.
He added that the children born to HCV-positive mothers need surveillance as well, but data suggest that the rates of newborn testing are also low. “There’s a preventable public health burden in all of this.”
The obvious way to increase eradicative therapy would be to treat women while they are getting antenatal care. A small phase 1 trial found that all pregnant participants who were HCV positive and given antivirals in their second trimester were safely treated and gave birth to healthy babies.
“If larger trials prove this treatment is safe and effective, then these results should be communicated to care providers and pregnant patients,” Dr. Jarlenski said. Otherwise, the public health potential of universal screening in pregnancy will not be realized.
This research was supported by the National Institute of Drug Abuse and by the Delaware Division of Medicaid and Medical Assistance and the University of Delaware, Center for Community Research & Service. Dr. Jarlenski disclosed no competing interests. One coauthor disclosed grant funding through her institution from Gilead Sciences and Organon unrelated to this work. Dr. Terplan reported no relevant competing interests.
FROM OBSTETRICS & GYNECOLOGY