Stroke scale cutoff might not be ideal guide for ordering CTA and detecting large vessel occlusions

Article Type
Changed
Mon, 05/08/2023 - 07:49

In emergency department stroke consultations, the National Institute of Health Stroke Scale (NIHSS) alone does not appear to be a reliable guide for ordering diagnostic tests for a large vessel occlusion (LVO), according to large body of data presented at the 2023 annual meeting of the American Academy of Neurology.

If the goal is not to miss any LVOs, there is no NIHSS score below which these do not occur, according to Theresa Sevilis, DO, regional medical director, TeleSpecialists, Fort Myers, Fla.

For example, her evaluation of a large and nationally representative dataset shows that more than 10% of the LVOs eventually identified and accepted for intervention would be missed with a cutoff of NIHSS score of 6 or higher. Moving the cutoff NIHSS score to 4 or greater, 6% of LVOs among the 23,166 strokes evaluated would have gone undetected.

“The current guidelines do not address low NIHSS score largely due to a paucity of data,” according to Dr. Sevilis, who showed data indicating that there is great variation among institutions in regard to ordering computed tomography angiography (CTA). She indicated that CTA is the current imaging standard for detecting LVO.
 

Large prospective dataset

The data for this study were derived from the TeleCare database, which captures acute stroke consultations in the emergency departments in 227 facilities in 27 states. Stroke consultations over a 6-month period from July through December 2021 were evaluated. The prospectively collected data were subjected to a multivariate analysis to determine the odds ratio for a CTA performed and LVO found at each NIHSS score of 0 to 5. Scores 6 or above served as the reference.

“Only consults performed within 24 hours [of presentation] were included,” Dr. Sevilis said.

After excluding cases in which no NIHSS score was captured, which represented less than 1% of cases, more than 10,500 cases underwent CTA, providing a rate of 45.5%. The rate of CTA for the whole dataset was 45.5%. Of the study population, 24.6% had a NIHSS score of 6 or above.

“When you are discussing when to perform CTA in patients with a low NIHSS score, you are discussing the majority of patients,” Dr. Sevilis said.

Of those with a NIHSS stroke of 6 or below, 28.2% had a score of 0. Not surprisingly, these were the least likely to have a CTA performed on the basis of an odds ratio of 0.14 and the least likely to have a LVO detected (OR, 0.1). With the exception of a NIHSS stroke score of 1, the likelihood of CTA and LVO climbed incrementally with higher stroke scores. These odds ratios were, respectively, 0.16 and 0.09 for a score of 1; 0.27 and 0.16 for a score of 2; 0.33 and 0.14 for a score of 3; 0.49 and 0.24 for a score of 4; and 0.71 and 0.27 for a score of 5.

In the group with NIHSS score of 6 or above, 24.1% were found to have an LVO. Of these, the proportion accepted for a mechanical thrombectomy was less than half. The intervention acceptance rate for mechanical intervention among LVOs in patients with lower NIHSS scores again fell incrementally by score. The acceptance rate was about 35% among LVO patients with a NIHSS score of 3 or 4 and 25% for those with a score of 0-2.

The interpretation of these data “depends on goals,” Dr. Sevilis said. “If the goal is to not miss a single LVO, then it is important to consider the balance between benefits and risks.”
 

 

 

No consistent cutoff

In participating facilities, the protocol for considering CTA to detect and treat LVOs ranges from neurologist choice to cutoffs of NIHSS scores of 2, 4, and 6, according to Dr. Sevilis. Where the data suggest that a cutoff of 4 or above might be reasonable, she said that NIHSS scoring is not a useful tool for those “who do not want to miss any LVOs.”

These data are based on emergency room stroke consultations and not on confirmed strokes,” Dr. Sevilis emphasized. Indeed, she noted that the final discharge diagnosis was not available. Recognizing that the analysis was not performed on a population with confirmed strokes is particularly important for understanding the limited rate of CTAs performed even in those with relatively high NIHSS scores. She noted this could be explained by many different reasons, including suspicion of hemorrhage or clinical features that took the workup in a different direction.
 

Reconsidering protocols

Based on the large sample size, Dr. Sevilis contended that it is likely that these data are representative, but she considers this study a first step toward considering protocols and developing guidelines for addressing stroke alerts in the emergency department.

A more important step will be ongoing trials designed specifically to generate data to answer this question. Pascal Jabbour, MD, chief of the division of neurovascular and endovascular neurosurgery, Thomas Jefferson University Hospitals, Philadelphia, is participating in one of these trials. He agreed with the premise that better evidence-based criteria are needed when evaluating acute stroke patients with a potential LVO.

The trial in which he is a coinvestigator, called ENDOLOW, is testing the hypothesis that outcomes will be better if acute stroke patients with a LVO and a low baseline NIHSS score (< 5) are treated with immediate thrombectomy rather than medical management. If this hypothesis is confirmed in the randomized ENDOLOW, it will provide an evidence basis for an approach already being practiced at some centers.

“There should be a very low threshold for CTA,” said Dr. Jabbour in an interview. This imaging “takes less than 2 minutes and it can provide the basis for a life-saving endovascular thrombectomy if a LVO is found.”

It is already well known that LVO is not restricted only to patients with an elevated NIHSS score, he said.

For determining whether to order a CTA, “I do not agree with NIHSS score of 6 or above. There is no absolute number below which risk of missing a LVO is eliminated,” Dr. Jabbour said. He also argued against relying on NIHSS score without considering other clinical features, particularly cortical signs, which should raise suspicion of a LVO regardless of NIHSS score.

One problem is that NIHSS scores are not static. Decompensation can be rapid with the NIHSS score quickly climbing. When this happens, the delay in treatment might lead to a preventable adverse outcome.

“There is a change in the paradigm now that we have more evidence of a benefit from aggressive treatment in the right candidates,” according to Dr. Jabbour, referring to the recently published SELECT2 trial. In that trial, on which Dr. Jabbour served as a coauthor, patients with LVO and large territory infarct were randomized to thrombectomy or medical care within 24 hours of a stroke. It was stopped early for efficacy because of the increased functional independence (20% vs. 7%) in the surgical intervention group.

If the ongoing trials establish better criteria for ruling in or out the presence of LVO in patients with acute stroke, Dr. Jabbour predicted that guidelines will be written to standardize practice.

Dr. Sevilis reports no potential conflicts of interest. Dr. Jabbour has financial relationships with Cerenovus, Medtronic, and Microvention.
 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

In emergency department stroke consultations, the National Institute of Health Stroke Scale (NIHSS) alone does not appear to be a reliable guide for ordering diagnostic tests for a large vessel occlusion (LVO), according to large body of data presented at the 2023 annual meeting of the American Academy of Neurology.

If the goal is not to miss any LVOs, there is no NIHSS score below which these do not occur, according to Theresa Sevilis, DO, regional medical director, TeleSpecialists, Fort Myers, Fla.

For example, her evaluation of a large and nationally representative dataset shows that more than 10% of the LVOs eventually identified and accepted for intervention would be missed with a cutoff of NIHSS score of 6 or higher. Moving the cutoff NIHSS score to 4 or greater, 6% of LVOs among the 23,166 strokes evaluated would have gone undetected.

“The current guidelines do not address low NIHSS score largely due to a paucity of data,” according to Dr. Sevilis, who showed data indicating that there is great variation among institutions in regard to ordering computed tomography angiography (CTA). She indicated that CTA is the current imaging standard for detecting LVO.
 

Large prospective dataset

The data for this study were derived from the TeleCare database, which captures acute stroke consultations in the emergency departments in 227 facilities in 27 states. Stroke consultations over a 6-month period from July through December 2021 were evaluated. The prospectively collected data were subjected to a multivariate analysis to determine the odds ratio for a CTA performed and LVO found at each NIHSS score of 0 to 5. Scores 6 or above served as the reference.

“Only consults performed within 24 hours [of presentation] were included,” Dr. Sevilis said.

After excluding cases in which no NIHSS score was captured, which represented less than 1% of cases, more than 10,500 cases underwent CTA, providing a rate of 45.5%. The rate of CTA for the whole dataset was 45.5%. Of the study population, 24.6% had a NIHSS score of 6 or above.

“When you are discussing when to perform CTA in patients with a low NIHSS score, you are discussing the majority of patients,” Dr. Sevilis said.

Of those with a NIHSS stroke of 6 or below, 28.2% had a score of 0. Not surprisingly, these were the least likely to have a CTA performed on the basis of an odds ratio of 0.14 and the least likely to have a LVO detected (OR, 0.1). With the exception of a NIHSS stroke score of 1, the likelihood of CTA and LVO climbed incrementally with higher stroke scores. These odds ratios were, respectively, 0.16 and 0.09 for a score of 1; 0.27 and 0.16 for a score of 2; 0.33 and 0.14 for a score of 3; 0.49 and 0.24 for a score of 4; and 0.71 and 0.27 for a score of 5.

In the group with NIHSS score of 6 or above, 24.1% were found to have an LVO. Of these, the proportion accepted for a mechanical thrombectomy was less than half. The intervention acceptance rate for mechanical intervention among LVOs in patients with lower NIHSS scores again fell incrementally by score. The acceptance rate was about 35% among LVO patients with a NIHSS score of 3 or 4 and 25% for those with a score of 0-2.

The interpretation of these data “depends on goals,” Dr. Sevilis said. “If the goal is to not miss a single LVO, then it is important to consider the balance between benefits and risks.”
 

 

 

No consistent cutoff

In participating facilities, the protocol for considering CTA to detect and treat LVOs ranges from neurologist choice to cutoffs of NIHSS scores of 2, 4, and 6, according to Dr. Sevilis. Where the data suggest that a cutoff of 4 or above might be reasonable, she said that NIHSS scoring is not a useful tool for those “who do not want to miss any LVOs.”

These data are based on emergency room stroke consultations and not on confirmed strokes,” Dr. Sevilis emphasized. Indeed, she noted that the final discharge diagnosis was not available. Recognizing that the analysis was not performed on a population with confirmed strokes is particularly important for understanding the limited rate of CTAs performed even in those with relatively high NIHSS scores. She noted this could be explained by many different reasons, including suspicion of hemorrhage or clinical features that took the workup in a different direction.
 

Reconsidering protocols

Based on the large sample size, Dr. Sevilis contended that it is likely that these data are representative, but she considers this study a first step toward considering protocols and developing guidelines for addressing stroke alerts in the emergency department.

A more important step will be ongoing trials designed specifically to generate data to answer this question. Pascal Jabbour, MD, chief of the division of neurovascular and endovascular neurosurgery, Thomas Jefferson University Hospitals, Philadelphia, is participating in one of these trials. He agreed with the premise that better evidence-based criteria are needed when evaluating acute stroke patients with a potential LVO.

The trial in which he is a coinvestigator, called ENDOLOW, is testing the hypothesis that outcomes will be better if acute stroke patients with a LVO and a low baseline NIHSS score (< 5) are treated with immediate thrombectomy rather than medical management. If this hypothesis is confirmed in the randomized ENDOLOW, it will provide an evidence basis for an approach already being practiced at some centers.

“There should be a very low threshold for CTA,” said Dr. Jabbour in an interview. This imaging “takes less than 2 minutes and it can provide the basis for a life-saving endovascular thrombectomy if a LVO is found.”

It is already well known that LVO is not restricted only to patients with an elevated NIHSS score, he said.

For determining whether to order a CTA, “I do not agree with NIHSS score of 6 or above. There is no absolute number below which risk of missing a LVO is eliminated,” Dr. Jabbour said. He also argued against relying on NIHSS score without considering other clinical features, particularly cortical signs, which should raise suspicion of a LVO regardless of NIHSS score.

One problem is that NIHSS scores are not static. Decompensation can be rapid with the NIHSS score quickly climbing. When this happens, the delay in treatment might lead to a preventable adverse outcome.

“There is a change in the paradigm now that we have more evidence of a benefit from aggressive treatment in the right candidates,” according to Dr. Jabbour, referring to the recently published SELECT2 trial. In that trial, on which Dr. Jabbour served as a coauthor, patients with LVO and large territory infarct were randomized to thrombectomy or medical care within 24 hours of a stroke. It was stopped early for efficacy because of the increased functional independence (20% vs. 7%) in the surgical intervention group.

If the ongoing trials establish better criteria for ruling in or out the presence of LVO in patients with acute stroke, Dr. Jabbour predicted that guidelines will be written to standardize practice.

Dr. Sevilis reports no potential conflicts of interest. Dr. Jabbour has financial relationships with Cerenovus, Medtronic, and Microvention.
 

In emergency department stroke consultations, the National Institute of Health Stroke Scale (NIHSS) alone does not appear to be a reliable guide for ordering diagnostic tests for a large vessel occlusion (LVO), according to large body of data presented at the 2023 annual meeting of the American Academy of Neurology.

If the goal is not to miss any LVOs, there is no NIHSS score below which these do not occur, according to Theresa Sevilis, DO, regional medical director, TeleSpecialists, Fort Myers, Fla.

For example, her evaluation of a large and nationally representative dataset shows that more than 10% of the LVOs eventually identified and accepted for intervention would be missed with a cutoff of NIHSS score of 6 or higher. Moving the cutoff NIHSS score to 4 or greater, 6% of LVOs among the 23,166 strokes evaluated would have gone undetected.

“The current guidelines do not address low NIHSS score largely due to a paucity of data,” according to Dr. Sevilis, who showed data indicating that there is great variation among institutions in regard to ordering computed tomography angiography (CTA). She indicated that CTA is the current imaging standard for detecting LVO.
 

Large prospective dataset

The data for this study were derived from the TeleCare database, which captures acute stroke consultations in the emergency departments in 227 facilities in 27 states. Stroke consultations over a 6-month period from July through December 2021 were evaluated. The prospectively collected data were subjected to a multivariate analysis to determine the odds ratio for a CTA performed and LVO found at each NIHSS score of 0 to 5. Scores 6 or above served as the reference.

“Only consults performed within 24 hours [of presentation] were included,” Dr. Sevilis said.

After excluding cases in which no NIHSS score was captured, which represented less than 1% of cases, more than 10,500 cases underwent CTA, providing a rate of 45.5%. The rate of CTA for the whole dataset was 45.5%. Of the study population, 24.6% had a NIHSS score of 6 or above.

“When you are discussing when to perform CTA in patients with a low NIHSS score, you are discussing the majority of patients,” Dr. Sevilis said.

Of those with a NIHSS stroke of 6 or below, 28.2% had a score of 0. Not surprisingly, these were the least likely to have a CTA performed on the basis of an odds ratio of 0.14 and the least likely to have a LVO detected (OR, 0.1). With the exception of a NIHSS stroke score of 1, the likelihood of CTA and LVO climbed incrementally with higher stroke scores. These odds ratios were, respectively, 0.16 and 0.09 for a score of 1; 0.27 and 0.16 for a score of 2; 0.33 and 0.14 for a score of 3; 0.49 and 0.24 for a score of 4; and 0.71 and 0.27 for a score of 5.

In the group with NIHSS score of 6 or above, 24.1% were found to have an LVO. Of these, the proportion accepted for a mechanical thrombectomy was less than half. The intervention acceptance rate for mechanical intervention among LVOs in patients with lower NIHSS scores again fell incrementally by score. The acceptance rate was about 35% among LVO patients with a NIHSS score of 3 or 4 and 25% for those with a score of 0-2.

The interpretation of these data “depends on goals,” Dr. Sevilis said. “If the goal is to not miss a single LVO, then it is important to consider the balance between benefits and risks.”
 

 

 

No consistent cutoff

In participating facilities, the protocol for considering CTA to detect and treat LVOs ranges from neurologist choice to cutoffs of NIHSS scores of 2, 4, and 6, according to Dr. Sevilis. Where the data suggest that a cutoff of 4 or above might be reasonable, she said that NIHSS scoring is not a useful tool for those “who do not want to miss any LVOs.”

These data are based on emergency room stroke consultations and not on confirmed strokes,” Dr. Sevilis emphasized. Indeed, she noted that the final discharge diagnosis was not available. Recognizing that the analysis was not performed on a population with confirmed strokes is particularly important for understanding the limited rate of CTAs performed even in those with relatively high NIHSS scores. She noted this could be explained by many different reasons, including suspicion of hemorrhage or clinical features that took the workup in a different direction.
 

Reconsidering protocols

Based on the large sample size, Dr. Sevilis contended that it is likely that these data are representative, but she considers this study a first step toward considering protocols and developing guidelines for addressing stroke alerts in the emergency department.

A more important step will be ongoing trials designed specifically to generate data to answer this question. Pascal Jabbour, MD, chief of the division of neurovascular and endovascular neurosurgery, Thomas Jefferson University Hospitals, Philadelphia, is participating in one of these trials. He agreed with the premise that better evidence-based criteria are needed when evaluating acute stroke patients with a potential LVO.

The trial in which he is a coinvestigator, called ENDOLOW, is testing the hypothesis that outcomes will be better if acute stroke patients with a LVO and a low baseline NIHSS score (< 5) are treated with immediate thrombectomy rather than medical management. If this hypothesis is confirmed in the randomized ENDOLOW, it will provide an evidence basis for an approach already being practiced at some centers.

“There should be a very low threshold for CTA,” said Dr. Jabbour in an interview. This imaging “takes less than 2 minutes and it can provide the basis for a life-saving endovascular thrombectomy if a LVO is found.”

It is already well known that LVO is not restricted only to patients with an elevated NIHSS score, he said.

For determining whether to order a CTA, “I do not agree with NIHSS score of 6 or above. There is no absolute number below which risk of missing a LVO is eliminated,” Dr. Jabbour said. He also argued against relying on NIHSS score without considering other clinical features, particularly cortical signs, which should raise suspicion of a LVO regardless of NIHSS score.

One problem is that NIHSS scores are not static. Decompensation can be rapid with the NIHSS score quickly climbing. When this happens, the delay in treatment might lead to a preventable adverse outcome.

“There is a change in the paradigm now that we have more evidence of a benefit from aggressive treatment in the right candidates,” according to Dr. Jabbour, referring to the recently published SELECT2 trial. In that trial, on which Dr. Jabbour served as a coauthor, patients with LVO and large territory infarct were randomized to thrombectomy or medical care within 24 hours of a stroke. It was stopped early for efficacy because of the increased functional independence (20% vs. 7%) in the surgical intervention group.

If the ongoing trials establish better criteria for ruling in or out the presence of LVO in patients with acute stroke, Dr. Jabbour predicted that guidelines will be written to standardize practice.

Dr. Sevilis reports no potential conflicts of interest. Dr. Jabbour has financial relationships with Cerenovus, Medtronic, and Microvention.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAN 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Novel fluorescence guidance improves lumpectomy outcomes

Article Type
Changed
Sat, 05/06/2023 - 23:54

As many as 40% of lumpectomies leave positive margins that necessitate a second surgery, but a novel fluorescent imaging agent used along with a direct visualization system may improve complete resection rates, new phase 3 findings show.

Pegulicianine (Lumisight), an investigational and activatable fluorescent imaging agent used with a novel direct visualization system, helped identify residual tumor or circumvent second surgeries in about 10% of patients in the trial.

Use of the agent and direct visualization system – both from Lumicell and currently under review by the Food and Drug Administration – could provide more complete resection for patients with early breast cancer and avert the need for reexcisions, the investigators write.

The findings were published online in NEJM Evidence and were subsequently presented at the annual meeting of the American Society of Breast Surgeons.

Local recurrence following lumpectomy increases the risk of dying from breast cancer, and the risk of local recurrence is directly linked to inadequate tumor removal during lumpectomy. In about 20%-40% of lumpectomies, positive margins are identified after surgery.

To improve patient outcomes, investigators assessed whether a novel fluorescence-guided surgery system helped surgeons perform more complete resections during lumpectomy.

In the Novel Surgical Imaging for Tumor Excision (INSITE) trial, 392 patients were randomly assigned to undergo pegulicianine fluorescence-guided surgery (n = 357) or standard lumpectomy (n = 35).

To prevent surgeons from performing a smaller than standard lumpectomy in anticipation of using the pegulicianine fluorescence-guided system, patients were randomly assigned to the pegulicianine fluorescence-guided surgery group or the control group. The groups were revealed only after the surgeon completed the standard lumpectomy.

“Randomization was not designed to provide a control group for analysis of device performance,” The authors explain. “In this study design, each patient undergoing pegulicianine fluorescence-guided surgery served as her own control,” they write. The investigators compared final margin pathology after standard lumpectomy and after guided surgery. Those in the control group were included in the safety analysis.

Study participants were women aged 18 years or older who were undergoing lumpectomy for stage I–III breast cancer and/or ductal carcinoma in situ. All patients received pegulicianine 1.0 mg/kg via a 3-minute intravenous infusion 2-6 hours before surgery.

The agent produces a signal at sites of residual tumor, and a handheld probe illuminates the cavity during surgery. A tumor detection algorithm then analyzes and displays the images to the surgeon in real time – an overall process that adds about 7 minutes to the operative procedure, the authors say.

Investigators identified invasive cancers in 316 patients and in situ cancers in 76 patients. Among the 357 patients in the treatment group, 27 (7.6%) were found to have residual tumor after standard lumpectomy. For 22 patients, cavity orientations were deemed negative on standard margin evaluations, the authors report.

With use of pegulicianine fluorescence-guided surgery, positive margins were converted to negative margins for 9 of 62 patients (14.5%), potentially averting a second surgery in those patients.

Overall, the authors say that pegulicianine fluorescence-guided surgery removed residual tumor (27 of 357) or avoided second surgeries (9 of 357) in 10% of patients in the trial.

The current trial findings confirm results regarding the safety and efficacy of pegulicianine fluorescence-guided surgery and the direct visualization system that were reported in a prior multicenter feasibility study, the authors say.

Pegulicianine fluorescence-guided surgery met prespecified thresholds for removal of residual tumor and specificity, at 85.2%, but did not meet the prespecified threshold for sensitivity, which was only 49.3%.

The rate of serious adverse events with pegulicianine was 0.5% (two patients), similar to that of other contrast agents. Administration of the agent was stopped because of adverse events for six patients, the investigators write.

Serious adverse events included grade 3 hypersensitivity in one patient and an anaphylactic reaction in another. The other four adverse events included an allergic reaction, milder hypersensitivity, nausea, and pegulicianine extravasation. All adverse events resolved, and patients proceeded to standard lumpectomy.

Overall, the trial findings “suggest that a more complete breast cancer resection may be achieved” with pegulicianine fluorescence-guided surgery and the direct visualization system, lead investigator Barbara Smith, MD, PhD, director of the breast program at Massachusetts General Hospital and professor of surgery at Harvard Medical School, both in Boston, said in a press release. “Given the low complication rate, minimal added operative time and, most importantly, the discovery of additional cancer left behind after a lumpectomy, the Lumicell [system] has the potential to be a critical adjunct to enhance standard practice for breast cancer patients.”

The system also has the potential to reduce “the patient burden of additional surgery” and decrease “costs associated with a return to the operating room,” the authors conclude.

The INSITE trial was funded by Lumicell and the National Institutes of Health. Dr. Smith reported unpaid research collaboration with Lumicell.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

As many as 40% of lumpectomies leave positive margins that necessitate a second surgery, but a novel fluorescent imaging agent used along with a direct visualization system may improve complete resection rates, new phase 3 findings show.

Pegulicianine (Lumisight), an investigational and activatable fluorescent imaging agent used with a novel direct visualization system, helped identify residual tumor or circumvent second surgeries in about 10% of patients in the trial.

Use of the agent and direct visualization system – both from Lumicell and currently under review by the Food and Drug Administration – could provide more complete resection for patients with early breast cancer and avert the need for reexcisions, the investigators write.

The findings were published online in NEJM Evidence and were subsequently presented at the annual meeting of the American Society of Breast Surgeons.

Local recurrence following lumpectomy increases the risk of dying from breast cancer, and the risk of local recurrence is directly linked to inadequate tumor removal during lumpectomy. In about 20%-40% of lumpectomies, positive margins are identified after surgery.

To improve patient outcomes, investigators assessed whether a novel fluorescence-guided surgery system helped surgeons perform more complete resections during lumpectomy.

In the Novel Surgical Imaging for Tumor Excision (INSITE) trial, 392 patients were randomly assigned to undergo pegulicianine fluorescence-guided surgery (n = 357) or standard lumpectomy (n = 35).

To prevent surgeons from performing a smaller than standard lumpectomy in anticipation of using the pegulicianine fluorescence-guided system, patients were randomly assigned to the pegulicianine fluorescence-guided surgery group or the control group. The groups were revealed only after the surgeon completed the standard lumpectomy.

“Randomization was not designed to provide a control group for analysis of device performance,” The authors explain. “In this study design, each patient undergoing pegulicianine fluorescence-guided surgery served as her own control,” they write. The investigators compared final margin pathology after standard lumpectomy and after guided surgery. Those in the control group were included in the safety analysis.

Study participants were women aged 18 years or older who were undergoing lumpectomy for stage I–III breast cancer and/or ductal carcinoma in situ. All patients received pegulicianine 1.0 mg/kg via a 3-minute intravenous infusion 2-6 hours before surgery.

The agent produces a signal at sites of residual tumor, and a handheld probe illuminates the cavity during surgery. A tumor detection algorithm then analyzes and displays the images to the surgeon in real time – an overall process that adds about 7 minutes to the operative procedure, the authors say.

Investigators identified invasive cancers in 316 patients and in situ cancers in 76 patients. Among the 357 patients in the treatment group, 27 (7.6%) were found to have residual tumor after standard lumpectomy. For 22 patients, cavity orientations were deemed negative on standard margin evaluations, the authors report.

With use of pegulicianine fluorescence-guided surgery, positive margins were converted to negative margins for 9 of 62 patients (14.5%), potentially averting a second surgery in those patients.

Overall, the authors say that pegulicianine fluorescence-guided surgery removed residual tumor (27 of 357) or avoided second surgeries (9 of 357) in 10% of patients in the trial.

The current trial findings confirm results regarding the safety and efficacy of pegulicianine fluorescence-guided surgery and the direct visualization system that were reported in a prior multicenter feasibility study, the authors say.

Pegulicianine fluorescence-guided surgery met prespecified thresholds for removal of residual tumor and specificity, at 85.2%, but did not meet the prespecified threshold for sensitivity, which was only 49.3%.

The rate of serious adverse events with pegulicianine was 0.5% (two patients), similar to that of other contrast agents. Administration of the agent was stopped because of adverse events for six patients, the investigators write.

Serious adverse events included grade 3 hypersensitivity in one patient and an anaphylactic reaction in another. The other four adverse events included an allergic reaction, milder hypersensitivity, nausea, and pegulicianine extravasation. All adverse events resolved, and patients proceeded to standard lumpectomy.

Overall, the trial findings “suggest that a more complete breast cancer resection may be achieved” with pegulicianine fluorescence-guided surgery and the direct visualization system, lead investigator Barbara Smith, MD, PhD, director of the breast program at Massachusetts General Hospital and professor of surgery at Harvard Medical School, both in Boston, said in a press release. “Given the low complication rate, minimal added operative time and, most importantly, the discovery of additional cancer left behind after a lumpectomy, the Lumicell [system] has the potential to be a critical adjunct to enhance standard practice for breast cancer patients.”

The system also has the potential to reduce “the patient burden of additional surgery” and decrease “costs associated with a return to the operating room,” the authors conclude.

The INSITE trial was funded by Lumicell and the National Institutes of Health. Dr. Smith reported unpaid research collaboration with Lumicell.

A version of this article first appeared on Medscape.com.

As many as 40% of lumpectomies leave positive margins that necessitate a second surgery, but a novel fluorescent imaging agent used along with a direct visualization system may improve complete resection rates, new phase 3 findings show.

Pegulicianine (Lumisight), an investigational and activatable fluorescent imaging agent used with a novel direct visualization system, helped identify residual tumor or circumvent second surgeries in about 10% of patients in the trial.

Use of the agent and direct visualization system – both from Lumicell and currently under review by the Food and Drug Administration – could provide more complete resection for patients with early breast cancer and avert the need for reexcisions, the investigators write.

The findings were published online in NEJM Evidence and were subsequently presented at the annual meeting of the American Society of Breast Surgeons.

Local recurrence following lumpectomy increases the risk of dying from breast cancer, and the risk of local recurrence is directly linked to inadequate tumor removal during lumpectomy. In about 20%-40% of lumpectomies, positive margins are identified after surgery.

To improve patient outcomes, investigators assessed whether a novel fluorescence-guided surgery system helped surgeons perform more complete resections during lumpectomy.

In the Novel Surgical Imaging for Tumor Excision (INSITE) trial, 392 patients were randomly assigned to undergo pegulicianine fluorescence-guided surgery (n = 357) or standard lumpectomy (n = 35).

To prevent surgeons from performing a smaller than standard lumpectomy in anticipation of using the pegulicianine fluorescence-guided system, patients were randomly assigned to the pegulicianine fluorescence-guided surgery group or the control group. The groups were revealed only after the surgeon completed the standard lumpectomy.

“Randomization was not designed to provide a control group for analysis of device performance,” The authors explain. “In this study design, each patient undergoing pegulicianine fluorescence-guided surgery served as her own control,” they write. The investigators compared final margin pathology after standard lumpectomy and after guided surgery. Those in the control group were included in the safety analysis.

Study participants were women aged 18 years or older who were undergoing lumpectomy for stage I–III breast cancer and/or ductal carcinoma in situ. All patients received pegulicianine 1.0 mg/kg via a 3-minute intravenous infusion 2-6 hours before surgery.

The agent produces a signal at sites of residual tumor, and a handheld probe illuminates the cavity during surgery. A tumor detection algorithm then analyzes and displays the images to the surgeon in real time – an overall process that adds about 7 minutes to the operative procedure, the authors say.

Investigators identified invasive cancers in 316 patients and in situ cancers in 76 patients. Among the 357 patients in the treatment group, 27 (7.6%) were found to have residual tumor after standard lumpectomy. For 22 patients, cavity orientations were deemed negative on standard margin evaluations, the authors report.

With use of pegulicianine fluorescence-guided surgery, positive margins were converted to negative margins for 9 of 62 patients (14.5%), potentially averting a second surgery in those patients.

Overall, the authors say that pegulicianine fluorescence-guided surgery removed residual tumor (27 of 357) or avoided second surgeries (9 of 357) in 10% of patients in the trial.

The current trial findings confirm results regarding the safety and efficacy of pegulicianine fluorescence-guided surgery and the direct visualization system that were reported in a prior multicenter feasibility study, the authors say.

Pegulicianine fluorescence-guided surgery met prespecified thresholds for removal of residual tumor and specificity, at 85.2%, but did not meet the prespecified threshold for sensitivity, which was only 49.3%.

The rate of serious adverse events with pegulicianine was 0.5% (two patients), similar to that of other contrast agents. Administration of the agent was stopped because of adverse events for six patients, the investigators write.

Serious adverse events included grade 3 hypersensitivity in one patient and an anaphylactic reaction in another. The other four adverse events included an allergic reaction, milder hypersensitivity, nausea, and pegulicianine extravasation. All adverse events resolved, and patients proceeded to standard lumpectomy.

Overall, the trial findings “suggest that a more complete breast cancer resection may be achieved” with pegulicianine fluorescence-guided surgery and the direct visualization system, lead investigator Barbara Smith, MD, PhD, director of the breast program at Massachusetts General Hospital and professor of surgery at Harvard Medical School, both in Boston, said in a press release. “Given the low complication rate, minimal added operative time and, most importantly, the discovery of additional cancer left behind after a lumpectomy, the Lumicell [system] has the potential to be a critical adjunct to enhance standard practice for breast cancer patients.”

The system also has the potential to reduce “the patient burden of additional surgery” and decrease “costs associated with a return to the operating room,” the authors conclude.

The INSITE trial was funded by Lumicell and the National Institutes of Health. Dr. Smith reported unpaid research collaboration with Lumicell.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEJM EVIDENCE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study shows higher obesity-related cancer mortality in areas with more fast food

Article Type
Changed
Sun, 05/07/2023 - 00:56

Communities with easy access to fast food were 77% more likely to have high levels of obesity-related cancer mortality, based on data from a new cross-sectional study of more than 3,000 communities.

Although increased healthy eating has been associated with reduced risk of obesity and with reduced cancer incidence and mortality, access to healthier eating remains a challenge in communities with less access to grocery stores and healthy food options (food deserts) and/or easy access to convenience stores and fast food (food swamps), Malcolm Seth Bevel, PhD, of the Medical College of Georgia, Augusta, and colleagues, wrote in their paper, published in JAMA Oncology.

In addition, data on the association between food deserts and swamps and obesity-related cancer mortality are limited, they said.

“We felt that the study was important given the fact that obesity is an epidemic in the United States, and multiple factors contribute to obesity, especially adverse food environments,” Dr. Bevel said in an interview. “Also, I lived in these areas my whole life, and saw how it affected underserved populations. There was a story that needed to be told, so we’re telling it,” he said in an interview.

In a study, the researchers analyzed food access and cancer mortality data from 3,038 counties across the United States. The food access data came from the U.S. Department of Agriculture Food Environment Atlas (FEA) for the years 2012, 2014, 2015, 2017, and 2020. Data on obesity-related cancer mortality came from the Centers for Disease Control and Prevention for the years from 2010 to 2020.

Food desert scores were calculated through data from the FEA, and food swamp scores were based on the ratio of fast-food restaurants and convenience stores to grocery stores and farmers markets in a modification of the Retail Food Environment Index score.

The researchers used an age-adjusted, multiple regression model to determine the association between food desert and food swamp scores and obesity-related cancer mortality rates. Higher food swamp and food desert scores (defined as 20.0 to 58.0 or higher) were used to classify counties as having fewer healthy food resources. The primary outcome was obesity-related cancer mortality, defined as high or low (71.8 or higher per 100,000 individuals and less than 71.8 per 100,000 individuals, respectively).

Overall, high rates of obesity-related cancer mortality were 77% more likely in the counties that met the criteria for high food swamp scores (adjusted odds ratio 1.77). In addition, researchers found a positive dose-response relationship among three levels of both food desert scores and food swamp scores and obesity-related cancer mortality.

A total of 758 counties had obesity-related cancer mortality rates in the highest quartile. Compared to counties with low rates of obesity-related cancer mortality, counties with high rates of obesity-related cancer mortality also had a higher percentage of non-Hispanic Black residents (3.26% vs. 1.77%), higher percentage of adults older than 65 years (15.71% vs. 15.40%), higher rates of adult obesity (33.0% vs. 32.10%), and higher rates of adult diabetes (12.50% vs. 10.70%).

Possible explanations for the results include the lack of interest in grocery stores in neighborhoods with a population with a lower socioeconomic status, which can create a food desert, the researchers wrote in their discussion. “Coupled with the increasing growth rate of fast-food restaurants in recent years and the intentional advertisement of unhealthy foods in urban neighborhoods with [people of lower income], the food desert may transform into a food swamp,” they said.

The findings were limited by several factors including the study design, which did not allow for showing a causal association of food deserts and food swamps with obesity-related cancer mortality, the researchers noted. Other limitations included the use of groups rather than individuals, the potential misclassification of food stores, and the use of county-level data on race, ethnicity, and income, they wrote.

The results indicate that “food swamps appear to be a growing epidemic across the U.S., likely because of systemic issues, and should draw concern and conversation from local and state officials,” the researchers concluded.
 

 

 

Community-level investments can benefit individual health

Dr. Bevel said he was not surprised by the findings, as he has seen firsthand the lack of healthy food options and growth of unhealthy food options, especially for certain populations in certain communities. “Typically, these are people who have lower socioeconomic status, primarily non-Hispanic Black or African American or Hispanic American,” he said “I have watched people have to choose between getting fruits/vegetables versus their medications or running to fast food places to feed their families. What is truly surprising is that we’re not talking about people’s lived environment enough for my taste,” he said.  

“I hope that our data and results can inform local and state policymakers to truly invest in all communities, such as funding for community gardens, and realize that adverse food environments, including the barriers in navigating these environments, have significant consequences on real people,” said Dr. Bevel. “Also, I hope that the results can help clinicians realize that a patient’s lived environment can truly affect their obesity and/or obesity-related cancer status; being cognizant of that is the first step in holistic, comprehensive care,” he said. 

“One role that oncologists might be able to play in improving patients’ access to healthier food is to create and/or implement healthy lifestyle programs with gardening components to combat the poorest food environments that their patients likely reside in,” said Dr. Bevel. Clinicians also could consider the innovative approach of “food prescriptions” to help reduce the effects of deprived, built environments, he noted.

Looking ahead, next steps for research include determining the severity of association between food swamps and obesity-related cancer by varying factors such as cancer type, and examining any potential racial disparities between people living in these environments and obesity-related cancer, Dr. Bevel added.
 

Data provide foundation for multilevel interventions

The current study findings “raise a clarion call to elevate the discussion on food availability and access to ensure an equitable emphasis on both the importance of lifestyle factors and the upstream structural, economic, and environmental contexts that shape these behaviors at the individual level,” Karriem S. Watson, DHSc, MS, MPH, of the National Institutes of Health, Bethesda, Md., and Angela Odoms-Young, PhD, of Cornell University, Ithaca, N.Y., wrote in an accompanying editorial.

The findings provide a foundation for studies of obesity-related cancer outcomes that take the community environment into consideration, they added.

The causes of both obesity and cancer are complex, and the study findings suggest that the links between unhealthy food environments and obesity-related cancer may go beyond dietary consumption alone and extend to social and psychological factors, the editorialists noted.

“Whether dealing with the lack of access to healthy foods or an overabundance of unhealthy food, there is a critical need to develop additional research that explores the associations between obesity-related cancer mortality and food inequities,” they concluded.

The study received no outside funding. The researchers and the editorialists had no financial conflicts to disclose.

Publications
Topics
Sections

Communities with easy access to fast food were 77% more likely to have high levels of obesity-related cancer mortality, based on data from a new cross-sectional study of more than 3,000 communities.

Although increased healthy eating has been associated with reduced risk of obesity and with reduced cancer incidence and mortality, access to healthier eating remains a challenge in communities with less access to grocery stores and healthy food options (food deserts) and/or easy access to convenience stores and fast food (food swamps), Malcolm Seth Bevel, PhD, of the Medical College of Georgia, Augusta, and colleagues, wrote in their paper, published in JAMA Oncology.

In addition, data on the association between food deserts and swamps and obesity-related cancer mortality are limited, they said.

“We felt that the study was important given the fact that obesity is an epidemic in the United States, and multiple factors contribute to obesity, especially adverse food environments,” Dr. Bevel said in an interview. “Also, I lived in these areas my whole life, and saw how it affected underserved populations. There was a story that needed to be told, so we’re telling it,” he said in an interview.

In a study, the researchers analyzed food access and cancer mortality data from 3,038 counties across the United States. The food access data came from the U.S. Department of Agriculture Food Environment Atlas (FEA) for the years 2012, 2014, 2015, 2017, and 2020. Data on obesity-related cancer mortality came from the Centers for Disease Control and Prevention for the years from 2010 to 2020.

Food desert scores were calculated through data from the FEA, and food swamp scores were based on the ratio of fast-food restaurants and convenience stores to grocery stores and farmers markets in a modification of the Retail Food Environment Index score.

The researchers used an age-adjusted, multiple regression model to determine the association between food desert and food swamp scores and obesity-related cancer mortality rates. Higher food swamp and food desert scores (defined as 20.0 to 58.0 or higher) were used to classify counties as having fewer healthy food resources. The primary outcome was obesity-related cancer mortality, defined as high or low (71.8 or higher per 100,000 individuals and less than 71.8 per 100,000 individuals, respectively).

Overall, high rates of obesity-related cancer mortality were 77% more likely in the counties that met the criteria for high food swamp scores (adjusted odds ratio 1.77). In addition, researchers found a positive dose-response relationship among three levels of both food desert scores and food swamp scores and obesity-related cancer mortality.

A total of 758 counties had obesity-related cancer mortality rates in the highest quartile. Compared to counties with low rates of obesity-related cancer mortality, counties with high rates of obesity-related cancer mortality also had a higher percentage of non-Hispanic Black residents (3.26% vs. 1.77%), higher percentage of adults older than 65 years (15.71% vs. 15.40%), higher rates of adult obesity (33.0% vs. 32.10%), and higher rates of adult diabetes (12.50% vs. 10.70%).

Possible explanations for the results include the lack of interest in grocery stores in neighborhoods with a population with a lower socioeconomic status, which can create a food desert, the researchers wrote in their discussion. “Coupled with the increasing growth rate of fast-food restaurants in recent years and the intentional advertisement of unhealthy foods in urban neighborhoods with [people of lower income], the food desert may transform into a food swamp,” they said.

The findings were limited by several factors including the study design, which did not allow for showing a causal association of food deserts and food swamps with obesity-related cancer mortality, the researchers noted. Other limitations included the use of groups rather than individuals, the potential misclassification of food stores, and the use of county-level data on race, ethnicity, and income, they wrote.

The results indicate that “food swamps appear to be a growing epidemic across the U.S., likely because of systemic issues, and should draw concern and conversation from local and state officials,” the researchers concluded.
 

 

 

Community-level investments can benefit individual health

Dr. Bevel said he was not surprised by the findings, as he has seen firsthand the lack of healthy food options and growth of unhealthy food options, especially for certain populations in certain communities. “Typically, these are people who have lower socioeconomic status, primarily non-Hispanic Black or African American or Hispanic American,” he said “I have watched people have to choose between getting fruits/vegetables versus their medications or running to fast food places to feed their families. What is truly surprising is that we’re not talking about people’s lived environment enough for my taste,” he said.  

“I hope that our data and results can inform local and state policymakers to truly invest in all communities, such as funding for community gardens, and realize that adverse food environments, including the barriers in navigating these environments, have significant consequences on real people,” said Dr. Bevel. “Also, I hope that the results can help clinicians realize that a patient’s lived environment can truly affect their obesity and/or obesity-related cancer status; being cognizant of that is the first step in holistic, comprehensive care,” he said. 

“One role that oncologists might be able to play in improving patients’ access to healthier food is to create and/or implement healthy lifestyle programs with gardening components to combat the poorest food environments that their patients likely reside in,” said Dr. Bevel. Clinicians also could consider the innovative approach of “food prescriptions” to help reduce the effects of deprived, built environments, he noted.

Looking ahead, next steps for research include determining the severity of association between food swamps and obesity-related cancer by varying factors such as cancer type, and examining any potential racial disparities between people living in these environments and obesity-related cancer, Dr. Bevel added.
 

Data provide foundation for multilevel interventions

The current study findings “raise a clarion call to elevate the discussion on food availability and access to ensure an equitable emphasis on both the importance of lifestyle factors and the upstream structural, economic, and environmental contexts that shape these behaviors at the individual level,” Karriem S. Watson, DHSc, MS, MPH, of the National Institutes of Health, Bethesda, Md., and Angela Odoms-Young, PhD, of Cornell University, Ithaca, N.Y., wrote in an accompanying editorial.

The findings provide a foundation for studies of obesity-related cancer outcomes that take the community environment into consideration, they added.

The causes of both obesity and cancer are complex, and the study findings suggest that the links between unhealthy food environments and obesity-related cancer may go beyond dietary consumption alone and extend to social and psychological factors, the editorialists noted.

“Whether dealing with the lack of access to healthy foods or an overabundance of unhealthy food, there is a critical need to develop additional research that explores the associations between obesity-related cancer mortality and food inequities,” they concluded.

The study received no outside funding. The researchers and the editorialists had no financial conflicts to disclose.

Communities with easy access to fast food were 77% more likely to have high levels of obesity-related cancer mortality, based on data from a new cross-sectional study of more than 3,000 communities.

Although increased healthy eating has been associated with reduced risk of obesity and with reduced cancer incidence and mortality, access to healthier eating remains a challenge in communities with less access to grocery stores and healthy food options (food deserts) and/or easy access to convenience stores and fast food (food swamps), Malcolm Seth Bevel, PhD, of the Medical College of Georgia, Augusta, and colleagues, wrote in their paper, published in JAMA Oncology.

In addition, data on the association between food deserts and swamps and obesity-related cancer mortality are limited, they said.

“We felt that the study was important given the fact that obesity is an epidemic in the United States, and multiple factors contribute to obesity, especially adverse food environments,” Dr. Bevel said in an interview. “Also, I lived in these areas my whole life, and saw how it affected underserved populations. There was a story that needed to be told, so we’re telling it,” he said in an interview.

In a study, the researchers analyzed food access and cancer mortality data from 3,038 counties across the United States. The food access data came from the U.S. Department of Agriculture Food Environment Atlas (FEA) for the years 2012, 2014, 2015, 2017, and 2020. Data on obesity-related cancer mortality came from the Centers for Disease Control and Prevention for the years from 2010 to 2020.

Food desert scores were calculated through data from the FEA, and food swamp scores were based on the ratio of fast-food restaurants and convenience stores to grocery stores and farmers markets in a modification of the Retail Food Environment Index score.

The researchers used an age-adjusted, multiple regression model to determine the association between food desert and food swamp scores and obesity-related cancer mortality rates. Higher food swamp and food desert scores (defined as 20.0 to 58.0 or higher) were used to classify counties as having fewer healthy food resources. The primary outcome was obesity-related cancer mortality, defined as high or low (71.8 or higher per 100,000 individuals and less than 71.8 per 100,000 individuals, respectively).

Overall, high rates of obesity-related cancer mortality were 77% more likely in the counties that met the criteria for high food swamp scores (adjusted odds ratio 1.77). In addition, researchers found a positive dose-response relationship among three levels of both food desert scores and food swamp scores and obesity-related cancer mortality.

A total of 758 counties had obesity-related cancer mortality rates in the highest quartile. Compared to counties with low rates of obesity-related cancer mortality, counties with high rates of obesity-related cancer mortality also had a higher percentage of non-Hispanic Black residents (3.26% vs. 1.77%), higher percentage of adults older than 65 years (15.71% vs. 15.40%), higher rates of adult obesity (33.0% vs. 32.10%), and higher rates of adult diabetes (12.50% vs. 10.70%).

Possible explanations for the results include the lack of interest in grocery stores in neighborhoods with a population with a lower socioeconomic status, which can create a food desert, the researchers wrote in their discussion. “Coupled with the increasing growth rate of fast-food restaurants in recent years and the intentional advertisement of unhealthy foods in urban neighborhoods with [people of lower income], the food desert may transform into a food swamp,” they said.

The findings were limited by several factors including the study design, which did not allow for showing a causal association of food deserts and food swamps with obesity-related cancer mortality, the researchers noted. Other limitations included the use of groups rather than individuals, the potential misclassification of food stores, and the use of county-level data on race, ethnicity, and income, they wrote.

The results indicate that “food swamps appear to be a growing epidemic across the U.S., likely because of systemic issues, and should draw concern and conversation from local and state officials,” the researchers concluded.
 

 

 

Community-level investments can benefit individual health

Dr. Bevel said he was not surprised by the findings, as he has seen firsthand the lack of healthy food options and growth of unhealthy food options, especially for certain populations in certain communities. “Typically, these are people who have lower socioeconomic status, primarily non-Hispanic Black or African American or Hispanic American,” he said “I have watched people have to choose between getting fruits/vegetables versus their medications or running to fast food places to feed their families. What is truly surprising is that we’re not talking about people’s lived environment enough for my taste,” he said.  

“I hope that our data and results can inform local and state policymakers to truly invest in all communities, such as funding for community gardens, and realize that adverse food environments, including the barriers in navigating these environments, have significant consequences on real people,” said Dr. Bevel. “Also, I hope that the results can help clinicians realize that a patient’s lived environment can truly affect their obesity and/or obesity-related cancer status; being cognizant of that is the first step in holistic, comprehensive care,” he said. 

“One role that oncologists might be able to play in improving patients’ access to healthier food is to create and/or implement healthy lifestyle programs with gardening components to combat the poorest food environments that their patients likely reside in,” said Dr. Bevel. Clinicians also could consider the innovative approach of “food prescriptions” to help reduce the effects of deprived, built environments, he noted.

Looking ahead, next steps for research include determining the severity of association between food swamps and obesity-related cancer by varying factors such as cancer type, and examining any potential racial disparities between people living in these environments and obesity-related cancer, Dr. Bevel added.
 

Data provide foundation for multilevel interventions

The current study findings “raise a clarion call to elevate the discussion on food availability and access to ensure an equitable emphasis on both the importance of lifestyle factors and the upstream structural, economic, and environmental contexts that shape these behaviors at the individual level,” Karriem S. Watson, DHSc, MS, MPH, of the National Institutes of Health, Bethesda, Md., and Angela Odoms-Young, PhD, of Cornell University, Ithaca, N.Y., wrote in an accompanying editorial.

The findings provide a foundation for studies of obesity-related cancer outcomes that take the community environment into consideration, they added.

The causes of both obesity and cancer are complex, and the study findings suggest that the links between unhealthy food environments and obesity-related cancer may go beyond dietary consumption alone and extend to social and psychological factors, the editorialists noted.

“Whether dealing with the lack of access to healthy foods or an overabundance of unhealthy food, there is a critical need to develop additional research that explores the associations between obesity-related cancer mortality and food inequities,” they concluded.

The study received no outside funding. The researchers and the editorialists had no financial conflicts to disclose.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Autism and bone health: What you need to know

Article Type
Changed
Fri, 05/05/2023 - 12:06

Many years ago, at the conclusion of a talk I gave on bone health in teens with anorexia nervosa, I was approached by a colleague, Ann Neumeyer, MD, medical director of the Lurie Center for Autism at Massachusetts General Hospital, Boston, who asked about bone health in children with autism spectrum disorder (ASD).

When I explained that there was little information about bone health in this patient population, she suggested that we learn and investigate together. Ann explained that she had observed that some of her patients with ASD had suffered fractures with minimal trauma, raising her concern about their bone health.

This was the beginning of a partnership that led us down the path of many grant submissions, some of which were funded and others that were not, to explore and investigate bone outcomes in children with ASD.

Over the years it has become very clear that these patients are at high risk for low bone density at multiple sites. This applies to prepubertal children as well as older children and adolescents. One study showed that 28% and 33% of children with ASD 8-14 years old had very low bone density (z scores of ≤ –2) at the spine and hip, respectively, compared with 0% of typically developing controls.

Studies that have used sophisticated imaging techniques to determine bone strength have shown that it is lower at the forearm and lower leg in children with ASD versus neurotypical children.

These findings are of particular concern during the childhood and teenage years when bone is typically accrued at a rapid rate. A normal rate of bone accrual at this time of life is essential for optimal bone health in later life. While children with ASD gain bone mass at a similar rate as neurotypical controls, they start at a deficit and seem unable to “catch up.”

Further, people with ASD are more prone to certain kinds of fracture than those without the condition. For example, both children and adults with ASD have a high risk for hip fracture, while adult women with ASD have a higher risk for forearm and spine fractures. There is some protection against forearm fractures in children and adult men, probably because of markedly lower levels of physical activity, which would reduce fall risk.

Many of Ann’s patients with ASD had unusual or restricted diets, low levels of physical activity, and were on multiple medications. We have since learned that some factors that contribute to low bone density in ASD include lower levels of weight-bearing physical activity; lower muscle mass; low muscle tone; suboptimal dietary calcium and vitamin D intakelower vitamin D levelshigher levels of the hormone cortisol, which has deleterious effects on bone; and use of medications that can lower bone density.

In order to mitigate the risk for low bone density and fractures, it is important to optimize physical activity while considering the child’s ability to safely engage in weight-bearing sports.

High-impact sports like gymnastics and jumping, or cross-impact sports like soccer, basketball, field hockey, and lacrosse, are particularly useful in this context, but many patients with ASD are not able to easily engage in typical team sports.

For such children, a prescribed amount of time spent walking, as well as weight and resistance training, could be helpful. The latter would also help increase muscle mass, a key modulator of bone health.

Other strategies include ensuring sufficient intake of calcium and vitamin D through diet and supplements. This can be a particular challenge for children with ASD on specialized diets, such as a gluten-free or dairy-free diet, which are deficient in calcium and vitamin D. Health care providers should check for intake of dairy and dairy products, as well as serum vitamin D levels, and prescribe supplements as needed.

All children should get at least 600 IUs of vitamin D and 1,000-1,300 mg of elemental calcium daily. That said, many with ASD need much higher quantities of vitamin D (1,000-4,000 IUs or more) to maintain levels in the normal range. This is particularly true for dark-skinned children and children with obesity, as well as those who have medical disorders that cause malabsorption.

Higher cortisol levels in the ASD patient population are harder to manage. Efforts to ease anxiety and depression may help reduce cortisol levels. Medications such as protein pump inhibitors and glucocorticosteroids can compromise bone health.

In addition, certain antipsychotics can cause marked elevations in prolactin which, in turn, can lower levels of estrogen and testosterone, which are very important for bone health. In such cases, the clinician should consider switching patients to a different, less detrimental medication or adjust the current medication so that patients receive the lowest possible effective dose.

Obesity is associated with increased fracture risk and with suboptimal bone accrual during childhood, so ensuring a healthy diet is important. This includes avoiding sugary beverages and reducing intake of processed food and juice.

Sometimes, particularly when a child has low bone density and a history of several low-trauma fractures, medications such as bisphosphonates should be considered to increase bone density.

Above all, as physicians who manage ASD, it is essential that we raise awareness about bone health among our colleagues, patients, and their families to help mitigate fracture risk.

Madhusmita Misra, MD, MPH, is chief of the Division of Pediatric Endocrinology at Mass General for Children, Boston.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Many years ago, at the conclusion of a talk I gave on bone health in teens with anorexia nervosa, I was approached by a colleague, Ann Neumeyer, MD, medical director of the Lurie Center for Autism at Massachusetts General Hospital, Boston, who asked about bone health in children with autism spectrum disorder (ASD).

When I explained that there was little information about bone health in this patient population, she suggested that we learn and investigate together. Ann explained that she had observed that some of her patients with ASD had suffered fractures with minimal trauma, raising her concern about their bone health.

This was the beginning of a partnership that led us down the path of many grant submissions, some of which were funded and others that were not, to explore and investigate bone outcomes in children with ASD.

Over the years it has become very clear that these patients are at high risk for low bone density at multiple sites. This applies to prepubertal children as well as older children and adolescents. One study showed that 28% and 33% of children with ASD 8-14 years old had very low bone density (z scores of ≤ –2) at the spine and hip, respectively, compared with 0% of typically developing controls.

Studies that have used sophisticated imaging techniques to determine bone strength have shown that it is lower at the forearm and lower leg in children with ASD versus neurotypical children.

These findings are of particular concern during the childhood and teenage years when bone is typically accrued at a rapid rate. A normal rate of bone accrual at this time of life is essential for optimal bone health in later life. While children with ASD gain bone mass at a similar rate as neurotypical controls, they start at a deficit and seem unable to “catch up.”

Further, people with ASD are more prone to certain kinds of fracture than those without the condition. For example, both children and adults with ASD have a high risk for hip fracture, while adult women with ASD have a higher risk for forearm and spine fractures. There is some protection against forearm fractures in children and adult men, probably because of markedly lower levels of physical activity, which would reduce fall risk.

Many of Ann’s patients with ASD had unusual or restricted diets, low levels of physical activity, and were on multiple medications. We have since learned that some factors that contribute to low bone density in ASD include lower levels of weight-bearing physical activity; lower muscle mass; low muscle tone; suboptimal dietary calcium and vitamin D intakelower vitamin D levelshigher levels of the hormone cortisol, which has deleterious effects on bone; and use of medications that can lower bone density.

In order to mitigate the risk for low bone density and fractures, it is important to optimize physical activity while considering the child’s ability to safely engage in weight-bearing sports.

High-impact sports like gymnastics and jumping, or cross-impact sports like soccer, basketball, field hockey, and lacrosse, are particularly useful in this context, but many patients with ASD are not able to easily engage in typical team sports.

For such children, a prescribed amount of time spent walking, as well as weight and resistance training, could be helpful. The latter would also help increase muscle mass, a key modulator of bone health.

Other strategies include ensuring sufficient intake of calcium and vitamin D through diet and supplements. This can be a particular challenge for children with ASD on specialized diets, such as a gluten-free or dairy-free diet, which are deficient in calcium and vitamin D. Health care providers should check for intake of dairy and dairy products, as well as serum vitamin D levels, and prescribe supplements as needed.

All children should get at least 600 IUs of vitamin D and 1,000-1,300 mg of elemental calcium daily. That said, many with ASD need much higher quantities of vitamin D (1,000-4,000 IUs or more) to maintain levels in the normal range. This is particularly true for dark-skinned children and children with obesity, as well as those who have medical disorders that cause malabsorption.

Higher cortisol levels in the ASD patient population are harder to manage. Efforts to ease anxiety and depression may help reduce cortisol levels. Medications such as protein pump inhibitors and glucocorticosteroids can compromise bone health.

In addition, certain antipsychotics can cause marked elevations in prolactin which, in turn, can lower levels of estrogen and testosterone, which are very important for bone health. In such cases, the clinician should consider switching patients to a different, less detrimental medication or adjust the current medication so that patients receive the lowest possible effective dose.

Obesity is associated with increased fracture risk and with suboptimal bone accrual during childhood, so ensuring a healthy diet is important. This includes avoiding sugary beverages and reducing intake of processed food and juice.

Sometimes, particularly when a child has low bone density and a history of several low-trauma fractures, medications such as bisphosphonates should be considered to increase bone density.

Above all, as physicians who manage ASD, it is essential that we raise awareness about bone health among our colleagues, patients, and their families to help mitigate fracture risk.

Madhusmita Misra, MD, MPH, is chief of the Division of Pediatric Endocrinology at Mass General for Children, Boston.

A version of this article first appeared on Medscape.com.

Many years ago, at the conclusion of a talk I gave on bone health in teens with anorexia nervosa, I was approached by a colleague, Ann Neumeyer, MD, medical director of the Lurie Center for Autism at Massachusetts General Hospital, Boston, who asked about bone health in children with autism spectrum disorder (ASD).

When I explained that there was little information about bone health in this patient population, she suggested that we learn and investigate together. Ann explained that she had observed that some of her patients with ASD had suffered fractures with minimal trauma, raising her concern about their bone health.

This was the beginning of a partnership that led us down the path of many grant submissions, some of which were funded and others that were not, to explore and investigate bone outcomes in children with ASD.

Over the years it has become very clear that these patients are at high risk for low bone density at multiple sites. This applies to prepubertal children as well as older children and adolescents. One study showed that 28% and 33% of children with ASD 8-14 years old had very low bone density (z scores of ≤ –2) at the spine and hip, respectively, compared with 0% of typically developing controls.

Studies that have used sophisticated imaging techniques to determine bone strength have shown that it is lower at the forearm and lower leg in children with ASD versus neurotypical children.

These findings are of particular concern during the childhood and teenage years when bone is typically accrued at a rapid rate. A normal rate of bone accrual at this time of life is essential for optimal bone health in later life. While children with ASD gain bone mass at a similar rate as neurotypical controls, they start at a deficit and seem unable to “catch up.”

Further, people with ASD are more prone to certain kinds of fracture than those without the condition. For example, both children and adults with ASD have a high risk for hip fracture, while adult women with ASD have a higher risk for forearm and spine fractures. There is some protection against forearm fractures in children and adult men, probably because of markedly lower levels of physical activity, which would reduce fall risk.

Many of Ann’s patients with ASD had unusual or restricted diets, low levels of physical activity, and were on multiple medications. We have since learned that some factors that contribute to low bone density in ASD include lower levels of weight-bearing physical activity; lower muscle mass; low muscle tone; suboptimal dietary calcium and vitamin D intakelower vitamin D levelshigher levels of the hormone cortisol, which has deleterious effects on bone; and use of medications that can lower bone density.

In order to mitigate the risk for low bone density and fractures, it is important to optimize physical activity while considering the child’s ability to safely engage in weight-bearing sports.

High-impact sports like gymnastics and jumping, or cross-impact sports like soccer, basketball, field hockey, and lacrosse, are particularly useful in this context, but many patients with ASD are not able to easily engage in typical team sports.

For such children, a prescribed amount of time spent walking, as well as weight and resistance training, could be helpful. The latter would also help increase muscle mass, a key modulator of bone health.

Other strategies include ensuring sufficient intake of calcium and vitamin D through diet and supplements. This can be a particular challenge for children with ASD on specialized diets, such as a gluten-free or dairy-free diet, which are deficient in calcium and vitamin D. Health care providers should check for intake of dairy and dairy products, as well as serum vitamin D levels, and prescribe supplements as needed.

All children should get at least 600 IUs of vitamin D and 1,000-1,300 mg of elemental calcium daily. That said, many with ASD need much higher quantities of vitamin D (1,000-4,000 IUs or more) to maintain levels in the normal range. This is particularly true for dark-skinned children and children with obesity, as well as those who have medical disorders that cause malabsorption.

Higher cortisol levels in the ASD patient population are harder to manage. Efforts to ease anxiety and depression may help reduce cortisol levels. Medications such as protein pump inhibitors and glucocorticosteroids can compromise bone health.

In addition, certain antipsychotics can cause marked elevations in prolactin which, in turn, can lower levels of estrogen and testosterone, which are very important for bone health. In such cases, the clinician should consider switching patients to a different, less detrimental medication or adjust the current medication so that patients receive the lowest possible effective dose.

Obesity is associated with increased fracture risk and with suboptimal bone accrual during childhood, so ensuring a healthy diet is important. This includes avoiding sugary beverages and reducing intake of processed food and juice.

Sometimes, particularly when a child has low bone density and a history of several low-trauma fractures, medications such as bisphosphonates should be considered to increase bone density.

Above all, as physicians who manage ASD, it is essential that we raise awareness about bone health among our colleagues, patients, and their families to help mitigate fracture risk.

Madhusmita Misra, MD, MPH, is chief of the Division of Pediatric Endocrinology at Mass General for Children, Boston.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The federal government paid private doctors twice by mistake for veterans’ care

Article Type
Changed
Mon, 05/08/2023 - 07:50

The U.S. federal government wrote duplicate checks to private doctors who treated veterans, costing taxpayers up to $128 million in extra payments over 5 years, a new report by a federal watchdog revealed in April.
 

Private doctors were paid twice in nearly 300,000 cases from 2017 to 2021 involving veterans who were eligible for Veterans Health Administration and Medicare benefits, according to the report by the Health & Human Services Office of Inspector General.

The doctors were paid by Medicare for medical services that the VHA had authorized and already paid for, the OIG reported after it conducted a 5-year audit.

Duplicate Medicare payments have doubled from $22 million in 2019 when the Veterans Community Care Program was implemented to $45 million in 2021, according to the OIG report. The program allows veterans to seek care from private doctors when the VHA can’t provide the care they need.

Roughly 1.9 million veterans every year receive government-paid health care from private doctors.

The OIG said it decided to audit Medicare’s claims because “duplicate payments were a long-standing issue.”

The problem dates back to a 1979 General Accounting Office (now the Government Accountability Office) report that found Medicare and the Department of Veterans Affairs VHA made duplicate payments of more than $72,000 for certain medical services provided to veterans, the OIG reported.

The HHS OIG’s audit examined $19.2 billion in Medicare payments for 36 million claims for individuals who enrolled in Medicare and were eligible for VA services. About 90% of those claims were for doctor evaluations and visits, according to the OIG report.

The OIG found “these duplicate payments occurred because CMS did not implement controls to address duplicate payments for services provided to individuals with Medicare and VHA benefits.”

Specifically, the OIG found that the CMS and the VHA were not sharing enrollment, claims, and payment data with each other, as required by federal law.

If CMS had access to that information, the agency could have compared the VHA claims data with existing Medicare claims data to identify duplicate claims, the OIG claimed.

The OIG recommended that CMS take the following four steps to fix the problem, which CMS has agreed to do, according to the report:

  • Integrate VHA enrollment, claims, and payment data into the CMS centralized claims data system so it can identify potential fraud, waste, and abuse under the Medicare program.
  • Issue guidance to medical professionals on not billing Medicare for a medical service that was authorized by the VHA.
  • Establish a comprehensive data-sharing agreement with the VHA.
  • Establish an internal process (such as system edits) to address duplicate payments.

“CMS previously informed [the OIG] that establishing a long-term solution to address duplicate payments will take time,” the OIG reported.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The U.S. federal government wrote duplicate checks to private doctors who treated veterans, costing taxpayers up to $128 million in extra payments over 5 years, a new report by a federal watchdog revealed in April.
 

Private doctors were paid twice in nearly 300,000 cases from 2017 to 2021 involving veterans who were eligible for Veterans Health Administration and Medicare benefits, according to the report by the Health & Human Services Office of Inspector General.

The doctors were paid by Medicare for medical services that the VHA had authorized and already paid for, the OIG reported after it conducted a 5-year audit.

Duplicate Medicare payments have doubled from $22 million in 2019 when the Veterans Community Care Program was implemented to $45 million in 2021, according to the OIG report. The program allows veterans to seek care from private doctors when the VHA can’t provide the care they need.

Roughly 1.9 million veterans every year receive government-paid health care from private doctors.

The OIG said it decided to audit Medicare’s claims because “duplicate payments were a long-standing issue.”

The problem dates back to a 1979 General Accounting Office (now the Government Accountability Office) report that found Medicare and the Department of Veterans Affairs VHA made duplicate payments of more than $72,000 for certain medical services provided to veterans, the OIG reported.

The HHS OIG’s audit examined $19.2 billion in Medicare payments for 36 million claims for individuals who enrolled in Medicare and were eligible for VA services. About 90% of those claims were for doctor evaluations and visits, according to the OIG report.

The OIG found “these duplicate payments occurred because CMS did not implement controls to address duplicate payments for services provided to individuals with Medicare and VHA benefits.”

Specifically, the OIG found that the CMS and the VHA were not sharing enrollment, claims, and payment data with each other, as required by federal law.

If CMS had access to that information, the agency could have compared the VHA claims data with existing Medicare claims data to identify duplicate claims, the OIG claimed.

The OIG recommended that CMS take the following four steps to fix the problem, which CMS has agreed to do, according to the report:

  • Integrate VHA enrollment, claims, and payment data into the CMS centralized claims data system so it can identify potential fraud, waste, and abuse under the Medicare program.
  • Issue guidance to medical professionals on not billing Medicare for a medical service that was authorized by the VHA.
  • Establish a comprehensive data-sharing agreement with the VHA.
  • Establish an internal process (such as system edits) to address duplicate payments.

“CMS previously informed [the OIG] that establishing a long-term solution to address duplicate payments will take time,” the OIG reported.

A version of this article first appeared on Medscape.com.

The U.S. federal government wrote duplicate checks to private doctors who treated veterans, costing taxpayers up to $128 million in extra payments over 5 years, a new report by a federal watchdog revealed in April.
 

Private doctors were paid twice in nearly 300,000 cases from 2017 to 2021 involving veterans who were eligible for Veterans Health Administration and Medicare benefits, according to the report by the Health & Human Services Office of Inspector General.

The doctors were paid by Medicare for medical services that the VHA had authorized and already paid for, the OIG reported after it conducted a 5-year audit.

Duplicate Medicare payments have doubled from $22 million in 2019 when the Veterans Community Care Program was implemented to $45 million in 2021, according to the OIG report. The program allows veterans to seek care from private doctors when the VHA can’t provide the care they need.

Roughly 1.9 million veterans every year receive government-paid health care from private doctors.

The OIG said it decided to audit Medicare’s claims because “duplicate payments were a long-standing issue.”

The problem dates back to a 1979 General Accounting Office (now the Government Accountability Office) report that found Medicare and the Department of Veterans Affairs VHA made duplicate payments of more than $72,000 for certain medical services provided to veterans, the OIG reported.

The HHS OIG’s audit examined $19.2 billion in Medicare payments for 36 million claims for individuals who enrolled in Medicare and were eligible for VA services. About 90% of those claims were for doctor evaluations and visits, according to the OIG report.

The OIG found “these duplicate payments occurred because CMS did not implement controls to address duplicate payments for services provided to individuals with Medicare and VHA benefits.”

Specifically, the OIG found that the CMS and the VHA were not sharing enrollment, claims, and payment data with each other, as required by federal law.

If CMS had access to that information, the agency could have compared the VHA claims data with existing Medicare claims data to identify duplicate claims, the OIG claimed.

The OIG recommended that CMS take the following four steps to fix the problem, which CMS has agreed to do, according to the report:

  • Integrate VHA enrollment, claims, and payment data into the CMS centralized claims data system so it can identify potential fraud, waste, and abuse under the Medicare program.
  • Issue guidance to medical professionals on not billing Medicare for a medical service that was authorized by the VHA.
  • Establish a comprehensive data-sharing agreement with the VHA.
  • Establish an internal process (such as system edits) to address duplicate payments.

“CMS previously informed [the OIG] that establishing a long-term solution to address duplicate payments will take time,” the OIG reported.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Expert discusses which diets are best, based on the evidence

Article Type
Changed
Mon, 05/08/2023 - 08:36

– Primary care providers can draw from a wide range of diets to give patients evidence-based advice on how to lose weight, prevent diabetes, and achieve other health goals, according to a speaker at the annual meeting of the American College of Physicians.

“Evidence from studies can help clinicians and their patients develop a successful dietary management plan and achieve optimal health,” said internist Michelle Hauser, MD, clinical associate professor at Stanford (Calif.) University. She also discussed evidence-based techniques to support patients in maintaining dietary modifications.
 

Predominantly plant‐based diets

Popular predominantly plant‐based diets include a Mediterranean diet, healthy vegetarian diet, predominantly whole-food plant‐based (WFPB) diet, and a dietary approach to stop hypertension (DASH).

The DASH diet was originally designed to help patients manage their blood pressure, but evidence suggests that it also can help adults with obesity lose weight. In contrast to the DASH diet, the Mediterranean diet is not low-fat and not very restrictive. Yet the evidence suggests that the Mediterranean diet is not only helpful for losing weight but also can reduce the risk of various chronic diseases, including obesity, type 2 diabetes, cardiovascular disease (CVD), and cancer, Dr. Hauser said. In addition, data suggest that the Mediterranean diet may reduce the risk of all-cause mortality and lower the levels of cholesterol.

“I like to highlight all these protective effects to my patients, because even if their goal is to lose weight, knowing that hard work pays off in additional ways can keep them motivated,” Dr. Hauser stated.

A healthy vegetarian diet and a WFPB diet are similar, and both are helpful in weight loss and management of total cholesterol and LDL‐C levels. Furthermore, healthy vegetarian and WFPB diets may reduce the risk of type 2 diabetes, CVD, and some cancers. Cohort study data suggest that progressively more vegetarian diets are associated with lower BMIs.

“My interpretation of these data is that predominantly plant-based diets rich in whole foods are healthful and can be done in a way that is sustainable for most,” said Dr. Hauser. However, this generally requires a lot of support at the outset to address gaps in knowledge, skills, and other potential barriers.

For example, she referred one obese patient at risk of diabetes and cardiovascular disease to a registered dietitian to develop a dietary plan. The patient also attended a behavioral medicine weight management program to learn strategies such as using smaller plates, and his family attended a healthy cooking class together to improve meal planning and cooking skills.
 

Time‐restricted feeding

There are numerous variations of time-restricted feeding, commonly referred to as intermittent fasting, but the principles are similar – limiting food intake to a specific window of time each day or week.

Although some studies have shown that time-restricted feeding may help patients reduce adiposity and improve lipid markers, most studies comparing time-restricted feeding to a calorie-restricted diet have shown little to no difference in weight-related outcomes, Dr. Hauser said.

These data suggest that time-restricted feeding may help patients with weight loss only if time restriction helps them reduce calorie intake. She also warned that time-restrictive feeding might cause late-night cravings and might not be helpful in individuals prone to food cravings.
 

 

 

Low‐carbohydrate and ketogenic diets

Losing muscle mass can prevent some people from dieting, but evidence suggests that a high-fat, very low-carbohydrate diet – also called a ketogenic diet – may help patients reduce weight and fat mass while preserving fat‐free mass, Dr. Hauser said.

The evidence regarding the usefulness of a low-carbohydrate (non-keto) diet is less clear because most studies compared it to a low-fat diet, and these two diets might lead to a similar extent of weight loss.
 

Rating the level of scientific evidence behind different diet options

Nutrition studies do no provide the same level of evidence as drug studies, said Dr. Hauser, because it is easier to conduct a randomized controlled trial of a drug versus placebo. Diets have many more variables, and it also takes much longer to observe most outcomes of a dietary change.

In addition, clinical trials of dietary interventions are typically short and focus on disease markers such as serum lipids and hemoglobin A1c levels. To obtain reliable information on the usefulness of a diet, researchers need to collect detailed health and lifestyle information from hundreds of thousands of people over several decades, which is not always feasible. “This is why meta-analyses of pooled dietary study data are more likely to yield dependable findings,” she noted.
 

Getting to know patients is essential to help them maintain diet modifications

When developing a diet plan for a patient, it is important to consider the sustainability of a dietary pattern. “The benefits of any healthy dietary change will only last as long as they can be maintained,” said Dr. Hauser. “Counseling someone on choosing an appropriate long-term dietary pattern requires getting to know them – taste preferences, food traditions, barriers, facilitators, food access, and time and cost restrictions.”

In an interview after the session, David Bittleman, MD, an internist at Veterans Affairs San Diego Health Care System, agreed that getting to know patients is essential for successfully advising them on diet.

“I always start developing a diet plan by trying to find out what [a patient’s] diet is like and what their goals are. I need to know what they are already doing in order to make suggestions about what they can do to make their diet healthier,” he said.

When asked about her approach to supporting patients in the long term, Dr. Hauser said that she recommends sequential, gradual changes. Dr. Hauser added that she suggests her patients prioritize implementing dietary changes that they are confident they can maintain.

Dr. Hauser and Dr. Bittleman report no relevant financial relationships.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Primary care providers can draw from a wide range of diets to give patients evidence-based advice on how to lose weight, prevent diabetes, and achieve other health goals, according to a speaker at the annual meeting of the American College of Physicians.

“Evidence from studies can help clinicians and their patients develop a successful dietary management plan and achieve optimal health,” said internist Michelle Hauser, MD, clinical associate professor at Stanford (Calif.) University. She also discussed evidence-based techniques to support patients in maintaining dietary modifications.
 

Predominantly plant‐based diets

Popular predominantly plant‐based diets include a Mediterranean diet, healthy vegetarian diet, predominantly whole-food plant‐based (WFPB) diet, and a dietary approach to stop hypertension (DASH).

The DASH diet was originally designed to help patients manage their blood pressure, but evidence suggests that it also can help adults with obesity lose weight. In contrast to the DASH diet, the Mediterranean diet is not low-fat and not very restrictive. Yet the evidence suggests that the Mediterranean diet is not only helpful for losing weight but also can reduce the risk of various chronic diseases, including obesity, type 2 diabetes, cardiovascular disease (CVD), and cancer, Dr. Hauser said. In addition, data suggest that the Mediterranean diet may reduce the risk of all-cause mortality and lower the levels of cholesterol.

“I like to highlight all these protective effects to my patients, because even if their goal is to lose weight, knowing that hard work pays off in additional ways can keep them motivated,” Dr. Hauser stated.

A healthy vegetarian diet and a WFPB diet are similar, and both are helpful in weight loss and management of total cholesterol and LDL‐C levels. Furthermore, healthy vegetarian and WFPB diets may reduce the risk of type 2 diabetes, CVD, and some cancers. Cohort study data suggest that progressively more vegetarian diets are associated with lower BMIs.

“My interpretation of these data is that predominantly plant-based diets rich in whole foods are healthful and can be done in a way that is sustainable for most,” said Dr. Hauser. However, this generally requires a lot of support at the outset to address gaps in knowledge, skills, and other potential barriers.

For example, she referred one obese patient at risk of diabetes and cardiovascular disease to a registered dietitian to develop a dietary plan. The patient also attended a behavioral medicine weight management program to learn strategies such as using smaller plates, and his family attended a healthy cooking class together to improve meal planning and cooking skills.
 

Time‐restricted feeding

There are numerous variations of time-restricted feeding, commonly referred to as intermittent fasting, but the principles are similar – limiting food intake to a specific window of time each day or week.

Although some studies have shown that time-restricted feeding may help patients reduce adiposity and improve lipid markers, most studies comparing time-restricted feeding to a calorie-restricted diet have shown little to no difference in weight-related outcomes, Dr. Hauser said.

These data suggest that time-restricted feeding may help patients with weight loss only if time restriction helps them reduce calorie intake. She also warned that time-restrictive feeding might cause late-night cravings and might not be helpful in individuals prone to food cravings.
 

 

 

Low‐carbohydrate and ketogenic diets

Losing muscle mass can prevent some people from dieting, but evidence suggests that a high-fat, very low-carbohydrate diet – also called a ketogenic diet – may help patients reduce weight and fat mass while preserving fat‐free mass, Dr. Hauser said.

The evidence regarding the usefulness of a low-carbohydrate (non-keto) diet is less clear because most studies compared it to a low-fat diet, and these two diets might lead to a similar extent of weight loss.
 

Rating the level of scientific evidence behind different diet options

Nutrition studies do no provide the same level of evidence as drug studies, said Dr. Hauser, because it is easier to conduct a randomized controlled trial of a drug versus placebo. Diets have many more variables, and it also takes much longer to observe most outcomes of a dietary change.

In addition, clinical trials of dietary interventions are typically short and focus on disease markers such as serum lipids and hemoglobin A1c levels. To obtain reliable information on the usefulness of a diet, researchers need to collect detailed health and lifestyle information from hundreds of thousands of people over several decades, which is not always feasible. “This is why meta-analyses of pooled dietary study data are more likely to yield dependable findings,” she noted.
 

Getting to know patients is essential to help them maintain diet modifications

When developing a diet plan for a patient, it is important to consider the sustainability of a dietary pattern. “The benefits of any healthy dietary change will only last as long as they can be maintained,” said Dr. Hauser. “Counseling someone on choosing an appropriate long-term dietary pattern requires getting to know them – taste preferences, food traditions, barriers, facilitators, food access, and time and cost restrictions.”

In an interview after the session, David Bittleman, MD, an internist at Veterans Affairs San Diego Health Care System, agreed that getting to know patients is essential for successfully advising them on diet.

“I always start developing a diet plan by trying to find out what [a patient’s] diet is like and what their goals are. I need to know what they are already doing in order to make suggestions about what they can do to make their diet healthier,” he said.

When asked about her approach to supporting patients in the long term, Dr. Hauser said that she recommends sequential, gradual changes. Dr. Hauser added that she suggests her patients prioritize implementing dietary changes that they are confident they can maintain.

Dr. Hauser and Dr. Bittleman report no relevant financial relationships.

– Primary care providers can draw from a wide range of diets to give patients evidence-based advice on how to lose weight, prevent diabetes, and achieve other health goals, according to a speaker at the annual meeting of the American College of Physicians.

“Evidence from studies can help clinicians and their patients develop a successful dietary management plan and achieve optimal health,” said internist Michelle Hauser, MD, clinical associate professor at Stanford (Calif.) University. She also discussed evidence-based techniques to support patients in maintaining dietary modifications.
 

Predominantly plant‐based diets

Popular predominantly plant‐based diets include a Mediterranean diet, healthy vegetarian diet, predominantly whole-food plant‐based (WFPB) diet, and a dietary approach to stop hypertension (DASH).

The DASH diet was originally designed to help patients manage their blood pressure, but evidence suggests that it also can help adults with obesity lose weight. In contrast to the DASH diet, the Mediterranean diet is not low-fat and not very restrictive. Yet the evidence suggests that the Mediterranean diet is not only helpful for losing weight but also can reduce the risk of various chronic diseases, including obesity, type 2 diabetes, cardiovascular disease (CVD), and cancer, Dr. Hauser said. In addition, data suggest that the Mediterranean diet may reduce the risk of all-cause mortality and lower the levels of cholesterol.

“I like to highlight all these protective effects to my patients, because even if their goal is to lose weight, knowing that hard work pays off in additional ways can keep them motivated,” Dr. Hauser stated.

A healthy vegetarian diet and a WFPB diet are similar, and both are helpful in weight loss and management of total cholesterol and LDL‐C levels. Furthermore, healthy vegetarian and WFPB diets may reduce the risk of type 2 diabetes, CVD, and some cancers. Cohort study data suggest that progressively more vegetarian diets are associated with lower BMIs.

“My interpretation of these data is that predominantly plant-based diets rich in whole foods are healthful and can be done in a way that is sustainable for most,” said Dr. Hauser. However, this generally requires a lot of support at the outset to address gaps in knowledge, skills, and other potential barriers.

For example, she referred one obese patient at risk of diabetes and cardiovascular disease to a registered dietitian to develop a dietary plan. The patient also attended a behavioral medicine weight management program to learn strategies such as using smaller plates, and his family attended a healthy cooking class together to improve meal planning and cooking skills.
 

Time‐restricted feeding

There are numerous variations of time-restricted feeding, commonly referred to as intermittent fasting, but the principles are similar – limiting food intake to a specific window of time each day or week.

Although some studies have shown that time-restricted feeding may help patients reduce adiposity and improve lipid markers, most studies comparing time-restricted feeding to a calorie-restricted diet have shown little to no difference in weight-related outcomes, Dr. Hauser said.

These data suggest that time-restricted feeding may help patients with weight loss only if time restriction helps them reduce calorie intake. She also warned that time-restrictive feeding might cause late-night cravings and might not be helpful in individuals prone to food cravings.
 

 

 

Low‐carbohydrate and ketogenic diets

Losing muscle mass can prevent some people from dieting, but evidence suggests that a high-fat, very low-carbohydrate diet – also called a ketogenic diet – may help patients reduce weight and fat mass while preserving fat‐free mass, Dr. Hauser said.

The evidence regarding the usefulness of a low-carbohydrate (non-keto) diet is less clear because most studies compared it to a low-fat diet, and these two diets might lead to a similar extent of weight loss.
 

Rating the level of scientific evidence behind different diet options

Nutrition studies do no provide the same level of evidence as drug studies, said Dr. Hauser, because it is easier to conduct a randomized controlled trial of a drug versus placebo. Diets have many more variables, and it also takes much longer to observe most outcomes of a dietary change.

In addition, clinical trials of dietary interventions are typically short and focus on disease markers such as serum lipids and hemoglobin A1c levels. To obtain reliable information on the usefulness of a diet, researchers need to collect detailed health and lifestyle information from hundreds of thousands of people over several decades, which is not always feasible. “This is why meta-analyses of pooled dietary study data are more likely to yield dependable findings,” she noted.
 

Getting to know patients is essential to help them maintain diet modifications

When developing a diet plan for a patient, it is important to consider the sustainability of a dietary pattern. “The benefits of any healthy dietary change will only last as long as they can be maintained,” said Dr. Hauser. “Counseling someone on choosing an appropriate long-term dietary pattern requires getting to know them – taste preferences, food traditions, barriers, facilitators, food access, and time and cost restrictions.”

In an interview after the session, David Bittleman, MD, an internist at Veterans Affairs San Diego Health Care System, agreed that getting to know patients is essential for successfully advising them on diet.

“I always start developing a diet plan by trying to find out what [a patient’s] diet is like and what their goals are. I need to know what they are already doing in order to make suggestions about what they can do to make their diet healthier,” he said.

When asked about her approach to supporting patients in the long term, Dr. Hauser said that she recommends sequential, gradual changes. Dr. Hauser added that she suggests her patients prioritize implementing dietary changes that they are confident they can maintain.

Dr. Hauser and Dr. Bittleman report no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT INTERNAL MEDICINE 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Long-term impact of childhood trauma explained

Article Type
Changed
Fri, 05/05/2023 - 10:01

Dysregulated stress systems may help explain why childhood trauma has such a dramatic and enduring psychiatric impact, new research suggests.

“We already knew childhood trauma is associated with the later development of depressive and anxiety disorders, but it’s been unclear what makes sufferers of early trauma more likely to develop these psychiatric conditions,” study investigator Erika Kuzminskaite, PhD candidate, department of psychiatry, Amsterdam University Medical Center (UMC), the Netherlands, told this news organization.

Pauline Anderson
Erika Kuzminskaite

“The evidence now points to unbalanced stress systems as a possible cause of this vulnerability, and now the most important question is, how we can develop preventive interventions,” she added.

The findings were presented as part of the Anxiety and Depression Association of America Anxiety & Depression conference.
 

Elevated cortisol, inflammation

The study included 2,779 adults from the Netherlands Study of Depression and Anxiety (NESDA). Two thirds of participants were female.

Participants retrospectively reported childhood trauma, defined as emotional, physical, or sexual abuse or emotional or physical neglect, before the age of 18 years. Severe trauma was defined as multiple types or increased frequency of abuse.

Of the total cohort, 48% reported experiencing some childhood trauma – 21% reported severe trauma, 27% reported mild trauma, and 42% reported no childhood trauma.

Among those with trauma, 89% had a current or remitted anxiety or depressive disorder, and 11% had no psychiatric sequelae. Among participants who reported no trauma, 68% had a current or remitted disorder, and 32% had no psychiatric disorders.

At baseline, researchers assessed markers of major bodily stress systems, including the hypothalamic-pituitary-adrenal (HPA) axis, the immune-inflammatory system, and the autonomic nervous system (ANS). They examined these markers separately and cumulatively.

In one model, investigators found that levels of cortisol and inflammation were significantly elevated in those with severe childhood trauma compared to those with no childhood trauma. The effects were largest for the cumulative markers for HPA-axis, inflammation, and all stress system markers (Cohen’s d = 0.23, 0.12, and 0.25, respectively). There was no association with ANS markers.

The results were partially explained by lifestyle, said Ms. Kuzminskaite, who noted that people with severe childhood trauma tend to have a higher body mass index, smoke more, and have other unhealthy habits that may represent a “coping” mechanism for trauma.

Those who experienced childhood trauma also have higher rates of other disorders, including asthma, diabetes, and cardiovascular disease. Ms. Kuzminskaite noted that people with childhood trauma have at least double the risk of cancer in later life.

When researchers adjusted for lifestyle factors and chronic conditions, the association for cortisol was reduced and that for inflammation disappeared. However, the cumulative inflammatory markers remained significant.

Another model examined lipopolysaccharide-stimulated (LPS) immune-inflammatory markers by childhood trauma severity. This provides a more “dynamic” measure of stress systems than looking only at static circulating levels in the blood, as was done in the first model, said Ms. Kuzminskaite.

“These levels should theoretically be more affected by experiences such as childhood trauma and they are also less sensitive to lifestyle.”

Here, researchers found significant positive associations with childhood trauma, especially severe trauma, after adjusting for lifestyle and health-related covariates (cumulative index d = 0.19).

“Almost all people with childhood trauma, especially severe trauma, had LPS-stimulated cytokines upregulated,” said Ms. Kuzminskaite. “So again, there is this dysregulation of immune system functioning in these subjects.”

And again, the strongest effect was for the cumulative index of all cytokines, she said.
 

 

 

Personalized interventions

Ms. Kuzminskaite noted the importance of learning the impact of early trauma on stress responses. “The goal is to eventually have personalized interventions for people with depression or anxiety related to childhood trauma, or even preventative interventions. If we know, for example, something is going wrong with a patient’s stress systems, we can suggest some therapeutic targets.”

Investigators in Amsterdam are examining the efficacy of mifepristone, which blocks progesterone and is used along with misoprostol for medication abortions and to treat high blood sugar. “The drug is supposed to reset the stress system functioning,” said Ms. Kuzminskaite.

It’s still important to target unhealthy lifestyle habits “that are really impacting the functioning of the stress systems,” she said. Lifestyle interventions could improve the efficacy of treatments for depression, for example, she added.

Luana Marques, PhD, associate professor, department of psychiatry, Harvard Medical School, Boston, said such research is important.

“It reveals the potentially extensive and long-lasting impact of childhood trauma on functioning. The findings underscore the importance of equipping at-risk and trauma-exposed youth with evidence-based skills for managing stress,” she said.

No conflicts of interest were reported.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Dysregulated stress systems may help explain why childhood trauma has such a dramatic and enduring psychiatric impact, new research suggests.

“We already knew childhood trauma is associated with the later development of depressive and anxiety disorders, but it’s been unclear what makes sufferers of early trauma more likely to develop these psychiatric conditions,” study investigator Erika Kuzminskaite, PhD candidate, department of psychiatry, Amsterdam University Medical Center (UMC), the Netherlands, told this news organization.

Pauline Anderson
Erika Kuzminskaite

“The evidence now points to unbalanced stress systems as a possible cause of this vulnerability, and now the most important question is, how we can develop preventive interventions,” she added.

The findings were presented as part of the Anxiety and Depression Association of America Anxiety & Depression conference.
 

Elevated cortisol, inflammation

The study included 2,779 adults from the Netherlands Study of Depression and Anxiety (NESDA). Two thirds of participants were female.

Participants retrospectively reported childhood trauma, defined as emotional, physical, or sexual abuse or emotional or physical neglect, before the age of 18 years. Severe trauma was defined as multiple types or increased frequency of abuse.

Of the total cohort, 48% reported experiencing some childhood trauma – 21% reported severe trauma, 27% reported mild trauma, and 42% reported no childhood trauma.

Among those with trauma, 89% had a current or remitted anxiety or depressive disorder, and 11% had no psychiatric sequelae. Among participants who reported no trauma, 68% had a current or remitted disorder, and 32% had no psychiatric disorders.

At baseline, researchers assessed markers of major bodily stress systems, including the hypothalamic-pituitary-adrenal (HPA) axis, the immune-inflammatory system, and the autonomic nervous system (ANS). They examined these markers separately and cumulatively.

In one model, investigators found that levels of cortisol and inflammation were significantly elevated in those with severe childhood trauma compared to those with no childhood trauma. The effects were largest for the cumulative markers for HPA-axis, inflammation, and all stress system markers (Cohen’s d = 0.23, 0.12, and 0.25, respectively). There was no association with ANS markers.

The results were partially explained by lifestyle, said Ms. Kuzminskaite, who noted that people with severe childhood trauma tend to have a higher body mass index, smoke more, and have other unhealthy habits that may represent a “coping” mechanism for trauma.

Those who experienced childhood trauma also have higher rates of other disorders, including asthma, diabetes, and cardiovascular disease. Ms. Kuzminskaite noted that people with childhood trauma have at least double the risk of cancer in later life.

When researchers adjusted for lifestyle factors and chronic conditions, the association for cortisol was reduced and that for inflammation disappeared. However, the cumulative inflammatory markers remained significant.

Another model examined lipopolysaccharide-stimulated (LPS) immune-inflammatory markers by childhood trauma severity. This provides a more “dynamic” measure of stress systems than looking only at static circulating levels in the blood, as was done in the first model, said Ms. Kuzminskaite.

“These levels should theoretically be more affected by experiences such as childhood trauma and they are also less sensitive to lifestyle.”

Here, researchers found significant positive associations with childhood trauma, especially severe trauma, after adjusting for lifestyle and health-related covariates (cumulative index d = 0.19).

“Almost all people with childhood trauma, especially severe trauma, had LPS-stimulated cytokines upregulated,” said Ms. Kuzminskaite. “So again, there is this dysregulation of immune system functioning in these subjects.”

And again, the strongest effect was for the cumulative index of all cytokines, she said.
 

 

 

Personalized interventions

Ms. Kuzminskaite noted the importance of learning the impact of early trauma on stress responses. “The goal is to eventually have personalized interventions for people with depression or anxiety related to childhood trauma, or even preventative interventions. If we know, for example, something is going wrong with a patient’s stress systems, we can suggest some therapeutic targets.”

Investigators in Amsterdam are examining the efficacy of mifepristone, which blocks progesterone and is used along with misoprostol for medication abortions and to treat high blood sugar. “The drug is supposed to reset the stress system functioning,” said Ms. Kuzminskaite.

It’s still important to target unhealthy lifestyle habits “that are really impacting the functioning of the stress systems,” she said. Lifestyle interventions could improve the efficacy of treatments for depression, for example, she added.

Luana Marques, PhD, associate professor, department of psychiatry, Harvard Medical School, Boston, said such research is important.

“It reveals the potentially extensive and long-lasting impact of childhood trauma on functioning. The findings underscore the importance of equipping at-risk and trauma-exposed youth with evidence-based skills for managing stress,” she said.

No conflicts of interest were reported.

A version of this article first appeared on Medscape.com.

Dysregulated stress systems may help explain why childhood trauma has such a dramatic and enduring psychiatric impact, new research suggests.

“We already knew childhood trauma is associated with the later development of depressive and anxiety disorders, but it’s been unclear what makes sufferers of early trauma more likely to develop these psychiatric conditions,” study investigator Erika Kuzminskaite, PhD candidate, department of psychiatry, Amsterdam University Medical Center (UMC), the Netherlands, told this news organization.

Pauline Anderson
Erika Kuzminskaite

“The evidence now points to unbalanced stress systems as a possible cause of this vulnerability, and now the most important question is, how we can develop preventive interventions,” she added.

The findings were presented as part of the Anxiety and Depression Association of America Anxiety & Depression conference.
 

Elevated cortisol, inflammation

The study included 2,779 adults from the Netherlands Study of Depression and Anxiety (NESDA). Two thirds of participants were female.

Participants retrospectively reported childhood trauma, defined as emotional, physical, or sexual abuse or emotional or physical neglect, before the age of 18 years. Severe trauma was defined as multiple types or increased frequency of abuse.

Of the total cohort, 48% reported experiencing some childhood trauma – 21% reported severe trauma, 27% reported mild trauma, and 42% reported no childhood trauma.

Among those with trauma, 89% had a current or remitted anxiety or depressive disorder, and 11% had no psychiatric sequelae. Among participants who reported no trauma, 68% had a current or remitted disorder, and 32% had no psychiatric disorders.

At baseline, researchers assessed markers of major bodily stress systems, including the hypothalamic-pituitary-adrenal (HPA) axis, the immune-inflammatory system, and the autonomic nervous system (ANS). They examined these markers separately and cumulatively.

In one model, investigators found that levels of cortisol and inflammation were significantly elevated in those with severe childhood trauma compared to those with no childhood trauma. The effects were largest for the cumulative markers for HPA-axis, inflammation, and all stress system markers (Cohen’s d = 0.23, 0.12, and 0.25, respectively). There was no association with ANS markers.

The results were partially explained by lifestyle, said Ms. Kuzminskaite, who noted that people with severe childhood trauma tend to have a higher body mass index, smoke more, and have other unhealthy habits that may represent a “coping” mechanism for trauma.

Those who experienced childhood trauma also have higher rates of other disorders, including asthma, diabetes, and cardiovascular disease. Ms. Kuzminskaite noted that people with childhood trauma have at least double the risk of cancer in later life.

When researchers adjusted for lifestyle factors and chronic conditions, the association for cortisol was reduced and that for inflammation disappeared. However, the cumulative inflammatory markers remained significant.

Another model examined lipopolysaccharide-stimulated (LPS) immune-inflammatory markers by childhood trauma severity. This provides a more “dynamic” measure of stress systems than looking only at static circulating levels in the blood, as was done in the first model, said Ms. Kuzminskaite.

“These levels should theoretically be more affected by experiences such as childhood trauma and they are also less sensitive to lifestyle.”

Here, researchers found significant positive associations with childhood trauma, especially severe trauma, after adjusting for lifestyle and health-related covariates (cumulative index d = 0.19).

“Almost all people with childhood trauma, especially severe trauma, had LPS-stimulated cytokines upregulated,” said Ms. Kuzminskaite. “So again, there is this dysregulation of immune system functioning in these subjects.”

And again, the strongest effect was for the cumulative index of all cytokines, she said.
 

 

 

Personalized interventions

Ms. Kuzminskaite noted the importance of learning the impact of early trauma on stress responses. “The goal is to eventually have personalized interventions for people with depression or anxiety related to childhood trauma, or even preventative interventions. If we know, for example, something is going wrong with a patient’s stress systems, we can suggest some therapeutic targets.”

Investigators in Amsterdam are examining the efficacy of mifepristone, which blocks progesterone and is used along with misoprostol for medication abortions and to treat high blood sugar. “The drug is supposed to reset the stress system functioning,” said Ms. Kuzminskaite.

It’s still important to target unhealthy lifestyle habits “that are really impacting the functioning of the stress systems,” she said. Lifestyle interventions could improve the efficacy of treatments for depression, for example, she added.

Luana Marques, PhD, associate professor, department of psychiatry, Harvard Medical School, Boston, said such research is important.

“It reveals the potentially extensive and long-lasting impact of childhood trauma on functioning. The findings underscore the importance of equipping at-risk and trauma-exposed youth with evidence-based skills for managing stress,” she said.

No conflicts of interest were reported.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ADAA 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Normal CRP during RA flares: An ‘underappreciated, persistent phenotype’

Article Type
Changed
Tue, 05/09/2023 - 13:22

Even when C-reactive protein (CRP) levels are normal, patients with seropositive rheumatoid arthritis (RA) could still be experiencing significant disease that persists over time, researchers from University College London have found.

Similar levels of joint erosion and disease activity were observed over a 5-year period; researchers compared patients who had high CRP levels (> 5 mg/L)* with patients whose CRP levels were consistently normal (< 5 mg/L) at the time of an ultrasound-proven disease flare.

“Our data suggests that the phenotype of normal CRP represents at least 5% of our cohort,” Bhavika Sethi, MBChB, reported in a virtual poster presentation at the annual meeting of the British Society for Rheumatology.

“They are more likely to require biologic treatment, and this continues on even though they have equivalent DAS28 [disease activity score in 28 joints] and risk of joint damage” to high-CRP patients, she said.

These patients are a significant minority, Dr. Sethi added, and “we need to think about how we provide care for them and allocate resources.”
 

Diagnostic delay and poor outcomes previously seen

The study is a continuation of a larger project, the corresponding author for the poster, Matthew Hutchinson, MBChB, told this news organization.

A few years ago, Dr. Hutchinson explained, a subset of patients with normal CRP levels during RA flares were identified and were found to be more likely to have experienced diagnostic delay and worse outcomes than did those with high CRP levels.

The aim of the current study was to see whether those findings persisted by longitudinally assessing patient records and seeing what happened 1, 2, and 5 years later. They evaluated 312 patients with seropositive RA, of whom 28 had CRP < 5 mg/L as well as active disease, which was determined on the basis of a DAS28 > 4.5. Of those 28 patients, 16 had persistently low CRP (< 5 mg/L) despite active disease. All patients who were taking tocilizumab were excluded from the study because of its CRP-lowering properties.

“Our project was showing that this group of people exist, trying to characterize them a little better” and that the study serves as a “jumping-off point” for future research, Dr. Hutchinson said.

The study was also conducted to “make people more aware of [patients with normal CRP during flare], because treating clinicians could be falsely reassured by a normal CRP,” he added. “Patients in front of them could actually be undertreated and have worse outcomes if [it is] not picked up,” Dr. Hutchinson suggested.

In comparison with those with high CRP levels, those with normal CRP levels were more likely to be receiving biologic treatment at 5 years (76.6% vs. 44.4%; P =  .0323).

At 5 years, DAS28 was similar (P = .9615) among patients with normal CRP levels and those with high CRP levels, at a median of 2.8 and 3.2, respectively. A similar percentage of patients in these two groups also had joint damage (63.3% vs. 71.4%; P = .7384).
 

Don’t rely only on CRP to diagnose and manage RA flares

“CRP is a generic inflammatory marker in most people,” Dr. Hutchinson said. “In the majority of situations when either there is inflammation or an infection, certainly if it’s systemic infection or inflammation, you will find CRP being elevated on the blood tests.”

For someone presenting with joint pain, high CRP can be a useful indicator that it’s more of an inflammatory process than physical injury, he added. CRP is also frequently used to calculate DAS28 to monitor disease activity.

“This study highlights that CRP may be normal during flares in some people with RA,” Jeffrey A. Sparks, MD, told this news organization.

“These patients may still require advanced therapies and can accrue damage,” the rheumatologist from Brigham and Women’s Hospital and Harvard University, Boston, added.

“Clinicians should not only rely on CRP to diagnose and manage RA flares,” said Dr. Sparks, who was not involved in the study.

The study was independently supported. Dr. Hutchinson and Dr. Sethi report no relevant financial relationships. Dr. Sparks is supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases, the R. Bruce and Joan M. Mickey Research Scholar Fund, and the Llura Gund Award for Rheumatoid Arthritis Research and Care; he has received research support from Bristol-Myers Squibb and has performed consultancy for AbbVie, Amgen, Boehringer Ingelheim, Bristol-Myers Squibb, Gilead, Inova Diagnostics, Janssen, Optum, and Pfizer.

*Correction, 5/9/2023: This article has been updated to correct the units for C-reactive protein from mg/dL to mg/L.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Even when C-reactive protein (CRP) levels are normal, patients with seropositive rheumatoid arthritis (RA) could still be experiencing significant disease that persists over time, researchers from University College London have found.

Similar levels of joint erosion and disease activity were observed over a 5-year period; researchers compared patients who had high CRP levels (> 5 mg/L)* with patients whose CRP levels were consistently normal (< 5 mg/L) at the time of an ultrasound-proven disease flare.

“Our data suggests that the phenotype of normal CRP represents at least 5% of our cohort,” Bhavika Sethi, MBChB, reported in a virtual poster presentation at the annual meeting of the British Society for Rheumatology.

“They are more likely to require biologic treatment, and this continues on even though they have equivalent DAS28 [disease activity score in 28 joints] and risk of joint damage” to high-CRP patients, she said.

These patients are a significant minority, Dr. Sethi added, and “we need to think about how we provide care for them and allocate resources.”
 

Diagnostic delay and poor outcomes previously seen

The study is a continuation of a larger project, the corresponding author for the poster, Matthew Hutchinson, MBChB, told this news organization.

A few years ago, Dr. Hutchinson explained, a subset of patients with normal CRP levels during RA flares were identified and were found to be more likely to have experienced diagnostic delay and worse outcomes than did those with high CRP levels.

The aim of the current study was to see whether those findings persisted by longitudinally assessing patient records and seeing what happened 1, 2, and 5 years later. They evaluated 312 patients with seropositive RA, of whom 28 had CRP < 5 mg/L as well as active disease, which was determined on the basis of a DAS28 > 4.5. Of those 28 patients, 16 had persistently low CRP (< 5 mg/L) despite active disease. All patients who were taking tocilizumab were excluded from the study because of its CRP-lowering properties.

“Our project was showing that this group of people exist, trying to characterize them a little better” and that the study serves as a “jumping-off point” for future research, Dr. Hutchinson said.

The study was also conducted to “make people more aware of [patients with normal CRP during flare], because treating clinicians could be falsely reassured by a normal CRP,” he added. “Patients in front of them could actually be undertreated and have worse outcomes if [it is] not picked up,” Dr. Hutchinson suggested.

In comparison with those with high CRP levels, those with normal CRP levels were more likely to be receiving biologic treatment at 5 years (76.6% vs. 44.4%; P =  .0323).

At 5 years, DAS28 was similar (P = .9615) among patients with normal CRP levels and those with high CRP levels, at a median of 2.8 and 3.2, respectively. A similar percentage of patients in these two groups also had joint damage (63.3% vs. 71.4%; P = .7384).
 

Don’t rely only on CRP to diagnose and manage RA flares

“CRP is a generic inflammatory marker in most people,” Dr. Hutchinson said. “In the majority of situations when either there is inflammation or an infection, certainly if it’s systemic infection or inflammation, you will find CRP being elevated on the blood tests.”

For someone presenting with joint pain, high CRP can be a useful indicator that it’s more of an inflammatory process than physical injury, he added. CRP is also frequently used to calculate DAS28 to monitor disease activity.

“This study highlights that CRP may be normal during flares in some people with RA,” Jeffrey A. Sparks, MD, told this news organization.

“These patients may still require advanced therapies and can accrue damage,” the rheumatologist from Brigham and Women’s Hospital and Harvard University, Boston, added.

“Clinicians should not only rely on CRP to diagnose and manage RA flares,” said Dr. Sparks, who was not involved in the study.

The study was independently supported. Dr. Hutchinson and Dr. Sethi report no relevant financial relationships. Dr. Sparks is supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases, the R. Bruce and Joan M. Mickey Research Scholar Fund, and the Llura Gund Award for Rheumatoid Arthritis Research and Care; he has received research support from Bristol-Myers Squibb and has performed consultancy for AbbVie, Amgen, Boehringer Ingelheim, Bristol-Myers Squibb, Gilead, Inova Diagnostics, Janssen, Optum, and Pfizer.

*Correction, 5/9/2023: This article has been updated to correct the units for C-reactive protein from mg/dL to mg/L.

A version of this article first appeared on Medscape.com.

Even when C-reactive protein (CRP) levels are normal, patients with seropositive rheumatoid arthritis (RA) could still be experiencing significant disease that persists over time, researchers from University College London have found.

Similar levels of joint erosion and disease activity were observed over a 5-year period; researchers compared patients who had high CRP levels (> 5 mg/L)* with patients whose CRP levels were consistently normal (< 5 mg/L) at the time of an ultrasound-proven disease flare.

“Our data suggests that the phenotype of normal CRP represents at least 5% of our cohort,” Bhavika Sethi, MBChB, reported in a virtual poster presentation at the annual meeting of the British Society for Rheumatology.

“They are more likely to require biologic treatment, and this continues on even though they have equivalent DAS28 [disease activity score in 28 joints] and risk of joint damage” to high-CRP patients, she said.

These patients are a significant minority, Dr. Sethi added, and “we need to think about how we provide care for them and allocate resources.”
 

Diagnostic delay and poor outcomes previously seen

The study is a continuation of a larger project, the corresponding author for the poster, Matthew Hutchinson, MBChB, told this news organization.

A few years ago, Dr. Hutchinson explained, a subset of patients with normal CRP levels during RA flares were identified and were found to be more likely to have experienced diagnostic delay and worse outcomes than did those with high CRP levels.

The aim of the current study was to see whether those findings persisted by longitudinally assessing patient records and seeing what happened 1, 2, and 5 years later. They evaluated 312 patients with seropositive RA, of whom 28 had CRP < 5 mg/L as well as active disease, which was determined on the basis of a DAS28 > 4.5. Of those 28 patients, 16 had persistently low CRP (< 5 mg/L) despite active disease. All patients who were taking tocilizumab were excluded from the study because of its CRP-lowering properties.

“Our project was showing that this group of people exist, trying to characterize them a little better” and that the study serves as a “jumping-off point” for future research, Dr. Hutchinson said.

The study was also conducted to “make people more aware of [patients with normal CRP during flare], because treating clinicians could be falsely reassured by a normal CRP,” he added. “Patients in front of them could actually be undertreated and have worse outcomes if [it is] not picked up,” Dr. Hutchinson suggested.

In comparison with those with high CRP levels, those with normal CRP levels were more likely to be receiving biologic treatment at 5 years (76.6% vs. 44.4%; P =  .0323).

At 5 years, DAS28 was similar (P = .9615) among patients with normal CRP levels and those with high CRP levels, at a median of 2.8 and 3.2, respectively. A similar percentage of patients in these two groups also had joint damage (63.3% vs. 71.4%; P = .7384).
 

Don’t rely only on CRP to diagnose and manage RA flares

“CRP is a generic inflammatory marker in most people,” Dr. Hutchinson said. “In the majority of situations when either there is inflammation or an infection, certainly if it’s systemic infection or inflammation, you will find CRP being elevated on the blood tests.”

For someone presenting with joint pain, high CRP can be a useful indicator that it’s more of an inflammatory process than physical injury, he added. CRP is also frequently used to calculate DAS28 to monitor disease activity.

“This study highlights that CRP may be normal during flares in some people with RA,” Jeffrey A. Sparks, MD, told this news organization.

“These patients may still require advanced therapies and can accrue damage,” the rheumatologist from Brigham and Women’s Hospital and Harvard University, Boston, added.

“Clinicians should not only rely on CRP to diagnose and manage RA flares,” said Dr. Sparks, who was not involved in the study.

The study was independently supported. Dr. Hutchinson and Dr. Sethi report no relevant financial relationships. Dr. Sparks is supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases, the R. Bruce and Joan M. Mickey Research Scholar Fund, and the Llura Gund Award for Rheumatoid Arthritis Research and Care; he has received research support from Bristol-Myers Squibb and has performed consultancy for AbbVie, Amgen, Boehringer Ingelheim, Bristol-Myers Squibb, Gilead, Inova Diagnostics, Janssen, Optum, and Pfizer.

*Correction, 5/9/2023: This article has been updated to correct the units for C-reactive protein from mg/dL to mg/L.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT BSR 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

LAA closure outcomes improve with CCTA: Swiss-Apero subanalysis

Article Type
Changed
Fri, 05/05/2023 - 10:02

The largest multicenter randomized trial to date of CT angiography before left atrial appendage closure (LAAC) to treat atrial fibrillation has added to the evidence that the imaging technique on top of transesophageal echocardiography achieves a higher degree of short- and long-term success than TEE alone.

The results are from a subanalysis of the Swiss-Apero trial, a randomized comparative trial of the Watchman and Amulet devices for LAAC, which published results in Circulation.

“Our observational data support to use of CT for LAAC procedure planning,” senior investigator Lorenz Räber, MD, PhD, said in an interview. “This is not very surprising given the high variability of the LAA anatomy and the associated complexity of the procedure.” Dr. Räber is director of the catheterization laboratory at Inselspital, Bern (Switzerland) University Hospital.

The study, published online in JACC: Cardiovascular Interventions, included 219 LAAC procedures in which the operators performed coronary CT angiography (CTTA) beforehand. When the investigators designed the study, LAAC procedures were typically planned using TEE alone, and so participating operators were blinded to preprocedural CCTA imaging. Soon after the study launch, European cardiology societies issued a consensus statement that included CCTA as an option for procedure planning. So the Swiss-Apero investigators changed the subanalysis protocol to unblind the operators – that is, they were permitted to plan LAAC procedures with CCTA imaging in addition to TEE. In this subanalysis, most patients had implantation with blinding to CCTA (57.9% vs. 41.2%).
 

Study results

The subanalysis determined that operator unblinding to preprocedural CCTA resulted in better success with LAAC, both in the short term, at 93.5% vs. 81.1% (P = .009; adjusted odds ratio, 2.76; 95% confidence interval, 1.05-7.29; P = .40) and the long term, at 83.7% vs. 72.4% (P = .050; aOR, 2.12; 95% CI, 1.03-4.35; P = .041).

Dr. Räber noted that this is only the third study to date that examined the potential impact of preprocedural CCTA plus TEE. One was a small study of 24 consecutive LAAC procedures with the Watchman device that compared TEE alone and CCTA plus TEE, finding better outcomes in the group that had both imaging modalities . A larger, single-center cohort study of 485 LAAC Watchman procedures found that CCTA resulted in faster operation times and higher successful device implantation rates, but no significant difference in procedural complications.

Dr. Räber explained why his group’s subanalysis may have found a clinical benefit with CCTA on top of TEE. “Our study was much larger, as compared to the randomized clinical trial, and there was no selection bias as in the second study mentioned before, as operators did not have the option to decide whether or not to assess the CCTA prior to the procedure,” he said. “Finally, in the previous studies there was no random allocation of device type” – that is, Amulet versus Watchman.

One study limitation Dr. Räber noted was that significantly more patients in the blinded group were discharged with dual-antiplatelet therapy. “The lower rate of procedure complications observed in unblinded procedures was mostly driven by a lower number of major bleedings and in particular of pericardial tamponade,” he said. “We cannot therefore exclude that the higher percentage of patients under dual-antiplatelet therapy in the CCTA-blinded group might have favored this difference.”

However, he noted the investigators corrected their analysis to account for differences between the groups. “Importantly, the numerical excess in major procedural bleeding was observed within both the single-antiplatelet therapy and dual-antiplatelet therapy subgroups of the TEE-only group.”

In an accompanying editorial, coauthors Brian O’Neill, MD, and Dee Dee Wang, MD, both with the Center for Structural Heard Disease at Henry Ford Hospital in Detroit, noted that the Swiss-Apero subanalysis “reinforced” the benefit of CCTA before LAAC.  

“This study demonstrated, for the first time, improved short- and long-term procedural success using CT in addition to TEE for left atrial appendage occlusion,” Dr. O’Neill said in an interview. “This particular study may serve as a guide to an adequately powered randomized trial of CT versus TEE in left atrial appendage occlusion.” Future LAAC trials should incorporate preprocedural CCTA.

Dr. O’Neill noted that, as a subanalysis of a randomized trial, the “results are hypothesis generating.” However, he added, “the results are in line with several previous studies of CT versus TEE in left atrial appendage occlusion.”

Dr Räber disclosed financial relationships with Abbott Vascular, Boston Scientific, Biotronik, Infraredx, Heartflow, Sanofi, Regeneron, Amgen, AstraZeneca, CSL Behring, Canon, Occlutech, and Vifor. Dr. O’Neill disclosed financial relationships with Edwards Lifesciences, Medtronic, and Abbott Vascular.

Publications
Topics
Sections

The largest multicenter randomized trial to date of CT angiography before left atrial appendage closure (LAAC) to treat atrial fibrillation has added to the evidence that the imaging technique on top of transesophageal echocardiography achieves a higher degree of short- and long-term success than TEE alone.

The results are from a subanalysis of the Swiss-Apero trial, a randomized comparative trial of the Watchman and Amulet devices for LAAC, which published results in Circulation.

“Our observational data support to use of CT for LAAC procedure planning,” senior investigator Lorenz Räber, MD, PhD, said in an interview. “This is not very surprising given the high variability of the LAA anatomy and the associated complexity of the procedure.” Dr. Räber is director of the catheterization laboratory at Inselspital, Bern (Switzerland) University Hospital.

The study, published online in JACC: Cardiovascular Interventions, included 219 LAAC procedures in which the operators performed coronary CT angiography (CTTA) beforehand. When the investigators designed the study, LAAC procedures were typically planned using TEE alone, and so participating operators were blinded to preprocedural CCTA imaging. Soon after the study launch, European cardiology societies issued a consensus statement that included CCTA as an option for procedure planning. So the Swiss-Apero investigators changed the subanalysis protocol to unblind the operators – that is, they were permitted to plan LAAC procedures with CCTA imaging in addition to TEE. In this subanalysis, most patients had implantation with blinding to CCTA (57.9% vs. 41.2%).
 

Study results

The subanalysis determined that operator unblinding to preprocedural CCTA resulted in better success with LAAC, both in the short term, at 93.5% vs. 81.1% (P = .009; adjusted odds ratio, 2.76; 95% confidence interval, 1.05-7.29; P = .40) and the long term, at 83.7% vs. 72.4% (P = .050; aOR, 2.12; 95% CI, 1.03-4.35; P = .041).

Dr. Räber noted that this is only the third study to date that examined the potential impact of preprocedural CCTA plus TEE. One was a small study of 24 consecutive LAAC procedures with the Watchman device that compared TEE alone and CCTA plus TEE, finding better outcomes in the group that had both imaging modalities . A larger, single-center cohort study of 485 LAAC Watchman procedures found that CCTA resulted in faster operation times and higher successful device implantation rates, but no significant difference in procedural complications.

Dr. Räber explained why his group’s subanalysis may have found a clinical benefit with CCTA on top of TEE. “Our study was much larger, as compared to the randomized clinical trial, and there was no selection bias as in the second study mentioned before, as operators did not have the option to decide whether or not to assess the CCTA prior to the procedure,” he said. “Finally, in the previous studies there was no random allocation of device type” – that is, Amulet versus Watchman.

One study limitation Dr. Räber noted was that significantly more patients in the blinded group were discharged with dual-antiplatelet therapy. “The lower rate of procedure complications observed in unblinded procedures was mostly driven by a lower number of major bleedings and in particular of pericardial tamponade,” he said. “We cannot therefore exclude that the higher percentage of patients under dual-antiplatelet therapy in the CCTA-blinded group might have favored this difference.”

However, he noted the investigators corrected their analysis to account for differences between the groups. “Importantly, the numerical excess in major procedural bleeding was observed within both the single-antiplatelet therapy and dual-antiplatelet therapy subgroups of the TEE-only group.”

In an accompanying editorial, coauthors Brian O’Neill, MD, and Dee Dee Wang, MD, both with the Center for Structural Heard Disease at Henry Ford Hospital in Detroit, noted that the Swiss-Apero subanalysis “reinforced” the benefit of CCTA before LAAC.  

“This study demonstrated, for the first time, improved short- and long-term procedural success using CT in addition to TEE for left atrial appendage occlusion,” Dr. O’Neill said in an interview. “This particular study may serve as a guide to an adequately powered randomized trial of CT versus TEE in left atrial appendage occlusion.” Future LAAC trials should incorporate preprocedural CCTA.

Dr. O’Neill noted that, as a subanalysis of a randomized trial, the “results are hypothesis generating.” However, he added, “the results are in line with several previous studies of CT versus TEE in left atrial appendage occlusion.”

Dr Räber disclosed financial relationships with Abbott Vascular, Boston Scientific, Biotronik, Infraredx, Heartflow, Sanofi, Regeneron, Amgen, AstraZeneca, CSL Behring, Canon, Occlutech, and Vifor. Dr. O’Neill disclosed financial relationships with Edwards Lifesciences, Medtronic, and Abbott Vascular.

The largest multicenter randomized trial to date of CT angiography before left atrial appendage closure (LAAC) to treat atrial fibrillation has added to the evidence that the imaging technique on top of transesophageal echocardiography achieves a higher degree of short- and long-term success than TEE alone.

The results are from a subanalysis of the Swiss-Apero trial, a randomized comparative trial of the Watchman and Amulet devices for LAAC, which published results in Circulation.

“Our observational data support to use of CT for LAAC procedure planning,” senior investigator Lorenz Räber, MD, PhD, said in an interview. “This is not very surprising given the high variability of the LAA anatomy and the associated complexity of the procedure.” Dr. Räber is director of the catheterization laboratory at Inselspital, Bern (Switzerland) University Hospital.

The study, published online in JACC: Cardiovascular Interventions, included 219 LAAC procedures in which the operators performed coronary CT angiography (CTTA) beforehand. When the investigators designed the study, LAAC procedures were typically planned using TEE alone, and so participating operators were blinded to preprocedural CCTA imaging. Soon after the study launch, European cardiology societies issued a consensus statement that included CCTA as an option for procedure planning. So the Swiss-Apero investigators changed the subanalysis protocol to unblind the operators – that is, they were permitted to plan LAAC procedures with CCTA imaging in addition to TEE. In this subanalysis, most patients had implantation with blinding to CCTA (57.9% vs. 41.2%).
 

Study results

The subanalysis determined that operator unblinding to preprocedural CCTA resulted in better success with LAAC, both in the short term, at 93.5% vs. 81.1% (P = .009; adjusted odds ratio, 2.76; 95% confidence interval, 1.05-7.29; P = .40) and the long term, at 83.7% vs. 72.4% (P = .050; aOR, 2.12; 95% CI, 1.03-4.35; P = .041).

Dr. Räber noted that this is only the third study to date that examined the potential impact of preprocedural CCTA plus TEE. One was a small study of 24 consecutive LAAC procedures with the Watchman device that compared TEE alone and CCTA plus TEE, finding better outcomes in the group that had both imaging modalities . A larger, single-center cohort study of 485 LAAC Watchman procedures found that CCTA resulted in faster operation times and higher successful device implantation rates, but no significant difference in procedural complications.

Dr. Räber explained why his group’s subanalysis may have found a clinical benefit with CCTA on top of TEE. “Our study was much larger, as compared to the randomized clinical trial, and there was no selection bias as in the second study mentioned before, as operators did not have the option to decide whether or not to assess the CCTA prior to the procedure,” he said. “Finally, in the previous studies there was no random allocation of device type” – that is, Amulet versus Watchman.

One study limitation Dr. Räber noted was that significantly more patients in the blinded group were discharged with dual-antiplatelet therapy. “The lower rate of procedure complications observed in unblinded procedures was mostly driven by a lower number of major bleedings and in particular of pericardial tamponade,” he said. “We cannot therefore exclude that the higher percentage of patients under dual-antiplatelet therapy in the CCTA-blinded group might have favored this difference.”

However, he noted the investigators corrected their analysis to account for differences between the groups. “Importantly, the numerical excess in major procedural bleeding was observed within both the single-antiplatelet therapy and dual-antiplatelet therapy subgroups of the TEE-only group.”

In an accompanying editorial, coauthors Brian O’Neill, MD, and Dee Dee Wang, MD, both with the Center for Structural Heard Disease at Henry Ford Hospital in Detroit, noted that the Swiss-Apero subanalysis “reinforced” the benefit of CCTA before LAAC.  

“This study demonstrated, for the first time, improved short- and long-term procedural success using CT in addition to TEE for left atrial appendage occlusion,” Dr. O’Neill said in an interview. “This particular study may serve as a guide to an adequately powered randomized trial of CT versus TEE in left atrial appendage occlusion.” Future LAAC trials should incorporate preprocedural CCTA.

Dr. O’Neill noted that, as a subanalysis of a randomized trial, the “results are hypothesis generating.” However, he added, “the results are in line with several previous studies of CT versus TEE in left atrial appendage occlusion.”

Dr Räber disclosed financial relationships with Abbott Vascular, Boston Scientific, Biotronik, Infraredx, Heartflow, Sanofi, Regeneron, Amgen, AstraZeneca, CSL Behring, Canon, Occlutech, and Vifor. Dr. O’Neill disclosed financial relationships with Edwards Lifesciences, Medtronic, and Abbott Vascular.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JACC: CARDIOVASCULAR INTERVENTIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Controlled hyperthermia: Novel treatment of BCCs without surgery continues to be refined

Article Type
Changed
Fri, 05/05/2023 - 10:03

Treating superficial and nodular basal cell cancers (BCCs) with an apoptotic process induced by controlled hyperthermia resulted in strong histologic clearance of tumors, an interim report from an ongoing study showed.

“For 2,000 years, it’s been known that heat can kill cancers,” an apoptotic reaction “rather than a destructive reaction coming from excessive heat,” Christopher B. Zachary, MD, said at the annual conference of the American Society for Laser Medicine and Surgery, where the study was presented during an abstract session.

Dr. Christopher B. Zachary

Dr. Zachary, professor and chair emeritus of the department of dermatology at the University of California, Irvine, and colleagues, evaluated a novel, noninvasive technique of controlled hyperthermia and mapping protocol (CHAMP) designed to help clinicians with margin assessment and treatment of superficial and nodular BCCs. For this prospective study, which was first described at the 2022 ASLMS annual conference and is being conducted at three centers, 73 patients with biopsy-proven superficial and nodular BCCs have been scanned with the VivoSight Dx optical coherence tomography (OCT) device to map BCC tumor margins.

The BCCs were treated with the Sciton 1,064-nm Er:YAG laser equipped with a 4-mm beam diameter scan pattern with no overlap and an 8-millisecond pulse duration, randomized to either standard 120-140 J/cm2 pulses until tissue graying and contraction was observed, or the CHAMP controlled hyperthermia technique using repeated 25 J/cm2 pulses under thermal camera imaging to maintain a consistent temperature of 55º C for 60 seconds. Patients were rescanned by OCT at 3 to 12 months for any signs of residual tumor and if positive, were retreated. Finally, lesions were excised for evidence of histological clearance.

To date, 48 patients have completed the study. Among the 26 patients treated with the CHAMP method, 22 (84.6%) were histologically clear, as were 19 of the 22 (86.4%) in the standard treatment group. Ulceration was uncommon with the CHAMP method, and patients healed with modest erythema, Dr. Zachary said.



Pretreatment OCT mapping of BCCs indicated that tumors extended beyond their 5-mm clinical margins in 11 cases (15%). “This will be of interest to those who treat BCCs by Mohs or standard excision,” he said. Increased vascularity measured by dynamic OCT was noted in most CHAMP patients immediately after irradiation, which suggests that apoptosis was the primary mechanism of tumor response instead of vascular destruction.

“The traditional technique for using the long pulsed 1,064-nm Er:YAG laser to cause damage and destruction of BCC is 120-140 J/cm2 at one or two passes until you get to an endpoint of graying and contraction of tissue,” Dr. Zachary said. “That’s opposed to the ‘Low and Slow’ approach [where you use] multiple pulses at 25 J/cm2 until you achieve an optimal time and temperature. If you treat above 60º C, you tend to get epidermal blistering, prolonged healing, and interestingly, absence of pain. I think that’s because you kill off the nerve fibers. With the low fluence multiple scan technique, you’re going for an even flat-top heating.”

Currently, he and his colleagues consider 55 degrees at 60 seconds as “the optimal parameters,” he said, but “it could be 45 degrees at 90 seconds or two minutes. We don’t know yet.”

In an interview at the meeting, one of the abstract session moderators, Mathew M. Avram, MD, JD, director of laser, cosmetics, and dermatologic surgery at Massachusetts General Hospital, Boston, said that he was encouraged by the study results as investigations into effective, noninvasive treatment of BCC continue to move forward. “Details matter such as the temperature [of energy delivery] and noninvasive imaging to delineate the appropriate margins,” said Dr. Avram, who has conducted research on the 1,064-nm long-pulsed Nd:YAG laser as an alternative treatment for nonfacial BCCs in patients who are poor surgical candidates.

Dr. Mathew M. Avram

“Hopefully, at some point,” he said, such approaches will “become the standard of care for many BCCs that we are now treating surgically. I don’t think this will happen in the next 3 years, but I think in the long term, it will emerge as the treatment of choice.”

The study is being funded by Michelson Diagnostics. Sciton provided the long-pulsed 1,064-nm lasers devices being used in the trial. Dr. Zachary reported having no relevant disclosures. Dr. Avram disclosed that he has received consulting fees from Sciton.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Treating superficial and nodular basal cell cancers (BCCs) with an apoptotic process induced by controlled hyperthermia resulted in strong histologic clearance of tumors, an interim report from an ongoing study showed.

“For 2,000 years, it’s been known that heat can kill cancers,” an apoptotic reaction “rather than a destructive reaction coming from excessive heat,” Christopher B. Zachary, MD, said at the annual conference of the American Society for Laser Medicine and Surgery, where the study was presented during an abstract session.

Dr. Christopher B. Zachary

Dr. Zachary, professor and chair emeritus of the department of dermatology at the University of California, Irvine, and colleagues, evaluated a novel, noninvasive technique of controlled hyperthermia and mapping protocol (CHAMP) designed to help clinicians with margin assessment and treatment of superficial and nodular BCCs. For this prospective study, which was first described at the 2022 ASLMS annual conference and is being conducted at three centers, 73 patients with biopsy-proven superficial and nodular BCCs have been scanned with the VivoSight Dx optical coherence tomography (OCT) device to map BCC tumor margins.

The BCCs were treated with the Sciton 1,064-nm Er:YAG laser equipped with a 4-mm beam diameter scan pattern with no overlap and an 8-millisecond pulse duration, randomized to either standard 120-140 J/cm2 pulses until tissue graying and contraction was observed, or the CHAMP controlled hyperthermia technique using repeated 25 J/cm2 pulses under thermal camera imaging to maintain a consistent temperature of 55º C for 60 seconds. Patients were rescanned by OCT at 3 to 12 months for any signs of residual tumor and if positive, were retreated. Finally, lesions were excised for evidence of histological clearance.

To date, 48 patients have completed the study. Among the 26 patients treated with the CHAMP method, 22 (84.6%) were histologically clear, as were 19 of the 22 (86.4%) in the standard treatment group. Ulceration was uncommon with the CHAMP method, and patients healed with modest erythema, Dr. Zachary said.



Pretreatment OCT mapping of BCCs indicated that tumors extended beyond their 5-mm clinical margins in 11 cases (15%). “This will be of interest to those who treat BCCs by Mohs or standard excision,” he said. Increased vascularity measured by dynamic OCT was noted in most CHAMP patients immediately after irradiation, which suggests that apoptosis was the primary mechanism of tumor response instead of vascular destruction.

“The traditional technique for using the long pulsed 1,064-nm Er:YAG laser to cause damage and destruction of BCC is 120-140 J/cm2 at one or two passes until you get to an endpoint of graying and contraction of tissue,” Dr. Zachary said. “That’s opposed to the ‘Low and Slow’ approach [where you use] multiple pulses at 25 J/cm2 until you achieve an optimal time and temperature. If you treat above 60º C, you tend to get epidermal blistering, prolonged healing, and interestingly, absence of pain. I think that’s because you kill off the nerve fibers. With the low fluence multiple scan technique, you’re going for an even flat-top heating.”

Currently, he and his colleagues consider 55 degrees at 60 seconds as “the optimal parameters,” he said, but “it could be 45 degrees at 90 seconds or two minutes. We don’t know yet.”

In an interview at the meeting, one of the abstract session moderators, Mathew M. Avram, MD, JD, director of laser, cosmetics, and dermatologic surgery at Massachusetts General Hospital, Boston, said that he was encouraged by the study results as investigations into effective, noninvasive treatment of BCC continue to move forward. “Details matter such as the temperature [of energy delivery] and noninvasive imaging to delineate the appropriate margins,” said Dr. Avram, who has conducted research on the 1,064-nm long-pulsed Nd:YAG laser as an alternative treatment for nonfacial BCCs in patients who are poor surgical candidates.

Dr. Mathew M. Avram

“Hopefully, at some point,” he said, such approaches will “become the standard of care for many BCCs that we are now treating surgically. I don’t think this will happen in the next 3 years, but I think in the long term, it will emerge as the treatment of choice.”

The study is being funded by Michelson Diagnostics. Sciton provided the long-pulsed 1,064-nm lasers devices being used in the trial. Dr. Zachary reported having no relevant disclosures. Dr. Avram disclosed that he has received consulting fees from Sciton.

Treating superficial and nodular basal cell cancers (BCCs) with an apoptotic process induced by controlled hyperthermia resulted in strong histologic clearance of tumors, an interim report from an ongoing study showed.

“For 2,000 years, it’s been known that heat can kill cancers,” an apoptotic reaction “rather than a destructive reaction coming from excessive heat,” Christopher B. Zachary, MD, said at the annual conference of the American Society for Laser Medicine and Surgery, where the study was presented during an abstract session.

Dr. Christopher B. Zachary

Dr. Zachary, professor and chair emeritus of the department of dermatology at the University of California, Irvine, and colleagues, evaluated a novel, noninvasive technique of controlled hyperthermia and mapping protocol (CHAMP) designed to help clinicians with margin assessment and treatment of superficial and nodular BCCs. For this prospective study, which was first described at the 2022 ASLMS annual conference and is being conducted at three centers, 73 patients with biopsy-proven superficial and nodular BCCs have been scanned with the VivoSight Dx optical coherence tomography (OCT) device to map BCC tumor margins.

The BCCs were treated with the Sciton 1,064-nm Er:YAG laser equipped with a 4-mm beam diameter scan pattern with no overlap and an 8-millisecond pulse duration, randomized to either standard 120-140 J/cm2 pulses until tissue graying and contraction was observed, or the CHAMP controlled hyperthermia technique using repeated 25 J/cm2 pulses under thermal camera imaging to maintain a consistent temperature of 55º C for 60 seconds. Patients were rescanned by OCT at 3 to 12 months for any signs of residual tumor and if positive, were retreated. Finally, lesions were excised for evidence of histological clearance.

To date, 48 patients have completed the study. Among the 26 patients treated with the CHAMP method, 22 (84.6%) were histologically clear, as were 19 of the 22 (86.4%) in the standard treatment group. Ulceration was uncommon with the CHAMP method, and patients healed with modest erythema, Dr. Zachary said.



Pretreatment OCT mapping of BCCs indicated that tumors extended beyond their 5-mm clinical margins in 11 cases (15%). “This will be of interest to those who treat BCCs by Mohs or standard excision,” he said. Increased vascularity measured by dynamic OCT was noted in most CHAMP patients immediately after irradiation, which suggests that apoptosis was the primary mechanism of tumor response instead of vascular destruction.

“The traditional technique for using the long pulsed 1,064-nm Er:YAG laser to cause damage and destruction of BCC is 120-140 J/cm2 at one or two passes until you get to an endpoint of graying and contraction of tissue,” Dr. Zachary said. “That’s opposed to the ‘Low and Slow’ approach [where you use] multiple pulses at 25 J/cm2 until you achieve an optimal time and temperature. If you treat above 60º C, you tend to get epidermal blistering, prolonged healing, and interestingly, absence of pain. I think that’s because you kill off the nerve fibers. With the low fluence multiple scan technique, you’re going for an even flat-top heating.”

Currently, he and his colleagues consider 55 degrees at 60 seconds as “the optimal parameters,” he said, but “it could be 45 degrees at 90 seconds or two minutes. We don’t know yet.”

In an interview at the meeting, one of the abstract session moderators, Mathew M. Avram, MD, JD, director of laser, cosmetics, and dermatologic surgery at Massachusetts General Hospital, Boston, said that he was encouraged by the study results as investigations into effective, noninvasive treatment of BCC continue to move forward. “Details matter such as the temperature [of energy delivery] and noninvasive imaging to delineate the appropriate margins,” said Dr. Avram, who has conducted research on the 1,064-nm long-pulsed Nd:YAG laser as an alternative treatment for nonfacial BCCs in patients who are poor surgical candidates.

Dr. Mathew M. Avram

“Hopefully, at some point,” he said, such approaches will “become the standard of care for many BCCs that we are now treating surgically. I don’t think this will happen in the next 3 years, but I think in the long term, it will emerge as the treatment of choice.”

The study is being funded by Michelson Diagnostics. Sciton provided the long-pulsed 1,064-nm lasers devices being used in the trial. Dr. Zachary reported having no relevant disclosures. Dr. Avram disclosed that he has received consulting fees from Sciton.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ASLMS 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article