User login
MDedge conference coverage features onsite reporting of the latest study results and expert perspectives from leading researchers.
At 5 years, TAVI valves perform better than surgical ones
In a pooled analysis from two randomized trials, transcatheter aortic valve implantation (TAVI) was associated with significantly less bioprosthetic valve dysfunction (BVD) than a surgical prosthetic implantation, according to data presented as a late-breaker at the Cardiovascular Research Technologies conference, sponsored by MedStar Heart & Vascular Institute.
“The difference in valve performance was driven by a twofold lower SVD [structural valve deterioration] and a 3-fold lower severe PPM [prothesis-patient mismatch] for TAVI versus surgery,” reported Steven J. Yakubov, MD.
The data were pooled from the CoreValve U.S. Pivotal and SURTAVI randomized trials. Of patients participating in these two trials, 5-year follow-up data were available for 1,128 randomized to the CoreValve/Evolut TAVI and 971 randomized to surgical prosthetic valve replacement.
The major focus of the study was on the cumulative incidence of BVD, but the study also included separate analyses on the relationship between BVD and clinical outcomes. Preprocedural indicators for BVD at 5 years were also analyzed.
SVD was defined as a mean gradient increase of at least 10 mm Hg from discharge to 30 days, along with at least 20 mm Hg at last echo or new-onset aortic regurgitation. Nonstructural valve deterioration (NSVD) was defined as severe PPM at discharge or 30 days or severe paravalvular regurgitation through 5 years. In addition to these two components, the BVD endpoint also included thrombosis and endocarditis.
Surgical valve deterioration high at 5 years
On the basis of these definitions, the rate of BVD at 5 years was 14.2% in the surgery group and 7.8% in the TAVI group, translating into a 50% risk reduction in favor of TAVI (hazard ratio, 0.50; P < .001).
Thrombosis or endocarditis occurred in low rates in both groups, but every other component of BVD favored TAVI significantly, not just numerically. This included SVD (2.2% vs. 4.4%; P = .004), and the two components of NSVD, PPM (3.7% vs. 11.8%; P < .001) and severe paravalvular regurgitation (0.2% vs. 1.2%; P = .02).
When stratified by annular diameter, the relative advantage of TAVI over surgery was greatest in those valves with diameters of up to 23 mm. In this group, the lower relative rate in the TAVI group (8.6% vs. 19.7%) represented a nearly 70% reduction in risk of valve deterioration at 5 years (HR, 0.31; P < .001).
However, the advantage at 5 years also remained substantial and significant in larger valves (8.1% vs. 12.6%), translating into a 40% risk reduction in favor of TAVI (HR, 0.60; P = .002).
Independent of type of valve replacement, BVD at 5 years was associated with worse outcomes, including significantly increased risks for all-cause mortality (HR, 1.46; P = .004), cardiovascular mortality (1.84; P < .001), and hospitalization for valve disease or worsening heart failure (HR, 1.67; P = .001).
The baseline characteristics that were statistically associated with BVD at 5 years on multivariate analysis in pooled data from both the TAVI and surgical groups included age (P = .02), a creatinine clearance less than 30 mL/min per 1.73 m2 (P = .006), and a low relative baseline left ventricular ejection fraction (P < .001).
BVD criteria validated for outcome prediction
The four components of valve performance employed in this analysis (SVD, NSVD, thrombosis, and endocarditis) were drawn from consensus documents issued by the Valve Academic Research Consortium and the European Association of Percutaneous Cardiovascular Interventions, but the relative importance of these components for predicting valve survival was previously unknown, according to Dr. Yakubov.
“This is the first analysis to validate clinical criteria for valve performance and its association with clinical outcomes,” said Dr. Yakubov, medical director of cardiovascular studies, OhioHealth Research Institute at Riverside Methodist Hospital, Columbus.
This is also the first study to employ randomized data to prove an advantage of TAVI over surgery in long-term follow-up.
A 10-year follow-up is planned for the patients who participated in these two trials, but the lower rate of BVD in the TAVI arm at 5 years is already a threat to surgical repairs, acknowledged several surgeons who served as panelists in the session where these results were presented.
“I think that these data are a reflection of the fact that we [surgeons] are not being as aggressive as we should be,” said Gregory P. Fontana, MD, who is national director, cardiothoracic surgery, HCA Healthcare, and is affiliated with Los Robles Health System, Thousand Oaks, Calif. “We need to be employing larger prostheses.”
A very similar comment was made by Michael J. Reardon, MD, a professor of cardiothoracic surgery at Houston Methodist Hospital. Pointing to the higher rate of PVL as an example of a common postsurgical complication, he agreed that surgeons should be moving to bigger valve sizes.
While adjustments in valve size might address the steeper rise in NSVD subtypes of BVD observed in the surgical group, but Dr. Reardon and others pointed out that late BVD events also rose at a greater pace in the surgical group. These suggest other improvements in technique might also be needed to keep surgical valve repairs competitive.
Dr. Yakubov reported financial relationships with Medtronic and Boston Scientific, both of which provided funding for this study. Dr. Fontana reported financial relationships with Abbott and Medtronic. Dr. Reardon reported financial relationships with Abbott, Boston Scientific, Medtronic, and Gore Medical.
In a pooled analysis from two randomized trials, transcatheter aortic valve implantation (TAVI) was associated with significantly less bioprosthetic valve dysfunction (BVD) than a surgical prosthetic implantation, according to data presented as a late-breaker at the Cardiovascular Research Technologies conference, sponsored by MedStar Heart & Vascular Institute.
“The difference in valve performance was driven by a twofold lower SVD [structural valve deterioration] and a 3-fold lower severe PPM [prothesis-patient mismatch] for TAVI versus surgery,” reported Steven J. Yakubov, MD.
The data were pooled from the CoreValve U.S. Pivotal and SURTAVI randomized trials. Of patients participating in these two trials, 5-year follow-up data were available for 1,128 randomized to the CoreValve/Evolut TAVI and 971 randomized to surgical prosthetic valve replacement.
The major focus of the study was on the cumulative incidence of BVD, but the study also included separate analyses on the relationship between BVD and clinical outcomes. Preprocedural indicators for BVD at 5 years were also analyzed.
SVD was defined as a mean gradient increase of at least 10 mm Hg from discharge to 30 days, along with at least 20 mm Hg at last echo or new-onset aortic regurgitation. Nonstructural valve deterioration (NSVD) was defined as severe PPM at discharge or 30 days or severe paravalvular regurgitation through 5 years. In addition to these two components, the BVD endpoint also included thrombosis and endocarditis.
Surgical valve deterioration high at 5 years
On the basis of these definitions, the rate of BVD at 5 years was 14.2% in the surgery group and 7.8% in the TAVI group, translating into a 50% risk reduction in favor of TAVI (hazard ratio, 0.50; P < .001).
Thrombosis or endocarditis occurred in low rates in both groups, but every other component of BVD favored TAVI significantly, not just numerically. This included SVD (2.2% vs. 4.4%; P = .004), and the two components of NSVD, PPM (3.7% vs. 11.8%; P < .001) and severe paravalvular regurgitation (0.2% vs. 1.2%; P = .02).
When stratified by annular diameter, the relative advantage of TAVI over surgery was greatest in those valves with diameters of up to 23 mm. In this group, the lower relative rate in the TAVI group (8.6% vs. 19.7%) represented a nearly 70% reduction in risk of valve deterioration at 5 years (HR, 0.31; P < .001).
However, the advantage at 5 years also remained substantial and significant in larger valves (8.1% vs. 12.6%), translating into a 40% risk reduction in favor of TAVI (HR, 0.60; P = .002).
Independent of type of valve replacement, BVD at 5 years was associated with worse outcomes, including significantly increased risks for all-cause mortality (HR, 1.46; P = .004), cardiovascular mortality (1.84; P < .001), and hospitalization for valve disease or worsening heart failure (HR, 1.67; P = .001).
The baseline characteristics that were statistically associated with BVD at 5 years on multivariate analysis in pooled data from both the TAVI and surgical groups included age (P = .02), a creatinine clearance less than 30 mL/min per 1.73 m2 (P = .006), and a low relative baseline left ventricular ejection fraction (P < .001).
BVD criteria validated for outcome prediction
The four components of valve performance employed in this analysis (SVD, NSVD, thrombosis, and endocarditis) were drawn from consensus documents issued by the Valve Academic Research Consortium and the European Association of Percutaneous Cardiovascular Interventions, but the relative importance of these components for predicting valve survival was previously unknown, according to Dr. Yakubov.
“This is the first analysis to validate clinical criteria for valve performance and its association with clinical outcomes,” said Dr. Yakubov, medical director of cardiovascular studies, OhioHealth Research Institute at Riverside Methodist Hospital, Columbus.
This is also the first study to employ randomized data to prove an advantage of TAVI over surgery in long-term follow-up.
A 10-year follow-up is planned for the patients who participated in these two trials, but the lower rate of BVD in the TAVI arm at 5 years is already a threat to surgical repairs, acknowledged several surgeons who served as panelists in the session where these results were presented.
“I think that these data are a reflection of the fact that we [surgeons] are not being as aggressive as we should be,” said Gregory P. Fontana, MD, who is national director, cardiothoracic surgery, HCA Healthcare, and is affiliated with Los Robles Health System, Thousand Oaks, Calif. “We need to be employing larger prostheses.”
A very similar comment was made by Michael J. Reardon, MD, a professor of cardiothoracic surgery at Houston Methodist Hospital. Pointing to the higher rate of PVL as an example of a common postsurgical complication, he agreed that surgeons should be moving to bigger valve sizes.
While adjustments in valve size might address the steeper rise in NSVD subtypes of BVD observed in the surgical group, but Dr. Reardon and others pointed out that late BVD events also rose at a greater pace in the surgical group. These suggest other improvements in technique might also be needed to keep surgical valve repairs competitive.
Dr. Yakubov reported financial relationships with Medtronic and Boston Scientific, both of which provided funding for this study. Dr. Fontana reported financial relationships with Abbott and Medtronic. Dr. Reardon reported financial relationships with Abbott, Boston Scientific, Medtronic, and Gore Medical.
In a pooled analysis from two randomized trials, transcatheter aortic valve implantation (TAVI) was associated with significantly less bioprosthetic valve dysfunction (BVD) than a surgical prosthetic implantation, according to data presented as a late-breaker at the Cardiovascular Research Technologies conference, sponsored by MedStar Heart & Vascular Institute.
“The difference in valve performance was driven by a twofold lower SVD [structural valve deterioration] and a 3-fold lower severe PPM [prothesis-patient mismatch] for TAVI versus surgery,” reported Steven J. Yakubov, MD.
The data were pooled from the CoreValve U.S. Pivotal and SURTAVI randomized trials. Of patients participating in these two trials, 5-year follow-up data were available for 1,128 randomized to the CoreValve/Evolut TAVI and 971 randomized to surgical prosthetic valve replacement.
The major focus of the study was on the cumulative incidence of BVD, but the study also included separate analyses on the relationship between BVD and clinical outcomes. Preprocedural indicators for BVD at 5 years were also analyzed.
SVD was defined as a mean gradient increase of at least 10 mm Hg from discharge to 30 days, along with at least 20 mm Hg at last echo or new-onset aortic regurgitation. Nonstructural valve deterioration (NSVD) was defined as severe PPM at discharge or 30 days or severe paravalvular regurgitation through 5 years. In addition to these two components, the BVD endpoint also included thrombosis and endocarditis.
Surgical valve deterioration high at 5 years
On the basis of these definitions, the rate of BVD at 5 years was 14.2% in the surgery group and 7.8% in the TAVI group, translating into a 50% risk reduction in favor of TAVI (hazard ratio, 0.50; P < .001).
Thrombosis or endocarditis occurred in low rates in both groups, but every other component of BVD favored TAVI significantly, not just numerically. This included SVD (2.2% vs. 4.4%; P = .004), and the two components of NSVD, PPM (3.7% vs. 11.8%; P < .001) and severe paravalvular regurgitation (0.2% vs. 1.2%; P = .02).
When stratified by annular diameter, the relative advantage of TAVI over surgery was greatest in those valves with diameters of up to 23 mm. In this group, the lower relative rate in the TAVI group (8.6% vs. 19.7%) represented a nearly 70% reduction in risk of valve deterioration at 5 years (HR, 0.31; P < .001).
However, the advantage at 5 years also remained substantial and significant in larger valves (8.1% vs. 12.6%), translating into a 40% risk reduction in favor of TAVI (HR, 0.60; P = .002).
Independent of type of valve replacement, BVD at 5 years was associated with worse outcomes, including significantly increased risks for all-cause mortality (HR, 1.46; P = .004), cardiovascular mortality (1.84; P < .001), and hospitalization for valve disease or worsening heart failure (HR, 1.67; P = .001).
The baseline characteristics that were statistically associated with BVD at 5 years on multivariate analysis in pooled data from both the TAVI and surgical groups included age (P = .02), a creatinine clearance less than 30 mL/min per 1.73 m2 (P = .006), and a low relative baseline left ventricular ejection fraction (P < .001).
BVD criteria validated for outcome prediction
The four components of valve performance employed in this analysis (SVD, NSVD, thrombosis, and endocarditis) were drawn from consensus documents issued by the Valve Academic Research Consortium and the European Association of Percutaneous Cardiovascular Interventions, but the relative importance of these components for predicting valve survival was previously unknown, according to Dr. Yakubov.
“This is the first analysis to validate clinical criteria for valve performance and its association with clinical outcomes,” said Dr. Yakubov, medical director of cardiovascular studies, OhioHealth Research Institute at Riverside Methodist Hospital, Columbus.
This is also the first study to employ randomized data to prove an advantage of TAVI over surgery in long-term follow-up.
A 10-year follow-up is planned for the patients who participated in these two trials, but the lower rate of BVD in the TAVI arm at 5 years is already a threat to surgical repairs, acknowledged several surgeons who served as panelists in the session where these results were presented.
“I think that these data are a reflection of the fact that we [surgeons] are not being as aggressive as we should be,” said Gregory P. Fontana, MD, who is national director, cardiothoracic surgery, HCA Healthcare, and is affiliated with Los Robles Health System, Thousand Oaks, Calif. “We need to be employing larger prostheses.”
A very similar comment was made by Michael J. Reardon, MD, a professor of cardiothoracic surgery at Houston Methodist Hospital. Pointing to the higher rate of PVL as an example of a common postsurgical complication, he agreed that surgeons should be moving to bigger valve sizes.
While adjustments in valve size might address the steeper rise in NSVD subtypes of BVD observed in the surgical group, but Dr. Reardon and others pointed out that late BVD events also rose at a greater pace in the surgical group. These suggest other improvements in technique might also be needed to keep surgical valve repairs competitive.
Dr. Yakubov reported financial relationships with Medtronic and Boston Scientific, both of which provided funding for this study. Dr. Fontana reported financial relationships with Abbott and Medtronic. Dr. Reardon reported financial relationships with Abbott, Boston Scientific, Medtronic, and Gore Medical.
FROM CRT 2023
Skin reactions from melanoma targeted and immune therapies range from pruritus to SJS
SAN DIEGO – A
“These skin reactions can cause pain, itching, and emotional and social distress that may severely impact activities of daily living,” Aleksandr Itkin, MD, a dermatologist at Scripps MD Anderson Cancer Center, San Diego, said at the annual Cutaneous Malignancy Update. An estimated 30%-50% of patients on immune checkpoint inhibitors (ICIs) experience cutaneous adverse events, he said, which leads to dose reduction or discontinuation of ICIs in 20% of cases.
Clinicians first observed these side effects in 2011, with the Food and Drug Administration approval of ipilimumab, a human cytotoxic T-lymphocyte antigen 4 (CTLA-4)–blocking antibody, for metastatic melanoma, followed by the programmed death receptor-1 (PD-1) inhibitors nivolumab and pembrolizumab, which were approved in 2014 for the same indication.
Since then, more ICIs showing similar adverse cutaneous reactions have been approved by the FDA. These include avelumab, atezolizumab in combination with cobimetinib and vemurafenib, and a combination of relatlimab, an anti-LAG-3 antibody, with nivolumab.
Among the targeted therapies, the BRAF inhibitors vemurafenib and dabrafenib alone or in combination with MAPK pathway inhibitors cobimetinib and trametinib, which are a first-line therapy for V600 BRAF mutated metastatic melanoma, are associated with their own set of cutaneous reactions. The oncolytic modified herpes simplex virus T-VEC (talimogene laherparepvec), approved by the FDA in 2015 for the treatment of unresectable stage IIIB-IV metastatic melanoma, also results in cutaneous reactions that have been found useful in assessing the therapeutic outcome of this agent.
According to a 2020 CME article on the dermatologic adverse events that occur after treatment initiation with ICIs, the time of onset of psoriasiform rash is within the first 3 weeks, maculopapular rash and pruritus in the first 4-6 weeks, lichenoid eruption in the first 7-12 weeks, and bullous pemphigoid in weeks 13-15. The most severe reactions – SJS, toxic epidermal necrolysis (TEN), and drug reaction with eosinophilia and systemic symptoms (DRESS) – usually occur after 2-3 months of treatment.
A subsequent retrospective cohort study of patients in the United States treated with ICIs for a variety of systemic malignancies and matched controls found that the ICI-treated group had a significantly higher incidence of pruritus, mucositis, erythroderma, maculopapular eruption, vitiligo, lichen planus, bullous pemphigoid, Grover disease, rash, other nonspecific eruptions, and drug eruption or other nonspecific drug reaction. Patients with melanoma and renal cell carcinoma and those receiving combination therapy were at a higher risk of cutaneous immune-related adverse events.
Another study, a prospective trial of 617 patients with various advanced cancers (including melanoma), found that both severe and mild skin toxicities were significantly associated with improved progression-free and overall survival.
According to Dr. Itkin, erythema multiforme, SJS, and TEN have been reported with anti-PD1, anti-CTLA4, and BRAF inhibitors. In TEN induced by vemurafenib, an in vitro analysis showed cross-activation of lymphocytes with dabrafenib and with sulfamethoxazole. “This means you that may want to avoid sulfonamides in patients with serious hypersensitivity to vemurafenib, and vice versa,” he said at the meeting hosted by Scripps MD Anderson Cancer Center.
Acneiform eruptions
In addition, the use of MAPK inhibitors can trigger acneiform eruptions. In one study, 77% of patients on trametinib developed acneiform eruption, but only 10% of those on trametinib in combination with dabrafenib developed acneiform eruption. “Inhibition of the MAPK pathway leads to decreased proliferative markers, further leading to decreased keratinocyte replication, increased inflammatory cytokine, apoptosis, thinning and abnormal epidermal differentiation, follicular rupture, and papule/pustule formation,” he said. For these cases, “treatment options are similar to what we use for regular acne except for here, use of systemic steroids is sometimes needed, especially in more severe cases. The reaction may be so severe as to lead to dose reduction or discontinuation of antineoplastic treatment.”
Effects on nail, hair
Paronychia and onycholysis are additional potential adverse events of MEK inhibitors and BRAF inhibitors alone or in combination, Dr. Itkin continued. Onycholysis is associated with dabrafenib alone or in combination with trametinib, while vemurafenib has been shown to induce acute paronychia and brittle nails. He said that secondary infections in these cases can be treated with the options familiar to dermatologists in their daily practice: oral doxycycline, azole antifungals, vinegar soaks, topical superpotent corticosteroids under occlusion, nail avulsion, and phenol nail matrix ablation.
Dr. Itkin noted that while PD-1 and PD-L1 inhibitors can cause hair repigmentation, CTLA-4 and PD-1 inhibitors are more likely to cause vitiligo. Appearance of vitiligo is regarded as a good prognostic factor in the treatment of melanoma with various checkpoint inhibitors alone or in combination with each other or with radiation therapy. “About 5% of melanoma patients treated with ipilimumab will develop vitiligo,” he said.
ICI-induced vitiligo differs from conventional vitiligo in that there is no family or personal history of autoimmunity; it presents as a flecked pattern of lesion on photo-exposed skin, and it lacks the Koebner phenomenon. In addition, induction of squamous neoplasms can occur with BRAF inhibitors, especially in patients with a high frequency of RAS mutations.
He said that coadministration of MEK inhibitors such as trametinib and cobimetinib may prevent induction of keratinocytic neoplasms.
Dr. Itkin reported having no relevant financial disclosures.
SAN DIEGO – A
“These skin reactions can cause pain, itching, and emotional and social distress that may severely impact activities of daily living,” Aleksandr Itkin, MD, a dermatologist at Scripps MD Anderson Cancer Center, San Diego, said at the annual Cutaneous Malignancy Update. An estimated 30%-50% of patients on immune checkpoint inhibitors (ICIs) experience cutaneous adverse events, he said, which leads to dose reduction or discontinuation of ICIs in 20% of cases.
Clinicians first observed these side effects in 2011, with the Food and Drug Administration approval of ipilimumab, a human cytotoxic T-lymphocyte antigen 4 (CTLA-4)–blocking antibody, for metastatic melanoma, followed by the programmed death receptor-1 (PD-1) inhibitors nivolumab and pembrolizumab, which were approved in 2014 for the same indication.
Since then, more ICIs showing similar adverse cutaneous reactions have been approved by the FDA. These include avelumab, atezolizumab in combination with cobimetinib and vemurafenib, and a combination of relatlimab, an anti-LAG-3 antibody, with nivolumab.
Among the targeted therapies, the BRAF inhibitors vemurafenib and dabrafenib alone or in combination with MAPK pathway inhibitors cobimetinib and trametinib, which are a first-line therapy for V600 BRAF mutated metastatic melanoma, are associated with their own set of cutaneous reactions. The oncolytic modified herpes simplex virus T-VEC (talimogene laherparepvec), approved by the FDA in 2015 for the treatment of unresectable stage IIIB-IV metastatic melanoma, also results in cutaneous reactions that have been found useful in assessing the therapeutic outcome of this agent.
According to a 2020 CME article on the dermatologic adverse events that occur after treatment initiation with ICIs, the time of onset of psoriasiform rash is within the first 3 weeks, maculopapular rash and pruritus in the first 4-6 weeks, lichenoid eruption in the first 7-12 weeks, and bullous pemphigoid in weeks 13-15. The most severe reactions – SJS, toxic epidermal necrolysis (TEN), and drug reaction with eosinophilia and systemic symptoms (DRESS) – usually occur after 2-3 months of treatment.
A subsequent retrospective cohort study of patients in the United States treated with ICIs for a variety of systemic malignancies and matched controls found that the ICI-treated group had a significantly higher incidence of pruritus, mucositis, erythroderma, maculopapular eruption, vitiligo, lichen planus, bullous pemphigoid, Grover disease, rash, other nonspecific eruptions, and drug eruption or other nonspecific drug reaction. Patients with melanoma and renal cell carcinoma and those receiving combination therapy were at a higher risk of cutaneous immune-related adverse events.
Another study, a prospective trial of 617 patients with various advanced cancers (including melanoma), found that both severe and mild skin toxicities were significantly associated with improved progression-free and overall survival.
According to Dr. Itkin, erythema multiforme, SJS, and TEN have been reported with anti-PD1, anti-CTLA4, and BRAF inhibitors. In TEN induced by vemurafenib, an in vitro analysis showed cross-activation of lymphocytes with dabrafenib and with sulfamethoxazole. “This means you that may want to avoid sulfonamides in patients with serious hypersensitivity to vemurafenib, and vice versa,” he said at the meeting hosted by Scripps MD Anderson Cancer Center.
Acneiform eruptions
In addition, the use of MAPK inhibitors can trigger acneiform eruptions. In one study, 77% of patients on trametinib developed acneiform eruption, but only 10% of those on trametinib in combination with dabrafenib developed acneiform eruption. “Inhibition of the MAPK pathway leads to decreased proliferative markers, further leading to decreased keratinocyte replication, increased inflammatory cytokine, apoptosis, thinning and abnormal epidermal differentiation, follicular rupture, and papule/pustule formation,” he said. For these cases, “treatment options are similar to what we use for regular acne except for here, use of systemic steroids is sometimes needed, especially in more severe cases. The reaction may be so severe as to lead to dose reduction or discontinuation of antineoplastic treatment.”
Effects on nail, hair
Paronychia and onycholysis are additional potential adverse events of MEK inhibitors and BRAF inhibitors alone or in combination, Dr. Itkin continued. Onycholysis is associated with dabrafenib alone or in combination with trametinib, while vemurafenib has been shown to induce acute paronychia and brittle nails. He said that secondary infections in these cases can be treated with the options familiar to dermatologists in their daily practice: oral doxycycline, azole antifungals, vinegar soaks, topical superpotent corticosteroids under occlusion, nail avulsion, and phenol nail matrix ablation.
Dr. Itkin noted that while PD-1 and PD-L1 inhibitors can cause hair repigmentation, CTLA-4 and PD-1 inhibitors are more likely to cause vitiligo. Appearance of vitiligo is regarded as a good prognostic factor in the treatment of melanoma with various checkpoint inhibitors alone or in combination with each other or with radiation therapy. “About 5% of melanoma patients treated with ipilimumab will develop vitiligo,” he said.
ICI-induced vitiligo differs from conventional vitiligo in that there is no family or personal history of autoimmunity; it presents as a flecked pattern of lesion on photo-exposed skin, and it lacks the Koebner phenomenon. In addition, induction of squamous neoplasms can occur with BRAF inhibitors, especially in patients with a high frequency of RAS mutations.
He said that coadministration of MEK inhibitors such as trametinib and cobimetinib may prevent induction of keratinocytic neoplasms.
Dr. Itkin reported having no relevant financial disclosures.
SAN DIEGO – A
“These skin reactions can cause pain, itching, and emotional and social distress that may severely impact activities of daily living,” Aleksandr Itkin, MD, a dermatologist at Scripps MD Anderson Cancer Center, San Diego, said at the annual Cutaneous Malignancy Update. An estimated 30%-50% of patients on immune checkpoint inhibitors (ICIs) experience cutaneous adverse events, he said, which leads to dose reduction or discontinuation of ICIs in 20% of cases.
Clinicians first observed these side effects in 2011, with the Food and Drug Administration approval of ipilimumab, a human cytotoxic T-lymphocyte antigen 4 (CTLA-4)–blocking antibody, for metastatic melanoma, followed by the programmed death receptor-1 (PD-1) inhibitors nivolumab and pembrolizumab, which were approved in 2014 for the same indication.
Since then, more ICIs showing similar adverse cutaneous reactions have been approved by the FDA. These include avelumab, atezolizumab in combination with cobimetinib and vemurafenib, and a combination of relatlimab, an anti-LAG-3 antibody, with nivolumab.
Among the targeted therapies, the BRAF inhibitors vemurafenib and dabrafenib alone or in combination with MAPK pathway inhibitors cobimetinib and trametinib, which are a first-line therapy for V600 BRAF mutated metastatic melanoma, are associated with their own set of cutaneous reactions. The oncolytic modified herpes simplex virus T-VEC (talimogene laherparepvec), approved by the FDA in 2015 for the treatment of unresectable stage IIIB-IV metastatic melanoma, also results in cutaneous reactions that have been found useful in assessing the therapeutic outcome of this agent.
According to a 2020 CME article on the dermatologic adverse events that occur after treatment initiation with ICIs, the time of onset of psoriasiform rash is within the first 3 weeks, maculopapular rash and pruritus in the first 4-6 weeks, lichenoid eruption in the first 7-12 weeks, and bullous pemphigoid in weeks 13-15. The most severe reactions – SJS, toxic epidermal necrolysis (TEN), and drug reaction with eosinophilia and systemic symptoms (DRESS) – usually occur after 2-3 months of treatment.
A subsequent retrospective cohort study of patients in the United States treated with ICIs for a variety of systemic malignancies and matched controls found that the ICI-treated group had a significantly higher incidence of pruritus, mucositis, erythroderma, maculopapular eruption, vitiligo, lichen planus, bullous pemphigoid, Grover disease, rash, other nonspecific eruptions, and drug eruption or other nonspecific drug reaction. Patients with melanoma and renal cell carcinoma and those receiving combination therapy were at a higher risk of cutaneous immune-related adverse events.
Another study, a prospective trial of 617 patients with various advanced cancers (including melanoma), found that both severe and mild skin toxicities were significantly associated with improved progression-free and overall survival.
According to Dr. Itkin, erythema multiforme, SJS, and TEN have been reported with anti-PD1, anti-CTLA4, and BRAF inhibitors. In TEN induced by vemurafenib, an in vitro analysis showed cross-activation of lymphocytes with dabrafenib and with sulfamethoxazole. “This means you that may want to avoid sulfonamides in patients with serious hypersensitivity to vemurafenib, and vice versa,” he said at the meeting hosted by Scripps MD Anderson Cancer Center.
Acneiform eruptions
In addition, the use of MAPK inhibitors can trigger acneiform eruptions. In one study, 77% of patients on trametinib developed acneiform eruption, but only 10% of those on trametinib in combination with dabrafenib developed acneiform eruption. “Inhibition of the MAPK pathway leads to decreased proliferative markers, further leading to decreased keratinocyte replication, increased inflammatory cytokine, apoptosis, thinning and abnormal epidermal differentiation, follicular rupture, and papule/pustule formation,” he said. For these cases, “treatment options are similar to what we use for regular acne except for here, use of systemic steroids is sometimes needed, especially in more severe cases. The reaction may be so severe as to lead to dose reduction or discontinuation of antineoplastic treatment.”
Effects on nail, hair
Paronychia and onycholysis are additional potential adverse events of MEK inhibitors and BRAF inhibitors alone or in combination, Dr. Itkin continued. Onycholysis is associated with dabrafenib alone or in combination with trametinib, while vemurafenib has been shown to induce acute paronychia and brittle nails. He said that secondary infections in these cases can be treated with the options familiar to dermatologists in their daily practice: oral doxycycline, azole antifungals, vinegar soaks, topical superpotent corticosteroids under occlusion, nail avulsion, and phenol nail matrix ablation.
Dr. Itkin noted that while PD-1 and PD-L1 inhibitors can cause hair repigmentation, CTLA-4 and PD-1 inhibitors are more likely to cause vitiligo. Appearance of vitiligo is regarded as a good prognostic factor in the treatment of melanoma with various checkpoint inhibitors alone or in combination with each other or with radiation therapy. “About 5% of melanoma patients treated with ipilimumab will develop vitiligo,” he said.
ICI-induced vitiligo differs from conventional vitiligo in that there is no family or personal history of autoimmunity; it presents as a flecked pattern of lesion on photo-exposed skin, and it lacks the Koebner phenomenon. In addition, induction of squamous neoplasms can occur with BRAF inhibitors, especially in patients with a high frequency of RAS mutations.
He said that coadministration of MEK inhibitors such as trametinib and cobimetinib may prevent induction of keratinocytic neoplasms.
Dr. Itkin reported having no relevant financial disclosures.
AT MELANOMA 2023
IBD: More patients on vedolizumab vs. anti-TNFs at 2 years
COPENHAGEN – , according to the first meta-analysis of their real-world effectiveness.
The results mostly applied to bionaive subjects, and the benefit of vedolizumab over both TNFi’s – infliximab (Remicade) and adalimumab (Humira), was more evident in ulcerative colitis, compared with Crohn’s disease, noted the researchers, led by Tsz Hong Yiu, MD, a clinician and researcher at the University of Sydney.
“It appears that patients are more likely to stay on vedolizumab than either infliximab or adalimumab, especially in bionaive patients, which could suggest either a better tolerance to the treatment or a better response,” Dr. Yiu said in an interview at the annual Congress of the European Crohn’s and Colitis Organisation.
The 2-year follow up data were particularly encouraging, noted Dr. Yiu, with more patients persisting on vedolizumab than both anti-TNF alpha drugs overall with respect to both ulcerative colitis and Crohn’s disease.
In a head-to-head comparison, 15% more patients stayed on vedolizumab than anti-TNF alpha drugs overall, at 1-year follow-up for both ulcerative colitis and Crohn’s disease (risk ratio, 1.15). At 2 years of follow-up, 12% more patients remained on vedolizumab in comparison with anti-TNF alpha drugs overall (RR, 1.12), again for both forms of inflammatory bowel disease (IBD).
“This may provide early evidence that supports vedolizumab as a first-line biologic agent for inpatients with inflammatory bowel disease,” said Dr. Yiu, noting that further research was required to validate the correlation of persistence with clinical effectiveness.
Adding comment on the motivation for the study, senior author Rupert Leong, MD, a gastroenterologist at Concord RepatriaKon General Hospital, Sydney, said, “We wanted to identify the drug with the highest effectiveness, which is the real-world benefit of the drug to patients, rather than efficacy, which refers to clinical trial data.”
“Importantly, clinical trial data are usually only 1 year, whereas persistence collects data often for several years. This is relevant in chronic diseases that can affect patients over several decades, because the true benefit of a drug cannot be implied from a short-term clinical trial,” he explained.
Persistence was chosen as the primary end-point because it is a measure that incorporates a drug’s efficacy and side-effect profile but also the patient’s perspective, added Dr. Yiu. “So, a patient may value mild side effects over treatment effectiveness and decide to cease treatment.”
A prior meta-analysis looking at loss of response found that 33% of people taking infliximab and 41% of people taking adalimumab became resistant to the biologics after a median follow up of 1 year. “The most common cause of loss of response to anti-TNF inhibitors is due to immunogenicity,” remarked Dr. Yiu. “These findings suggested that alternative biologics with high effectiveness should be considered.”
Data from the 2019 VARSITY study also informed the researchers’ decision to conduct a real-world study. VARSITY investigators found vedolizumab had increased efficacy over adalimumab in ulcerative colitis, however, data on the real-world effectiveness of vedolizumab, compared with adalimumab and infliximab, in both ulcerative colitis and Crohn’s disease remained unknown.
Dr. Leong pointed out the difficulty in selecting the correct treatment given the increasing numbers of biological agents available. “The paucity of head-to-head studies meant use of cohort studies is considered both relevant and informative, not least because long-term follow-up data can reveal secondary loss of response of these monoclonal antibodies, while pooling data further increases the statistical power and determines consistency.”
As such, the researchers conducted a systematic review and meta-analysis of six observational studies evaluating persistence, as a surrogate marker for clinical response, of vedolizumab versus infliximab and adalimumab among participants aged over 18 years with a diagnosis of either ulcerative colitis or Crohn’s disease from 2017 to July 2022.
Overall, the study found that 1-year persistence of vedolizumab was 71.2% in ulcerative colitis and 76% in Crohn’s disease, which was significantly higher than with infliximab (56.4% in ulcerative colitis, 53.7% in Crohn’s disease), and likewise with adalimumab (53.7% in ulcerative colitis, 55.6% in Crohn’s disease).
Results of 2-year persistence were pooled from four studies and found that vedolizumab had a 2-year persistence of 66% in ulcerative colitis and 61% in Crohn’s disease. By comparison, infliximab had a persistence of 49.7% for ulcerative colitis and 59.1% for Crohn’s disease, and adalimumab had a persistence of 31.4% for ulcerative colitis and 56% for Crohn’s disease).
In ulcerative colitis specifically, vedolizumab performed better than both adalimumab and infliximab with an RR of 1.41 (95% confidence interval, 1.14-1.74) and 1.15 (95% CI, 1.06-1.25) respectively, and an RR of 1.23 (95% CI, 1.14-1.33) was generated when adalimumab and infliximab results were combined after 1 year of follow-up.
In Crohn’s disease specifically, vedolizumab had a slightly higher 1-year persistence over anti-TNF inhibitors combined (RR, 1.10; 95% CI, 1.02-1.19), but there were insufficient data to support individual analysis.
In a subgroup of bionaive patients, vedolizumab had a higher 1-year persistence (RR, 1.14; 95% CI, 1.07-1.22) but did not show a statistically significant advantage in bioexperienced patients (RR, 1.04; 95% CI, 0.80-1.35), compared with anti-TNF inhibitors.
Dr. Yiu remarked that they were unable to identify any randomized controlled trials (RCTs) directly comparing infliximab versus vedolizumab in IBD at the time of their systematic review. However, he drew attention to a recent research article that compared the effectiveness, persistence, and side-effect profile of vedolizumab and infliximab in a small cohort of ulcerative colitis patients. “ In this study, vedolizumab showed overall superiority over infliximab, which is in keeping with our study’s findings.”
Commenting on the study, Viraj Kariyawasam, MD, gastroenterologist and head of IBD at Blacktown and Mount Druitt hospital in Sydney, said the findings were “very important in defining the place of vedolizumab in the treatment of ulcerative colitis, and more so in Crohn’s disease.”
“Despite vedolizumab being considered a lower-efficacy drug, compared to infliximab, in Crohn’s disease by most practicing clinicians, and still favoring anti-TNF in the treatment of Crohn’s disease, the study highlights the superior persistence of vedolizumab,” he said in an interview.
“This is likely associated with efficacy over the two most used anti-TNF agents. With the knowledge we have about reduced efficacy of vedolizumab after the use of anti-TNF, or as a second- or third-line agent, and its superior persistence as a first-line biologic with already published safety data, vedolizumab should be considered and preferred as a first-line agent in the treatment of both ulcerative colitis and Crohn’s disease.”
Dr. Yiu has declared no conflicts of interest. Dr. Leong declares he is an advisory board member of AbbVie, Aspen, BMS, Celgene, Celltrion, Chiesi, Ferring, Glutagen, Hospira, Janssen, Lilly, MSD, Novartis, Pfizer, Prometheus Biosciences, Takeda; research grant recipient of Celltrion, Shire, Janssen, Takeda, Gastroenterological Society of Australia, NHMRC, Gutsy Group, Pfizer, Joanna Tiddy grant University of Sydney. One coauthor is an advisory board member of AbbVie and has received speaker fees from AbbVie and Takeda. Dr. Kariyawasam has educational grants and/or speaker fees from Janssen, AbbVie, and Takeda.
COPENHAGEN – , according to the first meta-analysis of their real-world effectiveness.
The results mostly applied to bionaive subjects, and the benefit of vedolizumab over both TNFi’s – infliximab (Remicade) and adalimumab (Humira), was more evident in ulcerative colitis, compared with Crohn’s disease, noted the researchers, led by Tsz Hong Yiu, MD, a clinician and researcher at the University of Sydney.
“It appears that patients are more likely to stay on vedolizumab than either infliximab or adalimumab, especially in bionaive patients, which could suggest either a better tolerance to the treatment or a better response,” Dr. Yiu said in an interview at the annual Congress of the European Crohn’s and Colitis Organisation.
The 2-year follow up data were particularly encouraging, noted Dr. Yiu, with more patients persisting on vedolizumab than both anti-TNF alpha drugs overall with respect to both ulcerative colitis and Crohn’s disease.
In a head-to-head comparison, 15% more patients stayed on vedolizumab than anti-TNF alpha drugs overall, at 1-year follow-up for both ulcerative colitis and Crohn’s disease (risk ratio, 1.15). At 2 years of follow-up, 12% more patients remained on vedolizumab in comparison with anti-TNF alpha drugs overall (RR, 1.12), again for both forms of inflammatory bowel disease (IBD).
“This may provide early evidence that supports vedolizumab as a first-line biologic agent for inpatients with inflammatory bowel disease,” said Dr. Yiu, noting that further research was required to validate the correlation of persistence with clinical effectiveness.
Adding comment on the motivation for the study, senior author Rupert Leong, MD, a gastroenterologist at Concord RepatriaKon General Hospital, Sydney, said, “We wanted to identify the drug with the highest effectiveness, which is the real-world benefit of the drug to patients, rather than efficacy, which refers to clinical trial data.”
“Importantly, clinical trial data are usually only 1 year, whereas persistence collects data often for several years. This is relevant in chronic diseases that can affect patients over several decades, because the true benefit of a drug cannot be implied from a short-term clinical trial,” he explained.
Persistence was chosen as the primary end-point because it is a measure that incorporates a drug’s efficacy and side-effect profile but also the patient’s perspective, added Dr. Yiu. “So, a patient may value mild side effects over treatment effectiveness and decide to cease treatment.”
A prior meta-analysis looking at loss of response found that 33% of people taking infliximab and 41% of people taking adalimumab became resistant to the biologics after a median follow up of 1 year. “The most common cause of loss of response to anti-TNF inhibitors is due to immunogenicity,” remarked Dr. Yiu. “These findings suggested that alternative biologics with high effectiveness should be considered.”
Data from the 2019 VARSITY study also informed the researchers’ decision to conduct a real-world study. VARSITY investigators found vedolizumab had increased efficacy over adalimumab in ulcerative colitis, however, data on the real-world effectiveness of vedolizumab, compared with adalimumab and infliximab, in both ulcerative colitis and Crohn’s disease remained unknown.
Dr. Leong pointed out the difficulty in selecting the correct treatment given the increasing numbers of biological agents available. “The paucity of head-to-head studies meant use of cohort studies is considered both relevant and informative, not least because long-term follow-up data can reveal secondary loss of response of these monoclonal antibodies, while pooling data further increases the statistical power and determines consistency.”
As such, the researchers conducted a systematic review and meta-analysis of six observational studies evaluating persistence, as a surrogate marker for clinical response, of vedolizumab versus infliximab and adalimumab among participants aged over 18 years with a diagnosis of either ulcerative colitis or Crohn’s disease from 2017 to July 2022.
Overall, the study found that 1-year persistence of vedolizumab was 71.2% in ulcerative colitis and 76% in Crohn’s disease, which was significantly higher than with infliximab (56.4% in ulcerative colitis, 53.7% in Crohn’s disease), and likewise with adalimumab (53.7% in ulcerative colitis, 55.6% in Crohn’s disease).
Results of 2-year persistence were pooled from four studies and found that vedolizumab had a 2-year persistence of 66% in ulcerative colitis and 61% in Crohn’s disease. By comparison, infliximab had a persistence of 49.7% for ulcerative colitis and 59.1% for Crohn’s disease, and adalimumab had a persistence of 31.4% for ulcerative colitis and 56% for Crohn’s disease).
In ulcerative colitis specifically, vedolizumab performed better than both adalimumab and infliximab with an RR of 1.41 (95% confidence interval, 1.14-1.74) and 1.15 (95% CI, 1.06-1.25) respectively, and an RR of 1.23 (95% CI, 1.14-1.33) was generated when adalimumab and infliximab results were combined after 1 year of follow-up.
In Crohn’s disease specifically, vedolizumab had a slightly higher 1-year persistence over anti-TNF inhibitors combined (RR, 1.10; 95% CI, 1.02-1.19), but there were insufficient data to support individual analysis.
In a subgroup of bionaive patients, vedolizumab had a higher 1-year persistence (RR, 1.14; 95% CI, 1.07-1.22) but did not show a statistically significant advantage in bioexperienced patients (RR, 1.04; 95% CI, 0.80-1.35), compared with anti-TNF inhibitors.
Dr. Yiu remarked that they were unable to identify any randomized controlled trials (RCTs) directly comparing infliximab versus vedolizumab in IBD at the time of their systematic review. However, he drew attention to a recent research article that compared the effectiveness, persistence, and side-effect profile of vedolizumab and infliximab in a small cohort of ulcerative colitis patients. “ In this study, vedolizumab showed overall superiority over infliximab, which is in keeping with our study’s findings.”
Commenting on the study, Viraj Kariyawasam, MD, gastroenterologist and head of IBD at Blacktown and Mount Druitt hospital in Sydney, said the findings were “very important in defining the place of vedolizumab in the treatment of ulcerative colitis, and more so in Crohn’s disease.”
“Despite vedolizumab being considered a lower-efficacy drug, compared to infliximab, in Crohn’s disease by most practicing clinicians, and still favoring anti-TNF in the treatment of Crohn’s disease, the study highlights the superior persistence of vedolizumab,” he said in an interview.
“This is likely associated with efficacy over the two most used anti-TNF agents. With the knowledge we have about reduced efficacy of vedolizumab after the use of anti-TNF, or as a second- or third-line agent, and its superior persistence as a first-line biologic with already published safety data, vedolizumab should be considered and preferred as a first-line agent in the treatment of both ulcerative colitis and Crohn’s disease.”
Dr. Yiu has declared no conflicts of interest. Dr. Leong declares he is an advisory board member of AbbVie, Aspen, BMS, Celgene, Celltrion, Chiesi, Ferring, Glutagen, Hospira, Janssen, Lilly, MSD, Novartis, Pfizer, Prometheus Biosciences, Takeda; research grant recipient of Celltrion, Shire, Janssen, Takeda, Gastroenterological Society of Australia, NHMRC, Gutsy Group, Pfizer, Joanna Tiddy grant University of Sydney. One coauthor is an advisory board member of AbbVie and has received speaker fees from AbbVie and Takeda. Dr. Kariyawasam has educational grants and/or speaker fees from Janssen, AbbVie, and Takeda.
COPENHAGEN – , according to the first meta-analysis of their real-world effectiveness.
The results mostly applied to bionaive subjects, and the benefit of vedolizumab over both TNFi’s – infliximab (Remicade) and adalimumab (Humira), was more evident in ulcerative colitis, compared with Crohn’s disease, noted the researchers, led by Tsz Hong Yiu, MD, a clinician and researcher at the University of Sydney.
“It appears that patients are more likely to stay on vedolizumab than either infliximab or adalimumab, especially in bionaive patients, which could suggest either a better tolerance to the treatment or a better response,” Dr. Yiu said in an interview at the annual Congress of the European Crohn’s and Colitis Organisation.
The 2-year follow up data were particularly encouraging, noted Dr. Yiu, with more patients persisting on vedolizumab than both anti-TNF alpha drugs overall with respect to both ulcerative colitis and Crohn’s disease.
In a head-to-head comparison, 15% more patients stayed on vedolizumab than anti-TNF alpha drugs overall, at 1-year follow-up for both ulcerative colitis and Crohn’s disease (risk ratio, 1.15). At 2 years of follow-up, 12% more patients remained on vedolizumab in comparison with anti-TNF alpha drugs overall (RR, 1.12), again for both forms of inflammatory bowel disease (IBD).
“This may provide early evidence that supports vedolizumab as a first-line biologic agent for inpatients with inflammatory bowel disease,” said Dr. Yiu, noting that further research was required to validate the correlation of persistence with clinical effectiveness.
Adding comment on the motivation for the study, senior author Rupert Leong, MD, a gastroenterologist at Concord RepatriaKon General Hospital, Sydney, said, “We wanted to identify the drug with the highest effectiveness, which is the real-world benefit of the drug to patients, rather than efficacy, which refers to clinical trial data.”
“Importantly, clinical trial data are usually only 1 year, whereas persistence collects data often for several years. This is relevant in chronic diseases that can affect patients over several decades, because the true benefit of a drug cannot be implied from a short-term clinical trial,” he explained.
Persistence was chosen as the primary end-point because it is a measure that incorporates a drug’s efficacy and side-effect profile but also the patient’s perspective, added Dr. Yiu. “So, a patient may value mild side effects over treatment effectiveness and decide to cease treatment.”
A prior meta-analysis looking at loss of response found that 33% of people taking infliximab and 41% of people taking adalimumab became resistant to the biologics after a median follow up of 1 year. “The most common cause of loss of response to anti-TNF inhibitors is due to immunogenicity,” remarked Dr. Yiu. “These findings suggested that alternative biologics with high effectiveness should be considered.”
Data from the 2019 VARSITY study also informed the researchers’ decision to conduct a real-world study. VARSITY investigators found vedolizumab had increased efficacy over adalimumab in ulcerative colitis, however, data on the real-world effectiveness of vedolizumab, compared with adalimumab and infliximab, in both ulcerative colitis and Crohn’s disease remained unknown.
Dr. Leong pointed out the difficulty in selecting the correct treatment given the increasing numbers of biological agents available. “The paucity of head-to-head studies meant use of cohort studies is considered both relevant and informative, not least because long-term follow-up data can reveal secondary loss of response of these monoclonal antibodies, while pooling data further increases the statistical power and determines consistency.”
As such, the researchers conducted a systematic review and meta-analysis of six observational studies evaluating persistence, as a surrogate marker for clinical response, of vedolizumab versus infliximab and adalimumab among participants aged over 18 years with a diagnosis of either ulcerative colitis or Crohn’s disease from 2017 to July 2022.
Overall, the study found that 1-year persistence of vedolizumab was 71.2% in ulcerative colitis and 76% in Crohn’s disease, which was significantly higher than with infliximab (56.4% in ulcerative colitis, 53.7% in Crohn’s disease), and likewise with adalimumab (53.7% in ulcerative colitis, 55.6% in Crohn’s disease).
Results of 2-year persistence were pooled from four studies and found that vedolizumab had a 2-year persistence of 66% in ulcerative colitis and 61% in Crohn’s disease. By comparison, infliximab had a persistence of 49.7% for ulcerative colitis and 59.1% for Crohn’s disease, and adalimumab had a persistence of 31.4% for ulcerative colitis and 56% for Crohn’s disease).
In ulcerative colitis specifically, vedolizumab performed better than both adalimumab and infliximab with an RR of 1.41 (95% confidence interval, 1.14-1.74) and 1.15 (95% CI, 1.06-1.25) respectively, and an RR of 1.23 (95% CI, 1.14-1.33) was generated when adalimumab and infliximab results were combined after 1 year of follow-up.
In Crohn’s disease specifically, vedolizumab had a slightly higher 1-year persistence over anti-TNF inhibitors combined (RR, 1.10; 95% CI, 1.02-1.19), but there were insufficient data to support individual analysis.
In a subgroup of bionaive patients, vedolizumab had a higher 1-year persistence (RR, 1.14; 95% CI, 1.07-1.22) but did not show a statistically significant advantage in bioexperienced patients (RR, 1.04; 95% CI, 0.80-1.35), compared with anti-TNF inhibitors.
Dr. Yiu remarked that they were unable to identify any randomized controlled trials (RCTs) directly comparing infliximab versus vedolizumab in IBD at the time of their systematic review. However, he drew attention to a recent research article that compared the effectiveness, persistence, and side-effect profile of vedolizumab and infliximab in a small cohort of ulcerative colitis patients. “ In this study, vedolizumab showed overall superiority over infliximab, which is in keeping with our study’s findings.”
Commenting on the study, Viraj Kariyawasam, MD, gastroenterologist and head of IBD at Blacktown and Mount Druitt hospital in Sydney, said the findings were “very important in defining the place of vedolizumab in the treatment of ulcerative colitis, and more so in Crohn’s disease.”
“Despite vedolizumab being considered a lower-efficacy drug, compared to infliximab, in Crohn’s disease by most practicing clinicians, and still favoring anti-TNF in the treatment of Crohn’s disease, the study highlights the superior persistence of vedolizumab,” he said in an interview.
“This is likely associated with efficacy over the two most used anti-TNF agents. With the knowledge we have about reduced efficacy of vedolizumab after the use of anti-TNF, or as a second- or third-line agent, and its superior persistence as a first-line biologic with already published safety data, vedolizumab should be considered and preferred as a first-line agent in the treatment of both ulcerative colitis and Crohn’s disease.”
Dr. Yiu has declared no conflicts of interest. Dr. Leong declares he is an advisory board member of AbbVie, Aspen, BMS, Celgene, Celltrion, Chiesi, Ferring, Glutagen, Hospira, Janssen, Lilly, MSD, Novartis, Pfizer, Prometheus Biosciences, Takeda; research grant recipient of Celltrion, Shire, Janssen, Takeda, Gastroenterological Society of Australia, NHMRC, Gutsy Group, Pfizer, Joanna Tiddy grant University of Sydney. One coauthor is an advisory board member of AbbVie and has received speaker fees from AbbVie and Takeda. Dr. Kariyawasam has educational grants and/or speaker fees from Janssen, AbbVie, and Takeda.
AT ECCO 2023
Cannabis tied to lower IBD mortality, hospital costs
COPENHAGEN – Mortality rate, length of hospital stay, and cost of hospitalization all drop significantly in patients with inflammatory bowel disease (IBD) concurrently using cannabis, shows a study that supports wider availability of the substance for specified medical use.
Inpatient mortality dropped by more than 70% in those patients concurrently using cannabis for another indication, compared with those not taking the drug, while total cost of hospitalization dropped by more than $11,000.
The findings were presented as a poster by Neethi Dasu, DO, PGY 6 Gastroenterology Fellow, at Jefferson Health Hospital, N.J., at the annual congress of the European Crohn’s and Colitis Organization. Dr. Dasu worked with coinvestigator Brian Blair, DO, FACOI, Gastroenterology Program Director, IBD specialist, at the same hospital.
Dr. Dasu said. “Not only do patients spend less time in hospital, but they also have a decrease in mortality and hospital cost, which can be significant for patients with IBD, a chronic condition, that often burdens them with high health care spend.”
The researcher also highlighted that with annual U.S. health care spending on IBD having increased significantly in recent years, getting patients well and out of the hospital in a timely manner is key and that “cannabis might help in this aim.”
Cannabis use is legalized in some U.S. states for medical treatment of several chronic, debilitating disorders, especially cancer. Currently, there is no direct Food and Drug Administration approval for use for IBD. “Utilizing it would be considered off-label and investigational,” Dr. Dasu pointed out.
Patients report cannabis, as a pain control treatment, is effective for acute flares and chronic IBD, said Dr. Dasu. “It is an excellent agent for pain control that is not a narcotic, as with opioids, which can cause dependence and addiction. These could ultimately harm patients in the long term,” she addedin an interview. “Opioids can also cause drowsiness and side effects, which harm a person’s quality of life.”
Patients with IBD using cannabis concurrently
Dr. Dasu and her coresearchers aimed to see if outcomes including mortality and pain could be modified with “a very accessible and cost-efficient agent that does not cause long term addiction or adverse events.”
She added that previous studies had evaluated the clinical response in patients with IBD and concomitant cannabis use, but that their study was novel because it looked at inpatient outcomes as well as overall hospital cost.
Dr. Dasu and colleagues analyzed data over the years 2015-2019, from the Nationwide Inpatient Sample (NIS), a large publicly available all-payer inpatient care database, which encompasses approximately 7 million inpatient hospitalizations annually in the United States.
All included patients had IBD, either ulcerative colitis or Crohn’s disease, were aged 18 years and over, and used cannabis for a concurrent indication.
Odds ratios were calculated for in-hospital mortality, average length of hospital stay, and hospital charges, after adjusting for age, gender, race, primary insurance payer status, hospital type and size (number of beds), hospital region, hospital teaching status, and other demographic characteristics.
Of the 1,198,839 patients with IBD, 29,445 used cannabis for a different indication. Participants had an average age of 38.7 years.
Highly significant drop in mortality and hospital costs
Inpatient mortality shows a significant decrease of 72% (odds ratio, 0.28; confidence interval, 0.19-0.41, P < .0001) in those who concurrently used cannabis, compared with those who did not. Hospital length of stay also dropped by –0.17 days (95% CI, –0.35 to –0.01; P < .041), and this translated into a significant drop in the total cost of hospitalization from $39,309.00 (IBD without cannabis use) to $28,254.30 (IBD with cannabis use), resulting in an $11,054.70 savings (95% CI, –$13,681.15 to –$8,427.24; P < .0001).
As a chronic inflammatory disease, IBD involves immune dysregulation leading to symptoms of nausea, vomiting, bleeding, and abdominal pain; however, the pathophysiologic mechanism is not fully understood. She added that studies in mice had shown that cannabis acts via cannabinoid 1 and 2 receptors, located in the nervous system, to decrease pain, nausea, and vomiting. “Mechanisms of cannabis’s analgesic effect also involves inhibition of the release of neurotransmitters involved in pain and inflammation.”
Asked how she felt about the future for cannabis treatment in IBD, Dr. Dasu remarked that it would most likely require decriminalizing marijuana use on a federal level, although individual states currently offer exemptions.
“Further research should be done to evaluate the medical benefits of cannabis use in patients with IBD, with studies warranted to investigate the factors that may be driving these differences, as well warranted to investigations into the effect of cannabis on remission rates, rates of hospitalization, potential complications, and quality of life,” concluded Dr. Dasu.
Commenting on the study, Mary-Jane Williams, MD, a gastroenterology fellow at East Carolina University Health Medical Center, Greenville, N.C., told this news organization that the study was “a pleasant breath of information on the topic of cannabis use in IBD,” adding that providers often face questions about cannabis use from patients.
“Modulation of the endocannabinoid system ... plays a key role in the pathogenesis of IBD including pain control, limiting intestinal inflammation, and decreasing intestinal motility,” Dr. Williams said, adding that, “Its use in IBD has promising improvement in the therapeutic effect and overall quality of life.”
“This study highlights and supports substantial therapeutic effects of cannabis in the management of IBD patients, be it their pain control, improving nausea, appetite and sleep, remission rates, earlier time to recovery, shortened hospitalization and faster endoscopic improvement,” she pointed out, noting the need for further studies, but also that most organizations, including the Crohn’s and Colitis Foundation, support policies that facilitate the conduct of clinical research using objective parameters and the potential development of cannabinoid-based medications in the management of our patients with IBD.
Dr. Dasu, Dr. Blair, and Dr. Williams have declared no financial disclosures.
COPENHAGEN – Mortality rate, length of hospital stay, and cost of hospitalization all drop significantly in patients with inflammatory bowel disease (IBD) concurrently using cannabis, shows a study that supports wider availability of the substance for specified medical use.
Inpatient mortality dropped by more than 70% in those patients concurrently using cannabis for another indication, compared with those not taking the drug, while total cost of hospitalization dropped by more than $11,000.
The findings were presented as a poster by Neethi Dasu, DO, PGY 6 Gastroenterology Fellow, at Jefferson Health Hospital, N.J., at the annual congress of the European Crohn’s and Colitis Organization. Dr. Dasu worked with coinvestigator Brian Blair, DO, FACOI, Gastroenterology Program Director, IBD specialist, at the same hospital.
Dr. Dasu said. “Not only do patients spend less time in hospital, but they also have a decrease in mortality and hospital cost, which can be significant for patients with IBD, a chronic condition, that often burdens them with high health care spend.”
The researcher also highlighted that with annual U.S. health care spending on IBD having increased significantly in recent years, getting patients well and out of the hospital in a timely manner is key and that “cannabis might help in this aim.”
Cannabis use is legalized in some U.S. states for medical treatment of several chronic, debilitating disorders, especially cancer. Currently, there is no direct Food and Drug Administration approval for use for IBD. “Utilizing it would be considered off-label and investigational,” Dr. Dasu pointed out.
Patients report cannabis, as a pain control treatment, is effective for acute flares and chronic IBD, said Dr. Dasu. “It is an excellent agent for pain control that is not a narcotic, as with opioids, which can cause dependence and addiction. These could ultimately harm patients in the long term,” she addedin an interview. “Opioids can also cause drowsiness and side effects, which harm a person’s quality of life.”
Patients with IBD using cannabis concurrently
Dr. Dasu and her coresearchers aimed to see if outcomes including mortality and pain could be modified with “a very accessible and cost-efficient agent that does not cause long term addiction or adverse events.”
She added that previous studies had evaluated the clinical response in patients with IBD and concomitant cannabis use, but that their study was novel because it looked at inpatient outcomes as well as overall hospital cost.
Dr. Dasu and colleagues analyzed data over the years 2015-2019, from the Nationwide Inpatient Sample (NIS), a large publicly available all-payer inpatient care database, which encompasses approximately 7 million inpatient hospitalizations annually in the United States.
All included patients had IBD, either ulcerative colitis or Crohn’s disease, were aged 18 years and over, and used cannabis for a concurrent indication.
Odds ratios were calculated for in-hospital mortality, average length of hospital stay, and hospital charges, after adjusting for age, gender, race, primary insurance payer status, hospital type and size (number of beds), hospital region, hospital teaching status, and other demographic characteristics.
Of the 1,198,839 patients with IBD, 29,445 used cannabis for a different indication. Participants had an average age of 38.7 years.
Highly significant drop in mortality and hospital costs
Inpatient mortality shows a significant decrease of 72% (odds ratio, 0.28; confidence interval, 0.19-0.41, P < .0001) in those who concurrently used cannabis, compared with those who did not. Hospital length of stay also dropped by –0.17 days (95% CI, –0.35 to –0.01; P < .041), and this translated into a significant drop in the total cost of hospitalization from $39,309.00 (IBD without cannabis use) to $28,254.30 (IBD with cannabis use), resulting in an $11,054.70 savings (95% CI, –$13,681.15 to –$8,427.24; P < .0001).
As a chronic inflammatory disease, IBD involves immune dysregulation leading to symptoms of nausea, vomiting, bleeding, and abdominal pain; however, the pathophysiologic mechanism is not fully understood. She added that studies in mice had shown that cannabis acts via cannabinoid 1 and 2 receptors, located in the nervous system, to decrease pain, nausea, and vomiting. “Mechanisms of cannabis’s analgesic effect also involves inhibition of the release of neurotransmitters involved in pain and inflammation.”
Asked how she felt about the future for cannabis treatment in IBD, Dr. Dasu remarked that it would most likely require decriminalizing marijuana use on a federal level, although individual states currently offer exemptions.
“Further research should be done to evaluate the medical benefits of cannabis use in patients with IBD, with studies warranted to investigate the factors that may be driving these differences, as well warranted to investigations into the effect of cannabis on remission rates, rates of hospitalization, potential complications, and quality of life,” concluded Dr. Dasu.
Commenting on the study, Mary-Jane Williams, MD, a gastroenterology fellow at East Carolina University Health Medical Center, Greenville, N.C., told this news organization that the study was “a pleasant breath of information on the topic of cannabis use in IBD,” adding that providers often face questions about cannabis use from patients.
“Modulation of the endocannabinoid system ... plays a key role in the pathogenesis of IBD including pain control, limiting intestinal inflammation, and decreasing intestinal motility,” Dr. Williams said, adding that, “Its use in IBD has promising improvement in the therapeutic effect and overall quality of life.”
“This study highlights and supports substantial therapeutic effects of cannabis in the management of IBD patients, be it their pain control, improving nausea, appetite and sleep, remission rates, earlier time to recovery, shortened hospitalization and faster endoscopic improvement,” she pointed out, noting the need for further studies, but also that most organizations, including the Crohn’s and Colitis Foundation, support policies that facilitate the conduct of clinical research using objective parameters and the potential development of cannabinoid-based medications in the management of our patients with IBD.
Dr. Dasu, Dr. Blair, and Dr. Williams have declared no financial disclosures.
COPENHAGEN – Mortality rate, length of hospital stay, and cost of hospitalization all drop significantly in patients with inflammatory bowel disease (IBD) concurrently using cannabis, shows a study that supports wider availability of the substance for specified medical use.
Inpatient mortality dropped by more than 70% in those patients concurrently using cannabis for another indication, compared with those not taking the drug, while total cost of hospitalization dropped by more than $11,000.
The findings were presented as a poster by Neethi Dasu, DO, PGY 6 Gastroenterology Fellow, at Jefferson Health Hospital, N.J., at the annual congress of the European Crohn’s and Colitis Organization. Dr. Dasu worked with coinvestigator Brian Blair, DO, FACOI, Gastroenterology Program Director, IBD specialist, at the same hospital.
Dr. Dasu said. “Not only do patients spend less time in hospital, but they also have a decrease in mortality and hospital cost, which can be significant for patients with IBD, a chronic condition, that often burdens them with high health care spend.”
The researcher also highlighted that with annual U.S. health care spending on IBD having increased significantly in recent years, getting patients well and out of the hospital in a timely manner is key and that “cannabis might help in this aim.”
Cannabis use is legalized in some U.S. states for medical treatment of several chronic, debilitating disorders, especially cancer. Currently, there is no direct Food and Drug Administration approval for use for IBD. “Utilizing it would be considered off-label and investigational,” Dr. Dasu pointed out.
Patients report cannabis, as a pain control treatment, is effective for acute flares and chronic IBD, said Dr. Dasu. “It is an excellent agent for pain control that is not a narcotic, as with opioids, which can cause dependence and addiction. These could ultimately harm patients in the long term,” she addedin an interview. “Opioids can also cause drowsiness and side effects, which harm a person’s quality of life.”
Patients with IBD using cannabis concurrently
Dr. Dasu and her coresearchers aimed to see if outcomes including mortality and pain could be modified with “a very accessible and cost-efficient agent that does not cause long term addiction or adverse events.”
She added that previous studies had evaluated the clinical response in patients with IBD and concomitant cannabis use, but that their study was novel because it looked at inpatient outcomes as well as overall hospital cost.
Dr. Dasu and colleagues analyzed data over the years 2015-2019, from the Nationwide Inpatient Sample (NIS), a large publicly available all-payer inpatient care database, which encompasses approximately 7 million inpatient hospitalizations annually in the United States.
All included patients had IBD, either ulcerative colitis or Crohn’s disease, were aged 18 years and over, and used cannabis for a concurrent indication.
Odds ratios were calculated for in-hospital mortality, average length of hospital stay, and hospital charges, after adjusting for age, gender, race, primary insurance payer status, hospital type and size (number of beds), hospital region, hospital teaching status, and other demographic characteristics.
Of the 1,198,839 patients with IBD, 29,445 used cannabis for a different indication. Participants had an average age of 38.7 years.
Highly significant drop in mortality and hospital costs
Inpatient mortality shows a significant decrease of 72% (odds ratio, 0.28; confidence interval, 0.19-0.41, P < .0001) in those who concurrently used cannabis, compared with those who did not. Hospital length of stay also dropped by –0.17 days (95% CI, –0.35 to –0.01; P < .041), and this translated into a significant drop in the total cost of hospitalization from $39,309.00 (IBD without cannabis use) to $28,254.30 (IBD with cannabis use), resulting in an $11,054.70 savings (95% CI, –$13,681.15 to –$8,427.24; P < .0001).
As a chronic inflammatory disease, IBD involves immune dysregulation leading to symptoms of nausea, vomiting, bleeding, and abdominal pain; however, the pathophysiologic mechanism is not fully understood. She added that studies in mice had shown that cannabis acts via cannabinoid 1 and 2 receptors, located in the nervous system, to decrease pain, nausea, and vomiting. “Mechanisms of cannabis’s analgesic effect also involves inhibition of the release of neurotransmitters involved in pain and inflammation.”
Asked how she felt about the future for cannabis treatment in IBD, Dr. Dasu remarked that it would most likely require decriminalizing marijuana use on a federal level, although individual states currently offer exemptions.
“Further research should be done to evaluate the medical benefits of cannabis use in patients with IBD, with studies warranted to investigate the factors that may be driving these differences, as well warranted to investigations into the effect of cannabis on remission rates, rates of hospitalization, potential complications, and quality of life,” concluded Dr. Dasu.
Commenting on the study, Mary-Jane Williams, MD, a gastroenterology fellow at East Carolina University Health Medical Center, Greenville, N.C., told this news organization that the study was “a pleasant breath of information on the topic of cannabis use in IBD,” adding that providers often face questions about cannabis use from patients.
“Modulation of the endocannabinoid system ... plays a key role in the pathogenesis of IBD including pain control, limiting intestinal inflammation, and decreasing intestinal motility,” Dr. Williams said, adding that, “Its use in IBD has promising improvement in the therapeutic effect and overall quality of life.”
“This study highlights and supports substantial therapeutic effects of cannabis in the management of IBD patients, be it their pain control, improving nausea, appetite and sleep, remission rates, earlier time to recovery, shortened hospitalization and faster endoscopic improvement,” she pointed out, noting the need for further studies, but also that most organizations, including the Crohn’s and Colitis Foundation, support policies that facilitate the conduct of clinical research using objective parameters and the potential development of cannabinoid-based medications in the management of our patients with IBD.
Dr. Dasu, Dr. Blair, and Dr. Williams have declared no financial disclosures.
AT ECCO 2023
Risk of stent infection low, but may be underreported
Infections of coronary stents appear to be uncommon, but it is not clear if they are often missed, underreported, or truly rare, according to a new analysis.
In a search of multiple databases, 79 cases of coronary stent infections (CSI) were found in 65 published reports, according to Venkatakrishnan Ramakumar, MBBS, MD, department of cardiology, All India Institute of Medical Sciences, New Delhi.
Over the period of evaluation, which had no defined starting point but stretched to November 2021, the 79 infections reported worldwide occurred when millions of percutaneous coronary intervention (PCI) procedures were performed. In the United States alone, the current estimated annual number of PCIs is 600,000, according to an article published in the Journal of the American Heart Association.
If the number of reported CSI cases represented even a modest fraction of those that occurred, the risk would still be almost negligible. Yet, Dr. Ramakumar insisted that there has been little attention paid to the potential for CSI, creating a situation in which many or almost all cases are simply being missed.
“We do not know how many infections have gone unrecognized,” Dr. Ramakumar said in presenting his results at the Cardiovascular Research Technologies conference, sponsored by MedStar Heart & Vascular Institute. And even if they are identified and promptly treated, there “is the potential for a publication bias,” he added, referring to the reluctance of investigators to submit and publishers to accept manuscripts with negative results.
Regardless of the frequency with which they occur, CSI is associated with bad outcomes, according to the data evaluated by Dr. Ramakumar. On the basis of in-hospital mortality, the primary endpoint of this analysis, the rate of death in patients developing CSI was 30.3%.
Successful treatment varied by hospital type
This risk was not uniform. Rather, rates of in-hospital mortality and proportion of patients treated successfully varied substantially by type of hospital. At private teaching hospitals for example, successful treatment – whether medical alone or followed by bailout surgery – was 80%. The rates fell to 40% at public teaching hospitals and then to 25% at private nonteaching hospitals.
The full-text articles included in this analysis were evaluated and selected by two reviewers working independently. A CSI diagnosis made clinically or with imaging and treatment outcomes were among criteria for the case studies to be included. Dr. Ramakumar said the study, which he claimed is the largest systematic review of CSI ever conducted, has been registered with PROSPERO, an international prospective registry of systematic reviews.
The presenting symptom was fever in 72% of cases and chest pain in the others, although there was one asymptomatic CSI reported. On angiography, 62% had a concomitant mycotic aneurysm. Intramyocardial abscess (13.9%), rupture (11.3%), and coronary fistula (7.5%) were also common findings, but no angiographic abnormalities could be identified in 53% of patients.
Following PCI, most CSI developed within 8 days (43%) or the first month (23%), but CSI was reported more than 6 months after the procedure in 19%. Complex PCI accounted for 51% of cases. Of stent types, 56% were drug eluting and 13% were bare metal.
When comparing characteristics of those who survived CSI with those who did not, most (89%) of those with a non–ST-segment elevated acute coronary syndrome ultimately survived, while survival from CSI in those with structural heart disease was only 17%.
Microbiological findings were not a criterion for study inclusion, but Staphylococcus species accounted for 65% of the infections for which positive cultures were reported. Pseudomonas accounted for 13%. Less than 4% (3.8%) tested positive for multiple pathogens. A small proportion of patients had unusual infectious organisms.
As part of this analysis, the investigators developed an artificial intelligence model to predict CSI based on patient characteristics and other variables. However, the specificity of only around 70% led Dr. Ramakumar to conclude that it does not yet have practical value.
However, he believes that better methodology to detect CSI is needed, and he proposed a diagnostic algorithm that he believes would both improve detection rates and accelerate the time to diagnosis.
Algorithm proposed for detection of CSI
In this algorithm, the first step in symptomatic patients with a positive blood culture suspected of CSI is imaging, such as transthoracic echocardiography, to identify features of infective endocarditis or endarteritis. If the imaging is positive, further imaging, such as PET, that supports the diagnosis, should be adequate to support a diagnosis and treatment.
If initial imaging is negative, alternative diagnoses should be considered, but Dr. Ramakumar advised repeat imaging after 48 hours if symptoms persist and no other causes are found.
Dr. Ramakumar acknowledged the many limitations of this analysis, including the small sample size and the challenges of assembling coherent data from case reports with variable types of information submitted during different eras of PCI evolution. However, reiterating that CSI might be frequently missed, he emphasized that this problem might be bigger than currently understood.
It is difficult to rule out any possibility that CSI is frequently missed, but Andrew Sharp, MD, PhD, a consultant interventional cardiologist at the University Hospital of Wales, Cardiff, is skeptical.
“One might think this is a potential problem, but I cannot think of one patient in whom this has occurred,” Dr. Sharp said in an interview. He is fairly confident that they are extremely rare.
“When there is infection associated with a foreign body, such as a pacemaker, they do not typically resolve by themselves,” he explained. “Often the device has to be removed. If this was true for CSI, then I think we would be aware of these complications.”
However, he praised the investigators for taking a look at CSI in a systematic approach. An invited panelist during the CRT featured research, which is where these data were presented, Dr. Sharp was more interested in understanding why they do not occur now that data are available to suggest they are rare.
“Is there something in the coronary environment, such as the consistent blood flow, that protects against infection?” he asked. CSI is a valid area of further research, according to Dr. Sharp, but he does not consider infected stents to be a common threat based on his own sizable case series.
Dr. Ramakumar and Dr. Sharp reported no potential conflicts of interest.
Infections of coronary stents appear to be uncommon, but it is not clear if they are often missed, underreported, or truly rare, according to a new analysis.
In a search of multiple databases, 79 cases of coronary stent infections (CSI) were found in 65 published reports, according to Venkatakrishnan Ramakumar, MBBS, MD, department of cardiology, All India Institute of Medical Sciences, New Delhi.
Over the period of evaluation, which had no defined starting point but stretched to November 2021, the 79 infections reported worldwide occurred when millions of percutaneous coronary intervention (PCI) procedures were performed. In the United States alone, the current estimated annual number of PCIs is 600,000, according to an article published in the Journal of the American Heart Association.
If the number of reported CSI cases represented even a modest fraction of those that occurred, the risk would still be almost negligible. Yet, Dr. Ramakumar insisted that there has been little attention paid to the potential for CSI, creating a situation in which many or almost all cases are simply being missed.
“We do not know how many infections have gone unrecognized,” Dr. Ramakumar said in presenting his results at the Cardiovascular Research Technologies conference, sponsored by MedStar Heart & Vascular Institute. And even if they are identified and promptly treated, there “is the potential for a publication bias,” he added, referring to the reluctance of investigators to submit and publishers to accept manuscripts with negative results.
Regardless of the frequency with which they occur, CSI is associated with bad outcomes, according to the data evaluated by Dr. Ramakumar. On the basis of in-hospital mortality, the primary endpoint of this analysis, the rate of death in patients developing CSI was 30.3%.
Successful treatment varied by hospital type
This risk was not uniform. Rather, rates of in-hospital mortality and proportion of patients treated successfully varied substantially by type of hospital. At private teaching hospitals for example, successful treatment – whether medical alone or followed by bailout surgery – was 80%. The rates fell to 40% at public teaching hospitals and then to 25% at private nonteaching hospitals.
The full-text articles included in this analysis were evaluated and selected by two reviewers working independently. A CSI diagnosis made clinically or with imaging and treatment outcomes were among criteria for the case studies to be included. Dr. Ramakumar said the study, which he claimed is the largest systematic review of CSI ever conducted, has been registered with PROSPERO, an international prospective registry of systematic reviews.
The presenting symptom was fever in 72% of cases and chest pain in the others, although there was one asymptomatic CSI reported. On angiography, 62% had a concomitant mycotic aneurysm. Intramyocardial abscess (13.9%), rupture (11.3%), and coronary fistula (7.5%) were also common findings, but no angiographic abnormalities could be identified in 53% of patients.
Following PCI, most CSI developed within 8 days (43%) or the first month (23%), but CSI was reported more than 6 months after the procedure in 19%. Complex PCI accounted for 51% of cases. Of stent types, 56% were drug eluting and 13% were bare metal.
When comparing characteristics of those who survived CSI with those who did not, most (89%) of those with a non–ST-segment elevated acute coronary syndrome ultimately survived, while survival from CSI in those with structural heart disease was only 17%.
Microbiological findings were not a criterion for study inclusion, but Staphylococcus species accounted for 65% of the infections for which positive cultures were reported. Pseudomonas accounted for 13%. Less than 4% (3.8%) tested positive for multiple pathogens. A small proportion of patients had unusual infectious organisms.
As part of this analysis, the investigators developed an artificial intelligence model to predict CSI based on patient characteristics and other variables. However, the specificity of only around 70% led Dr. Ramakumar to conclude that it does not yet have practical value.
However, he believes that better methodology to detect CSI is needed, and he proposed a diagnostic algorithm that he believes would both improve detection rates and accelerate the time to diagnosis.
Algorithm proposed for detection of CSI
In this algorithm, the first step in symptomatic patients with a positive blood culture suspected of CSI is imaging, such as transthoracic echocardiography, to identify features of infective endocarditis or endarteritis. If the imaging is positive, further imaging, such as PET, that supports the diagnosis, should be adequate to support a diagnosis and treatment.
If initial imaging is negative, alternative diagnoses should be considered, but Dr. Ramakumar advised repeat imaging after 48 hours if symptoms persist and no other causes are found.
Dr. Ramakumar acknowledged the many limitations of this analysis, including the small sample size and the challenges of assembling coherent data from case reports with variable types of information submitted during different eras of PCI evolution. However, reiterating that CSI might be frequently missed, he emphasized that this problem might be bigger than currently understood.
It is difficult to rule out any possibility that CSI is frequently missed, but Andrew Sharp, MD, PhD, a consultant interventional cardiologist at the University Hospital of Wales, Cardiff, is skeptical.
“One might think this is a potential problem, but I cannot think of one patient in whom this has occurred,” Dr. Sharp said in an interview. He is fairly confident that they are extremely rare.
“When there is infection associated with a foreign body, such as a pacemaker, they do not typically resolve by themselves,” he explained. “Often the device has to be removed. If this was true for CSI, then I think we would be aware of these complications.”
However, he praised the investigators for taking a look at CSI in a systematic approach. An invited panelist during the CRT featured research, which is where these data were presented, Dr. Sharp was more interested in understanding why they do not occur now that data are available to suggest they are rare.
“Is there something in the coronary environment, such as the consistent blood flow, that protects against infection?” he asked. CSI is a valid area of further research, according to Dr. Sharp, but he does not consider infected stents to be a common threat based on his own sizable case series.
Dr. Ramakumar and Dr. Sharp reported no potential conflicts of interest.
Infections of coronary stents appear to be uncommon, but it is not clear if they are often missed, underreported, or truly rare, according to a new analysis.
In a search of multiple databases, 79 cases of coronary stent infections (CSI) were found in 65 published reports, according to Venkatakrishnan Ramakumar, MBBS, MD, department of cardiology, All India Institute of Medical Sciences, New Delhi.
Over the period of evaluation, which had no defined starting point but stretched to November 2021, the 79 infections reported worldwide occurred when millions of percutaneous coronary intervention (PCI) procedures were performed. In the United States alone, the current estimated annual number of PCIs is 600,000, according to an article published in the Journal of the American Heart Association.
If the number of reported CSI cases represented even a modest fraction of those that occurred, the risk would still be almost negligible. Yet, Dr. Ramakumar insisted that there has been little attention paid to the potential for CSI, creating a situation in which many or almost all cases are simply being missed.
“We do not know how many infections have gone unrecognized,” Dr. Ramakumar said in presenting his results at the Cardiovascular Research Technologies conference, sponsored by MedStar Heart & Vascular Institute. And even if they are identified and promptly treated, there “is the potential for a publication bias,” he added, referring to the reluctance of investigators to submit and publishers to accept manuscripts with negative results.
Regardless of the frequency with which they occur, CSI is associated with bad outcomes, according to the data evaluated by Dr. Ramakumar. On the basis of in-hospital mortality, the primary endpoint of this analysis, the rate of death in patients developing CSI was 30.3%.
Successful treatment varied by hospital type
This risk was not uniform. Rather, rates of in-hospital mortality and proportion of patients treated successfully varied substantially by type of hospital. At private teaching hospitals for example, successful treatment – whether medical alone or followed by bailout surgery – was 80%. The rates fell to 40% at public teaching hospitals and then to 25% at private nonteaching hospitals.
The full-text articles included in this analysis were evaluated and selected by two reviewers working independently. A CSI diagnosis made clinically or with imaging and treatment outcomes were among criteria for the case studies to be included. Dr. Ramakumar said the study, which he claimed is the largest systematic review of CSI ever conducted, has been registered with PROSPERO, an international prospective registry of systematic reviews.
The presenting symptom was fever in 72% of cases and chest pain in the others, although there was one asymptomatic CSI reported. On angiography, 62% had a concomitant mycotic aneurysm. Intramyocardial abscess (13.9%), rupture (11.3%), and coronary fistula (7.5%) were also common findings, but no angiographic abnormalities could be identified in 53% of patients.
Following PCI, most CSI developed within 8 days (43%) or the first month (23%), but CSI was reported more than 6 months after the procedure in 19%. Complex PCI accounted for 51% of cases. Of stent types, 56% were drug eluting and 13% were bare metal.
When comparing characteristics of those who survived CSI with those who did not, most (89%) of those with a non–ST-segment elevated acute coronary syndrome ultimately survived, while survival from CSI in those with structural heart disease was only 17%.
Microbiological findings were not a criterion for study inclusion, but Staphylococcus species accounted for 65% of the infections for which positive cultures were reported. Pseudomonas accounted for 13%. Less than 4% (3.8%) tested positive for multiple pathogens. A small proportion of patients had unusual infectious organisms.
As part of this analysis, the investigators developed an artificial intelligence model to predict CSI based on patient characteristics and other variables. However, the specificity of only around 70% led Dr. Ramakumar to conclude that it does not yet have practical value.
However, he believes that better methodology to detect CSI is needed, and he proposed a diagnostic algorithm that he believes would both improve detection rates and accelerate the time to diagnosis.
Algorithm proposed for detection of CSI
In this algorithm, the first step in symptomatic patients with a positive blood culture suspected of CSI is imaging, such as transthoracic echocardiography, to identify features of infective endocarditis or endarteritis. If the imaging is positive, further imaging, such as PET, that supports the diagnosis, should be adequate to support a diagnosis and treatment.
If initial imaging is negative, alternative diagnoses should be considered, but Dr. Ramakumar advised repeat imaging after 48 hours if symptoms persist and no other causes are found.
Dr. Ramakumar acknowledged the many limitations of this analysis, including the small sample size and the challenges of assembling coherent data from case reports with variable types of information submitted during different eras of PCI evolution. However, reiterating that CSI might be frequently missed, he emphasized that this problem might be bigger than currently understood.
It is difficult to rule out any possibility that CSI is frequently missed, but Andrew Sharp, MD, PhD, a consultant interventional cardiologist at the University Hospital of Wales, Cardiff, is skeptical.
“One might think this is a potential problem, but I cannot think of one patient in whom this has occurred,” Dr. Sharp said in an interview. He is fairly confident that they are extremely rare.
“When there is infection associated with a foreign body, such as a pacemaker, they do not typically resolve by themselves,” he explained. “Often the device has to be removed. If this was true for CSI, then I think we would be aware of these complications.”
However, he praised the investigators for taking a look at CSI in a systematic approach. An invited panelist during the CRT featured research, which is where these data were presented, Dr. Sharp was more interested in understanding why they do not occur now that data are available to suggest they are rare.
“Is there something in the coronary environment, such as the consistent blood flow, that protects against infection?” he asked. CSI is a valid area of further research, according to Dr. Sharp, but he does not consider infected stents to be a common threat based on his own sizable case series.
Dr. Ramakumar and Dr. Sharp reported no potential conflicts of interest.
FROM CRT 2023
A better MS measure?
“When you measure disability, what you really want to know is how things are changing in the patient’s life and not your perception of how they’re changing,” said Mark Gudesblatt, MD, who presented a study comparing the technique, called quantitative gait analysis, to other measures at a poster session during the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
The device, called Protokinetics, has been used in clinical studies for Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, stroke, Friederich’s ataxia, and other conditions. The device is a digitized carpet that senses weight change and pressure as the individual walks.
“We can actually measure performance, and the performance is not just how fast you walk 25 feet. We’re measuring things that underlie how you walk: step length, step length variability, velocity, weight shift, how much time you spend on one leg. So it’s like listening to a symphony. We’re not measuring just the trumpets or the violins, we’re measuring everything,” said Dr. Gudesblatt, who is medical director of the Comprehensive MS Center at South Shore Neurologic Associates, Patchogue, N.Y.
Commonly used measures include the Expanded Disability Status Scale (EDSS), the 25-foot time walk (25’TW), and the Timed Up and Go (TUG).
Those measures are useful but don’t really measure up to clinical need, Dr. Gudesblatt said. “What you want is no evidence of disease activity, whether that’s multiple dimensions of thinking or multiple dimensions of walking, or changes on an MRI that are not the radiologist’s impression. Patients always say: ‘Doc, I’m worse.’ And we say: ‘Well, your exam is unchanged, your MRI has not changed. But they are worse for reasons – either their perception or their performance. So you can measure this very granularly, and you can relate it to their fear of falling, their balance confidence. This ups the game,” said Dr. Gudesblatt.
“And here’s where it gets even more interesting. You can use this for signatures of disease,” he added. The data can, for example, suggest that instead of Parkinson’s disease, a patient may have a Parkinson’s variant. “What we’re doing is showing how the 25-foot timed walk and Timed Up and Go are very traditional, conservative measures. They’re equivalent to the Pony Express. They’re good, but not where you want to be.”
Technology provides more sensitive, but more complex data
Digital tools to measure a variety of functions, including gait, cognition, and upper limb function are becoming increasingly common in MS, according to Catherine Larochelle, MD, PhD, who was asked for comment. “They are easily providing measures that are likely more sensitive and diverse and probably more meaningful about the daily functional status of a person than our usual EDSS,” said Dr. Larochelle, who is an associate professor at Université de Montréal.
The next step is to determine how best to use the complex data that such devices generate. “Lots of research is being done to better understand how to use the rich but complex data obtained with these tools to provide useful information to people with MS and their clinical team, to help guide shared clinical decisions, and likely accelerate and improve outcomes in clinical trials. So this is a very exciting new era in terms of clinical neurological assessment,” said Dr. Larochelle.
Granular gait analysis
Dr. Gudesblatt and colleagues analyzed retrospective data from 105 people with MS (69% female; average age, 53.7 years). Participants underwent all tests on the same day. The digital gait analysis captured velocity, double support, cadence, functional ambulation profile, gait variability index, and walk ratio over three trials conducted at preferred walking speed (PWS) and during dual task walking.
There were statistically significant relationships (P ≤ .01) between TUG and 25’TW (R2 = 0.62). There were also significant relationships between 25’TW and digital parameters measured at PWS: velocity (R2 = 0.63); double support (R2 = 0.74); cadence (R2 = 0.56); and gait variability index (R2 = 0.54). During dual task walking, there were relationships between 25’TW and velocity (R2 = 0.53); double support (R2 = 0.30); cadence (R2 = 0.43); and gait variability index (R2 = 0.46).
TUG values were significantly associated with gait parameters during PWS: velocity (R2 = 0.71); double support (R2 = 0.75); cadence (R2 = 0.43); gait variability index (R2 = 0.45); and walk ratio (R2 = 0.06). During dual task walking, TUG values were significantly associated with velocity (R2 = 0.55), double support (R2 = 0.21), cadence (R2 = 0.45), and gait variability index (R2 = 0.39).
“With the availability multiple effective disease modifying therapies and the future potential of restorative or reparative treatments, more granular, validated standardized outcome measures are urgently needed,” said Dr. Gudesblatt. Analysis of gait cycle can provide clinically useful information not adequately captured by the current, more traditional approaches of measuring outcomes in MS.
Dr. Gudesblatt and Dr. Larochelle have no relevant financial disclosures.
“When you measure disability, what you really want to know is how things are changing in the patient’s life and not your perception of how they’re changing,” said Mark Gudesblatt, MD, who presented a study comparing the technique, called quantitative gait analysis, to other measures at a poster session during the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
The device, called Protokinetics, has been used in clinical studies for Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, stroke, Friederich’s ataxia, and other conditions. The device is a digitized carpet that senses weight change and pressure as the individual walks.
“We can actually measure performance, and the performance is not just how fast you walk 25 feet. We’re measuring things that underlie how you walk: step length, step length variability, velocity, weight shift, how much time you spend on one leg. So it’s like listening to a symphony. We’re not measuring just the trumpets or the violins, we’re measuring everything,” said Dr. Gudesblatt, who is medical director of the Comprehensive MS Center at South Shore Neurologic Associates, Patchogue, N.Y.
Commonly used measures include the Expanded Disability Status Scale (EDSS), the 25-foot time walk (25’TW), and the Timed Up and Go (TUG).
Those measures are useful but don’t really measure up to clinical need, Dr. Gudesblatt said. “What you want is no evidence of disease activity, whether that’s multiple dimensions of thinking or multiple dimensions of walking, or changes on an MRI that are not the radiologist’s impression. Patients always say: ‘Doc, I’m worse.’ And we say: ‘Well, your exam is unchanged, your MRI has not changed. But they are worse for reasons – either their perception or their performance. So you can measure this very granularly, and you can relate it to their fear of falling, their balance confidence. This ups the game,” said Dr. Gudesblatt.
“And here’s where it gets even more interesting. You can use this for signatures of disease,” he added. The data can, for example, suggest that instead of Parkinson’s disease, a patient may have a Parkinson’s variant. “What we’re doing is showing how the 25-foot timed walk and Timed Up and Go are very traditional, conservative measures. They’re equivalent to the Pony Express. They’re good, but not where you want to be.”
Technology provides more sensitive, but more complex data
Digital tools to measure a variety of functions, including gait, cognition, and upper limb function are becoming increasingly common in MS, according to Catherine Larochelle, MD, PhD, who was asked for comment. “They are easily providing measures that are likely more sensitive and diverse and probably more meaningful about the daily functional status of a person than our usual EDSS,” said Dr. Larochelle, who is an associate professor at Université de Montréal.
The next step is to determine how best to use the complex data that such devices generate. “Lots of research is being done to better understand how to use the rich but complex data obtained with these tools to provide useful information to people with MS and their clinical team, to help guide shared clinical decisions, and likely accelerate and improve outcomes in clinical trials. So this is a very exciting new era in terms of clinical neurological assessment,” said Dr. Larochelle.
Granular gait analysis
Dr. Gudesblatt and colleagues analyzed retrospective data from 105 people with MS (69% female; average age, 53.7 years). Participants underwent all tests on the same day. The digital gait analysis captured velocity, double support, cadence, functional ambulation profile, gait variability index, and walk ratio over three trials conducted at preferred walking speed (PWS) and during dual task walking.
There were statistically significant relationships (P ≤ .01) between TUG and 25’TW (R2 = 0.62). There were also significant relationships between 25’TW and digital parameters measured at PWS: velocity (R2 = 0.63); double support (R2 = 0.74); cadence (R2 = 0.56); and gait variability index (R2 = 0.54). During dual task walking, there were relationships between 25’TW and velocity (R2 = 0.53); double support (R2 = 0.30); cadence (R2 = 0.43); and gait variability index (R2 = 0.46).
TUG values were significantly associated with gait parameters during PWS: velocity (R2 = 0.71); double support (R2 = 0.75); cadence (R2 = 0.43); gait variability index (R2 = 0.45); and walk ratio (R2 = 0.06). During dual task walking, TUG values were significantly associated with velocity (R2 = 0.55), double support (R2 = 0.21), cadence (R2 = 0.45), and gait variability index (R2 = 0.39).
“With the availability multiple effective disease modifying therapies and the future potential of restorative or reparative treatments, more granular, validated standardized outcome measures are urgently needed,” said Dr. Gudesblatt. Analysis of gait cycle can provide clinically useful information not adequately captured by the current, more traditional approaches of measuring outcomes in MS.
Dr. Gudesblatt and Dr. Larochelle have no relevant financial disclosures.
“When you measure disability, what you really want to know is how things are changing in the patient’s life and not your perception of how they’re changing,” said Mark Gudesblatt, MD, who presented a study comparing the technique, called quantitative gait analysis, to other measures at a poster session during the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
The device, called Protokinetics, has been used in clinical studies for Alzheimer’s disease, Parkinson’s disease, Huntington’s disease, stroke, Friederich’s ataxia, and other conditions. The device is a digitized carpet that senses weight change and pressure as the individual walks.
“We can actually measure performance, and the performance is not just how fast you walk 25 feet. We’re measuring things that underlie how you walk: step length, step length variability, velocity, weight shift, how much time you spend on one leg. So it’s like listening to a symphony. We’re not measuring just the trumpets or the violins, we’re measuring everything,” said Dr. Gudesblatt, who is medical director of the Comprehensive MS Center at South Shore Neurologic Associates, Patchogue, N.Y.
Commonly used measures include the Expanded Disability Status Scale (EDSS), the 25-foot time walk (25’TW), and the Timed Up and Go (TUG).
Those measures are useful but don’t really measure up to clinical need, Dr. Gudesblatt said. “What you want is no evidence of disease activity, whether that’s multiple dimensions of thinking or multiple dimensions of walking, or changes on an MRI that are not the radiologist’s impression. Patients always say: ‘Doc, I’m worse.’ And we say: ‘Well, your exam is unchanged, your MRI has not changed. But they are worse for reasons – either their perception or their performance. So you can measure this very granularly, and you can relate it to their fear of falling, their balance confidence. This ups the game,” said Dr. Gudesblatt.
“And here’s where it gets even more interesting. You can use this for signatures of disease,” he added. The data can, for example, suggest that instead of Parkinson’s disease, a patient may have a Parkinson’s variant. “What we’re doing is showing how the 25-foot timed walk and Timed Up and Go are very traditional, conservative measures. They’re equivalent to the Pony Express. They’re good, but not where you want to be.”
Technology provides more sensitive, but more complex data
Digital tools to measure a variety of functions, including gait, cognition, and upper limb function are becoming increasingly common in MS, according to Catherine Larochelle, MD, PhD, who was asked for comment. “They are easily providing measures that are likely more sensitive and diverse and probably more meaningful about the daily functional status of a person than our usual EDSS,” said Dr. Larochelle, who is an associate professor at Université de Montréal.
The next step is to determine how best to use the complex data that such devices generate. “Lots of research is being done to better understand how to use the rich but complex data obtained with these tools to provide useful information to people with MS and their clinical team, to help guide shared clinical decisions, and likely accelerate and improve outcomes in clinical trials. So this is a very exciting new era in terms of clinical neurological assessment,” said Dr. Larochelle.
Granular gait analysis
Dr. Gudesblatt and colleagues analyzed retrospective data from 105 people with MS (69% female; average age, 53.7 years). Participants underwent all tests on the same day. The digital gait analysis captured velocity, double support, cadence, functional ambulation profile, gait variability index, and walk ratio over three trials conducted at preferred walking speed (PWS) and during dual task walking.
There were statistically significant relationships (P ≤ .01) between TUG and 25’TW (R2 = 0.62). There were also significant relationships between 25’TW and digital parameters measured at PWS: velocity (R2 = 0.63); double support (R2 = 0.74); cadence (R2 = 0.56); and gait variability index (R2 = 0.54). During dual task walking, there were relationships between 25’TW and velocity (R2 = 0.53); double support (R2 = 0.30); cadence (R2 = 0.43); and gait variability index (R2 = 0.46).
TUG values were significantly associated with gait parameters during PWS: velocity (R2 = 0.71); double support (R2 = 0.75); cadence (R2 = 0.43); gait variability index (R2 = 0.45); and walk ratio (R2 = 0.06). During dual task walking, TUG values were significantly associated with velocity (R2 = 0.55), double support (R2 = 0.21), cadence (R2 = 0.45), and gait variability index (R2 = 0.39).
“With the availability multiple effective disease modifying therapies and the future potential of restorative or reparative treatments, more granular, validated standardized outcome measures are urgently needed,” said Dr. Gudesblatt. Analysis of gait cycle can provide clinically useful information not adequately captured by the current, more traditional approaches of measuring outcomes in MS.
Dr. Gudesblatt and Dr. Larochelle have no relevant financial disclosures.
FROM ACTRIMS FORUM 2023
500 more steps a day tied to 14% lower CVD risk in older adults
Older adults who added a quarter mile of steps to their day showed a reduction in risk of cardiovascular events by 14% within 4 years, according to a study in more than 400 individuals.
“Aging is such a dynamic process, but most studies of daily steps and step goals are conducted on younger populations,” lead author Erin E. Dooley, PhD, an epidemiologist at the University of Alabama at Birmingham, said in an interview.
The impact of more modest step goals in older adults has not been well studied, Dr. Dooley said.
The population in the current study ranged from 71 to 92 years, with an average age of 78 years. The older age and relatively short follow-up period show the importance of steps and physical activity in older adults, she said.
Dr. Dooley presented the study at the Epidemiology and Prevention/Lifestyle and Cardiometabolic Health meeting.
She and her colleagues analyzed a subsample of participants in Atherosclerosis Risk in Communities (ARIC) study, an ongoing study conducted by the National Heart, Lung, and Blood Institute. The study population included 452 adults for whom step data were available at visit 6 of the ARIC study between 2016 and 2017. Participants wore an accelerometer on the waist for at least 10 hours a day for at least 3 days. The mean age of the participants was 78.4 years, 59% were women, and 20% were Black.
Outcomes were measured through December 2019 and included fatal and nonfatal cardiovascular disease (CVD) events of coronary heart disease, stroke, and heart failure.
Overall, each additional 500 steps per day was linked to a 14% reduction in risk of a CVD event (hazard ratio, 0.86; 95% confidence interval, 0.76-0.98). The mean step count was 3,447 steps per day, and 34 participants (7.5%) experienced a CVD event over 1,269 person-years of follow-up.
The cumulative risk of CVD was significantly higher (11.5%) in the quartile of adults with the lowest step count (defined as fewer than 2,077 steps per day), compared with 3.5% in those with the highest step count (defined as at least 4,453 steps per day).
In addition, adults in the highest quartile of steps had a 77% reduced risk of a proximal CVD (within 3.5 years) event over the study period (HR, 0.23).
Additional research is needed to explore whether increased steps prevent or delay CVD and whether low step counts may be a biomarker for underlying disease, the researchers noted in their abstract.
However, the results support the value of even a modest increase in activity to reduce CVD risk in older adults.
Small steps may get patients started
Dr. Dooley said she was surprised at the degree of benefits on heart health from 500 steps, and noted that the findings have clinical implications.
“Steps may be a more understandable metric for physical activity for patients than talking about moderate to vigorous intensity physical activity,” she said in an interview. “While we do not want to diminish the importance of higher intensity physical activity, encouraging small increases in the number of daily steps can also have great benefits for heart health.
“Steps are counted using a variety of devices and phones, so it may be helpful for patients to show clinicians their activity during well visits,” Dr. Dooley said. “Walking may also be more manageable for people as it is low impact. Achievable goals are also important. This study suggests that, for older adults, around 3,000 steps or more was associated with reduced CVD risk,” although the greatest benefits were seen with the most active group who averaged 4,500 or more steps per day.
More research is needed to show how steps may change over time, and how this relates to CVD and heart health,” she said. “At this time, we only had a single measure of physical activity.”
Study fills research gap for older adults
“Currently, the majority of the literature exploring a relationship between physical activity and the risk for developing cardiovascular disease has evaluated all adults together, not only those who are 70 year of age and older,” Monica C. Serra, PhD, of the University of Texas, San Antonio, said in an interview. “This study allows us to start to target specific cardiovascular recommendations for older adults.”.
“It is always exciting to see results from physical activity studies that continue to support prior evidence that even small amounts of physical activity are beneficial to cardiovascular health,” said Dr. Serra, who is also vice chair of the program committee for the meeting. “These results suggest that even if only small additions in physical activity are achievable, they may have cumulative benefits in reducing cardiovascular disease risk.” For clinicians, the results also provide targets that are easy for patients to understand, said Dr. Serra. Daily step counts allow clinicians to provide specific and measurable goals to help their older patients increase physical activity.
“Small additions in total daily step counts may have clinically meaningful benefits to heart health, so promoting their patients to make any slight changes that are able to be consistently incorporated into their schedule should be encouraged. This may be best monitored by encouraging the use of an activity tracker,” she said.
Although the current study adds to the literature with objective measures of physical activity utilizing accelerometers, these devices are not as sensitive at picking up activities such as bicycling or swimming, which may be more appropriate for some older adults with mobility limitations and chronic conditions, Dr. Serra said. Additional research is needed to assess the impact of other activities on CVD in the older population.
The meeting was sponsored by the American Heart Association. The study received no outside funding. Dr. Dooley and Dr. Serra had no financial conflicts to disclose.
Older adults who added a quarter mile of steps to their day showed a reduction in risk of cardiovascular events by 14% within 4 years, according to a study in more than 400 individuals.
“Aging is such a dynamic process, but most studies of daily steps and step goals are conducted on younger populations,” lead author Erin E. Dooley, PhD, an epidemiologist at the University of Alabama at Birmingham, said in an interview.
The impact of more modest step goals in older adults has not been well studied, Dr. Dooley said.
The population in the current study ranged from 71 to 92 years, with an average age of 78 years. The older age and relatively short follow-up period show the importance of steps and physical activity in older adults, she said.
Dr. Dooley presented the study at the Epidemiology and Prevention/Lifestyle and Cardiometabolic Health meeting.
She and her colleagues analyzed a subsample of participants in Atherosclerosis Risk in Communities (ARIC) study, an ongoing study conducted by the National Heart, Lung, and Blood Institute. The study population included 452 adults for whom step data were available at visit 6 of the ARIC study between 2016 and 2017. Participants wore an accelerometer on the waist for at least 10 hours a day for at least 3 days. The mean age of the participants was 78.4 years, 59% were women, and 20% were Black.
Outcomes were measured through December 2019 and included fatal and nonfatal cardiovascular disease (CVD) events of coronary heart disease, stroke, and heart failure.
Overall, each additional 500 steps per day was linked to a 14% reduction in risk of a CVD event (hazard ratio, 0.86; 95% confidence interval, 0.76-0.98). The mean step count was 3,447 steps per day, and 34 participants (7.5%) experienced a CVD event over 1,269 person-years of follow-up.
The cumulative risk of CVD was significantly higher (11.5%) in the quartile of adults with the lowest step count (defined as fewer than 2,077 steps per day), compared with 3.5% in those with the highest step count (defined as at least 4,453 steps per day).
In addition, adults in the highest quartile of steps had a 77% reduced risk of a proximal CVD (within 3.5 years) event over the study period (HR, 0.23).
Additional research is needed to explore whether increased steps prevent or delay CVD and whether low step counts may be a biomarker for underlying disease, the researchers noted in their abstract.
However, the results support the value of even a modest increase in activity to reduce CVD risk in older adults.
Small steps may get patients started
Dr. Dooley said she was surprised at the degree of benefits on heart health from 500 steps, and noted that the findings have clinical implications.
“Steps may be a more understandable metric for physical activity for patients than talking about moderate to vigorous intensity physical activity,” she said in an interview. “While we do not want to diminish the importance of higher intensity physical activity, encouraging small increases in the number of daily steps can also have great benefits for heart health.
“Steps are counted using a variety of devices and phones, so it may be helpful for patients to show clinicians their activity during well visits,” Dr. Dooley said. “Walking may also be more manageable for people as it is low impact. Achievable goals are also important. This study suggests that, for older adults, around 3,000 steps or more was associated with reduced CVD risk,” although the greatest benefits were seen with the most active group who averaged 4,500 or more steps per day.
More research is needed to show how steps may change over time, and how this relates to CVD and heart health,” she said. “At this time, we only had a single measure of physical activity.”
Study fills research gap for older adults
“Currently, the majority of the literature exploring a relationship between physical activity and the risk for developing cardiovascular disease has evaluated all adults together, not only those who are 70 year of age and older,” Monica C. Serra, PhD, of the University of Texas, San Antonio, said in an interview. “This study allows us to start to target specific cardiovascular recommendations for older adults.”.
“It is always exciting to see results from physical activity studies that continue to support prior evidence that even small amounts of physical activity are beneficial to cardiovascular health,” said Dr. Serra, who is also vice chair of the program committee for the meeting. “These results suggest that even if only small additions in physical activity are achievable, they may have cumulative benefits in reducing cardiovascular disease risk.” For clinicians, the results also provide targets that are easy for patients to understand, said Dr. Serra. Daily step counts allow clinicians to provide specific and measurable goals to help their older patients increase physical activity.
“Small additions in total daily step counts may have clinically meaningful benefits to heart health, so promoting their patients to make any slight changes that are able to be consistently incorporated into their schedule should be encouraged. This may be best monitored by encouraging the use of an activity tracker,” she said.
Although the current study adds to the literature with objective measures of physical activity utilizing accelerometers, these devices are not as sensitive at picking up activities such as bicycling or swimming, which may be more appropriate for some older adults with mobility limitations and chronic conditions, Dr. Serra said. Additional research is needed to assess the impact of other activities on CVD in the older population.
The meeting was sponsored by the American Heart Association. The study received no outside funding. Dr. Dooley and Dr. Serra had no financial conflicts to disclose.
Older adults who added a quarter mile of steps to their day showed a reduction in risk of cardiovascular events by 14% within 4 years, according to a study in more than 400 individuals.
“Aging is such a dynamic process, but most studies of daily steps and step goals are conducted on younger populations,” lead author Erin E. Dooley, PhD, an epidemiologist at the University of Alabama at Birmingham, said in an interview.
The impact of more modest step goals in older adults has not been well studied, Dr. Dooley said.
The population in the current study ranged from 71 to 92 years, with an average age of 78 years. The older age and relatively short follow-up period show the importance of steps and physical activity in older adults, she said.
Dr. Dooley presented the study at the Epidemiology and Prevention/Lifestyle and Cardiometabolic Health meeting.
She and her colleagues analyzed a subsample of participants in Atherosclerosis Risk in Communities (ARIC) study, an ongoing study conducted by the National Heart, Lung, and Blood Institute. The study population included 452 adults for whom step data were available at visit 6 of the ARIC study between 2016 and 2017. Participants wore an accelerometer on the waist for at least 10 hours a day for at least 3 days. The mean age of the participants was 78.4 years, 59% were women, and 20% were Black.
Outcomes were measured through December 2019 and included fatal and nonfatal cardiovascular disease (CVD) events of coronary heart disease, stroke, and heart failure.
Overall, each additional 500 steps per day was linked to a 14% reduction in risk of a CVD event (hazard ratio, 0.86; 95% confidence interval, 0.76-0.98). The mean step count was 3,447 steps per day, and 34 participants (7.5%) experienced a CVD event over 1,269 person-years of follow-up.
The cumulative risk of CVD was significantly higher (11.5%) in the quartile of adults with the lowest step count (defined as fewer than 2,077 steps per day), compared with 3.5% in those with the highest step count (defined as at least 4,453 steps per day).
In addition, adults in the highest quartile of steps had a 77% reduced risk of a proximal CVD (within 3.5 years) event over the study period (HR, 0.23).
Additional research is needed to explore whether increased steps prevent or delay CVD and whether low step counts may be a biomarker for underlying disease, the researchers noted in their abstract.
However, the results support the value of even a modest increase in activity to reduce CVD risk in older adults.
Small steps may get patients started
Dr. Dooley said she was surprised at the degree of benefits on heart health from 500 steps, and noted that the findings have clinical implications.
“Steps may be a more understandable metric for physical activity for patients than talking about moderate to vigorous intensity physical activity,” she said in an interview. “While we do not want to diminish the importance of higher intensity physical activity, encouraging small increases in the number of daily steps can also have great benefits for heart health.
“Steps are counted using a variety of devices and phones, so it may be helpful for patients to show clinicians their activity during well visits,” Dr. Dooley said. “Walking may also be more manageable for people as it is low impact. Achievable goals are also important. This study suggests that, for older adults, around 3,000 steps or more was associated with reduced CVD risk,” although the greatest benefits were seen with the most active group who averaged 4,500 or more steps per day.
More research is needed to show how steps may change over time, and how this relates to CVD and heart health,” she said. “At this time, we only had a single measure of physical activity.”
Study fills research gap for older adults
“Currently, the majority of the literature exploring a relationship between physical activity and the risk for developing cardiovascular disease has evaluated all adults together, not only those who are 70 year of age and older,” Monica C. Serra, PhD, of the University of Texas, San Antonio, said in an interview. “This study allows us to start to target specific cardiovascular recommendations for older adults.”.
“It is always exciting to see results from physical activity studies that continue to support prior evidence that even small amounts of physical activity are beneficial to cardiovascular health,” said Dr. Serra, who is also vice chair of the program committee for the meeting. “These results suggest that even if only small additions in physical activity are achievable, they may have cumulative benefits in reducing cardiovascular disease risk.” For clinicians, the results also provide targets that are easy for patients to understand, said Dr. Serra. Daily step counts allow clinicians to provide specific and measurable goals to help their older patients increase physical activity.
“Small additions in total daily step counts may have clinically meaningful benefits to heart health, so promoting their patients to make any slight changes that are able to be consistently incorporated into their schedule should be encouraged. This may be best monitored by encouraging the use of an activity tracker,” she said.
Although the current study adds to the literature with objective measures of physical activity utilizing accelerometers, these devices are not as sensitive at picking up activities such as bicycling or swimming, which may be more appropriate for some older adults with mobility limitations and chronic conditions, Dr. Serra said. Additional research is needed to assess the impact of other activities on CVD in the older population.
The meeting was sponsored by the American Heart Association. The study received no outside funding. Dr. Dooley and Dr. Serra had no financial conflicts to disclose.
FROM EPI/LIFESTYLE 2023
Cutting calories may benefit cognition in MS
SAN DIEGO – , new research suggests.
Although this was just one small 12-week trial, “we were still able to see an amelioration in certain measures, for example, measures of fatigue as well as measures of cognitive function” in participants following the diet, said study investigator Laura Piccio, MD, PhD, associate professor, Washington University, St. Louis, and the University of Sydney.
Overall, the results underscore the importance of patients with MS maintaining an ideal body weight, Dr. Piccio said.
The findings were presented at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
High adherence rate
Obesity, which is associated with increased inflammation, has previously been linked to the development of MS. Release of adipokines from adipose tissue “shifts the balance” toward a proinflammatory milieu; and a chronic low-grade inflammatory state may promote autoimmunity, Dr. Piccio noted.
The current study included 42 adult patients (85.7% women; mean age, 48.2 years) with relapsing-remitting MS. Their mean baseline body mass index was 28.7, indicating being overweight, and the mean weight was 80.7 kg. The median Expanded Disability Status Scale (EDSS) score was 2.0.
Researchers randomly assigned participants to an intermittent calorie restriction (iCR) group or to a control group. For 2 days per week, the diet group ate 25% of what they normally would. For example, they might consume 500 calories from salads and non-starchy vegetables with a light dressing, Dr. Piccio said. The control group was not restricted in their eating.
In addition to the baseline assessment, the patients had study visits at weeks 6 and 12. Researchers adjusted for age, sex, and use of MS disease-modifying therapy.
Calorie reduction turned out to be a feasible intervention. “We had a pretty high adherence to the diet,” with 17 members of each group completing the study, Dr. Piccio reported. “So it shows this diet is possible,” she added.
Participants in the iCR group demonstrated a significant decrease in weight, BMI, and waist circumference at weeks 6 and 12 compared with baseline. They lost an average of 2.2 kg (about 5 pounds) over the course of the trial.
Serum leptin levels were also significantly decreased in the iCR group – and several lipids affected by the diet were positively correlated with adiponectin. Calorie restriction also affected T-cell subtypes.
“We definitely had an impact on body weight and also changes in certain inflammatory markers,” said Dr. Piccio.
Maintain healthy weight
The diet affected clinical measures, too. The score on the Symbol Digit Modalities Test (SDMT) increased significantly with iCR at 6 weeks (mean increase, 3.5; 95% confidence interval [CI], 0.6-6.3; P = .01) and 12 weeks (mean increase, 6.2; 95% CI, 3.4–9.5; P = .00004) compared with baseline.
There were no significant differences on the SDMT in the control group over time. In addition, the mean score on this test at 12 weeks was significantly higher in the iCR group compared with the control group.
Researchers also noted benefits of the diet on some patient-reported outcomes, such as certain subscales of the Modified Fatigue Impact Scale.
However, Dr. Piccio stressed that these results should be viewed with caution. “There could be many other factors driving this change in a small study like this,” she said. For example, just being on a diet might make individuals feel and function better. Dr. Piccio added that it is not clear what happens when participants return to their normal diet and their original body weight.
She noted that it is probably important to “get to a healthy body weight and to maintain it” – and it may not matter whether that’s through intermittent fasting or changing diet in other ways. “Anything you can do in order to keep your body weight within a normal range is important,” Dr. Piccio said.
Superb study
Commenting on the study findings, ACTRIMS program committee chair Catherine Larochelle, MD, PhD, clinician-scientist at Centre Hospitalier de l’Université de Montréal, said results from this “superb” study suggest that cognition can be positively influenced by healthy dietary habits.
“This is very promising and exciting,” Dr. Larochelle said. However, she cautioned that the data need to be reproduced and confirmed in other cohorts.
Overall, Dr. Larochelle noted that diet is becoming a “hot topic” in the field of MS. “This effervescent field of research should lead to new nonpharmacological therapeutic approaches to complement existing disease-modifying therapies and improve meaningful outcomes for people with MS,” she said.
The study was funded by the National MS Society in the United States. Dr. Piccio and Dr. Larochelle have reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
SAN DIEGO – , new research suggests.
Although this was just one small 12-week trial, “we were still able to see an amelioration in certain measures, for example, measures of fatigue as well as measures of cognitive function” in participants following the diet, said study investigator Laura Piccio, MD, PhD, associate professor, Washington University, St. Louis, and the University of Sydney.
Overall, the results underscore the importance of patients with MS maintaining an ideal body weight, Dr. Piccio said.
The findings were presented at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
High adherence rate
Obesity, which is associated with increased inflammation, has previously been linked to the development of MS. Release of adipokines from adipose tissue “shifts the balance” toward a proinflammatory milieu; and a chronic low-grade inflammatory state may promote autoimmunity, Dr. Piccio noted.
The current study included 42 adult patients (85.7% women; mean age, 48.2 years) with relapsing-remitting MS. Their mean baseline body mass index was 28.7, indicating being overweight, and the mean weight was 80.7 kg. The median Expanded Disability Status Scale (EDSS) score was 2.0.
Researchers randomly assigned participants to an intermittent calorie restriction (iCR) group or to a control group. For 2 days per week, the diet group ate 25% of what they normally would. For example, they might consume 500 calories from salads and non-starchy vegetables with a light dressing, Dr. Piccio said. The control group was not restricted in their eating.
In addition to the baseline assessment, the patients had study visits at weeks 6 and 12. Researchers adjusted for age, sex, and use of MS disease-modifying therapy.
Calorie reduction turned out to be a feasible intervention. “We had a pretty high adherence to the diet,” with 17 members of each group completing the study, Dr. Piccio reported. “So it shows this diet is possible,” she added.
Participants in the iCR group demonstrated a significant decrease in weight, BMI, and waist circumference at weeks 6 and 12 compared with baseline. They lost an average of 2.2 kg (about 5 pounds) over the course of the trial.
Serum leptin levels were also significantly decreased in the iCR group – and several lipids affected by the diet were positively correlated with adiponectin. Calorie restriction also affected T-cell subtypes.
“We definitely had an impact on body weight and also changes in certain inflammatory markers,” said Dr. Piccio.
Maintain healthy weight
The diet affected clinical measures, too. The score on the Symbol Digit Modalities Test (SDMT) increased significantly with iCR at 6 weeks (mean increase, 3.5; 95% confidence interval [CI], 0.6-6.3; P = .01) and 12 weeks (mean increase, 6.2; 95% CI, 3.4–9.5; P = .00004) compared with baseline.
There were no significant differences on the SDMT in the control group over time. In addition, the mean score on this test at 12 weeks was significantly higher in the iCR group compared with the control group.
Researchers also noted benefits of the diet on some patient-reported outcomes, such as certain subscales of the Modified Fatigue Impact Scale.
However, Dr. Piccio stressed that these results should be viewed with caution. “There could be many other factors driving this change in a small study like this,” she said. For example, just being on a diet might make individuals feel and function better. Dr. Piccio added that it is not clear what happens when participants return to their normal diet and their original body weight.
She noted that it is probably important to “get to a healthy body weight and to maintain it” – and it may not matter whether that’s through intermittent fasting or changing diet in other ways. “Anything you can do in order to keep your body weight within a normal range is important,” Dr. Piccio said.
Superb study
Commenting on the study findings, ACTRIMS program committee chair Catherine Larochelle, MD, PhD, clinician-scientist at Centre Hospitalier de l’Université de Montréal, said results from this “superb” study suggest that cognition can be positively influenced by healthy dietary habits.
“This is very promising and exciting,” Dr. Larochelle said. However, she cautioned that the data need to be reproduced and confirmed in other cohorts.
Overall, Dr. Larochelle noted that diet is becoming a “hot topic” in the field of MS. “This effervescent field of research should lead to new nonpharmacological therapeutic approaches to complement existing disease-modifying therapies and improve meaningful outcomes for people with MS,” she said.
The study was funded by the National MS Society in the United States. Dr. Piccio and Dr. Larochelle have reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
SAN DIEGO – , new research suggests.
Although this was just one small 12-week trial, “we were still able to see an amelioration in certain measures, for example, measures of fatigue as well as measures of cognitive function” in participants following the diet, said study investigator Laura Piccio, MD, PhD, associate professor, Washington University, St. Louis, and the University of Sydney.
Overall, the results underscore the importance of patients with MS maintaining an ideal body weight, Dr. Piccio said.
The findings were presented at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
High adherence rate
Obesity, which is associated with increased inflammation, has previously been linked to the development of MS. Release of adipokines from adipose tissue “shifts the balance” toward a proinflammatory milieu; and a chronic low-grade inflammatory state may promote autoimmunity, Dr. Piccio noted.
The current study included 42 adult patients (85.7% women; mean age, 48.2 years) with relapsing-remitting MS. Their mean baseline body mass index was 28.7, indicating being overweight, and the mean weight was 80.7 kg. The median Expanded Disability Status Scale (EDSS) score was 2.0.
Researchers randomly assigned participants to an intermittent calorie restriction (iCR) group or to a control group. For 2 days per week, the diet group ate 25% of what they normally would. For example, they might consume 500 calories from salads and non-starchy vegetables with a light dressing, Dr. Piccio said. The control group was not restricted in their eating.
In addition to the baseline assessment, the patients had study visits at weeks 6 and 12. Researchers adjusted for age, sex, and use of MS disease-modifying therapy.
Calorie reduction turned out to be a feasible intervention. “We had a pretty high adherence to the diet,” with 17 members of each group completing the study, Dr. Piccio reported. “So it shows this diet is possible,” she added.
Participants in the iCR group demonstrated a significant decrease in weight, BMI, and waist circumference at weeks 6 and 12 compared with baseline. They lost an average of 2.2 kg (about 5 pounds) over the course of the trial.
Serum leptin levels were also significantly decreased in the iCR group – and several lipids affected by the diet were positively correlated with adiponectin. Calorie restriction also affected T-cell subtypes.
“We definitely had an impact on body weight and also changes in certain inflammatory markers,” said Dr. Piccio.
Maintain healthy weight
The diet affected clinical measures, too. The score on the Symbol Digit Modalities Test (SDMT) increased significantly with iCR at 6 weeks (mean increase, 3.5; 95% confidence interval [CI], 0.6-6.3; P = .01) and 12 weeks (mean increase, 6.2; 95% CI, 3.4–9.5; P = .00004) compared with baseline.
There were no significant differences on the SDMT in the control group over time. In addition, the mean score on this test at 12 weeks was significantly higher in the iCR group compared with the control group.
Researchers also noted benefits of the diet on some patient-reported outcomes, such as certain subscales of the Modified Fatigue Impact Scale.
However, Dr. Piccio stressed that these results should be viewed with caution. “There could be many other factors driving this change in a small study like this,” she said. For example, just being on a diet might make individuals feel and function better. Dr. Piccio added that it is not clear what happens when participants return to their normal diet and their original body weight.
She noted that it is probably important to “get to a healthy body weight and to maintain it” – and it may not matter whether that’s through intermittent fasting or changing diet in other ways. “Anything you can do in order to keep your body weight within a normal range is important,” Dr. Piccio said.
Superb study
Commenting on the study findings, ACTRIMS program committee chair Catherine Larochelle, MD, PhD, clinician-scientist at Centre Hospitalier de l’Université de Montréal, said results from this “superb” study suggest that cognition can be positively influenced by healthy dietary habits.
“This is very promising and exciting,” Dr. Larochelle said. However, she cautioned that the data need to be reproduced and confirmed in other cohorts.
Overall, Dr. Larochelle noted that diet is becoming a “hot topic” in the field of MS. “This effervescent field of research should lead to new nonpharmacological therapeutic approaches to complement existing disease-modifying therapies and improve meaningful outcomes for people with MS,” she said.
The study was funded by the National MS Society in the United States. Dr. Piccio and Dr. Larochelle have reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
AT ACTRIMS FORUM 2023
High level of psychiatric morbidity in prodromal MS
SAN DIEGO – new research reveals. Results of a population-based study show the relative risk of psychiatric morbidity, including depression and anxiety, was up to 88% higher in patients with MS, compared with their counterparts without the disease.
These results are an incentive to “keep exploring” to get a “clearer picture” of the MS prodrome, said study investigator Anibal Chertcoff, MD, who is trained both as a neurologist and psychiatrist and is a postdoctoral fellow at the University of British Columbia, Vancouver.
With a better understanding of this phase, it might be possible to “push the limits to get an earlier diagnosis of MS,” said Dr. Chertcoff.
The findings were presented at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
Psychiatric morbidity during the prodromal phase of MS
Psychiatric comorbidities are common in MS. Emerging research suggests psychiatric disorders may be present before disease onset.
Using administrative and clinical data, the investigators collected information on MS cases and healthy matched controls who had no demyelinating disease claims. They used a clinical cohort of patients attending an MS clinic and a much larger administrative cohort that used an algorithm to detect MS cases using diagnostic codes and prescription data for disease modifying therapies.
The administrative cohort consisted of 6,863 MS cases and 31,865 controls while the clinical cohort had 966 cases and 4,534 controls. The majority (73%) of cases and controls were female. The mean age at the first demyelinating claim was 44 years.
The study’s primary outcome was prevalence of psychiatric morbidity using diagnostic codes for depression, anxiety, bipolar disorder, and schizophrenia. In the 5 years pre-MS onset, 28% of MS cases and 14.9% of controls had psychiatric morbidity.
The researchers plotted psychiatric morbidity in both MS cases and controls over time on a graph. “In terms of the prevalence of psychiatric morbidity, in each year the difference between the groups, at least visually, seems to increase with time as it gets closer to MS onset,” said Dr. Chertcoff.
The analysis showed the relative risk of psychiatric morbidity over the 5 years before MS onset was 1.88 (95% confidence interval, 1.80-1.97) in the administrative cohort, and 1.57 (95% CI, 1.36-1.80) in the clinical cohort.
Secondary analyses showed individuals with MS had more yearly physician visits, visits to psychiatrists, psychiatric hospital admissions, and prescription fills for psychiatric medication, compared with controls. This, said Dr. Chertcoff, illustrates the burden psychiatric morbidity during the prodromal phase of MS places on health care resources.
It’s possible that low-grade inflammation, which is linked to MS, is also pushing these psychiatric phenomena, said Dr. Chertcoff. He noted that the prevalence of depression is significantly higher not only in MS, but in a wide range of other inflammatory conditions.
In addition to psychiatric complaints, MS patients experience other symptoms, including pain, sleep disturbances, fatigue, and gastrointestinal issues during the MS prodrome, said Dr. Chertcoff.
Patients with MS are often seeing other physicians – including psychiatrists during the prodromal phase of the disease. Neurologists, Dr. Chertcoff said, could perhaps “raise awareness” among these other specialists about the prevalence of psychiatric morbidities during this phase.
He hopes experts in the field will consider developing research criteria for the MS prodrome similar to what has been done in Parkinson’s disease.
When does MS start?
Commenting on the research findings, Mark Freedman, MD, professor of medicine (Neurology), University of Ottawa, and director of the multiple sclerosis research unit, Ottawa Hospital-General Campus, said the study illustrates the increased research attention the interplay between MS and psychiatric disorders is getting.
He recalled “one of the most compelling” recent studies that looked at a large group of children with MS and showed their grades started falling more than 5 years before developing MS symptoms. “You could see their grades going down year by year by year, so an indicator that a young brain, which should be like a sponge and improving, was actually faltering well before the symptoms.”
Results from this new study continue to beg the question of when MS actually starts, said Dr. Freedman.
The study received funding from the U.S. National MS Society, the MS Society of Canada, and the Michael Smith Foundation. Dr. Chertcoff and Dr. Freedman reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
SAN DIEGO – new research reveals. Results of a population-based study show the relative risk of psychiatric morbidity, including depression and anxiety, was up to 88% higher in patients with MS, compared with their counterparts without the disease.
These results are an incentive to “keep exploring” to get a “clearer picture” of the MS prodrome, said study investigator Anibal Chertcoff, MD, who is trained both as a neurologist and psychiatrist and is a postdoctoral fellow at the University of British Columbia, Vancouver.
With a better understanding of this phase, it might be possible to “push the limits to get an earlier diagnosis of MS,” said Dr. Chertcoff.
The findings were presented at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
Psychiatric morbidity during the prodromal phase of MS
Psychiatric comorbidities are common in MS. Emerging research suggests psychiatric disorders may be present before disease onset.
Using administrative and clinical data, the investigators collected information on MS cases and healthy matched controls who had no demyelinating disease claims. They used a clinical cohort of patients attending an MS clinic and a much larger administrative cohort that used an algorithm to detect MS cases using diagnostic codes and prescription data for disease modifying therapies.
The administrative cohort consisted of 6,863 MS cases and 31,865 controls while the clinical cohort had 966 cases and 4,534 controls. The majority (73%) of cases and controls were female. The mean age at the first demyelinating claim was 44 years.
The study’s primary outcome was prevalence of psychiatric morbidity using diagnostic codes for depression, anxiety, bipolar disorder, and schizophrenia. In the 5 years pre-MS onset, 28% of MS cases and 14.9% of controls had psychiatric morbidity.
The researchers plotted psychiatric morbidity in both MS cases and controls over time on a graph. “In terms of the prevalence of psychiatric morbidity, in each year the difference between the groups, at least visually, seems to increase with time as it gets closer to MS onset,” said Dr. Chertcoff.
The analysis showed the relative risk of psychiatric morbidity over the 5 years before MS onset was 1.88 (95% confidence interval, 1.80-1.97) in the administrative cohort, and 1.57 (95% CI, 1.36-1.80) in the clinical cohort.
Secondary analyses showed individuals with MS had more yearly physician visits, visits to psychiatrists, psychiatric hospital admissions, and prescription fills for psychiatric medication, compared with controls. This, said Dr. Chertcoff, illustrates the burden psychiatric morbidity during the prodromal phase of MS places on health care resources.
It’s possible that low-grade inflammation, which is linked to MS, is also pushing these psychiatric phenomena, said Dr. Chertcoff. He noted that the prevalence of depression is significantly higher not only in MS, but in a wide range of other inflammatory conditions.
In addition to psychiatric complaints, MS patients experience other symptoms, including pain, sleep disturbances, fatigue, and gastrointestinal issues during the MS prodrome, said Dr. Chertcoff.
Patients with MS are often seeing other physicians – including psychiatrists during the prodromal phase of the disease. Neurologists, Dr. Chertcoff said, could perhaps “raise awareness” among these other specialists about the prevalence of psychiatric morbidities during this phase.
He hopes experts in the field will consider developing research criteria for the MS prodrome similar to what has been done in Parkinson’s disease.
When does MS start?
Commenting on the research findings, Mark Freedman, MD, professor of medicine (Neurology), University of Ottawa, and director of the multiple sclerosis research unit, Ottawa Hospital-General Campus, said the study illustrates the increased research attention the interplay between MS and psychiatric disorders is getting.
He recalled “one of the most compelling” recent studies that looked at a large group of children with MS and showed their grades started falling more than 5 years before developing MS symptoms. “You could see their grades going down year by year by year, so an indicator that a young brain, which should be like a sponge and improving, was actually faltering well before the symptoms.”
Results from this new study continue to beg the question of when MS actually starts, said Dr. Freedman.
The study received funding from the U.S. National MS Society, the MS Society of Canada, and the Michael Smith Foundation. Dr. Chertcoff and Dr. Freedman reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
SAN DIEGO – new research reveals. Results of a population-based study show the relative risk of psychiatric morbidity, including depression and anxiety, was up to 88% higher in patients with MS, compared with their counterparts without the disease.
These results are an incentive to “keep exploring” to get a “clearer picture” of the MS prodrome, said study investigator Anibal Chertcoff, MD, who is trained both as a neurologist and psychiatrist and is a postdoctoral fellow at the University of British Columbia, Vancouver.
With a better understanding of this phase, it might be possible to “push the limits to get an earlier diagnosis of MS,” said Dr. Chertcoff.
The findings were presented at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
Psychiatric morbidity during the prodromal phase of MS
Psychiatric comorbidities are common in MS. Emerging research suggests psychiatric disorders may be present before disease onset.
Using administrative and clinical data, the investigators collected information on MS cases and healthy matched controls who had no demyelinating disease claims. They used a clinical cohort of patients attending an MS clinic and a much larger administrative cohort that used an algorithm to detect MS cases using diagnostic codes and prescription data for disease modifying therapies.
The administrative cohort consisted of 6,863 MS cases and 31,865 controls while the clinical cohort had 966 cases and 4,534 controls. The majority (73%) of cases and controls were female. The mean age at the first demyelinating claim was 44 years.
The study’s primary outcome was prevalence of psychiatric morbidity using diagnostic codes for depression, anxiety, bipolar disorder, and schizophrenia. In the 5 years pre-MS onset, 28% of MS cases and 14.9% of controls had psychiatric morbidity.
The researchers plotted psychiatric morbidity in both MS cases and controls over time on a graph. “In terms of the prevalence of psychiatric morbidity, in each year the difference between the groups, at least visually, seems to increase with time as it gets closer to MS onset,” said Dr. Chertcoff.
The analysis showed the relative risk of psychiatric morbidity over the 5 years before MS onset was 1.88 (95% confidence interval, 1.80-1.97) in the administrative cohort, and 1.57 (95% CI, 1.36-1.80) in the clinical cohort.
Secondary analyses showed individuals with MS had more yearly physician visits, visits to psychiatrists, psychiatric hospital admissions, and prescription fills for psychiatric medication, compared with controls. This, said Dr. Chertcoff, illustrates the burden psychiatric morbidity during the prodromal phase of MS places on health care resources.
It’s possible that low-grade inflammation, which is linked to MS, is also pushing these psychiatric phenomena, said Dr. Chertcoff. He noted that the prevalence of depression is significantly higher not only in MS, but in a wide range of other inflammatory conditions.
In addition to psychiatric complaints, MS patients experience other symptoms, including pain, sleep disturbances, fatigue, and gastrointestinal issues during the MS prodrome, said Dr. Chertcoff.
Patients with MS are often seeing other physicians – including psychiatrists during the prodromal phase of the disease. Neurologists, Dr. Chertcoff said, could perhaps “raise awareness” among these other specialists about the prevalence of psychiatric morbidities during this phase.
He hopes experts in the field will consider developing research criteria for the MS prodrome similar to what has been done in Parkinson’s disease.
When does MS start?
Commenting on the research findings, Mark Freedman, MD, professor of medicine (Neurology), University of Ottawa, and director of the multiple sclerosis research unit, Ottawa Hospital-General Campus, said the study illustrates the increased research attention the interplay between MS and psychiatric disorders is getting.
He recalled “one of the most compelling” recent studies that looked at a large group of children with MS and showed their grades started falling more than 5 years before developing MS symptoms. “You could see their grades going down year by year by year, so an indicator that a young brain, which should be like a sponge and improving, was actually faltering well before the symptoms.”
Results from this new study continue to beg the question of when MS actually starts, said Dr. Freedman.
The study received funding from the U.S. National MS Society, the MS Society of Canada, and the Michael Smith Foundation. Dr. Chertcoff and Dr. Freedman reported no relevant financial relationships.
A version of this article originally appeared on Medscape.com.
AT ACTRIMS FORUM 2023
MOGAD: Immunotherapy predicts fewer relapses
SAN DIEGO – The authors note that many MOGAD patients never experience a relapse and it is difficult to predict which ones will.
MOGAD can cause optic neuritis, transverse myelitis, and acute disseminated encephalomyelitis (ADEM). It was first described in 2007, and the best approaches to therapy are not yet understood. The new study is at least a starting point for understanding treatment outcomes, according to Philippe Bilodeau, MD, who presented the study during a poster session at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
Predicting which patients will relapse
“I think one of the biggest unanswered clinical questions in MOGAD is trying to determine who’s going to go on to have relapsing MOGAD. About 30% to 40% of patients with MOGAD will never have a second attack. So one of the big questions is: How can we identify patients who would benefit from immunotherapy, and how can we identify patients who will have a more benign disease course and may not need to be started on a treatment,” said Dr. Bilodeau, a neurology resident at Massachusetts General Hospital/Brigham and Women’s Hospital, Boston.
The researchers analyzed data from 143 patients seen at Massachusetts General or Brigham and Women’s Hospital who had presented with their first attack. Over a follow-up period of 5 years, the relapse rate was 61.8%. The researchers examined various factors, including age of onset, high MOG titer, attack type, and male sex, and found that only the latter came close to predicting relapse, though it fell short of clinical significance (hazard ratio [HR], 0.61; P = .07).
However, treatment with mycophenolate, azathioprine, intravenous immunoglobulins (IVIG), rituximab, or tocilizumab strongly predicted a lower probability of relapse (HR, 0.25; P < .0001).
The most effective treatment for relapsing MOGAD
In a separate poster, his team examined a subset of the cohort of 88 patients who were treated with mycophenolate mofetil, B-cell depletion, rituximab, or IV immunoglobulins (IVIG) during a first or second relapse, as well as an analysis of every relapse experienced by any patient during the course of their disease. “Using a negative binomial regression, we looked at the annualized relapse rates and incidence rate ratios between the different treatments. No matter how you looked at the data – even if you looked at total time on IVIG, if you looked at time on monotherapy, excluding if they were on prednisone at the same time if they were on both IVIG and rituximab, if you only consider patients that were on high dose IVIG – IVIG was by far the best treatment and rituximab was always the least effective, and mycophenolate was always between IVIG and rituximab. So I think in that cohort, we can say with some confidence that IVIG is the most effective treatment for relapsing MOGAD,” said Dr. Bilodeau.
Other studies had suggested efficacy of individual treatments, but “I think what hadn’t been done is taking one cohort and comparing those treatments head to head, so that’s what we were trying to do,” said Dr. Bilodeau.
Both studies have the usual caveats of a retrospective study and so cannot prove causality. “We need to find more covariates to make sure that there’s no confounding (factor) explaining this and to make sure that there aren’t other demographic or clinical factors that explain the association. But as it stands, I think at this time starting treatment with immunotherapy is the only thing that we know will reduce the risk of having a future relapse. There’s a lot of further analysis that we need to do,” said Dr. Bilodeau.
He said that the study also provided some preliminary insight into treatment of pediatric disease. “We have interesting data from that analysis that pediatric-onset MOGAD actually had a particularly good response to [mycophenolate], more so than in adults,” he said.
“At this point, I think a rational approach if you have someone coming in with a first relapse is, you have to assess their risk tolerance. If they’re a very risk-averse patient, I think it’s reasonable to start them on treatment. I think it’s reasonable to monitor their titer. There’s some data that if they seroconvert to negative, you might be able to stop immunotherapy. If someone has established relapsing disease, and they have adult onset [disease], IVIG should be the first-line treatment. If they’re pediatric onset, either [mycophenolate] or IVIG are probably good first line treatments,” he said.
‘A good beginning’
The studies are a good beginning to getting a better understanding of MOGAD treatment, according to Michael Cossoy, MD, who attended the poster session and was asked to comment on the study.
“It’s interesting because MOG antibody-associated disease is so relatively new that we don’t have a great idea yet about who needs to be treated. Should we put them on some immunosuppressive therapy or should we wait? At the moment this is a bit of a tautology. You know that if you put people on therapy from the very first event, some of those people are not going to have a second event. And some of the people are, but you’ve decreased the risk of them having that second (event) if your treatment is effective. So that’s what they’ve shown, which is great. But the question is, can you predict who’s going to have a second event and know who to put on treatment and not put on treatment? It’s too early to know, but this is a good start,” said Dr. Cossoy, assistant professor of ophthalmology at the University of Manitoba.
Dr. Bilodeau and Dr. Cossoy have no relevant financial disclosures.
SAN DIEGO – The authors note that many MOGAD patients never experience a relapse and it is difficult to predict which ones will.
MOGAD can cause optic neuritis, transverse myelitis, and acute disseminated encephalomyelitis (ADEM). It was first described in 2007, and the best approaches to therapy are not yet understood. The new study is at least a starting point for understanding treatment outcomes, according to Philippe Bilodeau, MD, who presented the study during a poster session at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
Predicting which patients will relapse
“I think one of the biggest unanswered clinical questions in MOGAD is trying to determine who’s going to go on to have relapsing MOGAD. About 30% to 40% of patients with MOGAD will never have a second attack. So one of the big questions is: How can we identify patients who would benefit from immunotherapy, and how can we identify patients who will have a more benign disease course and may not need to be started on a treatment,” said Dr. Bilodeau, a neurology resident at Massachusetts General Hospital/Brigham and Women’s Hospital, Boston.
The researchers analyzed data from 143 patients seen at Massachusetts General or Brigham and Women’s Hospital who had presented with their first attack. Over a follow-up period of 5 years, the relapse rate was 61.8%. The researchers examined various factors, including age of onset, high MOG titer, attack type, and male sex, and found that only the latter came close to predicting relapse, though it fell short of clinical significance (hazard ratio [HR], 0.61; P = .07).
However, treatment with mycophenolate, azathioprine, intravenous immunoglobulins (IVIG), rituximab, or tocilizumab strongly predicted a lower probability of relapse (HR, 0.25; P < .0001).
The most effective treatment for relapsing MOGAD
In a separate poster, his team examined a subset of the cohort of 88 patients who were treated with mycophenolate mofetil, B-cell depletion, rituximab, or IV immunoglobulins (IVIG) during a first or second relapse, as well as an analysis of every relapse experienced by any patient during the course of their disease. “Using a negative binomial regression, we looked at the annualized relapse rates and incidence rate ratios between the different treatments. No matter how you looked at the data – even if you looked at total time on IVIG, if you looked at time on monotherapy, excluding if they were on prednisone at the same time if they were on both IVIG and rituximab, if you only consider patients that were on high dose IVIG – IVIG was by far the best treatment and rituximab was always the least effective, and mycophenolate was always between IVIG and rituximab. So I think in that cohort, we can say with some confidence that IVIG is the most effective treatment for relapsing MOGAD,” said Dr. Bilodeau.
Other studies had suggested efficacy of individual treatments, but “I think what hadn’t been done is taking one cohort and comparing those treatments head to head, so that’s what we were trying to do,” said Dr. Bilodeau.
Both studies have the usual caveats of a retrospective study and so cannot prove causality. “We need to find more covariates to make sure that there’s no confounding (factor) explaining this and to make sure that there aren’t other demographic or clinical factors that explain the association. But as it stands, I think at this time starting treatment with immunotherapy is the only thing that we know will reduce the risk of having a future relapse. There’s a lot of further analysis that we need to do,” said Dr. Bilodeau.
He said that the study also provided some preliminary insight into treatment of pediatric disease. “We have interesting data from that analysis that pediatric-onset MOGAD actually had a particularly good response to [mycophenolate], more so than in adults,” he said.
“At this point, I think a rational approach if you have someone coming in with a first relapse is, you have to assess their risk tolerance. If they’re a very risk-averse patient, I think it’s reasonable to start them on treatment. I think it’s reasonable to monitor their titer. There’s some data that if they seroconvert to negative, you might be able to stop immunotherapy. If someone has established relapsing disease, and they have adult onset [disease], IVIG should be the first-line treatment. If they’re pediatric onset, either [mycophenolate] or IVIG are probably good first line treatments,” he said.
‘A good beginning’
The studies are a good beginning to getting a better understanding of MOGAD treatment, according to Michael Cossoy, MD, who attended the poster session and was asked to comment on the study.
“It’s interesting because MOG antibody-associated disease is so relatively new that we don’t have a great idea yet about who needs to be treated. Should we put them on some immunosuppressive therapy or should we wait? At the moment this is a bit of a tautology. You know that if you put people on therapy from the very first event, some of those people are not going to have a second event. And some of the people are, but you’ve decreased the risk of them having that second (event) if your treatment is effective. So that’s what they’ve shown, which is great. But the question is, can you predict who’s going to have a second event and know who to put on treatment and not put on treatment? It’s too early to know, but this is a good start,” said Dr. Cossoy, assistant professor of ophthalmology at the University of Manitoba.
Dr. Bilodeau and Dr. Cossoy have no relevant financial disclosures.
SAN DIEGO – The authors note that many MOGAD patients never experience a relapse and it is difficult to predict which ones will.
MOGAD can cause optic neuritis, transverse myelitis, and acute disseminated encephalomyelitis (ADEM). It was first described in 2007, and the best approaches to therapy are not yet understood. The new study is at least a starting point for understanding treatment outcomes, according to Philippe Bilodeau, MD, who presented the study during a poster session at the annual meeting held by the Americas Committee for Treatment and Research in Multiple Sclerosis (ACTRIMS).
Predicting which patients will relapse
“I think one of the biggest unanswered clinical questions in MOGAD is trying to determine who’s going to go on to have relapsing MOGAD. About 30% to 40% of patients with MOGAD will never have a second attack. So one of the big questions is: How can we identify patients who would benefit from immunotherapy, and how can we identify patients who will have a more benign disease course and may not need to be started on a treatment,” said Dr. Bilodeau, a neurology resident at Massachusetts General Hospital/Brigham and Women’s Hospital, Boston.
The researchers analyzed data from 143 patients seen at Massachusetts General or Brigham and Women’s Hospital who had presented with their first attack. Over a follow-up period of 5 years, the relapse rate was 61.8%. The researchers examined various factors, including age of onset, high MOG titer, attack type, and male sex, and found that only the latter came close to predicting relapse, though it fell short of clinical significance (hazard ratio [HR], 0.61; P = .07).
However, treatment with mycophenolate, azathioprine, intravenous immunoglobulins (IVIG), rituximab, or tocilizumab strongly predicted a lower probability of relapse (HR, 0.25; P < .0001).
The most effective treatment for relapsing MOGAD
In a separate poster, his team examined a subset of the cohort of 88 patients who were treated with mycophenolate mofetil, B-cell depletion, rituximab, or IV immunoglobulins (IVIG) during a first or second relapse, as well as an analysis of every relapse experienced by any patient during the course of their disease. “Using a negative binomial regression, we looked at the annualized relapse rates and incidence rate ratios between the different treatments. No matter how you looked at the data – even if you looked at total time on IVIG, if you looked at time on monotherapy, excluding if they were on prednisone at the same time if they were on both IVIG and rituximab, if you only consider patients that were on high dose IVIG – IVIG was by far the best treatment and rituximab was always the least effective, and mycophenolate was always between IVIG and rituximab. So I think in that cohort, we can say with some confidence that IVIG is the most effective treatment for relapsing MOGAD,” said Dr. Bilodeau.
Other studies had suggested efficacy of individual treatments, but “I think what hadn’t been done is taking one cohort and comparing those treatments head to head, so that’s what we were trying to do,” said Dr. Bilodeau.
Both studies have the usual caveats of a retrospective study and so cannot prove causality. “We need to find more covariates to make sure that there’s no confounding (factor) explaining this and to make sure that there aren’t other demographic or clinical factors that explain the association. But as it stands, I think at this time starting treatment with immunotherapy is the only thing that we know will reduce the risk of having a future relapse. There’s a lot of further analysis that we need to do,” said Dr. Bilodeau.
He said that the study also provided some preliminary insight into treatment of pediatric disease. “We have interesting data from that analysis that pediatric-onset MOGAD actually had a particularly good response to [mycophenolate], more so than in adults,” he said.
“At this point, I think a rational approach if you have someone coming in with a first relapse is, you have to assess their risk tolerance. If they’re a very risk-averse patient, I think it’s reasonable to start them on treatment. I think it’s reasonable to monitor their titer. There’s some data that if they seroconvert to negative, you might be able to stop immunotherapy. If someone has established relapsing disease, and they have adult onset [disease], IVIG should be the first-line treatment. If they’re pediatric onset, either [mycophenolate] or IVIG are probably good first line treatments,” he said.
‘A good beginning’
The studies are a good beginning to getting a better understanding of MOGAD treatment, according to Michael Cossoy, MD, who attended the poster session and was asked to comment on the study.
“It’s interesting because MOG antibody-associated disease is so relatively new that we don’t have a great idea yet about who needs to be treated. Should we put them on some immunosuppressive therapy or should we wait? At the moment this is a bit of a tautology. You know that if you put people on therapy from the very first event, some of those people are not going to have a second event. And some of the people are, but you’ve decreased the risk of them having that second (event) if your treatment is effective. So that’s what they’ve shown, which is great. But the question is, can you predict who’s going to have a second event and know who to put on treatment and not put on treatment? It’s too early to know, but this is a good start,” said Dr. Cossoy, assistant professor of ophthalmology at the University of Manitoba.
Dr. Bilodeau and Dr. Cossoy have no relevant financial disclosures.
At ACTRIMS FORUM 2023





