What Is the Mechanism of Alemtuzumab-Induced Autoimmunity?

Article Type
Changed
Wed, 01/16/2019 - 15:39
Investigators examine whether reduced thymic function and depletion of B and T cells explain increased autoimmune risk after treatment.

SAN DIEGO—The monoclonal antibody alemtuzumab can be an effective treatment for people living with multiple sclerosis (MS), but the agent is also associated with an increased risk for developing other autoimmune diseases, leaving clinicians with a conundrum.

Alemtuzumab is an efficacious treatment in MS that can slow the rate of brain atrophy over the long term, said Alasdair Coles, MD, Professor of Neuroimmunology at the University of Cambridge, United Kingdom, at ACTRIMS 2018 Forum. “But one or two years after each cycle of alemtuzumab, patients are at high risk of autoimmune diseases. This is the not-too-worrying thyroid disease, but there are some troubling and potentially serious complications at lower frequency.”

Alasdair Coles, MD


Autoimmune thyroid disease can affect as much as 40% of patients treated with alemtuzumab, but immune thrombocytopenia (3%) and autoimmune renal disease (0.1%) are also reported. About one in 10 people treated with the monoclonal antibody for MS can also develop de novo asymptomatic autoantibodies.

“People ask, ‘Why doesn’t MS come back as part of this generic mechanism?’ I don’t know the answer to that question,” said Dr. Coles.

In the United States, alemtuzumab is indicated for treatment of relapsing-remitting MS in adults who have failed to respond adequately to two or more previous therapies. In contrast, “this has become a first-line treatment in the UK,” said Dr. Coles. “Unfortunately, we can offer no proven treatment to prevent this autoimmunity.”

Considering Proposed Mechanisms

Dr. Coles and other researchers are investigating the cellular mechanism underlying the paradoxical autoimmunity associated with alemtuzumab. Some have suggested that faulty immune B cells could be the culprit, but “there is no difference in B cell reconstitution between those who do and do not get autoimmunity,” said Dr. Coles. “So, we do not think that autoimmunity after alemtuzumab is primarily a B cell problem.” Other investigators have suggested that the mechanism is the depletion of a key immune regulatory cell associated with alemtuzumab. One such example is depletion in T cells as part of an autoimmune cascade that involve CD52-high expressing cells and sialic acid-binding immunoglobulin-like lectin 10. “We do not believe this,” said Dr. Coles. “We cannot replicate the finding of reduced CD52 high cells in type 1 diabetes or MS, nor the binding of SIGLEC-10 to CD52.”

Along with his colleague Joanne Jones, MD, PhD, also at the University of Cambridge, Dr. Coles and his team instead propose that autoimmunity after alemtuzumab therapy is associated with a homeostatic proliferation of T cells in the context of a defective thymus. “We see thymic function reduced after alemtuzumab for a few months. We do not know if alemtuzumab is having a direct impact on the thymus or if it is an indirect effect through a cytokine storm at the time of administering alemtuzumab.”

In addition, in contrast with B cells, both CD4-positive and CD8-positive T cells are clonally restricted after alemtuzumab treatment, said Dr. Coles. “These are the only changes that distinguish patients who do and do not develop autoimmunity,” he said. “Those who develop autoimmunity have reduced clonality and impaired thymic function, compared with those who do not.”

The theory is that the reconstitution of T cells after alemtuzumab comes preferentially from expansion of peripheral T cells, rather than naïve T cells from the thymus, which leads to a higher representation of autoreactive T cells and thus leads to B-cell- and antibody-mediated autoimmunity.

 

 

The Bigger Picture

The autoimmune phenomenon is not unique to alemtuzumab or MS. “This turns out to be one of a family of clinical situations where the reconstitution of the depleted lymphocyte repertoire leads to autoimmunity,” Dr. Coles said. A similar effect was seen years ago when very lymphopenic patients with HIV were given antiviral therapy. About 10% of treated patients had this effect. Furthermore, about 10% of patients who undergo bone marrow transplant may experience similar autoimmune concerns.

“What we do think is true is that we have tapped into a classical expression of autoimmunity,” Dr. Coles said. “Alemtuzumab is a fantastic opportunity to study the mechanisms underlying lymphopenia-associated autoimmunity.”

A Tantalizing Prospect

“It is a tantalizing prospect that susceptible individuals might be identified in the future prior to treatment,” Dr. Coles said. “We looked at IL-21. We showed that after treatment, and perhaps more interestingly, before treatment with alemtuzumab, serum levels of IL-21 are greater in those who subsequently develop autoimmune disease. This [finding] suggests [that] some individuals are prone to developing autoimmune disease and could be identified potentially prior to treatment with alemtuzumab.”

More research is needed, including the development of more sensitive IL-21 assays for use in this population, Dr. Coles said. “Please do not attempt to predict the risk of autoimmunity after alemtuzumab using the current commercial assays. This is a source of some frustration for me.” A potential route of lymphocyte repertoire reconstitution after alemtuzumab is thymic reconstitution, which could lead to a more diverse immune repertoire, Dr. Coles said. “If we can direct reconstitution through the thymic reconstitution, we should be able to prevent autoimmunity.”

Dr. Coles receives honoraria for travel and speaking from Sanofi Genzyme, which markets alemtuzumab.

—Damian McNamara

Issue
Neurology Reviews - 26(4)
Publications
Topics
Page Number
76-77
Sections
Investigators examine whether reduced thymic function and depletion of B and T cells explain increased autoimmune risk after treatment.
Investigators examine whether reduced thymic function and depletion of B and T cells explain increased autoimmune risk after treatment.

SAN DIEGO—The monoclonal antibody alemtuzumab can be an effective treatment for people living with multiple sclerosis (MS), but the agent is also associated with an increased risk for developing other autoimmune diseases, leaving clinicians with a conundrum.

Alemtuzumab is an efficacious treatment in MS that can slow the rate of brain atrophy over the long term, said Alasdair Coles, MD, Professor of Neuroimmunology at the University of Cambridge, United Kingdom, at ACTRIMS 2018 Forum. “But one or two years after each cycle of alemtuzumab, patients are at high risk of autoimmune diseases. This is the not-too-worrying thyroid disease, but there are some troubling and potentially serious complications at lower frequency.”

Alasdair Coles, MD


Autoimmune thyroid disease can affect as much as 40% of patients treated with alemtuzumab, but immune thrombocytopenia (3%) and autoimmune renal disease (0.1%) are also reported. About one in 10 people treated with the monoclonal antibody for MS can also develop de novo asymptomatic autoantibodies.

“People ask, ‘Why doesn’t MS come back as part of this generic mechanism?’ I don’t know the answer to that question,” said Dr. Coles.

In the United States, alemtuzumab is indicated for treatment of relapsing-remitting MS in adults who have failed to respond adequately to two or more previous therapies. In contrast, “this has become a first-line treatment in the UK,” said Dr. Coles. “Unfortunately, we can offer no proven treatment to prevent this autoimmunity.”

Considering Proposed Mechanisms

Dr. Coles and other researchers are investigating the cellular mechanism underlying the paradoxical autoimmunity associated with alemtuzumab. Some have suggested that faulty immune B cells could be the culprit, but “there is no difference in B cell reconstitution between those who do and do not get autoimmunity,” said Dr. Coles. “So, we do not think that autoimmunity after alemtuzumab is primarily a B cell problem.” Other investigators have suggested that the mechanism is the depletion of a key immune regulatory cell associated with alemtuzumab. One such example is depletion in T cells as part of an autoimmune cascade that involve CD52-high expressing cells and sialic acid-binding immunoglobulin-like lectin 10. “We do not believe this,” said Dr. Coles. “We cannot replicate the finding of reduced CD52 high cells in type 1 diabetes or MS, nor the binding of SIGLEC-10 to CD52.”

Along with his colleague Joanne Jones, MD, PhD, also at the University of Cambridge, Dr. Coles and his team instead propose that autoimmunity after alemtuzumab therapy is associated with a homeostatic proliferation of T cells in the context of a defective thymus. “We see thymic function reduced after alemtuzumab for a few months. We do not know if alemtuzumab is having a direct impact on the thymus or if it is an indirect effect through a cytokine storm at the time of administering alemtuzumab.”

In addition, in contrast with B cells, both CD4-positive and CD8-positive T cells are clonally restricted after alemtuzumab treatment, said Dr. Coles. “These are the only changes that distinguish patients who do and do not develop autoimmunity,” he said. “Those who develop autoimmunity have reduced clonality and impaired thymic function, compared with those who do not.”

The theory is that the reconstitution of T cells after alemtuzumab comes preferentially from expansion of peripheral T cells, rather than naïve T cells from the thymus, which leads to a higher representation of autoreactive T cells and thus leads to B-cell- and antibody-mediated autoimmunity.

 

 

The Bigger Picture

The autoimmune phenomenon is not unique to alemtuzumab or MS. “This turns out to be one of a family of clinical situations where the reconstitution of the depleted lymphocyte repertoire leads to autoimmunity,” Dr. Coles said. A similar effect was seen years ago when very lymphopenic patients with HIV were given antiviral therapy. About 10% of treated patients had this effect. Furthermore, about 10% of patients who undergo bone marrow transplant may experience similar autoimmune concerns.

“What we do think is true is that we have tapped into a classical expression of autoimmunity,” Dr. Coles said. “Alemtuzumab is a fantastic opportunity to study the mechanisms underlying lymphopenia-associated autoimmunity.”

A Tantalizing Prospect

“It is a tantalizing prospect that susceptible individuals might be identified in the future prior to treatment,” Dr. Coles said. “We looked at IL-21. We showed that after treatment, and perhaps more interestingly, before treatment with alemtuzumab, serum levels of IL-21 are greater in those who subsequently develop autoimmune disease. This [finding] suggests [that] some individuals are prone to developing autoimmune disease and could be identified potentially prior to treatment with alemtuzumab.”

More research is needed, including the development of more sensitive IL-21 assays for use in this population, Dr. Coles said. “Please do not attempt to predict the risk of autoimmunity after alemtuzumab using the current commercial assays. This is a source of some frustration for me.” A potential route of lymphocyte repertoire reconstitution after alemtuzumab is thymic reconstitution, which could lead to a more diverse immune repertoire, Dr. Coles said. “If we can direct reconstitution through the thymic reconstitution, we should be able to prevent autoimmunity.”

Dr. Coles receives honoraria for travel and speaking from Sanofi Genzyme, which markets alemtuzumab.

—Damian McNamara

SAN DIEGO—The monoclonal antibody alemtuzumab can be an effective treatment for people living with multiple sclerosis (MS), but the agent is also associated with an increased risk for developing other autoimmune diseases, leaving clinicians with a conundrum.

Alemtuzumab is an efficacious treatment in MS that can slow the rate of brain atrophy over the long term, said Alasdair Coles, MD, Professor of Neuroimmunology at the University of Cambridge, United Kingdom, at ACTRIMS 2018 Forum. “But one or two years after each cycle of alemtuzumab, patients are at high risk of autoimmune diseases. This is the not-too-worrying thyroid disease, but there are some troubling and potentially serious complications at lower frequency.”

Alasdair Coles, MD


Autoimmune thyroid disease can affect as much as 40% of patients treated with alemtuzumab, but immune thrombocytopenia (3%) and autoimmune renal disease (0.1%) are also reported. About one in 10 people treated with the monoclonal antibody for MS can also develop de novo asymptomatic autoantibodies.

“People ask, ‘Why doesn’t MS come back as part of this generic mechanism?’ I don’t know the answer to that question,” said Dr. Coles.

In the United States, alemtuzumab is indicated for treatment of relapsing-remitting MS in adults who have failed to respond adequately to two or more previous therapies. In contrast, “this has become a first-line treatment in the UK,” said Dr. Coles. “Unfortunately, we can offer no proven treatment to prevent this autoimmunity.”

Considering Proposed Mechanisms

Dr. Coles and other researchers are investigating the cellular mechanism underlying the paradoxical autoimmunity associated with alemtuzumab. Some have suggested that faulty immune B cells could be the culprit, but “there is no difference in B cell reconstitution between those who do and do not get autoimmunity,” said Dr. Coles. “So, we do not think that autoimmunity after alemtuzumab is primarily a B cell problem.” Other investigators have suggested that the mechanism is the depletion of a key immune regulatory cell associated with alemtuzumab. One such example is depletion in T cells as part of an autoimmune cascade that involve CD52-high expressing cells and sialic acid-binding immunoglobulin-like lectin 10. “We do not believe this,” said Dr. Coles. “We cannot replicate the finding of reduced CD52 high cells in type 1 diabetes or MS, nor the binding of SIGLEC-10 to CD52.”

Along with his colleague Joanne Jones, MD, PhD, also at the University of Cambridge, Dr. Coles and his team instead propose that autoimmunity after alemtuzumab therapy is associated with a homeostatic proliferation of T cells in the context of a defective thymus. “We see thymic function reduced after alemtuzumab for a few months. We do not know if alemtuzumab is having a direct impact on the thymus or if it is an indirect effect through a cytokine storm at the time of administering alemtuzumab.”

In addition, in contrast with B cells, both CD4-positive and CD8-positive T cells are clonally restricted after alemtuzumab treatment, said Dr. Coles. “These are the only changes that distinguish patients who do and do not develop autoimmunity,” he said. “Those who develop autoimmunity have reduced clonality and impaired thymic function, compared with those who do not.”

The theory is that the reconstitution of T cells after alemtuzumab comes preferentially from expansion of peripheral T cells, rather than naïve T cells from the thymus, which leads to a higher representation of autoreactive T cells and thus leads to B-cell- and antibody-mediated autoimmunity.

 

 

The Bigger Picture

The autoimmune phenomenon is not unique to alemtuzumab or MS. “This turns out to be one of a family of clinical situations where the reconstitution of the depleted lymphocyte repertoire leads to autoimmunity,” Dr. Coles said. A similar effect was seen years ago when very lymphopenic patients with HIV were given antiviral therapy. About 10% of treated patients had this effect. Furthermore, about 10% of patients who undergo bone marrow transplant may experience similar autoimmune concerns.

“What we do think is true is that we have tapped into a classical expression of autoimmunity,” Dr. Coles said. “Alemtuzumab is a fantastic opportunity to study the mechanisms underlying lymphopenia-associated autoimmunity.”

A Tantalizing Prospect

“It is a tantalizing prospect that susceptible individuals might be identified in the future prior to treatment,” Dr. Coles said. “We looked at IL-21. We showed that after treatment, and perhaps more interestingly, before treatment with alemtuzumab, serum levels of IL-21 are greater in those who subsequently develop autoimmune disease. This [finding] suggests [that] some individuals are prone to developing autoimmune disease and could be identified potentially prior to treatment with alemtuzumab.”

More research is needed, including the development of more sensitive IL-21 assays for use in this population, Dr. Coles said. “Please do not attempt to predict the risk of autoimmunity after alemtuzumab using the current commercial assays. This is a source of some frustration for me.” A potential route of lymphocyte repertoire reconstitution after alemtuzumab is thymic reconstitution, which could lead to a more diverse immune repertoire, Dr. Coles said. “If we can direct reconstitution through the thymic reconstitution, we should be able to prevent autoimmunity.”

Dr. Coles receives honoraria for travel and speaking from Sanofi Genzyme, which markets alemtuzumab.

—Damian McNamara

Issue
Neurology Reviews - 26(4)
Issue
Neurology Reviews - 26(4)
Page Number
76-77
Page Number
76-77
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

European Commission expands denosumab indication

Article Type
Changed
Fri, 01/04/2019 - 10:21

The European Commission has expanded the indication for denosumab (Xgeva), making it is available for the prevention of skeletal-related events in adults with multiple myeloma and other advanced malignancies involving bone.

The European approval is based on the monoclonal antibody’s strong performance in a phase 3, international trial looking specifically at prevention of skeletal-related events in multiple myeloma patients.

During the trial, the drug demonstrated noninferiority to zoledronic acid in delaying the time to first skeletal-related event (hazard ratio, 0.98, 95% confidence interval: 0.85-1.14), according to Amgen, which markets denosumab. The median time to first skeletal-related event was 22.8 months for denosumab versus 24.0 months for zoledronic acid.

The denosumab indication was expanded to include prevention of skeletal-related events by the Food and Drug Administration in the United States in January 2018.

Publications
Topics
Sections

The European Commission has expanded the indication for denosumab (Xgeva), making it is available for the prevention of skeletal-related events in adults with multiple myeloma and other advanced malignancies involving bone.

The European approval is based on the monoclonal antibody’s strong performance in a phase 3, international trial looking specifically at prevention of skeletal-related events in multiple myeloma patients.

During the trial, the drug demonstrated noninferiority to zoledronic acid in delaying the time to first skeletal-related event (hazard ratio, 0.98, 95% confidence interval: 0.85-1.14), according to Amgen, which markets denosumab. The median time to first skeletal-related event was 22.8 months for denosumab versus 24.0 months for zoledronic acid.

The denosumab indication was expanded to include prevention of skeletal-related events by the Food and Drug Administration in the United States in January 2018.

The European Commission has expanded the indication for denosumab (Xgeva), making it is available for the prevention of skeletal-related events in adults with multiple myeloma and other advanced malignancies involving bone.

The European approval is based on the monoclonal antibody’s strong performance in a phase 3, international trial looking specifically at prevention of skeletal-related events in multiple myeloma patients.

During the trial, the drug demonstrated noninferiority to zoledronic acid in delaying the time to first skeletal-related event (hazard ratio, 0.98, 95% confidence interval: 0.85-1.14), according to Amgen, which markets denosumab. The median time to first skeletal-related event was 22.8 months for denosumab versus 24.0 months for zoledronic acid.

The denosumab indication was expanded to include prevention of skeletal-related events by the Food and Drug Administration in the United States in January 2018.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Disseminated Vesicles and Necrotic Papules

Article Type
Changed
Thu, 01/10/2019 - 13:49
Display Headline
Disseminated Vesicles and Necrotic Papules

The Diagnosis: Lues Maligna

Laboratory evaluation demonstrated a total CD4 count of 26 cells/μL (reference range, 443-1471 cells/μL) with a viral load of 1,770,111 copies/mL (reference range, 0 copies/mL), as well as a positive rapid plasma reagin (RPR) test with a titer of 1:8 (reference range, nonreactive). A reactive treponemal antibody test confirmed a true positive RPR test result. Viral culture as well as direct fluorescence antibodies for varicella-zoster virus and an active vesicle of herpes simplex virus (HSV) were negative. Serum immunoglobulin titers for varicella-zoster virus demonstrated low IgM with a positive IgG demonstrating immunity without recent infection. Blood and lesional skin tissue cultures were negative for additional infectious etiologies including bacterial and fungal elements. A lumbar puncture was not performed.

Biopsy of a papulonodule on the left arm demonstrated a lichenoid lymphohistiocytic infiltrate with superficial and deep inflammation (Figure 1). Neutrophils also were noted within a follicle with ballooning and acantholysis within the follicular epithelium. Additional staining for Mycobacterium, HSV-1, HSV-2, and Treponema were negative. In the clinical setting, this histologic pattern was most consistent with secondary syphilis. Pityriasis lichenoides et varioliformis acuta also was included in the histopathologic differential diagnosis by a dermatopathologist (M.C.).

Figure 1. Lues maligna. Punch biopsy of the left forearm demonstrated lichenoid lymphohistiocytic infiltrate with superficial and deep inflammation (H&E, original magnification ×100).

Based on the clinical, microbiologic, and histopathologic findings, a diagnosis of lues maligna (cutaneous secondary syphilis) with a vesiculonecrotic presentation was made. The patient's low RPR titer was attributed to profound immunosuppression, while a confirmation of syphilis infection was made with treponemal antibody testing. Histopathologic examination was consistent with lues maligna and did not demonstrate evidence of any other infectious etiologies.

Following 7 days of intravenous penicillin, the patient demonstrated dramatic improvement of all skin lesions and was discharged receiving once-weekly intramuscular penicillin for 4 weeks. In accordance with the diagnosis, the patient demonstrated rapid improvement of the lesions following appropriate antibiotic therapy.

After the diagnosis of lues maligna was made, the patient disclosed a sexual encounter with a male partner 6 weeks prior to the current presentation, after which he developed a self-resolving genital ulcer suspicious for a primary chancre.

Increasing rates of syphilis transmission have been attributed to males aged 15 to 44 years who have sexual encounters with other males.1 Although syphilis commonly is known as the great mimicker, syphilology texts state that lesions are not associated with syphilis if vesicles are part of the cutaneous eruption in an adult.2 However, rare reports of secondary syphilis presenting as vesicles, pustules, bullae, and pityriasis lichenoides et varioliformis acuta-like eruptions also have been documented.2-4

Initial screening for suspected syphilis involves sensitive, but not specific, nontreponemal RPR testing reported in the form of a titer. Nontreponemal titers in human immunodeficiency virus-positive individuals can be unusually high or low, fluctuate rapidly, and/or be unresponsive to antibiotic therapy.1

Lues maligna is a rare form of malignant secondary syphilis that most commonly presents in human immunodeficiency virus-positive hosts.5 Although lues maligna often presents with ulceronodular lesions, 2 cases presenting with vesiculonecrotic lesions also have been reported.6 Patients often experience systemic symptoms including fever, fatigue, and joint pain. Rapid plasma reagin titers can range from 1:8 to 1:128 in affected individuals.6 Diagnosis is dependent on serologic and histologic confirmation while ruling out viral, fungal, and bacterial etiologies. Characteristic red-brown lesions of secondary syphilis involving the palms and soles (Figure 2) alsoaid in diagnosis.1 Additionally, identification of the Jarisch-Herxheimer reaction following treatment and rapid response to antibiotic therapy are helpful diagnostic findings.6,7 While histopathologic examination of lues maligna typically does not reveal evidence of spirochetes, it also is important to rule out other infectious etiologies.7

Figure 2. Scattered erythematous indurated nodules with overlying scaling on the bilateral palms in a patient with lues maligna.

Our case emphasizes the importance of early recognition and treatment of the variable clinical, laboratory, and histologic presentations of lues maligna.

References
  1. Syphilis fact sheet. Centers for Disease Control and Prevention website. https://www.cdc.gov/std/syphilis/stdfact-syphilis.htm. Updated June 13, 2017. Accessed March 22, 2018.
  2. Lawrence P, Saxe N. Bullous secondary syphilis. Clin Exp Dermatol. 1992;17:44-46.
  3. Pastuszczak M, Woz´niak W, Jaworek AK, et al. Pityriasis lichenoides-like secondary syphilis and neurosyphilis in a HIV-infected patient. Postepy Dermatol Alergol. 2013;30:127-130.
  4. Schnirring-Judge M, Gustaferro C, Terol C. Vesiculobullous syphilis: a case involving an unusual cutaneous manifestation of secondary syphilis [published online November 24, 2010]. J Foot Ankle Surg. 2011;50:96-101.
  5. Pföhler C, Koerner R, von Müller L, et al. Lues maligna in a patient with unknown HIV infection. BMJ Case Rep. 2011. pii: bcr0520114221. doi: 10.1136/bcr.05.2011.4221.
  6. Don PC, Rubinstein R, Christie S. Malignant syphilis (lues maligna) and concurrent infection with HIV. Int J Dermatol. 1995;34:403-407.
  7. Tucker JD, Shah S, Jarell AD, et al. Lues maligna in early HIV infection case report and review of the literature. Sex Transm Dis. 2009;36:512-514.
Article PDF
Author and Disclosure Information

From the Department of Dermatology, Henry Ford Health System, Detroit, Michigan.

The authors report no conflict of interest.

Correspondence: David Oberlin, MD, 3031 W Grand Blvd, Ste 800, Detroit, MI 48202 (doberli1@hfhs.org).

Issue
Cutis - 101(4)
Publications
Topics
Page Number
238, 251-252
Sections
Author and Disclosure Information

From the Department of Dermatology, Henry Ford Health System, Detroit, Michigan.

The authors report no conflict of interest.

Correspondence: David Oberlin, MD, 3031 W Grand Blvd, Ste 800, Detroit, MI 48202 (doberli1@hfhs.org).

Author and Disclosure Information

From the Department of Dermatology, Henry Ford Health System, Detroit, Michigan.

The authors report no conflict of interest.

Correspondence: David Oberlin, MD, 3031 W Grand Blvd, Ste 800, Detroit, MI 48202 (doberli1@hfhs.org).

Article PDF
Article PDF
Related Articles

The Diagnosis: Lues Maligna

Laboratory evaluation demonstrated a total CD4 count of 26 cells/μL (reference range, 443-1471 cells/μL) with a viral load of 1,770,111 copies/mL (reference range, 0 copies/mL), as well as a positive rapid plasma reagin (RPR) test with a titer of 1:8 (reference range, nonreactive). A reactive treponemal antibody test confirmed a true positive RPR test result. Viral culture as well as direct fluorescence antibodies for varicella-zoster virus and an active vesicle of herpes simplex virus (HSV) were negative. Serum immunoglobulin titers for varicella-zoster virus demonstrated low IgM with a positive IgG demonstrating immunity without recent infection. Blood and lesional skin tissue cultures were negative for additional infectious etiologies including bacterial and fungal elements. A lumbar puncture was not performed.

Biopsy of a papulonodule on the left arm demonstrated a lichenoid lymphohistiocytic infiltrate with superficial and deep inflammation (Figure 1). Neutrophils also were noted within a follicle with ballooning and acantholysis within the follicular epithelium. Additional staining for Mycobacterium, HSV-1, HSV-2, and Treponema were negative. In the clinical setting, this histologic pattern was most consistent with secondary syphilis. Pityriasis lichenoides et varioliformis acuta also was included in the histopathologic differential diagnosis by a dermatopathologist (M.C.).

Figure 1. Lues maligna. Punch biopsy of the left forearm demonstrated lichenoid lymphohistiocytic infiltrate with superficial and deep inflammation (H&E, original magnification ×100).

Based on the clinical, microbiologic, and histopathologic findings, a diagnosis of lues maligna (cutaneous secondary syphilis) with a vesiculonecrotic presentation was made. The patient's low RPR titer was attributed to profound immunosuppression, while a confirmation of syphilis infection was made with treponemal antibody testing. Histopathologic examination was consistent with lues maligna and did not demonstrate evidence of any other infectious etiologies.

Following 7 days of intravenous penicillin, the patient demonstrated dramatic improvement of all skin lesions and was discharged receiving once-weekly intramuscular penicillin for 4 weeks. In accordance with the diagnosis, the patient demonstrated rapid improvement of the lesions following appropriate antibiotic therapy.

After the diagnosis of lues maligna was made, the patient disclosed a sexual encounter with a male partner 6 weeks prior to the current presentation, after which he developed a self-resolving genital ulcer suspicious for a primary chancre.

Increasing rates of syphilis transmission have been attributed to males aged 15 to 44 years who have sexual encounters with other males.1 Although syphilis commonly is known as the great mimicker, syphilology texts state that lesions are not associated with syphilis if vesicles are part of the cutaneous eruption in an adult.2 However, rare reports of secondary syphilis presenting as vesicles, pustules, bullae, and pityriasis lichenoides et varioliformis acuta-like eruptions also have been documented.2-4

Initial screening for suspected syphilis involves sensitive, but not specific, nontreponemal RPR testing reported in the form of a titer. Nontreponemal titers in human immunodeficiency virus-positive individuals can be unusually high or low, fluctuate rapidly, and/or be unresponsive to antibiotic therapy.1

Lues maligna is a rare form of malignant secondary syphilis that most commonly presents in human immunodeficiency virus-positive hosts.5 Although lues maligna often presents with ulceronodular lesions, 2 cases presenting with vesiculonecrotic lesions also have been reported.6 Patients often experience systemic symptoms including fever, fatigue, and joint pain. Rapid plasma reagin titers can range from 1:8 to 1:128 in affected individuals.6 Diagnosis is dependent on serologic and histologic confirmation while ruling out viral, fungal, and bacterial etiologies. Characteristic red-brown lesions of secondary syphilis involving the palms and soles (Figure 2) alsoaid in diagnosis.1 Additionally, identification of the Jarisch-Herxheimer reaction following treatment and rapid response to antibiotic therapy are helpful diagnostic findings.6,7 While histopathologic examination of lues maligna typically does not reveal evidence of spirochetes, it also is important to rule out other infectious etiologies.7

Figure 2. Scattered erythematous indurated nodules with overlying scaling on the bilateral palms in a patient with lues maligna.

Our case emphasizes the importance of early recognition and treatment of the variable clinical, laboratory, and histologic presentations of lues maligna.

The Diagnosis: Lues Maligna

Laboratory evaluation demonstrated a total CD4 count of 26 cells/μL (reference range, 443-1471 cells/μL) with a viral load of 1,770,111 copies/mL (reference range, 0 copies/mL), as well as a positive rapid plasma reagin (RPR) test with a titer of 1:8 (reference range, nonreactive). A reactive treponemal antibody test confirmed a true positive RPR test result. Viral culture as well as direct fluorescence antibodies for varicella-zoster virus and an active vesicle of herpes simplex virus (HSV) were negative. Serum immunoglobulin titers for varicella-zoster virus demonstrated low IgM with a positive IgG demonstrating immunity without recent infection. Blood and lesional skin tissue cultures were negative for additional infectious etiologies including bacterial and fungal elements. A lumbar puncture was not performed.

Biopsy of a papulonodule on the left arm demonstrated a lichenoid lymphohistiocytic infiltrate with superficial and deep inflammation (Figure 1). Neutrophils also were noted within a follicle with ballooning and acantholysis within the follicular epithelium. Additional staining for Mycobacterium, HSV-1, HSV-2, and Treponema were negative. In the clinical setting, this histologic pattern was most consistent with secondary syphilis. Pityriasis lichenoides et varioliformis acuta also was included in the histopathologic differential diagnosis by a dermatopathologist (M.C.).

Figure 1. Lues maligna. Punch biopsy of the left forearm demonstrated lichenoid lymphohistiocytic infiltrate with superficial and deep inflammation (H&E, original magnification ×100).

Based on the clinical, microbiologic, and histopathologic findings, a diagnosis of lues maligna (cutaneous secondary syphilis) with a vesiculonecrotic presentation was made. The patient's low RPR titer was attributed to profound immunosuppression, while a confirmation of syphilis infection was made with treponemal antibody testing. Histopathologic examination was consistent with lues maligna and did not demonstrate evidence of any other infectious etiologies.

Following 7 days of intravenous penicillin, the patient demonstrated dramatic improvement of all skin lesions and was discharged receiving once-weekly intramuscular penicillin for 4 weeks. In accordance with the diagnosis, the patient demonstrated rapid improvement of the lesions following appropriate antibiotic therapy.

After the diagnosis of lues maligna was made, the patient disclosed a sexual encounter with a male partner 6 weeks prior to the current presentation, after which he developed a self-resolving genital ulcer suspicious for a primary chancre.

Increasing rates of syphilis transmission have been attributed to males aged 15 to 44 years who have sexual encounters with other males.1 Although syphilis commonly is known as the great mimicker, syphilology texts state that lesions are not associated with syphilis if vesicles are part of the cutaneous eruption in an adult.2 However, rare reports of secondary syphilis presenting as vesicles, pustules, bullae, and pityriasis lichenoides et varioliformis acuta-like eruptions also have been documented.2-4

Initial screening for suspected syphilis involves sensitive, but not specific, nontreponemal RPR testing reported in the form of a titer. Nontreponemal titers in human immunodeficiency virus-positive individuals can be unusually high or low, fluctuate rapidly, and/or be unresponsive to antibiotic therapy.1

Lues maligna is a rare form of malignant secondary syphilis that most commonly presents in human immunodeficiency virus-positive hosts.5 Although lues maligna often presents with ulceronodular lesions, 2 cases presenting with vesiculonecrotic lesions also have been reported.6 Patients often experience systemic symptoms including fever, fatigue, and joint pain. Rapid plasma reagin titers can range from 1:8 to 1:128 in affected individuals.6 Diagnosis is dependent on serologic and histologic confirmation while ruling out viral, fungal, and bacterial etiologies. Characteristic red-brown lesions of secondary syphilis involving the palms and soles (Figure 2) alsoaid in diagnosis.1 Additionally, identification of the Jarisch-Herxheimer reaction following treatment and rapid response to antibiotic therapy are helpful diagnostic findings.6,7 While histopathologic examination of lues maligna typically does not reveal evidence of spirochetes, it also is important to rule out other infectious etiologies.7

Figure 2. Scattered erythematous indurated nodules with overlying scaling on the bilateral palms in a patient with lues maligna.

Our case emphasizes the importance of early recognition and treatment of the variable clinical, laboratory, and histologic presentations of lues maligna.

References
  1. Syphilis fact sheet. Centers for Disease Control and Prevention website. https://www.cdc.gov/std/syphilis/stdfact-syphilis.htm. Updated June 13, 2017. Accessed March 22, 2018.
  2. Lawrence P, Saxe N. Bullous secondary syphilis. Clin Exp Dermatol. 1992;17:44-46.
  3. Pastuszczak M, Woz´niak W, Jaworek AK, et al. Pityriasis lichenoides-like secondary syphilis and neurosyphilis in a HIV-infected patient. Postepy Dermatol Alergol. 2013;30:127-130.
  4. Schnirring-Judge M, Gustaferro C, Terol C. Vesiculobullous syphilis: a case involving an unusual cutaneous manifestation of secondary syphilis [published online November 24, 2010]. J Foot Ankle Surg. 2011;50:96-101.
  5. Pföhler C, Koerner R, von Müller L, et al. Lues maligna in a patient with unknown HIV infection. BMJ Case Rep. 2011. pii: bcr0520114221. doi: 10.1136/bcr.05.2011.4221.
  6. Don PC, Rubinstein R, Christie S. Malignant syphilis (lues maligna) and concurrent infection with HIV. Int J Dermatol. 1995;34:403-407.
  7. Tucker JD, Shah S, Jarell AD, et al. Lues maligna in early HIV infection case report and review of the literature. Sex Transm Dis. 2009;36:512-514.
References
  1. Syphilis fact sheet. Centers for Disease Control and Prevention website. https://www.cdc.gov/std/syphilis/stdfact-syphilis.htm. Updated June 13, 2017. Accessed March 22, 2018.
  2. Lawrence P, Saxe N. Bullous secondary syphilis. Clin Exp Dermatol. 1992;17:44-46.
  3. Pastuszczak M, Woz´niak W, Jaworek AK, et al. Pityriasis lichenoides-like secondary syphilis and neurosyphilis in a HIV-infected patient. Postepy Dermatol Alergol. 2013;30:127-130.
  4. Schnirring-Judge M, Gustaferro C, Terol C. Vesiculobullous syphilis: a case involving an unusual cutaneous manifestation of secondary syphilis [published online November 24, 2010]. J Foot Ankle Surg. 2011;50:96-101.
  5. Pföhler C, Koerner R, von Müller L, et al. Lues maligna in a patient with unknown HIV infection. BMJ Case Rep. 2011. pii: bcr0520114221. doi: 10.1136/bcr.05.2011.4221.
  6. Don PC, Rubinstein R, Christie S. Malignant syphilis (lues maligna) and concurrent infection with HIV. Int J Dermatol. 1995;34:403-407.
  7. Tucker JD, Shah S, Jarell AD, et al. Lues maligna in early HIV infection case report and review of the literature. Sex Transm Dis. 2009;36:512-514.
Issue
Cutis - 101(4)
Issue
Cutis - 101(4)
Page Number
238, 251-252
Page Number
238, 251-252
Publications
Publications
Topics
Article Type
Display Headline
Disseminated Vesicles and Necrotic Papules
Display Headline
Disseminated Vesicles and Necrotic Papules
Sections
Questionnaire Body

A 30-year-old man who had contracted human immunodeficiency virus from a male sexual partner 4 years prior presented to the emergency department with fevers, chills, night sweats, and rhinorrhea of 2 weeks' duration. He reported that he had been off highly active antiretroviral therapy for 2 years. Physical examination revealed numerous erythematous, papulonecrotic, crusted lesions on the face, neck, chest, back, arms, and legs that had developed over the past 4 days. Fluid-filled vesicles also were noted on the arms and legs, while erythematous, indurated nodules with overlying scaling were noted on the bilateral palms and soles. The patient reported that he had been vaccinated for varicella-zoster virus as a child without primary infection.

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
PubMed ID
29763482
Disqus Comments
Default
Gate On Date
Wed, 04/04/2018 - 11:15
Un-Gate On Date
Wed, 04/04/2018 - 11:15
Use ProPublica
Article PDF Media

Patients With MS May Not Receive Appropriate Medicines From Primary Care Doctors

Article Type
Changed
Wed, 01/16/2019 - 15:39
Patients treated for MS by primary care doctors are more likely to be less educated, compared with patients treated by neurologists.

SAN DIEGO—Patients with multiple sclerosis (MS) who are treated by primary care physicians are significantly less likely to receive disease-modifying therapies (DMTs) than patients treated by neurologists, even though they have more symptoms, according to a study reported at the ACTRIMS 2018 Forum.

Approximately 85% of patients treated by neurologists at MS centers receive DMTs, compared with 51% of those treated at primary care offices.

In addition, patients treated at primary care practices have several kinds of symptoms. “This [finding] suggests there is a critical need for neurologists, especially MS specialists, to reach out and collaborate with these primary care providers and provide education about how to manage MS and improve both the treatment and the outcomes,” said lead study author Michael T. Halpern, MD, PhD, Associate Professor of Public Health at Temple University in Philadelphia.

Michael T. Halpern, MD, PhD


Dr. Halpern and colleagues analyzed data from the Sonya Slifka Longitudinal MS Study. They focused on patients with MS who received care at MS centers (376 patients, all treated by neurologists), neurology practices (552 patients), and primary care practices (55 patients).

In the three groups, most of the patients were female (from 77% to 82%). Compared with patients treated at MS centers, those who were treated at primary care practices were more likely to be white (98% vs 82%), to have less than a college education (69% vs 42%), and to have Medicaid or veteran coverage, or be uninsured (22% vs 11%).

The rate of patients receiving DMTs was 84% at MS centers and 79% at neurology practices. About 51% of patients treated by primary care doctors received DMTs, even though they reported more symptoms in areas such as vision, walking, bowel, speech, and numbness, compared with patients in the other groups.

The study does not indicate why the patients with MS who are treated by primary care physicians are not receiving appropriate therapies, and it is not known whether the absence of treatment makes their conditions worse.

Nevertheless, it has been well documented that DMTs can reduce disease progression and relapses, Dr. Halpern said. “Individuals with MS who are not being appropriately treated are more likely to experience symptoms, relapses, and faster [accumulation of] disability.”

Primary care doctors may not be providing appropriate treatment because they lack the training and expertise to properly prescribe MS medications, said Dr. Halpern. Whatever the explanation, MS subspecialists and primary care doctors clearly need to collaborate more, he said.

—Randy Dotinga

Issue
Neurology Reviews - 26(4)
Publications
Topics
Page Number
79
Sections
Patients treated for MS by primary care doctors are more likely to be less educated, compared with patients treated by neurologists.
Patients treated for MS by primary care doctors are more likely to be less educated, compared with patients treated by neurologists.

SAN DIEGO—Patients with multiple sclerosis (MS) who are treated by primary care physicians are significantly less likely to receive disease-modifying therapies (DMTs) than patients treated by neurologists, even though they have more symptoms, according to a study reported at the ACTRIMS 2018 Forum.

Approximately 85% of patients treated by neurologists at MS centers receive DMTs, compared with 51% of those treated at primary care offices.

In addition, patients treated at primary care practices have several kinds of symptoms. “This [finding] suggests there is a critical need for neurologists, especially MS specialists, to reach out and collaborate with these primary care providers and provide education about how to manage MS and improve both the treatment and the outcomes,” said lead study author Michael T. Halpern, MD, PhD, Associate Professor of Public Health at Temple University in Philadelphia.

Michael T. Halpern, MD, PhD


Dr. Halpern and colleagues analyzed data from the Sonya Slifka Longitudinal MS Study. They focused on patients with MS who received care at MS centers (376 patients, all treated by neurologists), neurology practices (552 patients), and primary care practices (55 patients).

In the three groups, most of the patients were female (from 77% to 82%). Compared with patients treated at MS centers, those who were treated at primary care practices were more likely to be white (98% vs 82%), to have less than a college education (69% vs 42%), and to have Medicaid or veteran coverage, or be uninsured (22% vs 11%).

The rate of patients receiving DMTs was 84% at MS centers and 79% at neurology practices. About 51% of patients treated by primary care doctors received DMTs, even though they reported more symptoms in areas such as vision, walking, bowel, speech, and numbness, compared with patients in the other groups.

The study does not indicate why the patients with MS who are treated by primary care physicians are not receiving appropriate therapies, and it is not known whether the absence of treatment makes their conditions worse.

Nevertheless, it has been well documented that DMTs can reduce disease progression and relapses, Dr. Halpern said. “Individuals with MS who are not being appropriately treated are more likely to experience symptoms, relapses, and faster [accumulation of] disability.”

Primary care doctors may not be providing appropriate treatment because they lack the training and expertise to properly prescribe MS medications, said Dr. Halpern. Whatever the explanation, MS subspecialists and primary care doctors clearly need to collaborate more, he said.

—Randy Dotinga

SAN DIEGO—Patients with multiple sclerosis (MS) who are treated by primary care physicians are significantly less likely to receive disease-modifying therapies (DMTs) than patients treated by neurologists, even though they have more symptoms, according to a study reported at the ACTRIMS 2018 Forum.

Approximately 85% of patients treated by neurologists at MS centers receive DMTs, compared with 51% of those treated at primary care offices.

In addition, patients treated at primary care practices have several kinds of symptoms. “This [finding] suggests there is a critical need for neurologists, especially MS specialists, to reach out and collaborate with these primary care providers and provide education about how to manage MS and improve both the treatment and the outcomes,” said lead study author Michael T. Halpern, MD, PhD, Associate Professor of Public Health at Temple University in Philadelphia.

Michael T. Halpern, MD, PhD


Dr. Halpern and colleagues analyzed data from the Sonya Slifka Longitudinal MS Study. They focused on patients with MS who received care at MS centers (376 patients, all treated by neurologists), neurology practices (552 patients), and primary care practices (55 patients).

In the three groups, most of the patients were female (from 77% to 82%). Compared with patients treated at MS centers, those who were treated at primary care practices were more likely to be white (98% vs 82%), to have less than a college education (69% vs 42%), and to have Medicaid or veteran coverage, or be uninsured (22% vs 11%).

The rate of patients receiving DMTs was 84% at MS centers and 79% at neurology practices. About 51% of patients treated by primary care doctors received DMTs, even though they reported more symptoms in areas such as vision, walking, bowel, speech, and numbness, compared with patients in the other groups.

The study does not indicate why the patients with MS who are treated by primary care physicians are not receiving appropriate therapies, and it is not known whether the absence of treatment makes their conditions worse.

Nevertheless, it has been well documented that DMTs can reduce disease progression and relapses, Dr. Halpern said. “Individuals with MS who are not being appropriately treated are more likely to experience symptoms, relapses, and faster [accumulation of] disability.”

Primary care doctors may not be providing appropriate treatment because they lack the training and expertise to properly prescribe MS medications, said Dr. Halpern. Whatever the explanation, MS subspecialists and primary care doctors clearly need to collaborate more, he said.

—Randy Dotinga

Issue
Neurology Reviews - 26(4)
Issue
Neurology Reviews - 26(4)
Page Number
79
Page Number
79
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Posttransplant cyclophosphamide helped reduce GVHD rates

Article Type
Changed
Fri, 01/04/2019 - 10:21

 

– The combination of mycophenolate mofetil, tacrolimus, and posttransplant cyclophosphamide outperformed other prophylaxis regimens at reducing graft versus host disease with relapse-free survival in a multicenter trial.

The trial’s primary aim was to compare rates of post–hematopoietic stem cell transplant GVHD-free and relapse-free survival (GRFS) in the three study arms, compared with the tacrolimus/methotrexate group, who were receiving a “contemporary control,” Javier Bolaños-Meade, MD, said during a late-breaking abstract session of the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.

The mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide group had a hazard ratio of 0.72 for reaching the primary endpoint – GRFS (95% confidence interval, 0.55-0.94; P = .04), compared with patients receiving the control regimen. In the study, GRFS was defined as the amount of time elapsed between transplant and any of: grade III-IV acute GVHD, chronic GVHD severe enough to require systemic therapy, disease relapse or progression, or death. Grade III-IV acute GVHD and GVHD survival were superior with mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide, compared with the control (P = .006 and .01, respectively).

The phase 2 trial enrolled adults aged 18-75 years who had a malignant disease and a matched donor, and were slated to receive reduced intensity conditioning. The study randomized patients 1:1:1 to one of three experimental regimens and 224 to the control tacrolimus/methotrexate regimen. In the experimental arms, 92 patients received mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide; 89 patients received tacrolimus/methotrexate/maraviroc, and 92 patients received tacrolimus/methotrexate/bortezomib.

“According to predetermined parameters for success, tacrolimus/mycophenolate mofetil/cyclophosphamide was superior to control in GRFS, severe acute GVHD, chronic GVHD requiring immunosuppression, and GVHD-free survival, without a negative impact on treatment-related mortality, relapse/progression, overall survival or disease-free survival,” Dr. Bolaños-Meade said.

Patients could be included in the study if they had acute leukemia, chronic myelogenous leukemia, or myelodysplastic syndrome; patients with these diagnoses could have no circulating blasts and had to have less than 10% blasts in bone marrow. Patients with chronic lymphocytic leukemia and lymphoma with sensitive disease at the time of transplant were also eligible. All patients received peripheral blood stem cells, and underwent reduced intensity conditioning.

Permissible conditioning regimens included fludarabine/busulfan dosed at 8 mg/kg or less, fludarabine/cyclophosphamide with or without total body irradiation (TBI), fludarabine/TBI at 200 cGy, or fludarabine/melphalan dosed at less than 150 mg/m2 of body surface area. Alemtuzumab and anti-thymocyte globulin were not permitted.

 

 


Patients had to have a cardiac ejection fraction greater than 40%. For inclusion, patients had to have estimated creatinine clearance greater than 40 mL/min, bilirubin less than two times the upper limit of normal, and ALT/AST less than 2.5 times the upper limit of normal. Inclusion criteria also required adequate pulmonary function, defined as hemoglobin-corrected diffused capacity of carbon monoxide of at least 40% and forced expiratory volume in one second of 50% or greater.

Patients’ donors had to be either siblings, or 7/8 or 8/8 human leukocyte antigen-matched unrelated donors.

The patients receiving tacrolimus/methotrexate who served as controls were also collected prospectively, from centers that were not participating in the three-arm clinical trial. These patients also received reduced intensity conditioning and a peripheral blood stem cell transplant. This arm of the study was run through the Center for International Blood & Marrow Transplant Research. “I want to stress that the entry criteria were the same as for the intervention arms of the study,” Dr. Bolaños-Meade said.

Using a baseline rate of 23% for the GRFS endpoint, Dr. Bolaños-Meade and his collaborators established the size of the intervention and control arm so that the study would have 86%-88% power to detect a 20% improvement in the rate of GRFS over the contemporary control GVHD prophylaxis.
 

 


Across all study arms, patients were a median of 64 years old and most (58%-67%) were men. A little more than half of the patients had a Karnofsky Performance Status of 90%-100%. The Hematopoietic Cell Transplantation–Comorbidity Index was 3 or greater in about 40% of patients in the intervention arms, and in 62% of those in the control arm.

The phase 2 study was not designed to compare each experimental arm against the others, but only to compare each experimental arm to the control, said Dr. Bolaños-Meade, of the department of oncology at Johns Hopkins University, Baltimore.

“The comparisons that were made in this study ... have a limited power to really show superiority,” he said, adding that the National Clinical Trials Network is beginning a phase 3 trial that directly compares posttransplant cyclophosphamide to tacrolimus/methotrexate.

Dr. Bolaños-Meade reported serving on the data safety monitoring board of Incyte.
 

SOURCE: Bolaños-Meade J et al. 2018 BMT Tandem Meetings, Abstract LBA1.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– The combination of mycophenolate mofetil, tacrolimus, and posttransplant cyclophosphamide outperformed other prophylaxis regimens at reducing graft versus host disease with relapse-free survival in a multicenter trial.

The trial’s primary aim was to compare rates of post–hematopoietic stem cell transplant GVHD-free and relapse-free survival (GRFS) in the three study arms, compared with the tacrolimus/methotrexate group, who were receiving a “contemporary control,” Javier Bolaños-Meade, MD, said during a late-breaking abstract session of the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.

The mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide group had a hazard ratio of 0.72 for reaching the primary endpoint – GRFS (95% confidence interval, 0.55-0.94; P = .04), compared with patients receiving the control regimen. In the study, GRFS was defined as the amount of time elapsed between transplant and any of: grade III-IV acute GVHD, chronic GVHD severe enough to require systemic therapy, disease relapse or progression, or death. Grade III-IV acute GVHD and GVHD survival were superior with mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide, compared with the control (P = .006 and .01, respectively).

The phase 2 trial enrolled adults aged 18-75 years who had a malignant disease and a matched donor, and were slated to receive reduced intensity conditioning. The study randomized patients 1:1:1 to one of three experimental regimens and 224 to the control tacrolimus/methotrexate regimen. In the experimental arms, 92 patients received mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide; 89 patients received tacrolimus/methotrexate/maraviroc, and 92 patients received tacrolimus/methotrexate/bortezomib.

“According to predetermined parameters for success, tacrolimus/mycophenolate mofetil/cyclophosphamide was superior to control in GRFS, severe acute GVHD, chronic GVHD requiring immunosuppression, and GVHD-free survival, without a negative impact on treatment-related mortality, relapse/progression, overall survival or disease-free survival,” Dr. Bolaños-Meade said.

Patients could be included in the study if they had acute leukemia, chronic myelogenous leukemia, or myelodysplastic syndrome; patients with these diagnoses could have no circulating blasts and had to have less than 10% blasts in bone marrow. Patients with chronic lymphocytic leukemia and lymphoma with sensitive disease at the time of transplant were also eligible. All patients received peripheral blood stem cells, and underwent reduced intensity conditioning.

Permissible conditioning regimens included fludarabine/busulfan dosed at 8 mg/kg or less, fludarabine/cyclophosphamide with or without total body irradiation (TBI), fludarabine/TBI at 200 cGy, or fludarabine/melphalan dosed at less than 150 mg/m2 of body surface area. Alemtuzumab and anti-thymocyte globulin were not permitted.

 

 


Patients had to have a cardiac ejection fraction greater than 40%. For inclusion, patients had to have estimated creatinine clearance greater than 40 mL/min, bilirubin less than two times the upper limit of normal, and ALT/AST less than 2.5 times the upper limit of normal. Inclusion criteria also required adequate pulmonary function, defined as hemoglobin-corrected diffused capacity of carbon monoxide of at least 40% and forced expiratory volume in one second of 50% or greater.

Patients’ donors had to be either siblings, or 7/8 or 8/8 human leukocyte antigen-matched unrelated donors.

The patients receiving tacrolimus/methotrexate who served as controls were also collected prospectively, from centers that were not participating in the three-arm clinical trial. These patients also received reduced intensity conditioning and a peripheral blood stem cell transplant. This arm of the study was run through the Center for International Blood & Marrow Transplant Research. “I want to stress that the entry criteria were the same as for the intervention arms of the study,” Dr. Bolaños-Meade said.

Using a baseline rate of 23% for the GRFS endpoint, Dr. Bolaños-Meade and his collaborators established the size of the intervention and control arm so that the study would have 86%-88% power to detect a 20% improvement in the rate of GRFS over the contemporary control GVHD prophylaxis.
 

 


Across all study arms, patients were a median of 64 years old and most (58%-67%) were men. A little more than half of the patients had a Karnofsky Performance Status of 90%-100%. The Hematopoietic Cell Transplantation–Comorbidity Index was 3 or greater in about 40% of patients in the intervention arms, and in 62% of those in the control arm.

The phase 2 study was not designed to compare each experimental arm against the others, but only to compare each experimental arm to the control, said Dr. Bolaños-Meade, of the department of oncology at Johns Hopkins University, Baltimore.

“The comparisons that were made in this study ... have a limited power to really show superiority,” he said, adding that the National Clinical Trials Network is beginning a phase 3 trial that directly compares posttransplant cyclophosphamide to tacrolimus/methotrexate.

Dr. Bolaños-Meade reported serving on the data safety monitoring board of Incyte.
 

SOURCE: Bolaños-Meade J et al. 2018 BMT Tandem Meetings, Abstract LBA1.

 

– The combination of mycophenolate mofetil, tacrolimus, and posttransplant cyclophosphamide outperformed other prophylaxis regimens at reducing graft versus host disease with relapse-free survival in a multicenter trial.

The trial’s primary aim was to compare rates of post–hematopoietic stem cell transplant GVHD-free and relapse-free survival (GRFS) in the three study arms, compared with the tacrolimus/methotrexate group, who were receiving a “contemporary control,” Javier Bolaños-Meade, MD, said during a late-breaking abstract session of the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.

The mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide group had a hazard ratio of 0.72 for reaching the primary endpoint – GRFS (95% confidence interval, 0.55-0.94; P = .04), compared with patients receiving the control regimen. In the study, GRFS was defined as the amount of time elapsed between transplant and any of: grade III-IV acute GVHD, chronic GVHD severe enough to require systemic therapy, disease relapse or progression, or death. Grade III-IV acute GVHD and GVHD survival were superior with mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide, compared with the control (P = .006 and .01, respectively).

The phase 2 trial enrolled adults aged 18-75 years who had a malignant disease and a matched donor, and were slated to receive reduced intensity conditioning. The study randomized patients 1:1:1 to one of three experimental regimens and 224 to the control tacrolimus/methotrexate regimen. In the experimental arms, 92 patients received mycophenolate mofetil/tacrolimus/posttransplant cyclophosphamide; 89 patients received tacrolimus/methotrexate/maraviroc, and 92 patients received tacrolimus/methotrexate/bortezomib.

“According to predetermined parameters for success, tacrolimus/mycophenolate mofetil/cyclophosphamide was superior to control in GRFS, severe acute GVHD, chronic GVHD requiring immunosuppression, and GVHD-free survival, without a negative impact on treatment-related mortality, relapse/progression, overall survival or disease-free survival,” Dr. Bolaños-Meade said.

Patients could be included in the study if they had acute leukemia, chronic myelogenous leukemia, or myelodysplastic syndrome; patients with these diagnoses could have no circulating blasts and had to have less than 10% blasts in bone marrow. Patients with chronic lymphocytic leukemia and lymphoma with sensitive disease at the time of transplant were also eligible. All patients received peripheral blood stem cells, and underwent reduced intensity conditioning.

Permissible conditioning regimens included fludarabine/busulfan dosed at 8 mg/kg or less, fludarabine/cyclophosphamide with or without total body irradiation (TBI), fludarabine/TBI at 200 cGy, or fludarabine/melphalan dosed at less than 150 mg/m2 of body surface area. Alemtuzumab and anti-thymocyte globulin were not permitted.

 

 


Patients had to have a cardiac ejection fraction greater than 40%. For inclusion, patients had to have estimated creatinine clearance greater than 40 mL/min, bilirubin less than two times the upper limit of normal, and ALT/AST less than 2.5 times the upper limit of normal. Inclusion criteria also required adequate pulmonary function, defined as hemoglobin-corrected diffused capacity of carbon monoxide of at least 40% and forced expiratory volume in one second of 50% or greater.

Patients’ donors had to be either siblings, or 7/8 or 8/8 human leukocyte antigen-matched unrelated donors.

The patients receiving tacrolimus/methotrexate who served as controls were also collected prospectively, from centers that were not participating in the three-arm clinical trial. These patients also received reduced intensity conditioning and a peripheral blood stem cell transplant. This arm of the study was run through the Center for International Blood & Marrow Transplant Research. “I want to stress that the entry criteria were the same as for the intervention arms of the study,” Dr. Bolaños-Meade said.

Using a baseline rate of 23% for the GRFS endpoint, Dr. Bolaños-Meade and his collaborators established the size of the intervention and control arm so that the study would have 86%-88% power to detect a 20% improvement in the rate of GRFS over the contemporary control GVHD prophylaxis.
 

 


Across all study arms, patients were a median of 64 years old and most (58%-67%) were men. A little more than half of the patients had a Karnofsky Performance Status of 90%-100%. The Hematopoietic Cell Transplantation–Comorbidity Index was 3 or greater in about 40% of patients in the intervention arms, and in 62% of those in the control arm.

The phase 2 study was not designed to compare each experimental arm against the others, but only to compare each experimental arm to the control, said Dr. Bolaños-Meade, of the department of oncology at Johns Hopkins University, Baltimore.

“The comparisons that were made in this study ... have a limited power to really show superiority,” he said, adding that the National Clinical Trials Network is beginning a phase 3 trial that directly compares posttransplant cyclophosphamide to tacrolimus/methotrexate.

Dr. Bolaños-Meade reported serving on the data safety monitoring board of Incyte.
 

SOURCE: Bolaños-Meade J et al. 2018 BMT Tandem Meetings, Abstract LBA1.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM THE 2018 BMT TANDEM MEETINGS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: A cyclophosphamide-containing posttransplant regimen bested controls for reducing graft versus host disease rates.

Major finding: The hazard ratio for GVHD-free and relapse-free survival was 0.72 for those receiving cyclophosphamide, compared with controls (P = .04).

Study details: Randomized, controlled trial of 497 patients receiving one of three intervention arm posttransplant regimens for GVHD prophylaxis, or a control regimen of tacrolimus and methotrexate.

Disclosures: Dr. Bolaños-Meade reported serving on the data safety monitoring board of Incyte.

Source: Bolaños-Meade J et al. 2018 BMT Tandem Meetings, Abstract LB1.

Disqus Comments
Default

Troubleshooting Gait and Voice Problems After DBS for Parkinson’s Disease

Article Type
Changed
Mon, 01/07/2019 - 10:40
The approach to symptom control may change according to whether the symptom results from the disease or the stimulation.

LAS VEGAS—In patients with Parkinson’s disease, deep brain stimulation (DBS) can improve gait significantly and reduce vocal tremor. Some patients may fail to improve following implantation, however, and others who do improve may later worsen. In such cases, neurologists can address problems with gait and voice tremor using various steps to optimize DBS treatment, according to two lectures delivered at the 21st Annual Meeting of the North American Neuromodulation Society.

Refractory Gait Impairment

Gait impairment and freezing of gait may persist for years in some patients, despite DBS treatment at the traditional frequency of 130 Hz. Studies by Moreau and colleagues indicate that stimulation at 60 Hz improves these outcomes in previously refractory patients, said Helen M. Brontë-Stewart, MD, MSE, the John E. Cahill Family Professor and Director of the Stanford Movement Disorders Center at Stanford University School of Medicine in California. Research by Ricchi et al shows that low-frequency DBS also reduces gait impairment and freezing of gait in the early stages after implantation.

Helen M. Brontë-Stewart, MD, MSE

The factors that predict which patients will benefit from low-frequency DBS of the subthalamic nucleus (STN) are increased age, severe axial phenotype at five years after surgery, and lower preoperative levodopa responsiveness. But low-frequency DBS may not be adequate to improve other motor signs such as tremor, said Dr. Brontë-Stewart. Improvements on low-frequency DBS also may not last long.

The literature about which part of the STN should be stimulated for more effective treatment contains mixed results. Several investigations, including a 2011 study by McNeely et al, showed that high-frequency DBS is most efficacious when applied to the dorsolateral margin of STN. Other studies, including one performed by Dr. Brontë-Stewart and colleagues, indicate that stimulating the ventral area of the STN is more effective. Khoo et al found that 60-Hz stimulation was superior to 130-Hz stimulation for axial motor signs in Parkinson’s disease. “Clearly, we do not have consensus,” said Dr. Brontë-Stewart.

Postsurgical Gait Worsening

If a patient’s gait worsens shortly after DBS surgery, one possible explanation is that the leads were misplaced. Gait also could worsen if high-frequency DBS is applied outside the STN, especially in the anterior, medial, and dorsal regions, said Dr. Brontë-Stewart. If a patient’s gait and akinesia worsen with high-frequency STN DBS, but his or her tremor and rigidity improve, the cause may be diffusion of the stimulatory field into the pallido-fugal fibers before decussation of the pallido-pedunculopontine nucleus (PPN) pathway.

“The combination of STN DBS and medication may lead to lower-extremity dyskinesias,” which may account for gait worsening in some patients, said Dr. Brontë-Stewart. “It is important to look at these patients off medication. It may show that the dyskinesias are interfering with the gait studies, and whether the medication is affecting their cognition, which may also worsen gait.”

Patients’ gait and balance may worsen years after implantation. For example, stimulation-resistant axial symptoms may emerge after five years of DBS even if treatment remains effective for appendicular symptoms. This outcome may follow progression of the disease into nondopaminergic networks. Another possible cause is increased voltage that involves pallido-fugal pathways, thus enlarging the field of stimulation, said Dr. Brontë-Stewart.

For patients with delayed worsening, Dr. Brontë-Stewart advises that “if you reprogram DBS, focusing on gait symmetry, you can improve gait, including freezing of gait. Many of us program DBS for appendicular symptoms, and we fail to do this for gait…. Perhaps use bipolar or interleaving programming to restrict field extension.”

Preoperative improvement in Unified Parkinson’s Disease Rating Scale Part III scores in response to levodopa treatment is the best predictor of the effect of DBS on gait and freezing of gait. Improvement in freezing of gait following STN DBS has, in turn, been related to reduced medication dosing and lack of worsening of cognition, concluded Dr. Brontë-Stewart.

An Initial Approach to Vocal Tremor

The literature suggests that in patients with Parkinson’s disease, STN DBS often results in deterioration of speech that may not improve when the stimulation is stopped. Predictors of vocal problems include presurgical dysarthria, duration and severity of presurgical disease progression, and contact placement around the left STN.

“There is no large evidence base upon which to work when you are trying to … deal with someone who comes to you with speech problems,” said Bryan T. Klassen, MD, Assistant Professor of Neurology at the Mayo Clinic in Rochester, Minnesota. Addressing potential speech problems before implantation “should be a major part of any DBS protocol,” he added. A neurologist should document a patient’s pre-existing speech issues carefully. At Mayo Clinic, all patients scheduled to undergo implantation visit a speech pathologist first, and the examination is recorded.

In addition, patients need to understand that vocal tremor may be a symptom of Parkinson’s disease and may not result from DBS. On the other hand, neurologists also should inform patients that inserting the leads may cause dysarthria even before the battery for the device is implanted. Patients ultimately may have to choose between optimal tremor control and optimal speech, said Dr. Klassen.

 

 

Disease-Related Vocal Abnormalities

When a patient presents with speech problems, the neurologist must determine whether they result from the disease or from stimulation. Symptoms that have responded insufficiently to DBS are likely related to the disease, as are symptoms consistent with disease progression, such as gradually progressive dysarthria. These symptoms may respond to more aggressive stimulation. A patient with worsening hypokinetic dysarthria, however, may not improve, and could worsen, with more aggressive stimulation.

No clear criteria can help a neurologist determine whether to consider abnormal speech nonresponsive to stimulation. This determination relies on clinical judgment and should be communicated clearly to the patient, said Dr. Klassen. At that point, the neurologist and patient may consider speech therapy.

Stimulation-Related Vocal Abnormalities

DBS implantation itself sometimes causes dysarthria that may improve over the course of weeks or months. Implantation also may worsen pre-existing dysarthria. “That [side effect] does not necessarily have to limit what or how you are stimulating for tremor control,” said Dr. Klassen. If the symptom results from stimulation, it will improve when stimulation is stopped. It may take as little as a few seconds or as long as several weeks for vocal abnormalities to improve, but tremor worsens while stimulation is turned off.

A neurologist should locate the source of any stimulation-dependent vocal abnormality so that he or she can focus the stimulation field on that source. Although the left lead tends to be implicated in vocal abnormalities more often than the right lead, the neurologist needs to determine the leads’ contributions empirically by turning the leads off individually. “Depending on the washout [period], that may take more time than you would like,” said Dr. Klassen.

A review of the initial monopolar thresholds can indicate which regions along the electrode tend to affect speech the most. Postoperative imaging may help in this determination. If the patient has a prolonged washout period, the neurologist can give him or her “homework,” said Dr. Klassen. To do this, the neurologist sets the DBS device to run several programs and asks the patient to record his or her experiences in a notebook.

Optimizing the Stimulation Settings

Vocal abnormalities that arise after surgery may indicate that the stimulation parameters need to be modified. First, neurologists must choose the optimal lead location along the electrode. Eccentric steering or multiple-source current steering may reduce vocal tremor by better defining the distribution of current.

To reduce the volume of tissue activated, the neurologist can increase the pulse width, reduce the amplitude, or switch to a bipolar configuration. If a particular setting causes side effects, reducing the voltage may increase tolerability, albeit at the expense of efficacy. Switching from a high frequency to a low frequency also may reduce vocal tremor. If it is impossible to control limb tremor and vocal abnormalities optimally with a single setting, the patient may choose the setting that provides the most acceptable overall control.

Another option is to allow the patient to switch as necessary between a program optimized for tremor control and one optimized for speech. A patient may also choose to turn stimulation on and off as needed. Finally, adjunctive speech therapy can reduce vocal tremor, said Dr. Klassen.

—Erik Greb

Issue
Neurology Reviews - 26(4)
Publications
Topics
Page Number
9-14
Sections
Related Articles
The approach to symptom control may change according to whether the symptom results from the disease or the stimulation.
The approach to symptom control may change according to whether the symptom results from the disease or the stimulation.

LAS VEGAS—In patients with Parkinson’s disease, deep brain stimulation (DBS) can improve gait significantly and reduce vocal tremor. Some patients may fail to improve following implantation, however, and others who do improve may later worsen. In such cases, neurologists can address problems with gait and voice tremor using various steps to optimize DBS treatment, according to two lectures delivered at the 21st Annual Meeting of the North American Neuromodulation Society.

Refractory Gait Impairment

Gait impairment and freezing of gait may persist for years in some patients, despite DBS treatment at the traditional frequency of 130 Hz. Studies by Moreau and colleagues indicate that stimulation at 60 Hz improves these outcomes in previously refractory patients, said Helen M. Brontë-Stewart, MD, MSE, the John E. Cahill Family Professor and Director of the Stanford Movement Disorders Center at Stanford University School of Medicine in California. Research by Ricchi et al shows that low-frequency DBS also reduces gait impairment and freezing of gait in the early stages after implantation.

Helen M. Brontë-Stewart, MD, MSE

The factors that predict which patients will benefit from low-frequency DBS of the subthalamic nucleus (STN) are increased age, severe axial phenotype at five years after surgery, and lower preoperative levodopa responsiveness. But low-frequency DBS may not be adequate to improve other motor signs such as tremor, said Dr. Brontë-Stewart. Improvements on low-frequency DBS also may not last long.

The literature about which part of the STN should be stimulated for more effective treatment contains mixed results. Several investigations, including a 2011 study by McNeely et al, showed that high-frequency DBS is most efficacious when applied to the dorsolateral margin of STN. Other studies, including one performed by Dr. Brontë-Stewart and colleagues, indicate that stimulating the ventral area of the STN is more effective. Khoo et al found that 60-Hz stimulation was superior to 130-Hz stimulation for axial motor signs in Parkinson’s disease. “Clearly, we do not have consensus,” said Dr. Brontë-Stewart.

Postsurgical Gait Worsening

If a patient’s gait worsens shortly after DBS surgery, one possible explanation is that the leads were misplaced. Gait also could worsen if high-frequency DBS is applied outside the STN, especially in the anterior, medial, and dorsal regions, said Dr. Brontë-Stewart. If a patient’s gait and akinesia worsen with high-frequency STN DBS, but his or her tremor and rigidity improve, the cause may be diffusion of the stimulatory field into the pallido-fugal fibers before decussation of the pallido-pedunculopontine nucleus (PPN) pathway.

“The combination of STN DBS and medication may lead to lower-extremity dyskinesias,” which may account for gait worsening in some patients, said Dr. Brontë-Stewart. “It is important to look at these patients off medication. It may show that the dyskinesias are interfering with the gait studies, and whether the medication is affecting their cognition, which may also worsen gait.”

Patients’ gait and balance may worsen years after implantation. For example, stimulation-resistant axial symptoms may emerge after five years of DBS even if treatment remains effective for appendicular symptoms. This outcome may follow progression of the disease into nondopaminergic networks. Another possible cause is increased voltage that involves pallido-fugal pathways, thus enlarging the field of stimulation, said Dr. Brontë-Stewart.

For patients with delayed worsening, Dr. Brontë-Stewart advises that “if you reprogram DBS, focusing on gait symmetry, you can improve gait, including freezing of gait. Many of us program DBS for appendicular symptoms, and we fail to do this for gait…. Perhaps use bipolar or interleaving programming to restrict field extension.”

Preoperative improvement in Unified Parkinson’s Disease Rating Scale Part III scores in response to levodopa treatment is the best predictor of the effect of DBS on gait and freezing of gait. Improvement in freezing of gait following STN DBS has, in turn, been related to reduced medication dosing and lack of worsening of cognition, concluded Dr. Brontë-Stewart.

An Initial Approach to Vocal Tremor

The literature suggests that in patients with Parkinson’s disease, STN DBS often results in deterioration of speech that may not improve when the stimulation is stopped. Predictors of vocal problems include presurgical dysarthria, duration and severity of presurgical disease progression, and contact placement around the left STN.

“There is no large evidence base upon which to work when you are trying to … deal with someone who comes to you with speech problems,” said Bryan T. Klassen, MD, Assistant Professor of Neurology at the Mayo Clinic in Rochester, Minnesota. Addressing potential speech problems before implantation “should be a major part of any DBS protocol,” he added. A neurologist should document a patient’s pre-existing speech issues carefully. At Mayo Clinic, all patients scheduled to undergo implantation visit a speech pathologist first, and the examination is recorded.

In addition, patients need to understand that vocal tremor may be a symptom of Parkinson’s disease and may not result from DBS. On the other hand, neurologists also should inform patients that inserting the leads may cause dysarthria even before the battery for the device is implanted. Patients ultimately may have to choose between optimal tremor control and optimal speech, said Dr. Klassen.

 

 

Disease-Related Vocal Abnormalities

When a patient presents with speech problems, the neurologist must determine whether they result from the disease or from stimulation. Symptoms that have responded insufficiently to DBS are likely related to the disease, as are symptoms consistent with disease progression, such as gradually progressive dysarthria. These symptoms may respond to more aggressive stimulation. A patient with worsening hypokinetic dysarthria, however, may not improve, and could worsen, with more aggressive stimulation.

No clear criteria can help a neurologist determine whether to consider abnormal speech nonresponsive to stimulation. This determination relies on clinical judgment and should be communicated clearly to the patient, said Dr. Klassen. At that point, the neurologist and patient may consider speech therapy.

Stimulation-Related Vocal Abnormalities

DBS implantation itself sometimes causes dysarthria that may improve over the course of weeks or months. Implantation also may worsen pre-existing dysarthria. “That [side effect] does not necessarily have to limit what or how you are stimulating for tremor control,” said Dr. Klassen. If the symptom results from stimulation, it will improve when stimulation is stopped. It may take as little as a few seconds or as long as several weeks for vocal abnormalities to improve, but tremor worsens while stimulation is turned off.

A neurologist should locate the source of any stimulation-dependent vocal abnormality so that he or she can focus the stimulation field on that source. Although the left lead tends to be implicated in vocal abnormalities more often than the right lead, the neurologist needs to determine the leads’ contributions empirically by turning the leads off individually. “Depending on the washout [period], that may take more time than you would like,” said Dr. Klassen.

A review of the initial monopolar thresholds can indicate which regions along the electrode tend to affect speech the most. Postoperative imaging may help in this determination. If the patient has a prolonged washout period, the neurologist can give him or her “homework,” said Dr. Klassen. To do this, the neurologist sets the DBS device to run several programs and asks the patient to record his or her experiences in a notebook.

Optimizing the Stimulation Settings

Vocal abnormalities that arise after surgery may indicate that the stimulation parameters need to be modified. First, neurologists must choose the optimal lead location along the electrode. Eccentric steering or multiple-source current steering may reduce vocal tremor by better defining the distribution of current.

To reduce the volume of tissue activated, the neurologist can increase the pulse width, reduce the amplitude, or switch to a bipolar configuration. If a particular setting causes side effects, reducing the voltage may increase tolerability, albeit at the expense of efficacy. Switching from a high frequency to a low frequency also may reduce vocal tremor. If it is impossible to control limb tremor and vocal abnormalities optimally with a single setting, the patient may choose the setting that provides the most acceptable overall control.

Another option is to allow the patient to switch as necessary between a program optimized for tremor control and one optimized for speech. A patient may also choose to turn stimulation on and off as needed. Finally, adjunctive speech therapy can reduce vocal tremor, said Dr. Klassen.

—Erik Greb

LAS VEGAS—In patients with Parkinson’s disease, deep brain stimulation (DBS) can improve gait significantly and reduce vocal tremor. Some patients may fail to improve following implantation, however, and others who do improve may later worsen. In such cases, neurologists can address problems with gait and voice tremor using various steps to optimize DBS treatment, according to two lectures delivered at the 21st Annual Meeting of the North American Neuromodulation Society.

Refractory Gait Impairment

Gait impairment and freezing of gait may persist for years in some patients, despite DBS treatment at the traditional frequency of 130 Hz. Studies by Moreau and colleagues indicate that stimulation at 60 Hz improves these outcomes in previously refractory patients, said Helen M. Brontë-Stewart, MD, MSE, the John E. Cahill Family Professor and Director of the Stanford Movement Disorders Center at Stanford University School of Medicine in California. Research by Ricchi et al shows that low-frequency DBS also reduces gait impairment and freezing of gait in the early stages after implantation.

Helen M. Brontë-Stewart, MD, MSE

The factors that predict which patients will benefit from low-frequency DBS of the subthalamic nucleus (STN) are increased age, severe axial phenotype at five years after surgery, and lower preoperative levodopa responsiveness. But low-frequency DBS may not be adequate to improve other motor signs such as tremor, said Dr. Brontë-Stewart. Improvements on low-frequency DBS also may not last long.

The literature about which part of the STN should be stimulated for more effective treatment contains mixed results. Several investigations, including a 2011 study by McNeely et al, showed that high-frequency DBS is most efficacious when applied to the dorsolateral margin of STN. Other studies, including one performed by Dr. Brontë-Stewart and colleagues, indicate that stimulating the ventral area of the STN is more effective. Khoo et al found that 60-Hz stimulation was superior to 130-Hz stimulation for axial motor signs in Parkinson’s disease. “Clearly, we do not have consensus,” said Dr. Brontë-Stewart.

Postsurgical Gait Worsening

If a patient’s gait worsens shortly after DBS surgery, one possible explanation is that the leads were misplaced. Gait also could worsen if high-frequency DBS is applied outside the STN, especially in the anterior, medial, and dorsal regions, said Dr. Brontë-Stewart. If a patient’s gait and akinesia worsen with high-frequency STN DBS, but his or her tremor and rigidity improve, the cause may be diffusion of the stimulatory field into the pallido-fugal fibers before decussation of the pallido-pedunculopontine nucleus (PPN) pathway.

“The combination of STN DBS and medication may lead to lower-extremity dyskinesias,” which may account for gait worsening in some patients, said Dr. Brontë-Stewart. “It is important to look at these patients off medication. It may show that the dyskinesias are interfering with the gait studies, and whether the medication is affecting their cognition, which may also worsen gait.”

Patients’ gait and balance may worsen years after implantation. For example, stimulation-resistant axial symptoms may emerge after five years of DBS even if treatment remains effective for appendicular symptoms. This outcome may follow progression of the disease into nondopaminergic networks. Another possible cause is increased voltage that involves pallido-fugal pathways, thus enlarging the field of stimulation, said Dr. Brontë-Stewart.

For patients with delayed worsening, Dr. Brontë-Stewart advises that “if you reprogram DBS, focusing on gait symmetry, you can improve gait, including freezing of gait. Many of us program DBS for appendicular symptoms, and we fail to do this for gait…. Perhaps use bipolar or interleaving programming to restrict field extension.”

Preoperative improvement in Unified Parkinson’s Disease Rating Scale Part III scores in response to levodopa treatment is the best predictor of the effect of DBS on gait and freezing of gait. Improvement in freezing of gait following STN DBS has, in turn, been related to reduced medication dosing and lack of worsening of cognition, concluded Dr. Brontë-Stewart.

An Initial Approach to Vocal Tremor

The literature suggests that in patients with Parkinson’s disease, STN DBS often results in deterioration of speech that may not improve when the stimulation is stopped. Predictors of vocal problems include presurgical dysarthria, duration and severity of presurgical disease progression, and contact placement around the left STN.

“There is no large evidence base upon which to work when you are trying to … deal with someone who comes to you with speech problems,” said Bryan T. Klassen, MD, Assistant Professor of Neurology at the Mayo Clinic in Rochester, Minnesota. Addressing potential speech problems before implantation “should be a major part of any DBS protocol,” he added. A neurologist should document a patient’s pre-existing speech issues carefully. At Mayo Clinic, all patients scheduled to undergo implantation visit a speech pathologist first, and the examination is recorded.

In addition, patients need to understand that vocal tremor may be a symptom of Parkinson’s disease and may not result from DBS. On the other hand, neurologists also should inform patients that inserting the leads may cause dysarthria even before the battery for the device is implanted. Patients ultimately may have to choose between optimal tremor control and optimal speech, said Dr. Klassen.

 

 

Disease-Related Vocal Abnormalities

When a patient presents with speech problems, the neurologist must determine whether they result from the disease or from stimulation. Symptoms that have responded insufficiently to DBS are likely related to the disease, as are symptoms consistent with disease progression, such as gradually progressive dysarthria. These symptoms may respond to more aggressive stimulation. A patient with worsening hypokinetic dysarthria, however, may not improve, and could worsen, with more aggressive stimulation.

No clear criteria can help a neurologist determine whether to consider abnormal speech nonresponsive to stimulation. This determination relies on clinical judgment and should be communicated clearly to the patient, said Dr. Klassen. At that point, the neurologist and patient may consider speech therapy.

Stimulation-Related Vocal Abnormalities

DBS implantation itself sometimes causes dysarthria that may improve over the course of weeks or months. Implantation also may worsen pre-existing dysarthria. “That [side effect] does not necessarily have to limit what or how you are stimulating for tremor control,” said Dr. Klassen. If the symptom results from stimulation, it will improve when stimulation is stopped. It may take as little as a few seconds or as long as several weeks for vocal abnormalities to improve, but tremor worsens while stimulation is turned off.

A neurologist should locate the source of any stimulation-dependent vocal abnormality so that he or she can focus the stimulation field on that source. Although the left lead tends to be implicated in vocal abnormalities more often than the right lead, the neurologist needs to determine the leads’ contributions empirically by turning the leads off individually. “Depending on the washout [period], that may take more time than you would like,” said Dr. Klassen.

A review of the initial monopolar thresholds can indicate which regions along the electrode tend to affect speech the most. Postoperative imaging may help in this determination. If the patient has a prolonged washout period, the neurologist can give him or her “homework,” said Dr. Klassen. To do this, the neurologist sets the DBS device to run several programs and asks the patient to record his or her experiences in a notebook.

Optimizing the Stimulation Settings

Vocal abnormalities that arise after surgery may indicate that the stimulation parameters need to be modified. First, neurologists must choose the optimal lead location along the electrode. Eccentric steering or multiple-source current steering may reduce vocal tremor by better defining the distribution of current.

To reduce the volume of tissue activated, the neurologist can increase the pulse width, reduce the amplitude, or switch to a bipolar configuration. If a particular setting causes side effects, reducing the voltage may increase tolerability, albeit at the expense of efficacy. Switching from a high frequency to a low frequency also may reduce vocal tremor. If it is impossible to control limb tremor and vocal abnormalities optimally with a single setting, the patient may choose the setting that provides the most acceptable overall control.

Another option is to allow the patient to switch as necessary between a program optimized for tremor control and one optimized for speech. A patient may also choose to turn stimulation on and off as needed. Finally, adjunctive speech therapy can reduce vocal tremor, said Dr. Klassen.

—Erik Greb

Issue
Neurology Reviews - 26(4)
Issue
Neurology Reviews - 26(4)
Page Number
9-14
Page Number
9-14
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Cutting Edge Technology in Dermatology: Virtual Reality and Artificial Intelligence

Article Type
Changed
Thu, 03/28/2019 - 14:39
Display Headline
Cutting Edge Technology in Dermatology: Virtual Reality and Artificial Intelligence

The clinical practice of dermatology is changing at a rapid pace. Advances in technology and new inventions in rapid diagnostics are revolutionizing how physicians approach medical care. In 2009, the Health Information Technology for Economic and Clinical Health Act of 20091 ushered in the era of electronic medical records, along with a series of associated challenges.2 In 2014, the potential reach of medical expertise in the United States was expanded with the creation of the Interstate Medical Licensure Compact, which offers an expedited pathway to licensure for physicians seeking to practice in multiple states as a way to increase access to health care in underserved or rural areas via telemedicine.3 In early 2017, a computer algorithm was able to perform on par with board-certified dermatologists when distinguishing between clinical images of biopsy-proven benign and malignant skin lesions.4 Recently, Microsoft announced a partnership with rural telecommunications providers to bring high-speed broadband Internet service to millions of Americans using television white space technology, which can improve access to health care services through the implementation of telemedicine and other connected health technologies in rural communities.5

Given these advances, how does today’s dermatologist integrate into the future of the specialty? If leveraged properly, current technologies such as teledermatology and patient portals integrated with electronic medical records can be beneficial to dermatology practices by improving access to care, facilitating triage of patients, and improving communication between patients and health care team members. Herein, we discuss some of the emerging technologies that have the potential to shape clinical dermatology practice and remove barriers to care.

Virtual Reality

Teledermatology can be practiced through live video or, more commonly, via a store-and-forward method in which dermatologists review clinical photographs and the patient's history asynchronously with the in-office visit.6 Virtual reality has the potential to augment teledermatology services by enabling a live, interactive visit that more closely models the traditional face-to-face visit. Virtual reality already is available for patients at home with the use of a commercially marketed headset and a smartphone, and the marriage of virtual reailty and telemedicine has the potential to transform health care.

Virtual reality also can be used to deliver an essential component of the physical examination of a patient: sensory information from palpation. Haptic feedback, also known as haptics, is used to relay force and tactile information to the user of a device (eg, a haptic glove).7 In dermatology, this information pertains to the skin texture, skin profile, and physical properties (eg, stiffness, temperature).8 Assessing the texture of the skin surface can help when distinguishing epidermal processes such as psoriasis versus atopic dermatitis or when evaluating edema, induration, and depth of a leg ulcer.9

One model for conducting a teledermatology encounter that captures sensory information would consist of a haptic probe located at a referring medical provider’s office for examining patients and a master robot that controls the probe located at the consulting dermatologist’s facility.8 Another model converts 2-dimensional images taken from traditional full-body optical imaging systems into virtual 3-dimensional (3D) images that can be felt using a haptic device.10,11 In this method, the user is able to both visualize and touch the skin surface at the same time. Currently, 3D imaging of skin lesions is available in the form of a specialized handheld imager that allows the dermatologist to appreciate the texture and elevation of single lesions when viewing clinical photographs. Additionally, full-body 3D mapping of the skin surface is available for monitoring pigmented lesions or other diseases of the skin.12,13

Artificial Intelligence and Machine Learning

Computer algorithms can be helpful in assisting physicians with disease diagnosis. Machine learning is a subfield of artificial intelligence (AI) in which computer programs learn automatically from experience without explicit programming instructions. A machine learning algorithm uses a labeled data set known as a training set to create a function that can make predictions about new inputs in the future.14 The algorithm can successively compare its predicted outputs with the correct outputs and modify its function as errors are found; for example, a database of images of healthy skin as well as the skin of psoriasis patients can be fed into a machine learning algorithm that picks up features such as color and skin texture from the labeled photographs, allowing it to learn how to diagnose psoriasis.15

In one instance, researchers at Stanford University (Stanford, California) used approximately 130,000 images representing over 2000 different skin diseases in order to train their machine learning algorithm to recognize benign and malignant skin lesions.4 Although the algorithm was able to match the performance of experienced dermatologists in many diagnostic categories, further testing in a real-world clinical setting still needs to be done. In the future, nondermatologists may have the option to consult with decision-support systems that include image analysis software, such as the one developed at Stanford University, for making decisions in triage or diagnosis, which may be critical in areas where access to a dermatologist is limited.4 Future AI systems also may provide supplemental assistance in managing patients to dermatology trainees until they have the experience of a more established dermatologist.

Currently, AI cannot match general human intelligence and life experience; therefore, physicians will continue to make the final decisions when it comes to diagnosis and treatment. In the future, AI algorithms may integrate into clinical dermatology practice, leading to more accurate triage of lesions, potentially streamlined referral to dermatologists for skin conditions that require prompt consultation, and improved quality of care.

Summary

In conclusion, emerging technologies have the power to augment and revolutionize dermatology practice. Savvy dermatologists may incorporate new tools in a way that works for their practice, leading to increased efficiency and improved patient outcomes. Eventually, the technology that is most beneficial to clinical practice will likely be adopted by and integrated into mainstream dermatologic care, making it available for the majority of clinicians to use.

References
  1. Health Information Technology for Economic and Clinical Health (HITECH) Act, Pub L No. 111-5, 123 Stat 226 (2009).
  2. Holmgren AJ, Adler-Milstein J. Health information exchange in US hospitals: the current landscape and a path to improved information sharing. J Hosp Med. 2017;12:193-198.
  3. The IMLC. Interstate Medical Licensure Compact website. http://www.imlcc.org. Accessed March 21, 2018.
  4. Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017;542:115-118.
  5. The Associated Press. Microsoft eyes buffer zone in TV airwaves for rural internet. ABC News website. http://abcnews.go.com/amp/Technology/wireStory/microsoft-announces-rural-broadband-initiative-48562282. Published July 11, 2017. Accessed March 21, 2018.
  6. Practice guidelines for dermatology. American Telemedicine Association website. https://higherlogicdownload.s3.amazonaws.com/AMERICANTELEMED/3c09839a-fffd-46f7-916c-692c11d78933/UploadedImages/SIGs/Teledermatology.Final.pdf. Accessed March 27, 2018. Published April 28, 2016.
  7. Lee O, Lee K, Oh C, et al. Prototype tactile feedback system for examination by skin touch. Skin Res Technol. 2014;20:307-314.
  8. Waldron KJ, Enedah C, Gladstone H. Stiffness and texture perception for teledermatology. Stud Health Technol Inform. 2005;111:579-585.
  9. Cox NH. A literally blinded trial of palpation in dermatologic diagnosis. J Am Acad Dermatol. 2007;56:949-951.
  10. Kim K. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback. Skin Res Technol. 2016;22:334-340.
  11. Kim K, Lee S. Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch. Skin Res Technol. 2015;21:164-174.
  12. How it works. 3Derm website. https://www.3derm.com. Accessed March 21, 2018.
  13. Vectra 3D. Canfield Scientific website. http://www.canfieldsci.com/imaging-systems/vectra-wb360-imaging-system. Accessed March 21, 2018.
  14. Alpaydin E. Introduction to Machine Learning. Cambridge, MA: MIT Press; 2014.
  15. Shrivastava VK, Londhe ND, Sonawane RS, et al. Computer-aided diagnosis of psoriasis skin images with HOS, texture and color features: a first comparative study of its kind. Comput Methods Programs Biomed. 2016;126:98-109.
Article PDF
Author and Disclosure Information

Dr. Prado is from Orange Park Medical Center, Florida. Dr. Kovarik is from the Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, Philadelphia.

The authors report no conflict of interest.

Correspondence: Carrie Kovarik, MD, Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, 3600 Spruce St, 2 Maloney Bldg, Philadelphia, PA 19104 (carrie.kovarik@uphs.upenn.edu).

Issue
Cutis - 101(4)
Publications
Topics
Page Number
236-237
Sections
Author and Disclosure Information

Dr. Prado is from Orange Park Medical Center, Florida. Dr. Kovarik is from the Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, Philadelphia.

The authors report no conflict of interest.

Correspondence: Carrie Kovarik, MD, Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, 3600 Spruce St, 2 Maloney Bldg, Philadelphia, PA 19104 (carrie.kovarik@uphs.upenn.edu).

Author and Disclosure Information

Dr. Prado is from Orange Park Medical Center, Florida. Dr. Kovarik is from the Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, Philadelphia.

The authors report no conflict of interest.

Correspondence: Carrie Kovarik, MD, Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, 3600 Spruce St, 2 Maloney Bldg, Philadelphia, PA 19104 (carrie.kovarik@uphs.upenn.edu).

Article PDF
Article PDF

The clinical practice of dermatology is changing at a rapid pace. Advances in technology and new inventions in rapid diagnostics are revolutionizing how physicians approach medical care. In 2009, the Health Information Technology for Economic and Clinical Health Act of 20091 ushered in the era of electronic medical records, along with a series of associated challenges.2 In 2014, the potential reach of medical expertise in the United States was expanded with the creation of the Interstate Medical Licensure Compact, which offers an expedited pathway to licensure for physicians seeking to practice in multiple states as a way to increase access to health care in underserved or rural areas via telemedicine.3 In early 2017, a computer algorithm was able to perform on par with board-certified dermatologists when distinguishing between clinical images of biopsy-proven benign and malignant skin lesions.4 Recently, Microsoft announced a partnership with rural telecommunications providers to bring high-speed broadband Internet service to millions of Americans using television white space technology, which can improve access to health care services through the implementation of telemedicine and other connected health technologies in rural communities.5

Given these advances, how does today’s dermatologist integrate into the future of the specialty? If leveraged properly, current technologies such as teledermatology and patient portals integrated with electronic medical records can be beneficial to dermatology practices by improving access to care, facilitating triage of patients, and improving communication between patients and health care team members. Herein, we discuss some of the emerging technologies that have the potential to shape clinical dermatology practice and remove barriers to care.

Virtual Reality

Teledermatology can be practiced through live video or, more commonly, via a store-and-forward method in which dermatologists review clinical photographs and the patient's history asynchronously with the in-office visit.6 Virtual reality has the potential to augment teledermatology services by enabling a live, interactive visit that more closely models the traditional face-to-face visit. Virtual reality already is available for patients at home with the use of a commercially marketed headset and a smartphone, and the marriage of virtual reailty and telemedicine has the potential to transform health care.

Virtual reality also can be used to deliver an essential component of the physical examination of a patient: sensory information from palpation. Haptic feedback, also known as haptics, is used to relay force and tactile information to the user of a device (eg, a haptic glove).7 In dermatology, this information pertains to the skin texture, skin profile, and physical properties (eg, stiffness, temperature).8 Assessing the texture of the skin surface can help when distinguishing epidermal processes such as psoriasis versus atopic dermatitis or when evaluating edema, induration, and depth of a leg ulcer.9

One model for conducting a teledermatology encounter that captures sensory information would consist of a haptic probe located at a referring medical provider’s office for examining patients and a master robot that controls the probe located at the consulting dermatologist’s facility.8 Another model converts 2-dimensional images taken from traditional full-body optical imaging systems into virtual 3-dimensional (3D) images that can be felt using a haptic device.10,11 In this method, the user is able to both visualize and touch the skin surface at the same time. Currently, 3D imaging of skin lesions is available in the form of a specialized handheld imager that allows the dermatologist to appreciate the texture and elevation of single lesions when viewing clinical photographs. Additionally, full-body 3D mapping of the skin surface is available for monitoring pigmented lesions or other diseases of the skin.12,13

Artificial Intelligence and Machine Learning

Computer algorithms can be helpful in assisting physicians with disease diagnosis. Machine learning is a subfield of artificial intelligence (AI) in which computer programs learn automatically from experience without explicit programming instructions. A machine learning algorithm uses a labeled data set known as a training set to create a function that can make predictions about new inputs in the future.14 The algorithm can successively compare its predicted outputs with the correct outputs and modify its function as errors are found; for example, a database of images of healthy skin as well as the skin of psoriasis patients can be fed into a machine learning algorithm that picks up features such as color and skin texture from the labeled photographs, allowing it to learn how to diagnose psoriasis.15

In one instance, researchers at Stanford University (Stanford, California) used approximately 130,000 images representing over 2000 different skin diseases in order to train their machine learning algorithm to recognize benign and malignant skin lesions.4 Although the algorithm was able to match the performance of experienced dermatologists in many diagnostic categories, further testing in a real-world clinical setting still needs to be done. In the future, nondermatologists may have the option to consult with decision-support systems that include image analysis software, such as the one developed at Stanford University, for making decisions in triage or diagnosis, which may be critical in areas where access to a dermatologist is limited.4 Future AI systems also may provide supplemental assistance in managing patients to dermatology trainees until they have the experience of a more established dermatologist.

Currently, AI cannot match general human intelligence and life experience; therefore, physicians will continue to make the final decisions when it comes to diagnosis and treatment. In the future, AI algorithms may integrate into clinical dermatology practice, leading to more accurate triage of lesions, potentially streamlined referral to dermatologists for skin conditions that require prompt consultation, and improved quality of care.

Summary

In conclusion, emerging technologies have the power to augment and revolutionize dermatology practice. Savvy dermatologists may incorporate new tools in a way that works for their practice, leading to increased efficiency and improved patient outcomes. Eventually, the technology that is most beneficial to clinical practice will likely be adopted by and integrated into mainstream dermatologic care, making it available for the majority of clinicians to use.

The clinical practice of dermatology is changing at a rapid pace. Advances in technology and new inventions in rapid diagnostics are revolutionizing how physicians approach medical care. In 2009, the Health Information Technology for Economic and Clinical Health Act of 20091 ushered in the era of electronic medical records, along with a series of associated challenges.2 In 2014, the potential reach of medical expertise in the United States was expanded with the creation of the Interstate Medical Licensure Compact, which offers an expedited pathway to licensure for physicians seeking to practice in multiple states as a way to increase access to health care in underserved or rural areas via telemedicine.3 In early 2017, a computer algorithm was able to perform on par with board-certified dermatologists when distinguishing between clinical images of biopsy-proven benign and malignant skin lesions.4 Recently, Microsoft announced a partnership with rural telecommunications providers to bring high-speed broadband Internet service to millions of Americans using television white space technology, which can improve access to health care services through the implementation of telemedicine and other connected health technologies in rural communities.5

Given these advances, how does today’s dermatologist integrate into the future of the specialty? If leveraged properly, current technologies such as teledermatology and patient portals integrated with electronic medical records can be beneficial to dermatology practices by improving access to care, facilitating triage of patients, and improving communication between patients and health care team members. Herein, we discuss some of the emerging technologies that have the potential to shape clinical dermatology practice and remove barriers to care.

Virtual Reality

Teledermatology can be practiced through live video or, more commonly, via a store-and-forward method in which dermatologists review clinical photographs and the patient's history asynchronously with the in-office visit.6 Virtual reality has the potential to augment teledermatology services by enabling a live, interactive visit that more closely models the traditional face-to-face visit. Virtual reality already is available for patients at home with the use of a commercially marketed headset and a smartphone, and the marriage of virtual reailty and telemedicine has the potential to transform health care.

Virtual reality also can be used to deliver an essential component of the physical examination of a patient: sensory information from palpation. Haptic feedback, also known as haptics, is used to relay force and tactile information to the user of a device (eg, a haptic glove).7 In dermatology, this information pertains to the skin texture, skin profile, and physical properties (eg, stiffness, temperature).8 Assessing the texture of the skin surface can help when distinguishing epidermal processes such as psoriasis versus atopic dermatitis or when evaluating edema, induration, and depth of a leg ulcer.9

One model for conducting a teledermatology encounter that captures sensory information would consist of a haptic probe located at a referring medical provider’s office for examining patients and a master robot that controls the probe located at the consulting dermatologist’s facility.8 Another model converts 2-dimensional images taken from traditional full-body optical imaging systems into virtual 3-dimensional (3D) images that can be felt using a haptic device.10,11 In this method, the user is able to both visualize and touch the skin surface at the same time. Currently, 3D imaging of skin lesions is available in the form of a specialized handheld imager that allows the dermatologist to appreciate the texture and elevation of single lesions when viewing clinical photographs. Additionally, full-body 3D mapping of the skin surface is available for monitoring pigmented lesions or other diseases of the skin.12,13

Artificial Intelligence and Machine Learning

Computer algorithms can be helpful in assisting physicians with disease diagnosis. Machine learning is a subfield of artificial intelligence (AI) in which computer programs learn automatically from experience without explicit programming instructions. A machine learning algorithm uses a labeled data set known as a training set to create a function that can make predictions about new inputs in the future.14 The algorithm can successively compare its predicted outputs with the correct outputs and modify its function as errors are found; for example, a database of images of healthy skin as well as the skin of psoriasis patients can be fed into a machine learning algorithm that picks up features such as color and skin texture from the labeled photographs, allowing it to learn how to diagnose psoriasis.15

In one instance, researchers at Stanford University (Stanford, California) used approximately 130,000 images representing over 2000 different skin diseases in order to train their machine learning algorithm to recognize benign and malignant skin lesions.4 Although the algorithm was able to match the performance of experienced dermatologists in many diagnostic categories, further testing in a real-world clinical setting still needs to be done. In the future, nondermatologists may have the option to consult with decision-support systems that include image analysis software, such as the one developed at Stanford University, for making decisions in triage or diagnosis, which may be critical in areas where access to a dermatologist is limited.4 Future AI systems also may provide supplemental assistance in managing patients to dermatology trainees until they have the experience of a more established dermatologist.

Currently, AI cannot match general human intelligence and life experience; therefore, physicians will continue to make the final decisions when it comes to diagnosis and treatment. In the future, AI algorithms may integrate into clinical dermatology practice, leading to more accurate triage of lesions, potentially streamlined referral to dermatologists for skin conditions that require prompt consultation, and improved quality of care.

Summary

In conclusion, emerging technologies have the power to augment and revolutionize dermatology practice. Savvy dermatologists may incorporate new tools in a way that works for their practice, leading to increased efficiency and improved patient outcomes. Eventually, the technology that is most beneficial to clinical practice will likely be adopted by and integrated into mainstream dermatologic care, making it available for the majority of clinicians to use.

References
  1. Health Information Technology for Economic and Clinical Health (HITECH) Act, Pub L No. 111-5, 123 Stat 226 (2009).
  2. Holmgren AJ, Adler-Milstein J. Health information exchange in US hospitals: the current landscape and a path to improved information sharing. J Hosp Med. 2017;12:193-198.
  3. The IMLC. Interstate Medical Licensure Compact website. http://www.imlcc.org. Accessed March 21, 2018.
  4. Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017;542:115-118.
  5. The Associated Press. Microsoft eyes buffer zone in TV airwaves for rural internet. ABC News website. http://abcnews.go.com/amp/Technology/wireStory/microsoft-announces-rural-broadband-initiative-48562282. Published July 11, 2017. Accessed March 21, 2018.
  6. Practice guidelines for dermatology. American Telemedicine Association website. https://higherlogicdownload.s3.amazonaws.com/AMERICANTELEMED/3c09839a-fffd-46f7-916c-692c11d78933/UploadedImages/SIGs/Teledermatology.Final.pdf. Accessed March 27, 2018. Published April 28, 2016.
  7. Lee O, Lee K, Oh C, et al. Prototype tactile feedback system for examination by skin touch. Skin Res Technol. 2014;20:307-314.
  8. Waldron KJ, Enedah C, Gladstone H. Stiffness and texture perception for teledermatology. Stud Health Technol Inform. 2005;111:579-585.
  9. Cox NH. A literally blinded trial of palpation in dermatologic diagnosis. J Am Acad Dermatol. 2007;56:949-951.
  10. Kim K. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback. Skin Res Technol. 2016;22:334-340.
  11. Kim K, Lee S. Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch. Skin Res Technol. 2015;21:164-174.
  12. How it works. 3Derm website. https://www.3derm.com. Accessed March 21, 2018.
  13. Vectra 3D. Canfield Scientific website. http://www.canfieldsci.com/imaging-systems/vectra-wb360-imaging-system. Accessed March 21, 2018.
  14. Alpaydin E. Introduction to Machine Learning. Cambridge, MA: MIT Press; 2014.
  15. Shrivastava VK, Londhe ND, Sonawane RS, et al. Computer-aided diagnosis of psoriasis skin images with HOS, texture and color features: a first comparative study of its kind. Comput Methods Programs Biomed. 2016;126:98-109.
References
  1. Health Information Technology for Economic and Clinical Health (HITECH) Act, Pub L No. 111-5, 123 Stat 226 (2009).
  2. Holmgren AJ, Adler-Milstein J. Health information exchange in US hospitals: the current landscape and a path to improved information sharing. J Hosp Med. 2017;12:193-198.
  3. The IMLC. Interstate Medical Licensure Compact website. http://www.imlcc.org. Accessed March 21, 2018.
  4. Esteva A, Kuprel B, Novoa RA, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017;542:115-118.
  5. The Associated Press. Microsoft eyes buffer zone in TV airwaves for rural internet. ABC News website. http://abcnews.go.com/amp/Technology/wireStory/microsoft-announces-rural-broadband-initiative-48562282. Published July 11, 2017. Accessed March 21, 2018.
  6. Practice guidelines for dermatology. American Telemedicine Association website. https://higherlogicdownload.s3.amazonaws.com/AMERICANTELEMED/3c09839a-fffd-46f7-916c-692c11d78933/UploadedImages/SIGs/Teledermatology.Final.pdf. Accessed March 27, 2018. Published April 28, 2016.
  7. Lee O, Lee K, Oh C, et al. Prototype tactile feedback system for examination by skin touch. Skin Res Technol. 2014;20:307-314.
  8. Waldron KJ, Enedah C, Gladstone H. Stiffness and texture perception for teledermatology. Stud Health Technol Inform. 2005;111:579-585.
  9. Cox NH. A literally blinded trial of palpation in dermatologic diagnosis. J Am Acad Dermatol. 2007;56:949-951.
  10. Kim K. Roughness based perceptual analysis towards digital skin imaging system with haptic feedback. Skin Res Technol. 2016;22:334-340.
  11. Kim K, Lee S. Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch. Skin Res Technol. 2015;21:164-174.
  12. How it works. 3Derm website. https://www.3derm.com. Accessed March 21, 2018.
  13. Vectra 3D. Canfield Scientific website. http://www.canfieldsci.com/imaging-systems/vectra-wb360-imaging-system. Accessed March 21, 2018.
  14. Alpaydin E. Introduction to Machine Learning. Cambridge, MA: MIT Press; 2014.
  15. Shrivastava VK, Londhe ND, Sonawane RS, et al. Computer-aided diagnosis of psoriasis skin images with HOS, texture and color features: a first comparative study of its kind. Comput Methods Programs Biomed. 2016;126:98-109.
Issue
Cutis - 101(4)
Issue
Cutis - 101(4)
Page Number
236-237
Page Number
236-237
Publications
Publications
Topics
Article Type
Display Headline
Cutting Edge Technology in Dermatology: Virtual Reality and Artificial Intelligence
Display Headline
Cutting Edge Technology in Dermatology: Virtual Reality and Artificial Intelligence
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Article PDF Media

Consanguineous parentage raises risk of mood disorders, psychoses in offspring

Findings should stimulate research
Article Type
Changed
Fri, 01/18/2019 - 17:32

Children of consanguineous parents are three times more likely to be prescribed medications for common mood disorders than the children of nonrelated parents, according to a study published April 4.

In JAMA Psychiatry, researchers reported the results of a retrospective populationwide cohort study involving 363,960 individuals born in Northern Ireland between 1971 and 1986; 609 (0.2%) of whom were born to parents who were either first or second cousins.

The analysis showed a clear relationship between the degree of consanguinity and the likelihood of being prescribed psychotropic medications. After adjusting for known mental health risk factors, including birth weight, children of parents who were first cousins had threefold higher odds of being prescribed antidepressant or anxiolytic medicines (odds ratio, 3.01; 95% confidence interval, 1.24-7.31) and a twofold higher odds of receiving antipsychotics (OR, 2.13; 95% CI, 1.29-3.51), compared with the children of nonconsanguineous parents.

“The results illustrate a clear increasing, stepwise association between level of consanguinity and mental ill health, suggesting a quasi–dose-response association, supporting a causal association between consanguineous parents and mental health of progeny,” wrote Aideen Maguire, PhD, and colleagues from Queen’s University Belfast (Northern Ireland).

Overall, more than one-third (35.8%) of children born to first-cousin consanguineous unions were prescribed antidepressant or anxiolytic medication, and 8.5% received antipsychotic medications, compared with one-quarter (26%) and 2.7% of nonrelated offspring.

Children of parents who were second cousins had an elevated but not statistically significant risk of receiving psychotropic medications (OR, 1.31; 95% CI, 0.63-2.71). None of these associations were affected by whether the births were singleton or multiple births.

 

 


“Despite the recent debate around the physical genetic risk of consanguineous parents, more research is required on the psychological effects of consanguineous parents on progeny,” the authors wrote.

The analysis also showed that participants aged 38-41 years were 15% more likely to receive antipsychotic medication, compared with those aged 26-29 years. The odds also were higher among fourth-born progeny, compared with first-born children, and in those from rural as opposed to urban areas.

Researchers also looked at whether deprivation or living in rural areas was associated with a higher likelihood of consanguineous pairings but found no such interactions. The incidence of consanguineous marriages found in the study was consistent with previous estimates in this population.

The authors suggested several possible explanations as to the association between consanguineous parents and mood disorders. The first was that psychiatric disorders are known to be heritable, suggesting that inherited genetic variants play a major role.
 

 

“As a form of assortative mating, consanguinity increases polygenic loading and thus is likely associated with a higher risk of mental disorder in progeny,” the authors wrote.

They also speculated that having consanguineous parents is associated with social stigma, particularly in Western societies where these partnerships are taboo. Offspring in these societies may experience discrimination that can affect mental health outcomes. However, they also noted that the children in the cohort might not have known about their parents’ consanguineous status.
 

 


“This study demonstrates the ability of populationwide data linkage to explore hard-to-reach populations, and we call on other countries with similar large-scale administrative data sources to use their data to explore the effects of consanguinity on offspring.”

The study was funded by the Centre of Excellence for Public Health Northern Ireland, with additional assistance from the Honest Broker Service. No conflicts of interest were declared.

SOURCE: Maguire A et al. JAMA Psychiatry. 2018 Apr 4. doi: 10.1001/jamapsychiatry.2018.0133.

Body

 

While Charles Darwin – himself the product of a consanguineous marriage – found no evidence of a higher prevalence of consanguineous parentage among the inmates of asylums in England, there is known to be a higher risk of recessively inherited single-gene disorders among the offspring of consanguineous couples. However, until now, it was not known whether this also included an elevated risk of psychiatric disorders.

The findings of this study should stimulate further research efforts toward a greater understanding of the genetic contribution to common complex psychiatric conditions. The increase in whole-genome sequencing could provide entire genomes to help guide genetic counseling, not only for medical but also psychiatric conditions.

The study also should raise awareness about the difficulties and challenges associated with determining consanguinity, amid the potential stigma associated with cousin marriage.
 

Alison Shaw, DPhil, is affiliated with the department of social anthropology at the University of Oxford (England). These comments are taken from an accompanying editorial (JAMA Psychiatry 2018. Apr 4. doi: 10.1001/jamapsychiatry.2018.0513). No conflicts of interest were declared.

Publications
Topics
Sections
Body

 

While Charles Darwin – himself the product of a consanguineous marriage – found no evidence of a higher prevalence of consanguineous parentage among the inmates of asylums in England, there is known to be a higher risk of recessively inherited single-gene disorders among the offspring of consanguineous couples. However, until now, it was not known whether this also included an elevated risk of psychiatric disorders.

The findings of this study should stimulate further research efforts toward a greater understanding of the genetic contribution to common complex psychiatric conditions. The increase in whole-genome sequencing could provide entire genomes to help guide genetic counseling, not only for medical but also psychiatric conditions.

The study also should raise awareness about the difficulties and challenges associated with determining consanguinity, amid the potential stigma associated with cousin marriage.
 

Alison Shaw, DPhil, is affiliated with the department of social anthropology at the University of Oxford (England). These comments are taken from an accompanying editorial (JAMA Psychiatry 2018. Apr 4. doi: 10.1001/jamapsychiatry.2018.0513). No conflicts of interest were declared.

Body

 

While Charles Darwin – himself the product of a consanguineous marriage – found no evidence of a higher prevalence of consanguineous parentage among the inmates of asylums in England, there is known to be a higher risk of recessively inherited single-gene disorders among the offspring of consanguineous couples. However, until now, it was not known whether this also included an elevated risk of psychiatric disorders.

The findings of this study should stimulate further research efforts toward a greater understanding of the genetic contribution to common complex psychiatric conditions. The increase in whole-genome sequencing could provide entire genomes to help guide genetic counseling, not only for medical but also psychiatric conditions.

The study also should raise awareness about the difficulties and challenges associated with determining consanguinity, amid the potential stigma associated with cousin marriage.
 

Alison Shaw, DPhil, is affiliated with the department of social anthropology at the University of Oxford (England). These comments are taken from an accompanying editorial (JAMA Psychiatry 2018. Apr 4. doi: 10.1001/jamapsychiatry.2018.0513). No conflicts of interest were declared.

Title
Findings should stimulate research
Findings should stimulate research

Children of consanguineous parents are three times more likely to be prescribed medications for common mood disorders than the children of nonrelated parents, according to a study published April 4.

In JAMA Psychiatry, researchers reported the results of a retrospective populationwide cohort study involving 363,960 individuals born in Northern Ireland between 1971 and 1986; 609 (0.2%) of whom were born to parents who were either first or second cousins.

The analysis showed a clear relationship between the degree of consanguinity and the likelihood of being prescribed psychotropic medications. After adjusting for known mental health risk factors, including birth weight, children of parents who were first cousins had threefold higher odds of being prescribed antidepressant or anxiolytic medicines (odds ratio, 3.01; 95% confidence interval, 1.24-7.31) and a twofold higher odds of receiving antipsychotics (OR, 2.13; 95% CI, 1.29-3.51), compared with the children of nonconsanguineous parents.

“The results illustrate a clear increasing, stepwise association between level of consanguinity and mental ill health, suggesting a quasi–dose-response association, supporting a causal association between consanguineous parents and mental health of progeny,” wrote Aideen Maguire, PhD, and colleagues from Queen’s University Belfast (Northern Ireland).

Overall, more than one-third (35.8%) of children born to first-cousin consanguineous unions were prescribed antidepressant or anxiolytic medication, and 8.5% received antipsychotic medications, compared with one-quarter (26%) and 2.7% of nonrelated offspring.

Children of parents who were second cousins had an elevated but not statistically significant risk of receiving psychotropic medications (OR, 1.31; 95% CI, 0.63-2.71). None of these associations were affected by whether the births were singleton or multiple births.

 

 


“Despite the recent debate around the physical genetic risk of consanguineous parents, more research is required on the psychological effects of consanguineous parents on progeny,” the authors wrote.

The analysis also showed that participants aged 38-41 years were 15% more likely to receive antipsychotic medication, compared with those aged 26-29 years. The odds also were higher among fourth-born progeny, compared with first-born children, and in those from rural as opposed to urban areas.

Researchers also looked at whether deprivation or living in rural areas was associated with a higher likelihood of consanguineous pairings but found no such interactions. The incidence of consanguineous marriages found in the study was consistent with previous estimates in this population.

The authors suggested several possible explanations as to the association between consanguineous parents and mood disorders. The first was that psychiatric disorders are known to be heritable, suggesting that inherited genetic variants play a major role.
 

 

“As a form of assortative mating, consanguinity increases polygenic loading and thus is likely associated with a higher risk of mental disorder in progeny,” the authors wrote.

They also speculated that having consanguineous parents is associated with social stigma, particularly in Western societies where these partnerships are taboo. Offspring in these societies may experience discrimination that can affect mental health outcomes. However, they also noted that the children in the cohort might not have known about their parents’ consanguineous status.
 

 


“This study demonstrates the ability of populationwide data linkage to explore hard-to-reach populations, and we call on other countries with similar large-scale administrative data sources to use their data to explore the effects of consanguinity on offspring.”

The study was funded by the Centre of Excellence for Public Health Northern Ireland, with additional assistance from the Honest Broker Service. No conflicts of interest were declared.

SOURCE: Maguire A et al. JAMA Psychiatry. 2018 Apr 4. doi: 10.1001/jamapsychiatry.2018.0133.

Children of consanguineous parents are three times more likely to be prescribed medications for common mood disorders than the children of nonrelated parents, according to a study published April 4.

In JAMA Psychiatry, researchers reported the results of a retrospective populationwide cohort study involving 363,960 individuals born in Northern Ireland between 1971 and 1986; 609 (0.2%) of whom were born to parents who were either first or second cousins.

The analysis showed a clear relationship between the degree of consanguinity and the likelihood of being prescribed psychotropic medications. After adjusting for known mental health risk factors, including birth weight, children of parents who were first cousins had threefold higher odds of being prescribed antidepressant or anxiolytic medicines (odds ratio, 3.01; 95% confidence interval, 1.24-7.31) and a twofold higher odds of receiving antipsychotics (OR, 2.13; 95% CI, 1.29-3.51), compared with the children of nonconsanguineous parents.

“The results illustrate a clear increasing, stepwise association between level of consanguinity and mental ill health, suggesting a quasi–dose-response association, supporting a causal association between consanguineous parents and mental health of progeny,” wrote Aideen Maguire, PhD, and colleagues from Queen’s University Belfast (Northern Ireland).

Overall, more than one-third (35.8%) of children born to first-cousin consanguineous unions were prescribed antidepressant or anxiolytic medication, and 8.5% received antipsychotic medications, compared with one-quarter (26%) and 2.7% of nonrelated offspring.

Children of parents who were second cousins had an elevated but not statistically significant risk of receiving psychotropic medications (OR, 1.31; 95% CI, 0.63-2.71). None of these associations were affected by whether the births were singleton or multiple births.

 

 


“Despite the recent debate around the physical genetic risk of consanguineous parents, more research is required on the psychological effects of consanguineous parents on progeny,” the authors wrote.

The analysis also showed that participants aged 38-41 years were 15% more likely to receive antipsychotic medication, compared with those aged 26-29 years. The odds also were higher among fourth-born progeny, compared with first-born children, and in those from rural as opposed to urban areas.

Researchers also looked at whether deprivation or living in rural areas was associated with a higher likelihood of consanguineous pairings but found no such interactions. The incidence of consanguineous marriages found in the study was consistent with previous estimates in this population.

The authors suggested several possible explanations as to the association between consanguineous parents and mood disorders. The first was that psychiatric disorders are known to be heritable, suggesting that inherited genetic variants play a major role.
 

 

“As a form of assortative mating, consanguinity increases polygenic loading and thus is likely associated with a higher risk of mental disorder in progeny,” the authors wrote.

They also speculated that having consanguineous parents is associated with social stigma, particularly in Western societies where these partnerships are taboo. Offspring in these societies may experience discrimination that can affect mental health outcomes. However, they also noted that the children in the cohort might not have known about their parents’ consanguineous status.
 

 


“This study demonstrates the ability of populationwide data linkage to explore hard-to-reach populations, and we call on other countries with similar large-scale administrative data sources to use their data to explore the effects of consanguinity on offspring.”

The study was funded by the Centre of Excellence for Public Health Northern Ireland, with additional assistance from the Honest Broker Service. No conflicts of interest were declared.

SOURCE: Maguire A et al. JAMA Psychiatry. 2018 Apr 4. doi: 10.1001/jamapsychiatry.2018.0133.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Children of consanguineous parents are at significantly greater risk of mood disorders.

Major finding: Children of first-cousin parents have a threefold greater risk of mood disorders.

Study details: A retrospective populationwide cohort study involving 363,960 individuals.

Disclosures: The study was funded by the Centre of Excellence for Public Health Northern Ireland, with additional assistance from the Honest Broker Service. No conflicts of interest were declared.

Source: Maguire A et al. JAMA Psychiatry. doi: 10.1001/jamapsychiatry.2018.0133.

Disqus Comments
Default

Sarcopenia, body fat linked with mortality in nonmetastatic breast cancer

Weight control, exercise reduce death risk
Article Type
Changed
Thu, 12/15/2022 - 17:47

Among women with nonmetastatic breast cancer, low muscle mass and excess body fat are significantly associated with worse survival, investigators report.

An observational study of 3,241 women diagnosed with stage II or III breast cancer showed that low muscle mass (sarcopenia) was independently associated with a 41% increase in risk for overall mortality, and that total adipose tissue (TAT) measures were associated with a 35% increase in overall mortality.

Women with sarcopenia and high total TAT scores had a nearly twofold higher risk for death, reported Bette J. Caan, DrPH, of Kaiser Permanente in Oakland, Calif., and her colleagues.

Although low muscle mass was found to be a significant risk factor for death, neither poor muscle quality, measured by radiodensity, nor body mass index (BMI) were significantly associated with overall mortality, the investigators reported in a study published online in JAMA Oncology.

“Both muscle and adiposity represent modifiable risk factors in patients with breast cancer. In addition to weight loss, we should also consider interventions to improve muscle mass, such as resistance training or protein supplementation. In the era of precision medicine, the direct measurement of muscle and adiposity will help to guide treatment plans and interventions to optimize survival outcomes,” they wrote.

Although moderate to severe obesity measured by high BMI has been associated with worse outcomes for patients with breast cancer and other malignancies, the evidence is mixed for those who are merely overweight or have borderline obesity, the authors noted.

 

 


BMI is a simple ratio of height to weight, and does not measure body composition, and “low BMI can mask excess adiposity while high BMI can mask low muscularity,” they wrote.

To determine whether associations between measures of body composition could be prognostic for overall mortality, the investigators conducted a retrospective cohort study with patients from Kaiser Permanente Northern California and the Dana-Farber Cancer Institute in Boston.

The cohort included 3,241 women diagnosed with stage II or III invasive breast cancer during 2005-2013 in California and during 2000-2012 in Boston. All of the patients included had either abdominal or pelvic CT scans or PET-CT scans at the time of diagnosis.

The investigators looked at the associations between sarcopenia, TAT, and low muscle radiodensity, and created hazard ratio (HR) estimates of the effects of the various interactions on overall mortality, adjusted for sociodemographics, tumor characteristics, treatment, BMI, and other body composition measures.
 

 

They found that after a median follow-up of 6 years, patients with sarcopenia had a significantly greater risk for overall mortality than did patients without sarcopenia (HR, 1.41; 95% confidence interval, 1.18-1.69).

Additionally, patients in the highest tertile of TAT also had significantly higher overall mortality, compared with patients in the lowest tertile (HR, 1.35; CI, 1.08-1.69).

As noted before, poor muscle quality was not significantly associated with overall mortality.

Looking at both sarcopenia and TAT, the authors found that the highest risk for death was in those patients with both sarcopenia and high TAT (HR, 1.88; CI, 1.30-2.73).
 

 


However, they also found that BMI was not an independent predictor of overall mortality, and did identify those patients who were at risk because of their body composition.

“We demonstrate that sarcopenia is not a condition restricted to patients with later-stage disease but rather is highly prevalent among patients with nonmetastatic disease across all levels of BMI. Our findings are likely generalizable across many other nonmetastatic cancers because the associations with muscle and improved survival for those with metastatic cancer has been observed across a variety of solid tumors,” Dr. Caan and her associates wrote in their conclusion.

The article did not report a funding source for the study. The investigators reported having no conflicts of interest to disclose.

SOURCE: Cann BJ et al. JAMA Oncol. 2018 Apr 5. doi: 10.1001/jamaoncol.2018.0137.

Body

 

Obesity is highly prevalent among breast cancer survivors, and in addition to its effects on cancer development and outcomes, it also can affect treatment efficacy and adverse effects and complicate clinical management of breast cancer from obesity-related comorbidities such as hypertension and diabetes. As such, the American Society of Clinical Oncology made obesity and cancer one of their core priorities in 2013 and launched the Obesity & Cancer Initiative with activities ranging from education and awareness to clinical guidance, promotion of research, and policy and advocacy.

Despite its limitations, body mass index remains an easy tool to help health care clinicians identify patients at greater risk for poor outcomes and adverse effects and guide their recommendations, as well as to educate patients in self-assessing their weight status. Weight management and control are likely to have many benefits for breast cancer survivors but should always be tailored to individual patients’ needs. When CT imaging is available, the study by Caan et al. suggests that body composition measures can be useful in identifying women at higher risk of mortality. Their findings are an important reminder that weight loss and/or weight control programs must always incorporate physical activity with the goal of not just reducing adiposity, but also maintaining and increasing muscle mass, which would not only reduce the risk of death, but might also help improve quality of life after a cancer diagnosis.

Elisa V. Bandera, MD, PhD, is with the Rutgers Cancer Institute of New Jersey, Rutgers University, New Brunswick. Esther M. John, PhD, is with Stanford (Calif.) University. Both editorialists reported having no conflicts of interest to disclose. Their remarks are adapted from an accompanying invited commentary (JAMA Oncol. doi: 10.1001/jamaoncol.2018.0137).

Publications
Topics
Sections
Body

 

Obesity is highly prevalent among breast cancer survivors, and in addition to its effects on cancer development and outcomes, it also can affect treatment efficacy and adverse effects and complicate clinical management of breast cancer from obesity-related comorbidities such as hypertension and diabetes. As such, the American Society of Clinical Oncology made obesity and cancer one of their core priorities in 2013 and launched the Obesity & Cancer Initiative with activities ranging from education and awareness to clinical guidance, promotion of research, and policy and advocacy.

Despite its limitations, body mass index remains an easy tool to help health care clinicians identify patients at greater risk for poor outcomes and adverse effects and guide their recommendations, as well as to educate patients in self-assessing their weight status. Weight management and control are likely to have many benefits for breast cancer survivors but should always be tailored to individual patients’ needs. When CT imaging is available, the study by Caan et al. suggests that body composition measures can be useful in identifying women at higher risk of mortality. Their findings are an important reminder that weight loss and/or weight control programs must always incorporate physical activity with the goal of not just reducing adiposity, but also maintaining and increasing muscle mass, which would not only reduce the risk of death, but might also help improve quality of life after a cancer diagnosis.

Elisa V. Bandera, MD, PhD, is with the Rutgers Cancer Institute of New Jersey, Rutgers University, New Brunswick. Esther M. John, PhD, is with Stanford (Calif.) University. Both editorialists reported having no conflicts of interest to disclose. Their remarks are adapted from an accompanying invited commentary (JAMA Oncol. doi: 10.1001/jamaoncol.2018.0137).

Body

 

Obesity is highly prevalent among breast cancer survivors, and in addition to its effects on cancer development and outcomes, it also can affect treatment efficacy and adverse effects and complicate clinical management of breast cancer from obesity-related comorbidities such as hypertension and diabetes. As such, the American Society of Clinical Oncology made obesity and cancer one of their core priorities in 2013 and launched the Obesity & Cancer Initiative with activities ranging from education and awareness to clinical guidance, promotion of research, and policy and advocacy.

Despite its limitations, body mass index remains an easy tool to help health care clinicians identify patients at greater risk for poor outcomes and adverse effects and guide their recommendations, as well as to educate patients in self-assessing their weight status. Weight management and control are likely to have many benefits for breast cancer survivors but should always be tailored to individual patients’ needs. When CT imaging is available, the study by Caan et al. suggests that body composition measures can be useful in identifying women at higher risk of mortality. Their findings are an important reminder that weight loss and/or weight control programs must always incorporate physical activity with the goal of not just reducing adiposity, but also maintaining and increasing muscle mass, which would not only reduce the risk of death, but might also help improve quality of life after a cancer diagnosis.

Elisa V. Bandera, MD, PhD, is with the Rutgers Cancer Institute of New Jersey, Rutgers University, New Brunswick. Esther M. John, PhD, is with Stanford (Calif.) University. Both editorialists reported having no conflicts of interest to disclose. Their remarks are adapted from an accompanying invited commentary (JAMA Oncol. doi: 10.1001/jamaoncol.2018.0137).

Title
Weight control, exercise reduce death risk
Weight control, exercise reduce death risk

Among women with nonmetastatic breast cancer, low muscle mass and excess body fat are significantly associated with worse survival, investigators report.

An observational study of 3,241 women diagnosed with stage II or III breast cancer showed that low muscle mass (sarcopenia) was independently associated with a 41% increase in risk for overall mortality, and that total adipose tissue (TAT) measures were associated with a 35% increase in overall mortality.

Women with sarcopenia and high total TAT scores had a nearly twofold higher risk for death, reported Bette J. Caan, DrPH, of Kaiser Permanente in Oakland, Calif., and her colleagues.

Although low muscle mass was found to be a significant risk factor for death, neither poor muscle quality, measured by radiodensity, nor body mass index (BMI) were significantly associated with overall mortality, the investigators reported in a study published online in JAMA Oncology.

“Both muscle and adiposity represent modifiable risk factors in patients with breast cancer. In addition to weight loss, we should also consider interventions to improve muscle mass, such as resistance training or protein supplementation. In the era of precision medicine, the direct measurement of muscle and adiposity will help to guide treatment plans and interventions to optimize survival outcomes,” they wrote.

Although moderate to severe obesity measured by high BMI has been associated with worse outcomes for patients with breast cancer and other malignancies, the evidence is mixed for those who are merely overweight or have borderline obesity, the authors noted.

 

 


BMI is a simple ratio of height to weight, and does not measure body composition, and “low BMI can mask excess adiposity while high BMI can mask low muscularity,” they wrote.

To determine whether associations between measures of body composition could be prognostic for overall mortality, the investigators conducted a retrospective cohort study with patients from Kaiser Permanente Northern California and the Dana-Farber Cancer Institute in Boston.

The cohort included 3,241 women diagnosed with stage II or III invasive breast cancer during 2005-2013 in California and during 2000-2012 in Boston. All of the patients included had either abdominal or pelvic CT scans or PET-CT scans at the time of diagnosis.

The investigators looked at the associations between sarcopenia, TAT, and low muscle radiodensity, and created hazard ratio (HR) estimates of the effects of the various interactions on overall mortality, adjusted for sociodemographics, tumor characteristics, treatment, BMI, and other body composition measures.
 

 

They found that after a median follow-up of 6 years, patients with sarcopenia had a significantly greater risk for overall mortality than did patients without sarcopenia (HR, 1.41; 95% confidence interval, 1.18-1.69).

Additionally, patients in the highest tertile of TAT also had significantly higher overall mortality, compared with patients in the lowest tertile (HR, 1.35; CI, 1.08-1.69).

As noted before, poor muscle quality was not significantly associated with overall mortality.

Looking at both sarcopenia and TAT, the authors found that the highest risk for death was in those patients with both sarcopenia and high TAT (HR, 1.88; CI, 1.30-2.73).
 

 


However, they also found that BMI was not an independent predictor of overall mortality, and did identify those patients who were at risk because of their body composition.

“We demonstrate that sarcopenia is not a condition restricted to patients with later-stage disease but rather is highly prevalent among patients with nonmetastatic disease across all levels of BMI. Our findings are likely generalizable across many other nonmetastatic cancers because the associations with muscle and improved survival for those with metastatic cancer has been observed across a variety of solid tumors,” Dr. Caan and her associates wrote in their conclusion.

The article did not report a funding source for the study. The investigators reported having no conflicts of interest to disclose.

SOURCE: Cann BJ et al. JAMA Oncol. 2018 Apr 5. doi: 10.1001/jamaoncol.2018.0137.

Among women with nonmetastatic breast cancer, low muscle mass and excess body fat are significantly associated with worse survival, investigators report.

An observational study of 3,241 women diagnosed with stage II or III breast cancer showed that low muscle mass (sarcopenia) was independently associated with a 41% increase in risk for overall mortality, and that total adipose tissue (TAT) measures were associated with a 35% increase in overall mortality.

Women with sarcopenia and high total TAT scores had a nearly twofold higher risk for death, reported Bette J. Caan, DrPH, of Kaiser Permanente in Oakland, Calif., and her colleagues.

Although low muscle mass was found to be a significant risk factor for death, neither poor muscle quality, measured by radiodensity, nor body mass index (BMI) were significantly associated with overall mortality, the investigators reported in a study published online in JAMA Oncology.

“Both muscle and adiposity represent modifiable risk factors in patients with breast cancer. In addition to weight loss, we should also consider interventions to improve muscle mass, such as resistance training or protein supplementation. In the era of precision medicine, the direct measurement of muscle and adiposity will help to guide treatment plans and interventions to optimize survival outcomes,” they wrote.

Although moderate to severe obesity measured by high BMI has been associated with worse outcomes for patients with breast cancer and other malignancies, the evidence is mixed for those who are merely overweight or have borderline obesity, the authors noted.

 

 


BMI is a simple ratio of height to weight, and does not measure body composition, and “low BMI can mask excess adiposity while high BMI can mask low muscularity,” they wrote.

To determine whether associations between measures of body composition could be prognostic for overall mortality, the investigators conducted a retrospective cohort study with patients from Kaiser Permanente Northern California and the Dana-Farber Cancer Institute in Boston.

The cohort included 3,241 women diagnosed with stage II or III invasive breast cancer during 2005-2013 in California and during 2000-2012 in Boston. All of the patients included had either abdominal or pelvic CT scans or PET-CT scans at the time of diagnosis.

The investigators looked at the associations between sarcopenia, TAT, and low muscle radiodensity, and created hazard ratio (HR) estimates of the effects of the various interactions on overall mortality, adjusted for sociodemographics, tumor characteristics, treatment, BMI, and other body composition measures.
 

 

They found that after a median follow-up of 6 years, patients with sarcopenia had a significantly greater risk for overall mortality than did patients without sarcopenia (HR, 1.41; 95% confidence interval, 1.18-1.69).

Additionally, patients in the highest tertile of TAT also had significantly higher overall mortality, compared with patients in the lowest tertile (HR, 1.35; CI, 1.08-1.69).

As noted before, poor muscle quality was not significantly associated with overall mortality.

Looking at both sarcopenia and TAT, the authors found that the highest risk for death was in those patients with both sarcopenia and high TAT (HR, 1.88; CI, 1.30-2.73).
 

 


However, they also found that BMI was not an independent predictor of overall mortality, and did identify those patients who were at risk because of their body composition.

“We demonstrate that sarcopenia is not a condition restricted to patients with later-stage disease but rather is highly prevalent among patients with nonmetastatic disease across all levels of BMI. Our findings are likely generalizable across many other nonmetastatic cancers because the associations with muscle and improved survival for those with metastatic cancer has been observed across a variety of solid tumors,” Dr. Caan and her associates wrote in their conclusion.

The article did not report a funding source for the study. The investigators reported having no conflicts of interest to disclose.

SOURCE: Cann BJ et al. JAMA Oncol. 2018 Apr 5. doi: 10.1001/jamaoncol.2018.0137.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Helping women with nonmetastatic breast cancer control weight and improve muscle strength could lower their risk of death.

Major finding: Women with sarcopenia and high total adipose tissue had a hazard ratio of 1.89 for overall mortality.

Study details: Retrospective cohort study of 3,241 women diagnosed with stage II or III invasive breast cancer in California and Massachusetts.

Disclosures: The article did not report a funding source for the study. The investigators reported having no conflicts of interest to disclose.

Source: Cann BJ et al. JAMA Oncol. 2018 Apr 5. doi:10.1001/jamaoncol.2018.0137.

Disqus Comments
Default

Any detectable high-sensitivity cardiac troponin T (hs-cTn) level is associated with adverse outcomes

Article Type
Changed
Fri, 09/14/2018 - 11:54

Clinical question: What is the association between high-sensitivity cardiac troponin T (hs-cTn) levels and outcomes in patients with chest pain?

Background: There are few data on the link between hs-cTn levels and outcomes in patients with chest pain but no myocardial infarction or any other condition that can cause acute increases in troponin levels.

Study design: Observational cohort study.

Setting: Patients older than 25 years old with chest pain presenting to the emergency department at a university hospital in Sweden.

Dr. Karen Clarke

Synopsis: 19,460 patients with chest pain who had at least one hs-cTn level obtained during their ED visit were included in this study. In comparison with patients who had hs-cTn less than 5 ng/L, the adjusted hazard ratios for all-cause mortality were 2.00, 2.92, 4.07, 6.77, and 9.68, in patients with hs-cTn levels of 5-9, 10-14, 15-29, 30-49, and 50 or greater ng/L, respectively. The yearly rates of MI were 0.3% and 4.5% in patients with hs-cTn levels less than 5 ng/L and 50 or greater ng/L, respectively. The yearly rates of hospitalization for heart failure were 0.1%, 1%, 2.8%, and 20% in patients with hs-cTn levels less than 5, 5-9, 10-14, and 50 or greater ng/L, respectively. There was a clear and graded association between any detectable levels of hs-cTn and risk factors for cardiovascular and noncardiovascular mortality, MI, and heart failure.

Bottom line: For patients with chest pain and stable troponin levels, there is an elevated risk of death, hospitalization for heart failure, and MI, if there is any detectable level of hs-cTn.

Citation: Roos A et al. Stable high-sensitivity cardiac troponin T levels and outcomes in patients with chest pain. J Am Coll Cardiol. 2017 Oct 31;70(18):2226-36. doi: 10.1016/j.jacc.2017.08.064.

 

Dr. Clarke is assistant professor of medicine in the division of hospital medicine, Emory University, Atlanta.

Publications
Topics
Sections

Clinical question: What is the association between high-sensitivity cardiac troponin T (hs-cTn) levels and outcomes in patients with chest pain?

Background: There are few data on the link between hs-cTn levels and outcomes in patients with chest pain but no myocardial infarction or any other condition that can cause acute increases in troponin levels.

Study design: Observational cohort study.

Setting: Patients older than 25 years old with chest pain presenting to the emergency department at a university hospital in Sweden.

Dr. Karen Clarke

Synopsis: 19,460 patients with chest pain who had at least one hs-cTn level obtained during their ED visit were included in this study. In comparison with patients who had hs-cTn less than 5 ng/L, the adjusted hazard ratios for all-cause mortality were 2.00, 2.92, 4.07, 6.77, and 9.68, in patients with hs-cTn levels of 5-9, 10-14, 15-29, 30-49, and 50 or greater ng/L, respectively. The yearly rates of MI were 0.3% and 4.5% in patients with hs-cTn levels less than 5 ng/L and 50 or greater ng/L, respectively. The yearly rates of hospitalization for heart failure were 0.1%, 1%, 2.8%, and 20% in patients with hs-cTn levels less than 5, 5-9, 10-14, and 50 or greater ng/L, respectively. There was a clear and graded association between any detectable levels of hs-cTn and risk factors for cardiovascular and noncardiovascular mortality, MI, and heart failure.

Bottom line: For patients with chest pain and stable troponin levels, there is an elevated risk of death, hospitalization for heart failure, and MI, if there is any detectable level of hs-cTn.

Citation: Roos A et al. Stable high-sensitivity cardiac troponin T levels and outcomes in patients with chest pain. J Am Coll Cardiol. 2017 Oct 31;70(18):2226-36. doi: 10.1016/j.jacc.2017.08.064.

 

Dr. Clarke is assistant professor of medicine in the division of hospital medicine, Emory University, Atlanta.

Clinical question: What is the association between high-sensitivity cardiac troponin T (hs-cTn) levels and outcomes in patients with chest pain?

Background: There are few data on the link between hs-cTn levels and outcomes in patients with chest pain but no myocardial infarction or any other condition that can cause acute increases in troponin levels.

Study design: Observational cohort study.

Setting: Patients older than 25 years old with chest pain presenting to the emergency department at a university hospital in Sweden.

Dr. Karen Clarke

Synopsis: 19,460 patients with chest pain who had at least one hs-cTn level obtained during their ED visit were included in this study. In comparison with patients who had hs-cTn less than 5 ng/L, the adjusted hazard ratios for all-cause mortality were 2.00, 2.92, 4.07, 6.77, and 9.68, in patients with hs-cTn levels of 5-9, 10-14, 15-29, 30-49, and 50 or greater ng/L, respectively. The yearly rates of MI were 0.3% and 4.5% in patients with hs-cTn levels less than 5 ng/L and 50 or greater ng/L, respectively. The yearly rates of hospitalization for heart failure were 0.1%, 1%, 2.8%, and 20% in patients with hs-cTn levels less than 5, 5-9, 10-14, and 50 or greater ng/L, respectively. There was a clear and graded association between any detectable levels of hs-cTn and risk factors for cardiovascular and noncardiovascular mortality, MI, and heart failure.

Bottom line: For patients with chest pain and stable troponin levels, there is an elevated risk of death, hospitalization for heart failure, and MI, if there is any detectable level of hs-cTn.

Citation: Roos A et al. Stable high-sensitivity cardiac troponin T levels and outcomes in patients with chest pain. J Am Coll Cardiol. 2017 Oct 31;70(18):2226-36. doi: 10.1016/j.jacc.2017.08.064.

 

Dr. Clarke is assistant professor of medicine in the division of hospital medicine, Emory University, Atlanta.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default