LSC phenotypes correlate with prognosis in AML

Article Type
Changed
Thu, 06/04/2015 - 06:00
Display Headline
LSC phenotypes correlate with prognosis in AML

Crowd at ASCO 2015

© ASCO/Rodney White

CHICAGO—Researchers say they have identified 3 leukemia stem cell (LSC) phenotypes that are correlated with cytogenetic/molecular abnormalities and prognosis in acute myeloid leukemia (AML).

The investigators believe this knowledge could aid risk stratification of AML patients, particularly those without identifiable cytogenetic or molecular risk factors.

The findings may also pave the way for scientists to identify novel therapeutic targets on LSCs and monitor LSCs in response to therapy.

Jonathan Michael Gerber, MD, of Levine Cancer Institute in Charlotte, North Carolina, presented the findings at the 2015 ASCO Annual Meeting (abstract 7000*).

Via previous research, Dr Gerber and his colleagues identified 3 different LSC phenotypes in AML:

  • LSCs that are CD34-negative
  • LSCs that are CD34-positive, CD38-negative, and have intermediate levels of aldehyde dehydrogenase (ALDHint)
  • LSCs that are CD34-positive, CD38-negative, and have high levels of ALDH (ALDHhigh).

With the current study, the researchers wanted to determine if these phenotypes correlate with cytogenetic/molecular features and treatment outcomes.

So they analyzed diagnostic samples from 98 patients with newly diagnosed AML who had normal or unfavorable cytogenetics. The patients were enrolled on a phase 2 trial comparing FLAM and 7+3 (Zeidner et al, haematologica 2015).

Dr Gerber and his colleagues identified 22 patients with CD34- LSCs, 43 with ALDHint LSCs, and 33 with ALDHhigh LSCs.

Risk factors

“We found that leukemia stem cell phenotype indeed correlated quite strongly with cytogenetic and molecular risk factors,” Dr Gerber said.

NPM1 mutations were more common in patients with CD34- LSCs (64%) than those with ALDHint LSCs (14%) or ALDHhigh LSCs (6%, P<0.001). NPM1 mutations were the sole abnormality in 50% of patients with CD34- LSCs.

Poor-risk cytogenetics and/or FLT3-ITD mutations were more common in patients with ALDHhigh LSCs (85%) than those with ALDHint LCSs (35%) or CD34- LSCs (18%, P<0.001).

Nine percent of patients in the CD34- LSC group fell into the European Leukemia Network poor-risk category, compared to 73% of patients in the ALDHhigh LSC group.

Only 2 patients had 11q23, and both had CD34- LSCs. Fifty-five percent of patients with ALDHhigh LSCs had prior myelodysplastic syndromes or myeloproliferative neoplasms.

Prognosis

“We found that leukemia stem cell phenotype correlated strongly with outcomes as well,” Dr Gerber said. “It turned out that CD34- patients fared more favorably overall.”

Patients with CD34- LSCs had the highest complete response rate (86%), followed by those with ALDHint LSCs (67%) and ALDHhigh LSCs (45%, P<0.01).

Patients in the CD34- group also had a higher rate of event-free survival at 2 years (46%) than patients in the ALDHint group (26%) or the ALDHhigh group (0%). The median event-free survival was 13 months, 11.3 months, and 2.2 months, respectively (P<0.01).

The rate of overall survival at 2 years was best for the CD34- group (76%), followed by the ALDHint group (38%) and the ALDHhigh group (34%). The median overall survival was not reached, 18.7 months, and 9.4 months, respectively (P=0.02).

Dr Gerber also noted that ALDHhigh patients fared much better if they underwent hematopoietic stem cell transplant.

“There is 0% leukemia-free survival at the 2-year mark for the ALDHhigh patients who were not transplanted,” he said. “Those that were transplanted fared about the same as everyone else in the series. So it was very striking that there were no chemotherapy survivors in that group.”

In closing, Dr Gerber said this research suggests the 3 LSC phenotypes are mutually exclusive and correlate with cytogenetic and molecular risk factors as well as outcomes in patients with AML.

 

 

“This [discovery] may allow for rapid risk stratification in this explosive disease, facilitate enrollment onto induction protocols . . . , and allow us to divert those ALDHhigh, very high-risk patients earlier to novel therapies and/or transplant, given that they’re not really helped much by conventional chemotherapy.”

*Information in the abstract differs from that presented at the meeting.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Crowd at ASCO 2015

© ASCO/Rodney White

CHICAGO—Researchers say they have identified 3 leukemia stem cell (LSC) phenotypes that are correlated with cytogenetic/molecular abnormalities and prognosis in acute myeloid leukemia (AML).

The investigators believe this knowledge could aid risk stratification of AML patients, particularly those without identifiable cytogenetic or molecular risk factors.

The findings may also pave the way for scientists to identify novel therapeutic targets on LSCs and monitor LSCs in response to therapy.

Jonathan Michael Gerber, MD, of Levine Cancer Institute in Charlotte, North Carolina, presented the findings at the 2015 ASCO Annual Meeting (abstract 7000*).

Via previous research, Dr Gerber and his colleagues identified 3 different LSC phenotypes in AML:

  • LSCs that are CD34-negative
  • LSCs that are CD34-positive, CD38-negative, and have intermediate levels of aldehyde dehydrogenase (ALDHint)
  • LSCs that are CD34-positive, CD38-negative, and have high levels of ALDH (ALDHhigh).

With the current study, the researchers wanted to determine if these phenotypes correlate with cytogenetic/molecular features and treatment outcomes.

So they analyzed diagnostic samples from 98 patients with newly diagnosed AML who had normal or unfavorable cytogenetics. The patients were enrolled on a phase 2 trial comparing FLAM and 7+3 (Zeidner et al, haematologica 2015).

Dr Gerber and his colleagues identified 22 patients with CD34- LSCs, 43 with ALDHint LSCs, and 33 with ALDHhigh LSCs.

Risk factors

“We found that leukemia stem cell phenotype indeed correlated quite strongly with cytogenetic and molecular risk factors,” Dr Gerber said.

NPM1 mutations were more common in patients with CD34- LSCs (64%) than those with ALDHint LSCs (14%) or ALDHhigh LSCs (6%, P<0.001). NPM1 mutations were the sole abnormality in 50% of patients with CD34- LSCs.

Poor-risk cytogenetics and/or FLT3-ITD mutations were more common in patients with ALDHhigh LSCs (85%) than those with ALDHint LCSs (35%) or CD34- LSCs (18%, P<0.001).

Nine percent of patients in the CD34- LSC group fell into the European Leukemia Network poor-risk category, compared to 73% of patients in the ALDHhigh LSC group.

Only 2 patients had 11q23, and both had CD34- LSCs. Fifty-five percent of patients with ALDHhigh LSCs had prior myelodysplastic syndromes or myeloproliferative neoplasms.

Prognosis

“We found that leukemia stem cell phenotype correlated strongly with outcomes as well,” Dr Gerber said. “It turned out that CD34- patients fared more favorably overall.”

Patients with CD34- LSCs had the highest complete response rate (86%), followed by those with ALDHint LSCs (67%) and ALDHhigh LSCs (45%, P<0.01).

Patients in the CD34- group also had a higher rate of event-free survival at 2 years (46%) than patients in the ALDHint group (26%) or the ALDHhigh group (0%). The median event-free survival was 13 months, 11.3 months, and 2.2 months, respectively (P<0.01).

The rate of overall survival at 2 years was best for the CD34- group (76%), followed by the ALDHint group (38%) and the ALDHhigh group (34%). The median overall survival was not reached, 18.7 months, and 9.4 months, respectively (P=0.02).

Dr Gerber also noted that ALDHhigh patients fared much better if they underwent hematopoietic stem cell transplant.

“There is 0% leukemia-free survival at the 2-year mark for the ALDHhigh patients who were not transplanted,” he said. “Those that were transplanted fared about the same as everyone else in the series. So it was very striking that there were no chemotherapy survivors in that group.”

In closing, Dr Gerber said this research suggests the 3 LSC phenotypes are mutually exclusive and correlate with cytogenetic and molecular risk factors as well as outcomes in patients with AML.

 

 

“This [discovery] may allow for rapid risk stratification in this explosive disease, facilitate enrollment onto induction protocols . . . , and allow us to divert those ALDHhigh, very high-risk patients earlier to novel therapies and/or transplant, given that they’re not really helped much by conventional chemotherapy.”

*Information in the abstract differs from that presented at the meeting.

Crowd at ASCO 2015

© ASCO/Rodney White

CHICAGO—Researchers say they have identified 3 leukemia stem cell (LSC) phenotypes that are correlated with cytogenetic/molecular abnormalities and prognosis in acute myeloid leukemia (AML).

The investigators believe this knowledge could aid risk stratification of AML patients, particularly those without identifiable cytogenetic or molecular risk factors.

The findings may also pave the way for scientists to identify novel therapeutic targets on LSCs and monitor LSCs in response to therapy.

Jonathan Michael Gerber, MD, of Levine Cancer Institute in Charlotte, North Carolina, presented the findings at the 2015 ASCO Annual Meeting (abstract 7000*).

Via previous research, Dr Gerber and his colleagues identified 3 different LSC phenotypes in AML:

  • LSCs that are CD34-negative
  • LSCs that are CD34-positive, CD38-negative, and have intermediate levels of aldehyde dehydrogenase (ALDHint)
  • LSCs that are CD34-positive, CD38-negative, and have high levels of ALDH (ALDHhigh).

With the current study, the researchers wanted to determine if these phenotypes correlate with cytogenetic/molecular features and treatment outcomes.

So they analyzed diagnostic samples from 98 patients with newly diagnosed AML who had normal or unfavorable cytogenetics. The patients were enrolled on a phase 2 trial comparing FLAM and 7+3 (Zeidner et al, haematologica 2015).

Dr Gerber and his colleagues identified 22 patients with CD34- LSCs, 43 with ALDHint LSCs, and 33 with ALDHhigh LSCs.

Risk factors

“We found that leukemia stem cell phenotype indeed correlated quite strongly with cytogenetic and molecular risk factors,” Dr Gerber said.

NPM1 mutations were more common in patients with CD34- LSCs (64%) than those with ALDHint LSCs (14%) or ALDHhigh LSCs (6%, P<0.001). NPM1 mutations were the sole abnormality in 50% of patients with CD34- LSCs.

Poor-risk cytogenetics and/or FLT3-ITD mutations were more common in patients with ALDHhigh LSCs (85%) than those with ALDHint LCSs (35%) or CD34- LSCs (18%, P<0.001).

Nine percent of patients in the CD34- LSC group fell into the European Leukemia Network poor-risk category, compared to 73% of patients in the ALDHhigh LSC group.

Only 2 patients had 11q23, and both had CD34- LSCs. Fifty-five percent of patients with ALDHhigh LSCs had prior myelodysplastic syndromes or myeloproliferative neoplasms.

Prognosis

“We found that leukemia stem cell phenotype correlated strongly with outcomes as well,” Dr Gerber said. “It turned out that CD34- patients fared more favorably overall.”

Patients with CD34- LSCs had the highest complete response rate (86%), followed by those with ALDHint LSCs (67%) and ALDHhigh LSCs (45%, P<0.01).

Patients in the CD34- group also had a higher rate of event-free survival at 2 years (46%) than patients in the ALDHint group (26%) or the ALDHhigh group (0%). The median event-free survival was 13 months, 11.3 months, and 2.2 months, respectively (P<0.01).

The rate of overall survival at 2 years was best for the CD34- group (76%), followed by the ALDHint group (38%) and the ALDHhigh group (34%). The median overall survival was not reached, 18.7 months, and 9.4 months, respectively (P=0.02).

Dr Gerber also noted that ALDHhigh patients fared much better if they underwent hematopoietic stem cell transplant.

“There is 0% leukemia-free survival at the 2-year mark for the ALDHhigh patients who were not transplanted,” he said. “Those that were transplanted fared about the same as everyone else in the series. So it was very striking that there were no chemotherapy survivors in that group.”

In closing, Dr Gerber said this research suggests the 3 LSC phenotypes are mutually exclusive and correlate with cytogenetic and molecular risk factors as well as outcomes in patients with AML.

 

 

“This [discovery] may allow for rapid risk stratification in this explosive disease, facilitate enrollment onto induction protocols . . . , and allow us to divert those ALDHhigh, very high-risk patients earlier to novel therapies and/or transplant, given that they’re not really helped much by conventional chemotherapy.”

*Information in the abstract differs from that presented at the meeting.

Publications
Publications
Topics
Article Type
Display Headline
LSC phenotypes correlate with prognosis in AML
Display Headline
LSC phenotypes correlate with prognosis in AML
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Biochemist Irwin Rose dies at 88

Article Type
Changed
Thu, 06/04/2015 - 05:00
Display Headline
Biochemist Irwin Rose dies at 88

Irwin “Ernie” Rose, PhD

Photo courtesy of UCI

Biochemist and Nobel laureate Irwin “Ernie” Rose, PhD, has passed away at the age of 88.

Dr Rose and colleagues from Israel won the Nobel Prize in Chemistry in 2004 for their discovery of ubiquitin-mediated protein degradation.

This research has wide-ranging implications for medicine and led to the development of anticancer drugs such as bortezomib, which is approved in the US to treat multiple myeloma and mantle cell lymphoma.

According to his friends and colleagues, Dr Rose was humble, generous, and endlessly curious.

“Ernie was not interested in personal fame and was oblivious to the politics of science,” said Ann Skalka, PhD, of Fox Chase Cancer Center in Philadelphia, Pennsylvania.

“His total satisfaction came from solving intricate biochemical puzzles. Although Ernie was an intellectual leader on the project that ultimately won him the Nobel, he took no personal credit. He was rather surprised at being recognized, but all of us at Fox Chase knew that the Nobel Committee had gotten it right.”

Dr Rose was born in Brooklyn, New York, on July 16, 1926. His scientific ambitions began to take shape after he moved to Spokane, Washington, at 13. While in high school, he spent summers working at a local hospital. And this inspired him to pursue a career that involved “solving medical problems.”

Dr Rose attended Washington State College for his undergraduate work and went on to earn a doctoral degree at the University of Chicago, after a brief stint in the Navy. He spent the better part of his career as a research scientist at the Fox Chase Cancer Center.

There, during the late 1970s and early 1980s, Dr Rose helped reveal how ubiquitin molecules facilitate the breakdown of old and damaged proteins. The discovery of this process fostered a new understanding of the molecular activity present in cancers and other diseases.

For the work, Dr Rose shared the 2004 Nobel Prize in Chemistry with Avram Hershko, MD, PhD, and Aaron Ciechanover, MD, PhD, of the Israel Institute of Technology.

“Ernie had a genius for asking the right questions,” said Jonathan Chernoff, MD, PhD, of Fox Chase Cancer Center.

“In the mid-1950s, when many scientists were interested in how proteins are synthesized, Ernie became fascinated with the opposite issue—how are proteins degraded? With the collaboration of his Israeli colleagues, he cracked that problem with the discovery of the ubiquitin conjugating system.”

After retiring to Laguna Woods, California, in 1997, Dr Rose accepted a special research position with the University of California Irvine (UCI).

There, he studied the mechanisms of fumarase, an enzyme involved in the citric acid cycle, the cellular pathway by which higher organisms convert food into energy. And he quickly became a beloved colleague and mentor to students and faculty.

“[B]oth prior to and after winning the Nobel Prize, he would help any student or young postdoctoral researcher who was having a hard time with an experiment,” said Ralph Bradshaw, PhD, a former professor at UCI.

“It was a lot of fun working with him,” said James Nowick, PhD, of UCI. “He worked with his own hands, not relying on others, with old instrumentation, and was able to do literally superb science.”

“He was the quintessential scientist—perseverant, soft-spoken, and interested in science for science’s sake,” Dr Chernoff said. “We will miss him very much.”

Dr Rose died in his sleep on June 2 in Deerfield, Massachusetts. He is survived by his wife, Zelda; their sons, Howard, Frederic, and Robert; and 5 grandchildren. Dr Rose’s daughter, Sarah, died in 2005.

Publications
Topics

Irwin “Ernie” Rose, PhD

Photo courtesy of UCI

Biochemist and Nobel laureate Irwin “Ernie” Rose, PhD, has passed away at the age of 88.

Dr Rose and colleagues from Israel won the Nobel Prize in Chemistry in 2004 for their discovery of ubiquitin-mediated protein degradation.

This research has wide-ranging implications for medicine and led to the development of anticancer drugs such as bortezomib, which is approved in the US to treat multiple myeloma and mantle cell lymphoma.

According to his friends and colleagues, Dr Rose was humble, generous, and endlessly curious.

“Ernie was not interested in personal fame and was oblivious to the politics of science,” said Ann Skalka, PhD, of Fox Chase Cancer Center in Philadelphia, Pennsylvania.

“His total satisfaction came from solving intricate biochemical puzzles. Although Ernie was an intellectual leader on the project that ultimately won him the Nobel, he took no personal credit. He was rather surprised at being recognized, but all of us at Fox Chase knew that the Nobel Committee had gotten it right.”

Dr Rose was born in Brooklyn, New York, on July 16, 1926. His scientific ambitions began to take shape after he moved to Spokane, Washington, at 13. While in high school, he spent summers working at a local hospital. And this inspired him to pursue a career that involved “solving medical problems.”

Dr Rose attended Washington State College for his undergraduate work and went on to earn a doctoral degree at the University of Chicago, after a brief stint in the Navy. He spent the better part of his career as a research scientist at the Fox Chase Cancer Center.

There, during the late 1970s and early 1980s, Dr Rose helped reveal how ubiquitin molecules facilitate the breakdown of old and damaged proteins. The discovery of this process fostered a new understanding of the molecular activity present in cancers and other diseases.

For the work, Dr Rose shared the 2004 Nobel Prize in Chemistry with Avram Hershko, MD, PhD, and Aaron Ciechanover, MD, PhD, of the Israel Institute of Technology.

“Ernie had a genius for asking the right questions,” said Jonathan Chernoff, MD, PhD, of Fox Chase Cancer Center.

“In the mid-1950s, when many scientists were interested in how proteins are synthesized, Ernie became fascinated with the opposite issue—how are proteins degraded? With the collaboration of his Israeli colleagues, he cracked that problem with the discovery of the ubiquitin conjugating system.”

After retiring to Laguna Woods, California, in 1997, Dr Rose accepted a special research position with the University of California Irvine (UCI).

There, he studied the mechanisms of fumarase, an enzyme involved in the citric acid cycle, the cellular pathway by which higher organisms convert food into energy. And he quickly became a beloved colleague and mentor to students and faculty.

“[B]oth prior to and after winning the Nobel Prize, he would help any student or young postdoctoral researcher who was having a hard time with an experiment,” said Ralph Bradshaw, PhD, a former professor at UCI.

“It was a lot of fun working with him,” said James Nowick, PhD, of UCI. “He worked with his own hands, not relying on others, with old instrumentation, and was able to do literally superb science.”

“He was the quintessential scientist—perseverant, soft-spoken, and interested in science for science’s sake,” Dr Chernoff said. “We will miss him very much.”

Dr Rose died in his sleep on June 2 in Deerfield, Massachusetts. He is survived by his wife, Zelda; their sons, Howard, Frederic, and Robert; and 5 grandchildren. Dr Rose’s daughter, Sarah, died in 2005.

Irwin “Ernie” Rose, PhD

Photo courtesy of UCI

Biochemist and Nobel laureate Irwin “Ernie” Rose, PhD, has passed away at the age of 88.

Dr Rose and colleagues from Israel won the Nobel Prize in Chemistry in 2004 for their discovery of ubiquitin-mediated protein degradation.

This research has wide-ranging implications for medicine and led to the development of anticancer drugs such as bortezomib, which is approved in the US to treat multiple myeloma and mantle cell lymphoma.

According to his friends and colleagues, Dr Rose was humble, generous, and endlessly curious.

“Ernie was not interested in personal fame and was oblivious to the politics of science,” said Ann Skalka, PhD, of Fox Chase Cancer Center in Philadelphia, Pennsylvania.

“His total satisfaction came from solving intricate biochemical puzzles. Although Ernie was an intellectual leader on the project that ultimately won him the Nobel, he took no personal credit. He was rather surprised at being recognized, but all of us at Fox Chase knew that the Nobel Committee had gotten it right.”

Dr Rose was born in Brooklyn, New York, on July 16, 1926. His scientific ambitions began to take shape after he moved to Spokane, Washington, at 13. While in high school, he spent summers working at a local hospital. And this inspired him to pursue a career that involved “solving medical problems.”

Dr Rose attended Washington State College for his undergraduate work and went on to earn a doctoral degree at the University of Chicago, after a brief stint in the Navy. He spent the better part of his career as a research scientist at the Fox Chase Cancer Center.

There, during the late 1970s and early 1980s, Dr Rose helped reveal how ubiquitin molecules facilitate the breakdown of old and damaged proteins. The discovery of this process fostered a new understanding of the molecular activity present in cancers and other diseases.

For the work, Dr Rose shared the 2004 Nobel Prize in Chemistry with Avram Hershko, MD, PhD, and Aaron Ciechanover, MD, PhD, of the Israel Institute of Technology.

“Ernie had a genius for asking the right questions,” said Jonathan Chernoff, MD, PhD, of Fox Chase Cancer Center.

“In the mid-1950s, when many scientists were interested in how proteins are synthesized, Ernie became fascinated with the opposite issue—how are proteins degraded? With the collaboration of his Israeli colleagues, he cracked that problem with the discovery of the ubiquitin conjugating system.”

After retiring to Laguna Woods, California, in 1997, Dr Rose accepted a special research position with the University of California Irvine (UCI).

There, he studied the mechanisms of fumarase, an enzyme involved in the citric acid cycle, the cellular pathway by which higher organisms convert food into energy. And he quickly became a beloved colleague and mentor to students and faculty.

“[B]oth prior to and after winning the Nobel Prize, he would help any student or young postdoctoral researcher who was having a hard time with an experiment,” said Ralph Bradshaw, PhD, a former professor at UCI.

“It was a lot of fun working with him,” said James Nowick, PhD, of UCI. “He worked with his own hands, not relying on others, with old instrumentation, and was able to do literally superb science.”

“He was the quintessential scientist—perseverant, soft-spoken, and interested in science for science’s sake,” Dr Chernoff said. “We will miss him very much.”

Dr Rose died in his sleep on June 2 in Deerfield, Massachusetts. He is survived by his wife, Zelda; their sons, Howard, Frederic, and Robert; and 5 grandchildren. Dr Rose’s daughter, Sarah, died in 2005.

Publications
Publications
Topics
Article Type
Display Headline
Biochemist Irwin Rose dies at 88
Display Headline
Biochemist Irwin Rose dies at 88
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Nutrition source could improve platinum-based nanodrugs

Article Type
Changed
Thu, 06/04/2015 - 05:00
Display Headline
Nutrition source could improve platinum-based nanodrugs

Parenteral nutrition source

A parenteral nutrition source can reduce the toxicity and increase the bioavailability of platinum-based anticancer nanodrugs, according to preclinical research published in Scientific Reports.

Many of the side effects of platinum-based drugs occur when they settle in healthy tissue.

To deliver these drugs in a more targeted way, researchers have created nanoscale delivery systems engineered to make the drugs accumulate at tumor sites.

However, tests of these nanodrugs show that between 1% and 10% of the drugs are delivered to the tumor site, with most of the remainder being diverted to the liver and spleen.

“The body’s immune system, especially the liver and spleen, has been one of the biggest stumbling blocks in developing nanoscale chemotherapy drug delivery systems,” said Chien Ho, PhD, of Carnegie Mellon University in Pittsburg, Pennsylvania.

“When the drugs collect in those organs, they become less available to treat the cancer and can also cause toxicity.”

But Dr Ho and his colleagues have found evidence to suggest that Intralipid, a fat emulsion used as a parenteral nutrition source, can help prevent that.

While developing cellular nanotags to help detect organ rejection, Dr Ho noticed that Intralipid reduced the amount of nanoparticles that were being cleared by the liver and spleen by about 50%. As a result, the nanoparticles remained in the bloodstream for longer periods of time.

So he and his colleagues decided to see if Intralipid had the same effect on platinum-based anticancer nanodrugs.

In the newly published study, the researchers administered a single, clinical dose of Intralipid to Sprague Dawley rats. One hour later, they administered a dose of a platinum-based chemotherapy drug that had been incorporated into a nanoparticle to both Intralipid-treated rats and controls.

Twenty-four hours after the drug was administered, rats pretreated with Intralipid had experienced reduced accumulation of the platinum-based drug compared to controls.

Drug accumulation decreased by 20.4% in the liver, 42.5% in the spleen, and 31.2% in the kidney. Consequently, in these organs, the toxic side effects of the nanodrug were significantly decreased compared to controls.

Furthermore, Intralipid pretreatment allowed more of the drug to remain available and active in the body for longer periods of time.

After 5 hours, the drug’s bioavailability increased by 18.7% in Intralipid-treated mice compared to controls. After 24 hours, bioavailability was 9.4% higher in Intralipid-treated mice than in controls.

The researchers believe this increased bioavailability will allow more of the drug to reach the tumor site and could perhaps allow clinicians to reduce the dosage needed to treat a patient. The team is now investigating the possibility of bringing this research to a clinical trial.

Publications
Topics

Parenteral nutrition source

A parenteral nutrition source can reduce the toxicity and increase the bioavailability of platinum-based anticancer nanodrugs, according to preclinical research published in Scientific Reports.

Many of the side effects of platinum-based drugs occur when they settle in healthy tissue.

To deliver these drugs in a more targeted way, researchers have created nanoscale delivery systems engineered to make the drugs accumulate at tumor sites.

However, tests of these nanodrugs show that between 1% and 10% of the drugs are delivered to the tumor site, with most of the remainder being diverted to the liver and spleen.

“The body’s immune system, especially the liver and spleen, has been one of the biggest stumbling blocks in developing nanoscale chemotherapy drug delivery systems,” said Chien Ho, PhD, of Carnegie Mellon University in Pittsburg, Pennsylvania.

“When the drugs collect in those organs, they become less available to treat the cancer and can also cause toxicity.”

But Dr Ho and his colleagues have found evidence to suggest that Intralipid, a fat emulsion used as a parenteral nutrition source, can help prevent that.

While developing cellular nanotags to help detect organ rejection, Dr Ho noticed that Intralipid reduced the amount of nanoparticles that were being cleared by the liver and spleen by about 50%. As a result, the nanoparticles remained in the bloodstream for longer periods of time.

So he and his colleagues decided to see if Intralipid had the same effect on platinum-based anticancer nanodrugs.

In the newly published study, the researchers administered a single, clinical dose of Intralipid to Sprague Dawley rats. One hour later, they administered a dose of a platinum-based chemotherapy drug that had been incorporated into a nanoparticle to both Intralipid-treated rats and controls.

Twenty-four hours after the drug was administered, rats pretreated with Intralipid had experienced reduced accumulation of the platinum-based drug compared to controls.

Drug accumulation decreased by 20.4% in the liver, 42.5% in the spleen, and 31.2% in the kidney. Consequently, in these organs, the toxic side effects of the nanodrug were significantly decreased compared to controls.

Furthermore, Intralipid pretreatment allowed more of the drug to remain available and active in the body for longer periods of time.

After 5 hours, the drug’s bioavailability increased by 18.7% in Intralipid-treated mice compared to controls. After 24 hours, bioavailability was 9.4% higher in Intralipid-treated mice than in controls.

The researchers believe this increased bioavailability will allow more of the drug to reach the tumor site and could perhaps allow clinicians to reduce the dosage needed to treat a patient. The team is now investigating the possibility of bringing this research to a clinical trial.

Parenteral nutrition source

A parenteral nutrition source can reduce the toxicity and increase the bioavailability of platinum-based anticancer nanodrugs, according to preclinical research published in Scientific Reports.

Many of the side effects of platinum-based drugs occur when they settle in healthy tissue.

To deliver these drugs in a more targeted way, researchers have created nanoscale delivery systems engineered to make the drugs accumulate at tumor sites.

However, tests of these nanodrugs show that between 1% and 10% of the drugs are delivered to the tumor site, with most of the remainder being diverted to the liver and spleen.

“The body’s immune system, especially the liver and spleen, has been one of the biggest stumbling blocks in developing nanoscale chemotherapy drug delivery systems,” said Chien Ho, PhD, of Carnegie Mellon University in Pittsburg, Pennsylvania.

“When the drugs collect in those organs, they become less available to treat the cancer and can also cause toxicity.”

But Dr Ho and his colleagues have found evidence to suggest that Intralipid, a fat emulsion used as a parenteral nutrition source, can help prevent that.

While developing cellular nanotags to help detect organ rejection, Dr Ho noticed that Intralipid reduced the amount of nanoparticles that were being cleared by the liver and spleen by about 50%. As a result, the nanoparticles remained in the bloodstream for longer periods of time.

So he and his colleagues decided to see if Intralipid had the same effect on platinum-based anticancer nanodrugs.

In the newly published study, the researchers administered a single, clinical dose of Intralipid to Sprague Dawley rats. One hour later, they administered a dose of a platinum-based chemotherapy drug that had been incorporated into a nanoparticle to both Intralipid-treated rats and controls.

Twenty-four hours after the drug was administered, rats pretreated with Intralipid had experienced reduced accumulation of the platinum-based drug compared to controls.

Drug accumulation decreased by 20.4% in the liver, 42.5% in the spleen, and 31.2% in the kidney. Consequently, in these organs, the toxic side effects of the nanodrug were significantly decreased compared to controls.

Furthermore, Intralipid pretreatment allowed more of the drug to remain available and active in the body for longer periods of time.

After 5 hours, the drug’s bioavailability increased by 18.7% in Intralipid-treated mice compared to controls. After 24 hours, bioavailability was 9.4% higher in Intralipid-treated mice than in controls.

The researchers believe this increased bioavailability will allow more of the drug to reach the tumor site and could perhaps allow clinicians to reduce the dosage needed to treat a patient. The team is now investigating the possibility of bringing this research to a clinical trial.

Publications
Publications
Topics
Article Type
Display Headline
Nutrition source could improve platinum-based nanodrugs
Display Headline
Nutrition source could improve platinum-based nanodrugs
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Letter to the Editor

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
In reference to “Redesigning an inpatient pediatric service using lean to improve throughput efficiency”

I read Beck et al.'s article titled Redesigning an Inpatient Pediatric Service Using Lean to Improve Throughput Efficiency with great interest.[1] Redesigning the rounding process using Lean not only created a standard workflow including seeing dischargeable patients first, involving interdisciplinary huddle, completing the discharge checklist at bedside, but also added a second attending physician, thereby decreasing the workload. Stein et al. demonstrated that restructured floor‐based patient care including unit‐based teams, interdisciplinary bedside rounds, unit‐level performance reporting, and unit‐level nurse and physician coleadership all improved workflow with an average of 12.9 patients per physician.[2] Another study showed that increased workload was associated with prolonged length of stay with a recommended number of patients per day per physician at 15.[3] I want to point out the number of patients per physician in these studies. Today's hospitalists in community hospitals are expected to see >18 patients per day, with the additional pressure of decreasing costs, readmission rates, length of stays, and time for discharge while increasing productivity and patient satisfaction. Michtalik et al.'s survey showed that 40% of hospitalists reported exceeding their own safe numbers. Regardless of any assistance, physicians reported that they could safely see 15 patients per shift if their effort was 100% clinical.[4] Therefore, despite the outstanding results of the above studies, I am hesitant as to whether similar interventions would be as successful in community hospitals with higher patient loads. We need further studies to determine the optimum number of patients per hospitalist for nonteaching community hospitals. Another concern is how to adopt the successful examples of academic centers in nonteaching community hospitals in the absence of interns. Expecting hospitalists to replace the intern role is worrisome for job satisfaction, especially in the presence of high burnout rates.

Eliminating waste and redesigning the rounding process initiatives will definitely be the norm over the next years. We need to define center‐specific right patient/hospitalist ratios with proper roles and responsibilities for hospitalists. What works in the presence of residents may not work for nonteaching community hospitals. Caution should be taken while restructuring hospital medicine.

References
  1. Beck MJ, Gosik KBS. Redesigning an inpatient pediatric service using Lean to improve throughput efficiency. J Hosp Med. 2015;10(4):220227.
  2. Stein J, Payne C, Methvin A, et al. Reorganizing a hospital ward as an accountable care unit. J Hosp Med. 2015;10(1):3640.
  3. Elliott DJ, Young RS, Brice J, Aguiar R, Kolm P. Effect of hospitalist workload on the quality and efficiency of care. JAMA Intern Med. 2014;174(5):786793.
  4. Michtalik HJ, Yeh HC, Pronovost PJ, Brotman DJ. Impact of attending physician workload on patient care: a survey of hospitalists. JAMA Intern Med. 2013;173(5):375377.
Article PDF
Issue
Journal of Hospital Medicine - 10(9)
Page Number
639-639
Sections
Article PDF
Article PDF

I read Beck et al.'s article titled Redesigning an Inpatient Pediatric Service Using Lean to Improve Throughput Efficiency with great interest.[1] Redesigning the rounding process using Lean not only created a standard workflow including seeing dischargeable patients first, involving interdisciplinary huddle, completing the discharge checklist at bedside, but also added a second attending physician, thereby decreasing the workload. Stein et al. demonstrated that restructured floor‐based patient care including unit‐based teams, interdisciplinary bedside rounds, unit‐level performance reporting, and unit‐level nurse and physician coleadership all improved workflow with an average of 12.9 patients per physician.[2] Another study showed that increased workload was associated with prolonged length of stay with a recommended number of patients per day per physician at 15.[3] I want to point out the number of patients per physician in these studies. Today's hospitalists in community hospitals are expected to see >18 patients per day, with the additional pressure of decreasing costs, readmission rates, length of stays, and time for discharge while increasing productivity and patient satisfaction. Michtalik et al.'s survey showed that 40% of hospitalists reported exceeding their own safe numbers. Regardless of any assistance, physicians reported that they could safely see 15 patients per shift if their effort was 100% clinical.[4] Therefore, despite the outstanding results of the above studies, I am hesitant as to whether similar interventions would be as successful in community hospitals with higher patient loads. We need further studies to determine the optimum number of patients per hospitalist for nonteaching community hospitals. Another concern is how to adopt the successful examples of academic centers in nonteaching community hospitals in the absence of interns. Expecting hospitalists to replace the intern role is worrisome for job satisfaction, especially in the presence of high burnout rates.

Eliminating waste and redesigning the rounding process initiatives will definitely be the norm over the next years. We need to define center‐specific right patient/hospitalist ratios with proper roles and responsibilities for hospitalists. What works in the presence of residents may not work for nonteaching community hospitals. Caution should be taken while restructuring hospital medicine.

I read Beck et al.'s article titled Redesigning an Inpatient Pediatric Service Using Lean to Improve Throughput Efficiency with great interest.[1] Redesigning the rounding process using Lean not only created a standard workflow including seeing dischargeable patients first, involving interdisciplinary huddle, completing the discharge checklist at bedside, but also added a second attending physician, thereby decreasing the workload. Stein et al. demonstrated that restructured floor‐based patient care including unit‐based teams, interdisciplinary bedside rounds, unit‐level performance reporting, and unit‐level nurse and physician coleadership all improved workflow with an average of 12.9 patients per physician.[2] Another study showed that increased workload was associated with prolonged length of stay with a recommended number of patients per day per physician at 15.[3] I want to point out the number of patients per physician in these studies. Today's hospitalists in community hospitals are expected to see >18 patients per day, with the additional pressure of decreasing costs, readmission rates, length of stays, and time for discharge while increasing productivity and patient satisfaction. Michtalik et al.'s survey showed that 40% of hospitalists reported exceeding their own safe numbers. Regardless of any assistance, physicians reported that they could safely see 15 patients per shift if their effort was 100% clinical.[4] Therefore, despite the outstanding results of the above studies, I am hesitant as to whether similar interventions would be as successful in community hospitals with higher patient loads. We need further studies to determine the optimum number of patients per hospitalist for nonteaching community hospitals. Another concern is how to adopt the successful examples of academic centers in nonteaching community hospitals in the absence of interns. Expecting hospitalists to replace the intern role is worrisome for job satisfaction, especially in the presence of high burnout rates.

Eliminating waste and redesigning the rounding process initiatives will definitely be the norm over the next years. We need to define center‐specific right patient/hospitalist ratios with proper roles and responsibilities for hospitalists. What works in the presence of residents may not work for nonteaching community hospitals. Caution should be taken while restructuring hospital medicine.

References
  1. Beck MJ, Gosik KBS. Redesigning an inpatient pediatric service using Lean to improve throughput efficiency. J Hosp Med. 2015;10(4):220227.
  2. Stein J, Payne C, Methvin A, et al. Reorganizing a hospital ward as an accountable care unit. J Hosp Med. 2015;10(1):3640.
  3. Elliott DJ, Young RS, Brice J, Aguiar R, Kolm P. Effect of hospitalist workload on the quality and efficiency of care. JAMA Intern Med. 2014;174(5):786793.
  4. Michtalik HJ, Yeh HC, Pronovost PJ, Brotman DJ. Impact of attending physician workload on patient care: a survey of hospitalists. JAMA Intern Med. 2013;173(5):375377.
References
  1. Beck MJ, Gosik KBS. Redesigning an inpatient pediatric service using Lean to improve throughput efficiency. J Hosp Med. 2015;10(4):220227.
  2. Stein J, Payne C, Methvin A, et al. Reorganizing a hospital ward as an accountable care unit. J Hosp Med. 2015;10(1):3640.
  3. Elliott DJ, Young RS, Brice J, Aguiar R, Kolm P. Effect of hospitalist workload on the quality and efficiency of care. JAMA Intern Med. 2014;174(5):786793.
  4. Michtalik HJ, Yeh HC, Pronovost PJ, Brotman DJ. Impact of attending physician workload on patient care: a survey of hospitalists. JAMA Intern Med. 2013;173(5):375377.
Issue
Journal of Hospital Medicine - 10(9)
Issue
Journal of Hospital Medicine - 10(9)
Page Number
639-639
Page Number
639-639
Article Type
Display Headline
In reference to “Redesigning an inpatient pediatric service using lean to improve throughput efficiency”
Display Headline
In reference to “Redesigning an inpatient pediatric service using lean to improve throughput efficiency”
Sections
Article Source
© 2015 Society of Hospital Medicine
Disallow All Ads
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media

Tight glycemic control: Somewhat fewer CV events, same mortality

Article Type
Changed
Tue, 05/03/2022 - 15:40
Display Headline
Tight glycemic control: Somewhat fewer CV events, same mortality

Tight glycemic control modestly reduced the rate of major cardiovascular events but didn’t improve mortality in an extended follow-up of a clinical trial involving 1,791 veterans with type 2 diabetes, which was published online June 3 in the New England Journal of Medicine.

At the conclusion of the treatment phase of the Veteran Affairs Diabetes Trial in 2008, the primary outcome – the rate of a first major CV event – was nonsignificantly lower with intensive glycemic control than with standard glycemic control. Researchers now report the findings after an additional 7.5 years of follow-up of 92% of the participants in that multicenter unblended randomized controlled trial.

During the treatment phase of the study, median glycated hemoglobin level differed by 1.5 percentage points between patients who received intensive therapy (6.9%) and patients who received standard therapy (8.4%). During follow-up, this difference declined to only 0.2-0.3 percentage points. “Even with the support of a dedicated research team, only approximately half the participants [achieved] a glycated hemoglobin level of less than 7%,” said Dr. Rodney A. Hayward of the VA Center for Clinical Management Research, VA Ann Arbor (Mich.) Healthcare System, and his associates.

During extended follow-up, there were 253 major CV events in the group randomly assigned to intensive therapy and 288 in the group assigned to standard therapy. Tight glycemic control using a multidrug regimen was associated with a significant, though modest, 17% relative reduction in a the primary composite outcome of heart attack, stroke, new or worsening congestive heart failure, death from CV causes, or amputation due to ischemic gangrene. This represents 8.6 CV events prevented per 1,000 person-years.

However, there was no evidence of any reduction in either cardiovascular or all-cause mortality. In addition, treatment effects were no different between patients at high and those at low cardiovascular risk, the investigators said (N. Engl. J. Med. 2015 June 3 [doi:10.1056/NEJMoa1414266]).

“In the absence of a reduction in total mortality, a small to moderate reduction in the rate of CV events needs to be weighed against potential harm due to overly aggressive care and the burden, long-term safety profile, and side effects of treatment, including weight gain and hypoglycemia,” they added.

References

Click for Credit Link
Author and Disclosure Information

Publications
Topics
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Tight glycemic control modestly reduced the rate of major cardiovascular events but didn’t improve mortality in an extended follow-up of a clinical trial involving 1,791 veterans with type 2 diabetes, which was published online June 3 in the New England Journal of Medicine.

At the conclusion of the treatment phase of the Veteran Affairs Diabetes Trial in 2008, the primary outcome – the rate of a first major CV event – was nonsignificantly lower with intensive glycemic control than with standard glycemic control. Researchers now report the findings after an additional 7.5 years of follow-up of 92% of the participants in that multicenter unblended randomized controlled trial.

During the treatment phase of the study, median glycated hemoglobin level differed by 1.5 percentage points between patients who received intensive therapy (6.9%) and patients who received standard therapy (8.4%). During follow-up, this difference declined to only 0.2-0.3 percentage points. “Even with the support of a dedicated research team, only approximately half the participants [achieved] a glycated hemoglobin level of less than 7%,” said Dr. Rodney A. Hayward of the VA Center for Clinical Management Research, VA Ann Arbor (Mich.) Healthcare System, and his associates.

During extended follow-up, there were 253 major CV events in the group randomly assigned to intensive therapy and 288 in the group assigned to standard therapy. Tight glycemic control using a multidrug regimen was associated with a significant, though modest, 17% relative reduction in a the primary composite outcome of heart attack, stroke, new or worsening congestive heart failure, death from CV causes, or amputation due to ischemic gangrene. This represents 8.6 CV events prevented per 1,000 person-years.

However, there was no evidence of any reduction in either cardiovascular or all-cause mortality. In addition, treatment effects were no different between patients at high and those at low cardiovascular risk, the investigators said (N. Engl. J. Med. 2015 June 3 [doi:10.1056/NEJMoa1414266]).

“In the absence of a reduction in total mortality, a small to moderate reduction in the rate of CV events needs to be weighed against potential harm due to overly aggressive care and the burden, long-term safety profile, and side effects of treatment, including weight gain and hypoglycemia,” they added.

Tight glycemic control modestly reduced the rate of major cardiovascular events but didn’t improve mortality in an extended follow-up of a clinical trial involving 1,791 veterans with type 2 diabetes, which was published online June 3 in the New England Journal of Medicine.

At the conclusion of the treatment phase of the Veteran Affairs Diabetes Trial in 2008, the primary outcome – the rate of a first major CV event – was nonsignificantly lower with intensive glycemic control than with standard glycemic control. Researchers now report the findings after an additional 7.5 years of follow-up of 92% of the participants in that multicenter unblended randomized controlled trial.

During the treatment phase of the study, median glycated hemoglobin level differed by 1.5 percentage points between patients who received intensive therapy (6.9%) and patients who received standard therapy (8.4%). During follow-up, this difference declined to only 0.2-0.3 percentage points. “Even with the support of a dedicated research team, only approximately half the participants [achieved] a glycated hemoglobin level of less than 7%,” said Dr. Rodney A. Hayward of the VA Center for Clinical Management Research, VA Ann Arbor (Mich.) Healthcare System, and his associates.

During extended follow-up, there were 253 major CV events in the group randomly assigned to intensive therapy and 288 in the group assigned to standard therapy. Tight glycemic control using a multidrug regimen was associated with a significant, though modest, 17% relative reduction in a the primary composite outcome of heart attack, stroke, new or worsening congestive heart failure, death from CV causes, or amputation due to ischemic gangrene. This represents 8.6 CV events prevented per 1,000 person-years.

However, there was no evidence of any reduction in either cardiovascular or all-cause mortality. In addition, treatment effects were no different between patients at high and those at low cardiovascular risk, the investigators said (N. Engl. J. Med. 2015 June 3 [doi:10.1056/NEJMoa1414266]).

“In the absence of a reduction in total mortality, a small to moderate reduction in the rate of CV events needs to be weighed against potential harm due to overly aggressive care and the burden, long-term safety profile, and side effects of treatment, including weight gain and hypoglycemia,” they added.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Tight glycemic control: Somewhat fewer CV events, same mortality
Display Headline
Tight glycemic control: Somewhat fewer CV events, same mortality
Article Source

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Tight glycemic control cut the rate of major cardiovascular events by 17% but didn’t improve mortality in patients with type 2 diabetes.

Major finding: Compared with standard glycemic control, tight glycemic control prevented 8.6 CV events per 1,000 person-years.

Data source: Extended follow-up of an unblinded, multicenter, randomized, controlled trial involving 1,791 veterans with type 2 diabetes.

Disclosures: This study was supported by the VA Cooperative Studies Program, the National Institute of Diabetes and Digestive and Kidney Diseases, and the National Institutes of Health. Dr. Hayward reported having no relevant financial disclosures; two of his associates reported ties to Amgen, AstraZeneca, Merck, and Novo Nordisk.

No Advantage to Routine Thrombectomy Prior to Percutaneous Coronary Intervention for STEMI

Article Type
Changed
Fri, 09/14/2018 - 12:09
Display Headline
No Advantage to Routine Thrombectomy Prior to Percutaneous Coronary Intervention for STEMI

Clinical question: Does the use of routine thrombectomy for patients presenting with ST-segment elevation myocardial infarction improve outcomes?

Bottom line: For patients with ST-segment elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PCI), the routine use of manual thrombectomy improves some electrocardiographic and angiographic outcomes, but ultimately does not result in improved cardiovascular morbidity or mortality. Moreover, thrombectomy may increase the risk of stroke. (LOE = 1b)

Reference: Jolly SS, Cairns JA, Yusuf S, et al, for the TOTAL Investigators. Randomized trial of primary PCI with or without routine manual thrombectomy. N Engl J Med. 2015;372(15):1389–1398.

Study design: Randomized controlled trial (nonblinded)

Funding source: Industry + govt

Allocation: Concealed

Setting: Inpatient (any location) with outpatient follow-up

Synopsis

Manual thrombectomy with aspiration of thrombus prior to PCI is thought to prevent distal embolization and improve microvascular perfusion. Whether this results in clinical benefit is unclear. In this study, the investigators randomized patients presenting with STEMI to undergo either routine thrombus aspiration followed by PCI or PCI alone. Those who had a previous history of coronary-artery bypass grafting or those who had received fibrinolytics were excluded. The 2 groups were balanced at baseline, with almost 80% of patients in each group noted to have a high thrombus burden. A modified intention-to-treat analysis was used that included only those patients who actually underwent PCI for the index STEMI.

Although electrocardiographic and angiographic outcomes improved with thrombectomy (eg, increased ST-segment resolution, decreased distal embolization), no clinical benefit was found. Specifically, for the primary outcome of cardiovascular death, recurrent myocardial infarction, cardiogenic shock, or New York Heart Association class IV heart failure within 180 days of randomization, there were no significant differences detected between the 2 groups. The components of the composite outcome taken individually were also similar in each group. These results persisted across prespecified analyses of the as-treated population, per-protocol population, and the subgroup with high thrombus burden. Additionally, patients in the thrombectomy group were more likely to have a stroke within 30 days and 180 days, although the number of events was relatively small (for 30 days: 0.7% vs 0.3%, P = .02; for 180 days: 1% vs 0.5%, P = .002).

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2015(06)
Publications
Topics
Sections

Clinical question: Does the use of routine thrombectomy for patients presenting with ST-segment elevation myocardial infarction improve outcomes?

Bottom line: For patients with ST-segment elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PCI), the routine use of manual thrombectomy improves some electrocardiographic and angiographic outcomes, but ultimately does not result in improved cardiovascular morbidity or mortality. Moreover, thrombectomy may increase the risk of stroke. (LOE = 1b)

Reference: Jolly SS, Cairns JA, Yusuf S, et al, for the TOTAL Investigators. Randomized trial of primary PCI with or without routine manual thrombectomy. N Engl J Med. 2015;372(15):1389–1398.

Study design: Randomized controlled trial (nonblinded)

Funding source: Industry + govt

Allocation: Concealed

Setting: Inpatient (any location) with outpatient follow-up

Synopsis

Manual thrombectomy with aspiration of thrombus prior to PCI is thought to prevent distal embolization and improve microvascular perfusion. Whether this results in clinical benefit is unclear. In this study, the investigators randomized patients presenting with STEMI to undergo either routine thrombus aspiration followed by PCI or PCI alone. Those who had a previous history of coronary-artery bypass grafting or those who had received fibrinolytics were excluded. The 2 groups were balanced at baseline, with almost 80% of patients in each group noted to have a high thrombus burden. A modified intention-to-treat analysis was used that included only those patients who actually underwent PCI for the index STEMI.

Although electrocardiographic and angiographic outcomes improved with thrombectomy (eg, increased ST-segment resolution, decreased distal embolization), no clinical benefit was found. Specifically, for the primary outcome of cardiovascular death, recurrent myocardial infarction, cardiogenic shock, or New York Heart Association class IV heart failure within 180 days of randomization, there were no significant differences detected between the 2 groups. The components of the composite outcome taken individually were also similar in each group. These results persisted across prespecified analyses of the as-treated population, per-protocol population, and the subgroup with high thrombus burden. Additionally, patients in the thrombectomy group were more likely to have a stroke within 30 days and 180 days, although the number of events was relatively small (for 30 days: 0.7% vs 0.3%, P = .02; for 180 days: 1% vs 0.5%, P = .002).

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Clinical question: Does the use of routine thrombectomy for patients presenting with ST-segment elevation myocardial infarction improve outcomes?

Bottom line: For patients with ST-segment elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PCI), the routine use of manual thrombectomy improves some electrocardiographic and angiographic outcomes, but ultimately does not result in improved cardiovascular morbidity or mortality. Moreover, thrombectomy may increase the risk of stroke. (LOE = 1b)

Reference: Jolly SS, Cairns JA, Yusuf S, et al, for the TOTAL Investigators. Randomized trial of primary PCI with or without routine manual thrombectomy. N Engl J Med. 2015;372(15):1389–1398.

Study design: Randomized controlled trial (nonblinded)

Funding source: Industry + govt

Allocation: Concealed

Setting: Inpatient (any location) with outpatient follow-up

Synopsis

Manual thrombectomy with aspiration of thrombus prior to PCI is thought to prevent distal embolization and improve microvascular perfusion. Whether this results in clinical benefit is unclear. In this study, the investigators randomized patients presenting with STEMI to undergo either routine thrombus aspiration followed by PCI or PCI alone. Those who had a previous history of coronary-artery bypass grafting or those who had received fibrinolytics were excluded. The 2 groups were balanced at baseline, with almost 80% of patients in each group noted to have a high thrombus burden. A modified intention-to-treat analysis was used that included only those patients who actually underwent PCI for the index STEMI.

Although electrocardiographic and angiographic outcomes improved with thrombectomy (eg, increased ST-segment resolution, decreased distal embolization), no clinical benefit was found. Specifically, for the primary outcome of cardiovascular death, recurrent myocardial infarction, cardiogenic shock, or New York Heart Association class IV heart failure within 180 days of randomization, there were no significant differences detected between the 2 groups. The components of the composite outcome taken individually were also similar in each group. These results persisted across prespecified analyses of the as-treated population, per-protocol population, and the subgroup with high thrombus burden. Additionally, patients in the thrombectomy group were more likely to have a stroke within 30 days and 180 days, although the number of events was relatively small (for 30 days: 0.7% vs 0.3%, P = .02; for 180 days: 1% vs 0.5%, P = .002).

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2015(06)
Issue
The Hospitalist - 2015(06)
Publications
Publications
Topics
Article Type
Display Headline
No Advantage to Routine Thrombectomy Prior to Percutaneous Coronary Intervention for STEMI
Display Headline
No Advantage to Routine Thrombectomy Prior to Percutaneous Coronary Intervention for STEMI
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Temporary IVC Filter Added to Anticoagulation Does Not Decrease Pulmonary Embolism Recurrence Risk

Article Type
Changed
Fri, 09/14/2018 - 12:09
Display Headline
Temporary IVC Filter Added to Anticoagulation Does Not Decrease Pulmonary Embolism Recurrence Risk

Clinical question: Does the insertion of a retrievable inferior vena cava filter in addition to anticoagulation prevent the recurrence of pulmonary embolism in high-risk patients?

Bottom line: For patients with pulmonary embolism (PE) who are at high risk of recurrence or who have poor cardiopulmonary reserve, the addition of a retrievable inferior vena cava (IVC) filter plus anticoagulation does not decrease the risk of recurrent PE as compared with anticoagulation alone. Although this study was underpowered to detect a difference if one truly exists, the authors postulate that such a difference would likely be small and thus clinically irrelevant. (LOE = 1b-)

Reference: Mismetti P, Laporte S, Pellerin O, et al, for the PREPIC2 Study Group. Effect of a retrievable inferior vena cava filter plus anticoagulation vs anticoagulation alone on risk of recurrent pulmonary embolism. JAMA. 2015;313(16):1627–1635.

Study design: Randomized controlled trial (nonblinded)

Funding source: Government

Allocation: Concealed

Setting: Inpatient (any location) with outpatient follow-up

Synopsis

The utility of retrievable IVC filters added to anticoagulation for the prevention of recurrent PE is unknown. This study included adults who were hospitalized for acute PE associated with lower extremity venous thrombosis and had one additional criterion for severity (older than 75 years, active cancer, chronic cardiopulmonary conditions, recent stroke with leg paralysis, iliocaval or bilateral venous thromboses, or evidence of right ventricular dysfunction or myocardial injury).

The patients were randomized, using concealed allocation, to receive a filter plus anticoagulation or anticoagulation alone. Both groups were anticoagulated for at least 6 months and filters were retrieved at 3 months. More patients in the filter group had chronic respiratory failure at baseline but the groups were otherwise well matched. Analysis was by intention to treat.

At 3 months, the rate of recurrent PE did not differ between the 2 groups (3% in filter group vs 1.5% in control group; P = .50; RR with filter 2.00; 95% CI 0.51-7.89). Additionally, there were no differences detected in venous thromboembolism recurrence, major bleeding, or death at either 3 or 6 months. Complications in the filter group included access site hematomas, filter thromboses, and filter retrieval failures. The authors based their analysis on an expected PE recurrence rate of 8% in the control group but the actual rate was much lower. Although this results in an underpowered study, the authors note that the point estimate of the relative risk still favors the control group and if filters did confer a small advantage it would likely not be clinically meaningful.

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2015(06)
Publications
Sections

Clinical question: Does the insertion of a retrievable inferior vena cava filter in addition to anticoagulation prevent the recurrence of pulmonary embolism in high-risk patients?

Bottom line: For patients with pulmonary embolism (PE) who are at high risk of recurrence or who have poor cardiopulmonary reserve, the addition of a retrievable inferior vena cava (IVC) filter plus anticoagulation does not decrease the risk of recurrent PE as compared with anticoagulation alone. Although this study was underpowered to detect a difference if one truly exists, the authors postulate that such a difference would likely be small and thus clinically irrelevant. (LOE = 1b-)

Reference: Mismetti P, Laporte S, Pellerin O, et al, for the PREPIC2 Study Group. Effect of a retrievable inferior vena cava filter plus anticoagulation vs anticoagulation alone on risk of recurrent pulmonary embolism. JAMA. 2015;313(16):1627–1635.

Study design: Randomized controlled trial (nonblinded)

Funding source: Government

Allocation: Concealed

Setting: Inpatient (any location) with outpatient follow-up

Synopsis

The utility of retrievable IVC filters added to anticoagulation for the prevention of recurrent PE is unknown. This study included adults who were hospitalized for acute PE associated with lower extremity venous thrombosis and had one additional criterion for severity (older than 75 years, active cancer, chronic cardiopulmonary conditions, recent stroke with leg paralysis, iliocaval or bilateral venous thromboses, or evidence of right ventricular dysfunction or myocardial injury).

The patients were randomized, using concealed allocation, to receive a filter plus anticoagulation or anticoagulation alone. Both groups were anticoagulated for at least 6 months and filters were retrieved at 3 months. More patients in the filter group had chronic respiratory failure at baseline but the groups were otherwise well matched. Analysis was by intention to treat.

At 3 months, the rate of recurrent PE did not differ between the 2 groups (3% in filter group vs 1.5% in control group; P = .50; RR with filter 2.00; 95% CI 0.51-7.89). Additionally, there were no differences detected in venous thromboembolism recurrence, major bleeding, or death at either 3 or 6 months. Complications in the filter group included access site hematomas, filter thromboses, and filter retrieval failures. The authors based their analysis on an expected PE recurrence rate of 8% in the control group but the actual rate was much lower. Although this results in an underpowered study, the authors note that the point estimate of the relative risk still favors the control group and if filters did confer a small advantage it would likely not be clinically meaningful.

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Clinical question: Does the insertion of a retrievable inferior vena cava filter in addition to anticoagulation prevent the recurrence of pulmonary embolism in high-risk patients?

Bottom line: For patients with pulmonary embolism (PE) who are at high risk of recurrence or who have poor cardiopulmonary reserve, the addition of a retrievable inferior vena cava (IVC) filter plus anticoagulation does not decrease the risk of recurrent PE as compared with anticoagulation alone. Although this study was underpowered to detect a difference if one truly exists, the authors postulate that such a difference would likely be small and thus clinically irrelevant. (LOE = 1b-)

Reference: Mismetti P, Laporte S, Pellerin O, et al, for the PREPIC2 Study Group. Effect of a retrievable inferior vena cava filter plus anticoagulation vs anticoagulation alone on risk of recurrent pulmonary embolism. JAMA. 2015;313(16):1627–1635.

Study design: Randomized controlled trial (nonblinded)

Funding source: Government

Allocation: Concealed

Setting: Inpatient (any location) with outpatient follow-up

Synopsis

The utility of retrievable IVC filters added to anticoagulation for the prevention of recurrent PE is unknown. This study included adults who were hospitalized for acute PE associated with lower extremity venous thrombosis and had one additional criterion for severity (older than 75 years, active cancer, chronic cardiopulmonary conditions, recent stroke with leg paralysis, iliocaval or bilateral venous thromboses, or evidence of right ventricular dysfunction or myocardial injury).

The patients were randomized, using concealed allocation, to receive a filter plus anticoagulation or anticoagulation alone. Both groups were anticoagulated for at least 6 months and filters were retrieved at 3 months. More patients in the filter group had chronic respiratory failure at baseline but the groups were otherwise well matched. Analysis was by intention to treat.

At 3 months, the rate of recurrent PE did not differ between the 2 groups (3% in filter group vs 1.5% in control group; P = .50; RR with filter 2.00; 95% CI 0.51-7.89). Additionally, there were no differences detected in venous thromboembolism recurrence, major bleeding, or death at either 3 or 6 months. Complications in the filter group included access site hematomas, filter thromboses, and filter retrieval failures. The authors based their analysis on an expected PE recurrence rate of 8% in the control group but the actual rate was much lower. Although this results in an underpowered study, the authors note that the point estimate of the relative risk still favors the control group and if filters did confer a small advantage it would likely not be clinically meaningful.

Dr. Kulkarni is an assistant professor of hospital medicine at Northwestern University in Chicago.

Issue
The Hospitalist - 2015(06)
Issue
The Hospitalist - 2015(06)
Publications
Publications
Article Type
Display Headline
Temporary IVC Filter Added to Anticoagulation Does Not Decrease Pulmonary Embolism Recurrence Risk
Display Headline
Temporary IVC Filter Added to Anticoagulation Does Not Decrease Pulmonary Embolism Recurrence Risk
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)

Single agent can treat resistant MM

Article Type
Changed
Wed, 06/03/2015 - 07:00
Display Headline
Single agent can treat resistant MM

Sagar Lonial, MD

© ASCO/Todd Buchanan

CHICAGO—The anti-CD38 monoclonal antibody daratumumab can be effective as a stand-alone therapy for some heavily pretreated patients with multiple myeloma (MM), results of an ongoing phase 2 trial suggest.

The study, known as SIRIUS or MMY2002, included more than 100 patients who had received 3 or more prior lines of therapy.

Roughly 30% of these subjects responded to daratumumab, with a median response duration of about 7 months.

The median progression-free survival was close to 4 months, and the estimated 1-year overall survival rate was 65%.

Serious adverse events (AEs) occurred in 30% of patients.

“These findings speak to the potential of daratumumab as an effective and tolerable option for people with multiple myeloma who have exhausted other available treatment options,” said study investigator Sagar Lonial, MD, of Emory University School of Medicine in Atlanta, Georgia.

Dr Lonial presented these findings at the 2015 ASCO Annual Meeting (abstract LBA8512). The research was funded by Janssen Research & Development, the company developing daratumumab.

In part 1 of this study, 34 patients were randomized to receive either 8 mg/kg of daratumumab once every 4 weeks or 16 mg/kg once a week for 8 weeks, then once every 2 weeks for 16 weeks and once every 4 weeks after that, until disease progression or unacceptable toxicity.

In part 2, an additional 90 patients were enrolled to receive 16 mg/kg of daratumumab on the same dosing schedule as in part 1.

Dr Lonial reported results for all patients in parts 1 and 2 who received 16 mg/kg of daratumumab. These 106 patients had received a median of 5 prior lines of therapy, including a proteasome inhibitor and an immunomodulatory drug.

According to an independent review committee, 29.2% of patients responded to daratumumab. Eighteen patients had a partial response, 10 had a very good partial response, and 3 had a stringent complete response. The median duration of response was 7.4 months.

“It is particularly noteworthy to see this level of response with a single-agent in this heavily pretreated population,” Dr Lonial said. “Ninety-seven percent of patients in this study were refractory to their last line of therapy, and 95% were double-refractory to both a [proteasome inhibitor] and an [immunomodulatory drug].”

The median overall survival has not been reached, and the estimated 1-year overall survival rate is 65%. The median progression-free survival was 3.7 months.

After a median follow up of 9.4 months, 45.2% of responders remain on therapy.

The most common AEs were fatigue (39.6%), anemia (33%), nausea (29.2%), thrombocytopenia (25.5%), neutropenia (22.6%), back pain (22.6%), and cough (20.8%).

Thirty percent of patients experienced serious AEs. And 4.7% of patients discontinued treatment due to AEs, none of which were considered drug-related.

Infusion-related reactions (IRR) were reported in 42.5% of patients and were predominantly grade 1 or 2 (4.7% grade 3; no grade 4). These occurred mainly during the first infusion.

The most common IRRs included nasal congestion (12%), throat irritation (7%), cough (6%), dyspnea (6%), chills (6%), and vomiting (6%)—all of which were treated with standard of care and slower infusion rates.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Sagar Lonial, MD

© ASCO/Todd Buchanan

CHICAGO—The anti-CD38 monoclonal antibody daratumumab can be effective as a stand-alone therapy for some heavily pretreated patients with multiple myeloma (MM), results of an ongoing phase 2 trial suggest.

The study, known as SIRIUS or MMY2002, included more than 100 patients who had received 3 or more prior lines of therapy.

Roughly 30% of these subjects responded to daratumumab, with a median response duration of about 7 months.

The median progression-free survival was close to 4 months, and the estimated 1-year overall survival rate was 65%.

Serious adverse events (AEs) occurred in 30% of patients.

“These findings speak to the potential of daratumumab as an effective and tolerable option for people with multiple myeloma who have exhausted other available treatment options,” said study investigator Sagar Lonial, MD, of Emory University School of Medicine in Atlanta, Georgia.

Dr Lonial presented these findings at the 2015 ASCO Annual Meeting (abstract LBA8512). The research was funded by Janssen Research & Development, the company developing daratumumab.

In part 1 of this study, 34 patients were randomized to receive either 8 mg/kg of daratumumab once every 4 weeks or 16 mg/kg once a week for 8 weeks, then once every 2 weeks for 16 weeks and once every 4 weeks after that, until disease progression or unacceptable toxicity.

In part 2, an additional 90 patients were enrolled to receive 16 mg/kg of daratumumab on the same dosing schedule as in part 1.

Dr Lonial reported results for all patients in parts 1 and 2 who received 16 mg/kg of daratumumab. These 106 patients had received a median of 5 prior lines of therapy, including a proteasome inhibitor and an immunomodulatory drug.

According to an independent review committee, 29.2% of patients responded to daratumumab. Eighteen patients had a partial response, 10 had a very good partial response, and 3 had a stringent complete response. The median duration of response was 7.4 months.

“It is particularly noteworthy to see this level of response with a single-agent in this heavily pretreated population,” Dr Lonial said. “Ninety-seven percent of patients in this study were refractory to their last line of therapy, and 95% were double-refractory to both a [proteasome inhibitor] and an [immunomodulatory drug].”

The median overall survival has not been reached, and the estimated 1-year overall survival rate is 65%. The median progression-free survival was 3.7 months.

After a median follow up of 9.4 months, 45.2% of responders remain on therapy.

The most common AEs were fatigue (39.6%), anemia (33%), nausea (29.2%), thrombocytopenia (25.5%), neutropenia (22.6%), back pain (22.6%), and cough (20.8%).

Thirty percent of patients experienced serious AEs. And 4.7% of patients discontinued treatment due to AEs, none of which were considered drug-related.

Infusion-related reactions (IRR) were reported in 42.5% of patients and were predominantly grade 1 or 2 (4.7% grade 3; no grade 4). These occurred mainly during the first infusion.

The most common IRRs included nasal congestion (12%), throat irritation (7%), cough (6%), dyspnea (6%), chills (6%), and vomiting (6%)—all of which were treated with standard of care and slower infusion rates.

Sagar Lonial, MD

© ASCO/Todd Buchanan

CHICAGO—The anti-CD38 monoclonal antibody daratumumab can be effective as a stand-alone therapy for some heavily pretreated patients with multiple myeloma (MM), results of an ongoing phase 2 trial suggest.

The study, known as SIRIUS or MMY2002, included more than 100 patients who had received 3 or more prior lines of therapy.

Roughly 30% of these subjects responded to daratumumab, with a median response duration of about 7 months.

The median progression-free survival was close to 4 months, and the estimated 1-year overall survival rate was 65%.

Serious adverse events (AEs) occurred in 30% of patients.

“These findings speak to the potential of daratumumab as an effective and tolerable option for people with multiple myeloma who have exhausted other available treatment options,” said study investigator Sagar Lonial, MD, of Emory University School of Medicine in Atlanta, Georgia.

Dr Lonial presented these findings at the 2015 ASCO Annual Meeting (abstract LBA8512). The research was funded by Janssen Research & Development, the company developing daratumumab.

In part 1 of this study, 34 patients were randomized to receive either 8 mg/kg of daratumumab once every 4 weeks or 16 mg/kg once a week for 8 weeks, then once every 2 weeks for 16 weeks and once every 4 weeks after that, until disease progression or unacceptable toxicity.

In part 2, an additional 90 patients were enrolled to receive 16 mg/kg of daratumumab on the same dosing schedule as in part 1.

Dr Lonial reported results for all patients in parts 1 and 2 who received 16 mg/kg of daratumumab. These 106 patients had received a median of 5 prior lines of therapy, including a proteasome inhibitor and an immunomodulatory drug.

According to an independent review committee, 29.2% of patients responded to daratumumab. Eighteen patients had a partial response, 10 had a very good partial response, and 3 had a stringent complete response. The median duration of response was 7.4 months.

“It is particularly noteworthy to see this level of response with a single-agent in this heavily pretreated population,” Dr Lonial said. “Ninety-seven percent of patients in this study were refractory to their last line of therapy, and 95% were double-refractory to both a [proteasome inhibitor] and an [immunomodulatory drug].”

The median overall survival has not been reached, and the estimated 1-year overall survival rate is 65%. The median progression-free survival was 3.7 months.

After a median follow up of 9.4 months, 45.2% of responders remain on therapy.

The most common AEs were fatigue (39.6%), anemia (33%), nausea (29.2%), thrombocytopenia (25.5%), neutropenia (22.6%), back pain (22.6%), and cough (20.8%).

Thirty percent of patients experienced serious AEs. And 4.7% of patients discontinued treatment due to AEs, none of which were considered drug-related.

Infusion-related reactions (IRR) were reported in 42.5% of patients and were predominantly grade 1 or 2 (4.7% grade 3; no grade 4). These occurred mainly during the first infusion.

The most common IRRs included nasal congestion (12%), throat irritation (7%), cough (6%), dyspnea (6%), chills (6%), and vomiting (6%)—all of which were treated with standard of care and slower infusion rates.

Publications
Publications
Topics
Article Type
Display Headline
Single agent can treat resistant MM
Display Headline
Single agent can treat resistant MM
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Cancer survivors mirror spouses’ moods

Article Type
Changed
Wed, 06/03/2015 - 06:00
Display Headline
Cancer survivors mirror spouses’ moods

Cancer patient receiving

chemotherapy

Photo by Rhoda Baer

Cancer survivors’ moods are impacted—both positively and negatively—by their spouses’ moods, according to research published in Cancer Epidemiology, Biomarkers & Prevention.

In the study, cancer survivors whose spouses reported depressed moods were more likely to be depressed after about a year of follow-up, and survivors whose spouses reported better mental and physical health-related quality of life (HRQOL) were less likely to be depressed.

However, survivors’ moods did not have the same impact on their spouses.

“We were surprised that the effects of the spouses on the survivors were so much larger in this study than the effect of the survivors on their spouses,” said study author Kristin Litzelman, PhD, of the National Cancer Institute in Bethesda, Maryland. “We expected to see a more reciprocal relationship.”

Dr Litzelman and her colleagues conducted this research in an attempt to understand how cancer survivors and their families influence one another. The team hoped to identify ways to improve the healthcare both parties receive and thereby improve their health and well-being.

The researchers analyzed data from 910 cancer patients and their spouses, comparing them to 910 couples without any kind of cancer-related health problem.

The team used statistical models to assess how each spouse’s quality of life or depression at one time point was associated with his or her partner’s risk of depression around 11 months later. The researchers took into account a person’s previously reported mood, demographic characteristics, and other factors.

The results showed that, when spouses reported feeling depressed, cancer survivors were about 4 times more likely to report being depressed 11 months later (odds ratio [OR]=4.27). This association was stronger among female cancer survivors (OR=9.49) than male survivors (OR=3.98).

Cancer survivors whose spouses reported better HRQOL had a 30% decrease in depressed mood per 10-point improvement in HRQOL score. The ORs were 0.72 for mental health and 0.68 for physical health. The associations between spousal HRQOL and survivor depressed mood were similar for male and female survivors.

The researchers noted that cancer survivors’ moods did not have a significant impact on their spouses’ risk of depressed mood 11 months later.

And the team did not see mood associations in couples without any cancer-related health problems.

“This finding certainly needs to be backed up by other studies, but it highlights the importance of family well-being in cancer survivor outcomes,” Dr Litzelman said. “Our research highlights that spouses need to take care of themselves, not just for their own sake, but also for the sake of the cancer survivor.”

“Our findings also suggest that, when caring for cancer survivors, clinicians may want to assess the well-being of spousal caregivers. Future research could test whether including caregivers in the survivorship care plan might help to improve outcomes for both caregivers and for cancer survivors.”

Publications
Topics

Cancer patient receiving

chemotherapy

Photo by Rhoda Baer

Cancer survivors’ moods are impacted—both positively and negatively—by their spouses’ moods, according to research published in Cancer Epidemiology, Biomarkers & Prevention.

In the study, cancer survivors whose spouses reported depressed moods were more likely to be depressed after about a year of follow-up, and survivors whose spouses reported better mental and physical health-related quality of life (HRQOL) were less likely to be depressed.

However, survivors’ moods did not have the same impact on their spouses.

“We were surprised that the effects of the spouses on the survivors were so much larger in this study than the effect of the survivors on their spouses,” said study author Kristin Litzelman, PhD, of the National Cancer Institute in Bethesda, Maryland. “We expected to see a more reciprocal relationship.”

Dr Litzelman and her colleagues conducted this research in an attempt to understand how cancer survivors and their families influence one another. The team hoped to identify ways to improve the healthcare both parties receive and thereby improve their health and well-being.

The researchers analyzed data from 910 cancer patients and their spouses, comparing them to 910 couples without any kind of cancer-related health problem.

The team used statistical models to assess how each spouse’s quality of life or depression at one time point was associated with his or her partner’s risk of depression around 11 months later. The researchers took into account a person’s previously reported mood, demographic characteristics, and other factors.

The results showed that, when spouses reported feeling depressed, cancer survivors were about 4 times more likely to report being depressed 11 months later (odds ratio [OR]=4.27). This association was stronger among female cancer survivors (OR=9.49) than male survivors (OR=3.98).

Cancer survivors whose spouses reported better HRQOL had a 30% decrease in depressed mood per 10-point improvement in HRQOL score. The ORs were 0.72 for mental health and 0.68 for physical health. The associations between spousal HRQOL and survivor depressed mood were similar for male and female survivors.

The researchers noted that cancer survivors’ moods did not have a significant impact on their spouses’ risk of depressed mood 11 months later.

And the team did not see mood associations in couples without any cancer-related health problems.

“This finding certainly needs to be backed up by other studies, but it highlights the importance of family well-being in cancer survivor outcomes,” Dr Litzelman said. “Our research highlights that spouses need to take care of themselves, not just for their own sake, but also for the sake of the cancer survivor.”

“Our findings also suggest that, when caring for cancer survivors, clinicians may want to assess the well-being of spousal caregivers. Future research could test whether including caregivers in the survivorship care plan might help to improve outcomes for both caregivers and for cancer survivors.”

Cancer patient receiving

chemotherapy

Photo by Rhoda Baer

Cancer survivors’ moods are impacted—both positively and negatively—by their spouses’ moods, according to research published in Cancer Epidemiology, Biomarkers & Prevention.

In the study, cancer survivors whose spouses reported depressed moods were more likely to be depressed after about a year of follow-up, and survivors whose spouses reported better mental and physical health-related quality of life (HRQOL) were less likely to be depressed.

However, survivors’ moods did not have the same impact on their spouses.

“We were surprised that the effects of the spouses on the survivors were so much larger in this study than the effect of the survivors on their spouses,” said study author Kristin Litzelman, PhD, of the National Cancer Institute in Bethesda, Maryland. “We expected to see a more reciprocal relationship.”

Dr Litzelman and her colleagues conducted this research in an attempt to understand how cancer survivors and their families influence one another. The team hoped to identify ways to improve the healthcare both parties receive and thereby improve their health and well-being.

The researchers analyzed data from 910 cancer patients and their spouses, comparing them to 910 couples without any kind of cancer-related health problem.

The team used statistical models to assess how each spouse’s quality of life or depression at one time point was associated with his or her partner’s risk of depression around 11 months later. The researchers took into account a person’s previously reported mood, demographic characteristics, and other factors.

The results showed that, when spouses reported feeling depressed, cancer survivors were about 4 times more likely to report being depressed 11 months later (odds ratio [OR]=4.27). This association was stronger among female cancer survivors (OR=9.49) than male survivors (OR=3.98).

Cancer survivors whose spouses reported better HRQOL had a 30% decrease in depressed mood per 10-point improvement in HRQOL score. The ORs were 0.72 for mental health and 0.68 for physical health. The associations between spousal HRQOL and survivor depressed mood were similar for male and female survivors.

The researchers noted that cancer survivors’ moods did not have a significant impact on their spouses’ risk of depressed mood 11 months later.

And the team did not see mood associations in couples without any cancer-related health problems.

“This finding certainly needs to be backed up by other studies, but it highlights the importance of family well-being in cancer survivor outcomes,” Dr Litzelman said. “Our research highlights that spouses need to take care of themselves, not just for their own sake, but also for the sake of the cancer survivor.”

“Our findings also suggest that, when caring for cancer survivors, clinicians may want to assess the well-being of spousal caregivers. Future research could test whether including caregivers in the survivorship care plan might help to improve outcomes for both caregivers and for cancer survivors.”

Publications
Publications
Topics
Article Type
Display Headline
Cancer survivors mirror spouses’ moods
Display Headline
Cancer survivors mirror spouses’ moods
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

CDS systems often can’t tell if imaging is appropriate

Article Type
Changed
Wed, 06/03/2015 - 05:00
Display Headline
CDS systems often can’t tell if imaging is appropriate

Doctor and patient

Photo courtesy of NIH

Tools that help physicians decide whether to use diagnostic imaging can help reduce the use of unnecessary tests.

But new research suggests these tools may not be able to determine which tests are necessary most of the time.

The tools in question are computerized clinical decision support (CDS) systems, which match a patient’s characteristics against appropriateness

criteria to produce algorithmic treatment recommendations.

In a study published in JAMA, CDS systems did increase orders of imaging tests rated as “appropriate.”

However, the systems were not able to assign appropriateness ratings for a majority of tests because no appropriateness criteria were available for a particular test, or because the systems themselves were not able to find matching criteria.

“The increase in orders rated as appropriate is promising, but the number of tests that were not rated indicates there is room for further improvement of these tools,” said study author Peter S. Hussey, PhD, of the RAND Corporation in Boston, Massachusetts.

Study details

Dr Hussey and his colleagues used data from the Medicare Imaging Demonstration to evaluate the relationship of CDS system use with the proportion of imaging orders matched to appropriateness criteria, the appropriateness of ordered images, and the proportion of orders that changed after feedback.

The team compared 2 time periods during which clinicians used computerized radiology order entry systems and CDS systems for MRI, CT, and nuclear medicine procedures.

During a 6-month baseline period, the CDS systems tracked whether orders were linked with appropriateness criteria but did not provide clinicians with feedback on the appropriateness of orders.

During the 18-month intervention period, the CDS systems provided feedback indicating whether the order was linked to appropriateness criteria and, if so, the appropriateness rating, any recommendations for alternative orders, and a link to documentation supporting each rating.

National medical specialty societies developed the appropriateness criteria using expert panels that reviewed evidence and completed a structured rating process. The same appropriateness criteria were loaded into the CDS systems tools for all participating clinicians.

In all, 3340 clinicians placed 117,348 orders for advanced diagnostic imaging procedures.

Results

The CDS systems could not match most orders to appropriateness criteria. The systems did not identify relevant criteria for 63.3% of orders made during the baseline period and 66.5% of orders made during the intervention period.

Of the orders CDS systems could rate, 73.7% ordered during the baseline period and 81% ordered during the intervention period were rated as appropriate, and 11.1% and 6.4%, respectively, were rated inappropriate.

Of the orders that were initially rated as inappropriate, 4.8% were changed, and 1.9% were canceled.

When the CDS systems suggested an alternative for inappropriate orders, 9.9% of the orders were changed, and 0.4% were canceled. When the systems did not provide an alternative, 1.4% of inappropriate orders were changed, and 2.8% were canceled.

“In response to these findings, we recommend that clinical decision support efforts should focus on tools that help clinicians perform their work more efficiently and effectively,” said study author Katherine Kahn, MD, of the University of California, Los Angeles.

“We need a more comprehensive set of evidence-based guidelines that cover a greater proportion of advanced imaging orders for Medicare patients, and provide better methods for communicating feedback to clinicians.”

Publications
Topics

Doctor and patient

Photo courtesy of NIH

Tools that help physicians decide whether to use diagnostic imaging can help reduce the use of unnecessary tests.

But new research suggests these tools may not be able to determine which tests are necessary most of the time.

The tools in question are computerized clinical decision support (CDS) systems, which match a patient’s characteristics against appropriateness

criteria to produce algorithmic treatment recommendations.

In a study published in JAMA, CDS systems did increase orders of imaging tests rated as “appropriate.”

However, the systems were not able to assign appropriateness ratings for a majority of tests because no appropriateness criteria were available for a particular test, or because the systems themselves were not able to find matching criteria.

“The increase in orders rated as appropriate is promising, but the number of tests that were not rated indicates there is room for further improvement of these tools,” said study author Peter S. Hussey, PhD, of the RAND Corporation in Boston, Massachusetts.

Study details

Dr Hussey and his colleagues used data from the Medicare Imaging Demonstration to evaluate the relationship of CDS system use with the proportion of imaging orders matched to appropriateness criteria, the appropriateness of ordered images, and the proportion of orders that changed after feedback.

The team compared 2 time periods during which clinicians used computerized radiology order entry systems and CDS systems for MRI, CT, and nuclear medicine procedures.

During a 6-month baseline period, the CDS systems tracked whether orders were linked with appropriateness criteria but did not provide clinicians with feedback on the appropriateness of orders.

During the 18-month intervention period, the CDS systems provided feedback indicating whether the order was linked to appropriateness criteria and, if so, the appropriateness rating, any recommendations for alternative orders, and a link to documentation supporting each rating.

National medical specialty societies developed the appropriateness criteria using expert panels that reviewed evidence and completed a structured rating process. The same appropriateness criteria were loaded into the CDS systems tools for all participating clinicians.

In all, 3340 clinicians placed 117,348 orders for advanced diagnostic imaging procedures.

Results

The CDS systems could not match most orders to appropriateness criteria. The systems did not identify relevant criteria for 63.3% of orders made during the baseline period and 66.5% of orders made during the intervention period.

Of the orders CDS systems could rate, 73.7% ordered during the baseline period and 81% ordered during the intervention period were rated as appropriate, and 11.1% and 6.4%, respectively, were rated inappropriate.

Of the orders that were initially rated as inappropriate, 4.8% were changed, and 1.9% were canceled.

When the CDS systems suggested an alternative for inappropriate orders, 9.9% of the orders were changed, and 0.4% were canceled. When the systems did not provide an alternative, 1.4% of inappropriate orders were changed, and 2.8% were canceled.

“In response to these findings, we recommend that clinical decision support efforts should focus on tools that help clinicians perform their work more efficiently and effectively,” said study author Katherine Kahn, MD, of the University of California, Los Angeles.

“We need a more comprehensive set of evidence-based guidelines that cover a greater proportion of advanced imaging orders for Medicare patients, and provide better methods for communicating feedback to clinicians.”

Doctor and patient

Photo courtesy of NIH

Tools that help physicians decide whether to use diagnostic imaging can help reduce the use of unnecessary tests.

But new research suggests these tools may not be able to determine which tests are necessary most of the time.

The tools in question are computerized clinical decision support (CDS) systems, which match a patient’s characteristics against appropriateness

criteria to produce algorithmic treatment recommendations.

In a study published in JAMA, CDS systems did increase orders of imaging tests rated as “appropriate.”

However, the systems were not able to assign appropriateness ratings for a majority of tests because no appropriateness criteria were available for a particular test, or because the systems themselves were not able to find matching criteria.

“The increase in orders rated as appropriate is promising, but the number of tests that were not rated indicates there is room for further improvement of these tools,” said study author Peter S. Hussey, PhD, of the RAND Corporation in Boston, Massachusetts.

Study details

Dr Hussey and his colleagues used data from the Medicare Imaging Demonstration to evaluate the relationship of CDS system use with the proportion of imaging orders matched to appropriateness criteria, the appropriateness of ordered images, and the proportion of orders that changed after feedback.

The team compared 2 time periods during which clinicians used computerized radiology order entry systems and CDS systems for MRI, CT, and nuclear medicine procedures.

During a 6-month baseline period, the CDS systems tracked whether orders were linked with appropriateness criteria but did not provide clinicians with feedback on the appropriateness of orders.

During the 18-month intervention period, the CDS systems provided feedback indicating whether the order was linked to appropriateness criteria and, if so, the appropriateness rating, any recommendations for alternative orders, and a link to documentation supporting each rating.

National medical specialty societies developed the appropriateness criteria using expert panels that reviewed evidence and completed a structured rating process. The same appropriateness criteria were loaded into the CDS systems tools for all participating clinicians.

In all, 3340 clinicians placed 117,348 orders for advanced diagnostic imaging procedures.

Results

The CDS systems could not match most orders to appropriateness criteria. The systems did not identify relevant criteria for 63.3% of orders made during the baseline period and 66.5% of orders made during the intervention period.

Of the orders CDS systems could rate, 73.7% ordered during the baseline period and 81% ordered during the intervention period were rated as appropriate, and 11.1% and 6.4%, respectively, were rated inappropriate.

Of the orders that were initially rated as inappropriate, 4.8% were changed, and 1.9% were canceled.

When the CDS systems suggested an alternative for inappropriate orders, 9.9% of the orders were changed, and 0.4% were canceled. When the systems did not provide an alternative, 1.4% of inappropriate orders were changed, and 2.8% were canceled.

“In response to these findings, we recommend that clinical decision support efforts should focus on tools that help clinicians perform their work more efficiently and effectively,” said study author Katherine Kahn, MD, of the University of California, Los Angeles.

“We need a more comprehensive set of evidence-based guidelines that cover a greater proportion of advanced imaging orders for Medicare patients, and provide better methods for communicating feedback to clinicians.”

Publications
Publications
Topics
Article Type
Display Headline
CDS systems often can’t tell if imaging is appropriate
Display Headline
CDS systems often can’t tell if imaging is appropriate
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica