Cardiac Risk in Diabetes Often Overestimated

Article Type
Changed
Fri, 01/18/2019 - 11:27
Display Headline
Cardiac Risk in Diabetes Often Overestimated

DENVER – Diabetes patients with stable symptoms of coronary artery disease appear to have a lower cardiac event risk than previously thought.

The yearly rate of cardiovascular death or nonfatal MI was just 2.4% in a series of 444 consecutive diabetes outpatients with symptoms suggestive of coronary artery disease (CAD) who underwent exercise treadmill or pharmacologic stress single-photon emission computed tomography (SPECT) myocardial perfusion imaging. The cardiovascular death rate of 0.4% per year and the nonfatal MI rate of 2.0% per year were surprisingly low, given that 39% of subjects had known CAD and the rest had symptoms suggestive of CAD, Dr. Jamieson M. Bourque noted at the annual meeting of the American Society of Nuclear Cardiology.

The explanation may be found at least in part in contemporary evidence-based intensive medical management for risk reduction in this traditionally high-risk population, added Dr. Bourque of the University of Virginia, Charlottesville.

Of the 444 symptomatic diabetes patients, 78.5% had no inducible ischemia on stress SPECT myocardial perfusion imaging, 16.5% had 1%-9% left ventricular ischemia, and 5% had left ventricular ischemia of at least 10%. Again, these are lower rates than would be expected based on historical data taken from the era before aggressive risk factor modification in patients with diabetes and CAD symptoms.

During a median 2.4 years of follow-up, the combined rate of cardiovascular death, nonfatal MI, or revascularization more than 4 weeks after myocardial perfusion imaging was 32% in patients with at least 10% left ventricular ischemia on their presenting SPECT study, 14% in those with 1%-9% ischemia, and 8% in those with no ischemia.

Patients who achieved at least 10 METs (metabolic equivalents) on the treadmill during testing had the best prognosis. The sole event that occurred in this subgroup was a late revascularization.

In all, 60% of hard cardiac events occurring in this study were in patients with no perfusion defects. This points to the need for improved patient selection and risk stratification techniques in diabetes patients, according to Dr. Bourque.

He declared having no financial conflicts.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
coronary artery disease diabetes, cardiovascular death, single-photon emission computed tomography, myocardial perfusion imaging, inducible ischemia, CAD symptoms
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

DENVER – Diabetes patients with stable symptoms of coronary artery disease appear to have a lower cardiac event risk than previously thought.

The yearly rate of cardiovascular death or nonfatal MI was just 2.4% in a series of 444 consecutive diabetes outpatients with symptoms suggestive of coronary artery disease (CAD) who underwent exercise treadmill or pharmacologic stress single-photon emission computed tomography (SPECT) myocardial perfusion imaging. The cardiovascular death rate of 0.4% per year and the nonfatal MI rate of 2.0% per year were surprisingly low, given that 39% of subjects had known CAD and the rest had symptoms suggestive of CAD, Dr. Jamieson M. Bourque noted at the annual meeting of the American Society of Nuclear Cardiology.

The explanation may be found at least in part in contemporary evidence-based intensive medical management for risk reduction in this traditionally high-risk population, added Dr. Bourque of the University of Virginia, Charlottesville.

Of the 444 symptomatic diabetes patients, 78.5% had no inducible ischemia on stress SPECT myocardial perfusion imaging, 16.5% had 1%-9% left ventricular ischemia, and 5% had left ventricular ischemia of at least 10%. Again, these are lower rates than would be expected based on historical data taken from the era before aggressive risk factor modification in patients with diabetes and CAD symptoms.

During a median 2.4 years of follow-up, the combined rate of cardiovascular death, nonfatal MI, or revascularization more than 4 weeks after myocardial perfusion imaging was 32% in patients with at least 10% left ventricular ischemia on their presenting SPECT study, 14% in those with 1%-9% ischemia, and 8% in those with no ischemia.

Patients who achieved at least 10 METs (metabolic equivalents) on the treadmill during testing had the best prognosis. The sole event that occurred in this subgroup was a late revascularization.

In all, 60% of hard cardiac events occurring in this study were in patients with no perfusion defects. This points to the need for improved patient selection and risk stratification techniques in diabetes patients, according to Dr. Bourque.

He declared having no financial conflicts.

DENVER – Diabetes patients with stable symptoms of coronary artery disease appear to have a lower cardiac event risk than previously thought.

The yearly rate of cardiovascular death or nonfatal MI was just 2.4% in a series of 444 consecutive diabetes outpatients with symptoms suggestive of coronary artery disease (CAD) who underwent exercise treadmill or pharmacologic stress single-photon emission computed tomography (SPECT) myocardial perfusion imaging. The cardiovascular death rate of 0.4% per year and the nonfatal MI rate of 2.0% per year were surprisingly low, given that 39% of subjects had known CAD and the rest had symptoms suggestive of CAD, Dr. Jamieson M. Bourque noted at the annual meeting of the American Society of Nuclear Cardiology.

The explanation may be found at least in part in contemporary evidence-based intensive medical management for risk reduction in this traditionally high-risk population, added Dr. Bourque of the University of Virginia, Charlottesville.

Of the 444 symptomatic diabetes patients, 78.5% had no inducible ischemia on stress SPECT myocardial perfusion imaging, 16.5% had 1%-9% left ventricular ischemia, and 5% had left ventricular ischemia of at least 10%. Again, these are lower rates than would be expected based on historical data taken from the era before aggressive risk factor modification in patients with diabetes and CAD symptoms.

During a median 2.4 years of follow-up, the combined rate of cardiovascular death, nonfatal MI, or revascularization more than 4 weeks after myocardial perfusion imaging was 32% in patients with at least 10% left ventricular ischemia on their presenting SPECT study, 14% in those with 1%-9% ischemia, and 8% in those with no ischemia.

Patients who achieved at least 10 METs (metabolic equivalents) on the treadmill during testing had the best prognosis. The sole event that occurred in this subgroup was a late revascularization.

In all, 60% of hard cardiac events occurring in this study were in patients with no perfusion defects. This points to the need for improved patient selection and risk stratification techniques in diabetes patients, according to Dr. Bourque.

He declared having no financial conflicts.

Publications
Publications
Topics
Article Type
Display Headline
Cardiac Risk in Diabetes Often Overestimated
Display Headline
Cardiac Risk in Diabetes Often Overestimated
Legacy Keywords
coronary artery disease diabetes, cardiovascular death, single-photon emission computed tomography, myocardial perfusion imaging, inducible ischemia, CAD symptoms
Legacy Keywords
coronary artery disease diabetes, cardiovascular death, single-photon emission computed tomography, myocardial perfusion imaging, inducible ischemia, CAD symptoms
Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY OF NUCLEAR CARDIOLOGY

PURLs Copyright

Inside the Article

Vitals

Major Finding: The annual combined rate of cardiovascular death or nonfatal MI was 2.4% in a prospective series of diabetes patients with stable symptoms suggestive of CAD.

Data Source: A consecutive series of 444 patients followed for a median of 2.4 years.

Disclosures: No conflicts of interest.

Coronary Flow Reserve Enhances Risk Assessment

Article Type
Changed
Fri, 12/07/2018 - 14:23
Display Headline
Coronary Flow Reserve Enhances Risk Assessment

DENVER – The added prognostic value gained by measuring coronary flow reserve in addition to myocardial perfusion for predicting the risk of cardiac events is a major emerging theme in cardiac nuclear imaging.

The impetus for developing PET quantitation of coronary flow reserve as a tool for evaluating cardiac event risk in patients with known or suspected CAD lies in the relatively recent recognition that a normal or low-risk conventional single-photon emission computed tomography (SPECT) myocardial perfusion imaging study is no guarantee of a low event risk, Dr. George A. Beller noted at the annual meeting of the American Society of Nuclear Cardiology.

"In order to identify those patients with normal or low-risk perfusion scans who really are high risk, we need to do more than just look at relative perfusion," he explained. "Myocardial flow reserve as assessed by PET is emerging now as a really good adjunct to just looking at relative uptake."

Italian investigators recently utilized dynamic SPECT imaging to assess coronary flow reserve in 58 patients with a normal myocardial perfusion study. In the 20 patients with normal coronary flow reserve as well as normal perfusion, the cardiac event rate was just 0.7% per year. In the 38 with normal perfusion along with an abnormally low coronary flow reserve, however, the event rate was 5.2% per year (J. Nucl. Cardiol. 2011;18:612-9).

"This is a substantial sevenfold increase in event rate in those with what we would consider a normal scan with normal perfusion," commented Dr. Beller of the University of Virginia, Charlottesville.

PET offers several key advantages over SPECT for assessment of coronary flow reserve, the most important being that it provides accurate quantification of the extent of abnormal regional myocardial blood flow reserve. In addition, it readily detects microvascular dysfunction, and it also picks up balanced ischemia, which often results in a false-negative SPECT myocardial perfusion imaging study, he continued.

"Myocardial flow reserve as assessed by PET is emerging now as a really good adjunct to just looking at relative uptake."

A recent study by investigators at the University of Ottawa Heart Institute National Cardiac PET Center showed that quantitation of coronary flow reserve using rubidium-82 PET predicted hard cardiac events independently of summed stress scores for myocardial ischemia.

In 704 consecutive patients prospectively followed for a median of 387 days after testing, those with a summed symptom score of less than 4, indicative of normal myocardial perfusion, plus normal coronary flow reserve had a 1.3% incidence of cardiac death or MI. In contrast, patients with a summed symptom score below 4 and abnormally low coronary flow reserve had a significantly higher 2.0% event rate, while those with a summed symptom score of 4 or higher plus abnormal coronary flow reserve had an 11.1% cardiac event rate. Patients with abnormal myocardial perfusion and normal coronary flow reserve had a 1.1% incidence of cardiac events (J. Am. Coll. Cardiol. 2011;58:740-8).

Similarly, investigators at Johns Hopkins University, Baltimore, recently demonstrated that a finding of globally impaired myocardial flow reserve was a potent independent predictor of near-term cardiovascular events.

They utilized rubidium-82 PET to quantify global myocardial flow reserve in 275 patients referred for perfusion imaging and subsequently followed for an average of 1 year. In an age-adjusted multivariate analysis, a finding of regional perfusion defects was independently associated with a 2.5-fold increased risk of cardiac events, confirming a long-established relationship. But in addition, a global myocardial flow reserve below the median value was also an independent predictor of cardiac events, with an associated 2.9-fold increased risk (J. Nucl. Med. 2011;52:726-32).

On the basis of these and other compelling studies reported in the last year or two, Dr. Beller offered the following algorithm for noninvasive testing to predict the risk of cardiac events in patients with stable CAD: Those who can perform more than 10 METs (metabolic equivalents) of exercise on the treadmill are at very low risk and warrant being placed on optimal medical therapy, with crossover to an invasive strategy only if symptoms worsen. Patients less than 5% left ventricular ischemia and normal coronary flow reserve also belong in this low-risk category.

A high-risk test is defined by 10%-15% left ventricular ischemia or markedly reduced coronary flow reserve, even in the presence of only a mild perfusion defect. These are patients for whom consideration should be given to an invasive strategy coupled with optimal medical therapy.

For patients with mild ischemia – that is, 5%-9% – a normal ejection fraction, and either a normal or only a small focal area of diminished coronary flow reserve, optimal medical therapy and follow-up stress imaging is an appropriate approach, Dr. Beller added.

 

 

He declared having no financial conflicts.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
coronary flow reserve, cardiac nuclear imaging, myocardial perfusion, cardiac risk factor, PET quantitation, single-photon emission computed tomography
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

DENVER – The added prognostic value gained by measuring coronary flow reserve in addition to myocardial perfusion for predicting the risk of cardiac events is a major emerging theme in cardiac nuclear imaging.

The impetus for developing PET quantitation of coronary flow reserve as a tool for evaluating cardiac event risk in patients with known or suspected CAD lies in the relatively recent recognition that a normal or low-risk conventional single-photon emission computed tomography (SPECT) myocardial perfusion imaging study is no guarantee of a low event risk, Dr. George A. Beller noted at the annual meeting of the American Society of Nuclear Cardiology.

"In order to identify those patients with normal or low-risk perfusion scans who really are high risk, we need to do more than just look at relative perfusion," he explained. "Myocardial flow reserve as assessed by PET is emerging now as a really good adjunct to just looking at relative uptake."

Italian investigators recently utilized dynamic SPECT imaging to assess coronary flow reserve in 58 patients with a normal myocardial perfusion study. In the 20 patients with normal coronary flow reserve as well as normal perfusion, the cardiac event rate was just 0.7% per year. In the 38 with normal perfusion along with an abnormally low coronary flow reserve, however, the event rate was 5.2% per year (J. Nucl. Cardiol. 2011;18:612-9).

"This is a substantial sevenfold increase in event rate in those with what we would consider a normal scan with normal perfusion," commented Dr. Beller of the University of Virginia, Charlottesville.

PET offers several key advantages over SPECT for assessment of coronary flow reserve, the most important being that it provides accurate quantification of the extent of abnormal regional myocardial blood flow reserve. In addition, it readily detects microvascular dysfunction, and it also picks up balanced ischemia, which often results in a false-negative SPECT myocardial perfusion imaging study, he continued.

"Myocardial flow reserve as assessed by PET is emerging now as a really good adjunct to just looking at relative uptake."

A recent study by investigators at the University of Ottawa Heart Institute National Cardiac PET Center showed that quantitation of coronary flow reserve using rubidium-82 PET predicted hard cardiac events independently of summed stress scores for myocardial ischemia.

In 704 consecutive patients prospectively followed for a median of 387 days after testing, those with a summed symptom score of less than 4, indicative of normal myocardial perfusion, plus normal coronary flow reserve had a 1.3% incidence of cardiac death or MI. In contrast, patients with a summed symptom score below 4 and abnormally low coronary flow reserve had a significantly higher 2.0% event rate, while those with a summed symptom score of 4 or higher plus abnormal coronary flow reserve had an 11.1% cardiac event rate. Patients with abnormal myocardial perfusion and normal coronary flow reserve had a 1.1% incidence of cardiac events (J. Am. Coll. Cardiol. 2011;58:740-8).

Similarly, investigators at Johns Hopkins University, Baltimore, recently demonstrated that a finding of globally impaired myocardial flow reserve was a potent independent predictor of near-term cardiovascular events.

They utilized rubidium-82 PET to quantify global myocardial flow reserve in 275 patients referred for perfusion imaging and subsequently followed for an average of 1 year. In an age-adjusted multivariate analysis, a finding of regional perfusion defects was independently associated with a 2.5-fold increased risk of cardiac events, confirming a long-established relationship. But in addition, a global myocardial flow reserve below the median value was also an independent predictor of cardiac events, with an associated 2.9-fold increased risk (J. Nucl. Med. 2011;52:726-32).

On the basis of these and other compelling studies reported in the last year or two, Dr. Beller offered the following algorithm for noninvasive testing to predict the risk of cardiac events in patients with stable CAD: Those who can perform more than 10 METs (metabolic equivalents) of exercise on the treadmill are at very low risk and warrant being placed on optimal medical therapy, with crossover to an invasive strategy only if symptoms worsen. Patients less than 5% left ventricular ischemia and normal coronary flow reserve also belong in this low-risk category.

A high-risk test is defined by 10%-15% left ventricular ischemia or markedly reduced coronary flow reserve, even in the presence of only a mild perfusion defect. These are patients for whom consideration should be given to an invasive strategy coupled with optimal medical therapy.

For patients with mild ischemia – that is, 5%-9% – a normal ejection fraction, and either a normal or only a small focal area of diminished coronary flow reserve, optimal medical therapy and follow-up stress imaging is an appropriate approach, Dr. Beller added.

 

 

He declared having no financial conflicts.

DENVER – The added prognostic value gained by measuring coronary flow reserve in addition to myocardial perfusion for predicting the risk of cardiac events is a major emerging theme in cardiac nuclear imaging.

The impetus for developing PET quantitation of coronary flow reserve as a tool for evaluating cardiac event risk in patients with known or suspected CAD lies in the relatively recent recognition that a normal or low-risk conventional single-photon emission computed tomography (SPECT) myocardial perfusion imaging study is no guarantee of a low event risk, Dr. George A. Beller noted at the annual meeting of the American Society of Nuclear Cardiology.

"In order to identify those patients with normal or low-risk perfusion scans who really are high risk, we need to do more than just look at relative perfusion," he explained. "Myocardial flow reserve as assessed by PET is emerging now as a really good adjunct to just looking at relative uptake."

Italian investigators recently utilized dynamic SPECT imaging to assess coronary flow reserve in 58 patients with a normal myocardial perfusion study. In the 20 patients with normal coronary flow reserve as well as normal perfusion, the cardiac event rate was just 0.7% per year. In the 38 with normal perfusion along with an abnormally low coronary flow reserve, however, the event rate was 5.2% per year (J. Nucl. Cardiol. 2011;18:612-9).

"This is a substantial sevenfold increase in event rate in those with what we would consider a normal scan with normal perfusion," commented Dr. Beller of the University of Virginia, Charlottesville.

PET offers several key advantages over SPECT for assessment of coronary flow reserve, the most important being that it provides accurate quantification of the extent of abnormal regional myocardial blood flow reserve. In addition, it readily detects microvascular dysfunction, and it also picks up balanced ischemia, which often results in a false-negative SPECT myocardial perfusion imaging study, he continued.

"Myocardial flow reserve as assessed by PET is emerging now as a really good adjunct to just looking at relative uptake."

A recent study by investigators at the University of Ottawa Heart Institute National Cardiac PET Center showed that quantitation of coronary flow reserve using rubidium-82 PET predicted hard cardiac events independently of summed stress scores for myocardial ischemia.

In 704 consecutive patients prospectively followed for a median of 387 days after testing, those with a summed symptom score of less than 4, indicative of normal myocardial perfusion, plus normal coronary flow reserve had a 1.3% incidence of cardiac death or MI. In contrast, patients with a summed symptom score below 4 and abnormally low coronary flow reserve had a significantly higher 2.0% event rate, while those with a summed symptom score of 4 or higher plus abnormal coronary flow reserve had an 11.1% cardiac event rate. Patients with abnormal myocardial perfusion and normal coronary flow reserve had a 1.1% incidence of cardiac events (J. Am. Coll. Cardiol. 2011;58:740-8).

Similarly, investigators at Johns Hopkins University, Baltimore, recently demonstrated that a finding of globally impaired myocardial flow reserve was a potent independent predictor of near-term cardiovascular events.

They utilized rubidium-82 PET to quantify global myocardial flow reserve in 275 patients referred for perfusion imaging and subsequently followed for an average of 1 year. In an age-adjusted multivariate analysis, a finding of regional perfusion defects was independently associated with a 2.5-fold increased risk of cardiac events, confirming a long-established relationship. But in addition, a global myocardial flow reserve below the median value was also an independent predictor of cardiac events, with an associated 2.9-fold increased risk (J. Nucl. Med. 2011;52:726-32).

On the basis of these and other compelling studies reported in the last year or two, Dr. Beller offered the following algorithm for noninvasive testing to predict the risk of cardiac events in patients with stable CAD: Those who can perform more than 10 METs (metabolic equivalents) of exercise on the treadmill are at very low risk and warrant being placed on optimal medical therapy, with crossover to an invasive strategy only if symptoms worsen. Patients less than 5% left ventricular ischemia and normal coronary flow reserve also belong in this low-risk category.

A high-risk test is defined by 10%-15% left ventricular ischemia or markedly reduced coronary flow reserve, even in the presence of only a mild perfusion defect. These are patients for whom consideration should be given to an invasive strategy coupled with optimal medical therapy.

For patients with mild ischemia – that is, 5%-9% – a normal ejection fraction, and either a normal or only a small focal area of diminished coronary flow reserve, optimal medical therapy and follow-up stress imaging is an appropriate approach, Dr. Beller added.

 

 

He declared having no financial conflicts.

Publications
Publications
Topics
Article Type
Display Headline
Coronary Flow Reserve Enhances Risk Assessment
Display Headline
Coronary Flow Reserve Enhances Risk Assessment
Legacy Keywords
coronary flow reserve, cardiac nuclear imaging, myocardial perfusion, cardiac risk factor, PET quantitation, single-photon emission computed tomography
Legacy Keywords
coronary flow reserve, cardiac nuclear imaging, myocardial perfusion, cardiac risk factor, PET quantitation, single-photon emission computed tomography
Article Source

EXPERT ANALYSIS FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY OF NUCLEAR CARDIOLOGY

PURLs Copyright

Inside the Article

Nuclear Cardiology Works Toward Reduced Radiation Exposure

Article Type
Changed
Fri, 01/18/2019 - 11:27
Display Headline
Nuclear Cardiology Works Toward Reduced Radiation Exposure

DENVER – Nuclear medicine specialists are feeling the brunt of increased public anxiety and regulatory concern regarding patient radiation exposures.

Nuclear cardiologists, in particular, find themselves in the crosshairs as a result of recent evidence of inappropriate overutilization of myocardial perfusion imaging. The profession has responded with a campaign aimed at defining appropriate use scenarios for practitioners and encouraging adoption of newer techniques that reduce radiation exposure while retaining high image quality.

"Based on these recommendations, we expect that for the population of patients referred for SPECT or PET myocardial perfusion imaging, on average a total radiation exposure of 9 mSv or less can be achieved in 50% of studies by 2014," Dr. Manuel D. Cerqueira said at the annual meeting of the American Society of Nuclear Cardiology.

Meeting that goal will require, for example, doing fewer separate-day, stress/rest technetium-99 myocardial perfusion imaging tests, which typically entail 13-16 mSv of radiation exposure. Also, the American Society of Nuclear Cardiology (ASNC) recommendations urge consideration of stress echocardiography as an alternative to nuclear imaging in younger patients because the diagnostic accuracy may be comparable and they can avoid radiation exposure altogether. In addition, the ASNC report discourages thallium-201–based imaging protocols, which involve 22-31 mSv of radiation exposure, noted Dr. Cerqueira, who was first author of the recommendations (J. Nucl. Cardiol. 2010;17:709-18), and is professor of radiology and medicine at Case Western Reserve University, Cleveland.

Something that has nuclear medicine specialists and interventional radiologists greatly concerned, according to Robert W. Atcher, Ph.D., is a proposed Nuclear Regulatory Commission (NRC) policy change that would lower the occupational radiation exposure limit from 5 to 2 rem (Roentgen equivalent man).

"The most affected groups in our field are technologists with a heavy PET [positron emission tomography] patient flow, cyclotron engineers, maintenance personnel, and radiochemists synthesizing PET tracers and therapeutic compounds. This is potentially devastating to us, because if we lower the limit then we have to double the number of people who are responsible for doing the same number of imaging studies, with no way to collect any more reimbursement to handle that task. In essence, they’re threatening to devastate our ability to do imaging," said Dr. Atcher, director of the National Isotope Development Center at the U.S. Department of Energy.

The NRC got an earful from concerned physicians and nonphysician scientists at hearings on the proposed changes held last year in Los Angeles, Houston, and Washington. The agency has not yet announced whether it plans to go ahead.

Another instance of what Dr. Atcher characterized as "regulatory overreacting" involves congressional interest in requiring hospitalization for patients who have received iodine-131. He and others have testified that there is no scientific evidence of risk to patients’ families or the general public if current guidelines for I-131 use are followed. Congressional representatives were also told that hospitalizing I-131 recipients would cost in excess of $600 million annually. In addition, critics of the idea pointed out that the risk of acquiring a serious methicillin-resistant Staphylococcus aureus infection during a hospital stay is quite real, said Dr. Atcher, a former president of the Society for Nuclear Medicine.

The new mantra at ASNC is "patient-centered imaging." The group’s recommendations for reducing radiation exposure from myocardial perfusion imaging emphasize appropriate patient selection, the use of standardized imaging protocols, radiotracers with shorter half-lives, weight-based dosing, and improved imaging systems.

Dr. E. Gordon DePuey highlighted the many new methods of optimizing image quality that have reached the market. These include resolution recovery and noise modeling software that provides superior image quality with shortened radiation exposure time. "All vendors now offer software that does this," he pointed out.

Also, hardware enhancements such as cardiofocal collimation are making a big difference. This particular technology allows half-time SPECT (single-photon emission computed tomography) with 100% myocardial radiation count density, explained Dr. DePuey of Columbia University, New York.

"All these new hardware and software methods out there are major advancements in nuclear cardiology. They need to be very seriously considered and incorporated in your practice, because they are really the keys to allowing you to decrease the radiation dose to your patients," he said.

Dr. Cerqueira drew attention to a large study that concluded myocardial perfusion imaging accounts for 22% of the total effective radiation dose accumulated from all nuclear medicine imaging procedures. Abdominal CT was the second biggest contributor, at 18% (N. Engl. J. Med. 2009;361:849-57).

Also concerning was a recent study using the very large UnitedHealthcare patient database. It showed that myocardial perfusion imaging accounted for 80% of the cumulative effective radiation dose from all cardiac imaging procedures in women age 18-34 years (J. Am. Coll. Cardiol. 2010;56:702-11).

 

 

"That’s a young population of women of childbearing age where you really wouldn’t expect a great many of these myocardial perfusion imaging studies to be done," Dr. Cerqueira commented.

He noted that the ASNC recommendations urge reserving myocardial perfusion imaging for patients in whom it has the greatest clinical utility: those at intermediate risk of coronary artery disease, patients requiring prognostic or management information, and those with persistent unexplained symptoms.

Dr. DePuey disclosed that he serves as an adviser to UltraSPECT and Dogwood Pharmaceuticals. The other speakers declared having no relevant financial interests.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Nuclear medicine, patient radiation exposures, Nuclear cardiologists, myocardial perfusion imaging, reduce radiation exposure, retaining high image quality, SPECT, PET, myocardial perfusion imaging, Dr. Manuel D. Cerqueira, the American Society of Nuclear Cardiology,

Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

DENVER – Nuclear medicine specialists are feeling the brunt of increased public anxiety and regulatory concern regarding patient radiation exposures.

Nuclear cardiologists, in particular, find themselves in the crosshairs as a result of recent evidence of inappropriate overutilization of myocardial perfusion imaging. The profession has responded with a campaign aimed at defining appropriate use scenarios for practitioners and encouraging adoption of newer techniques that reduce radiation exposure while retaining high image quality.

"Based on these recommendations, we expect that for the population of patients referred for SPECT or PET myocardial perfusion imaging, on average a total radiation exposure of 9 mSv or less can be achieved in 50% of studies by 2014," Dr. Manuel D. Cerqueira said at the annual meeting of the American Society of Nuclear Cardiology.

Meeting that goal will require, for example, doing fewer separate-day, stress/rest technetium-99 myocardial perfusion imaging tests, which typically entail 13-16 mSv of radiation exposure. Also, the American Society of Nuclear Cardiology (ASNC) recommendations urge consideration of stress echocardiography as an alternative to nuclear imaging in younger patients because the diagnostic accuracy may be comparable and they can avoid radiation exposure altogether. In addition, the ASNC report discourages thallium-201–based imaging protocols, which involve 22-31 mSv of radiation exposure, noted Dr. Cerqueira, who was first author of the recommendations (J. Nucl. Cardiol. 2010;17:709-18), and is professor of radiology and medicine at Case Western Reserve University, Cleveland.

Something that has nuclear medicine specialists and interventional radiologists greatly concerned, according to Robert W. Atcher, Ph.D., is a proposed Nuclear Regulatory Commission (NRC) policy change that would lower the occupational radiation exposure limit from 5 to 2 rem (Roentgen equivalent man).

"The most affected groups in our field are technologists with a heavy PET [positron emission tomography] patient flow, cyclotron engineers, maintenance personnel, and radiochemists synthesizing PET tracers and therapeutic compounds. This is potentially devastating to us, because if we lower the limit then we have to double the number of people who are responsible for doing the same number of imaging studies, with no way to collect any more reimbursement to handle that task. In essence, they’re threatening to devastate our ability to do imaging," said Dr. Atcher, director of the National Isotope Development Center at the U.S. Department of Energy.

The NRC got an earful from concerned physicians and nonphysician scientists at hearings on the proposed changes held last year in Los Angeles, Houston, and Washington. The agency has not yet announced whether it plans to go ahead.

Another instance of what Dr. Atcher characterized as "regulatory overreacting" involves congressional interest in requiring hospitalization for patients who have received iodine-131. He and others have testified that there is no scientific evidence of risk to patients’ families or the general public if current guidelines for I-131 use are followed. Congressional representatives were also told that hospitalizing I-131 recipients would cost in excess of $600 million annually. In addition, critics of the idea pointed out that the risk of acquiring a serious methicillin-resistant Staphylococcus aureus infection during a hospital stay is quite real, said Dr. Atcher, a former president of the Society for Nuclear Medicine.

The new mantra at ASNC is "patient-centered imaging." The group’s recommendations for reducing radiation exposure from myocardial perfusion imaging emphasize appropriate patient selection, the use of standardized imaging protocols, radiotracers with shorter half-lives, weight-based dosing, and improved imaging systems.

Dr. E. Gordon DePuey highlighted the many new methods of optimizing image quality that have reached the market. These include resolution recovery and noise modeling software that provides superior image quality with shortened radiation exposure time. "All vendors now offer software that does this," he pointed out.

Also, hardware enhancements such as cardiofocal collimation are making a big difference. This particular technology allows half-time SPECT (single-photon emission computed tomography) with 100% myocardial radiation count density, explained Dr. DePuey of Columbia University, New York.

"All these new hardware and software methods out there are major advancements in nuclear cardiology. They need to be very seriously considered and incorporated in your practice, because they are really the keys to allowing you to decrease the radiation dose to your patients," he said.

Dr. Cerqueira drew attention to a large study that concluded myocardial perfusion imaging accounts for 22% of the total effective radiation dose accumulated from all nuclear medicine imaging procedures. Abdominal CT was the second biggest contributor, at 18% (N. Engl. J. Med. 2009;361:849-57).

Also concerning was a recent study using the very large UnitedHealthcare patient database. It showed that myocardial perfusion imaging accounted for 80% of the cumulative effective radiation dose from all cardiac imaging procedures in women age 18-34 years (J. Am. Coll. Cardiol. 2010;56:702-11).

 

 

"That’s a young population of women of childbearing age where you really wouldn’t expect a great many of these myocardial perfusion imaging studies to be done," Dr. Cerqueira commented.

He noted that the ASNC recommendations urge reserving myocardial perfusion imaging for patients in whom it has the greatest clinical utility: those at intermediate risk of coronary artery disease, patients requiring prognostic or management information, and those with persistent unexplained symptoms.

Dr. DePuey disclosed that he serves as an adviser to UltraSPECT and Dogwood Pharmaceuticals. The other speakers declared having no relevant financial interests.

DENVER – Nuclear medicine specialists are feeling the brunt of increased public anxiety and regulatory concern regarding patient radiation exposures.

Nuclear cardiologists, in particular, find themselves in the crosshairs as a result of recent evidence of inappropriate overutilization of myocardial perfusion imaging. The profession has responded with a campaign aimed at defining appropriate use scenarios for practitioners and encouraging adoption of newer techniques that reduce radiation exposure while retaining high image quality.

"Based on these recommendations, we expect that for the population of patients referred for SPECT or PET myocardial perfusion imaging, on average a total radiation exposure of 9 mSv or less can be achieved in 50% of studies by 2014," Dr. Manuel D. Cerqueira said at the annual meeting of the American Society of Nuclear Cardiology.

Meeting that goal will require, for example, doing fewer separate-day, stress/rest technetium-99 myocardial perfusion imaging tests, which typically entail 13-16 mSv of radiation exposure. Also, the American Society of Nuclear Cardiology (ASNC) recommendations urge consideration of stress echocardiography as an alternative to nuclear imaging in younger patients because the diagnostic accuracy may be comparable and they can avoid radiation exposure altogether. In addition, the ASNC report discourages thallium-201–based imaging protocols, which involve 22-31 mSv of radiation exposure, noted Dr. Cerqueira, who was first author of the recommendations (J. Nucl. Cardiol. 2010;17:709-18), and is professor of radiology and medicine at Case Western Reserve University, Cleveland.

Something that has nuclear medicine specialists and interventional radiologists greatly concerned, according to Robert W. Atcher, Ph.D., is a proposed Nuclear Regulatory Commission (NRC) policy change that would lower the occupational radiation exposure limit from 5 to 2 rem (Roentgen equivalent man).

"The most affected groups in our field are technologists with a heavy PET [positron emission tomography] patient flow, cyclotron engineers, maintenance personnel, and radiochemists synthesizing PET tracers and therapeutic compounds. This is potentially devastating to us, because if we lower the limit then we have to double the number of people who are responsible for doing the same number of imaging studies, with no way to collect any more reimbursement to handle that task. In essence, they’re threatening to devastate our ability to do imaging," said Dr. Atcher, director of the National Isotope Development Center at the U.S. Department of Energy.

The NRC got an earful from concerned physicians and nonphysician scientists at hearings on the proposed changes held last year in Los Angeles, Houston, and Washington. The agency has not yet announced whether it plans to go ahead.

Another instance of what Dr. Atcher characterized as "regulatory overreacting" involves congressional interest in requiring hospitalization for patients who have received iodine-131. He and others have testified that there is no scientific evidence of risk to patients’ families or the general public if current guidelines for I-131 use are followed. Congressional representatives were also told that hospitalizing I-131 recipients would cost in excess of $600 million annually. In addition, critics of the idea pointed out that the risk of acquiring a serious methicillin-resistant Staphylococcus aureus infection during a hospital stay is quite real, said Dr. Atcher, a former president of the Society for Nuclear Medicine.

The new mantra at ASNC is "patient-centered imaging." The group’s recommendations for reducing radiation exposure from myocardial perfusion imaging emphasize appropriate patient selection, the use of standardized imaging protocols, radiotracers with shorter half-lives, weight-based dosing, and improved imaging systems.

Dr. E. Gordon DePuey highlighted the many new methods of optimizing image quality that have reached the market. These include resolution recovery and noise modeling software that provides superior image quality with shortened radiation exposure time. "All vendors now offer software that does this," he pointed out.

Also, hardware enhancements such as cardiofocal collimation are making a big difference. This particular technology allows half-time SPECT (single-photon emission computed tomography) with 100% myocardial radiation count density, explained Dr. DePuey of Columbia University, New York.

"All these new hardware and software methods out there are major advancements in nuclear cardiology. They need to be very seriously considered and incorporated in your practice, because they are really the keys to allowing you to decrease the radiation dose to your patients," he said.

Dr. Cerqueira drew attention to a large study that concluded myocardial perfusion imaging accounts for 22% of the total effective radiation dose accumulated from all nuclear medicine imaging procedures. Abdominal CT was the second biggest contributor, at 18% (N. Engl. J. Med. 2009;361:849-57).

Also concerning was a recent study using the very large UnitedHealthcare patient database. It showed that myocardial perfusion imaging accounted for 80% of the cumulative effective radiation dose from all cardiac imaging procedures in women age 18-34 years (J. Am. Coll. Cardiol. 2010;56:702-11).

 

 

"That’s a young population of women of childbearing age where you really wouldn’t expect a great many of these myocardial perfusion imaging studies to be done," Dr. Cerqueira commented.

He noted that the ASNC recommendations urge reserving myocardial perfusion imaging for patients in whom it has the greatest clinical utility: those at intermediate risk of coronary artery disease, patients requiring prognostic or management information, and those with persistent unexplained symptoms.

Dr. DePuey disclosed that he serves as an adviser to UltraSPECT and Dogwood Pharmaceuticals. The other speakers declared having no relevant financial interests.

Publications
Publications
Topics
Article Type
Display Headline
Nuclear Cardiology Works Toward Reduced Radiation Exposure
Display Headline
Nuclear Cardiology Works Toward Reduced Radiation Exposure
Legacy Keywords
Nuclear medicine, patient radiation exposures, Nuclear cardiologists, myocardial perfusion imaging, reduce radiation exposure, retaining high image quality, SPECT, PET, myocardial perfusion imaging, Dr. Manuel D. Cerqueira, the American Society of Nuclear Cardiology,

Legacy Keywords
Nuclear medicine, patient radiation exposures, Nuclear cardiologists, myocardial perfusion imaging, reduce radiation exposure, retaining high image quality, SPECT, PET, myocardial perfusion imaging, Dr. Manuel D. Cerqueira, the American Society of Nuclear Cardiology,

Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY OF NUCLEAR CARDIOLOGY

PURLs Copyright

Inside the Article

B Vitamins Fail to Cut Fracture Risk in Women

Article Type
Changed
Fri, 01/18/2019 - 11:27
Display Headline
B Vitamins Fail to Cut Fracture Risk in Women

SAN DIEGO – Combined daily supplementation with folic acid and vitamins B6 and B12 proved to be a bust for reduction of nonvertebral fracture risk in a large, randomized, double-blind clinical trial conducted in women with known cardiovascular disease or multiple risk factors.

A secondary analysis of fracture outcomes in 5,442 female health professionals over age 42 years who participated in the Women’s Antioxidant and Folic Acid Cardiovascular Study (WAFACS) showed an overall nonvertebral fracture incidence of 7.6% during an average 7.3 years of treatment and follow-up in the supplementation group and a similar 6.9% rate in placebo-treated controls, Dr. Douglas C. Bauer reported at the annual meeting of the American Society for Bone and Mineral Research.

Dr. Douglas C. Bauer

Participants in WAFACS had known cardiovascular disease or three or more cardiovascular risk factors. The previously reported primary study end points were cardiovascular event rates and all-cause mortality, where supplements had no effect (JAMA 2008;299:2027-36).

The new retrospective secondary analysis of WAFACS was undertaken because some prior observational studies had concluded that elevated homocysteine and low vitamin B12 levels are associated with increased fracture risk. The supplement regimen utilized in the trial was designed to lower homocysteine and boost vitamin B12 levels. It consisted of daily folic acid at 2.5 mg, vitamin B6 at 50 mg, and vitamin B12 at 1 mg.

In all, 80% of participants reported greater than 66% compliance with therapy. Rates of hip, wrist, and total nonspine fractures were similar in both high-compliance subgroups, noted Dr. Bauer of the University of California, San Francisco.

WAFACS was funded by the National Heart, Lung, and Blood Institute. Dr. Bauer reported having no financial conflicts.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
nonvertebral fractures, folic acid supplement, vitamins B6 and B12, vitamin B supplements, fracture risk
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

SAN DIEGO – Combined daily supplementation with folic acid and vitamins B6 and B12 proved to be a bust for reduction of nonvertebral fracture risk in a large, randomized, double-blind clinical trial conducted in women with known cardiovascular disease or multiple risk factors.

A secondary analysis of fracture outcomes in 5,442 female health professionals over age 42 years who participated in the Women’s Antioxidant and Folic Acid Cardiovascular Study (WAFACS) showed an overall nonvertebral fracture incidence of 7.6% during an average 7.3 years of treatment and follow-up in the supplementation group and a similar 6.9% rate in placebo-treated controls, Dr. Douglas C. Bauer reported at the annual meeting of the American Society for Bone and Mineral Research.

Dr. Douglas C. Bauer

Participants in WAFACS had known cardiovascular disease or three or more cardiovascular risk factors. The previously reported primary study end points were cardiovascular event rates and all-cause mortality, where supplements had no effect (JAMA 2008;299:2027-36).

The new retrospective secondary analysis of WAFACS was undertaken because some prior observational studies had concluded that elevated homocysteine and low vitamin B12 levels are associated with increased fracture risk. The supplement regimen utilized in the trial was designed to lower homocysteine and boost vitamin B12 levels. It consisted of daily folic acid at 2.5 mg, vitamin B6 at 50 mg, and vitamin B12 at 1 mg.

In all, 80% of participants reported greater than 66% compliance with therapy. Rates of hip, wrist, and total nonspine fractures were similar in both high-compliance subgroups, noted Dr. Bauer of the University of California, San Francisco.

WAFACS was funded by the National Heart, Lung, and Blood Institute. Dr. Bauer reported having no financial conflicts.

SAN DIEGO – Combined daily supplementation with folic acid and vitamins B6 and B12 proved to be a bust for reduction of nonvertebral fracture risk in a large, randomized, double-blind clinical trial conducted in women with known cardiovascular disease or multiple risk factors.

A secondary analysis of fracture outcomes in 5,442 female health professionals over age 42 years who participated in the Women’s Antioxidant and Folic Acid Cardiovascular Study (WAFACS) showed an overall nonvertebral fracture incidence of 7.6% during an average 7.3 years of treatment and follow-up in the supplementation group and a similar 6.9% rate in placebo-treated controls, Dr. Douglas C. Bauer reported at the annual meeting of the American Society for Bone and Mineral Research.

Dr. Douglas C. Bauer

Participants in WAFACS had known cardiovascular disease or three or more cardiovascular risk factors. The previously reported primary study end points were cardiovascular event rates and all-cause mortality, where supplements had no effect (JAMA 2008;299:2027-36).

The new retrospective secondary analysis of WAFACS was undertaken because some prior observational studies had concluded that elevated homocysteine and low vitamin B12 levels are associated with increased fracture risk. The supplement regimen utilized in the trial was designed to lower homocysteine and boost vitamin B12 levels. It consisted of daily folic acid at 2.5 mg, vitamin B6 at 50 mg, and vitamin B12 at 1 mg.

In all, 80% of participants reported greater than 66% compliance with therapy. Rates of hip, wrist, and total nonspine fractures were similar in both high-compliance subgroups, noted Dr. Bauer of the University of California, San Francisco.

WAFACS was funded by the National Heart, Lung, and Blood Institute. Dr. Bauer reported having no financial conflicts.

Publications
Publications
Topics
Article Type
Display Headline
B Vitamins Fail to Cut Fracture Risk in Women
Display Headline
B Vitamins Fail to Cut Fracture Risk in Women
Legacy Keywords
nonvertebral fractures, folic acid supplement, vitamins B6 and B12, vitamin B supplements, fracture risk
Legacy Keywords
nonvertebral fractures, folic acid supplement, vitamins B6 and B12, vitamin B supplements, fracture risk
Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY FOR BONE AND MINERAL RESEARCH

PURLs Copyright

Inside the Article

Vitals

Major Finding: During an average 7.3 years of treatment and follow-up, the overall nonvertebral fracture incidence was 7.6% in the supplementation group, compared with 6.9% in placebo-treated controls.

Data Source: Secondary analysis of the 5,442-subject randomized, double-blind, placebo-controlled Women’s Antioxidant and Folic Acid Cardiovascular Study.

Disclosures: WAFACS was funded by the National Heart, Lung, and Blood Institute. Dr. Bauer reported having no financial conflicts.

In Coronary Artery Bypass, BIMA May Be Best

Article Type
Changed
Tue, 12/13/2016 - 12:08
Display Headline
In Coronary Artery Bypass, BIMA May Be Best

COLORADO SPRINGS ? Using bilateral internal mammary artery grafts provided a significant long-term survival advantage over single mammary artery grafts for coronary artery bypass surgery patients with normal or moderately impaired left ventricular function, according to a large retrospective study with lengthy follow-up. But when preoperative left ventricular ejection fraction (EF) was less than 30%, the procedure choice showed no difference in survival.

"BIMA grafting is the operation of choice in patients with a life expectancy beyond 1-2 decades," Dr. David Galbut said at the annual meeting of the Western Thoracic Surgical Association. He reported on 4,537 consecutive patients who had CABG with internal mammary artery grafting during 1972-1994 at three Florida hospitals. BIMA grafts were performed in 48% of the patients, an exceptionally high BIMA rate. In contrast, the Society of Thoracic Surgeons database shows that 4% of patients undergoing CABG nationally receive BIMA grafts. The reason for the 12-fold higher BIMA rate in the Florida study is that Dr. Galbut and colleagues had a decades-long conviction that BIMA has major clinical advantages.

In the Florida study, 233 patients had an EF below 30%, another 1,256 had an EF of 30%-50%, and 3,048 had a normal EF. In the low EF group, 87 BIMA patients were matched to an equal number of SIMA patients on the basis of 14 preoperative variables. In like manner, propensity scores were used to match 448 BIMA patients in the moderately impaired EF group and 1,137 BIMA patients with a normal EF to similar SIMA patients.

Many surgeons are reluctant to use BIMA grafting because of a concern that it will result in increased complications. This wasn?t the case in the Florida series. Indeed, operative morbidity ? including sternal wound infection rates ? were similar between BIMA and SIMA, according to Dr. Galbut of the Aventura (Fla.) Medical Center.

Dr. David Galbut

The 20-year survival rate in BIMA patients with moderately impaired EF was 33.1%, significantly better than the 19% survival in matched SIMA patients. In the normal EF group, the 20-year survival rate was 38.1% with BIMA and 35.8% with SIMA.

The general strategy the surgeons followed in BIMA grafting was to run the left internal mammary artery (LIMA) graft to the left anterior descending coronary artery. The LIMA is the dominant vessel in most patients and would therefore be the most durable conduit. The right internal mammary artery graft was placed wherever it fit best.

Discussant Dr. Anthony P. Furnary observed that retrospective studies can?t prove causality, not even when they?re large, painstakingly performed, and feature more than 2 decades of follow-up. Limitations in propensity score matching may account for much or all of the long-term survival advantage observed with BIMA grafting seen. Although patients were extensively matched in 14 preoperative variables, the year of surgery wasn?t among them.

The 22-year study period saw the introduction of many myocardial protection techniques. If more SIMA patients were operated on in earlier years, they might well have missed out on these therapies, said Dr. Furnary of the Providence Heart and Vascular Institute, Portland, Ore.

Dr. Galbut had no disclosures.

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

COLORADO SPRINGS ? Using bilateral internal mammary artery grafts provided a significant long-term survival advantage over single mammary artery grafts for coronary artery bypass surgery patients with normal or moderately impaired left ventricular function, according to a large retrospective study with lengthy follow-up. But when preoperative left ventricular ejection fraction (EF) was less than 30%, the procedure choice showed no difference in survival.

"BIMA grafting is the operation of choice in patients with a life expectancy beyond 1-2 decades," Dr. David Galbut said at the annual meeting of the Western Thoracic Surgical Association. He reported on 4,537 consecutive patients who had CABG with internal mammary artery grafting during 1972-1994 at three Florida hospitals. BIMA grafts were performed in 48% of the patients, an exceptionally high BIMA rate. In contrast, the Society of Thoracic Surgeons database shows that 4% of patients undergoing CABG nationally receive BIMA grafts. The reason for the 12-fold higher BIMA rate in the Florida study is that Dr. Galbut and colleagues had a decades-long conviction that BIMA has major clinical advantages.

In the Florida study, 233 patients had an EF below 30%, another 1,256 had an EF of 30%-50%, and 3,048 had a normal EF. In the low EF group, 87 BIMA patients were matched to an equal number of SIMA patients on the basis of 14 preoperative variables. In like manner, propensity scores were used to match 448 BIMA patients in the moderately impaired EF group and 1,137 BIMA patients with a normal EF to similar SIMA patients.

Many surgeons are reluctant to use BIMA grafting because of a concern that it will result in increased complications. This wasn?t the case in the Florida series. Indeed, operative morbidity ? including sternal wound infection rates ? were similar between BIMA and SIMA, according to Dr. Galbut of the Aventura (Fla.) Medical Center.

Dr. David Galbut

The 20-year survival rate in BIMA patients with moderately impaired EF was 33.1%, significantly better than the 19% survival in matched SIMA patients. In the normal EF group, the 20-year survival rate was 38.1% with BIMA and 35.8% with SIMA.

The general strategy the surgeons followed in BIMA grafting was to run the left internal mammary artery (LIMA) graft to the left anterior descending coronary artery. The LIMA is the dominant vessel in most patients and would therefore be the most durable conduit. The right internal mammary artery graft was placed wherever it fit best.

Discussant Dr. Anthony P. Furnary observed that retrospective studies can?t prove causality, not even when they?re large, painstakingly performed, and feature more than 2 decades of follow-up. Limitations in propensity score matching may account for much or all of the long-term survival advantage observed with BIMA grafting seen. Although patients were extensively matched in 14 preoperative variables, the year of surgery wasn?t among them.

The 22-year study period saw the introduction of many myocardial protection techniques. If more SIMA patients were operated on in earlier years, they might well have missed out on these therapies, said Dr. Furnary of the Providence Heart and Vascular Institute, Portland, Ore.

Dr. Galbut had no disclosures.

COLORADO SPRINGS ? Using bilateral internal mammary artery grafts provided a significant long-term survival advantage over single mammary artery grafts for coronary artery bypass surgery patients with normal or moderately impaired left ventricular function, according to a large retrospective study with lengthy follow-up. But when preoperative left ventricular ejection fraction (EF) was less than 30%, the procedure choice showed no difference in survival.

"BIMA grafting is the operation of choice in patients with a life expectancy beyond 1-2 decades," Dr. David Galbut said at the annual meeting of the Western Thoracic Surgical Association. He reported on 4,537 consecutive patients who had CABG with internal mammary artery grafting during 1972-1994 at three Florida hospitals. BIMA grafts were performed in 48% of the patients, an exceptionally high BIMA rate. In contrast, the Society of Thoracic Surgeons database shows that 4% of patients undergoing CABG nationally receive BIMA grafts. The reason for the 12-fold higher BIMA rate in the Florida study is that Dr. Galbut and colleagues had a decades-long conviction that BIMA has major clinical advantages.

In the Florida study, 233 patients had an EF below 30%, another 1,256 had an EF of 30%-50%, and 3,048 had a normal EF. In the low EF group, 87 BIMA patients were matched to an equal number of SIMA patients on the basis of 14 preoperative variables. In like manner, propensity scores were used to match 448 BIMA patients in the moderately impaired EF group and 1,137 BIMA patients with a normal EF to similar SIMA patients.

Many surgeons are reluctant to use BIMA grafting because of a concern that it will result in increased complications. This wasn?t the case in the Florida series. Indeed, operative morbidity ? including sternal wound infection rates ? were similar between BIMA and SIMA, according to Dr. Galbut of the Aventura (Fla.) Medical Center.

Dr. David Galbut

The 20-year survival rate in BIMA patients with moderately impaired EF was 33.1%, significantly better than the 19% survival in matched SIMA patients. In the normal EF group, the 20-year survival rate was 38.1% with BIMA and 35.8% with SIMA.

The general strategy the surgeons followed in BIMA grafting was to run the left internal mammary artery (LIMA) graft to the left anterior descending coronary artery. The LIMA is the dominant vessel in most patients and would therefore be the most durable conduit. The right internal mammary artery graft was placed wherever it fit best.

Discussant Dr. Anthony P. Furnary observed that retrospective studies can?t prove causality, not even when they?re large, painstakingly performed, and feature more than 2 decades of follow-up. Limitations in propensity score matching may account for much or all of the long-term survival advantage observed with BIMA grafting seen. Although patients were extensively matched in 14 preoperative variables, the year of surgery wasn?t among them.

The 22-year study period saw the introduction of many myocardial protection techniques. If more SIMA patients were operated on in earlier years, they might well have missed out on these therapies, said Dr. Furnary of the Providence Heart and Vascular Institute, Portland, Ore.

Dr. Galbut had no disclosures.

References

References

Publications
Publications
Topics
Article Type
Display Headline
In Coronary Artery Bypass, BIMA May Be Best
Display Headline
In Coronary Artery Bypass, BIMA May Be Best
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: The 20-year survival rate in BIMA patients in the moderately impaired EF group was 33.1%, significantly better than the 19% in matched SIMA patients. In the normal EF group, the 20-year survival rate was 38.1% with BIMA and 35.8% with SIMA.

Data Source: A retrospective study of 4,537 consecutive patients who underwent CABG with internal mammary artery grafting during 1972-1994 at three Florida hospitals.

Disclosures: Dr. Galbut declared having no financial conflicts.

Resection Works Well for High-Risk NSCLC Patients

Article Type
Changed
Tue, 12/13/2016 - 12:08
Display Headline
Resection Works Well for High-Risk NSCLC Patients

COLORADO SPRINGS – Patients with stage 1a non–small cell lung cancer deemed medically inoperable or high risk can undergo surgical resection safely and with excellent results, judging by results of a single-center retrospective study.

Indeed, their perioperative morbidity and mortality and 5-year recurrence-free survival rates were similar to those of low-risk patients undergoing tumor resection, Dr. Andrea S. Wolf said at the annual meeting of the Western Thoracic Surgical Association.

"These outcomes in high-risk patients provide the standard to which nonoperative therapies should be compared," said Dr. Wolf of Brigham and Women’s Hospital, Boston.

Of the 170,000 new cases of non–small cell lung cancer (NSCLC) diagnosed annually in the United States, 80% are deemed inoperable because of extensive malignancy or severe comorbidities, including chronic obstructive pulmonary disease, which affects roughly half of patients with NSCLC. Stereotactic body radiation therapy (SBRT), which uses advanced imaging techniques to deliver a targeted radiation dose to a tumor, is making substantial inroads in these inoperable patients. But Dr. Wolf’s study suggests that surgery is feasible for many patients considered high risk for pulmonary resection.

She reviewed the records of 66 patients with stage 1a disease considered high risk and 158 low-risk controls, all of whom underwent surgical resection at Brigham and Women’s Hospital during 1997-2006. None had pure bronchoalveolar carcinoma. Patients were deemed high risk if they were aged 80 years or older, or if their forced expiratory volume in 1 second (FEV1) was 50% or less of the predicted amount. Forty percent of the high-risk patients met the age criterion, 60% met the diminished pulmonary function standard, and 5% fulfilled both. Pathologic findings were similar in the high- and low-risk groups, with a median tumor size of 1.5 cm.

With a median 6 years of follow-up, the local recurrence rate was 18% in the high-risk population and 16% in the low-risk cohort. The distant recurrence rate was 15% in both groups.

The 5-year overall survival rate was 54% in the high-risk group and significantly better at 68% in the low-risk group (P = .04). However, there was no significant difference in 5-year recurrence-free survival: 73% and 77% in the high- and low-risk groups, respectively.

Perioperative mortality occurred in 2% of low-risk patients and none of the high-risk patients. The perioperative major morbidity rate was 14% in the high-risk group and 8% in the low-risk group. Similarly, there were no significant between-group differences in the rates of any individual major complications, which included MI, pulmonary embolus, and reoperation for bleeding.

"Your rationale is right on target," discussant Dr. Joseph B. Shrager told Dr. Wolf. "It’s highly important in this era of SBRT to document the excellent results we can get with surgery in very-high-risk patients. And zero deaths, which is what you showed here and was equivalent to the experience with low-risk patients, is certainly admirable."

That being said, he added that he was disappointed with the Boston surgeons’ low use of anatomic resection in the high-risk group. Only 18% underwent lobectomy and another 6% received segmentectomy, while 76% had a wedge resection.

"There were less than one-tenth as many segmentectomies as wedges in the high-risk patients. So, really, what you’ve shown is that a lesser operation – or you might even say our least-good operation – can safely be done in high-risk patients. The question now is, is that lesser operation better than SBRT? Because if it’s not, then SBRT will probably win that argument. I have to say, I think we have a better chance of winning out over SBRT with surgery if we’re comparing it to segmentectomy than if we’re comparing it to wedge," said Dr. Shrager, professor and chief of the division of thoracic surgery at Stanford (Calif.) University.

Dr. Wolf replied that, like Dr. Shrager, she and her coinvestigators were "surprised" at the high rate of wedge resection because thoracic surgeons at Brigham and Women’s Hospital tend to promote anatomic resection whenever possible. She suspects some of the wedges were performed in an effort to spare parenchyma when a tumor bordered segmental boundaries.

Dr. Shrager also took the Boston surgeons to task for the fact that only 38% of high-risk patients in the series underwent lymph node sampling.

"Short of a proven survival advantage for surgery over SBRT, which we don’t have yet, all we can say is at least we’re providing better lymph node staging. So why not more lymph node sampling?" Dr. Shrager asked.

Dr. Wolf said this, too, came as a surprise to her and her colleagues. The most likely explanation is that in many instances patients and surgeons sought smaller, quicker operations in an effort to spare the patient. But given the compelling evidence showing that lymph node sampling is critical for accurate staging and for determining the need for adjunctive therapy, that’s not an adequate excuse.

 

 

"Going forward, we’re very interested in making sure nodes are sampled, even with wedge resections," she said.

Dr. Wolf declared no conflicts.☐

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

COLORADO SPRINGS – Patients with stage 1a non–small cell lung cancer deemed medically inoperable or high risk can undergo surgical resection safely and with excellent results, judging by results of a single-center retrospective study.

Indeed, their perioperative morbidity and mortality and 5-year recurrence-free survival rates were similar to those of low-risk patients undergoing tumor resection, Dr. Andrea S. Wolf said at the annual meeting of the Western Thoracic Surgical Association.

"These outcomes in high-risk patients provide the standard to which nonoperative therapies should be compared," said Dr. Wolf of Brigham and Women’s Hospital, Boston.

Of the 170,000 new cases of non–small cell lung cancer (NSCLC) diagnosed annually in the United States, 80% are deemed inoperable because of extensive malignancy or severe comorbidities, including chronic obstructive pulmonary disease, which affects roughly half of patients with NSCLC. Stereotactic body radiation therapy (SBRT), which uses advanced imaging techniques to deliver a targeted radiation dose to a tumor, is making substantial inroads in these inoperable patients. But Dr. Wolf’s study suggests that surgery is feasible for many patients considered high risk for pulmonary resection.

She reviewed the records of 66 patients with stage 1a disease considered high risk and 158 low-risk controls, all of whom underwent surgical resection at Brigham and Women’s Hospital during 1997-2006. None had pure bronchoalveolar carcinoma. Patients were deemed high risk if they were aged 80 years or older, or if their forced expiratory volume in 1 second (FEV1) was 50% or less of the predicted amount. Forty percent of the high-risk patients met the age criterion, 60% met the diminished pulmonary function standard, and 5% fulfilled both. Pathologic findings were similar in the high- and low-risk groups, with a median tumor size of 1.5 cm.

With a median 6 years of follow-up, the local recurrence rate was 18% in the high-risk population and 16% in the low-risk cohort. The distant recurrence rate was 15% in both groups.

The 5-year overall survival rate was 54% in the high-risk group and significantly better at 68% in the low-risk group (P = .04). However, there was no significant difference in 5-year recurrence-free survival: 73% and 77% in the high- and low-risk groups, respectively.

Perioperative mortality occurred in 2% of low-risk patients and none of the high-risk patients. The perioperative major morbidity rate was 14% in the high-risk group and 8% in the low-risk group. Similarly, there were no significant between-group differences in the rates of any individual major complications, which included MI, pulmonary embolus, and reoperation for bleeding.

"Your rationale is right on target," discussant Dr. Joseph B. Shrager told Dr. Wolf. "It’s highly important in this era of SBRT to document the excellent results we can get with surgery in very-high-risk patients. And zero deaths, which is what you showed here and was equivalent to the experience with low-risk patients, is certainly admirable."

That being said, he added that he was disappointed with the Boston surgeons’ low use of anatomic resection in the high-risk group. Only 18% underwent lobectomy and another 6% received segmentectomy, while 76% had a wedge resection.

"There were less than one-tenth as many segmentectomies as wedges in the high-risk patients. So, really, what you’ve shown is that a lesser operation – or you might even say our least-good operation – can safely be done in high-risk patients. The question now is, is that lesser operation better than SBRT? Because if it’s not, then SBRT will probably win that argument. I have to say, I think we have a better chance of winning out over SBRT with surgery if we’re comparing it to segmentectomy than if we’re comparing it to wedge," said Dr. Shrager, professor and chief of the division of thoracic surgery at Stanford (Calif.) University.

Dr. Wolf replied that, like Dr. Shrager, she and her coinvestigators were "surprised" at the high rate of wedge resection because thoracic surgeons at Brigham and Women’s Hospital tend to promote anatomic resection whenever possible. She suspects some of the wedges were performed in an effort to spare parenchyma when a tumor bordered segmental boundaries.

Dr. Shrager also took the Boston surgeons to task for the fact that only 38% of high-risk patients in the series underwent lymph node sampling.

"Short of a proven survival advantage for surgery over SBRT, which we don’t have yet, all we can say is at least we’re providing better lymph node staging. So why not more lymph node sampling?" Dr. Shrager asked.

Dr. Wolf said this, too, came as a surprise to her and her colleagues. The most likely explanation is that in many instances patients and surgeons sought smaller, quicker operations in an effort to spare the patient. But given the compelling evidence showing that lymph node sampling is critical for accurate staging and for determining the need for adjunctive therapy, that’s not an adequate excuse.

 

 

"Going forward, we’re very interested in making sure nodes are sampled, even with wedge resections," she said.

Dr. Wolf declared no conflicts.☐

COLORADO SPRINGS – Patients with stage 1a non–small cell lung cancer deemed medically inoperable or high risk can undergo surgical resection safely and with excellent results, judging by results of a single-center retrospective study.

Indeed, their perioperative morbidity and mortality and 5-year recurrence-free survival rates were similar to those of low-risk patients undergoing tumor resection, Dr. Andrea S. Wolf said at the annual meeting of the Western Thoracic Surgical Association.

"These outcomes in high-risk patients provide the standard to which nonoperative therapies should be compared," said Dr. Wolf of Brigham and Women’s Hospital, Boston.

Of the 170,000 new cases of non–small cell lung cancer (NSCLC) diagnosed annually in the United States, 80% are deemed inoperable because of extensive malignancy or severe comorbidities, including chronic obstructive pulmonary disease, which affects roughly half of patients with NSCLC. Stereotactic body radiation therapy (SBRT), which uses advanced imaging techniques to deliver a targeted radiation dose to a tumor, is making substantial inroads in these inoperable patients. But Dr. Wolf’s study suggests that surgery is feasible for many patients considered high risk for pulmonary resection.

She reviewed the records of 66 patients with stage 1a disease considered high risk and 158 low-risk controls, all of whom underwent surgical resection at Brigham and Women’s Hospital during 1997-2006. None had pure bronchoalveolar carcinoma. Patients were deemed high risk if they were aged 80 years or older, or if their forced expiratory volume in 1 second (FEV1) was 50% or less of the predicted amount. Forty percent of the high-risk patients met the age criterion, 60% met the diminished pulmonary function standard, and 5% fulfilled both. Pathologic findings were similar in the high- and low-risk groups, with a median tumor size of 1.5 cm.

With a median 6 years of follow-up, the local recurrence rate was 18% in the high-risk population and 16% in the low-risk cohort. The distant recurrence rate was 15% in both groups.

The 5-year overall survival rate was 54% in the high-risk group and significantly better at 68% in the low-risk group (P = .04). However, there was no significant difference in 5-year recurrence-free survival: 73% and 77% in the high- and low-risk groups, respectively.

Perioperative mortality occurred in 2% of low-risk patients and none of the high-risk patients. The perioperative major morbidity rate was 14% in the high-risk group and 8% in the low-risk group. Similarly, there were no significant between-group differences in the rates of any individual major complications, which included MI, pulmonary embolus, and reoperation for bleeding.

"Your rationale is right on target," discussant Dr. Joseph B. Shrager told Dr. Wolf. "It’s highly important in this era of SBRT to document the excellent results we can get with surgery in very-high-risk patients. And zero deaths, which is what you showed here and was equivalent to the experience with low-risk patients, is certainly admirable."

That being said, he added that he was disappointed with the Boston surgeons’ low use of anatomic resection in the high-risk group. Only 18% underwent lobectomy and another 6% received segmentectomy, while 76% had a wedge resection.

"There were less than one-tenth as many segmentectomies as wedges in the high-risk patients. So, really, what you’ve shown is that a lesser operation – or you might even say our least-good operation – can safely be done in high-risk patients. The question now is, is that lesser operation better than SBRT? Because if it’s not, then SBRT will probably win that argument. I have to say, I think we have a better chance of winning out over SBRT with surgery if we’re comparing it to segmentectomy than if we’re comparing it to wedge," said Dr. Shrager, professor and chief of the division of thoracic surgery at Stanford (Calif.) University.

Dr. Wolf replied that, like Dr. Shrager, she and her coinvestigators were "surprised" at the high rate of wedge resection because thoracic surgeons at Brigham and Women’s Hospital tend to promote anatomic resection whenever possible. She suspects some of the wedges were performed in an effort to spare parenchyma when a tumor bordered segmental boundaries.

Dr. Shrager also took the Boston surgeons to task for the fact that only 38% of high-risk patients in the series underwent lymph node sampling.

"Short of a proven survival advantage for surgery over SBRT, which we don’t have yet, all we can say is at least we’re providing better lymph node staging. So why not more lymph node sampling?" Dr. Shrager asked.

Dr. Wolf said this, too, came as a surprise to her and her colleagues. The most likely explanation is that in many instances patients and surgeons sought smaller, quicker operations in an effort to spare the patient. But given the compelling evidence showing that lymph node sampling is critical for accurate staging and for determining the need for adjunctive therapy, that’s not an adequate excuse.

 

 

"Going forward, we’re very interested in making sure nodes are sampled, even with wedge resections," she said.

Dr. Wolf declared no conflicts.☐

References

References

Publications
Publications
Topics
Article Type
Display Headline
Resection Works Well for High-Risk NSCLC Patients
Display Headline
Resection Works Well for High-Risk NSCLC Patients
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: High-risk surgical candidates with stage 1a non–small-cell lung cancer underwent pulmonary resection with zero perioperative mortality, little major morbidity, and a 5-year recurrence-free survival rate similar to that of low-risk patients.

Data Source: Retrospective analysis of the Brigham and Women’s Hospital experience.

Disclosures: Dr. Wolf declared having no financial conflicts.

Paraesophageal Hernia Repair Boosts Lung Function

Article Type
Changed
Tue, 12/13/2016 - 12:08
Display Headline
Paraesophageal Hernia Repair Boosts Lung Function

COLORADO SPRINGS – Improvements in pulmonary function tests and subjective complaints of breathlessness appear to be underappreciated benefits of the surgical repair of giant paraesophageal hernias.

Symptom assessment of these patients has generally focused on reflux and dysphagia, but these hernias also have adverse impacts on pulmonary function. Repair most benefits patients who are older, have bigger hernias, and worse baseline pulmonary function, said Dr. Philip W. Carrott Jr., of Virginia Mason Medical Center, Seattle.

"Patients with giant paraesophageal hernia and coexistent dyspnea or positional breathlessness should be reviewed by an experienced surgeon for elective repair, even when pulmonary comorbidities exist," Dr. Carrott concluded at the annual meeting of the Western Thoracic Surgical Association.

He based this advice on a single-center, retrospective, cohort study involving 120 patients who had pulmonary function tests preoperatively and again at a median of 106 days after surgery.

The overall group averaged 10% increases over baseline (P less than .001) in forced vital capacity (FVC), forced expiratory volume in 1 second (FEV1), vital capacity, and volume-adjusted mid-expiratory flow (IsoFEF25-75), as well as a 2.9% increase in the diffusing capacity of the lung (DLCO).

The larger a patient’s hernia as expressed by percent intrathoracic stomach (ITS) on preoperative contrast studies, the greater the improvement in pulmonary function tests after surgery. Hernia size was the strongest predictor of improvement. For example, forced vital capacity improved by an average of 4.7%, compared with reference values in patients with the smallest hernias as expressed in a percent ITS of less than 50%, as compared with a 6.0% gain in patients with a preop 50%-74% ITS, a 9.1% improvement in those with 75%-99% ITS, and a 14.9% gain in FVC in patients with 100% ITS.

The postoperative improvement in lung function increased with each decade of patient age.

Patients with the worst preoperative lung function tended to have the biggest hernias – and the greatest objective and subjective improvements after surgery. For example, 36% of subjects had a reduced baseline FEV1 not more than 75% of the reference value. Their vital capacity improved by 0.45 L, as compared with 0.23 L in patients without a reduced baseline FEV1. And their DLCO improved by 1.23 mL CO/min/mm Hg, compared with just 0.23 in patients whose baseline FEV1 was more than 75% of the reference value.

Of 63 patients who reported preoperative dyspnea, 47 (75%) noted subjective improvement in their respiratory function after hernia repair. Intriguingly, so did 30 of 57 patients (53%) not complaining of dyspnea at baseline.

Dr. Carrott and his fellow researchers postulate that restoring efficient diaphragmatic function is just part of the explanation. "The stomach probably has a paradoxical motion during respiration, such that the abdominal positive pressure is pushing against the negative effect of the lungs and chest wall," he said.

Study participants averaged 74 years of age, with a median of four preoperative symptoms. The most common were heartburn in 59%, early satiety in 54%, dyspnea in 52%, dysphagia in 47%, chest pain in 40%, and regurgitation in 39%.

Major comorbidities included pulmonary disease in 29% of subjects, heart disease in 35%, and obesity in 39%. An open Hill repair with no hiatal reinforcement was performed in 99% of patients, and 97% of the operations were elective.

Despite the substantial prevalence of comorbid conditions, there was no operative mortality. The mean length of stay was 4 days. One-third had complications, including six cases of arrhythmia, four instances of nausea delaying discharge, three cases of pneumonia, and two cases each of ileus, wound infection, or delirium.

Discussant Dr. Sean C. Grondin observed that paraesophageal hernia is a relatively common disease in the practice of most thoracic surgeons. And although the study provides some support for the notion that surgical repair may improve pulmonary function, its retrospective nature and only moderate size render it less than fully convincing.

"I think it still falls a little short just yet of providing conclusive evidence. At this time I would caution surgeons from telling patients that they’ll get a definitive improvement in their dyspnea after paraesophageal hernia repair, although it’s certainly a possibility," said Dr. Grondin of the University of Calgary (Alta).

Dr. Ross M. Bremner of St. Joseph’s Hospital and Medical Center, Phoenix, commented that: "I’ve long been telling my patients who have their entire stomach in the chest that they’re likely to get some improvement in their pulmonary function from repair. Now at least I have some data to show them they can get at least 10%-15% improvement."

Dr. Carrott reported that he had no disclosures.

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

COLORADO SPRINGS – Improvements in pulmonary function tests and subjective complaints of breathlessness appear to be underappreciated benefits of the surgical repair of giant paraesophageal hernias.

Symptom assessment of these patients has generally focused on reflux and dysphagia, but these hernias also have adverse impacts on pulmonary function. Repair most benefits patients who are older, have bigger hernias, and worse baseline pulmonary function, said Dr. Philip W. Carrott Jr., of Virginia Mason Medical Center, Seattle.

"Patients with giant paraesophageal hernia and coexistent dyspnea or positional breathlessness should be reviewed by an experienced surgeon for elective repair, even when pulmonary comorbidities exist," Dr. Carrott concluded at the annual meeting of the Western Thoracic Surgical Association.

He based this advice on a single-center, retrospective, cohort study involving 120 patients who had pulmonary function tests preoperatively and again at a median of 106 days after surgery.

The overall group averaged 10% increases over baseline (P less than .001) in forced vital capacity (FVC), forced expiratory volume in 1 second (FEV1), vital capacity, and volume-adjusted mid-expiratory flow (IsoFEF25-75), as well as a 2.9% increase in the diffusing capacity of the lung (DLCO).

The larger a patient’s hernia as expressed by percent intrathoracic stomach (ITS) on preoperative contrast studies, the greater the improvement in pulmonary function tests after surgery. Hernia size was the strongest predictor of improvement. For example, forced vital capacity improved by an average of 4.7%, compared with reference values in patients with the smallest hernias as expressed in a percent ITS of less than 50%, as compared with a 6.0% gain in patients with a preop 50%-74% ITS, a 9.1% improvement in those with 75%-99% ITS, and a 14.9% gain in FVC in patients with 100% ITS.

The postoperative improvement in lung function increased with each decade of patient age.

Patients with the worst preoperative lung function tended to have the biggest hernias – and the greatest objective and subjective improvements after surgery. For example, 36% of subjects had a reduced baseline FEV1 not more than 75% of the reference value. Their vital capacity improved by 0.45 L, as compared with 0.23 L in patients without a reduced baseline FEV1. And their DLCO improved by 1.23 mL CO/min/mm Hg, compared with just 0.23 in patients whose baseline FEV1 was more than 75% of the reference value.

Of 63 patients who reported preoperative dyspnea, 47 (75%) noted subjective improvement in their respiratory function after hernia repair. Intriguingly, so did 30 of 57 patients (53%) not complaining of dyspnea at baseline.

Dr. Carrott and his fellow researchers postulate that restoring efficient diaphragmatic function is just part of the explanation. "The stomach probably has a paradoxical motion during respiration, such that the abdominal positive pressure is pushing against the negative effect of the lungs and chest wall," he said.

Study participants averaged 74 years of age, with a median of four preoperative symptoms. The most common were heartburn in 59%, early satiety in 54%, dyspnea in 52%, dysphagia in 47%, chest pain in 40%, and regurgitation in 39%.

Major comorbidities included pulmonary disease in 29% of subjects, heart disease in 35%, and obesity in 39%. An open Hill repair with no hiatal reinforcement was performed in 99% of patients, and 97% of the operations were elective.

Despite the substantial prevalence of comorbid conditions, there was no operative mortality. The mean length of stay was 4 days. One-third had complications, including six cases of arrhythmia, four instances of nausea delaying discharge, three cases of pneumonia, and two cases each of ileus, wound infection, or delirium.

Discussant Dr. Sean C. Grondin observed that paraesophageal hernia is a relatively common disease in the practice of most thoracic surgeons. And although the study provides some support for the notion that surgical repair may improve pulmonary function, its retrospective nature and only moderate size render it less than fully convincing.

"I think it still falls a little short just yet of providing conclusive evidence. At this time I would caution surgeons from telling patients that they’ll get a definitive improvement in their dyspnea after paraesophageal hernia repair, although it’s certainly a possibility," said Dr. Grondin of the University of Calgary (Alta).

Dr. Ross M. Bremner of St. Joseph’s Hospital and Medical Center, Phoenix, commented that: "I’ve long been telling my patients who have their entire stomach in the chest that they’re likely to get some improvement in their pulmonary function from repair. Now at least I have some data to show them they can get at least 10%-15% improvement."

Dr. Carrott reported that he had no disclosures.

COLORADO SPRINGS – Improvements in pulmonary function tests and subjective complaints of breathlessness appear to be underappreciated benefits of the surgical repair of giant paraesophageal hernias.

Symptom assessment of these patients has generally focused on reflux and dysphagia, but these hernias also have adverse impacts on pulmonary function. Repair most benefits patients who are older, have bigger hernias, and worse baseline pulmonary function, said Dr. Philip W. Carrott Jr., of Virginia Mason Medical Center, Seattle.

"Patients with giant paraesophageal hernia and coexistent dyspnea or positional breathlessness should be reviewed by an experienced surgeon for elective repair, even when pulmonary comorbidities exist," Dr. Carrott concluded at the annual meeting of the Western Thoracic Surgical Association.

He based this advice on a single-center, retrospective, cohort study involving 120 patients who had pulmonary function tests preoperatively and again at a median of 106 days after surgery.

The overall group averaged 10% increases over baseline (P less than .001) in forced vital capacity (FVC), forced expiratory volume in 1 second (FEV1), vital capacity, and volume-adjusted mid-expiratory flow (IsoFEF25-75), as well as a 2.9% increase in the diffusing capacity of the lung (DLCO).

The larger a patient’s hernia as expressed by percent intrathoracic stomach (ITS) on preoperative contrast studies, the greater the improvement in pulmonary function tests after surgery. Hernia size was the strongest predictor of improvement. For example, forced vital capacity improved by an average of 4.7%, compared with reference values in patients with the smallest hernias as expressed in a percent ITS of less than 50%, as compared with a 6.0% gain in patients with a preop 50%-74% ITS, a 9.1% improvement in those with 75%-99% ITS, and a 14.9% gain in FVC in patients with 100% ITS.

The postoperative improvement in lung function increased with each decade of patient age.

Patients with the worst preoperative lung function tended to have the biggest hernias – and the greatest objective and subjective improvements after surgery. For example, 36% of subjects had a reduced baseline FEV1 not more than 75% of the reference value. Their vital capacity improved by 0.45 L, as compared with 0.23 L in patients without a reduced baseline FEV1. And their DLCO improved by 1.23 mL CO/min/mm Hg, compared with just 0.23 in patients whose baseline FEV1 was more than 75% of the reference value.

Of 63 patients who reported preoperative dyspnea, 47 (75%) noted subjective improvement in their respiratory function after hernia repair. Intriguingly, so did 30 of 57 patients (53%) not complaining of dyspnea at baseline.

Dr. Carrott and his fellow researchers postulate that restoring efficient diaphragmatic function is just part of the explanation. "The stomach probably has a paradoxical motion during respiration, such that the abdominal positive pressure is pushing against the negative effect of the lungs and chest wall," he said.

Study participants averaged 74 years of age, with a median of four preoperative symptoms. The most common were heartburn in 59%, early satiety in 54%, dyspnea in 52%, dysphagia in 47%, chest pain in 40%, and regurgitation in 39%.

Major comorbidities included pulmonary disease in 29% of subjects, heart disease in 35%, and obesity in 39%. An open Hill repair with no hiatal reinforcement was performed in 99% of patients, and 97% of the operations were elective.

Despite the substantial prevalence of comorbid conditions, there was no operative mortality. The mean length of stay was 4 days. One-third had complications, including six cases of arrhythmia, four instances of nausea delaying discharge, three cases of pneumonia, and two cases each of ileus, wound infection, or delirium.

Discussant Dr. Sean C. Grondin observed that paraesophageal hernia is a relatively common disease in the practice of most thoracic surgeons. And although the study provides some support for the notion that surgical repair may improve pulmonary function, its retrospective nature and only moderate size render it less than fully convincing.

"I think it still falls a little short just yet of providing conclusive evidence. At this time I would caution surgeons from telling patients that they’ll get a definitive improvement in their dyspnea after paraesophageal hernia repair, although it’s certainly a possibility," said Dr. Grondin of the University of Calgary (Alta).

Dr. Ross M. Bremner of St. Joseph’s Hospital and Medical Center, Phoenix, commented that: "I’ve long been telling my patients who have their entire stomach in the chest that they’re likely to get some improvement in their pulmonary function from repair. Now at least I have some data to show them they can get at least 10%-15% improvement."

Dr. Carrott reported that he had no disclosures.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Paraesophageal Hernia Repair Boosts Lung Function
Display Headline
Paraesophageal Hernia Repair Boosts Lung Function
Article Source

PURLs Copyright

Inside the Article

Vitals

Major Finding: Forced vital capacity improved by an average of 4.7%, compared with reference values in patients with the smallest paraesophageal hernias, those with a preoperative intrathoracic stomach (ITS) of less than 50%. FVC improved 6.0% in patients with a 50%-74% ITS, 9.1% in those with 75%-99% ITS, and 15% in patients with 100% ITS.

Data Source: A single-center, retrospective, cohort study involving 120 patients who underwent repair of paraesophageal hernia and had pulmonary function measured preoperatively and again a median of 106 days post-surgery.

Disclosures: No financial conflicts of interest.

Experts: Medical Dermatology Is Losing Ground

Procedural Dermatology Is "Easy Target"
Article Type
Changed
Wed, 03/27/2019 - 14:59
Display Headline
Experts: Medical Dermatology Is Losing Ground

By 2020, the practice of dermatology will likely be narrower in scope, be less medically oriented, and place greater emphasis on dermatologic surgery and cosmetic interventions, according to two medical dermatologists.

These trends will apply worldwide but will be most pronounced in the United States, where they are already far more advanced. The dermatologists also predicted that European dermatologists will continue to practice a more traditional style of dermatology that includes inpatient care for severely ill patients.

Dr. Stephen I. Katz

"Cosmetic and procedural dermatology will no doubt continue to grow. You can see it growing at virtually every meeting you go to in the U.S. Why? There’s immediate gratification, and it’s the path of least resistance. You’re basically on your own, in business for yourself," said Dr. Stephen I. Katz, a dermatologist and director of the National Institute of Arthritis and Musculoskeletal and Skin Diseases.

What he finds particularly worrisome, he said, is that most cosmetic and procedural dermatology is not science based. These fields are thriving at a time of exciting advances in understanding the biologic basis of skin diseases, with the discoveries having far-reaching implications for both diagnosis and treatment. Yet aesthetic dermatology and dermatologic surgery are not pulling their fair share of the load, he said.

"With few exceptions, cosmetic and procedural dermatology is not accompanied by a research base. It’s absolutely ridiculous, for example, that we don’t have a robust research interest in wound healing," said Dr. Katz.

In an interview, Dr. Joel Schlessinger, a dermatologist in private practice in Omaha, Neb., said that although he too laments the lack of basic science research in cosmetic dermatology, he believes there is room for hope.

"Recently, over 100 residents competed for selective positions as adjunct faculty at the Third Annual Cosmetic Surgery Forum by submitting research on cosmetic dermatology," said Dr. Schlessinger.

And although several university programs have undertaken research on collagen fillers and collagen formation, "until the federal government realizes the importance that cosmetic dermatology [plays] in patients' lives, no significant basic science research will be funded at the university level. It is understandable, but not practical, for all research dollars to be in life-and-death disease states. Hopefully, the paradigm will change among strategic decisions at the government and university [levels], thereby allowing more serious research in cosmetic dermatology," he added.

"Somehow, I think, we've lost our way," said Dr. Katz. "Our specialty has already lost STDs, connective tissue disease, psoriasis, eczema, melanoma, wound healing – they’re already gone. Yes, there are psoriasis centers, but rheumatology is taking over much of psoriasis because rheumatologists are more comfortable with the more complicated biologic therapies."

The Marginalization of Academic Dermatology in the U.S.

He said that he had hoped academic dermatology would play a leadership role in keeping the specialty on track as a broad-based discipline that encompasses a wide range of subspecialty interests, but in this he has been disappointed. Academic dermatology, in his view, has undergone marginalization and trivialization.

"In the U.S., many dermatology programs are struggling to continue an academic focus. They’re basically no more than practices that happen to be located at a university, trying to meet their financial overhead with very little in the way of an academic dimension. Still, there are maybe 10-15 programs in the U.S. that have continued to focus on generating new knowledge in the university setting, often with PhD scientists leading the way," said Dr. Katz.

The National Institutes of Health now awards far more grant money for research into skin biology and skin disease to PhD scientists than to either MD or MD/PhD scientists. "The PhD scientists are so successful because they’re not encumbered by seeing patients," Dr. Katz said during a panel discussion at the World Congress of Dermatology in Seoul, South Korea.

His fellow panelist, Dr. Georg Stingl, reported that a substantial reduction in the number of hospital beds reserved for dermatology is ongoing throughout Europe, but those beds "have not yet disappeared."

"I predict the scope of dermatology in continental Europe 10 years from now will remain broader than that in the U.S., [the United Kingdom], and Asia, but it will be subject to territorial battles with other disciplines. Yet if our discipline is shrinking, it’s not the fault of others. It’s our own fault. It’s because of what we are doing," said Dr. Stingl, professor of dermatology and chairman of the division of immunology, allergy, and infectious diseases at the Medical University of Vienna.

 

 

Traditional fields of European dermatology that are now at risk of being lost to other specialties include venereology, dermato-oncology, type 1 allergy, and dermatopathology. Phlebology has already been largely taken over by vascular surgeons. On the other hand, European dermatology will see expansion of genodermatology, aesthetic dermatology, and dermatologic surgery, he added.

Dermatologic Role Changing in Japan

In Japan, as elsewhere throughout the world, the traditional dermatologic role as caregiver for severely ill patients is being taken over by other specialties.

"Traditional dermatologists are an endangered species," said Dr. Masayuki Amagai, professor of dermatology at Keio University, Tokyo. For example, Japanese dermatologists traditionally have cared for melanoma patients with terminal disease as well as for those who have early-stage disease. Now, however, more patients with advanced melanoma are being seen in integrated oncology centers.

Following the same model, it is likely that the near future will bring a new sort of integrated immunologic disease center for patients with Crohn’s disease, rheumatoid arthritis, psoriasis, and other conditions that share common inflammatory mechanisms. Dermatologists who hunger to take on the most interesting and challenging cases will want to become a part of such centers, where they will work alongside rheumatologists and gastroenterologists, Dr. Amagai said.

The good news in the United States, according to Dr. Katz, is that the recent skin biology discoveries will eventually translate into major clinical advances. And dermatology continues to attract the best and brightest medical school graduates, he said. In 2004-2007, 5.8% of U.S. dermatology residency positions were held by MD/PhDs, a rate nearly threefold greater than the average for other residency programs.

His wish list for the dermatology specialty includes a better-organized research agenda, including research programs in the cosmetic and procedural aspects of the specialty. He would also like to see the development of a clinical research consortium in dermatology, similar to the way pediatric research is conducted. "Dermatology departments are just too small to not work together," said Dr. Katz.

More effort should be placed on educating dermatologists about health services research, comparative effectiveness studies, and clinical outcomes research.

"These are areas we're not very good at. We need to be part of that whole scenario because reimbursement is going to be based on these types of studies," he said.

The speakers declared having no financial conflicts.

Body

Dr. Katz is correct that procedural dermatology will continue to grow, as the baby boomers age, develop more cancers, and want to maintain their youthful appearance. The "immediate gratification" comment is true of any surgical subspecialty in which a physician can help a patient by performing a procedure, which generally results in a beneficial event, compared with the medical treatment of a chronic disease that may ebb and flow for decades. Either way, we all have the same objective in mind: helping patients.

Procedural and cosmetic dermatology are easy targets, but I think that in some ways, they’re straw men. Calling a spade a spade, a lot of people go into dermatology because of the lifestyle. General dermatologists work an average of 37.5 hours a week, with women physicians working fewer hours than do men. And your "average" dermatologist spends only about 10% of his or her time doing cosmetic procedures.

In addition, it’s hard to get federal funding for dermatology because the ubiquitous diseases (like acne or psoriasis) aren’t really that dangerous, and the dangerous diseases aren’t that ubiquitous.

The concept that most cosmetic and procedural dermatology is not science based is interesting, and I have two responses. Many discoveries in medicine – such as the discovery of penicillin – are not actually related to an understanding of the biological basis of disease, but rather are serendipitous and based on careful observation of various phenomena.

The extraordinary work of Dr. Jeffery A. Klein (a dermatologic surgeon who practices in San Juan Capistrano, Calif.) delineating the benefits of tumescent anesthetic – not just for liposuction, but in a wide range of surgical procedures by many specialties – was painstaking and groundbreaking work. The self-funded study has been a significant advance in modern therapeutics.

Companies like Allergan have poured tens of millions of dollars into the development of drugs that can change people’s lives. Botox might be a dirty word to some, but for a hundred medical conditions, it can be a game changer.

In the real world outside the NIH, clinicians who desire to perform research have to do so after their clinics are finished, on weekends, or on borrowed time. How many chairs of departments of dermatology allow their procedural dermatologists protected time to follow their research interests?

I don’t believe that procedural dermatology is the root cause of the degradation of our specialty or of the loss of venereal, connective tissue, and other diseases. These were being lost long before procedural dermatology saw the light of day. I would say that those who have helped develop the skin surgical specialties should be acknowledged as the saviors of the specialty at large.

Christopher Zachary, M.D., is professor and chair of the department of dermatology at the University of California, Irvine. He has received support and honoraria from Merz, Allergan, and Medicis. His comments are based on an interview with this news organization.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
medical dermatology, Dr. Stephen I. Katz, cosmetic dermatology, procedural dermatology
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event
Body

Dr. Katz is correct that procedural dermatology will continue to grow, as the baby boomers age, develop more cancers, and want to maintain their youthful appearance. The "immediate gratification" comment is true of any surgical subspecialty in which a physician can help a patient by performing a procedure, which generally results in a beneficial event, compared with the medical treatment of a chronic disease that may ebb and flow for decades. Either way, we all have the same objective in mind: helping patients.

Procedural and cosmetic dermatology are easy targets, but I think that in some ways, they’re straw men. Calling a spade a spade, a lot of people go into dermatology because of the lifestyle. General dermatologists work an average of 37.5 hours a week, with women physicians working fewer hours than do men. And your "average" dermatologist spends only about 10% of his or her time doing cosmetic procedures.

In addition, it’s hard to get federal funding for dermatology because the ubiquitous diseases (like acne or psoriasis) aren’t really that dangerous, and the dangerous diseases aren’t that ubiquitous.

The concept that most cosmetic and procedural dermatology is not science based is interesting, and I have two responses. Many discoveries in medicine – such as the discovery of penicillin – are not actually related to an understanding of the biological basis of disease, but rather are serendipitous and based on careful observation of various phenomena.

The extraordinary work of Dr. Jeffery A. Klein (a dermatologic surgeon who practices in San Juan Capistrano, Calif.) delineating the benefits of tumescent anesthetic – not just for liposuction, but in a wide range of surgical procedures by many specialties – was painstaking and groundbreaking work. The self-funded study has been a significant advance in modern therapeutics.

Companies like Allergan have poured tens of millions of dollars into the development of drugs that can change people’s lives. Botox might be a dirty word to some, but for a hundred medical conditions, it can be a game changer.

In the real world outside the NIH, clinicians who desire to perform research have to do so after their clinics are finished, on weekends, or on borrowed time. How many chairs of departments of dermatology allow their procedural dermatologists protected time to follow their research interests?

I don’t believe that procedural dermatology is the root cause of the degradation of our specialty or of the loss of venereal, connective tissue, and other diseases. These were being lost long before procedural dermatology saw the light of day. I would say that those who have helped develop the skin surgical specialties should be acknowledged as the saviors of the specialty at large.

Christopher Zachary, M.D., is professor and chair of the department of dermatology at the University of California, Irvine. He has received support and honoraria from Merz, Allergan, and Medicis. His comments are based on an interview with this news organization.

Body

Dr. Katz is correct that procedural dermatology will continue to grow, as the baby boomers age, develop more cancers, and want to maintain their youthful appearance. The "immediate gratification" comment is true of any surgical subspecialty in which a physician can help a patient by performing a procedure, which generally results in a beneficial event, compared with the medical treatment of a chronic disease that may ebb and flow for decades. Either way, we all have the same objective in mind: helping patients.

Procedural and cosmetic dermatology are easy targets, but I think that in some ways, they’re straw men. Calling a spade a spade, a lot of people go into dermatology because of the lifestyle. General dermatologists work an average of 37.5 hours a week, with women physicians working fewer hours than do men. And your "average" dermatologist spends only about 10% of his or her time doing cosmetic procedures.

In addition, it’s hard to get federal funding for dermatology because the ubiquitous diseases (like acne or psoriasis) aren’t really that dangerous, and the dangerous diseases aren’t that ubiquitous.

The concept that most cosmetic and procedural dermatology is not science based is interesting, and I have two responses. Many discoveries in medicine – such as the discovery of penicillin – are not actually related to an understanding of the biological basis of disease, but rather are serendipitous and based on careful observation of various phenomena.

The extraordinary work of Dr. Jeffery A. Klein (a dermatologic surgeon who practices in San Juan Capistrano, Calif.) delineating the benefits of tumescent anesthetic – not just for liposuction, but in a wide range of surgical procedures by many specialties – was painstaking and groundbreaking work. The self-funded study has been a significant advance in modern therapeutics.

Companies like Allergan have poured tens of millions of dollars into the development of drugs that can change people’s lives. Botox might be a dirty word to some, but for a hundred medical conditions, it can be a game changer.

In the real world outside the NIH, clinicians who desire to perform research have to do so after their clinics are finished, on weekends, or on borrowed time. How many chairs of departments of dermatology allow their procedural dermatologists protected time to follow their research interests?

I don’t believe that procedural dermatology is the root cause of the degradation of our specialty or of the loss of venereal, connective tissue, and other diseases. These were being lost long before procedural dermatology saw the light of day. I would say that those who have helped develop the skin surgical specialties should be acknowledged as the saviors of the specialty at large.

Christopher Zachary, M.D., is professor and chair of the department of dermatology at the University of California, Irvine. He has received support and honoraria from Merz, Allergan, and Medicis. His comments are based on an interview with this news organization.

Title
Procedural Dermatology Is "Easy Target"
Procedural Dermatology Is "Easy Target"

By 2020, the practice of dermatology will likely be narrower in scope, be less medically oriented, and place greater emphasis on dermatologic surgery and cosmetic interventions, according to two medical dermatologists.

These trends will apply worldwide but will be most pronounced in the United States, where they are already far more advanced. The dermatologists also predicted that European dermatologists will continue to practice a more traditional style of dermatology that includes inpatient care for severely ill patients.

Dr. Stephen I. Katz

"Cosmetic and procedural dermatology will no doubt continue to grow. You can see it growing at virtually every meeting you go to in the U.S. Why? There’s immediate gratification, and it’s the path of least resistance. You’re basically on your own, in business for yourself," said Dr. Stephen I. Katz, a dermatologist and director of the National Institute of Arthritis and Musculoskeletal and Skin Diseases.

What he finds particularly worrisome, he said, is that most cosmetic and procedural dermatology is not science based. These fields are thriving at a time of exciting advances in understanding the biologic basis of skin diseases, with the discoveries having far-reaching implications for both diagnosis and treatment. Yet aesthetic dermatology and dermatologic surgery are not pulling their fair share of the load, he said.

"With few exceptions, cosmetic and procedural dermatology is not accompanied by a research base. It’s absolutely ridiculous, for example, that we don’t have a robust research interest in wound healing," said Dr. Katz.

In an interview, Dr. Joel Schlessinger, a dermatologist in private practice in Omaha, Neb., said that although he too laments the lack of basic science research in cosmetic dermatology, he believes there is room for hope.

"Recently, over 100 residents competed for selective positions as adjunct faculty at the Third Annual Cosmetic Surgery Forum by submitting research on cosmetic dermatology," said Dr. Schlessinger.

And although several university programs have undertaken research on collagen fillers and collagen formation, "until the federal government realizes the importance that cosmetic dermatology [plays] in patients' lives, no significant basic science research will be funded at the university level. It is understandable, but not practical, for all research dollars to be in life-and-death disease states. Hopefully, the paradigm will change among strategic decisions at the government and university [levels], thereby allowing more serious research in cosmetic dermatology," he added.

"Somehow, I think, we've lost our way," said Dr. Katz. "Our specialty has already lost STDs, connective tissue disease, psoriasis, eczema, melanoma, wound healing – they’re already gone. Yes, there are psoriasis centers, but rheumatology is taking over much of psoriasis because rheumatologists are more comfortable with the more complicated biologic therapies."

The Marginalization of Academic Dermatology in the U.S.

He said that he had hoped academic dermatology would play a leadership role in keeping the specialty on track as a broad-based discipline that encompasses a wide range of subspecialty interests, but in this he has been disappointed. Academic dermatology, in his view, has undergone marginalization and trivialization.

"In the U.S., many dermatology programs are struggling to continue an academic focus. They’re basically no more than practices that happen to be located at a university, trying to meet their financial overhead with very little in the way of an academic dimension. Still, there are maybe 10-15 programs in the U.S. that have continued to focus on generating new knowledge in the university setting, often with PhD scientists leading the way," said Dr. Katz.

The National Institutes of Health now awards far more grant money for research into skin biology and skin disease to PhD scientists than to either MD or MD/PhD scientists. "The PhD scientists are so successful because they’re not encumbered by seeing patients," Dr. Katz said during a panel discussion at the World Congress of Dermatology in Seoul, South Korea.

His fellow panelist, Dr. Georg Stingl, reported that a substantial reduction in the number of hospital beds reserved for dermatology is ongoing throughout Europe, but those beds "have not yet disappeared."

"I predict the scope of dermatology in continental Europe 10 years from now will remain broader than that in the U.S., [the United Kingdom], and Asia, but it will be subject to territorial battles with other disciplines. Yet if our discipline is shrinking, it’s not the fault of others. It’s our own fault. It’s because of what we are doing," said Dr. Stingl, professor of dermatology and chairman of the division of immunology, allergy, and infectious diseases at the Medical University of Vienna.

 

 

Traditional fields of European dermatology that are now at risk of being lost to other specialties include venereology, dermato-oncology, type 1 allergy, and dermatopathology. Phlebology has already been largely taken over by vascular surgeons. On the other hand, European dermatology will see expansion of genodermatology, aesthetic dermatology, and dermatologic surgery, he added.

Dermatologic Role Changing in Japan

In Japan, as elsewhere throughout the world, the traditional dermatologic role as caregiver for severely ill patients is being taken over by other specialties.

"Traditional dermatologists are an endangered species," said Dr. Masayuki Amagai, professor of dermatology at Keio University, Tokyo. For example, Japanese dermatologists traditionally have cared for melanoma patients with terminal disease as well as for those who have early-stage disease. Now, however, more patients with advanced melanoma are being seen in integrated oncology centers.

Following the same model, it is likely that the near future will bring a new sort of integrated immunologic disease center for patients with Crohn’s disease, rheumatoid arthritis, psoriasis, and other conditions that share common inflammatory mechanisms. Dermatologists who hunger to take on the most interesting and challenging cases will want to become a part of such centers, where they will work alongside rheumatologists and gastroenterologists, Dr. Amagai said.

The good news in the United States, according to Dr. Katz, is that the recent skin biology discoveries will eventually translate into major clinical advances. And dermatology continues to attract the best and brightest medical school graduates, he said. In 2004-2007, 5.8% of U.S. dermatology residency positions were held by MD/PhDs, a rate nearly threefold greater than the average for other residency programs.

His wish list for the dermatology specialty includes a better-organized research agenda, including research programs in the cosmetic and procedural aspects of the specialty. He would also like to see the development of a clinical research consortium in dermatology, similar to the way pediatric research is conducted. "Dermatology departments are just too small to not work together," said Dr. Katz.

More effort should be placed on educating dermatologists about health services research, comparative effectiveness studies, and clinical outcomes research.

"These are areas we're not very good at. We need to be part of that whole scenario because reimbursement is going to be based on these types of studies," he said.

The speakers declared having no financial conflicts.

By 2020, the practice of dermatology will likely be narrower in scope, be less medically oriented, and place greater emphasis on dermatologic surgery and cosmetic interventions, according to two medical dermatologists.

These trends will apply worldwide but will be most pronounced in the United States, where they are already far more advanced. The dermatologists also predicted that European dermatologists will continue to practice a more traditional style of dermatology that includes inpatient care for severely ill patients.

Dr. Stephen I. Katz

"Cosmetic and procedural dermatology will no doubt continue to grow. You can see it growing at virtually every meeting you go to in the U.S. Why? There’s immediate gratification, and it’s the path of least resistance. You’re basically on your own, in business for yourself," said Dr. Stephen I. Katz, a dermatologist and director of the National Institute of Arthritis and Musculoskeletal and Skin Diseases.

What he finds particularly worrisome, he said, is that most cosmetic and procedural dermatology is not science based. These fields are thriving at a time of exciting advances in understanding the biologic basis of skin diseases, with the discoveries having far-reaching implications for both diagnosis and treatment. Yet aesthetic dermatology and dermatologic surgery are not pulling their fair share of the load, he said.

"With few exceptions, cosmetic and procedural dermatology is not accompanied by a research base. It’s absolutely ridiculous, for example, that we don’t have a robust research interest in wound healing," said Dr. Katz.

In an interview, Dr. Joel Schlessinger, a dermatologist in private practice in Omaha, Neb., said that although he too laments the lack of basic science research in cosmetic dermatology, he believes there is room for hope.

"Recently, over 100 residents competed for selective positions as adjunct faculty at the Third Annual Cosmetic Surgery Forum by submitting research on cosmetic dermatology," said Dr. Schlessinger.

And although several university programs have undertaken research on collagen fillers and collagen formation, "until the federal government realizes the importance that cosmetic dermatology [plays] in patients' lives, no significant basic science research will be funded at the university level. It is understandable, but not practical, for all research dollars to be in life-and-death disease states. Hopefully, the paradigm will change among strategic decisions at the government and university [levels], thereby allowing more serious research in cosmetic dermatology," he added.

"Somehow, I think, we've lost our way," said Dr. Katz. "Our specialty has already lost STDs, connective tissue disease, psoriasis, eczema, melanoma, wound healing – they’re already gone. Yes, there are psoriasis centers, but rheumatology is taking over much of psoriasis because rheumatologists are more comfortable with the more complicated biologic therapies."

The Marginalization of Academic Dermatology in the U.S.

He said that he had hoped academic dermatology would play a leadership role in keeping the specialty on track as a broad-based discipline that encompasses a wide range of subspecialty interests, but in this he has been disappointed. Academic dermatology, in his view, has undergone marginalization and trivialization.

"In the U.S., many dermatology programs are struggling to continue an academic focus. They’re basically no more than practices that happen to be located at a university, trying to meet their financial overhead with very little in the way of an academic dimension. Still, there are maybe 10-15 programs in the U.S. that have continued to focus on generating new knowledge in the university setting, often with PhD scientists leading the way," said Dr. Katz.

The National Institutes of Health now awards far more grant money for research into skin biology and skin disease to PhD scientists than to either MD or MD/PhD scientists. "The PhD scientists are so successful because they’re not encumbered by seeing patients," Dr. Katz said during a panel discussion at the World Congress of Dermatology in Seoul, South Korea.

His fellow panelist, Dr. Georg Stingl, reported that a substantial reduction in the number of hospital beds reserved for dermatology is ongoing throughout Europe, but those beds "have not yet disappeared."

"I predict the scope of dermatology in continental Europe 10 years from now will remain broader than that in the U.S., [the United Kingdom], and Asia, but it will be subject to territorial battles with other disciplines. Yet if our discipline is shrinking, it’s not the fault of others. It’s our own fault. It’s because of what we are doing," said Dr. Stingl, professor of dermatology and chairman of the division of immunology, allergy, and infectious diseases at the Medical University of Vienna.

 

 

Traditional fields of European dermatology that are now at risk of being lost to other specialties include venereology, dermato-oncology, type 1 allergy, and dermatopathology. Phlebology has already been largely taken over by vascular surgeons. On the other hand, European dermatology will see expansion of genodermatology, aesthetic dermatology, and dermatologic surgery, he added.

Dermatologic Role Changing in Japan

In Japan, as elsewhere throughout the world, the traditional dermatologic role as caregiver for severely ill patients is being taken over by other specialties.

"Traditional dermatologists are an endangered species," said Dr. Masayuki Amagai, professor of dermatology at Keio University, Tokyo. For example, Japanese dermatologists traditionally have cared for melanoma patients with terminal disease as well as for those who have early-stage disease. Now, however, more patients with advanced melanoma are being seen in integrated oncology centers.

Following the same model, it is likely that the near future will bring a new sort of integrated immunologic disease center for patients with Crohn’s disease, rheumatoid arthritis, psoriasis, and other conditions that share common inflammatory mechanisms. Dermatologists who hunger to take on the most interesting and challenging cases will want to become a part of such centers, where they will work alongside rheumatologists and gastroenterologists, Dr. Amagai said.

The good news in the United States, according to Dr. Katz, is that the recent skin biology discoveries will eventually translate into major clinical advances. And dermatology continues to attract the best and brightest medical school graduates, he said. In 2004-2007, 5.8% of U.S. dermatology residency positions were held by MD/PhDs, a rate nearly threefold greater than the average for other residency programs.

His wish list for the dermatology specialty includes a better-organized research agenda, including research programs in the cosmetic and procedural aspects of the specialty. He would also like to see the development of a clinical research consortium in dermatology, similar to the way pediatric research is conducted. "Dermatology departments are just too small to not work together," said Dr. Katz.

More effort should be placed on educating dermatologists about health services research, comparative effectiveness studies, and clinical outcomes research.

"These are areas we're not very good at. We need to be part of that whole scenario because reimbursement is going to be based on these types of studies," he said.

The speakers declared having no financial conflicts.

Publications
Publications
Topics
Article Type
Display Headline
Experts: Medical Dermatology Is Losing Ground
Display Headline
Experts: Medical Dermatology Is Losing Ground
Legacy Keywords
medical dermatology, Dr. Stephen I. Katz, cosmetic dermatology, procedural dermatology
Legacy Keywords
medical dermatology, Dr. Stephen I. Katz, cosmetic dermatology, procedural dermatology
Article Source

PURLs Copyright

Inside the Article

Denosumab Fares Well in Extension Trials

Article Type
Changed
Fri, 01/18/2019 - 11:26
Display Headline
Denosumab Fares Well in Extension Trials

SAN DIEGO – Denosumab produced progressive increases in bone mineral density at key skeletal sites, along with a sustained reduction in biomarkers of bone turnover, through 8 years in a long-term extension of a clinical trial.

These 8-year continuous treatment data are both highly reassuring and clinically relevant, Dr. Michael McClung observed at the annual meeting of the American Society for Bone and Mineral Research.

Dr. Michael McClung

"Because the effects of denosumab on inhibiting bone remodeling are actually quite quickly reversible upon discontinuing therapy, long-term treatment with denosumab is anticipated to be important. Furthermore, denosumab, like other antiresorptive agents, is not the cure for osteoporosis and so maintaining bone strength with long-term therapy becomes necessary," according to Dr. McClung, founding director of the Oregon Osteoporosis Center, Portland.

Other investigators presented 5-year denosumab (Prolia) safety and efficacy updates from the ongoing extension of the landmark 7,868-patient phase III Fracture Reduction Evaluation of Denosumab in Osteoporosis Every 6 Months (FREEDOM) trial. The original 3-year results of FREEDOM were instrumental in winning marketing approval of the novel RANK ligand inhibitor for the treatment of postmenopausal women with osteoporosis at high risk for fracture. The FREEDOM trial is being extended to 10 years of denosumab therapy in the original treatment arm and 7 years in placebo-treated patients who crossed over to open-label active therapy.

Dr. McClung reported on the 200 postmenopausal women with osteoporosis or low bone mass who completed a 4-year phase 2 study of denosumab by subcutaneous injection of 60 mg once every 6 months and then enrolled in a 4-year extension study.

During the first 4 years on denosumab, the women showed an average 8% gain in bone mineral density at the lumbar spine and a 5% gain at the total hip. By 8 years, these gains had increased to 17% at the lumbar spine compared with baseline and 7% at the total hip. Reductions in the bone turnover markers, serum C-telopeptide of type I collagen, and bone-specific alkaline phosphatase remained stable throughout the 8 years of treatment, he added.

Also at the meeting, Dr. Steven R. Cummings presented an analysis of years 4 and 5 of the extension of the FREEDOM trial in which he concluded that denosumab significantly reduced the estimated risk of new vertebral fractures by 36% and nonvertebral fractures by 51% compared with placebo.

What’s unusual about his analysis is that there is actually no placebo arm in the extension phase of the trial. For purposes of comparison, he employed a novel "virtual twin" simulation method he and his colleagues created and have subsequently validated (Stat. Med. 2010;29:1127-36). The statistical tool uses data from the initial placebo-controlled phase of the trial to project expected outcomes for a cohort of "virtual twins" had they remained on placebo during the study extension.

Based on the virtual twin method, the projected yearly incidence of nonvertebral fractures in years 4 and 5 was 2.6% in the virtual twins on placebo, compared with 1.3% in the real denosumab-treated group. The projected incidence of new vertebral fractures in the extension phase was 2.2% per year in the virtual twins, compared with 1.4% in women on denosumab, according to Dr. Cummings, professor of medicine, epidemiology, and biostatistics at the University of California, San Francisco.

"Long-term treatment with denosumab is anticipated to be important."

He was first author of the landmark core 3-year FREEDOM trial, in which denosumab reduced the risk of new nonvertebral, vertebral, and hip fractures by 20%, 68%, and 40%, respectively, compared with placebo (N. Engl. J. Med. 2009;361:756-65).

Dr. Henry G. Bone presented an updated safety analysis of denosumab through 5 years of the extended FREEDOM trial.

"The bottom line for this report is that through the 4th and 5th years of this program, the long-term extension and crossover data do not indicate any progression of infection or malignancy," said Dr. Bone, head, endocrinology division, St. John Hospital and Medical Center, Detroit.

For example, the denosumab prescribing information cautions that cellulitis and erysipelas occurred more often in the active treatment arm than in placebo-treated controls during the first 3 years of the trial: four cases in year 1 in the denosumab arm, one case in year 2, and eight in year 3, compared with just one case in 3 years of placebo. The trend suggested a potential worsening problem over time with denosumab. But there were only two cases of the serious skin infections in year 4 of denosumab therapy and one in year 5, along with one case during 2 years of therapy in patients crossed over from placebo.

 

 

"I think that’s informative about what the long-term trend might be," he commented.

The safety analysis included 2,343 women on the RANK ligand for the full 5 years and 2,207 crossed over to open-label denosumab after 3 years on placebo.

An increased risk of endocarditis is also mentioned in the prescribing information based on three cases that occurred in the first 3 years; however, no further cases have occurred during the extension phase. Moreover, upon closer inspection one of the three cases of endocarditis appears suspect, as it didn’t result in hospitalization or antibiotic therapy.

Malignancy rates have remained similar and stable over time in both study arms, with no evidence of a concentration of cases in any organ system, according to Dr. Bone.

Dr. McClung and Dr. Bone disclosed that they serve as consultants to and are on the speakers bureau of Amgen, which sponsored the clinical trials. Dr. Cummings is also a consultant to the company.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
denosumab bone, women bone density, women osteoporosis, bone mineral density, bone turnover biomarkers, denosumab safety
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

SAN DIEGO – Denosumab produced progressive increases in bone mineral density at key skeletal sites, along with a sustained reduction in biomarkers of bone turnover, through 8 years in a long-term extension of a clinical trial.

These 8-year continuous treatment data are both highly reassuring and clinically relevant, Dr. Michael McClung observed at the annual meeting of the American Society for Bone and Mineral Research.

Dr. Michael McClung

"Because the effects of denosumab on inhibiting bone remodeling are actually quite quickly reversible upon discontinuing therapy, long-term treatment with denosumab is anticipated to be important. Furthermore, denosumab, like other antiresorptive agents, is not the cure for osteoporosis and so maintaining bone strength with long-term therapy becomes necessary," according to Dr. McClung, founding director of the Oregon Osteoporosis Center, Portland.

Other investigators presented 5-year denosumab (Prolia) safety and efficacy updates from the ongoing extension of the landmark 7,868-patient phase III Fracture Reduction Evaluation of Denosumab in Osteoporosis Every 6 Months (FREEDOM) trial. The original 3-year results of FREEDOM were instrumental in winning marketing approval of the novel RANK ligand inhibitor for the treatment of postmenopausal women with osteoporosis at high risk for fracture. The FREEDOM trial is being extended to 10 years of denosumab therapy in the original treatment arm and 7 years in placebo-treated patients who crossed over to open-label active therapy.

Dr. McClung reported on the 200 postmenopausal women with osteoporosis or low bone mass who completed a 4-year phase 2 study of denosumab by subcutaneous injection of 60 mg once every 6 months and then enrolled in a 4-year extension study.

During the first 4 years on denosumab, the women showed an average 8% gain in bone mineral density at the lumbar spine and a 5% gain at the total hip. By 8 years, these gains had increased to 17% at the lumbar spine compared with baseline and 7% at the total hip. Reductions in the bone turnover markers, serum C-telopeptide of type I collagen, and bone-specific alkaline phosphatase remained stable throughout the 8 years of treatment, he added.

Also at the meeting, Dr. Steven R. Cummings presented an analysis of years 4 and 5 of the extension of the FREEDOM trial in which he concluded that denosumab significantly reduced the estimated risk of new vertebral fractures by 36% and nonvertebral fractures by 51% compared with placebo.

What’s unusual about his analysis is that there is actually no placebo arm in the extension phase of the trial. For purposes of comparison, he employed a novel "virtual twin" simulation method he and his colleagues created and have subsequently validated (Stat. Med. 2010;29:1127-36). The statistical tool uses data from the initial placebo-controlled phase of the trial to project expected outcomes for a cohort of "virtual twins" had they remained on placebo during the study extension.

Based on the virtual twin method, the projected yearly incidence of nonvertebral fractures in years 4 and 5 was 2.6% in the virtual twins on placebo, compared with 1.3% in the real denosumab-treated group. The projected incidence of new vertebral fractures in the extension phase was 2.2% per year in the virtual twins, compared with 1.4% in women on denosumab, according to Dr. Cummings, professor of medicine, epidemiology, and biostatistics at the University of California, San Francisco.

"Long-term treatment with denosumab is anticipated to be important."

He was first author of the landmark core 3-year FREEDOM trial, in which denosumab reduced the risk of new nonvertebral, vertebral, and hip fractures by 20%, 68%, and 40%, respectively, compared with placebo (N. Engl. J. Med. 2009;361:756-65).

Dr. Henry G. Bone presented an updated safety analysis of denosumab through 5 years of the extended FREEDOM trial.

"The bottom line for this report is that through the 4th and 5th years of this program, the long-term extension and crossover data do not indicate any progression of infection or malignancy," said Dr. Bone, head, endocrinology division, St. John Hospital and Medical Center, Detroit.

For example, the denosumab prescribing information cautions that cellulitis and erysipelas occurred more often in the active treatment arm than in placebo-treated controls during the first 3 years of the trial: four cases in year 1 in the denosumab arm, one case in year 2, and eight in year 3, compared with just one case in 3 years of placebo. The trend suggested a potential worsening problem over time with denosumab. But there were only two cases of the serious skin infections in year 4 of denosumab therapy and one in year 5, along with one case during 2 years of therapy in patients crossed over from placebo.

 

 

"I think that’s informative about what the long-term trend might be," he commented.

The safety analysis included 2,343 women on the RANK ligand for the full 5 years and 2,207 crossed over to open-label denosumab after 3 years on placebo.

An increased risk of endocarditis is also mentioned in the prescribing information based on three cases that occurred in the first 3 years; however, no further cases have occurred during the extension phase. Moreover, upon closer inspection one of the three cases of endocarditis appears suspect, as it didn’t result in hospitalization or antibiotic therapy.

Malignancy rates have remained similar and stable over time in both study arms, with no evidence of a concentration of cases in any organ system, according to Dr. Bone.

Dr. McClung and Dr. Bone disclosed that they serve as consultants to and are on the speakers bureau of Amgen, which sponsored the clinical trials. Dr. Cummings is also a consultant to the company.

SAN DIEGO – Denosumab produced progressive increases in bone mineral density at key skeletal sites, along with a sustained reduction in biomarkers of bone turnover, through 8 years in a long-term extension of a clinical trial.

These 8-year continuous treatment data are both highly reassuring and clinically relevant, Dr. Michael McClung observed at the annual meeting of the American Society for Bone and Mineral Research.

Dr. Michael McClung

"Because the effects of denosumab on inhibiting bone remodeling are actually quite quickly reversible upon discontinuing therapy, long-term treatment with denosumab is anticipated to be important. Furthermore, denosumab, like other antiresorptive agents, is not the cure for osteoporosis and so maintaining bone strength with long-term therapy becomes necessary," according to Dr. McClung, founding director of the Oregon Osteoporosis Center, Portland.

Other investigators presented 5-year denosumab (Prolia) safety and efficacy updates from the ongoing extension of the landmark 7,868-patient phase III Fracture Reduction Evaluation of Denosumab in Osteoporosis Every 6 Months (FREEDOM) trial. The original 3-year results of FREEDOM were instrumental in winning marketing approval of the novel RANK ligand inhibitor for the treatment of postmenopausal women with osteoporosis at high risk for fracture. The FREEDOM trial is being extended to 10 years of denosumab therapy in the original treatment arm and 7 years in placebo-treated patients who crossed over to open-label active therapy.

Dr. McClung reported on the 200 postmenopausal women with osteoporosis or low bone mass who completed a 4-year phase 2 study of denosumab by subcutaneous injection of 60 mg once every 6 months and then enrolled in a 4-year extension study.

During the first 4 years on denosumab, the women showed an average 8% gain in bone mineral density at the lumbar spine and a 5% gain at the total hip. By 8 years, these gains had increased to 17% at the lumbar spine compared with baseline and 7% at the total hip. Reductions in the bone turnover markers, serum C-telopeptide of type I collagen, and bone-specific alkaline phosphatase remained stable throughout the 8 years of treatment, he added.

Also at the meeting, Dr. Steven R. Cummings presented an analysis of years 4 and 5 of the extension of the FREEDOM trial in which he concluded that denosumab significantly reduced the estimated risk of new vertebral fractures by 36% and nonvertebral fractures by 51% compared with placebo.

What’s unusual about his analysis is that there is actually no placebo arm in the extension phase of the trial. For purposes of comparison, he employed a novel "virtual twin" simulation method he and his colleagues created and have subsequently validated (Stat. Med. 2010;29:1127-36). The statistical tool uses data from the initial placebo-controlled phase of the trial to project expected outcomes for a cohort of "virtual twins" had they remained on placebo during the study extension.

Based on the virtual twin method, the projected yearly incidence of nonvertebral fractures in years 4 and 5 was 2.6% in the virtual twins on placebo, compared with 1.3% in the real denosumab-treated group. The projected incidence of new vertebral fractures in the extension phase was 2.2% per year in the virtual twins, compared with 1.4% in women on denosumab, according to Dr. Cummings, professor of medicine, epidemiology, and biostatistics at the University of California, San Francisco.

"Long-term treatment with denosumab is anticipated to be important."

He was first author of the landmark core 3-year FREEDOM trial, in which denosumab reduced the risk of new nonvertebral, vertebral, and hip fractures by 20%, 68%, and 40%, respectively, compared with placebo (N. Engl. J. Med. 2009;361:756-65).

Dr. Henry G. Bone presented an updated safety analysis of denosumab through 5 years of the extended FREEDOM trial.

"The bottom line for this report is that through the 4th and 5th years of this program, the long-term extension and crossover data do not indicate any progression of infection or malignancy," said Dr. Bone, head, endocrinology division, St. John Hospital and Medical Center, Detroit.

For example, the denosumab prescribing information cautions that cellulitis and erysipelas occurred more often in the active treatment arm than in placebo-treated controls during the first 3 years of the trial: four cases in year 1 in the denosumab arm, one case in year 2, and eight in year 3, compared with just one case in 3 years of placebo. The trend suggested a potential worsening problem over time with denosumab. But there were only two cases of the serious skin infections in year 4 of denosumab therapy and one in year 5, along with one case during 2 years of therapy in patients crossed over from placebo.

 

 

"I think that’s informative about what the long-term trend might be," he commented.

The safety analysis included 2,343 women on the RANK ligand for the full 5 years and 2,207 crossed over to open-label denosumab after 3 years on placebo.

An increased risk of endocarditis is also mentioned in the prescribing information based on three cases that occurred in the first 3 years; however, no further cases have occurred during the extension phase. Moreover, upon closer inspection one of the three cases of endocarditis appears suspect, as it didn’t result in hospitalization or antibiotic therapy.

Malignancy rates have remained similar and stable over time in both study arms, with no evidence of a concentration of cases in any organ system, according to Dr. Bone.

Dr. McClung and Dr. Bone disclosed that they serve as consultants to and are on the speakers bureau of Amgen, which sponsored the clinical trials. Dr. Cummings is also a consultant to the company.

Publications
Publications
Topics
Article Type
Display Headline
Denosumab Fares Well in Extension Trials
Display Headline
Denosumab Fares Well in Extension Trials
Legacy Keywords
denosumab bone, women bone density, women osteoporosis, bone mineral density, bone turnover biomarkers, denosumab safety
Legacy Keywords
denosumab bone, women bone density, women osteoporosis, bone mineral density, bone turnover biomarkers, denosumab safety
Article Source

EXPERT ANALYSIS FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY FOR BONE AND MINERAL RESEARCH

PURLs Copyright

Inside the Article

Need for Pharmacologic Stress Test Often Overestimated

Article Type
Changed
Fri, 01/18/2019 - 11:25
Display Headline
Need for Pharmacologic Stress Test Often Overestimated

DENVER – Physicians making referrals for cardiac stress testing often underestimate their patients’ ability to exercise to target heart rate, according to Dr. Michael Ross.

Here’s what can happen as a result: In a prospective series of 120 consecutive patients referred for pharmacologic myocardial perfusion imaging stress testing by primary care physicians, surgeons, and cardiologists, 60% of the patients were able to mount the treadmill and exercise to 85% of their estimated maximum heart rate, he reported at the annual meeting of American Society of Nuclear Cardiology.

Primary care physicians were significantly more likely than were cardiologists or surgeons to order a pharmacologic stress test in patients who did not need one because they were able to complete the less costly exercise stress test.

Is that because primary care physicians don’t know their patients and their physical capacities as well as other physicians do? Highly unlikely. Instead, it appears they are more concerned that if they order an exercise stress test and a patient can’t complete it, they’ll have to reorder the test – this time using pharmacologic stress – with the attendant inconvenience and delay, according to Dr. Ross of Northwestern University, Chicago.

In a multivariate logistic regression analysis, the only independent predictors of failure to reach target heart rate were being on a beta-blocker and having diabetes.

Dr. Ross said he had no relevant financial disclosures.

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
cardiac stress testing, pharmacologic stress test, exercise stress test, beta-blocker, diabetes stress test
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

DENVER – Physicians making referrals for cardiac stress testing often underestimate their patients’ ability to exercise to target heart rate, according to Dr. Michael Ross.

Here’s what can happen as a result: In a prospective series of 120 consecutive patients referred for pharmacologic myocardial perfusion imaging stress testing by primary care physicians, surgeons, and cardiologists, 60% of the patients were able to mount the treadmill and exercise to 85% of their estimated maximum heart rate, he reported at the annual meeting of American Society of Nuclear Cardiology.

Primary care physicians were significantly more likely than were cardiologists or surgeons to order a pharmacologic stress test in patients who did not need one because they were able to complete the less costly exercise stress test.

Is that because primary care physicians don’t know their patients and their physical capacities as well as other physicians do? Highly unlikely. Instead, it appears they are more concerned that if they order an exercise stress test and a patient can’t complete it, they’ll have to reorder the test – this time using pharmacologic stress – with the attendant inconvenience and delay, according to Dr. Ross of Northwestern University, Chicago.

In a multivariate logistic regression analysis, the only independent predictors of failure to reach target heart rate were being on a beta-blocker and having diabetes.

Dr. Ross said he had no relevant financial disclosures.

DENVER – Physicians making referrals for cardiac stress testing often underestimate their patients’ ability to exercise to target heart rate, according to Dr. Michael Ross.

Here’s what can happen as a result: In a prospective series of 120 consecutive patients referred for pharmacologic myocardial perfusion imaging stress testing by primary care physicians, surgeons, and cardiologists, 60% of the patients were able to mount the treadmill and exercise to 85% of their estimated maximum heart rate, he reported at the annual meeting of American Society of Nuclear Cardiology.

Primary care physicians were significantly more likely than were cardiologists or surgeons to order a pharmacologic stress test in patients who did not need one because they were able to complete the less costly exercise stress test.

Is that because primary care physicians don’t know their patients and their physical capacities as well as other physicians do? Highly unlikely. Instead, it appears they are more concerned that if they order an exercise stress test and a patient can’t complete it, they’ll have to reorder the test – this time using pharmacologic stress – with the attendant inconvenience and delay, according to Dr. Ross of Northwestern University, Chicago.

In a multivariate logistic regression analysis, the only independent predictors of failure to reach target heart rate were being on a beta-blocker and having diabetes.

Dr. Ross said he had no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Need for Pharmacologic Stress Test Often Overestimated
Display Headline
Need for Pharmacologic Stress Test Often Overestimated
Legacy Keywords
cardiac stress testing, pharmacologic stress test, exercise stress test, beta-blocker, diabetes stress test
Legacy Keywords
cardiac stress testing, pharmacologic stress test, exercise stress test, beta-blocker, diabetes stress test
Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY OF NUCLEAR CARDIOLOGY

PURLs Copyright

Inside the Article