User login
Genetic analysis links PCSK9 inhibition and CV mortality
, but not all-cause mortality, in a large cohort of individuals.
“We tested the hypothesis that genetically low LDL cholesterol due to PCSK9 [proprotein convertase subtilisin/kexin type 9] variation is causally associated with low cardiovascular and all-cause mortality in a general population of Northern European ancestry,” wrote Marianne Benn, MD, DMSc, and colleagues. The findings were published in the Journal of the American College of Cardiology.
The researchers conducted a large-scale genetic analysis of 109,566 persons from the Copenhagen City Heart Study and Copenhagen General Population Study. In addition, the team included a validation cohort of 431,043 individuals from the UK Biobank.
The median duration of follow-up was 10 years (0-42 years), and the median age at study entry was 57 years.
Study participants were genotyped for several PCSK9 variants and a weighted allele score based the effects of LDL cholesterol, individual allele frequency, and number of variant alleles was calculated for each subject.
Weighted scores were categorized into five stepwise noncontinuous score ranges, with lower levels of LDL cholesterol linked to higher allele scores.
After analysis, the researchers found that a growing number of PCSK9 alleles were associated with lower levels of LDL cholesterol up to 0.61 mmol/L (P for trend less than .001) and reduced CV mortality (P = .001), but not with reduced all-cause mortality (P = .11).
“Our genetic data did not show a reduction in risk of all-cause mortality, and only showed a reduction in risk of all-cause mortality in statin trials and not in the PCSK9-inhibitor trials meta-analyzed,” the researchers wrote. “This may be explained by the low frequency of cardiovascular disease in the 2 populations studied,” they explained.
One key limitation was the homogeneous makeup of the study population. Dr. Benn and colleagues acknowledged this could limit the generalizability of the results.
“Long-term LDL cholesterol treatment (e.g., with PCSK9 inhibitors), may translate into reductions in cardiovascular mortality,” they concluded.
The study was supported by the Danish Council for Independent Research, Medical Sciences, Johan Boserup, and the Lise Boserup’s Fund. The authors reported no conflicts of interest.
SOURCE: Benn M et al. JACC. 2019 Jun 17. doi: 10.1016/j.jacc.2019.03.517
One question that remains from the current study is whether prolonged inhibition of PCSK9 in patients with increased LDL cholesterol levels will reduce cardiovascular mortality in the context of primary and secondary prevention.
The recent development of PCSK9 inhibitors was heavily influenced by genetic analyses showing that person-specific variants in the PCSK9 gene could lower LDL levels and reduce rates of coronary heart disease. Because of the rarity of these gene variants, their impact on mortality on a large-scale basis remains unclear.
Although numerous clinical trials have shown that PCSK9 inhibition can reduce CVD-related events in both chronic and high-risk patients, no study has clearly shown an effect on cardiovascular death. However, the relationship between lipid levels and clinical outcomes is difficult to assess owing to the presence of confounding factors. Certain types of genetic analysis may help eliminate these challenges by analyzing large populations over extended periods of time.
The genetic analysis by Dr. Benn and colleagues showed an association between long-term exposure to lower levels of LDL cholesterol, by means of functional variants in the PCSK9 gene, and reduced cardiovascular mortality. These findings, alongside other studies, provide further support for the relationship between PCSK9 inhibition and prevention of cardiovascular mortality.
Gregory G. Schwartz, MD, PhD , and Matthew R.G. Taylor, MD, PhD , are with the University of Colorado in Aurora. Dr. Schwartz reported having financial affiliations with Resverlogix, Roche, Sanofi, and The Medicines Company. These comments are adapted from their editorial (J Am Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.03.518 ).
One question that remains from the current study is whether prolonged inhibition of PCSK9 in patients with increased LDL cholesterol levels will reduce cardiovascular mortality in the context of primary and secondary prevention.
The recent development of PCSK9 inhibitors was heavily influenced by genetic analyses showing that person-specific variants in the PCSK9 gene could lower LDL levels and reduce rates of coronary heart disease. Because of the rarity of these gene variants, their impact on mortality on a large-scale basis remains unclear.
Although numerous clinical trials have shown that PCSK9 inhibition can reduce CVD-related events in both chronic and high-risk patients, no study has clearly shown an effect on cardiovascular death. However, the relationship between lipid levels and clinical outcomes is difficult to assess owing to the presence of confounding factors. Certain types of genetic analysis may help eliminate these challenges by analyzing large populations over extended periods of time.
The genetic analysis by Dr. Benn and colleagues showed an association between long-term exposure to lower levels of LDL cholesterol, by means of functional variants in the PCSK9 gene, and reduced cardiovascular mortality. These findings, alongside other studies, provide further support for the relationship between PCSK9 inhibition and prevention of cardiovascular mortality.
Gregory G. Schwartz, MD, PhD , and Matthew R.G. Taylor, MD, PhD , are with the University of Colorado in Aurora. Dr. Schwartz reported having financial affiliations with Resverlogix, Roche, Sanofi, and The Medicines Company. These comments are adapted from their editorial (J Am Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.03.518 ).
One question that remains from the current study is whether prolonged inhibition of PCSK9 in patients with increased LDL cholesterol levels will reduce cardiovascular mortality in the context of primary and secondary prevention.
The recent development of PCSK9 inhibitors was heavily influenced by genetic analyses showing that person-specific variants in the PCSK9 gene could lower LDL levels and reduce rates of coronary heart disease. Because of the rarity of these gene variants, their impact on mortality on a large-scale basis remains unclear.
Although numerous clinical trials have shown that PCSK9 inhibition can reduce CVD-related events in both chronic and high-risk patients, no study has clearly shown an effect on cardiovascular death. However, the relationship between lipid levels and clinical outcomes is difficult to assess owing to the presence of confounding factors. Certain types of genetic analysis may help eliminate these challenges by analyzing large populations over extended periods of time.
The genetic analysis by Dr. Benn and colleagues showed an association between long-term exposure to lower levels of LDL cholesterol, by means of functional variants in the PCSK9 gene, and reduced cardiovascular mortality. These findings, alongside other studies, provide further support for the relationship between PCSK9 inhibition and prevention of cardiovascular mortality.
Gregory G. Schwartz, MD, PhD , and Matthew R.G. Taylor, MD, PhD , are with the University of Colorado in Aurora. Dr. Schwartz reported having financial affiliations with Resverlogix, Roche, Sanofi, and The Medicines Company. These comments are adapted from their editorial (J Am Coll Cardiol. 2019 Jun 17. doi: 10.1016/j.jacc.2019.03.518 ).
, but not all-cause mortality, in a large cohort of individuals.
“We tested the hypothesis that genetically low LDL cholesterol due to PCSK9 [proprotein convertase subtilisin/kexin type 9] variation is causally associated with low cardiovascular and all-cause mortality in a general population of Northern European ancestry,” wrote Marianne Benn, MD, DMSc, and colleagues. The findings were published in the Journal of the American College of Cardiology.
The researchers conducted a large-scale genetic analysis of 109,566 persons from the Copenhagen City Heart Study and Copenhagen General Population Study. In addition, the team included a validation cohort of 431,043 individuals from the UK Biobank.
The median duration of follow-up was 10 years (0-42 years), and the median age at study entry was 57 years.
Study participants were genotyped for several PCSK9 variants and a weighted allele score based the effects of LDL cholesterol, individual allele frequency, and number of variant alleles was calculated for each subject.
Weighted scores were categorized into five stepwise noncontinuous score ranges, with lower levels of LDL cholesterol linked to higher allele scores.
After analysis, the researchers found that a growing number of PCSK9 alleles were associated with lower levels of LDL cholesterol up to 0.61 mmol/L (P for trend less than .001) and reduced CV mortality (P = .001), but not with reduced all-cause mortality (P = .11).
“Our genetic data did not show a reduction in risk of all-cause mortality, and only showed a reduction in risk of all-cause mortality in statin trials and not in the PCSK9-inhibitor trials meta-analyzed,” the researchers wrote. “This may be explained by the low frequency of cardiovascular disease in the 2 populations studied,” they explained.
One key limitation was the homogeneous makeup of the study population. Dr. Benn and colleagues acknowledged this could limit the generalizability of the results.
“Long-term LDL cholesterol treatment (e.g., with PCSK9 inhibitors), may translate into reductions in cardiovascular mortality,” they concluded.
The study was supported by the Danish Council for Independent Research, Medical Sciences, Johan Boserup, and the Lise Boserup’s Fund. The authors reported no conflicts of interest.
SOURCE: Benn M et al. JACC. 2019 Jun 17. doi: 10.1016/j.jacc.2019.03.517
, but not all-cause mortality, in a large cohort of individuals.
“We tested the hypothesis that genetically low LDL cholesterol due to PCSK9 [proprotein convertase subtilisin/kexin type 9] variation is causally associated with low cardiovascular and all-cause mortality in a general population of Northern European ancestry,” wrote Marianne Benn, MD, DMSc, and colleagues. The findings were published in the Journal of the American College of Cardiology.
The researchers conducted a large-scale genetic analysis of 109,566 persons from the Copenhagen City Heart Study and Copenhagen General Population Study. In addition, the team included a validation cohort of 431,043 individuals from the UK Biobank.
The median duration of follow-up was 10 years (0-42 years), and the median age at study entry was 57 years.
Study participants were genotyped for several PCSK9 variants and a weighted allele score based the effects of LDL cholesterol, individual allele frequency, and number of variant alleles was calculated for each subject.
Weighted scores were categorized into five stepwise noncontinuous score ranges, with lower levels of LDL cholesterol linked to higher allele scores.
After analysis, the researchers found that a growing number of PCSK9 alleles were associated with lower levels of LDL cholesterol up to 0.61 mmol/L (P for trend less than .001) and reduced CV mortality (P = .001), but not with reduced all-cause mortality (P = .11).
“Our genetic data did not show a reduction in risk of all-cause mortality, and only showed a reduction in risk of all-cause mortality in statin trials and not in the PCSK9-inhibitor trials meta-analyzed,” the researchers wrote. “This may be explained by the low frequency of cardiovascular disease in the 2 populations studied,” they explained.
One key limitation was the homogeneous makeup of the study population. Dr. Benn and colleagues acknowledged this could limit the generalizability of the results.
“Long-term LDL cholesterol treatment (e.g., with PCSK9 inhibitors), may translate into reductions in cardiovascular mortality,” they concluded.
The study was supported by the Danish Council for Independent Research, Medical Sciences, Johan Boserup, and the Lise Boserup’s Fund. The authors reported no conflicts of interest.
SOURCE: Benn M et al. JACC. 2019 Jun 17. doi: 10.1016/j.jacc.2019.03.517
FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY
Rivaroxaban tied to higher GI bleeding than other NOACs
SAN DIEGO – Patients on rivaroxaban had significantly higher rates of GI bleeding, compared with those taking apixaban or dabigatran, results from a large population-based study showed.
“This may be due to the fact that rivaroxaban is administered as a single daily dose as opposed to the other two non–vitamin K anticoagulants [NOACs], which are given twice daily,” lead study author Arnar B. Ingason said at the annual Digestive Disease Week. “This may lead to a greater variance in plasma drug concentration, making these patients more susceptible to bleeding.”
Mr. Ingason, a medical student at the University of Iceland, Reykjavik, said that although several studies have compared warfarin with NOACs, it remains unclear which NOAC has the most favorable GI profile. In an effort to improve the research in this area, he and his associates performed a nationwide, population-based study during March 2014–Jan. 2018 to compare the GI bleeding risk of patients receiving rivaroxaban to that of a combined pool of patients receiving either apixaban or dabigatran. They drew from the Icelandic Medicine Registry, which contains all outpatient drug prescriptions in the country. Next, the researchers linked the personal identification numbers of patients to the Landspitali University diagnoses registry, which includes more than 90% of all patients hospitalized for GI bleeding. They used 1:1 nearest neighbor propensity score for matching and Kaplan-Meier survival estimates and Cox regression to compare rates of GI bleeding. The study outcome of interest was any clinically relevant GI bleeding.
Mr. Ingason reported that the baseline characteristics were similar between the rivaroxaban group and the apixaban/dabigatran group. They matched for several variables, including age, sex, Charlson score, the proportion being anticoagulant naive, moderate to severe renal disease, moderate to severe liver disease, any prior bleeding, and any prior thrombotic events.
During the study period, 3,473 patients received rivaroxaban, 1,901 received apixaban, and 1,086 received dabigatran. After propensity score matching, the researchers compared 2,635 patients who received rivaroxaban with 2,365 patients who received either apixaban or dabigatran. They found that patients in the rivaroxaban group had significantly higher rates of GI bleeding, compared with in the apixaban/dabigatran group (1.2 and. 0.6 events per 100 patient-years, respectively). This yielded a hazard ratio of 2.02, “which means that patients receiving rivaroxaban are twice as likely to get GI bleeding compared to patients on apixaban or dabigatran,” Mr. Ingason said. When the researchers examined the entire unmatched cohort of patients, the rivaroxaban group also had significantly higher rates of GI bleeding, compared with the apixaban/dabigatran group (1.0 and 0.6 events per 100 patient-years; HR, 1.75).
Mr. Ingason and his colleagues observed that patients in the rivaroxaban group had higher rates of GI bleeding, compared with the apixaban/dabigatran group, during the entire follow-up period. At the end of year 4, the rivaroxaban group had a 4% cumulative event rate of GI bleeding, compared with 1.8% for the apixaban/dabigatran group, a highly significant difference at P = .0057).
When a meeting attendee asked Mr. Ingason why patients taking apixaban or dabigatran were combined into one group, he said that it was done to increase the power of their study. “Our theory was that rivaroxaban was different because it is administered as a single daily dose, while the others are given twice daily,” he said. The researchers reported having no financial disclosures.
SAN DIEGO – Patients on rivaroxaban had significantly higher rates of GI bleeding, compared with those taking apixaban or dabigatran, results from a large population-based study showed.
“This may be due to the fact that rivaroxaban is administered as a single daily dose as opposed to the other two non–vitamin K anticoagulants [NOACs], which are given twice daily,” lead study author Arnar B. Ingason said at the annual Digestive Disease Week. “This may lead to a greater variance in plasma drug concentration, making these patients more susceptible to bleeding.”
Mr. Ingason, a medical student at the University of Iceland, Reykjavik, said that although several studies have compared warfarin with NOACs, it remains unclear which NOAC has the most favorable GI profile. In an effort to improve the research in this area, he and his associates performed a nationwide, population-based study during March 2014–Jan. 2018 to compare the GI bleeding risk of patients receiving rivaroxaban to that of a combined pool of patients receiving either apixaban or dabigatran. They drew from the Icelandic Medicine Registry, which contains all outpatient drug prescriptions in the country. Next, the researchers linked the personal identification numbers of patients to the Landspitali University diagnoses registry, which includes more than 90% of all patients hospitalized for GI bleeding. They used 1:1 nearest neighbor propensity score for matching and Kaplan-Meier survival estimates and Cox regression to compare rates of GI bleeding. The study outcome of interest was any clinically relevant GI bleeding.
Mr. Ingason reported that the baseline characteristics were similar between the rivaroxaban group and the apixaban/dabigatran group. They matched for several variables, including age, sex, Charlson score, the proportion being anticoagulant naive, moderate to severe renal disease, moderate to severe liver disease, any prior bleeding, and any prior thrombotic events.
During the study period, 3,473 patients received rivaroxaban, 1,901 received apixaban, and 1,086 received dabigatran. After propensity score matching, the researchers compared 2,635 patients who received rivaroxaban with 2,365 patients who received either apixaban or dabigatran. They found that patients in the rivaroxaban group had significantly higher rates of GI bleeding, compared with in the apixaban/dabigatran group (1.2 and. 0.6 events per 100 patient-years, respectively). This yielded a hazard ratio of 2.02, “which means that patients receiving rivaroxaban are twice as likely to get GI bleeding compared to patients on apixaban or dabigatran,” Mr. Ingason said. When the researchers examined the entire unmatched cohort of patients, the rivaroxaban group also had significantly higher rates of GI bleeding, compared with the apixaban/dabigatran group (1.0 and 0.6 events per 100 patient-years; HR, 1.75).
Mr. Ingason and his colleagues observed that patients in the rivaroxaban group had higher rates of GI bleeding, compared with the apixaban/dabigatran group, during the entire follow-up period. At the end of year 4, the rivaroxaban group had a 4% cumulative event rate of GI bleeding, compared with 1.8% for the apixaban/dabigatran group, a highly significant difference at P = .0057).
When a meeting attendee asked Mr. Ingason why patients taking apixaban or dabigatran were combined into one group, he said that it was done to increase the power of their study. “Our theory was that rivaroxaban was different because it is administered as a single daily dose, while the others are given twice daily,” he said. The researchers reported having no financial disclosures.
SAN DIEGO – Patients on rivaroxaban had significantly higher rates of GI bleeding, compared with those taking apixaban or dabigatran, results from a large population-based study showed.
“This may be due to the fact that rivaroxaban is administered as a single daily dose as opposed to the other two non–vitamin K anticoagulants [NOACs], which are given twice daily,” lead study author Arnar B. Ingason said at the annual Digestive Disease Week. “This may lead to a greater variance in plasma drug concentration, making these patients more susceptible to bleeding.”
Mr. Ingason, a medical student at the University of Iceland, Reykjavik, said that although several studies have compared warfarin with NOACs, it remains unclear which NOAC has the most favorable GI profile. In an effort to improve the research in this area, he and his associates performed a nationwide, population-based study during March 2014–Jan. 2018 to compare the GI bleeding risk of patients receiving rivaroxaban to that of a combined pool of patients receiving either apixaban or dabigatran. They drew from the Icelandic Medicine Registry, which contains all outpatient drug prescriptions in the country. Next, the researchers linked the personal identification numbers of patients to the Landspitali University diagnoses registry, which includes more than 90% of all patients hospitalized for GI bleeding. They used 1:1 nearest neighbor propensity score for matching and Kaplan-Meier survival estimates and Cox regression to compare rates of GI bleeding. The study outcome of interest was any clinically relevant GI bleeding.
Mr. Ingason reported that the baseline characteristics were similar between the rivaroxaban group and the apixaban/dabigatran group. They matched for several variables, including age, sex, Charlson score, the proportion being anticoagulant naive, moderate to severe renal disease, moderate to severe liver disease, any prior bleeding, and any prior thrombotic events.
During the study period, 3,473 patients received rivaroxaban, 1,901 received apixaban, and 1,086 received dabigatran. After propensity score matching, the researchers compared 2,635 patients who received rivaroxaban with 2,365 patients who received either apixaban or dabigatran. They found that patients in the rivaroxaban group had significantly higher rates of GI bleeding, compared with in the apixaban/dabigatran group (1.2 and. 0.6 events per 100 patient-years, respectively). This yielded a hazard ratio of 2.02, “which means that patients receiving rivaroxaban are twice as likely to get GI bleeding compared to patients on apixaban or dabigatran,” Mr. Ingason said. When the researchers examined the entire unmatched cohort of patients, the rivaroxaban group also had significantly higher rates of GI bleeding, compared with the apixaban/dabigatran group (1.0 and 0.6 events per 100 patient-years; HR, 1.75).
Mr. Ingason and his colleagues observed that patients in the rivaroxaban group had higher rates of GI bleeding, compared with the apixaban/dabigatran group, during the entire follow-up period. At the end of year 4, the rivaroxaban group had a 4% cumulative event rate of GI bleeding, compared with 1.8% for the apixaban/dabigatran group, a highly significant difference at P = .0057).
When a meeting attendee asked Mr. Ingason why patients taking apixaban or dabigatran were combined into one group, he said that it was done to increase the power of their study. “Our theory was that rivaroxaban was different because it is administered as a single daily dose, while the others are given twice daily,” he said. The researchers reported having no financial disclosures.
REPORTING FROM DDW 2019
An 89-year-old woman presented with an ulceration overlying a cardiac pacemaker
Cardiac implantable electronic devices (CIEDs) – cardiac pacemakers and implantable cardioverter defibrillators –are an established treatment for the management of cardiac dysrhythmias in millions of patients. Complications occur in up to 15%, some of which may present first to the dermatologist.
The differential (caused by local venous obstruction and pressure dermatitis), and impending skin erosion/device extrusion.
Erosion and extrusion is a major complication with significant morbidity and mortality. The two main causes are pressure necrosis and infection. Pressure necrosis is influenced by the size of the device, complexity of the connections, and technical skill with which the pacemaker chest wall pocket is created.
After extrusion, the pacemaker should be considered contaminated and removed, and the necrotic tissue debrided. If infected, a prolonged course of appropriate antibiotic therapy is indicated. A bacterial culture in the patient presented here was negative.
Pocket infection of CIEDs is rare and may manifest as erythema, tenderness, drainage, erosion, or pruritus above the site of the pacemaker, along with systemic symptoms and signs, including fever, chills, or malaise. Some may have just the systemic symptoms. Fewer than half of patients with CIED infection present within 1 year of their last procedure.
Ruptured epidermal cysts usually manifest as acute swelling, inflammation, and tenderness of previously long-standing asymptomatic epidermal cysts. There may be drainage of malodorous keratinous and purulent debris. They are typically not infected. Treatment includes incision and drainage for fluctuant lesions or intralesional corticosteroid injection for early, nonfluctuant cases.
Allergic contact dermatitis to metal may be seen with implantable devices. Patch testing to various metal allergens can be helpful in determining if any allergy is present.
This case and photo were submitted by Michael Stierstorfer, MD, East Penn Dermatology, North Wales, Pa.
Dr. Bilu Martin is a board-certified dermatologist in private practice at Premier Dermatology, in Aventura, Fla. More diagnostic cases are available at mdedge.com/dermatology. To submit a case for possible publication, send an email to dermnews@mdedge.com.
Cardiac implantable electronic devices (CIEDs) – cardiac pacemakers and implantable cardioverter defibrillators –are an established treatment for the management of cardiac dysrhythmias in millions of patients. Complications occur in up to 15%, some of which may present first to the dermatologist.
The differential (caused by local venous obstruction and pressure dermatitis), and impending skin erosion/device extrusion.
Erosion and extrusion is a major complication with significant morbidity and mortality. The two main causes are pressure necrosis and infection. Pressure necrosis is influenced by the size of the device, complexity of the connections, and technical skill with which the pacemaker chest wall pocket is created.
After extrusion, the pacemaker should be considered contaminated and removed, and the necrotic tissue debrided. If infected, a prolonged course of appropriate antibiotic therapy is indicated. A bacterial culture in the patient presented here was negative.
Pocket infection of CIEDs is rare and may manifest as erythema, tenderness, drainage, erosion, or pruritus above the site of the pacemaker, along with systemic symptoms and signs, including fever, chills, or malaise. Some may have just the systemic symptoms. Fewer than half of patients with CIED infection present within 1 year of their last procedure.
Ruptured epidermal cysts usually manifest as acute swelling, inflammation, and tenderness of previously long-standing asymptomatic epidermal cysts. There may be drainage of malodorous keratinous and purulent debris. They are typically not infected. Treatment includes incision and drainage for fluctuant lesions or intralesional corticosteroid injection for early, nonfluctuant cases.
Allergic contact dermatitis to metal may be seen with implantable devices. Patch testing to various metal allergens can be helpful in determining if any allergy is present.
This case and photo were submitted by Michael Stierstorfer, MD, East Penn Dermatology, North Wales, Pa.
Dr. Bilu Martin is a board-certified dermatologist in private practice at Premier Dermatology, in Aventura, Fla. More diagnostic cases are available at mdedge.com/dermatology. To submit a case for possible publication, send an email to dermnews@mdedge.com.
Cardiac implantable electronic devices (CIEDs) – cardiac pacemakers and implantable cardioverter defibrillators –are an established treatment for the management of cardiac dysrhythmias in millions of patients. Complications occur in up to 15%, some of which may present first to the dermatologist.
The differential (caused by local venous obstruction and pressure dermatitis), and impending skin erosion/device extrusion.
Erosion and extrusion is a major complication with significant morbidity and mortality. The two main causes are pressure necrosis and infection. Pressure necrosis is influenced by the size of the device, complexity of the connections, and technical skill with which the pacemaker chest wall pocket is created.
After extrusion, the pacemaker should be considered contaminated and removed, and the necrotic tissue debrided. If infected, a prolonged course of appropriate antibiotic therapy is indicated. A bacterial culture in the patient presented here was negative.
Pocket infection of CIEDs is rare and may manifest as erythema, tenderness, drainage, erosion, or pruritus above the site of the pacemaker, along with systemic symptoms and signs, including fever, chills, or malaise. Some may have just the systemic symptoms. Fewer than half of patients with CIED infection present within 1 year of their last procedure.
Ruptured epidermal cysts usually manifest as acute swelling, inflammation, and tenderness of previously long-standing asymptomatic epidermal cysts. There may be drainage of malodorous keratinous and purulent debris. They are typically not infected. Treatment includes incision and drainage for fluctuant lesions or intralesional corticosteroid injection for early, nonfluctuant cases.
Allergic contact dermatitis to metal may be seen with implantable devices. Patch testing to various metal allergens can be helpful in determining if any allergy is present.
This case and photo were submitted by Michael Stierstorfer, MD, East Penn Dermatology, North Wales, Pa.
Dr. Bilu Martin is a board-certified dermatologist in private practice at Premier Dermatology, in Aventura, Fla. More diagnostic cases are available at mdedge.com/dermatology. To submit a case for possible publication, send an email to dermnews@mdedge.com.
Heart Valve Replacement for High-Risk Patients
Left ventricular outflow tract (LVOT) obstruction is a life-threatening complication that can put transcatheter mitral valve replacement (TMVR) out of reach for many patients. But researchers from the National Heart, Lung and Blood Institute (NHLBI) and Emory University in Atlanta, Georgia, have developed a novel technique to essentially slice through the obstacle, increasing treatment options for high-risk patients.
TMVR is a less invasive alternative to open-heart surgery. Physicians replace the mitral valve by inserting an artificial valve via a catheter. In > 50% of patients, though, the heart leaflet is pushed back, blocking blood flow. In surgery, surgeons can cut out the leaflets when they replace valves, because they are looking at the open chest and the heart and can see the problem, says study author Jaffar Khan, MD, clinician at NHLBI.
The researchers describe their new method, LAMPOON, as “a transcatheter mimic of surgical chord-sparing leaflet resection.” LAMPOON involves intentional laceration of the anterior mitral valve leaflet. The operator inserts 2 catheters through the patient’s groin, up to the heart. A thread-sized electrified wire woven through the catheter splits open the leaflet.
In the LAMPOON study, the researchers evaluated the procedure’s results in 30 patients at high risk for surgical valve replacement and prohibitive risk of LVOT obstruction during TMVR.
Survival was 100%, and 30-day survival was 93% (compared with 38% reported with other methods). In all, 73% of patients met the primary outcome: a successful LAMPOON procedure followed by a successful TMVR without reintervention. No one had a stroke.
Every year > 20,000 people in the US die of heart valve disease. The researchers hope their innovative technique will help reduce that number.
Left ventricular outflow tract (LVOT) obstruction is a life-threatening complication that can put transcatheter mitral valve replacement (TMVR) out of reach for many patients. But researchers from the National Heart, Lung and Blood Institute (NHLBI) and Emory University in Atlanta, Georgia, have developed a novel technique to essentially slice through the obstacle, increasing treatment options for high-risk patients.
TMVR is a less invasive alternative to open-heart surgery. Physicians replace the mitral valve by inserting an artificial valve via a catheter. In > 50% of patients, though, the heart leaflet is pushed back, blocking blood flow. In surgery, surgeons can cut out the leaflets when they replace valves, because they are looking at the open chest and the heart and can see the problem, says study author Jaffar Khan, MD, clinician at NHLBI.
The researchers describe their new method, LAMPOON, as “a transcatheter mimic of surgical chord-sparing leaflet resection.” LAMPOON involves intentional laceration of the anterior mitral valve leaflet. The operator inserts 2 catheters through the patient’s groin, up to the heart. A thread-sized electrified wire woven through the catheter splits open the leaflet.
In the LAMPOON study, the researchers evaluated the procedure’s results in 30 patients at high risk for surgical valve replacement and prohibitive risk of LVOT obstruction during TMVR.
Survival was 100%, and 30-day survival was 93% (compared with 38% reported with other methods). In all, 73% of patients met the primary outcome: a successful LAMPOON procedure followed by a successful TMVR without reintervention. No one had a stroke.
Every year > 20,000 people in the US die of heart valve disease. The researchers hope their innovative technique will help reduce that number.
Left ventricular outflow tract (LVOT) obstruction is a life-threatening complication that can put transcatheter mitral valve replacement (TMVR) out of reach for many patients. But researchers from the National Heart, Lung and Blood Institute (NHLBI) and Emory University in Atlanta, Georgia, have developed a novel technique to essentially slice through the obstacle, increasing treatment options for high-risk patients.
TMVR is a less invasive alternative to open-heart surgery. Physicians replace the mitral valve by inserting an artificial valve via a catheter. In > 50% of patients, though, the heart leaflet is pushed back, blocking blood flow. In surgery, surgeons can cut out the leaflets when they replace valves, because they are looking at the open chest and the heart and can see the problem, says study author Jaffar Khan, MD, clinician at NHLBI.
The researchers describe their new method, LAMPOON, as “a transcatheter mimic of surgical chord-sparing leaflet resection.” LAMPOON involves intentional laceration of the anterior mitral valve leaflet. The operator inserts 2 catheters through the patient’s groin, up to the heart. A thread-sized electrified wire woven through the catheter splits open the leaflet.
In the LAMPOON study, the researchers evaluated the procedure’s results in 30 patients at high risk for surgical valve replacement and prohibitive risk of LVOT obstruction during TMVR.
Survival was 100%, and 30-day survival was 93% (compared with 38% reported with other methods). In all, 73% of patients met the primary outcome: a successful LAMPOON procedure followed by a successful TMVR without reintervention. No one had a stroke.
Every year > 20,000 people in the US die of heart valve disease. The researchers hope their innovative technique will help reduce that number.
Lowering hyperuricemia improved endothelial function but failed as an antihypertensive
MADRID – Using allopurinol to reduce hyperuricemia in young adults with prehypertension or stage 1 hypertension failed to significantly lower blood pressure but succeeded in significantly improving endothelial function as measured by increased flow-mediated arterial dilation in a single-center crossover study with 82 participants.
The finding of improved endothelial function suggests that reducing hyperuricemia may be a new way to manage hypertension or prevent progression to stage 1 hypertension, improve cardiovascular health, and ultimately cut cardiovascular events, Angelo L. Gaffo, MD, said at the European Congress of Rheumatology. The results indicated that the BP-lowering effect of allopurinol treatment was strongest in people who entered the study with the highest serum urate levels, greater than 6.5 mg/dL, an indication that the next step in developing this approach should be targeting it to people with serum urate levels in this range, said Dr. Gaffo, a rheumatologist at the University of Alabama at Birmingham.
“It’s just a matter of finding the right population to see the blood pressure reduction effect,” Dr. Gaffo said in an interview.
He and his associates designed the SURPHER (Serum Urate Reduction to Prevent Hypertension) study to assess the impact of allopurinol treatment in people aged 18-40 years with prehypertension or stage 1 hypertension as defined by U.S. BP standards at the time they launched the study in 2016 (Contemp Clin Trials. 2016 Sep;50:238-44). Enrolled participants had to be nonsmokers; have an estimated glomerular filtration rate of greater than 60 mL/min per 1.73 m2; have a serum urate level of at least 5.0 mg/dL in men and at least 4.0 mg/dL in women; and be without diabetes, antihypertensive medications, prior urate-lowering treatment, or a history of gout. The 99 people who started the study averaged 28 years old, nearly two-thirds were men, 40% were African Americans, and 52% were white. The participants’ average body mass index was nearly 31 kg/m2, and their average BP was 127/81 mm Hg. Average serum urate levels were 6.4 mg/dL in men and 4.9 mg/dL in women. Participants received 300 mg/day allopurinol or placebo, and after 4 weeks crossed to the alternate regimen, with 82 people completing the full protocol. While on allopurinol, serum urate levels fell by an average of 1.3 mg/dL, a statistically significant drop; on placebo, the levels showed no significant change from baseline.
The primary endpoint was the change in BP on allopurinol treatment, which overall showed no statistically significant difference, compared with when participants received placebo. The results also showed no significant impact of allopurinol treatment, compared with placebo, in serum levels of high-sensitivity C-reactive protein, a measure of inflammation. However, for the secondary endpoint of change in endothelial function as measured by a change in flow-mediated dilation (FMD), the results showed a statistically significant effect of allopurinol treatment. While on allopurinol, average FMD increased from 10.3% at baseline to 14.5% on the drug, a 41% relative increase, while on placebo the average FMD rate showed a slight reduction. Allopurinol treatment was safe and well tolerated during the study.
The results also showed that among people with a baseline serum urate level of greater than 6.5 mg/dL (15 of the 82 study completers) systolic BP fell by an average of about 5 mm Hg.
The results suggested that the concept of reducing hyperuricemia in people with early-stage hypertension or prehypertension might be viable for people with higher serum urate levels than most of those enrolled in SURPHER, Dr. Gaffo said. He noted that prior study results in obese adolescents showed that treating hyperuricemia was able to produce a meaningful BP reduction (Hypertension. 2012 Nov;60[5]:1148-56).
SURPHER received no commercial funding. Dr. Gaffo has received research funding from Amgen and AstraZeneca.
MADRID – Using allopurinol to reduce hyperuricemia in young adults with prehypertension or stage 1 hypertension failed to significantly lower blood pressure but succeeded in significantly improving endothelial function as measured by increased flow-mediated arterial dilation in a single-center crossover study with 82 participants.
The finding of improved endothelial function suggests that reducing hyperuricemia may be a new way to manage hypertension or prevent progression to stage 1 hypertension, improve cardiovascular health, and ultimately cut cardiovascular events, Angelo L. Gaffo, MD, said at the European Congress of Rheumatology. The results indicated that the BP-lowering effect of allopurinol treatment was strongest in people who entered the study with the highest serum urate levels, greater than 6.5 mg/dL, an indication that the next step in developing this approach should be targeting it to people with serum urate levels in this range, said Dr. Gaffo, a rheumatologist at the University of Alabama at Birmingham.
“It’s just a matter of finding the right population to see the blood pressure reduction effect,” Dr. Gaffo said in an interview.
He and his associates designed the SURPHER (Serum Urate Reduction to Prevent Hypertension) study to assess the impact of allopurinol treatment in people aged 18-40 years with prehypertension or stage 1 hypertension as defined by U.S. BP standards at the time they launched the study in 2016 (Contemp Clin Trials. 2016 Sep;50:238-44). Enrolled participants had to be nonsmokers; have an estimated glomerular filtration rate of greater than 60 mL/min per 1.73 m2; have a serum urate level of at least 5.0 mg/dL in men and at least 4.0 mg/dL in women; and be without diabetes, antihypertensive medications, prior urate-lowering treatment, or a history of gout. The 99 people who started the study averaged 28 years old, nearly two-thirds were men, 40% were African Americans, and 52% were white. The participants’ average body mass index was nearly 31 kg/m2, and their average BP was 127/81 mm Hg. Average serum urate levels were 6.4 mg/dL in men and 4.9 mg/dL in women. Participants received 300 mg/day allopurinol or placebo, and after 4 weeks crossed to the alternate regimen, with 82 people completing the full protocol. While on allopurinol, serum urate levels fell by an average of 1.3 mg/dL, a statistically significant drop; on placebo, the levels showed no significant change from baseline.
The primary endpoint was the change in BP on allopurinol treatment, which overall showed no statistically significant difference, compared with when participants received placebo. The results also showed no significant impact of allopurinol treatment, compared with placebo, in serum levels of high-sensitivity C-reactive protein, a measure of inflammation. However, for the secondary endpoint of change in endothelial function as measured by a change in flow-mediated dilation (FMD), the results showed a statistically significant effect of allopurinol treatment. While on allopurinol, average FMD increased from 10.3% at baseline to 14.5% on the drug, a 41% relative increase, while on placebo the average FMD rate showed a slight reduction. Allopurinol treatment was safe and well tolerated during the study.
The results also showed that among people with a baseline serum urate level of greater than 6.5 mg/dL (15 of the 82 study completers) systolic BP fell by an average of about 5 mm Hg.
The results suggested that the concept of reducing hyperuricemia in people with early-stage hypertension or prehypertension might be viable for people with higher serum urate levels than most of those enrolled in SURPHER, Dr. Gaffo said. He noted that prior study results in obese adolescents showed that treating hyperuricemia was able to produce a meaningful BP reduction (Hypertension. 2012 Nov;60[5]:1148-56).
SURPHER received no commercial funding. Dr. Gaffo has received research funding from Amgen and AstraZeneca.
MADRID – Using allopurinol to reduce hyperuricemia in young adults with prehypertension or stage 1 hypertension failed to significantly lower blood pressure but succeeded in significantly improving endothelial function as measured by increased flow-mediated arterial dilation in a single-center crossover study with 82 participants.
The finding of improved endothelial function suggests that reducing hyperuricemia may be a new way to manage hypertension or prevent progression to stage 1 hypertension, improve cardiovascular health, and ultimately cut cardiovascular events, Angelo L. Gaffo, MD, said at the European Congress of Rheumatology. The results indicated that the BP-lowering effect of allopurinol treatment was strongest in people who entered the study with the highest serum urate levels, greater than 6.5 mg/dL, an indication that the next step in developing this approach should be targeting it to people with serum urate levels in this range, said Dr. Gaffo, a rheumatologist at the University of Alabama at Birmingham.
“It’s just a matter of finding the right population to see the blood pressure reduction effect,” Dr. Gaffo said in an interview.
He and his associates designed the SURPHER (Serum Urate Reduction to Prevent Hypertension) study to assess the impact of allopurinol treatment in people aged 18-40 years with prehypertension or stage 1 hypertension as defined by U.S. BP standards at the time they launched the study in 2016 (Contemp Clin Trials. 2016 Sep;50:238-44). Enrolled participants had to be nonsmokers; have an estimated glomerular filtration rate of greater than 60 mL/min per 1.73 m2; have a serum urate level of at least 5.0 mg/dL in men and at least 4.0 mg/dL in women; and be without diabetes, antihypertensive medications, prior urate-lowering treatment, or a history of gout. The 99 people who started the study averaged 28 years old, nearly two-thirds were men, 40% were African Americans, and 52% were white. The participants’ average body mass index was nearly 31 kg/m2, and their average BP was 127/81 mm Hg. Average serum urate levels were 6.4 mg/dL in men and 4.9 mg/dL in women. Participants received 300 mg/day allopurinol or placebo, and after 4 weeks crossed to the alternate regimen, with 82 people completing the full protocol. While on allopurinol, serum urate levels fell by an average of 1.3 mg/dL, a statistically significant drop; on placebo, the levels showed no significant change from baseline.
The primary endpoint was the change in BP on allopurinol treatment, which overall showed no statistically significant difference, compared with when participants received placebo. The results also showed no significant impact of allopurinol treatment, compared with placebo, in serum levels of high-sensitivity C-reactive protein, a measure of inflammation. However, for the secondary endpoint of change in endothelial function as measured by a change in flow-mediated dilation (FMD), the results showed a statistically significant effect of allopurinol treatment. While on allopurinol, average FMD increased from 10.3% at baseline to 14.5% on the drug, a 41% relative increase, while on placebo the average FMD rate showed a slight reduction. Allopurinol treatment was safe and well tolerated during the study.
The results also showed that among people with a baseline serum urate level of greater than 6.5 mg/dL (15 of the 82 study completers) systolic BP fell by an average of about 5 mm Hg.
The results suggested that the concept of reducing hyperuricemia in people with early-stage hypertension or prehypertension might be viable for people with higher serum urate levels than most of those enrolled in SURPHER, Dr. Gaffo said. He noted that prior study results in obese adolescents showed that treating hyperuricemia was able to produce a meaningful BP reduction (Hypertension. 2012 Nov;60[5]:1148-56).
SURPHER received no commercial funding. Dr. Gaffo has received research funding from Amgen and AstraZeneca.
REPORTING FROM EULAR 2019 CONGRESS
Teva expands its recall of losartan lots
The Food and Drug Administration has announced that , according to a release.
The recall for this and other angiotensin II receptor blockers was initiated by Teva on April 25, 2019, because of detection of unacceptable levels of the possibly cancer-causing impurity N-Nitroso-N-methyl-4-aminobutyric acid (NMBA). Teva expanded this recall on June 10, with another update issued on June 12.
Losartan is not the only ARB found to contain NMBA; a full list of all ARBs affected can be found on the FDA website and currently includes more than 1,100 lots being recalled. The list can be searched and sorted by such considerations as medicine in question, company involved, and lot number.
The Food and Drug Administration has announced that , according to a release.
The recall for this and other angiotensin II receptor blockers was initiated by Teva on April 25, 2019, because of detection of unacceptable levels of the possibly cancer-causing impurity N-Nitroso-N-methyl-4-aminobutyric acid (NMBA). Teva expanded this recall on June 10, with another update issued on June 12.
Losartan is not the only ARB found to contain NMBA; a full list of all ARBs affected can be found on the FDA website and currently includes more than 1,100 lots being recalled. The list can be searched and sorted by such considerations as medicine in question, company involved, and lot number.
The Food and Drug Administration has announced that , according to a release.
The recall for this and other angiotensin II receptor blockers was initiated by Teva on April 25, 2019, because of detection of unacceptable levels of the possibly cancer-causing impurity N-Nitroso-N-methyl-4-aminobutyric acid (NMBA). Teva expanded this recall on June 10, with another update issued on June 12.
Losartan is not the only ARB found to contain NMBA; a full list of all ARBs affected can be found on the FDA website and currently includes more than 1,100 lots being recalled. The list can be searched and sorted by such considerations as medicine in question, company involved, and lot number.
Which antidiabetic for elderly patients? It depends on their CV risk
SAN FRANCISCO – SGLT2 inhibitors did a better job than GLP-1 receptor agonists at preventing heart failure hospitalizations in elderly patients with type 2 diabetes, but at the cost of more strokes, myocardial infarctions, and deaths among those without preexisting cardiovascular disease, according to Harvard University investigators.
Using Medicare claims data and propensity scoring, they matched 43,609 elderly patients who started a sodium-glucose cotransporter 2 (SGLT2) inhibitor for type 2 diabetes, 77% of whom were taking canagliflozin (Invokana), to 43,609 who started a glucagonlike peptide–1 (GLP-1)–receptor agonist, 60% of whom were taking liraglutide (Victoza).
Patients were paired by age, comorbidities, diabetes severity, and dozens of other variables, more than 120 in all. The data window ran from April 2013 through December 2016.
The idea was to compare the drugs directly in order to help clinicians decide which class to choose for older patients as second-line therapy, an important consideration at a time when there’s not much guidance specifically for the elderly, and manufacturers are issuing dueling placebo-controlled trials.
Both classes have shown cardiovascular benefits, but studies were mostly in younger people with preexisting cardiovascular disease (CVD). “The comparative impact of these agents in the older population has not yet been established,” lead investigator Elisabetta Patorno, MD, DrPH, of Harvard University, Boston, said at the annual scientific sessions of the American Diabetes Association.
General themes are emerging from Dr. Patorno’s work; it seems that deciding between the two classes has a lot to do with whether the main concern is heart failure or cardiovascular events. Even so, she said, it’s too early to incorporate the observations into guidelines. The analysis is ongoing, and there are plans to compare impacts on renal disease and other problems.
In the meantime, she and her colleagues found that initiating an SGLT2 inhibitor versus a GLP-1 receptor agonist in the elderly was associated with a 34% decreased risk of heart failure hospitalization (2.5 fewer hospitalizations per 1,000 patient years), with an even larger drop among people who had preexisting CVD.
There was, however, a 41% increased risk of lower limb amputations (0.8 more events per 1,000 patient years) and a 62% increase in diabetic ketoacidosis (DKA, 1 more event), problems previously associated with the class.
Results were comparable – fewer heart failure hospitalizations but more amputations and DKA – when SGLT2 initiation was compared to initiation with dipeptidyl peptidase-4 (DPP-4) inhibitors, another second-line option for type 2 diabetes that includes sitagliptin (Januvia), among others.
There was a 25% increased relative risk of the composite primary outcome of myocardial infarction, stroke, and all-cause mortality when patients without baseline CVD were started on an SGLT2 inhibitor instead of a GLP-1 receptor agonist (3.7 more events per 1,000 patient years). There was no increased risk among patients who already had CVD.
SGLT2 initiation actually had a protective effect, compared with dipeptidyl peptidase-4 inhibitors, with a 23% decreased risk of the composite outcome (6.5 fewer events) among patients both with and without baseline CVD. The findings were all statistically significant.
The average age in the study was 71.5 years; 45% of the subjects were men; 40% had a history of cardiovascular disease; and 60% were on metformin and 24% on insulin at study entry.
The work was funded by the National Institutes of Health. Dr. Patorno disclosed research grants form Boehringer Ingelheim and GlaxoSmithKline. Other investigators reported relationships with numerous pharmaceutical companies.
SAN FRANCISCO – SGLT2 inhibitors did a better job than GLP-1 receptor agonists at preventing heart failure hospitalizations in elderly patients with type 2 diabetes, but at the cost of more strokes, myocardial infarctions, and deaths among those without preexisting cardiovascular disease, according to Harvard University investigators.
Using Medicare claims data and propensity scoring, they matched 43,609 elderly patients who started a sodium-glucose cotransporter 2 (SGLT2) inhibitor for type 2 diabetes, 77% of whom were taking canagliflozin (Invokana), to 43,609 who started a glucagonlike peptide–1 (GLP-1)–receptor agonist, 60% of whom were taking liraglutide (Victoza).
Patients were paired by age, comorbidities, diabetes severity, and dozens of other variables, more than 120 in all. The data window ran from April 2013 through December 2016.
The idea was to compare the drugs directly in order to help clinicians decide which class to choose for older patients as second-line therapy, an important consideration at a time when there’s not much guidance specifically for the elderly, and manufacturers are issuing dueling placebo-controlled trials.
Both classes have shown cardiovascular benefits, but studies were mostly in younger people with preexisting cardiovascular disease (CVD). “The comparative impact of these agents in the older population has not yet been established,” lead investigator Elisabetta Patorno, MD, DrPH, of Harvard University, Boston, said at the annual scientific sessions of the American Diabetes Association.
General themes are emerging from Dr. Patorno’s work; it seems that deciding between the two classes has a lot to do with whether the main concern is heart failure or cardiovascular events. Even so, she said, it’s too early to incorporate the observations into guidelines. The analysis is ongoing, and there are plans to compare impacts on renal disease and other problems.
In the meantime, she and her colleagues found that initiating an SGLT2 inhibitor versus a GLP-1 receptor agonist in the elderly was associated with a 34% decreased risk of heart failure hospitalization (2.5 fewer hospitalizations per 1,000 patient years), with an even larger drop among people who had preexisting CVD.
There was, however, a 41% increased risk of lower limb amputations (0.8 more events per 1,000 patient years) and a 62% increase in diabetic ketoacidosis (DKA, 1 more event), problems previously associated with the class.
Results were comparable – fewer heart failure hospitalizations but more amputations and DKA – when SGLT2 initiation was compared to initiation with dipeptidyl peptidase-4 (DPP-4) inhibitors, another second-line option for type 2 diabetes that includes sitagliptin (Januvia), among others.
There was a 25% increased relative risk of the composite primary outcome of myocardial infarction, stroke, and all-cause mortality when patients without baseline CVD were started on an SGLT2 inhibitor instead of a GLP-1 receptor agonist (3.7 more events per 1,000 patient years). There was no increased risk among patients who already had CVD.
SGLT2 initiation actually had a protective effect, compared with dipeptidyl peptidase-4 inhibitors, with a 23% decreased risk of the composite outcome (6.5 fewer events) among patients both with and without baseline CVD. The findings were all statistically significant.
The average age in the study was 71.5 years; 45% of the subjects were men; 40% had a history of cardiovascular disease; and 60% were on metformin and 24% on insulin at study entry.
The work was funded by the National Institutes of Health. Dr. Patorno disclosed research grants form Boehringer Ingelheim and GlaxoSmithKline. Other investigators reported relationships with numerous pharmaceutical companies.
SAN FRANCISCO – SGLT2 inhibitors did a better job than GLP-1 receptor agonists at preventing heart failure hospitalizations in elderly patients with type 2 diabetes, but at the cost of more strokes, myocardial infarctions, and deaths among those without preexisting cardiovascular disease, according to Harvard University investigators.
Using Medicare claims data and propensity scoring, they matched 43,609 elderly patients who started a sodium-glucose cotransporter 2 (SGLT2) inhibitor for type 2 diabetes, 77% of whom were taking canagliflozin (Invokana), to 43,609 who started a glucagonlike peptide–1 (GLP-1)–receptor agonist, 60% of whom were taking liraglutide (Victoza).
Patients were paired by age, comorbidities, diabetes severity, and dozens of other variables, more than 120 in all. The data window ran from April 2013 through December 2016.
The idea was to compare the drugs directly in order to help clinicians decide which class to choose for older patients as second-line therapy, an important consideration at a time when there’s not much guidance specifically for the elderly, and manufacturers are issuing dueling placebo-controlled trials.
Both classes have shown cardiovascular benefits, but studies were mostly in younger people with preexisting cardiovascular disease (CVD). “The comparative impact of these agents in the older population has not yet been established,” lead investigator Elisabetta Patorno, MD, DrPH, of Harvard University, Boston, said at the annual scientific sessions of the American Diabetes Association.
General themes are emerging from Dr. Patorno’s work; it seems that deciding between the two classes has a lot to do with whether the main concern is heart failure or cardiovascular events. Even so, she said, it’s too early to incorporate the observations into guidelines. The analysis is ongoing, and there are plans to compare impacts on renal disease and other problems.
In the meantime, she and her colleagues found that initiating an SGLT2 inhibitor versus a GLP-1 receptor agonist in the elderly was associated with a 34% decreased risk of heart failure hospitalization (2.5 fewer hospitalizations per 1,000 patient years), with an even larger drop among people who had preexisting CVD.
There was, however, a 41% increased risk of lower limb amputations (0.8 more events per 1,000 patient years) and a 62% increase in diabetic ketoacidosis (DKA, 1 more event), problems previously associated with the class.
Results were comparable – fewer heart failure hospitalizations but more amputations and DKA – when SGLT2 initiation was compared to initiation with dipeptidyl peptidase-4 (DPP-4) inhibitors, another second-line option for type 2 diabetes that includes sitagliptin (Januvia), among others.
There was a 25% increased relative risk of the composite primary outcome of myocardial infarction, stroke, and all-cause mortality when patients without baseline CVD were started on an SGLT2 inhibitor instead of a GLP-1 receptor agonist (3.7 more events per 1,000 patient years). There was no increased risk among patients who already had CVD.
SGLT2 initiation actually had a protective effect, compared with dipeptidyl peptidase-4 inhibitors, with a 23% decreased risk of the composite outcome (6.5 fewer events) among patients both with and without baseline CVD. The findings were all statistically significant.
The average age in the study was 71.5 years; 45% of the subjects were men; 40% had a history of cardiovascular disease; and 60% were on metformin and 24% on insulin at study entry.
The work was funded by the National Institutes of Health. Dr. Patorno disclosed research grants form Boehringer Ingelheim and GlaxoSmithKline. Other investigators reported relationships with numerous pharmaceutical companies.
REPORTING FROM ADA 2019
Clinical pulmonary medicine. Cardiovascular medicine and surgery. Chest infections. Interprofessional team.
Clinical Pulmonary Medicine
Pulmonary embolism in pregnancy: A diagnostic conundrum
Pulmonary embolism (PE) is the 6th leading cause of maternal mortality in the United States. The clinical signs and symptoms of PE are usually nonspecific and often overlap with the normal physiologic changes of pregnancy. Due to low specificity and sensitivity of D-dimer test, pregnant patients with suspected PE often undergo CT pulmonary angiography (CTPA) and ventilation-perfusion scanning, both of which can cause radiation exposure to mother and fetus.
To answer whether pregnancy-adapted YEARS algorithm (Van der Hulle T et al. Lancet. 2017;390[10091]:289) can be safely used to avoid diagnostic imaging, Artemis Study Investigators prospectively studied three criteria from YEARS algorithm in combination with a D-dimer level (Van der Pol et al. N Engl J Med. 2019;380[12]:1139. The three criteria included clinical signs of deep-vein thrombosis (DVT), hemoptysis, and PE as the most likely diagnosis. PE was considered ruled out when none of the three criteria were present and D-dimer was less than 1000 ng/mL or if one or more of the criteria were met and D-dimer was less than 500 ng/mL. Patients in whom D-dimer was greater than 1000 ng/mL or in those with D-dimer greater than 500 ng/mL and had 1 or more of the YEARS algorithm criteria present, PE could not be ruled out and underwent CTPA. A modification of the criteria was done only for patients who had clinical signs of DVT at baseline. These patients underwent compression ultrasonography and if a clot was found, CTPA was not performed and patients were started on anticoagulation therapy. Those with negative DVT studies were subclassified based on D-dimer levels as the study population above. Patients in whom pulmonary embolism was not ruled out underwent CTPA. Of these 299 patients, 16 (5.4%) were confirmed to have PE at baseline.
In the remaining 195 patients in whom PE was ruled out on the basis of study protocol, a 3-month follow-up diagnosed one patient (0.51%) with VTE. Using pregnancy-adapted YEARS algorithm, CTPA was avoided in 39% of the patients of which 65% were in their first trimester when the radiation exposure can be most harmful to the fetus.
Muhammad Adrish, MD, FCCP
Steering Committee Member
Munish Luthra, MD, FCCP
Steering Committee Member
Cardiovascular Medicine and Surgery
Physical examination of low cardiac output in the ICU
Rapid evaluation of shock requires identifying signs of tissue hypoperfusion and differentiating between cardiogenic, obstructive, hypovolemic, and vasodilatory etiologies. Cardiac abnormalities may also contribute to mixed shock states in a broad array of critically ill patients. Left ventricular dysfunction in inpatients correlates with physical exam, with a 2.0 positive likelihood ratio and 0.41 negative likelihood ratio (Simel DL, Rennie D, eds. The Rational Clinical Examination: Evidence-Based Clinical Diagnosis. 2009). Accurate clinical assessment of cardiac output, however, is a fraught endeavor. In a recently published large series of patients with unplanned ICU admission, atrial fibrillation, systolic blood pressure (BP) < 90, altered consciousness, capillary refill time >4.5 seconds at the sternum, or skin mottling over the knee predicted low cardiac output with specificity >90%. Of 280 patients with a cardiac index of < 2.2 L/min/m2, less than half had any one of these findings (Hiemstra, et al. Intensive Care Med. 2019;45[2]:190).
Regarding determination of shock etiology, in a small series of patients with systolic blood pressure < 90 mm Hg, physical exam findings of relatively warm skin temperature and rapid capillary refill had 89% sensitivity for vasodilatory shock, and jugular venous pressure ≥8 had 82% sensitivity for cardiogenic etiologies (Vazquez, et al. J Hosp Med. 2010;5[8]:471). Thus, while physical exam findings may inform bedside shock assessment, their accuracy is limited. Critical care physicians should consider additional assessment techniques, such as echocardiography or invasive hemodynamic monitoring, if diagnostic uncertainty persists (Vincent, et al. N Engl J Med. 2013;369[18]:1726).
Benjamin Kenigsberg, MD
Steering Committee Member
Dr. David Bowton and Dr. Steven Hollenberg contributed to the article.
Chest Infections
Lung infections in the transplant recipients
The increase in lung transplantation over the years led to lung transplant recipients presenting to pulmonologists outside of specialized centers. One of the most common presentations is for infections. Infections account for more than 25% of all posttransplant deaths (Yusen, et al. J Heart Lung Transplant. 2014;33[10]:1009.
Multiple factors contribute to this increased infection risk, including donor lung colonization, disruption of local host defenses, constant contact with environmental pathogens, and heavy immunosuppression (Redmund KF, et al. Proc Am Thorac Soc. 2009;6[1]:94).
The onset of infectious manifestations, from the time of transplantation, is variable, depending on the organism. Based on the time of onset, infections can be categorized into within the first month posttransplant, 1 to 6 months, and beyond 6 months, posttransplant. During the first month, because of allograft colonization, preexisting infections in the recipient, and surgical- and hospital-acquired nosocomial infections are more common. The first 6 months are where the patients are at the highest risk for opportunistic infections. As the immunosuppression is lowered after 6 months, the causative organisms tend to be more common pathogens (Green M. Am J Transplant. 2013;13 [suppl 4]:3-8).
An early, aggressive, empiric antimicrobial therapy initiation and proactive, invasive diagnostic approach with needed testing to identify the potential pathogen, is imperative in these patients. Early bronchoscopy with bronchoalveolar lavage remains the most sensitive test to identify pathogens. Therapy can then be tailored toward the identified pathogen.
As part of the Chest Infections NetWork, we would like to raise awareness of lung infections in unique subgroups, such as lung transplant recipients. Treating infections in such patients requires a high index of suspicion in the setting of an atypical presentation.
Raed Alalawi, MD, FCCP
Steering Committee Member
Interprofessional Team
Extracorporeal Membrane Oxygenation (ECMO) in Near Fatal Asthma
Near fatal asthma (NFA) is defined as acute severe asthma characterized by acute respiratory failure with hypercapnia and/or respiratory acidosis requiring ventilator support. NFA refractory to conventional medical management and ventilator therapy can lead to fatal outcomes. Near fatal asthma also carries substantial mortality if invasive ventilation is needed (Marquette CH, et al. Am Rev Respir Dis. 1992;146[1]:76). Use of sedatives can exacerbate bronchospasm, and positive pressure ventilation can exacerbate dynamic hyperinflation, impairing hemodynamics, and gas exchange, and leading to barotrauma. This approach seems contrary to the goals of management. Outside of conventional therapies, such as IV steroids and inhaled beta-agonists, the data supporting other therapies such as IV beta-agonists, MgSO4, methylxanthines, mucolytics, heliox, and volatile anesthetics are scant. In contrast, venovenous ECMO can provide adequate gas exchange and prevent lung injury induced by mechanical ventilation and may be an effective bridging strategy to avoid aggressive ventilation in refractory NFA (Hye Ju Yeo, et al. Critical Care. 2017;21[1]:297).
Use of early ECMO to permit spontaneous breathing while the circuit accomplishes required ventilation and oxygenation seems more ideal. Avoidance of mechanical ventilation not only prevents complications like barotrauma but also may reduce delirium, malnutrition, and neuromuscular dysfunction. Performing “awake” ECMO has successfully been described for obstructive airway disease (Langer T, et al. Critical Care. 2016;20[1]:150). Factors limiting this approach are the invasive nature of ECMO and the inherent risks of large cannula dislodgement; however, the safety of this has been demonstrated with ambulation of ECMO patients to receive physical therapy (Abrams D, et al. Ann Cardiothorac Surg. 2019;8[1]:44). Alternatively, extracorporeal carbon dioxide removal (ECCO2R) systems utilize smaller catheters to satisfactorily remove CO2 while oxygen supplementation could be achieved via nasal cannula (Pisani L, et al. Respiratory Care. 2018;63[9]:1174). Incorporation of ECMO in select cases of NFA, especially ECCO2R, should be considered as an early rather than rescue therapy for acute severe asthma refractory to conventional medical therapy.
Robert Baeten, DMSc, PA-C, FCCP
Steering Committee Member
Munish Luthra MD, FCCP
Steering Committee Member
Clinical Pulmonary Medicine
Pulmonary embolism in pregnancy: A diagnostic conundrum
Pulmonary embolism (PE) is the 6th leading cause of maternal mortality in the United States. The clinical signs and symptoms of PE are usually nonspecific and often overlap with the normal physiologic changes of pregnancy. Due to low specificity and sensitivity of D-dimer test, pregnant patients with suspected PE often undergo CT pulmonary angiography (CTPA) and ventilation-perfusion scanning, both of which can cause radiation exposure to mother and fetus.
To answer whether pregnancy-adapted YEARS algorithm (Van der Hulle T et al. Lancet. 2017;390[10091]:289) can be safely used to avoid diagnostic imaging, Artemis Study Investigators prospectively studied three criteria from YEARS algorithm in combination with a D-dimer level (Van der Pol et al. N Engl J Med. 2019;380[12]:1139. The three criteria included clinical signs of deep-vein thrombosis (DVT), hemoptysis, and PE as the most likely diagnosis. PE was considered ruled out when none of the three criteria were present and D-dimer was less than 1000 ng/mL or if one or more of the criteria were met and D-dimer was less than 500 ng/mL. Patients in whom D-dimer was greater than 1000 ng/mL or in those with D-dimer greater than 500 ng/mL and had 1 or more of the YEARS algorithm criteria present, PE could not be ruled out and underwent CTPA. A modification of the criteria was done only for patients who had clinical signs of DVT at baseline. These patients underwent compression ultrasonography and if a clot was found, CTPA was not performed and patients were started on anticoagulation therapy. Those with negative DVT studies were subclassified based on D-dimer levels as the study population above. Patients in whom pulmonary embolism was not ruled out underwent CTPA. Of these 299 patients, 16 (5.4%) were confirmed to have PE at baseline.
In the remaining 195 patients in whom PE was ruled out on the basis of study protocol, a 3-month follow-up diagnosed one patient (0.51%) with VTE. Using pregnancy-adapted YEARS algorithm, CTPA was avoided in 39% of the patients of which 65% were in their first trimester when the radiation exposure can be most harmful to the fetus.
Muhammad Adrish, MD, FCCP
Steering Committee Member
Munish Luthra, MD, FCCP
Steering Committee Member
Cardiovascular Medicine and Surgery
Physical examination of low cardiac output in the ICU
Rapid evaluation of shock requires identifying signs of tissue hypoperfusion and differentiating between cardiogenic, obstructive, hypovolemic, and vasodilatory etiologies. Cardiac abnormalities may also contribute to mixed shock states in a broad array of critically ill patients. Left ventricular dysfunction in inpatients correlates with physical exam, with a 2.0 positive likelihood ratio and 0.41 negative likelihood ratio (Simel DL, Rennie D, eds. The Rational Clinical Examination: Evidence-Based Clinical Diagnosis. 2009). Accurate clinical assessment of cardiac output, however, is a fraught endeavor. In a recently published large series of patients with unplanned ICU admission, atrial fibrillation, systolic blood pressure (BP) < 90, altered consciousness, capillary refill time >4.5 seconds at the sternum, or skin mottling over the knee predicted low cardiac output with specificity >90%. Of 280 patients with a cardiac index of < 2.2 L/min/m2, less than half had any one of these findings (Hiemstra, et al. Intensive Care Med. 2019;45[2]:190).
Regarding determination of shock etiology, in a small series of patients with systolic blood pressure < 90 mm Hg, physical exam findings of relatively warm skin temperature and rapid capillary refill had 89% sensitivity for vasodilatory shock, and jugular venous pressure ≥8 had 82% sensitivity for cardiogenic etiologies (Vazquez, et al. J Hosp Med. 2010;5[8]:471). Thus, while physical exam findings may inform bedside shock assessment, their accuracy is limited. Critical care physicians should consider additional assessment techniques, such as echocardiography or invasive hemodynamic monitoring, if diagnostic uncertainty persists (Vincent, et al. N Engl J Med. 2013;369[18]:1726).
Benjamin Kenigsberg, MD
Steering Committee Member
Dr. David Bowton and Dr. Steven Hollenberg contributed to the article.
Chest Infections
Lung infections in the transplant recipients
The increase in lung transplantation over the years led to lung transplant recipients presenting to pulmonologists outside of specialized centers. One of the most common presentations is for infections. Infections account for more than 25% of all posttransplant deaths (Yusen, et al. J Heart Lung Transplant. 2014;33[10]:1009.
Multiple factors contribute to this increased infection risk, including donor lung colonization, disruption of local host defenses, constant contact with environmental pathogens, and heavy immunosuppression (Redmund KF, et al. Proc Am Thorac Soc. 2009;6[1]:94).
The onset of infectious manifestations, from the time of transplantation, is variable, depending on the organism. Based on the time of onset, infections can be categorized into within the first month posttransplant, 1 to 6 months, and beyond 6 months, posttransplant. During the first month, because of allograft colonization, preexisting infections in the recipient, and surgical- and hospital-acquired nosocomial infections are more common. The first 6 months are where the patients are at the highest risk for opportunistic infections. As the immunosuppression is lowered after 6 months, the causative organisms tend to be more common pathogens (Green M. Am J Transplant. 2013;13 [suppl 4]:3-8).
An early, aggressive, empiric antimicrobial therapy initiation and proactive, invasive diagnostic approach with needed testing to identify the potential pathogen, is imperative in these patients. Early bronchoscopy with bronchoalveolar lavage remains the most sensitive test to identify pathogens. Therapy can then be tailored toward the identified pathogen.
As part of the Chest Infections NetWork, we would like to raise awareness of lung infections in unique subgroups, such as lung transplant recipients. Treating infections in such patients requires a high index of suspicion in the setting of an atypical presentation.
Raed Alalawi, MD, FCCP
Steering Committee Member
Interprofessional Team
Extracorporeal Membrane Oxygenation (ECMO) in Near Fatal Asthma
Near fatal asthma (NFA) is defined as acute severe asthma characterized by acute respiratory failure with hypercapnia and/or respiratory acidosis requiring ventilator support. NFA refractory to conventional medical management and ventilator therapy can lead to fatal outcomes. Near fatal asthma also carries substantial mortality if invasive ventilation is needed (Marquette CH, et al. Am Rev Respir Dis. 1992;146[1]:76). Use of sedatives can exacerbate bronchospasm, and positive pressure ventilation can exacerbate dynamic hyperinflation, impairing hemodynamics, and gas exchange, and leading to barotrauma. This approach seems contrary to the goals of management. Outside of conventional therapies, such as IV steroids and inhaled beta-agonists, the data supporting other therapies such as IV beta-agonists, MgSO4, methylxanthines, mucolytics, heliox, and volatile anesthetics are scant. In contrast, venovenous ECMO can provide adequate gas exchange and prevent lung injury induced by mechanical ventilation and may be an effective bridging strategy to avoid aggressive ventilation in refractory NFA (Hye Ju Yeo, et al. Critical Care. 2017;21[1]:297).
Use of early ECMO to permit spontaneous breathing while the circuit accomplishes required ventilation and oxygenation seems more ideal. Avoidance of mechanical ventilation not only prevents complications like barotrauma but also may reduce delirium, malnutrition, and neuromuscular dysfunction. Performing “awake” ECMO has successfully been described for obstructive airway disease (Langer T, et al. Critical Care. 2016;20[1]:150). Factors limiting this approach are the invasive nature of ECMO and the inherent risks of large cannula dislodgement; however, the safety of this has been demonstrated with ambulation of ECMO patients to receive physical therapy (Abrams D, et al. Ann Cardiothorac Surg. 2019;8[1]:44). Alternatively, extracorporeal carbon dioxide removal (ECCO2R) systems utilize smaller catheters to satisfactorily remove CO2 while oxygen supplementation could be achieved via nasal cannula (Pisani L, et al. Respiratory Care. 2018;63[9]:1174). Incorporation of ECMO in select cases of NFA, especially ECCO2R, should be considered as an early rather than rescue therapy for acute severe asthma refractory to conventional medical therapy.
Robert Baeten, DMSc, PA-C, FCCP
Steering Committee Member
Munish Luthra MD, FCCP
Steering Committee Member
Clinical Pulmonary Medicine
Pulmonary embolism in pregnancy: A diagnostic conundrum
Pulmonary embolism (PE) is the 6th leading cause of maternal mortality in the United States. The clinical signs and symptoms of PE are usually nonspecific and often overlap with the normal physiologic changes of pregnancy. Due to low specificity and sensitivity of D-dimer test, pregnant patients with suspected PE often undergo CT pulmonary angiography (CTPA) and ventilation-perfusion scanning, both of which can cause radiation exposure to mother and fetus.
To answer whether pregnancy-adapted YEARS algorithm (Van der Hulle T et al. Lancet. 2017;390[10091]:289) can be safely used to avoid diagnostic imaging, Artemis Study Investigators prospectively studied three criteria from YEARS algorithm in combination with a D-dimer level (Van der Pol et al. N Engl J Med. 2019;380[12]:1139. The three criteria included clinical signs of deep-vein thrombosis (DVT), hemoptysis, and PE as the most likely diagnosis. PE was considered ruled out when none of the three criteria were present and D-dimer was less than 1000 ng/mL or if one or more of the criteria were met and D-dimer was less than 500 ng/mL. Patients in whom D-dimer was greater than 1000 ng/mL or in those with D-dimer greater than 500 ng/mL and had 1 or more of the YEARS algorithm criteria present, PE could not be ruled out and underwent CTPA. A modification of the criteria was done only for patients who had clinical signs of DVT at baseline. These patients underwent compression ultrasonography and if a clot was found, CTPA was not performed and patients were started on anticoagulation therapy. Those with negative DVT studies were subclassified based on D-dimer levels as the study population above. Patients in whom pulmonary embolism was not ruled out underwent CTPA. Of these 299 patients, 16 (5.4%) were confirmed to have PE at baseline.
In the remaining 195 patients in whom PE was ruled out on the basis of study protocol, a 3-month follow-up diagnosed one patient (0.51%) with VTE. Using pregnancy-adapted YEARS algorithm, CTPA was avoided in 39% of the patients of which 65% were in their first trimester when the radiation exposure can be most harmful to the fetus.
Muhammad Adrish, MD, FCCP
Steering Committee Member
Munish Luthra, MD, FCCP
Steering Committee Member
Cardiovascular Medicine and Surgery
Physical examination of low cardiac output in the ICU
Rapid evaluation of shock requires identifying signs of tissue hypoperfusion and differentiating between cardiogenic, obstructive, hypovolemic, and vasodilatory etiologies. Cardiac abnormalities may also contribute to mixed shock states in a broad array of critically ill patients. Left ventricular dysfunction in inpatients correlates with physical exam, with a 2.0 positive likelihood ratio and 0.41 negative likelihood ratio (Simel DL, Rennie D, eds. The Rational Clinical Examination: Evidence-Based Clinical Diagnosis. 2009). Accurate clinical assessment of cardiac output, however, is a fraught endeavor. In a recently published large series of patients with unplanned ICU admission, atrial fibrillation, systolic blood pressure (BP) < 90, altered consciousness, capillary refill time >4.5 seconds at the sternum, or skin mottling over the knee predicted low cardiac output with specificity >90%. Of 280 patients with a cardiac index of < 2.2 L/min/m2, less than half had any one of these findings (Hiemstra, et al. Intensive Care Med. 2019;45[2]:190).
Regarding determination of shock etiology, in a small series of patients with systolic blood pressure < 90 mm Hg, physical exam findings of relatively warm skin temperature and rapid capillary refill had 89% sensitivity for vasodilatory shock, and jugular venous pressure ≥8 had 82% sensitivity for cardiogenic etiologies (Vazquez, et al. J Hosp Med. 2010;5[8]:471). Thus, while physical exam findings may inform bedside shock assessment, their accuracy is limited. Critical care physicians should consider additional assessment techniques, such as echocardiography or invasive hemodynamic monitoring, if diagnostic uncertainty persists (Vincent, et al. N Engl J Med. 2013;369[18]:1726).
Benjamin Kenigsberg, MD
Steering Committee Member
Dr. David Bowton and Dr. Steven Hollenberg contributed to the article.
Chest Infections
Lung infections in the transplant recipients
The increase in lung transplantation over the years led to lung transplant recipients presenting to pulmonologists outside of specialized centers. One of the most common presentations is for infections. Infections account for more than 25% of all posttransplant deaths (Yusen, et al. J Heart Lung Transplant. 2014;33[10]:1009.
Multiple factors contribute to this increased infection risk, including donor lung colonization, disruption of local host defenses, constant contact with environmental pathogens, and heavy immunosuppression (Redmund KF, et al. Proc Am Thorac Soc. 2009;6[1]:94).
The onset of infectious manifestations, from the time of transplantation, is variable, depending on the organism. Based on the time of onset, infections can be categorized into within the first month posttransplant, 1 to 6 months, and beyond 6 months, posttransplant. During the first month, because of allograft colonization, preexisting infections in the recipient, and surgical- and hospital-acquired nosocomial infections are more common. The first 6 months are where the patients are at the highest risk for opportunistic infections. As the immunosuppression is lowered after 6 months, the causative organisms tend to be more common pathogens (Green M. Am J Transplant. 2013;13 [suppl 4]:3-8).
An early, aggressive, empiric antimicrobial therapy initiation and proactive, invasive diagnostic approach with needed testing to identify the potential pathogen, is imperative in these patients. Early bronchoscopy with bronchoalveolar lavage remains the most sensitive test to identify pathogens. Therapy can then be tailored toward the identified pathogen.
As part of the Chest Infections NetWork, we would like to raise awareness of lung infections in unique subgroups, such as lung transplant recipients. Treating infections in such patients requires a high index of suspicion in the setting of an atypical presentation.
Raed Alalawi, MD, FCCP
Steering Committee Member
Interprofessional Team
Extracorporeal Membrane Oxygenation (ECMO) in Near Fatal Asthma
Near fatal asthma (NFA) is defined as acute severe asthma characterized by acute respiratory failure with hypercapnia and/or respiratory acidosis requiring ventilator support. NFA refractory to conventional medical management and ventilator therapy can lead to fatal outcomes. Near fatal asthma also carries substantial mortality if invasive ventilation is needed (Marquette CH, et al. Am Rev Respir Dis. 1992;146[1]:76). Use of sedatives can exacerbate bronchospasm, and positive pressure ventilation can exacerbate dynamic hyperinflation, impairing hemodynamics, and gas exchange, and leading to barotrauma. This approach seems contrary to the goals of management. Outside of conventional therapies, such as IV steroids and inhaled beta-agonists, the data supporting other therapies such as IV beta-agonists, MgSO4, methylxanthines, mucolytics, heliox, and volatile anesthetics are scant. In contrast, venovenous ECMO can provide adequate gas exchange and prevent lung injury induced by mechanical ventilation and may be an effective bridging strategy to avoid aggressive ventilation in refractory NFA (Hye Ju Yeo, et al. Critical Care. 2017;21[1]:297).
Use of early ECMO to permit spontaneous breathing while the circuit accomplishes required ventilation and oxygenation seems more ideal. Avoidance of mechanical ventilation not only prevents complications like barotrauma but also may reduce delirium, malnutrition, and neuromuscular dysfunction. Performing “awake” ECMO has successfully been described for obstructive airway disease (Langer T, et al. Critical Care. 2016;20[1]:150). Factors limiting this approach are the invasive nature of ECMO and the inherent risks of large cannula dislodgement; however, the safety of this has been demonstrated with ambulation of ECMO patients to receive physical therapy (Abrams D, et al. Ann Cardiothorac Surg. 2019;8[1]:44). Alternatively, extracorporeal carbon dioxide removal (ECCO2R) systems utilize smaller catheters to satisfactorily remove CO2 while oxygen supplementation could be achieved via nasal cannula (Pisani L, et al. Respiratory Care. 2018;63[9]:1174). Incorporation of ECMO in select cases of NFA, especially ECCO2R, should be considered as an early rather than rescue therapy for acute severe asthma refractory to conventional medical therapy.
Robert Baeten, DMSc, PA-C, FCCP
Steering Committee Member
Munish Luthra MD, FCCP
Steering Committee Member
Having unmet social needs ups cardiovascular risk
WASHINGTON – according to a study.
“Although there have been great medical interventions and our technology keeps improving, we can’t prevent the burden of cardiovascular disease. It’s the social factors that are playing this role,” said Ana Palacio, MD, MPH, of the University of Miami during her presentation of the study findings at the annual meeting of the Society of General Internal Medicine.
“We need to address issues at the patient’s home, such as food, isolation, and transportation, to help them prevent cardiovascular risk,” she added.
The study was designed to determine how patient-reported social determinants of health (SDH) had an effect on the Framingham risk score (FRS). Researchers also wanted to assess the relationship between the SDH score and individual risk factors for cardiovascular health, including blood pressure, hemoglobin A1c, LDL cholesterol, body mass index, tobacco use, and physical activity.
Results showed that several SDH factors significantly increase the FRS score, including being born outside of the United States, living alone, having a high social isolation score, and having a low geocoded-based median household income (P less than .01). The calculated SDH score ranged from 0 to 59.
Higher SDH scores were associated with high FRS scores in the areas of poor blood pressure and diabetes control. Additionally, those who had financial strain, poor health literacy, stress, lack of education, and a low median household income were more likely to have a sedentary lifestyle. Black or Hispanic patients who were born outside the United States and had low median household income were at a higher risk of obesity.
The retrospective cohort study originally involved 11,113 primary care patients who received care at the University of Miami Health System between Sept. 16, 2016 and Sept. 10, 2017 and answered an SDH survey. Of this group, 2,876 patients completed the electronic health record data to compile a score. This population had a mean age of 53.8 years and was 61% female; 38% were Hispanic and 9% were black. The mean household income was $53,677 and 87% reported speaking English.
The study examined a total of 11 self-reported and census-based SDH factors. The self-reported factors were race/ethnicity, education, financial strain, stress, tobacco use and physical activity, social isolation, years living in the United States, health literacy, and delayed care. The remaining factors were based on an area deprivation index and census-driven median household income.
“The most surprising finding was how much weight the social factors have in adding to the Framingham risk score, in taking a patient from a medium score to a higher score because of their social environment,” said Dr. Palacio.
The study was funded by the Precision Medicine and Health Disparities Collaborative and was supported by the National Institute on Minority Health and Health Disparities and National Human Genome Research Institute of the National Institutes of Health.
WASHINGTON – according to a study.
“Although there have been great medical interventions and our technology keeps improving, we can’t prevent the burden of cardiovascular disease. It’s the social factors that are playing this role,” said Ana Palacio, MD, MPH, of the University of Miami during her presentation of the study findings at the annual meeting of the Society of General Internal Medicine.
“We need to address issues at the patient’s home, such as food, isolation, and transportation, to help them prevent cardiovascular risk,” she added.
The study was designed to determine how patient-reported social determinants of health (SDH) had an effect on the Framingham risk score (FRS). Researchers also wanted to assess the relationship between the SDH score and individual risk factors for cardiovascular health, including blood pressure, hemoglobin A1c, LDL cholesterol, body mass index, tobacco use, and physical activity.
Results showed that several SDH factors significantly increase the FRS score, including being born outside of the United States, living alone, having a high social isolation score, and having a low geocoded-based median household income (P less than .01). The calculated SDH score ranged from 0 to 59.
Higher SDH scores were associated with high FRS scores in the areas of poor blood pressure and diabetes control. Additionally, those who had financial strain, poor health literacy, stress, lack of education, and a low median household income were more likely to have a sedentary lifestyle. Black or Hispanic patients who were born outside the United States and had low median household income were at a higher risk of obesity.
The retrospective cohort study originally involved 11,113 primary care patients who received care at the University of Miami Health System between Sept. 16, 2016 and Sept. 10, 2017 and answered an SDH survey. Of this group, 2,876 patients completed the electronic health record data to compile a score. This population had a mean age of 53.8 years and was 61% female; 38% were Hispanic and 9% were black. The mean household income was $53,677 and 87% reported speaking English.
The study examined a total of 11 self-reported and census-based SDH factors. The self-reported factors were race/ethnicity, education, financial strain, stress, tobacco use and physical activity, social isolation, years living in the United States, health literacy, and delayed care. The remaining factors were based on an area deprivation index and census-driven median household income.
“The most surprising finding was how much weight the social factors have in adding to the Framingham risk score, in taking a patient from a medium score to a higher score because of their social environment,” said Dr. Palacio.
The study was funded by the Precision Medicine and Health Disparities Collaborative and was supported by the National Institute on Minority Health and Health Disparities and National Human Genome Research Institute of the National Institutes of Health.
WASHINGTON – according to a study.
“Although there have been great medical interventions and our technology keeps improving, we can’t prevent the burden of cardiovascular disease. It’s the social factors that are playing this role,” said Ana Palacio, MD, MPH, of the University of Miami during her presentation of the study findings at the annual meeting of the Society of General Internal Medicine.
“We need to address issues at the patient’s home, such as food, isolation, and transportation, to help them prevent cardiovascular risk,” she added.
The study was designed to determine how patient-reported social determinants of health (SDH) had an effect on the Framingham risk score (FRS). Researchers also wanted to assess the relationship between the SDH score and individual risk factors for cardiovascular health, including blood pressure, hemoglobin A1c, LDL cholesterol, body mass index, tobacco use, and physical activity.
Results showed that several SDH factors significantly increase the FRS score, including being born outside of the United States, living alone, having a high social isolation score, and having a low geocoded-based median household income (P less than .01). The calculated SDH score ranged from 0 to 59.
Higher SDH scores were associated with high FRS scores in the areas of poor blood pressure and diabetes control. Additionally, those who had financial strain, poor health literacy, stress, lack of education, and a low median household income were more likely to have a sedentary lifestyle. Black or Hispanic patients who were born outside the United States and had low median household income were at a higher risk of obesity.
The retrospective cohort study originally involved 11,113 primary care patients who received care at the University of Miami Health System between Sept. 16, 2016 and Sept. 10, 2017 and answered an SDH survey. Of this group, 2,876 patients completed the electronic health record data to compile a score. This population had a mean age of 53.8 years and was 61% female; 38% were Hispanic and 9% were black. The mean household income was $53,677 and 87% reported speaking English.
The study examined a total of 11 self-reported and census-based SDH factors. The self-reported factors were race/ethnicity, education, financial strain, stress, tobacco use and physical activity, social isolation, years living in the United States, health literacy, and delayed care. The remaining factors were based on an area deprivation index and census-driven median household income.
“The most surprising finding was how much weight the social factors have in adding to the Framingham risk score, in taking a patient from a medium score to a higher score because of their social environment,” said Dr. Palacio.
The study was funded by the Precision Medicine and Health Disparities Collaborative and was supported by the National Institute on Minority Health and Health Disparities and National Human Genome Research Institute of the National Institutes of Health.
REPORTING FROM SGIM 2019
Updated systematic review of aspirin primary prevention shows benefits, risks
Using daily aspirin treatment for the primary prevention of cardiovascular events remains an individualized decision that needs to balance a person’s risks for ischemic events and bleeding, according to results from a new systematic review of 15 randomized, aspirin-prevention trials, including results from 3 major trials that researchers reported during 2018.
“The findings suggest that the decision to use aspirin for primary prevention should be tailored to the individual patients based on estimated atherosclerotic cardiovascular disease risk and perceived bleeding risk, as well as patient preferences regarding the types of event prevented versus potential bleeding caused,” Jawahar L. Mehta, MD, and his associates wrote in an article published on June 10 in the Journal of the American College of Cardiology.
The authors also concluded that if a person decides to use aspirin for primary prevention, then a low dose of 100 mg/day or less is recommended.
This new systematic review follows two reviews published earlier in 2019 that reached roughly similar conclusions after analyzing largely the same randomized trial data, including the same three major trials from 2018. One of these prior reviews included data from 13 trials and a total of 164,225 people (JAMA. 2019 Jan 22;321[3]:277-87). The second review had data from 11 trials with 157,248 people (Eur Heart J. 2019 Feb 14;40[7]:607-17). The newly published review used data collected by 15 trials from 165,502 people.
The three 2018 trials that triggered the updated data assessments were the ARRIVE trial, with 12,546 people randomized (Lancet. 2018 Sep 22;392[10152]:1036-46), the ASPREE trial, with 19,114 people randomized (New Engl J Med. 2018 Oct 18;379[16]:1509-18), and the ASCEND trial, with 15,480 people randomized (New Engl J Med. 2018 Oct 18;379[16]:1529-39).
As stated in the new report from Dr. Mehta, a professor of medicine at the University of Arkansas for Medical Sciences in Little Rock, and his associates, the recent trial results from 2018 added new data from more than 45,000 additional subjects, a development that warranted a reappraisal of the evidence for aspirin’s efficacy and safety for primary prevention in contemporary practice.
The major findings from the analysis by Dr. Mehta and his associates were that in adults without a history of cardiovascular disease, daily aspirin use reduced the incidence of MIs, with a number needed to treat (NNT) of 357; reduced ischemic stroke (NNT, 500), reduced transient ischemic attack (NNT, 370), and reduced the overall, combined rate of all major adverse cardiovascular events (NNT, 263). But on the safety side, daily aspirin led to an increased rate of major bleeding episodes, with a number needed to harm (NNH) of 222, increased intracranial bleeds (NNH, 1,000), and an increase in gastrointestinal bleeds (NNH, 385).
The analysis “demonstrates a potential reduction of net benefit with aspirin in the contemporary era,” the authors concluded. They also noted that the benefits from aspirin prevention were, as expected, “more pronounced” among people with a higher estimated risk from atherosclerotic cardiovascular disease.
The systematic review findings came against the backdrop of a recently released primary prevention guideline from the American College of Cardiology and American Heart Association (J Am Coll Card. 2019 Mar. doi: 10.1016/j.jacc.2019.03.010). The guideline said that aspirin prophylaxis for primary prevention “might be considered” for adults aged 40-70 years, but should not be used for people who are older than 70, and also should not be given to people with an increased risk for bleeding. In general, the experts who produced this guideline said that aspirin prophylaxis should be infrequent.
The new analysis also found no reduction in the incidence of cancer or cancer-related death linked with aspirin use for primary prevention. The systematic review published earlier in 2019 in JAMA also found no link between aspirin use and cancer incidence or mortality. The review from the European Heart Journal did not report on the link between aspirin use and cancer incidence or mortality.
Dr. Mehta has been a consultant to AstraZeneca, Bayer, Boehringer Ingelheim, Medimmune, and Pfizer, and has received grant support from AstraZeneca, Bayer, and Boehringer Ingelheim.
On Twitter @mitchelzoler
SOURCE: Abdelaziz HK et al. J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.501.
The three trials published in 2018 that added important new data on primary prevention for cardiovascular disease with aspirin must ideally be interpreted within the context of the totality of evidence on this subject. This was achieved in the analysis reported by Dr. Mehta and his associates, as well as in other more recent publications.
Making a decision about using aspirin for primary prevention in individuals based on trial data is very challenging because it requires weighing a modest potential benefit that people gain from daily aspirin for preventing a first cardiovascular event against the modest risk of an adverse bleeding event. It does not suffice simply to compare the number of cardiovascular and bleeding events, because those two types of events do not have the same immediate or long-term consequences. Each patient must make a personal choice between the risks and benefits.
The greatest potential benefit from aspirin prophylaxis seems to be in people with increased cardiovascular risk but with no increased bleeding risk. In general, this means people aged 50-59 years old, and also possibly those aged 60-69 years old if their estimated 10-year cardiovascular disease risk is more than 10%. It may make more sense to first focus on other risk-reducing steps, such as smoking cessation, blood pressure control, and statin treatment. After that, prophylactic aspirin may be reasonable for people who retain a 10-year cardiovascular disease risk of more than 10% who are also not at increased bleeding risk. That seems to make it prudent to avoid aspirin for primary prevention once people reach the age of 70 years, although people who have been taking aspirin safely for a period of time before reaching 70 might reasonably consider continuing the prophylaxis for a period of time.
This and similar reviews continue to have major limitations. The duration of the trials they reviewed, a mean of 6.4 years, is insufficient to understand the full effect from aspirin prophylaxis. Also, none of the recent reviews used a patient-level meta-analysis, which could better help us understand aspirin’s action in key subgroups, such as women, patients with diabetes, and patients on treatments such as statins that reduce their cardiovascular risk.
Michael Pignone, MD, is professor and chair of medicine at the University of Texas Dell Medical School in Austin. He had no disclosures. He made these comments in an editorial that accompanied the report (J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.502).
The three trials published in 2018 that added important new data on primary prevention for cardiovascular disease with aspirin must ideally be interpreted within the context of the totality of evidence on this subject. This was achieved in the analysis reported by Dr. Mehta and his associates, as well as in other more recent publications.
Making a decision about using aspirin for primary prevention in individuals based on trial data is very challenging because it requires weighing a modest potential benefit that people gain from daily aspirin for preventing a first cardiovascular event against the modest risk of an adverse bleeding event. It does not suffice simply to compare the number of cardiovascular and bleeding events, because those two types of events do not have the same immediate or long-term consequences. Each patient must make a personal choice between the risks and benefits.
The greatest potential benefit from aspirin prophylaxis seems to be in people with increased cardiovascular risk but with no increased bleeding risk. In general, this means people aged 50-59 years old, and also possibly those aged 60-69 years old if their estimated 10-year cardiovascular disease risk is more than 10%. It may make more sense to first focus on other risk-reducing steps, such as smoking cessation, blood pressure control, and statin treatment. After that, prophylactic aspirin may be reasonable for people who retain a 10-year cardiovascular disease risk of more than 10% who are also not at increased bleeding risk. That seems to make it prudent to avoid aspirin for primary prevention once people reach the age of 70 years, although people who have been taking aspirin safely for a period of time before reaching 70 might reasonably consider continuing the prophylaxis for a period of time.
This and similar reviews continue to have major limitations. The duration of the trials they reviewed, a mean of 6.4 years, is insufficient to understand the full effect from aspirin prophylaxis. Also, none of the recent reviews used a patient-level meta-analysis, which could better help us understand aspirin’s action in key subgroups, such as women, patients with diabetes, and patients on treatments such as statins that reduce their cardiovascular risk.
Michael Pignone, MD, is professor and chair of medicine at the University of Texas Dell Medical School in Austin. He had no disclosures. He made these comments in an editorial that accompanied the report (J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.502).
The three trials published in 2018 that added important new data on primary prevention for cardiovascular disease with aspirin must ideally be interpreted within the context of the totality of evidence on this subject. This was achieved in the analysis reported by Dr. Mehta and his associates, as well as in other more recent publications.
Making a decision about using aspirin for primary prevention in individuals based on trial data is very challenging because it requires weighing a modest potential benefit that people gain from daily aspirin for preventing a first cardiovascular event against the modest risk of an adverse bleeding event. It does not suffice simply to compare the number of cardiovascular and bleeding events, because those two types of events do not have the same immediate or long-term consequences. Each patient must make a personal choice between the risks and benefits.
The greatest potential benefit from aspirin prophylaxis seems to be in people with increased cardiovascular risk but with no increased bleeding risk. In general, this means people aged 50-59 years old, and also possibly those aged 60-69 years old if their estimated 10-year cardiovascular disease risk is more than 10%. It may make more sense to first focus on other risk-reducing steps, such as smoking cessation, blood pressure control, and statin treatment. After that, prophylactic aspirin may be reasonable for people who retain a 10-year cardiovascular disease risk of more than 10% who are also not at increased bleeding risk. That seems to make it prudent to avoid aspirin for primary prevention once people reach the age of 70 years, although people who have been taking aspirin safely for a period of time before reaching 70 might reasonably consider continuing the prophylaxis for a period of time.
This and similar reviews continue to have major limitations. The duration of the trials they reviewed, a mean of 6.4 years, is insufficient to understand the full effect from aspirin prophylaxis. Also, none of the recent reviews used a patient-level meta-analysis, which could better help us understand aspirin’s action in key subgroups, such as women, patients with diabetes, and patients on treatments such as statins that reduce their cardiovascular risk.
Michael Pignone, MD, is professor and chair of medicine at the University of Texas Dell Medical School in Austin. He had no disclosures. He made these comments in an editorial that accompanied the report (J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.502).
Using daily aspirin treatment for the primary prevention of cardiovascular events remains an individualized decision that needs to balance a person’s risks for ischemic events and bleeding, according to results from a new systematic review of 15 randomized, aspirin-prevention trials, including results from 3 major trials that researchers reported during 2018.
“The findings suggest that the decision to use aspirin for primary prevention should be tailored to the individual patients based on estimated atherosclerotic cardiovascular disease risk and perceived bleeding risk, as well as patient preferences regarding the types of event prevented versus potential bleeding caused,” Jawahar L. Mehta, MD, and his associates wrote in an article published on June 10 in the Journal of the American College of Cardiology.
The authors also concluded that if a person decides to use aspirin for primary prevention, then a low dose of 100 mg/day or less is recommended.
This new systematic review follows two reviews published earlier in 2019 that reached roughly similar conclusions after analyzing largely the same randomized trial data, including the same three major trials from 2018. One of these prior reviews included data from 13 trials and a total of 164,225 people (JAMA. 2019 Jan 22;321[3]:277-87). The second review had data from 11 trials with 157,248 people (Eur Heart J. 2019 Feb 14;40[7]:607-17). The newly published review used data collected by 15 trials from 165,502 people.
The three 2018 trials that triggered the updated data assessments were the ARRIVE trial, with 12,546 people randomized (Lancet. 2018 Sep 22;392[10152]:1036-46), the ASPREE trial, with 19,114 people randomized (New Engl J Med. 2018 Oct 18;379[16]:1509-18), and the ASCEND trial, with 15,480 people randomized (New Engl J Med. 2018 Oct 18;379[16]:1529-39).
As stated in the new report from Dr. Mehta, a professor of medicine at the University of Arkansas for Medical Sciences in Little Rock, and his associates, the recent trial results from 2018 added new data from more than 45,000 additional subjects, a development that warranted a reappraisal of the evidence for aspirin’s efficacy and safety for primary prevention in contemporary practice.
The major findings from the analysis by Dr. Mehta and his associates were that in adults without a history of cardiovascular disease, daily aspirin use reduced the incidence of MIs, with a number needed to treat (NNT) of 357; reduced ischemic stroke (NNT, 500), reduced transient ischemic attack (NNT, 370), and reduced the overall, combined rate of all major adverse cardiovascular events (NNT, 263). But on the safety side, daily aspirin led to an increased rate of major bleeding episodes, with a number needed to harm (NNH) of 222, increased intracranial bleeds (NNH, 1,000), and an increase in gastrointestinal bleeds (NNH, 385).
The analysis “demonstrates a potential reduction of net benefit with aspirin in the contemporary era,” the authors concluded. They also noted that the benefits from aspirin prevention were, as expected, “more pronounced” among people with a higher estimated risk from atherosclerotic cardiovascular disease.
The systematic review findings came against the backdrop of a recently released primary prevention guideline from the American College of Cardiology and American Heart Association (J Am Coll Card. 2019 Mar. doi: 10.1016/j.jacc.2019.03.010). The guideline said that aspirin prophylaxis for primary prevention “might be considered” for adults aged 40-70 years, but should not be used for people who are older than 70, and also should not be given to people with an increased risk for bleeding. In general, the experts who produced this guideline said that aspirin prophylaxis should be infrequent.
The new analysis also found no reduction in the incidence of cancer or cancer-related death linked with aspirin use for primary prevention. The systematic review published earlier in 2019 in JAMA also found no link between aspirin use and cancer incidence or mortality. The review from the European Heart Journal did not report on the link between aspirin use and cancer incidence or mortality.
Dr. Mehta has been a consultant to AstraZeneca, Bayer, Boehringer Ingelheim, Medimmune, and Pfizer, and has received grant support from AstraZeneca, Bayer, and Boehringer Ingelheim.
On Twitter @mitchelzoler
SOURCE: Abdelaziz HK et al. J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.501.
Using daily aspirin treatment for the primary prevention of cardiovascular events remains an individualized decision that needs to balance a person’s risks for ischemic events and bleeding, according to results from a new systematic review of 15 randomized, aspirin-prevention trials, including results from 3 major trials that researchers reported during 2018.
“The findings suggest that the decision to use aspirin for primary prevention should be tailored to the individual patients based on estimated atherosclerotic cardiovascular disease risk and perceived bleeding risk, as well as patient preferences regarding the types of event prevented versus potential bleeding caused,” Jawahar L. Mehta, MD, and his associates wrote in an article published on June 10 in the Journal of the American College of Cardiology.
The authors also concluded that if a person decides to use aspirin for primary prevention, then a low dose of 100 mg/day or less is recommended.
This new systematic review follows two reviews published earlier in 2019 that reached roughly similar conclusions after analyzing largely the same randomized trial data, including the same three major trials from 2018. One of these prior reviews included data from 13 trials and a total of 164,225 people (JAMA. 2019 Jan 22;321[3]:277-87). The second review had data from 11 trials with 157,248 people (Eur Heart J. 2019 Feb 14;40[7]:607-17). The newly published review used data collected by 15 trials from 165,502 people.
The three 2018 trials that triggered the updated data assessments were the ARRIVE trial, with 12,546 people randomized (Lancet. 2018 Sep 22;392[10152]:1036-46), the ASPREE trial, with 19,114 people randomized (New Engl J Med. 2018 Oct 18;379[16]:1509-18), and the ASCEND trial, with 15,480 people randomized (New Engl J Med. 2018 Oct 18;379[16]:1529-39).
As stated in the new report from Dr. Mehta, a professor of medicine at the University of Arkansas for Medical Sciences in Little Rock, and his associates, the recent trial results from 2018 added new data from more than 45,000 additional subjects, a development that warranted a reappraisal of the evidence for aspirin’s efficacy and safety for primary prevention in contemporary practice.
The major findings from the analysis by Dr. Mehta and his associates were that in adults without a history of cardiovascular disease, daily aspirin use reduced the incidence of MIs, with a number needed to treat (NNT) of 357; reduced ischemic stroke (NNT, 500), reduced transient ischemic attack (NNT, 370), and reduced the overall, combined rate of all major adverse cardiovascular events (NNT, 263). But on the safety side, daily aspirin led to an increased rate of major bleeding episodes, with a number needed to harm (NNH) of 222, increased intracranial bleeds (NNH, 1,000), and an increase in gastrointestinal bleeds (NNH, 385).
The analysis “demonstrates a potential reduction of net benefit with aspirin in the contemporary era,” the authors concluded. They also noted that the benefits from aspirin prevention were, as expected, “more pronounced” among people with a higher estimated risk from atherosclerotic cardiovascular disease.
The systematic review findings came against the backdrop of a recently released primary prevention guideline from the American College of Cardiology and American Heart Association (J Am Coll Card. 2019 Mar. doi: 10.1016/j.jacc.2019.03.010). The guideline said that aspirin prophylaxis for primary prevention “might be considered” for adults aged 40-70 years, but should not be used for people who are older than 70, and also should not be given to people with an increased risk for bleeding. In general, the experts who produced this guideline said that aspirin prophylaxis should be infrequent.
The new analysis also found no reduction in the incidence of cancer or cancer-related death linked with aspirin use for primary prevention. The systematic review published earlier in 2019 in JAMA also found no link between aspirin use and cancer incidence or mortality. The review from the European Heart Journal did not report on the link between aspirin use and cancer incidence or mortality.
Dr. Mehta has been a consultant to AstraZeneca, Bayer, Boehringer Ingelheim, Medimmune, and Pfizer, and has received grant support from AstraZeneca, Bayer, and Boehringer Ingelheim.
On Twitter @mitchelzoler
SOURCE: Abdelaziz HK et al. J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.501.
FROM JACC
Key clinical point: Cumulative trial results continue to show that aspirin primary prevention cuts CVD events while boosting major bleeds.
Major finding: Aspirin prophylaxis cut cardiovascular events with an NNT of 263, but increased major bleeds with an NNH of 222.
Study details: Systematic review of data from 165,502 people enrolled in 15 randomized trials.
Disclosures: Dr. Mehta has been a consultant to AstraZeneca, Bayer, Boehringer Ingelheim, Medimmune, and Pfizer, and has received grant support from AstraZeneca, Bayer, and Boehringer Ingelheim.
Source: Abdelaziz HK et al. J Am Coll Cardiol. 2019 Jun 10. doi: 10.1016/j.jacc.2019.03.501.








