Posttransplant skin conditions vary widely by ethnicity

Article Type
Changed

 

– A new study finds that the risk of skin cancers in organ transplant recipients may vary widely by ethnicity.

“The most important findings from our study are the high rates of keratinocyte neoplasms observed in our white Northern European patients, but also in those of Far East Asian descent. Dermatologists should also appreciate the high risk of Kaposi’s sarcoma (KS) in patients originating from Sub-Saharan Africa,” Jonathan Kentley, MBBS, of Royal London Hospital, said in an interview. He presented the study findings at the annual meeting of the American Academy of Dermatology.

Dr. Jonathan Kentley
“As the immune system plays a pivotal role in the surveillance and destruction of skin cancer, iatrogenic immunosuppression has a profound impact on morbidity and mortality in these patients,” he noted. “This presents a significant health issue for transplant recipients, and they are at an increased risk of almost every skin cancer. Squamous cell carcinoma (SCC), in particular, has been intensively studied, and some literature suggests that transplant recipients are at a more than 100-times increased risk of SCC.”

For the study, Dr. Kentley and colleagues sought to better understand ethnic differences in skin disorders in patients who have received organ transplants, since many previous studies have included few nonwhite subjects.

They analyzed an organ transplant center database for the years 1989-2016, and tracked 1,304 consecutive patients – which included 1,125 with skin problems. The overall population was 64% male with a median age in the early 40s, and almost all (1,276) had undergone renal transplants. A relative handful underwent liver, lung, heart, and pancreas transplants.

The majority of patients (885) were white Northern Europeans, but there were also significant numbers of people with South Asian (202), black African/Caribbean (131) and white/Mediterranean (52) heritage. A small number were Far East Asian (26) and Middle Eastern (8). The median follow-up time for the ethnic groups varied from about 5 years to about 12 years.

The researchers found that basal cell carcinoma was most common in white Northern European patients, at nearly 25%, with other groups under 10%. SCC was common in white Northern European patients and Far East Asians, both at nearly 25%.
 

 


By far, KS was the most common in black African/Caribbean patients, at nearly 11%. According to Dr. Kentley, researchers found the number of KS cases to be surprisingly high in this group, “compounded by the fact that we have had a number of additional cases in the past year after we had collected the data for this study.” He attributes the higher number of KS cases in these patients to an increased seroprevalence of its causative agent, human herpesvirus-8, in Sub-Saharan Africa. The rate of KS in the second most commonly affected group – white Mediterranean patients – was almost 2%.

Viral warts were common in most groups, with the rate in both white groups (white Northern European and white Mediterranean) at nearly 60%, and Far East Asians at about 65%. Porokeratosis was by far the most common in white Norther Europeans, at nearly 8%, and sebaceous hyperplasia was common in all groups (more than 20% to about 27%) except in the black African/Caribbean and South Asian groups.

All these results were statistically significant with P values less than .05.

“Our study has confirmed the increased risk of keratinocyte cancers in patients of white Northern European descent, as well as providing more information on the increased risk in patients of Far East Asian descent,” Dr. Kentley said. “We have also confirmed the propensity of black African/Caribbean patients to develop Kaposi’s sarcoma in the first 5 years post transplant and highlighted that white Mediterranean patients are also at high risk. Beyond this, we have been able to review the prevalence of rare malignancies, such as Merkel cell carcinoma and appendageal tumors, and highlight that white Northern European patients remain at high risk of developing these conditions.”
 

 


As for the impact on clinical practice, “the patterns of skin disease susceptibility we have identified have important implications for rational design of transplant skin surveillance programs, targeted patient (and provider) education, and optimized clinical management,” Dr. Kentley said. “Ultimately, this is likely to have a significant impact on strategic deployment of limited dermatology health care resources.”

Specifically, the study suggests that all organ transplant patients receive a baseline skin assessment visit and nurse-led targeted education. Black African/Caribbean patients should be followed up for at least 5 years after transplant.

In the United States, at least 724,000 people have undergone organ transplants since 1988, with most getting kidney transplants, according to the United Network for Organ Sharing (UNOS).

No study funding was reported. The authors had no disclosures.

SOURCE: Kentley J et al. AAD 2018, Session F055.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– A new study finds that the risk of skin cancers in organ transplant recipients may vary widely by ethnicity.

“The most important findings from our study are the high rates of keratinocyte neoplasms observed in our white Northern European patients, but also in those of Far East Asian descent. Dermatologists should also appreciate the high risk of Kaposi’s sarcoma (KS) in patients originating from Sub-Saharan Africa,” Jonathan Kentley, MBBS, of Royal London Hospital, said in an interview. He presented the study findings at the annual meeting of the American Academy of Dermatology.

Dr. Jonathan Kentley
“As the immune system plays a pivotal role in the surveillance and destruction of skin cancer, iatrogenic immunosuppression has a profound impact on morbidity and mortality in these patients,” he noted. “This presents a significant health issue for transplant recipients, and they are at an increased risk of almost every skin cancer. Squamous cell carcinoma (SCC), in particular, has been intensively studied, and some literature suggests that transplant recipients are at a more than 100-times increased risk of SCC.”

For the study, Dr. Kentley and colleagues sought to better understand ethnic differences in skin disorders in patients who have received organ transplants, since many previous studies have included few nonwhite subjects.

They analyzed an organ transplant center database for the years 1989-2016, and tracked 1,304 consecutive patients – which included 1,125 with skin problems. The overall population was 64% male with a median age in the early 40s, and almost all (1,276) had undergone renal transplants. A relative handful underwent liver, lung, heart, and pancreas transplants.

The majority of patients (885) were white Northern Europeans, but there were also significant numbers of people with South Asian (202), black African/Caribbean (131) and white/Mediterranean (52) heritage. A small number were Far East Asian (26) and Middle Eastern (8). The median follow-up time for the ethnic groups varied from about 5 years to about 12 years.

The researchers found that basal cell carcinoma was most common in white Northern European patients, at nearly 25%, with other groups under 10%. SCC was common in white Northern European patients and Far East Asians, both at nearly 25%.
 

 


By far, KS was the most common in black African/Caribbean patients, at nearly 11%. According to Dr. Kentley, researchers found the number of KS cases to be surprisingly high in this group, “compounded by the fact that we have had a number of additional cases in the past year after we had collected the data for this study.” He attributes the higher number of KS cases in these patients to an increased seroprevalence of its causative agent, human herpesvirus-8, in Sub-Saharan Africa. The rate of KS in the second most commonly affected group – white Mediterranean patients – was almost 2%.

Viral warts were common in most groups, with the rate in both white groups (white Northern European and white Mediterranean) at nearly 60%, and Far East Asians at about 65%. Porokeratosis was by far the most common in white Norther Europeans, at nearly 8%, and sebaceous hyperplasia was common in all groups (more than 20% to about 27%) except in the black African/Caribbean and South Asian groups.

All these results were statistically significant with P values less than .05.

“Our study has confirmed the increased risk of keratinocyte cancers in patients of white Northern European descent, as well as providing more information on the increased risk in patients of Far East Asian descent,” Dr. Kentley said. “We have also confirmed the propensity of black African/Caribbean patients to develop Kaposi’s sarcoma in the first 5 years post transplant and highlighted that white Mediterranean patients are also at high risk. Beyond this, we have been able to review the prevalence of rare malignancies, such as Merkel cell carcinoma and appendageal tumors, and highlight that white Northern European patients remain at high risk of developing these conditions.”
 

 


As for the impact on clinical practice, “the patterns of skin disease susceptibility we have identified have important implications for rational design of transplant skin surveillance programs, targeted patient (and provider) education, and optimized clinical management,” Dr. Kentley said. “Ultimately, this is likely to have a significant impact on strategic deployment of limited dermatology health care resources.”

Specifically, the study suggests that all organ transplant patients receive a baseline skin assessment visit and nurse-led targeted education. Black African/Caribbean patients should be followed up for at least 5 years after transplant.

In the United States, at least 724,000 people have undergone organ transplants since 1988, with most getting kidney transplants, according to the United Network for Organ Sharing (UNOS).

No study funding was reported. The authors had no disclosures.

SOURCE: Kentley J et al. AAD 2018, Session F055.

 

– A new study finds that the risk of skin cancers in organ transplant recipients may vary widely by ethnicity.

“The most important findings from our study are the high rates of keratinocyte neoplasms observed in our white Northern European patients, but also in those of Far East Asian descent. Dermatologists should also appreciate the high risk of Kaposi’s sarcoma (KS) in patients originating from Sub-Saharan Africa,” Jonathan Kentley, MBBS, of Royal London Hospital, said in an interview. He presented the study findings at the annual meeting of the American Academy of Dermatology.

Dr. Jonathan Kentley
“As the immune system plays a pivotal role in the surveillance and destruction of skin cancer, iatrogenic immunosuppression has a profound impact on morbidity and mortality in these patients,” he noted. “This presents a significant health issue for transplant recipients, and they are at an increased risk of almost every skin cancer. Squamous cell carcinoma (SCC), in particular, has been intensively studied, and some literature suggests that transplant recipients are at a more than 100-times increased risk of SCC.”

For the study, Dr. Kentley and colleagues sought to better understand ethnic differences in skin disorders in patients who have received organ transplants, since many previous studies have included few nonwhite subjects.

They analyzed an organ transplant center database for the years 1989-2016, and tracked 1,304 consecutive patients – which included 1,125 with skin problems. The overall population was 64% male with a median age in the early 40s, and almost all (1,276) had undergone renal transplants. A relative handful underwent liver, lung, heart, and pancreas transplants.

The majority of patients (885) were white Northern Europeans, but there were also significant numbers of people with South Asian (202), black African/Caribbean (131) and white/Mediterranean (52) heritage. A small number were Far East Asian (26) and Middle Eastern (8). The median follow-up time for the ethnic groups varied from about 5 years to about 12 years.

The researchers found that basal cell carcinoma was most common in white Northern European patients, at nearly 25%, with other groups under 10%. SCC was common in white Northern European patients and Far East Asians, both at nearly 25%.
 

 


By far, KS was the most common in black African/Caribbean patients, at nearly 11%. According to Dr. Kentley, researchers found the number of KS cases to be surprisingly high in this group, “compounded by the fact that we have had a number of additional cases in the past year after we had collected the data for this study.” He attributes the higher number of KS cases in these patients to an increased seroprevalence of its causative agent, human herpesvirus-8, in Sub-Saharan Africa. The rate of KS in the second most commonly affected group – white Mediterranean patients – was almost 2%.

Viral warts were common in most groups, with the rate in both white groups (white Northern European and white Mediterranean) at nearly 60%, and Far East Asians at about 65%. Porokeratosis was by far the most common in white Norther Europeans, at nearly 8%, and sebaceous hyperplasia was common in all groups (more than 20% to about 27%) except in the black African/Caribbean and South Asian groups.

All these results were statistically significant with P values less than .05.

“Our study has confirmed the increased risk of keratinocyte cancers in patients of white Northern European descent, as well as providing more information on the increased risk in patients of Far East Asian descent,” Dr. Kentley said. “We have also confirmed the propensity of black African/Caribbean patients to develop Kaposi’s sarcoma in the first 5 years post transplant and highlighted that white Mediterranean patients are also at high risk. Beyond this, we have been able to review the prevalence of rare malignancies, such as Merkel cell carcinoma and appendageal tumors, and highlight that white Northern European patients remain at high risk of developing these conditions.”
 

 


As for the impact on clinical practice, “the patterns of skin disease susceptibility we have identified have important implications for rational design of transplant skin surveillance programs, targeted patient (and provider) education, and optimized clinical management,” Dr. Kentley said. “Ultimately, this is likely to have a significant impact on strategic deployment of limited dermatology health care resources.”

Specifically, the study suggests that all organ transplant patients receive a baseline skin assessment visit and nurse-led targeted education. Black African/Caribbean patients should be followed up for at least 5 years after transplant.

In the United States, at least 724,000 people have undergone organ transplants since 1988, with most getting kidney transplants, according to the United Network for Organ Sharing (UNOS).

No study funding was reported. The authors had no disclosures.

SOURCE: Kentley J et al. AAD 2018, Session F055.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM AAD 18

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Skin disorders after organ transplant differ widely by ethnicity.

Major finding: Posttransplant basal cell and squamous cell carcinomas are most common in white Northern Europeans (at nearly 25%), while Kaposi’s sarcoma was higher than expected (nearly 10%) in black African/Caribbean patients.

Study details: Analysis of 1,125 patients from a single transplant center who received organ transplants and developed skin problems over a median follow-up time of 5 to more than 12 years, depending on ethnicity.

Disclosures: No study funding was reported. The authors had no disclosures.

Source: Kentley J et al. AAD 2018, Session F055.

Disqus Comments
Default

Microneedling improved acne scars in small study of patients with darker skin

Article Type
Changed

 

Microneedling treatment produced statistically significant improvements in pigmentation-associated acne scars in a study of patients with dark skin, and did not contribute to more pigmentation, the study authors reported.

Most patients were pleased with the results. “Microneedling is an effective and safe treatment for acne scars associated with pigmentation in dark-skinned patients, without adding any risk of causing worsening of pigmentation,” the study’s lead author, Firas Al Qarqaz, MD said in an interview.

The study was published online in the Journal of Cosmetic Dermatology.

Dr. Al Qarqaz, of the department of dermatology, Jordan University of Science and Technology, Irbid, Jordan, pointed out that patients with darker skin and acne scars pose a unique challenge because some current treatments “can improve the scars but carry a risk of worsening the pigmentation and making skin/scars darker, which can be as troublesome to patients as their original scars.” Indeed, a review of microneedling as a treatment for dermatologic conditions in patients with darker skin noted that conventional resurfacing procedures can be limited in this patient population, because of concerns of adverse effects, including dyspigmentation (J Am Acad Dermatol. 2016 Feb;74[2]:348-55).



The situation is especially complex because “the current assessment methods for evaluating acne scars are not addressing clearly the important aspect of pigmentation that is associated with such scars, especially in darker skin, which can make objective assessment for improvement lacking,” Dr. Al Qarqaz noted.

He and a coauthor conducted the new study to determine whether microneedling can safely and effectively improve both acne scars and related hyperpigmentation in patients with darker skin. The study of 39 patients with postacne scarring comprised 31 women and 8 men aged 18-43 years (mean age, 27); their skin colors ranged from Fitzpatrick skin types type III to V. Most (27) were type IV.

The patients were treated with an electronic microneedling device and were evaluated at 2 weeks, and at least 4 weeks after their initial assessment (range, 4-14 weeks) for the final evaluation. The researchers found statistically significant improvement in two measures: The Postacne Hyperpigmentation Index improved from a mean score of 13 at baseline to a mean of 10 post procedure (P = .0035), and the Goodman-Baron acne scarring scale improved from a mean of 18 at baseline to a mean of 12 post procedure (P = .008).

 

 


Side effects were mild and temporary. “This treatment seems to be safe apart from short-lived erythema and occasional small hematoma following procedure,” the researchers concluded.

Nearly 80% of patients said they were satisfied with the procedure; the rest were not satisfied with the results (no reasons were cited). “Additional studies focusing on postacne scarring with hyperpigmentation are needed,” in addition to “assessment tools designed for this particular patient group,” the authors noted. They added that more treatments may be needed in some patients for hyperpigmentation.

Jordan University of Science and Technology funded the study. The study authors reported no relevant disclosures.

SOURCE: Al-Qarqaz F et al. J Cosmet Dermatol. 2018 Mar 15. (doi: 10.1111/jocd.12520.)

Publications
Topics
Sections

 

Microneedling treatment produced statistically significant improvements in pigmentation-associated acne scars in a study of patients with dark skin, and did not contribute to more pigmentation, the study authors reported.

Most patients were pleased with the results. “Microneedling is an effective and safe treatment for acne scars associated with pigmentation in dark-skinned patients, without adding any risk of causing worsening of pigmentation,” the study’s lead author, Firas Al Qarqaz, MD said in an interview.

The study was published online in the Journal of Cosmetic Dermatology.

Dr. Al Qarqaz, of the department of dermatology, Jordan University of Science and Technology, Irbid, Jordan, pointed out that patients with darker skin and acne scars pose a unique challenge because some current treatments “can improve the scars but carry a risk of worsening the pigmentation and making skin/scars darker, which can be as troublesome to patients as their original scars.” Indeed, a review of microneedling as a treatment for dermatologic conditions in patients with darker skin noted that conventional resurfacing procedures can be limited in this patient population, because of concerns of adverse effects, including dyspigmentation (J Am Acad Dermatol. 2016 Feb;74[2]:348-55).



The situation is especially complex because “the current assessment methods for evaluating acne scars are not addressing clearly the important aspect of pigmentation that is associated with such scars, especially in darker skin, which can make objective assessment for improvement lacking,” Dr. Al Qarqaz noted.

He and a coauthor conducted the new study to determine whether microneedling can safely and effectively improve both acne scars and related hyperpigmentation in patients with darker skin. The study of 39 patients with postacne scarring comprised 31 women and 8 men aged 18-43 years (mean age, 27); their skin colors ranged from Fitzpatrick skin types type III to V. Most (27) were type IV.

The patients were treated with an electronic microneedling device and were evaluated at 2 weeks, and at least 4 weeks after their initial assessment (range, 4-14 weeks) for the final evaluation. The researchers found statistically significant improvement in two measures: The Postacne Hyperpigmentation Index improved from a mean score of 13 at baseline to a mean of 10 post procedure (P = .0035), and the Goodman-Baron acne scarring scale improved from a mean of 18 at baseline to a mean of 12 post procedure (P = .008).

 

 


Side effects were mild and temporary. “This treatment seems to be safe apart from short-lived erythema and occasional small hematoma following procedure,” the researchers concluded.

Nearly 80% of patients said they were satisfied with the procedure; the rest were not satisfied with the results (no reasons were cited). “Additional studies focusing on postacne scarring with hyperpigmentation are needed,” in addition to “assessment tools designed for this particular patient group,” the authors noted. They added that more treatments may be needed in some patients for hyperpigmentation.

Jordan University of Science and Technology funded the study. The study authors reported no relevant disclosures.

SOURCE: Al-Qarqaz F et al. J Cosmet Dermatol. 2018 Mar 15. (doi: 10.1111/jocd.12520.)

 

Microneedling treatment produced statistically significant improvements in pigmentation-associated acne scars in a study of patients with dark skin, and did not contribute to more pigmentation, the study authors reported.

Most patients were pleased with the results. “Microneedling is an effective and safe treatment for acne scars associated with pigmentation in dark-skinned patients, without adding any risk of causing worsening of pigmentation,” the study’s lead author, Firas Al Qarqaz, MD said in an interview.

The study was published online in the Journal of Cosmetic Dermatology.

Dr. Al Qarqaz, of the department of dermatology, Jordan University of Science and Technology, Irbid, Jordan, pointed out that patients with darker skin and acne scars pose a unique challenge because some current treatments “can improve the scars but carry a risk of worsening the pigmentation and making skin/scars darker, which can be as troublesome to patients as their original scars.” Indeed, a review of microneedling as a treatment for dermatologic conditions in patients with darker skin noted that conventional resurfacing procedures can be limited in this patient population, because of concerns of adverse effects, including dyspigmentation (J Am Acad Dermatol. 2016 Feb;74[2]:348-55).



The situation is especially complex because “the current assessment methods for evaluating acne scars are not addressing clearly the important aspect of pigmentation that is associated with such scars, especially in darker skin, which can make objective assessment for improvement lacking,” Dr. Al Qarqaz noted.

He and a coauthor conducted the new study to determine whether microneedling can safely and effectively improve both acne scars and related hyperpigmentation in patients with darker skin. The study of 39 patients with postacne scarring comprised 31 women and 8 men aged 18-43 years (mean age, 27); their skin colors ranged from Fitzpatrick skin types type III to V. Most (27) were type IV.

The patients were treated with an electronic microneedling device and were evaluated at 2 weeks, and at least 4 weeks after their initial assessment (range, 4-14 weeks) for the final evaluation. The researchers found statistically significant improvement in two measures: The Postacne Hyperpigmentation Index improved from a mean score of 13 at baseline to a mean of 10 post procedure (P = .0035), and the Goodman-Baron acne scarring scale improved from a mean of 18 at baseline to a mean of 12 post procedure (P = .008).

 

 


Side effects were mild and temporary. “This treatment seems to be safe apart from short-lived erythema and occasional small hematoma following procedure,” the researchers concluded.

Nearly 80% of patients said they were satisfied with the procedure; the rest were not satisfied with the results (no reasons were cited). “Additional studies focusing on postacne scarring with hyperpigmentation are needed,” in addition to “assessment tools designed for this particular patient group,” the authors noted. They added that more treatments may be needed in some patients for hyperpigmentation.

Jordan University of Science and Technology funded the study. The study authors reported no relevant disclosures.

SOURCE: Al-Qarqaz F et al. J Cosmet Dermatol. 2018 Mar 15. (doi: 10.1111/jocd.12520.)

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF COSMETIC DERMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Microneedling produced statistically significant improvements without worsening pigmentation in darker-skinned patients with postacne scarring.

Major finding: The postacne hyperpigmentation index score improved from a mean of 13 to 10 (P = .0035) and the acne scarring scale improved from 18 to 12 (P = .008).

Study details: Microneedling was used to treat postacne scarring in 31 women and 8 men with Fitzpatrick skin types III-V.

Disclosures: Jordan University of Science and Technology funded the study. The study authors reported no relevant disclosures.

Source: Al Qarqaz F et al. J Cosmet Dermatol. 2018 Mar 15. doi: 10.1111/jocd.12520.

Disqus Comments
Default

Hope and hype: Inside the push for wearable diabetes technology

Article Type
Changed

 

Want to make a billion dollars? Here’s a hot tip: Invent wearable technology that detects diabetes, measures glucose levels, and determines how much insulin is needed – all without the need for a single drop of blood.

If you accept this mission, there’s a catch: You’ll have a whole bunch of company. When it comes to using technology to free patients with diabetes from the dreaded finger stick, “hope springs eternal in the hearts of scientists, entrepreneurs, opportunists, and charlatans alike,” writes electrochemical specialist and consultant John L. Smith, PhD, in his book “The Pursuit of Noninvasive Glucose.”

Google and Apple have been in the hunt, along with countless makers of devices and software. A noninvasive glucose monitoring system is the prime target, but there’s also plenty of interest in software that puts data from such devices as heartbeat sensors to work.

Dr. Laura Baers
“Patients with diabetes are likely to be the early winners in the rise of digital health, a sector that attracted investment of $4.7 billion in 2017,” said Laura Baers, PhD, a technology adviser with the market research company IDTechEx, in an interview.

For the moment, however, results are elusive, and the name of the game is hype.
 

 

Early failure has a lasting impact

In the beginning, there was Glucowatch. And it was not good.

The GlucoWatch G2 Biographer product received approval from the Food and Drug Administration back in 2001 and touted as a high-tech tool to monitor glucose levels via interstitial concentrations every 10 minutes. The device promised to draw glucose to the skin surface for measurement via electric shocks, and alarms were to go off when hypoglycemia or hyperglycemia was detected.

But numerous problems cropped up. There was a time lag, with the device estimating glucose levels that actually occurred 15-20 minutes earlier. Some patients couldn’t tolerate wearing the watch, and some were burned by the electric current.

And perhaps worst of all, the measurements often weren’t accurate, with one study finding that more than half of 240 nighttime alarms incorrectly warned children with diabetes of dangerously high or low glucose. (Diabetes Technol Ther. 2005 June; 7[3]:440-7).

As a result of the Glucowatch debacle, the FDA “become a little gun shy about approving anything. It made it even harder to approve something,” said Mark J. Rice, MD, of Vanderbilt University, Nashville, Tenn., who has tried to develop glucose-measuring technology.

Dr. Mark J. Rice
Glucowatch was removed from the market, and no noninvasive glucose-monitoring devices are currently being sold in the United States. That leaves plenty of room for the companies that want a piece of the action.

“If a device were to be commercialized, it would be hugely disruptive to the industry,” according to Dr. Baers, who said that she expects a device eventually will lead to higher levels of diabetes control and fewer side effects. “This would result in billion-dollar savings for the health care industry and reduced complications,” she said.
 

 

On tech front, promises and more promises

So far, there have been more promises than actual products.

If you don’t look too closely at the website of a device called GlucoWise, you might assume a noninvasive glucose monitor already exists. Under a photo of a smiling woman, the site promises a “100% pain-free device that makes traditional blood sampling a thing of the past.”

The “simple yet highly reliable” device, which looks a bit like a large clip for a potato chip bag, promises to measure glucose through high-frequency radio waves that penetrate thin body tissue in the earlobe or the area between the thumb and forefinger.

But the GlucoWise device is neither approved nor available, and the company’s predictions that it would take preorders by late 2016 didn’t come true.

Another product called SugarBEAT missed its planned 2016 release and now hopes to be available in the Britain later this year. It promises to measure glucose levels every 5 minutes via a small disposable patch that draws interstitial fluid from the skin.

Meanwhile, Apple has enlisted biomedical engineers to work on a secret project to measure glucose continuously and noninvasively, CNBC reported last year. And Google announced in 2014 that it was working on a glucose-detecting contact lens that could alert patients via tiny LED lights – yes, apparently in the lenses themselves – if levels go too high or low. But neither of these technologies is ready for prime time.

 

 

Sneaky glucose molecules elude scientists

According to Dr. Rice, no truly noninvasive glucose-measuring technique has worked so far.

The challenge, he said, is that it’s difficult to measure tiny glucose molecules, which have no color and share many characteristics with H2O.

“The real problem is trying to measure a colorless molecule in a sea of water,” Dr. Rice said.

Glucose lab tests rely on indicators from reactions with other substances, he said, “but you can’t do that in the body.” Measuring glucose in tears or urine is one possibility, he said, but the scarcity in those liquids poses a challenge: “Your body doesn’t want to spill glucose and lose energy.”

Dr. Rice himself explored a glucose-measuring technique that aimed to correlate glucose levels to the speed of the retina’s reaction to light. The idea was that patients would wear special glasses that would shine a light in the eye at the press of a button. The project ultimately failed.

There are other challenges, said Dr. Baers, the technology adviser. “Glucose concentrations in sweat or tears are not reflective of blood glucose concentrations. To make things even more challenging, glucose levels in these fluids are orders of magnitude smaller than that found in blood.”

And, she said, there’s a time lag between glucose levels in blood and in other body fluids. “This means that a sweat glucose level is really giving information from an hour previous, which can be dangerous if you’re operating machinery or driving.”

 

 

Wearable diabetes tech targets more than glucose

There’s more to wearable, noninvasive diabetes technology than glucose-monitoring. One of the new frontiers is diagnostics.

Earlier this year, researchers from the University of California at San Francisco and the digital startup Cardiogram reported that they were able to use data from digital heart rate sensors (like those found in Apple Watches, Fitbits and other devices) to correctly detect diabetes in patients.

In a study presented at the 2018 meeting of the Association for the Advancement of Artificial Intelligence, the researchers said they detected diabetes in 85% of 462 participants (out of a pool of 14,011) who’d previously been diagnosed with the condition (AAAAI abstract arXiv:1802.02511v1 [cs.LG]).

Brandon Ballinger
Heart rates can offer insight into diabetes because “your pancreas is linked to your heart through both the sympathetic and parasympathetic nervous system,” said Cardiogram cofounder Brandon Ballinger in an interview. He pointed to a 2005 study that linked cardiac autonomic impairment to the development of diabetes. (Diabetes Care 2005 Mar; 28[3]: 668-74)

The next step is to test whether the data analysis can detect undiagnosed diabetes, Mr. Ballinger said.
 

 

As tech advances, questions remain

San Diego’s Scripps Whittier Diabetes Institute is another player in the diabetes/digital health world. It’s currently working on several clinical trials of diabetes technology, including a study into whether older adults with type 1 diabetes will benefit from a continuous glucose monitoring device with a wireless connection.

Dr. Athena Philis-Tsimikas
But Athena Philis-Tsimikas, MD, a corporate vice president with the institute, cautioned that wearable technology in diabetes is no cure-all. “Wearables and apps are never as easy as those who are selling them makes them sound,” she said. “They’re always more complex than the engineers that design them feel they are. And who has enough time to train them [patients] and fix the glitches?”

Devices that measure glucose can also suffer from errors in transmission, she said. And the existing continuous glucose monitors have trouble with accuracy at the very highest and lowest glucose levels, she said, although they are improving.

There are other questions about future wearable technology for diabetes: Will the devices cost more than continuous glucose monitoring systems (CGM), which are already pricey? How will private health information be protected? (As Mr. Ballinger noted, “wearable data itself is out of the scope of HIPAA.”) And will patients actually take action when their devices diagnose diabetes or warn them that their glucose levels are out of whack?
 

 


CGM systems provide insight into the latter issue. Repeated alarms about highs and lows can drive patients crazy, Dr. Philis-Tsimikas said. “You might end up with alarm fatigue and annoyance. They might hit a 250, but they won’t want the alarm to go off, and they don’t want to be reminded of it,” she said. “And they might go down to 60-80 at night, but they don’t want to be woken up because they’re used to that range.”

Even if patients do pay attention to their diabetes devices, they may not take the proper action. Dr. Philis-Tsimikas pointed to a 2016 study that found adding an exercise-tracking device to traditional weight-loss intervention didn’t lead to more weight loss. In fact, those who used the device actually loss less weight. (JAMA. 2016;316[11]:1161-71)

The lesson? “There has to be a combination of some education together with the physiologic information,” she said. For now, the good news is that “we still have other options,” Dr. Philis-Tsimikas said. The newly released CGM system known as the Freestyle Libre, she said, is one alternative.

And she mentioned another technique that’s still around. You could call it Old Faithful: the low-tech, high-hassle but highly accurate finger stick.

Dr. Baers and Dr. Rice report no disclosures. Dr. Philis-Tsimikas has no disclosures but notes that Scripps Whittier Diabetes Institute receives grants and funding in the diabetes field and works with a number of drug makers and device makers. Mr. Ballinger discloses salary and equity from Cardiogram.
Publications
Topics
Sections

 

Want to make a billion dollars? Here’s a hot tip: Invent wearable technology that detects diabetes, measures glucose levels, and determines how much insulin is needed – all without the need for a single drop of blood.

If you accept this mission, there’s a catch: You’ll have a whole bunch of company. When it comes to using technology to free patients with diabetes from the dreaded finger stick, “hope springs eternal in the hearts of scientists, entrepreneurs, opportunists, and charlatans alike,” writes electrochemical specialist and consultant John L. Smith, PhD, in his book “The Pursuit of Noninvasive Glucose.”

Google and Apple have been in the hunt, along with countless makers of devices and software. A noninvasive glucose monitoring system is the prime target, but there’s also plenty of interest in software that puts data from such devices as heartbeat sensors to work.

Dr. Laura Baers
“Patients with diabetes are likely to be the early winners in the rise of digital health, a sector that attracted investment of $4.7 billion in 2017,” said Laura Baers, PhD, a technology adviser with the market research company IDTechEx, in an interview.

For the moment, however, results are elusive, and the name of the game is hype.
 

 

Early failure has a lasting impact

In the beginning, there was Glucowatch. And it was not good.

The GlucoWatch G2 Biographer product received approval from the Food and Drug Administration back in 2001 and touted as a high-tech tool to monitor glucose levels via interstitial concentrations every 10 minutes. The device promised to draw glucose to the skin surface for measurement via electric shocks, and alarms were to go off when hypoglycemia or hyperglycemia was detected.

But numerous problems cropped up. There was a time lag, with the device estimating glucose levels that actually occurred 15-20 minutes earlier. Some patients couldn’t tolerate wearing the watch, and some were burned by the electric current.

And perhaps worst of all, the measurements often weren’t accurate, with one study finding that more than half of 240 nighttime alarms incorrectly warned children with diabetes of dangerously high or low glucose. (Diabetes Technol Ther. 2005 June; 7[3]:440-7).

As a result of the Glucowatch debacle, the FDA “become a little gun shy about approving anything. It made it even harder to approve something,” said Mark J. Rice, MD, of Vanderbilt University, Nashville, Tenn., who has tried to develop glucose-measuring technology.

Dr. Mark J. Rice
Glucowatch was removed from the market, and no noninvasive glucose-monitoring devices are currently being sold in the United States. That leaves plenty of room for the companies that want a piece of the action.

“If a device were to be commercialized, it would be hugely disruptive to the industry,” according to Dr. Baers, who said that she expects a device eventually will lead to higher levels of diabetes control and fewer side effects. “This would result in billion-dollar savings for the health care industry and reduced complications,” she said.
 

 

On tech front, promises and more promises

So far, there have been more promises than actual products.

If you don’t look too closely at the website of a device called GlucoWise, you might assume a noninvasive glucose monitor already exists. Under a photo of a smiling woman, the site promises a “100% pain-free device that makes traditional blood sampling a thing of the past.”

The “simple yet highly reliable” device, which looks a bit like a large clip for a potato chip bag, promises to measure glucose through high-frequency radio waves that penetrate thin body tissue in the earlobe or the area between the thumb and forefinger.

But the GlucoWise device is neither approved nor available, and the company’s predictions that it would take preorders by late 2016 didn’t come true.

Another product called SugarBEAT missed its planned 2016 release and now hopes to be available in the Britain later this year. It promises to measure glucose levels every 5 minutes via a small disposable patch that draws interstitial fluid from the skin.

Meanwhile, Apple has enlisted biomedical engineers to work on a secret project to measure glucose continuously and noninvasively, CNBC reported last year. And Google announced in 2014 that it was working on a glucose-detecting contact lens that could alert patients via tiny LED lights – yes, apparently in the lenses themselves – if levels go too high or low. But neither of these technologies is ready for prime time.

 

 

Sneaky glucose molecules elude scientists

According to Dr. Rice, no truly noninvasive glucose-measuring technique has worked so far.

The challenge, he said, is that it’s difficult to measure tiny glucose molecules, which have no color and share many characteristics with H2O.

“The real problem is trying to measure a colorless molecule in a sea of water,” Dr. Rice said.

Glucose lab tests rely on indicators from reactions with other substances, he said, “but you can’t do that in the body.” Measuring glucose in tears or urine is one possibility, he said, but the scarcity in those liquids poses a challenge: “Your body doesn’t want to spill glucose and lose energy.”

Dr. Rice himself explored a glucose-measuring technique that aimed to correlate glucose levels to the speed of the retina’s reaction to light. The idea was that patients would wear special glasses that would shine a light in the eye at the press of a button. The project ultimately failed.

There are other challenges, said Dr. Baers, the technology adviser. “Glucose concentrations in sweat or tears are not reflective of blood glucose concentrations. To make things even more challenging, glucose levels in these fluids are orders of magnitude smaller than that found in blood.”

And, she said, there’s a time lag between glucose levels in blood and in other body fluids. “This means that a sweat glucose level is really giving information from an hour previous, which can be dangerous if you’re operating machinery or driving.”

 

 

Wearable diabetes tech targets more than glucose

There’s more to wearable, noninvasive diabetes technology than glucose-monitoring. One of the new frontiers is diagnostics.

Earlier this year, researchers from the University of California at San Francisco and the digital startup Cardiogram reported that they were able to use data from digital heart rate sensors (like those found in Apple Watches, Fitbits and other devices) to correctly detect diabetes in patients.

In a study presented at the 2018 meeting of the Association for the Advancement of Artificial Intelligence, the researchers said they detected diabetes in 85% of 462 participants (out of a pool of 14,011) who’d previously been diagnosed with the condition (AAAAI abstract arXiv:1802.02511v1 [cs.LG]).

Brandon Ballinger
Heart rates can offer insight into diabetes because “your pancreas is linked to your heart through both the sympathetic and parasympathetic nervous system,” said Cardiogram cofounder Brandon Ballinger in an interview. He pointed to a 2005 study that linked cardiac autonomic impairment to the development of diabetes. (Diabetes Care 2005 Mar; 28[3]: 668-74)

The next step is to test whether the data analysis can detect undiagnosed diabetes, Mr. Ballinger said.
 

 

As tech advances, questions remain

San Diego’s Scripps Whittier Diabetes Institute is another player in the diabetes/digital health world. It’s currently working on several clinical trials of diabetes technology, including a study into whether older adults with type 1 diabetes will benefit from a continuous glucose monitoring device with a wireless connection.

Dr. Athena Philis-Tsimikas
But Athena Philis-Tsimikas, MD, a corporate vice president with the institute, cautioned that wearable technology in diabetes is no cure-all. “Wearables and apps are never as easy as those who are selling them makes them sound,” she said. “They’re always more complex than the engineers that design them feel they are. And who has enough time to train them [patients] and fix the glitches?”

Devices that measure glucose can also suffer from errors in transmission, she said. And the existing continuous glucose monitors have trouble with accuracy at the very highest and lowest glucose levels, she said, although they are improving.

There are other questions about future wearable technology for diabetes: Will the devices cost more than continuous glucose monitoring systems (CGM), which are already pricey? How will private health information be protected? (As Mr. Ballinger noted, “wearable data itself is out of the scope of HIPAA.”) And will patients actually take action when their devices diagnose diabetes or warn them that their glucose levels are out of whack?
 

 


CGM systems provide insight into the latter issue. Repeated alarms about highs and lows can drive patients crazy, Dr. Philis-Tsimikas said. “You might end up with alarm fatigue and annoyance. They might hit a 250, but they won’t want the alarm to go off, and they don’t want to be reminded of it,” she said. “And they might go down to 60-80 at night, but they don’t want to be woken up because they’re used to that range.”

Even if patients do pay attention to their diabetes devices, they may not take the proper action. Dr. Philis-Tsimikas pointed to a 2016 study that found adding an exercise-tracking device to traditional weight-loss intervention didn’t lead to more weight loss. In fact, those who used the device actually loss less weight. (JAMA. 2016;316[11]:1161-71)

The lesson? “There has to be a combination of some education together with the physiologic information,” she said. For now, the good news is that “we still have other options,” Dr. Philis-Tsimikas said. The newly released CGM system known as the Freestyle Libre, she said, is one alternative.

And she mentioned another technique that’s still around. You could call it Old Faithful: the low-tech, high-hassle but highly accurate finger stick.

Dr. Baers and Dr. Rice report no disclosures. Dr. Philis-Tsimikas has no disclosures but notes that Scripps Whittier Diabetes Institute receives grants and funding in the diabetes field and works with a number of drug makers and device makers. Mr. Ballinger discloses salary and equity from Cardiogram.

 

Want to make a billion dollars? Here’s a hot tip: Invent wearable technology that detects diabetes, measures glucose levels, and determines how much insulin is needed – all without the need for a single drop of blood.

If you accept this mission, there’s a catch: You’ll have a whole bunch of company. When it comes to using technology to free patients with diabetes from the dreaded finger stick, “hope springs eternal in the hearts of scientists, entrepreneurs, opportunists, and charlatans alike,” writes electrochemical specialist and consultant John L. Smith, PhD, in his book “The Pursuit of Noninvasive Glucose.”

Google and Apple have been in the hunt, along with countless makers of devices and software. A noninvasive glucose monitoring system is the prime target, but there’s also plenty of interest in software that puts data from such devices as heartbeat sensors to work.

Dr. Laura Baers
“Patients with diabetes are likely to be the early winners in the rise of digital health, a sector that attracted investment of $4.7 billion in 2017,” said Laura Baers, PhD, a technology adviser with the market research company IDTechEx, in an interview.

For the moment, however, results are elusive, and the name of the game is hype.
 

 

Early failure has a lasting impact

In the beginning, there was Glucowatch. And it was not good.

The GlucoWatch G2 Biographer product received approval from the Food and Drug Administration back in 2001 and touted as a high-tech tool to monitor glucose levels via interstitial concentrations every 10 minutes. The device promised to draw glucose to the skin surface for measurement via electric shocks, and alarms were to go off when hypoglycemia or hyperglycemia was detected.

But numerous problems cropped up. There was a time lag, with the device estimating glucose levels that actually occurred 15-20 minutes earlier. Some patients couldn’t tolerate wearing the watch, and some were burned by the electric current.

And perhaps worst of all, the measurements often weren’t accurate, with one study finding that more than half of 240 nighttime alarms incorrectly warned children with diabetes of dangerously high or low glucose. (Diabetes Technol Ther. 2005 June; 7[3]:440-7).

As a result of the Glucowatch debacle, the FDA “become a little gun shy about approving anything. It made it even harder to approve something,” said Mark J. Rice, MD, of Vanderbilt University, Nashville, Tenn., who has tried to develop glucose-measuring technology.

Dr. Mark J. Rice
Glucowatch was removed from the market, and no noninvasive glucose-monitoring devices are currently being sold in the United States. That leaves plenty of room for the companies that want a piece of the action.

“If a device were to be commercialized, it would be hugely disruptive to the industry,” according to Dr. Baers, who said that she expects a device eventually will lead to higher levels of diabetes control and fewer side effects. “This would result in billion-dollar savings for the health care industry and reduced complications,” she said.
 

 

On tech front, promises and more promises

So far, there have been more promises than actual products.

If you don’t look too closely at the website of a device called GlucoWise, you might assume a noninvasive glucose monitor already exists. Under a photo of a smiling woman, the site promises a “100% pain-free device that makes traditional blood sampling a thing of the past.”

The “simple yet highly reliable” device, which looks a bit like a large clip for a potato chip bag, promises to measure glucose through high-frequency radio waves that penetrate thin body tissue in the earlobe or the area between the thumb and forefinger.

But the GlucoWise device is neither approved nor available, and the company’s predictions that it would take preorders by late 2016 didn’t come true.

Another product called SugarBEAT missed its planned 2016 release and now hopes to be available in the Britain later this year. It promises to measure glucose levels every 5 minutes via a small disposable patch that draws interstitial fluid from the skin.

Meanwhile, Apple has enlisted biomedical engineers to work on a secret project to measure glucose continuously and noninvasively, CNBC reported last year. And Google announced in 2014 that it was working on a glucose-detecting contact lens that could alert patients via tiny LED lights – yes, apparently in the lenses themselves – if levels go too high or low. But neither of these technologies is ready for prime time.

 

 

Sneaky glucose molecules elude scientists

According to Dr. Rice, no truly noninvasive glucose-measuring technique has worked so far.

The challenge, he said, is that it’s difficult to measure tiny glucose molecules, which have no color and share many characteristics with H2O.

“The real problem is trying to measure a colorless molecule in a sea of water,” Dr. Rice said.

Glucose lab tests rely on indicators from reactions with other substances, he said, “but you can’t do that in the body.” Measuring glucose in tears or urine is one possibility, he said, but the scarcity in those liquids poses a challenge: “Your body doesn’t want to spill glucose and lose energy.”

Dr. Rice himself explored a glucose-measuring technique that aimed to correlate glucose levels to the speed of the retina’s reaction to light. The idea was that patients would wear special glasses that would shine a light in the eye at the press of a button. The project ultimately failed.

There are other challenges, said Dr. Baers, the technology adviser. “Glucose concentrations in sweat or tears are not reflective of blood glucose concentrations. To make things even more challenging, glucose levels in these fluids are orders of magnitude smaller than that found in blood.”

And, she said, there’s a time lag between glucose levels in blood and in other body fluids. “This means that a sweat glucose level is really giving information from an hour previous, which can be dangerous if you’re operating machinery or driving.”

 

 

Wearable diabetes tech targets more than glucose

There’s more to wearable, noninvasive diabetes technology than glucose-monitoring. One of the new frontiers is diagnostics.

Earlier this year, researchers from the University of California at San Francisco and the digital startup Cardiogram reported that they were able to use data from digital heart rate sensors (like those found in Apple Watches, Fitbits and other devices) to correctly detect diabetes in patients.

In a study presented at the 2018 meeting of the Association for the Advancement of Artificial Intelligence, the researchers said they detected diabetes in 85% of 462 participants (out of a pool of 14,011) who’d previously been diagnosed with the condition (AAAAI abstract arXiv:1802.02511v1 [cs.LG]).

Brandon Ballinger
Heart rates can offer insight into diabetes because “your pancreas is linked to your heart through both the sympathetic and parasympathetic nervous system,” said Cardiogram cofounder Brandon Ballinger in an interview. He pointed to a 2005 study that linked cardiac autonomic impairment to the development of diabetes. (Diabetes Care 2005 Mar; 28[3]: 668-74)

The next step is to test whether the data analysis can detect undiagnosed diabetes, Mr. Ballinger said.
 

 

As tech advances, questions remain

San Diego’s Scripps Whittier Diabetes Institute is another player in the diabetes/digital health world. It’s currently working on several clinical trials of diabetes technology, including a study into whether older adults with type 1 diabetes will benefit from a continuous glucose monitoring device with a wireless connection.

Dr. Athena Philis-Tsimikas
But Athena Philis-Tsimikas, MD, a corporate vice president with the institute, cautioned that wearable technology in diabetes is no cure-all. “Wearables and apps are never as easy as those who are selling them makes them sound,” she said. “They’re always more complex than the engineers that design them feel they are. And who has enough time to train them [patients] and fix the glitches?”

Devices that measure glucose can also suffer from errors in transmission, she said. And the existing continuous glucose monitors have trouble with accuracy at the very highest and lowest glucose levels, she said, although they are improving.

There are other questions about future wearable technology for diabetes: Will the devices cost more than continuous glucose monitoring systems (CGM), which are already pricey? How will private health information be protected? (As Mr. Ballinger noted, “wearable data itself is out of the scope of HIPAA.”) And will patients actually take action when their devices diagnose diabetes or warn them that their glucose levels are out of whack?
 

 


CGM systems provide insight into the latter issue. Repeated alarms about highs and lows can drive patients crazy, Dr. Philis-Tsimikas said. “You might end up with alarm fatigue and annoyance. They might hit a 250, but they won’t want the alarm to go off, and they don’t want to be reminded of it,” she said. “And they might go down to 60-80 at night, but they don’t want to be woken up because they’re used to that range.”

Even if patients do pay attention to their diabetes devices, they may not take the proper action. Dr. Philis-Tsimikas pointed to a 2016 study that found adding an exercise-tracking device to traditional weight-loss intervention didn’t lead to more weight loss. In fact, those who used the device actually loss less weight. (JAMA. 2016;316[11]:1161-71)

The lesson? “There has to be a combination of some education together with the physiologic information,” she said. For now, the good news is that “we still have other options,” Dr. Philis-Tsimikas said. The newly released CGM system known as the Freestyle Libre, she said, is one alternative.

And she mentioned another technique that’s still around. You could call it Old Faithful: the low-tech, high-hassle but highly accurate finger stick.

Dr. Baers and Dr. Rice report no disclosures. Dr. Philis-Tsimikas has no disclosures but notes that Scripps Whittier Diabetes Institute receives grants and funding in the diabetes field and works with a number of drug makers and device makers. Mr. Ballinger discloses salary and equity from Cardiogram.
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Smokers face higher infection risk after hernia operations

Article Type
Changed

 

Jonah Stulberg, MD, FACS, is stickler about requiring patients to stop smoking at least 3 months before hernia surgery. He even uses urine tests to confirm whether they actually quit. A study by Dr. Stulberg and his colleagues supports this approach: Current and recent smokers are significantly more likely to suffer serious complications over 30 days after elective hernia repair procedures.

The finding held up even after the researchers controlled for various factors. “Our findings are in agreement with other findings in higher risk surgeries, and they provide evidence that low-risk surgeries are not exempt from the risks associated with smoking,” said Dr. Stulberg in an interview. “Our data would suggest that there is significant clinical benefit to encouraging smoking cessation before elective hernia repair.”

Dr. Jonah Stulberg
Dr. Stulberg of Northwestern University in Chicago, is a coauthor of the new study, which was published online in the American Journal of Surgery.

The researchers launched the study to better understand how smoking affects complication rates in light of the fact that “surgeons in the U.S. tend to offer low-risk elective surgical procedures to patients who are actively smoking despite overwhelming evidence that smoking increases surgical risks,” Dr. Stulberg said.

The researchers tracked 220,629 patients in the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent several types of elective hernia repair from 2011 to 2014.

Just over 18% of the patients said they’d smoked over the past year; they were more likely to be younger (median age, 50 for smokers vs. 57 for nonsmokers). Smokers also were more likely to be black, to be underweight, and to consume two or more alcoholic beverages per day (P less than .05).

The researchers tracked serious complications in the 30 days after surgery such as death, sepsis, and readmission.
 

 


Complications developed in 6.34% of smokers and 4.72% of nonsmokers (P less than .001). Numerous kinds of complications were more common in the smokers prior to adjustment: death, return to the operating room, readmission, and transfusion plus wound, pulmonary, thromboembolic and cardiac complications.

The researchers adjusted their statistics to account for factors such as ethnicity, sex, body mass index, preexisting comorbidities, and type of hernia operation. They found that risk of all complications was higher in smokers, compared with nonsmokers (odds ratio, 1.30) as were several other complications: death (OR, 1.53), return to operating room (OR, 1.23), readmission (OR, 1.24), wound complication (OR, 1.36), sepsis/septic shock (OR, 1.31), pulmonary complication (OR 1.77-2.30) and cardiac complication (OR, 1.27-1.43).

Only transfusion (OR, 0.90) and thromboembolic (OR, 0.87) complications were less likely in smokers.

The researchers noted that the statistics don’t allow them to analyze whether it makes any difference if smokers quit shortly before their procedures. Still, Dr. Stulberg stands by his you-must-quit-smoking-before-surgery edict. “I believe that their active smoking habit is a bigger health threat than their asymptomatic hernia, and therefore feel the right thing to do as their physician is support them through their smoking cessation,” he said. “I offer counseling and nicotine replacement if needed. I have very good quit rates and would encourage other surgeons to do the same.”

 

 

SOURCE: DeLancey JO et al. Am J Surg. 2018 Mar 6. doi: 10.1016/j.amjsurg.2018.03.004.

Publications
Topics
Sections

 

Jonah Stulberg, MD, FACS, is stickler about requiring patients to stop smoking at least 3 months before hernia surgery. He even uses urine tests to confirm whether they actually quit. A study by Dr. Stulberg and his colleagues supports this approach: Current and recent smokers are significantly more likely to suffer serious complications over 30 days after elective hernia repair procedures.

The finding held up even after the researchers controlled for various factors. “Our findings are in agreement with other findings in higher risk surgeries, and they provide evidence that low-risk surgeries are not exempt from the risks associated with smoking,” said Dr. Stulberg in an interview. “Our data would suggest that there is significant clinical benefit to encouraging smoking cessation before elective hernia repair.”

Dr. Jonah Stulberg
Dr. Stulberg of Northwestern University in Chicago, is a coauthor of the new study, which was published online in the American Journal of Surgery.

The researchers launched the study to better understand how smoking affects complication rates in light of the fact that “surgeons in the U.S. tend to offer low-risk elective surgical procedures to patients who are actively smoking despite overwhelming evidence that smoking increases surgical risks,” Dr. Stulberg said.

The researchers tracked 220,629 patients in the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent several types of elective hernia repair from 2011 to 2014.

Just over 18% of the patients said they’d smoked over the past year; they were more likely to be younger (median age, 50 for smokers vs. 57 for nonsmokers). Smokers also were more likely to be black, to be underweight, and to consume two or more alcoholic beverages per day (P less than .05).

The researchers tracked serious complications in the 30 days after surgery such as death, sepsis, and readmission.
 

 


Complications developed in 6.34% of smokers and 4.72% of nonsmokers (P less than .001). Numerous kinds of complications were more common in the smokers prior to adjustment: death, return to the operating room, readmission, and transfusion plus wound, pulmonary, thromboembolic and cardiac complications.

The researchers adjusted their statistics to account for factors such as ethnicity, sex, body mass index, preexisting comorbidities, and type of hernia operation. They found that risk of all complications was higher in smokers, compared with nonsmokers (odds ratio, 1.30) as were several other complications: death (OR, 1.53), return to operating room (OR, 1.23), readmission (OR, 1.24), wound complication (OR, 1.36), sepsis/septic shock (OR, 1.31), pulmonary complication (OR 1.77-2.30) and cardiac complication (OR, 1.27-1.43).

Only transfusion (OR, 0.90) and thromboembolic (OR, 0.87) complications were less likely in smokers.

The researchers noted that the statistics don’t allow them to analyze whether it makes any difference if smokers quit shortly before their procedures. Still, Dr. Stulberg stands by his you-must-quit-smoking-before-surgery edict. “I believe that their active smoking habit is a bigger health threat than their asymptomatic hernia, and therefore feel the right thing to do as their physician is support them through their smoking cessation,” he said. “I offer counseling and nicotine replacement if needed. I have very good quit rates and would encourage other surgeons to do the same.”

 

 

SOURCE: DeLancey JO et al. Am J Surg. 2018 Mar 6. doi: 10.1016/j.amjsurg.2018.03.004.

 

Jonah Stulberg, MD, FACS, is stickler about requiring patients to stop smoking at least 3 months before hernia surgery. He even uses urine tests to confirm whether they actually quit. A study by Dr. Stulberg and his colleagues supports this approach: Current and recent smokers are significantly more likely to suffer serious complications over 30 days after elective hernia repair procedures.

The finding held up even after the researchers controlled for various factors. “Our findings are in agreement with other findings in higher risk surgeries, and they provide evidence that low-risk surgeries are not exempt from the risks associated with smoking,” said Dr. Stulberg in an interview. “Our data would suggest that there is significant clinical benefit to encouraging smoking cessation before elective hernia repair.”

Dr. Jonah Stulberg
Dr. Stulberg of Northwestern University in Chicago, is a coauthor of the new study, which was published online in the American Journal of Surgery.

The researchers launched the study to better understand how smoking affects complication rates in light of the fact that “surgeons in the U.S. tend to offer low-risk elective surgical procedures to patients who are actively smoking despite overwhelming evidence that smoking increases surgical risks,” Dr. Stulberg said.

The researchers tracked 220,629 patients in the American College of Surgeons National Surgical Quality Improvement Project (NSQIP) database who underwent several types of elective hernia repair from 2011 to 2014.

Just over 18% of the patients said they’d smoked over the past year; they were more likely to be younger (median age, 50 for smokers vs. 57 for nonsmokers). Smokers also were more likely to be black, to be underweight, and to consume two or more alcoholic beverages per day (P less than .05).

The researchers tracked serious complications in the 30 days after surgery such as death, sepsis, and readmission.
 

 


Complications developed in 6.34% of smokers and 4.72% of nonsmokers (P less than .001). Numerous kinds of complications were more common in the smokers prior to adjustment: death, return to the operating room, readmission, and transfusion plus wound, pulmonary, thromboembolic and cardiac complications.

The researchers adjusted their statistics to account for factors such as ethnicity, sex, body mass index, preexisting comorbidities, and type of hernia operation. They found that risk of all complications was higher in smokers, compared with nonsmokers (odds ratio, 1.30) as were several other complications: death (OR, 1.53), return to operating room (OR, 1.23), readmission (OR, 1.24), wound complication (OR, 1.36), sepsis/septic shock (OR, 1.31), pulmonary complication (OR 1.77-2.30) and cardiac complication (OR, 1.27-1.43).

Only transfusion (OR, 0.90) and thromboembolic (OR, 0.87) complications were less likely in smokers.

The researchers noted that the statistics don’t allow them to analyze whether it makes any difference if smokers quit shortly before their procedures. Still, Dr. Stulberg stands by his you-must-quit-smoking-before-surgery edict. “I believe that their active smoking habit is a bigger health threat than their asymptomatic hernia, and therefore feel the right thing to do as their physician is support them through their smoking cessation,” he said. “I offer counseling and nicotine replacement if needed. I have very good quit rates and would encourage other surgeons to do the same.”

 

 

SOURCE: DeLancey JO et al. Am J Surg. 2018 Mar 6. doi: 10.1016/j.amjsurg.2018.03.004.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM AMERICAN JOURNAL OF SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Smokers are more likely than are nonsmokers to develop serious complications after elective hernia surgery.

Major finding: The adjusted risk of serious complications after elective hernia surgery is higher (odds ratio, 1.30) in smokers than nonsmokers.

Study details: Retrospective study of ACS NSQIP data on 220,629 patients in the United States (18% smokers) who underwent elective hernia operations during 2011-2014.

Disclosures: Northwestern Memorial Hospital and Northwestern University funded the study. Four of the nine authors reported various disclosures. The other authors report no disclosures.

Source: DeLancey JO et al. Am J Surg. 2018 Mar 6. doi: 10.1016/j.amjsurg.2018.03.004.

Disqus Comments
Default

Single screening for Lynch syndrome beats sequential tests in CRC

Article Type
Changed

 

Physicians could more accurately test patients with colon cancer for Lynch syndrome by using a single tumor sequencing test instead of the current protocol of up to six sequential tests, a new study suggests. The process may also be faster in some cases.

“We found that up-front tumor testing is actually more sensitive and more specific for detecting Lynch syndrome than the old, multiple-test model,” study coauthor Rachel Pearlman, MS, a genetic counselor at Ohio State University Wexner Medical Center, said in an interview. “Tumor sequencing resulted in a 10% improvement in Lynch syndrome detection rates while also providing important information about treatment options for the patients.”

©Gio_tto/Thinkstock.com


According to Ms. Pearlman, screening for Lynch syndrome is recommended for all patients with colon cancer and can require multiple sequential tests. It affects an estimated 3% of these patients, putting them at higher risk of several kinds of cancers including endometrial, ovarian, and gastric.

“Identifying the condition at the time of diagnosis can potentially impact treatment options and also help to facilitate intensive surveillance for other types of cancer,” Ms. Pearlman said. “In addition, we’ll know that the patients’ family members are at risk and will benefit from genetic counseling and testing.”

However, “traditional sequential testing is complex and confusing to patients and clinicians and occurs over a prolonged period, incurring risk for loss to follow-up,” the investigators wrote in JAMA Oncology.

For the new study, the researchers sought to confirm whether tumor sequencing, a form of genetic testing, would be faster and more accurate than the current sequential testing approach.

In a multicenter study, they prospectively tested tumor DNA in 2015 and 2016. They also tested another 46 patients who had been previously confirmed to have Lynch syndrome.

 

 


The average age of the patients was 60 years, 52% were women, 89% were white. Hispanics and Asians made up just 1% each of the total. Most of the cancers were stage II (26%) or stage III (40%).

Tumor sequencing identified all of the 46 confirmed cases of Lynch syndrome and turned up 12 more in the larger group, the researchers found.

Sensitivity of tumor sequencing was better (100%; 95% confidence interval, 93.8%-100%) than immunohistochemical testing plus BRAF (89.7%; 95% CI, 78.8%-96.1%; P = .04) and microsatellite instability testing plus BRAF (91.4%; 95% CI, 81.0%-97.1%; P = .07), and its specificity was equal to the other approaches, Ms. Pearlman and her associates reported.

Researchers also reported that tumor sequencing identified nearly 300 cases of tumors with genetic mutations that could impact therapy.
 

 


Eesults from tumor sequencing are available in a median of 2 weeks, which may be longer than some other tests, but “it requires less time overall by eliminating multiple follow-up tests in a subset of cases,” the study authors wrote.

“While this new test is currently more expensive than traditional step-wise testing, it will eliminate many other tests for a subset of patients so that it may be more cost-effective overall. If it is not now, it will certainly be in the future as the costs of tumor sequencing continue to decline,” Ms. Pearlman said. “However, formal cost-analysis studies will be necessary to determine if this is a cost-effective approach.”

The study was funded by a grant from Pelotonia, an annual cycling event that supports cancer research, and the National Cancer Institute. Myriad Genetics donated the sequence testing used for some patients.

SOURCE: Hampel H et al. JAMA Oncol. 2018 Mar 29. doi: 10.1001/jamaoncol.2018.0104.

Publications
Topics
Sections

 

Physicians could more accurately test patients with colon cancer for Lynch syndrome by using a single tumor sequencing test instead of the current protocol of up to six sequential tests, a new study suggests. The process may also be faster in some cases.

“We found that up-front tumor testing is actually more sensitive and more specific for detecting Lynch syndrome than the old, multiple-test model,” study coauthor Rachel Pearlman, MS, a genetic counselor at Ohio State University Wexner Medical Center, said in an interview. “Tumor sequencing resulted in a 10% improvement in Lynch syndrome detection rates while also providing important information about treatment options for the patients.”

©Gio_tto/Thinkstock.com


According to Ms. Pearlman, screening for Lynch syndrome is recommended for all patients with colon cancer and can require multiple sequential tests. It affects an estimated 3% of these patients, putting them at higher risk of several kinds of cancers including endometrial, ovarian, and gastric.

“Identifying the condition at the time of diagnosis can potentially impact treatment options and also help to facilitate intensive surveillance for other types of cancer,” Ms. Pearlman said. “In addition, we’ll know that the patients’ family members are at risk and will benefit from genetic counseling and testing.”

However, “traditional sequential testing is complex and confusing to patients and clinicians and occurs over a prolonged period, incurring risk for loss to follow-up,” the investigators wrote in JAMA Oncology.

For the new study, the researchers sought to confirm whether tumor sequencing, a form of genetic testing, would be faster and more accurate than the current sequential testing approach.

In a multicenter study, they prospectively tested tumor DNA in 2015 and 2016. They also tested another 46 patients who had been previously confirmed to have Lynch syndrome.

 

 


The average age of the patients was 60 years, 52% were women, 89% were white. Hispanics and Asians made up just 1% each of the total. Most of the cancers were stage II (26%) or stage III (40%).

Tumor sequencing identified all of the 46 confirmed cases of Lynch syndrome and turned up 12 more in the larger group, the researchers found.

Sensitivity of tumor sequencing was better (100%; 95% confidence interval, 93.8%-100%) than immunohistochemical testing plus BRAF (89.7%; 95% CI, 78.8%-96.1%; P = .04) and microsatellite instability testing plus BRAF (91.4%; 95% CI, 81.0%-97.1%; P = .07), and its specificity was equal to the other approaches, Ms. Pearlman and her associates reported.

Researchers also reported that tumor sequencing identified nearly 300 cases of tumors with genetic mutations that could impact therapy.
 

 


Eesults from tumor sequencing are available in a median of 2 weeks, which may be longer than some other tests, but “it requires less time overall by eliminating multiple follow-up tests in a subset of cases,” the study authors wrote.

“While this new test is currently more expensive than traditional step-wise testing, it will eliminate many other tests for a subset of patients so that it may be more cost-effective overall. If it is not now, it will certainly be in the future as the costs of tumor sequencing continue to decline,” Ms. Pearlman said. “However, formal cost-analysis studies will be necessary to determine if this is a cost-effective approach.”

The study was funded by a grant from Pelotonia, an annual cycling event that supports cancer research, and the National Cancer Institute. Myriad Genetics donated the sequence testing used for some patients.

SOURCE: Hampel H et al. JAMA Oncol. 2018 Mar 29. doi: 10.1001/jamaoncol.2018.0104.

 

Physicians could more accurately test patients with colon cancer for Lynch syndrome by using a single tumor sequencing test instead of the current protocol of up to six sequential tests, a new study suggests. The process may also be faster in some cases.

“We found that up-front tumor testing is actually more sensitive and more specific for detecting Lynch syndrome than the old, multiple-test model,” study coauthor Rachel Pearlman, MS, a genetic counselor at Ohio State University Wexner Medical Center, said in an interview. “Tumor sequencing resulted in a 10% improvement in Lynch syndrome detection rates while also providing important information about treatment options for the patients.”

©Gio_tto/Thinkstock.com


According to Ms. Pearlman, screening for Lynch syndrome is recommended for all patients with colon cancer and can require multiple sequential tests. It affects an estimated 3% of these patients, putting them at higher risk of several kinds of cancers including endometrial, ovarian, and gastric.

“Identifying the condition at the time of diagnosis can potentially impact treatment options and also help to facilitate intensive surveillance for other types of cancer,” Ms. Pearlman said. “In addition, we’ll know that the patients’ family members are at risk and will benefit from genetic counseling and testing.”

However, “traditional sequential testing is complex and confusing to patients and clinicians and occurs over a prolonged period, incurring risk for loss to follow-up,” the investigators wrote in JAMA Oncology.

For the new study, the researchers sought to confirm whether tumor sequencing, a form of genetic testing, would be faster and more accurate than the current sequential testing approach.

In a multicenter study, they prospectively tested tumor DNA in 2015 and 2016. They also tested another 46 patients who had been previously confirmed to have Lynch syndrome.

 

 


The average age of the patients was 60 years, 52% were women, 89% were white. Hispanics and Asians made up just 1% each of the total. Most of the cancers were stage II (26%) or stage III (40%).

Tumor sequencing identified all of the 46 confirmed cases of Lynch syndrome and turned up 12 more in the larger group, the researchers found.

Sensitivity of tumor sequencing was better (100%; 95% confidence interval, 93.8%-100%) than immunohistochemical testing plus BRAF (89.7%; 95% CI, 78.8%-96.1%; P = .04) and microsatellite instability testing plus BRAF (91.4%; 95% CI, 81.0%-97.1%; P = .07), and its specificity was equal to the other approaches, Ms. Pearlman and her associates reported.

Researchers also reported that tumor sequencing identified nearly 300 cases of tumors with genetic mutations that could impact therapy.
 

 


Eesults from tumor sequencing are available in a median of 2 weeks, which may be longer than some other tests, but “it requires less time overall by eliminating multiple follow-up tests in a subset of cases,” the study authors wrote.

“While this new test is currently more expensive than traditional step-wise testing, it will eliminate many other tests for a subset of patients so that it may be more cost-effective overall. If it is not now, it will certainly be in the future as the costs of tumor sequencing continue to decline,” Ms. Pearlman said. “However, formal cost-analysis studies will be necessary to determine if this is a cost-effective approach.”

The study was funded by a grant from Pelotonia, an annual cycling event that supports cancer research, and the National Cancer Institute. Myriad Genetics donated the sequence testing used for some patients.

SOURCE: Hampel H et al. JAMA Oncol. 2018 Mar 29. doi: 10.1001/jamaoncol.2018.0104.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Tumor sequencing provides more accurate Lynch syndrome testing in colon cancer.

Major finding: Sensitivity of tumor sequencing was better (100%) than immunohistochemical testing plus BRAF (89.7%) and microsatellite instability testing plus BRAF (91.4%). Specificity was the same.

Study details: Prospective testing of 419 consecutive patients with colon cancer plus analysis of samples from 46 patients with confirmed Lynch syndrome.

Disclosures: The study was funded by a grant from Pelotonia, an annual cycling event that supports cancer research, and the National Cancer Institute. Myriad Genetics provided the sequence testing used for some of the patients.

Source: Hampel H et al. JAMA Oncol. 2018 Mar 29. doi: 10.1001/jamaoncol.2018.0104.

Disqus Comments
Default
Use ProPublica

Study links mumps outbreaks to vaccine waning

Article Type
Changed

 

A new study links recent mumps outbreaks to the waning of the mumps vaccine – which researchers believe wears off in an average of 27 years – and not to heterologous virus genotypes.

“These observations indicate the need for either innovative clinical trial designs to measure the benefit of extending vaccine dosing schedules or new vaccines to address the problem of waning vaccine-induced protection,” the study authors wrote. The findings appeared in Science Translational Medicine.

CDC/ Allison M. Maiuri
The distribution of a mumps vaccine in the 1960s led to the near eradication of the disease with U.S. cases falling by more than 99%. However, large outbreaks have made the news over the last few years. There were more than 6,000 cases reported in the United States in 2016 and more than 5,000 in 2017, according to the Centers for Disease Control and Prevention, mainly on college campuses. (Physicians are not required to report mumps cases.)

The numbers over the past 2 years are the highest numbers by far since 2000, with the exception of 2006. The CDC attributes the 2016 and 2017 numbers to factors such as “the known effectiveness of the vaccine, waning immunity following vaccination, and the intensity of exposure to the virus in close-contact settings [such as a college campus] coupled with behaviors that increase the risk of transmission.”

This year, as of Feb. 24, the CDC stated that it has received reports of 304 cases in 32 states and Washington.

For the new study, Joseph A. Lewnard, PhD, and Yonatan H. Grad, MD, PhD, of the Harvard School of Public Health, Boston, examined six studies and estimated that mumps vaccinations provide protection for an average of 27 years (95% confidence interval, 16-51 years). They also reported finding no evidence that the effectiveness of vaccines was affected by new heterologous virus genotypes.

Their analysis also determined that outbreaks in the late 1980s and early 1990s (among teens), and from 2006 to present (in young adults), “aligned with peaks in mumps susceptibility of these age groups predicted to be due to loss of vaccine-derived protection.”

 

 


The authors suggested that third doses of vaccine or an improved mumps vaccine could provide added protection. “Although congregated U.S. military populations resemble high-risk groups based on their age distribution and close-contact environments, no outbreaks have been reported in the military since a policy of administering an MMR dose to incoming recruits, regardless of vaccination history, was adopted in 1991,” the researchers wrote, referring to a 2008 study (Vaccine 2008, 26:494-501).

For now, however, “we expect population susceptibility to mumps to continue increasing as transient vaccine-derived immunity supersedes previous infection as the main determinant of mumps susceptibility in the U.S. population,” the authors wrote.

The study was funded by awards from the National Institute of General Medical Sciences and the Doris Duke Charitable Foundation. The study authors report grant funding (Pfizer) and consulting (Pfizer, GlaxoSmithKline) for work unrelated to the study.

SOURCE: Lenward JA et al. Sci Transl Med. 2018 Mar 21. doi: 10.1126/scitranslmed.aao5945.

Publications
Topics
Sections

 

A new study links recent mumps outbreaks to the waning of the mumps vaccine – which researchers believe wears off in an average of 27 years – and not to heterologous virus genotypes.

“These observations indicate the need for either innovative clinical trial designs to measure the benefit of extending vaccine dosing schedules or new vaccines to address the problem of waning vaccine-induced protection,” the study authors wrote. The findings appeared in Science Translational Medicine.

CDC/ Allison M. Maiuri
The distribution of a mumps vaccine in the 1960s led to the near eradication of the disease with U.S. cases falling by more than 99%. However, large outbreaks have made the news over the last few years. There were more than 6,000 cases reported in the United States in 2016 and more than 5,000 in 2017, according to the Centers for Disease Control and Prevention, mainly on college campuses. (Physicians are not required to report mumps cases.)

The numbers over the past 2 years are the highest numbers by far since 2000, with the exception of 2006. The CDC attributes the 2016 and 2017 numbers to factors such as “the known effectiveness of the vaccine, waning immunity following vaccination, and the intensity of exposure to the virus in close-contact settings [such as a college campus] coupled with behaviors that increase the risk of transmission.”

This year, as of Feb. 24, the CDC stated that it has received reports of 304 cases in 32 states and Washington.

For the new study, Joseph A. Lewnard, PhD, and Yonatan H. Grad, MD, PhD, of the Harvard School of Public Health, Boston, examined six studies and estimated that mumps vaccinations provide protection for an average of 27 years (95% confidence interval, 16-51 years). They also reported finding no evidence that the effectiveness of vaccines was affected by new heterologous virus genotypes.

Their analysis also determined that outbreaks in the late 1980s and early 1990s (among teens), and from 2006 to present (in young adults), “aligned with peaks in mumps susceptibility of these age groups predicted to be due to loss of vaccine-derived protection.”

 

 


The authors suggested that third doses of vaccine or an improved mumps vaccine could provide added protection. “Although congregated U.S. military populations resemble high-risk groups based on their age distribution and close-contact environments, no outbreaks have been reported in the military since a policy of administering an MMR dose to incoming recruits, regardless of vaccination history, was adopted in 1991,” the researchers wrote, referring to a 2008 study (Vaccine 2008, 26:494-501).

For now, however, “we expect population susceptibility to mumps to continue increasing as transient vaccine-derived immunity supersedes previous infection as the main determinant of mumps susceptibility in the U.S. population,” the authors wrote.

The study was funded by awards from the National Institute of General Medical Sciences and the Doris Duke Charitable Foundation. The study authors report grant funding (Pfizer) and consulting (Pfizer, GlaxoSmithKline) for work unrelated to the study.

SOURCE: Lenward JA et al. Sci Transl Med. 2018 Mar 21. doi: 10.1126/scitranslmed.aao5945.

 

A new study links recent mumps outbreaks to the waning of the mumps vaccine – which researchers believe wears off in an average of 27 years – and not to heterologous virus genotypes.

“These observations indicate the need for either innovative clinical trial designs to measure the benefit of extending vaccine dosing schedules or new vaccines to address the problem of waning vaccine-induced protection,” the study authors wrote. The findings appeared in Science Translational Medicine.

CDC/ Allison M. Maiuri
The distribution of a mumps vaccine in the 1960s led to the near eradication of the disease with U.S. cases falling by more than 99%. However, large outbreaks have made the news over the last few years. There were more than 6,000 cases reported in the United States in 2016 and more than 5,000 in 2017, according to the Centers for Disease Control and Prevention, mainly on college campuses. (Physicians are not required to report mumps cases.)

The numbers over the past 2 years are the highest numbers by far since 2000, with the exception of 2006. The CDC attributes the 2016 and 2017 numbers to factors such as “the known effectiveness of the vaccine, waning immunity following vaccination, and the intensity of exposure to the virus in close-contact settings [such as a college campus] coupled with behaviors that increase the risk of transmission.”

This year, as of Feb. 24, the CDC stated that it has received reports of 304 cases in 32 states and Washington.

For the new study, Joseph A. Lewnard, PhD, and Yonatan H. Grad, MD, PhD, of the Harvard School of Public Health, Boston, examined six studies and estimated that mumps vaccinations provide protection for an average of 27 years (95% confidence interval, 16-51 years). They also reported finding no evidence that the effectiveness of vaccines was affected by new heterologous virus genotypes.

Their analysis also determined that outbreaks in the late 1980s and early 1990s (among teens), and from 2006 to present (in young adults), “aligned with peaks in mumps susceptibility of these age groups predicted to be due to loss of vaccine-derived protection.”

 

 


The authors suggested that third doses of vaccine or an improved mumps vaccine could provide added protection. “Although congregated U.S. military populations resemble high-risk groups based on their age distribution and close-contact environments, no outbreaks have been reported in the military since a policy of administering an MMR dose to incoming recruits, regardless of vaccination history, was adopted in 1991,” the researchers wrote, referring to a 2008 study (Vaccine 2008, 26:494-501).

For now, however, “we expect population susceptibility to mumps to continue increasing as transient vaccine-derived immunity supersedes previous infection as the main determinant of mumps susceptibility in the U.S. population,” the authors wrote.

The study was funded by awards from the National Institute of General Medical Sciences and the Doris Duke Charitable Foundation. The study authors report grant funding (Pfizer) and consulting (Pfizer, GlaxoSmithKline) for work unrelated to the study.

SOURCE: Lenward JA et al. Sci Transl Med. 2018 Mar 21. doi: 10.1126/scitranslmed.aao5945.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM SCIENCE TRANSLATIONAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Study links RA flares after joint replacement to disease activity, not medications

Article Type
Changed

 

Patients with the most severe cases of rheumatoid arthritis are more likely to suffer flares after knee or hip replacement surgery, a new study finds, and it doesn’t seem to matter whether they stop taking biologics before their operation.

“We found that the majority of patients had active disease at the time of surgery, contrary to prior statements that RA patients have inactive disease at the time they go for hip or knee replacement. In fact, the majority – 65% of the patients – reported a flare of RA within 6 weeks of surgery,” lead author Susan M. Goodman, MD, of Cornell University and the Hospital for Special Surgery, New York, said in an interview. “Surprisingly, although more of the flaring patients were taking potent biologics that had been withheld preoperatively, the major risk factor for flares was their baseline disease activity.”

Dr. Susan M. Goodman
The study appeared online March 15 in the Journal of Rheumatology.

According to Dr. Goodman, the researchers launched the study to better understand how medical decisions prior to joint replacement surgery affect the progress of RA afterward.

In terms of continuing RA drug treatment, she said, “the decision really hinges on the risk of infection versus the risk of flare, and we didn’t know the usual course of events for these patients.”

In addition, she said, “many doctors incorrectly think that the majority of patients with RA have ‘burnt-out’ or inactive disease at the time of hip or knee replacement surgery.”

For the study, the researchers prospectively followed 120 patients who were to undergo joint replacement surgery. (The researchers initially approached 354 patients, of whom 169 declined to participate. Another 65 were dropped from the study for various reasons, including 42 who did not sufficiently fill out questionnaires and were deleted from the final analysis.)
 

 


The researchers tracked the patients before surgery and for 6 weeks after surgery. A majority of the patients were female (83%) and white (81%), with a mean age of 62 and a median RA symptom duration of 15 years. A total of 44% underwent hip replacement surgery while the rest underwent knee replacement surgery. Just over half of the patients were taking biologics, which were stopped prior to surgery, while glucocorticoids and methotrexate were usually continued.

Just under two-thirds of the patients flared within the first 6 weeks after surgery. The researchers didn’t find any connection between the flares and stopping biologics or using methotrexate. They did, however, link higher baseline RA activity to postsurgery flaring (odds ratio, 2.11; P = .015).

Dr. Goodman said that she and her colleagues continue to collect data to better understand flares and the link to disease severity. “The long-term implications of this are not yet known. We would like to know the effect on long-term functional outcome and complication rate.”

The National Institutes of Health, the Weill Cornell Clinical Translational Science Center, and the Block Family Foundation supported the study. Dr. Goodman disclosed receiving research funding from Novartis and Roche.

SOURCE: Goodman S et al. J Rheumatol. 2018 Mar 15. doi: 10.3899/jrheum.170366

Publications
Topics
Sections

 

Patients with the most severe cases of rheumatoid arthritis are more likely to suffer flares after knee or hip replacement surgery, a new study finds, and it doesn’t seem to matter whether they stop taking biologics before their operation.

“We found that the majority of patients had active disease at the time of surgery, contrary to prior statements that RA patients have inactive disease at the time they go for hip or knee replacement. In fact, the majority – 65% of the patients – reported a flare of RA within 6 weeks of surgery,” lead author Susan M. Goodman, MD, of Cornell University and the Hospital for Special Surgery, New York, said in an interview. “Surprisingly, although more of the flaring patients were taking potent biologics that had been withheld preoperatively, the major risk factor for flares was their baseline disease activity.”

Dr. Susan M. Goodman
The study appeared online March 15 in the Journal of Rheumatology.

According to Dr. Goodman, the researchers launched the study to better understand how medical decisions prior to joint replacement surgery affect the progress of RA afterward.

In terms of continuing RA drug treatment, she said, “the decision really hinges on the risk of infection versus the risk of flare, and we didn’t know the usual course of events for these patients.”

In addition, she said, “many doctors incorrectly think that the majority of patients with RA have ‘burnt-out’ or inactive disease at the time of hip or knee replacement surgery.”

For the study, the researchers prospectively followed 120 patients who were to undergo joint replacement surgery. (The researchers initially approached 354 patients, of whom 169 declined to participate. Another 65 were dropped from the study for various reasons, including 42 who did not sufficiently fill out questionnaires and were deleted from the final analysis.)
 

 


The researchers tracked the patients before surgery and for 6 weeks after surgery. A majority of the patients were female (83%) and white (81%), with a mean age of 62 and a median RA symptom duration of 15 years. A total of 44% underwent hip replacement surgery while the rest underwent knee replacement surgery. Just over half of the patients were taking biologics, which were stopped prior to surgery, while glucocorticoids and methotrexate were usually continued.

Just under two-thirds of the patients flared within the first 6 weeks after surgery. The researchers didn’t find any connection between the flares and stopping biologics or using methotrexate. They did, however, link higher baseline RA activity to postsurgery flaring (odds ratio, 2.11; P = .015).

Dr. Goodman said that she and her colleagues continue to collect data to better understand flares and the link to disease severity. “The long-term implications of this are not yet known. We would like to know the effect on long-term functional outcome and complication rate.”

The National Institutes of Health, the Weill Cornell Clinical Translational Science Center, and the Block Family Foundation supported the study. Dr. Goodman disclosed receiving research funding from Novartis and Roche.

SOURCE: Goodman S et al. J Rheumatol. 2018 Mar 15. doi: 10.3899/jrheum.170366

 

Patients with the most severe cases of rheumatoid arthritis are more likely to suffer flares after knee or hip replacement surgery, a new study finds, and it doesn’t seem to matter whether they stop taking biologics before their operation.

“We found that the majority of patients had active disease at the time of surgery, contrary to prior statements that RA patients have inactive disease at the time they go for hip or knee replacement. In fact, the majority – 65% of the patients – reported a flare of RA within 6 weeks of surgery,” lead author Susan M. Goodman, MD, of Cornell University and the Hospital for Special Surgery, New York, said in an interview. “Surprisingly, although more of the flaring patients were taking potent biologics that had been withheld preoperatively, the major risk factor for flares was their baseline disease activity.”

Dr. Susan M. Goodman
The study appeared online March 15 in the Journal of Rheumatology.

According to Dr. Goodman, the researchers launched the study to better understand how medical decisions prior to joint replacement surgery affect the progress of RA afterward.

In terms of continuing RA drug treatment, she said, “the decision really hinges on the risk of infection versus the risk of flare, and we didn’t know the usual course of events for these patients.”

In addition, she said, “many doctors incorrectly think that the majority of patients with RA have ‘burnt-out’ or inactive disease at the time of hip or knee replacement surgery.”

For the study, the researchers prospectively followed 120 patients who were to undergo joint replacement surgery. (The researchers initially approached 354 patients, of whom 169 declined to participate. Another 65 were dropped from the study for various reasons, including 42 who did not sufficiently fill out questionnaires and were deleted from the final analysis.)
 

 


The researchers tracked the patients before surgery and for 6 weeks after surgery. A majority of the patients were female (83%) and white (81%), with a mean age of 62 and a median RA symptom duration of 15 years. A total of 44% underwent hip replacement surgery while the rest underwent knee replacement surgery. Just over half of the patients were taking biologics, which were stopped prior to surgery, while glucocorticoids and methotrexate were usually continued.

Just under two-thirds of the patients flared within the first 6 weeks after surgery. The researchers didn’t find any connection between the flares and stopping biologics or using methotrexate. They did, however, link higher baseline RA activity to postsurgery flaring (odds ratio, 2.11; P = .015).

Dr. Goodman said that she and her colleagues continue to collect data to better understand flares and the link to disease severity. “The long-term implications of this are not yet known. We would like to know the effect on long-term functional outcome and complication rate.”

The National Institutes of Health, the Weill Cornell Clinical Translational Science Center, and the Block Family Foundation supported the study. Dr. Goodman disclosed receiving research funding from Novartis and Roche.

SOURCE: Goodman S et al. J Rheumatol. 2018 Mar 15. doi: 10.3899/jrheum.170366

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JOURNAL OF RHEUMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Worse RA activity prior to knee or hip replacement surgery predicts RA flares afterward, but cessation of biologics or use of methotrexate does not.

Major finding: Sixty-five percent of RA patients developed flares after joint replacement surgery, and it was more common in those with higher baseline RA activity (odds ratio, 2.11; P = .015).

Study details: Prospective study of 120 patients with RA who underwent hip replacement (44%) or knee replacement (56%).

Disclosures: The National Institutes of Health, the Weill Cornell Clinical Translational Science Center, and the Block Family Foundation supported the study. The lead author disclosed receiving research funding from Novartis and Roche.

Source: Goodman S et al. J Rheumatol. 2018 Mar 15. doi: 10.3899/jrheum.170366.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Alternative oxygen therapy reduces treatment failure in bronchiolitis

Article Type
Changed

 

High-flow oxygen therapy outside the ICU boosts the likelihood that infants with bronchiolitis will avoid treatment failure and an escalation of treatment, a study finds.

“High flow can be safely used in general emergency wards and general pediatric ward settings in regional and metropolitan hospitals that have no immediate direct access to dedicated pediatric intensive care facilities,” study coauthor Andreas Schibler, MD, of University of Queensland in Australia, said in an interview. The findings were published March 22 in the New England Journal of Medicine.

Zoonar RF/Thinkstock
Bronchiolitis is quite common in children, and a 2002 report found that respiratory syncytial virus (RSV) bronchiolitis was the most common reason for infants under the age of 1 year to be hospitalized in the United States during 1997-1999 (Pediatr Infect Dis J. 2002 Jul;21[7]:629-32).

“The typical treatment for bronchiolitis is supportive therapy, providing nutrition, fluids, and if needed respiratory support including provision of oxygen,” Dr. Schibler said.

The prognosis is generally goods thanks to improvements in intensive care, he said, which some infants need because the standard oxygen therapy provided in general pediatric wards is insufficient. The new study examines whether high-flow oxygen therapy through a cannula – which he said has become more common – reduces the risk of treatment failure in non-ICU therapy, compared with standard oxygen treatment.

Dr. Schibler and his colleagues tracked 1,472 patients under 12 months with bronchiolitis and a need for oxygen treatment who were randomly assigned to high-flow or standard oxygen therapy to maintain their oxygen saturation at 92%-98% or 94%-98%, depending on policy at the hospital. The subjects were patients at 17 hospitals in Australia and New Zealand.

A total of 739 infants received high-flow treatment that provided heated and humidified oxygen at a rate of 2 liters per kilogram of body weight per minute. The other 733 infants received standard oxygen therapy up to a maximum 2 liters per minute.

 

 


The treatment failed, requiring an escalation of care, in 87 of 739 patients (12%) in the high-flow group and 167 of 733 (23%) in the standard-therapy group. (risk difference = –11% points; 95% confidence interval, –15 to –7; P less than .001).

“The ease to use and simplicity of high flow made us recognize and think that this level of respiratory care can be provided outside intensive care,” Dr. Schibler said. “This was further supported by the observational fact that most of these infants with bronchiolitis showed a dramatically improved respiratory condition once on high flow.”

Dr. Schibler said there haven’t been any signs of adverse effects from high-flow oxygen therapy. As for the cost of the treatment, he said it is “likely offset by a reduced need for intensive care therapy or costs associated with transferring to a children’s hospital.”

What should physicians and hospitals take from the study findings? “If a hospital explores the option to use high flow in bronchiolitis, then start the therapy early in the disease process or once an oxygen requirement is recognized,” Dr. Schibler said. “Implementation of a solid and structured training program with a clear hospital guideline based on the evidence will ensure the staff who care for these patients will be empowered and comfortable to adjust the oxygen levels given by the high-flow equipment. The greater the confidence and comfort level for the nursing and respiratory technician staff the better for these infants, as they will sooner observe those infants who are not responding well and may require a higher level of care such as intensive care or they will recognize the infant who responds well.”

 

 

SOURCE: Franklin D et al. N Engl J Med. 2018;378(12):1112-31.

Publications
Topics
Sections

 

High-flow oxygen therapy outside the ICU boosts the likelihood that infants with bronchiolitis will avoid treatment failure and an escalation of treatment, a study finds.

“High flow can be safely used in general emergency wards and general pediatric ward settings in regional and metropolitan hospitals that have no immediate direct access to dedicated pediatric intensive care facilities,” study coauthor Andreas Schibler, MD, of University of Queensland in Australia, said in an interview. The findings were published March 22 in the New England Journal of Medicine.

Zoonar RF/Thinkstock
Bronchiolitis is quite common in children, and a 2002 report found that respiratory syncytial virus (RSV) bronchiolitis was the most common reason for infants under the age of 1 year to be hospitalized in the United States during 1997-1999 (Pediatr Infect Dis J. 2002 Jul;21[7]:629-32).

“The typical treatment for bronchiolitis is supportive therapy, providing nutrition, fluids, and if needed respiratory support including provision of oxygen,” Dr. Schibler said.

The prognosis is generally goods thanks to improvements in intensive care, he said, which some infants need because the standard oxygen therapy provided in general pediatric wards is insufficient. The new study examines whether high-flow oxygen therapy through a cannula – which he said has become more common – reduces the risk of treatment failure in non-ICU therapy, compared with standard oxygen treatment.

Dr. Schibler and his colleagues tracked 1,472 patients under 12 months with bronchiolitis and a need for oxygen treatment who were randomly assigned to high-flow or standard oxygen therapy to maintain their oxygen saturation at 92%-98% or 94%-98%, depending on policy at the hospital. The subjects were patients at 17 hospitals in Australia and New Zealand.

A total of 739 infants received high-flow treatment that provided heated and humidified oxygen at a rate of 2 liters per kilogram of body weight per minute. The other 733 infants received standard oxygen therapy up to a maximum 2 liters per minute.

 

 


The treatment failed, requiring an escalation of care, in 87 of 739 patients (12%) in the high-flow group and 167 of 733 (23%) in the standard-therapy group. (risk difference = –11% points; 95% confidence interval, –15 to –7; P less than .001).

“The ease to use and simplicity of high flow made us recognize and think that this level of respiratory care can be provided outside intensive care,” Dr. Schibler said. “This was further supported by the observational fact that most of these infants with bronchiolitis showed a dramatically improved respiratory condition once on high flow.”

Dr. Schibler said there haven’t been any signs of adverse effects from high-flow oxygen therapy. As for the cost of the treatment, he said it is “likely offset by a reduced need for intensive care therapy or costs associated with transferring to a children’s hospital.”

What should physicians and hospitals take from the study findings? “If a hospital explores the option to use high flow in bronchiolitis, then start the therapy early in the disease process or once an oxygen requirement is recognized,” Dr. Schibler said. “Implementation of a solid and structured training program with a clear hospital guideline based on the evidence will ensure the staff who care for these patients will be empowered and comfortable to adjust the oxygen levels given by the high-flow equipment. The greater the confidence and comfort level for the nursing and respiratory technician staff the better for these infants, as they will sooner observe those infants who are not responding well and may require a higher level of care such as intensive care or they will recognize the infant who responds well.”

 

 

SOURCE: Franklin D et al. N Engl J Med. 2018;378(12):1112-31.

 

High-flow oxygen therapy outside the ICU boosts the likelihood that infants with bronchiolitis will avoid treatment failure and an escalation of treatment, a study finds.

“High flow can be safely used in general emergency wards and general pediatric ward settings in regional and metropolitan hospitals that have no immediate direct access to dedicated pediatric intensive care facilities,” study coauthor Andreas Schibler, MD, of University of Queensland in Australia, said in an interview. The findings were published March 22 in the New England Journal of Medicine.

Zoonar RF/Thinkstock
Bronchiolitis is quite common in children, and a 2002 report found that respiratory syncytial virus (RSV) bronchiolitis was the most common reason for infants under the age of 1 year to be hospitalized in the United States during 1997-1999 (Pediatr Infect Dis J. 2002 Jul;21[7]:629-32).

“The typical treatment for bronchiolitis is supportive therapy, providing nutrition, fluids, and if needed respiratory support including provision of oxygen,” Dr. Schibler said.

The prognosis is generally goods thanks to improvements in intensive care, he said, which some infants need because the standard oxygen therapy provided in general pediatric wards is insufficient. The new study examines whether high-flow oxygen therapy through a cannula – which he said has become more common – reduces the risk of treatment failure in non-ICU therapy, compared with standard oxygen treatment.

Dr. Schibler and his colleagues tracked 1,472 patients under 12 months with bronchiolitis and a need for oxygen treatment who were randomly assigned to high-flow or standard oxygen therapy to maintain their oxygen saturation at 92%-98% or 94%-98%, depending on policy at the hospital. The subjects were patients at 17 hospitals in Australia and New Zealand.

A total of 739 infants received high-flow treatment that provided heated and humidified oxygen at a rate of 2 liters per kilogram of body weight per minute. The other 733 infants received standard oxygen therapy up to a maximum 2 liters per minute.

 

 


The treatment failed, requiring an escalation of care, in 87 of 739 patients (12%) in the high-flow group and 167 of 733 (23%) in the standard-therapy group. (risk difference = –11% points; 95% confidence interval, –15 to –7; P less than .001).

“The ease to use and simplicity of high flow made us recognize and think that this level of respiratory care can be provided outside intensive care,” Dr. Schibler said. “This was further supported by the observational fact that most of these infants with bronchiolitis showed a dramatically improved respiratory condition once on high flow.”

Dr. Schibler said there haven’t been any signs of adverse effects from high-flow oxygen therapy. As for the cost of the treatment, he said it is “likely offset by a reduced need for intensive care therapy or costs associated with transferring to a children’s hospital.”

What should physicians and hospitals take from the study findings? “If a hospital explores the option to use high flow in bronchiolitis, then start the therapy early in the disease process or once an oxygen requirement is recognized,” Dr. Schibler said. “Implementation of a solid and structured training program with a clear hospital guideline based on the evidence will ensure the staff who care for these patients will be empowered and comfortable to adjust the oxygen levels given by the high-flow equipment. The greater the confidence and comfort level for the nursing and respiratory technician staff the better for these infants, as they will sooner observe those infants who are not responding well and may require a higher level of care such as intensive care or they will recognize the infant who responds well.”

 

 

SOURCE: Franklin D et al. N Engl J Med. 2018;378(12):1112-31.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: In non-ICUs, infants under 12 months with bronchiolitis are less likely to fail treatment if they are given high-flow oxygen therapy instead of standard oxygen therapy.

Major finding: Treatment failure occurred in 8 of 739 (12%) patients in the high-flow oxygen therapy group and 167 of 733 (23%) in the standard-therapy group.

Study details: Multicenter, randomized, controlled trial of 1,472 infants.

Disclosures: The National Health and Medical Research Council (Australia) and the Queensland Emergency Medical Research Fund provided funding, and sites received grant funding from various sources. Fisher & Paykel Healthcare, a respiratory care company based in Auckland, New Zealand, donated high-flow equipment/consumables and travel/accommodation support. Study authors reported various grants and other support.

Source: Franklin D et al. N Engl J Med 2018;378(12):1112-31.

Disqus Comments
Default

Study finds AD accounts for hundreds of thousands of annual ED visits

Article Type
Changed

 

– A new study finds that primary diagnoses of atopic dermatitis (AD) are made hundreds of thousands of times in United States emergency departments each year.

The numbers appear to be rising along with costs, researchers reported, and there are signs of disparities, with poorer people more likely to have an ED visit with a primary diagnosis of AD. The study was presented in a poster at the annual meeting of the American Academy of Dermatology.

Dr. Jonathan Silverberg
“Access to outpatient dermatologic care needs to be improved,” study investigator Jonathan I. Silverberg, MD, PhD, of the department of dermatology, Northwestern University, Chicago, said in an interview. “Since AD is a chronic disorder that can be managed in the outpatient setting most of the time, it is likely that improved outpatient access and care and tighter control of AD would result in fewer [ED] visits and a considerable costs savings in the long run.”

He and his coauthor, Lauren Kwa, also with the department of dermatology at Northwestern, conducted the analysis to better understand the role of AD in emergency care. “Many AD patients experience severe, unpredictable flares and worsening chronic disease that warrant urgent treatment,” Dr. Silverberg said. “However, patients typically don’t have instant access to outpatient dermatological care and may be forced to turn to the urgent care setting.”
 

 


Indeed, he noted, “previous U.S. population–based studies showed that people with AD have higher odds of [ED] utilization than the rest of the population.”

He and Ms. Kwa examined 2006-2012 data from the Nationwide Emergency Department Sample, which includes information on about 20% of all emergency visits in the United States.

During that period, there were 1.86 million ED visits with a primary diagnosis of AD. The annual weighted prevalence of primary diagnoses of AD stayed fairly stable through the period, ranging from 2,589 to 2,769 per 1 million visits. However, the weighted prevalence of secondary AD diagnoses grew steadily from 1,227 per 1 million visits in 2006 to 1,533 per 1 million visits in 2012.

The researchers estimated that the total cost of annual costs of AD-related ED visits grew from $86.9 million in 2006 to $172.8 million in 2012 (P less than .05).

 

 

SOURCE: Silverberg, J et al. Poster 7021.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– A new study finds that primary diagnoses of atopic dermatitis (AD) are made hundreds of thousands of times in United States emergency departments each year.

The numbers appear to be rising along with costs, researchers reported, and there are signs of disparities, with poorer people more likely to have an ED visit with a primary diagnosis of AD. The study was presented in a poster at the annual meeting of the American Academy of Dermatology.

Dr. Jonathan Silverberg
“Access to outpatient dermatologic care needs to be improved,” study investigator Jonathan I. Silverberg, MD, PhD, of the department of dermatology, Northwestern University, Chicago, said in an interview. “Since AD is a chronic disorder that can be managed in the outpatient setting most of the time, it is likely that improved outpatient access and care and tighter control of AD would result in fewer [ED] visits and a considerable costs savings in the long run.”

He and his coauthor, Lauren Kwa, also with the department of dermatology at Northwestern, conducted the analysis to better understand the role of AD in emergency care. “Many AD patients experience severe, unpredictable flares and worsening chronic disease that warrant urgent treatment,” Dr. Silverberg said. “However, patients typically don’t have instant access to outpatient dermatological care and may be forced to turn to the urgent care setting.”
 

 


Indeed, he noted, “previous U.S. population–based studies showed that people with AD have higher odds of [ED] utilization than the rest of the population.”

He and Ms. Kwa examined 2006-2012 data from the Nationwide Emergency Department Sample, which includes information on about 20% of all emergency visits in the United States.

During that period, there were 1.86 million ED visits with a primary diagnosis of AD. The annual weighted prevalence of primary diagnoses of AD stayed fairly stable through the period, ranging from 2,589 to 2,769 per 1 million visits. However, the weighted prevalence of secondary AD diagnoses grew steadily from 1,227 per 1 million visits in 2006 to 1,533 per 1 million visits in 2012.

The researchers estimated that the total cost of annual costs of AD-related ED visits grew from $86.9 million in 2006 to $172.8 million in 2012 (P less than .05).

 

 

SOURCE: Silverberg, J et al. Poster 7021.

 

– A new study finds that primary diagnoses of atopic dermatitis (AD) are made hundreds of thousands of times in United States emergency departments each year.

The numbers appear to be rising along with costs, researchers reported, and there are signs of disparities, with poorer people more likely to have an ED visit with a primary diagnosis of AD. The study was presented in a poster at the annual meeting of the American Academy of Dermatology.

Dr. Jonathan Silverberg
“Access to outpatient dermatologic care needs to be improved,” study investigator Jonathan I. Silverberg, MD, PhD, of the department of dermatology, Northwestern University, Chicago, said in an interview. “Since AD is a chronic disorder that can be managed in the outpatient setting most of the time, it is likely that improved outpatient access and care and tighter control of AD would result in fewer [ED] visits and a considerable costs savings in the long run.”

He and his coauthor, Lauren Kwa, also with the department of dermatology at Northwestern, conducted the analysis to better understand the role of AD in emergency care. “Many AD patients experience severe, unpredictable flares and worsening chronic disease that warrant urgent treatment,” Dr. Silverberg said. “However, patients typically don’t have instant access to outpatient dermatological care and may be forced to turn to the urgent care setting.”
 

 


Indeed, he noted, “previous U.S. population–based studies showed that people with AD have higher odds of [ED] utilization than the rest of the population.”

He and Ms. Kwa examined 2006-2012 data from the Nationwide Emergency Department Sample, which includes information on about 20% of all emergency visits in the United States.

During that period, there were 1.86 million ED visits with a primary diagnosis of AD. The annual weighted prevalence of primary diagnoses of AD stayed fairly stable through the period, ranging from 2,589 to 2,769 per 1 million visits. However, the weighted prevalence of secondary AD diagnoses grew steadily from 1,227 per 1 million visits in 2006 to 1,533 per 1 million visits in 2012.

The researchers estimated that the total cost of annual costs of AD-related ED visits grew from $86.9 million in 2006 to $172.8 million in 2012 (P less than .05).

 

 

SOURCE: Silverberg, J et al. Poster 7021.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM AAD 18

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: ED visits for atopic dermatitis are common, and their numbers are growing.

Major finding: An estimated 1.86 million ED visits in the United States from 2006 to 2012 were linked to a primary diagnosis of AD.

Study details: Analysis of data from the Nationwide Emergency Department Sample for 2006-2016.

Disclosures: No study funding was reported. The authors had no relevant disclosures.

Source: Silverberg J et al. Poster 7021.

Disqus Comments
Default

VIDEO: The return of Kaposi’s sarcoma

Article Type
Changed

– Dermatologists, who served as crucial sentinels during the early years of the AIDS epidemic, should be alert for dermatologic signs and symptoms of HIV infection, according to Toby Maurer, MD, professor of clinical dermatology at the University of California, San Francisco.


“We’re now seeing a lot of HIV-infected patients presenting once again with skin symptoms,” including new-onset psoriasis, poorly controlled seborrheic dermatitis, and even Kaposi’s sarcoma, she said in a video interview at the annual meeting of the American Academy of Dermatology.

 

The upswing in cases of Kaposi’s sarcoma “comes as a shock to many dermatologists; they thought Kaposi’s sarcoma was a thing of the past,” added Dr. Maurer, who presented on HIV-associated skin conditions at the meeting.

“My whole plea is to remember that HIV has not gone away, that it keeps showing up, and that the skin symptoms absolutely show up,” she said. “It’s not on the radar as much as it should be.”

In her presentation, Dr. Maurer, who is also chief of dermatology at San Francisco General Hospital, said HIV and HIV medications have a variety of impacts on skin. For example, psoriasis gets worse when patients are off medication and better when they’re on it, she said, while molluscum contagiosum and herpes simplex can actually worsen when patients start HIV drugs. And, she said, a late start of AIDS drugs can worsen eczema.

In the interview, she discussed the impact of starting antiretrovirals late into the infection, when CD4 counts are low, on skin conditions, as well as possible reasons behind the increase in Kaposi’s sarcoma, and interactions between systemic dermatologic medications and some antiretrovirals.


Dr. Maurer reports no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Dermatologists, who served as crucial sentinels during the early years of the AIDS epidemic, should be alert for dermatologic signs and symptoms of HIV infection, according to Toby Maurer, MD, professor of clinical dermatology at the University of California, San Francisco.


“We’re now seeing a lot of HIV-infected patients presenting once again with skin symptoms,” including new-onset psoriasis, poorly controlled seborrheic dermatitis, and even Kaposi’s sarcoma, she said in a video interview at the annual meeting of the American Academy of Dermatology.

 

The upswing in cases of Kaposi’s sarcoma “comes as a shock to many dermatologists; they thought Kaposi’s sarcoma was a thing of the past,” added Dr. Maurer, who presented on HIV-associated skin conditions at the meeting.

“My whole plea is to remember that HIV has not gone away, that it keeps showing up, and that the skin symptoms absolutely show up,” she said. “It’s not on the radar as much as it should be.”

In her presentation, Dr. Maurer, who is also chief of dermatology at San Francisco General Hospital, said HIV and HIV medications have a variety of impacts on skin. For example, psoriasis gets worse when patients are off medication and better when they’re on it, she said, while molluscum contagiosum and herpes simplex can actually worsen when patients start HIV drugs. And, she said, a late start of AIDS drugs can worsen eczema.

In the interview, she discussed the impact of starting antiretrovirals late into the infection, when CD4 counts are low, on skin conditions, as well as possible reasons behind the increase in Kaposi’s sarcoma, and interactions between systemic dermatologic medications and some antiretrovirals.


Dr. Maurer reports no relevant disclosures.

– Dermatologists, who served as crucial sentinels during the early years of the AIDS epidemic, should be alert for dermatologic signs and symptoms of HIV infection, according to Toby Maurer, MD, professor of clinical dermatology at the University of California, San Francisco.


“We’re now seeing a lot of HIV-infected patients presenting once again with skin symptoms,” including new-onset psoriasis, poorly controlled seborrheic dermatitis, and even Kaposi’s sarcoma, she said in a video interview at the annual meeting of the American Academy of Dermatology.

 

The upswing in cases of Kaposi’s sarcoma “comes as a shock to many dermatologists; they thought Kaposi’s sarcoma was a thing of the past,” added Dr. Maurer, who presented on HIV-associated skin conditions at the meeting.

“My whole plea is to remember that HIV has not gone away, that it keeps showing up, and that the skin symptoms absolutely show up,” she said. “It’s not on the radar as much as it should be.”

In her presentation, Dr. Maurer, who is also chief of dermatology at San Francisco General Hospital, said HIV and HIV medications have a variety of impacts on skin. For example, psoriasis gets worse when patients are off medication and better when they’re on it, she said, while molluscum contagiosum and herpes simplex can actually worsen when patients start HIV drugs. And, she said, a late start of AIDS drugs can worsen eczema.

In the interview, she discussed the impact of starting antiretrovirals late into the infection, when CD4 counts are low, on skin conditions, as well as possible reasons behind the increase in Kaposi’s sarcoma, and interactions between systemic dermatologic medications and some antiretrovirals.


Dr. Maurer reports no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM AAD 18

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article