Monitor Asthma Patients on Biologics for Remission, Potential EGPA Symptoms During Steroid Tapering

Article Type
Changed
Wed, 09/11/2024 - 13:44

 

Physicians are called to record clinical details of patients with asthma undergoing biologic therapy to monitor clinical remission and keep an eye on eosinophilic granulomatosis with polyangiitis (EGPA) symptoms as patients come off the medications, according to pulmonary experts presenting at the European Respiratory Society (ERS) 2024 International Congress.

Biologics have revolutionized the treatment of severe asthma, significantly improving patient outcomes. However, the focus has recently shifted toward achieving more comprehensive disease control. Remission, already a well-established goal in conditions like rheumatoid arthritis and inflammatory bowel disease, is now being explored in patients with asthma receiving biologics.

Peter Howarth, medical director at Global Medical, Specialty Medicine, GSK, in Brentford, England, said that new clinical remission criteria in asthma may be overly rigid and of little use. He said that more attainable limits must be created. Meanwhile, clinicians should collect clinical data more thoroughly.

In parallel, studies have also raised questions about the role of biologics in the emergence of EGPA.
 

Defining Clinical Remission in Asthma

Last year, a working group, including members from the American Thoracic Society and the American College and Academy of Allergy, Asthma, and Immunology, proposed new guidelines to define clinical remission in asthma. These guidelines extended beyond the typical outcomes of no severe exacerbations, no maintenance oral corticosteroid use, good asthma control, and stable lung function. The additional recommendations included no missed work or school due to asthma, limited use of rescue medication (no more than once a month), and reduced inhaled corticosteroid use to low or medium doses.

To explore the feasibility of achieving these clinical remission outcomes, GSK partnered with the Mayo Clinic for a retrospective analysis of the medical records of 700 patients with asthma undergoing various biologic therapies. The study revealed that essential data for determining clinical remission, such as asthma control and exacerbation records, were inconsistently documented. While some data were recorded, such as maintenance corticosteroid use in 50%-60% of cases, other key measures, like asthma control, were recorded in less than a quarter of the patients.

GSK researchers analyzed available data and found that around 30% of patients on any biologic therapy met three components of remission. Mepolizumab performed better than other corticosteroids, with over 40% of those receiving the drug meeting these criteria. However, when stricter definitions were applied, such as requiring four or more remission components, fewer patients achieved remission — less than 10% for four components, with no patients meeting the full seven-point criteria proposed by the working group.

An ongoing ERS Task Force is now exploring what clinical remission outcomes are practical to achieve, as the current definitions may be too aspirational, said Mr. Howarth. “It’s a matter of defying what is practical to achieve because if you can’t achieve it, then it won’t be valuable.”

He also pointed out that biologics are often used for the most severe cases of asthma after other treatments have failed. Evidence suggests that introducing biologics earlier in the disease, before chronic damage occurs, may result in better patient outcomes.
 

 

 

Biologics and EGPA

In a retrospective study, clinical details of 27 patients with adult-onset asthma from 28 countries, all on biologic therapy, were analyzed. The study, a multicounty collaboration, was led by ERS Severe Heterogeneous Asthma Research Collaboration, Patient-centred (SHARP), and aimed to understand the role of biologics in the emergence of EGPA.

The most significant finding presented at the ERS 2024 International Congress was that EGPA was not associated with maintenance corticosteroids; instead, it often emerged when corticosteroid doses were reduced or tapered off. “This might suggest that steroid withdrawal may unmask the underlying disease,” said Hitasha Rupani, MD, a consultant respiratory physician at the University Hospital Southampton, in Southampton, England. Importantly, the rate at which steroids were tapered did not influence the onset of EGPA, indicating that the tapering process, rather than its speed, may be the critical factor. However, due to the small sample size, this remains a hypothesis, Dr. Rupani explained.

The study also found that when clinicians had a clinical suspicion of EGPA before starting biologic therapy, the diagnosis was made earlier than in cases without such suspicion. Dr. Rupani concluded that this underscores the importance of clinical vigilance and the need to monitor patients closely for EGPA symptoms, especially during corticosteroid tapering.

The study was funded by GSK. Mr. Howarth is an employee at GSK. Dr. Rupani reports no relevant financial relationships. 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

Physicians are called to record clinical details of patients with asthma undergoing biologic therapy to monitor clinical remission and keep an eye on eosinophilic granulomatosis with polyangiitis (EGPA) symptoms as patients come off the medications, according to pulmonary experts presenting at the European Respiratory Society (ERS) 2024 International Congress.

Biologics have revolutionized the treatment of severe asthma, significantly improving patient outcomes. However, the focus has recently shifted toward achieving more comprehensive disease control. Remission, already a well-established goal in conditions like rheumatoid arthritis and inflammatory bowel disease, is now being explored in patients with asthma receiving biologics.

Peter Howarth, medical director at Global Medical, Specialty Medicine, GSK, in Brentford, England, said that new clinical remission criteria in asthma may be overly rigid and of little use. He said that more attainable limits must be created. Meanwhile, clinicians should collect clinical data more thoroughly.

In parallel, studies have also raised questions about the role of biologics in the emergence of EGPA.
 

Defining Clinical Remission in Asthma

Last year, a working group, including members from the American Thoracic Society and the American College and Academy of Allergy, Asthma, and Immunology, proposed new guidelines to define clinical remission in asthma. These guidelines extended beyond the typical outcomes of no severe exacerbations, no maintenance oral corticosteroid use, good asthma control, and stable lung function. The additional recommendations included no missed work or school due to asthma, limited use of rescue medication (no more than once a month), and reduced inhaled corticosteroid use to low or medium doses.

To explore the feasibility of achieving these clinical remission outcomes, GSK partnered with the Mayo Clinic for a retrospective analysis of the medical records of 700 patients with asthma undergoing various biologic therapies. The study revealed that essential data for determining clinical remission, such as asthma control and exacerbation records, were inconsistently documented. While some data were recorded, such as maintenance corticosteroid use in 50%-60% of cases, other key measures, like asthma control, were recorded in less than a quarter of the patients.

GSK researchers analyzed available data and found that around 30% of patients on any biologic therapy met three components of remission. Mepolizumab performed better than other corticosteroids, with over 40% of those receiving the drug meeting these criteria. However, when stricter definitions were applied, such as requiring four or more remission components, fewer patients achieved remission — less than 10% for four components, with no patients meeting the full seven-point criteria proposed by the working group.

An ongoing ERS Task Force is now exploring what clinical remission outcomes are practical to achieve, as the current definitions may be too aspirational, said Mr. Howarth. “It’s a matter of defying what is practical to achieve because if you can’t achieve it, then it won’t be valuable.”

He also pointed out that biologics are often used for the most severe cases of asthma after other treatments have failed. Evidence suggests that introducing biologics earlier in the disease, before chronic damage occurs, may result in better patient outcomes.
 

 

 

Biologics and EGPA

In a retrospective study, clinical details of 27 patients with adult-onset asthma from 28 countries, all on biologic therapy, were analyzed. The study, a multicounty collaboration, was led by ERS Severe Heterogeneous Asthma Research Collaboration, Patient-centred (SHARP), and aimed to understand the role of biologics in the emergence of EGPA.

The most significant finding presented at the ERS 2024 International Congress was that EGPA was not associated with maintenance corticosteroids; instead, it often emerged when corticosteroid doses were reduced or tapered off. “This might suggest that steroid withdrawal may unmask the underlying disease,” said Hitasha Rupani, MD, a consultant respiratory physician at the University Hospital Southampton, in Southampton, England. Importantly, the rate at which steroids were tapered did not influence the onset of EGPA, indicating that the tapering process, rather than its speed, may be the critical factor. However, due to the small sample size, this remains a hypothesis, Dr. Rupani explained.

The study also found that when clinicians had a clinical suspicion of EGPA before starting biologic therapy, the diagnosis was made earlier than in cases without such suspicion. Dr. Rupani concluded that this underscores the importance of clinical vigilance and the need to monitor patients closely for EGPA symptoms, especially during corticosteroid tapering.

The study was funded by GSK. Mr. Howarth is an employee at GSK. Dr. Rupani reports no relevant financial relationships. 

A version of this article appeared on Medscape.com.

 

Physicians are called to record clinical details of patients with asthma undergoing biologic therapy to monitor clinical remission and keep an eye on eosinophilic granulomatosis with polyangiitis (EGPA) symptoms as patients come off the medications, according to pulmonary experts presenting at the European Respiratory Society (ERS) 2024 International Congress.

Biologics have revolutionized the treatment of severe asthma, significantly improving patient outcomes. However, the focus has recently shifted toward achieving more comprehensive disease control. Remission, already a well-established goal in conditions like rheumatoid arthritis and inflammatory bowel disease, is now being explored in patients with asthma receiving biologics.

Peter Howarth, medical director at Global Medical, Specialty Medicine, GSK, in Brentford, England, said that new clinical remission criteria in asthma may be overly rigid and of little use. He said that more attainable limits must be created. Meanwhile, clinicians should collect clinical data more thoroughly.

In parallel, studies have also raised questions about the role of biologics in the emergence of EGPA.
 

Defining Clinical Remission in Asthma

Last year, a working group, including members from the American Thoracic Society and the American College and Academy of Allergy, Asthma, and Immunology, proposed new guidelines to define clinical remission in asthma. These guidelines extended beyond the typical outcomes of no severe exacerbations, no maintenance oral corticosteroid use, good asthma control, and stable lung function. The additional recommendations included no missed work or school due to asthma, limited use of rescue medication (no more than once a month), and reduced inhaled corticosteroid use to low or medium doses.

To explore the feasibility of achieving these clinical remission outcomes, GSK partnered with the Mayo Clinic for a retrospective analysis of the medical records of 700 patients with asthma undergoing various biologic therapies. The study revealed that essential data for determining clinical remission, such as asthma control and exacerbation records, were inconsistently documented. While some data were recorded, such as maintenance corticosteroid use in 50%-60% of cases, other key measures, like asthma control, were recorded in less than a quarter of the patients.

GSK researchers analyzed available data and found that around 30% of patients on any biologic therapy met three components of remission. Mepolizumab performed better than other corticosteroids, with over 40% of those receiving the drug meeting these criteria. However, when stricter definitions were applied, such as requiring four or more remission components, fewer patients achieved remission — less than 10% for four components, with no patients meeting the full seven-point criteria proposed by the working group.

An ongoing ERS Task Force is now exploring what clinical remission outcomes are practical to achieve, as the current definitions may be too aspirational, said Mr. Howarth. “It’s a matter of defying what is practical to achieve because if you can’t achieve it, then it won’t be valuable.”

He also pointed out that biologics are often used for the most severe cases of asthma after other treatments have failed. Evidence suggests that introducing biologics earlier in the disease, before chronic damage occurs, may result in better patient outcomes.
 

 

 

Biologics and EGPA

In a retrospective study, clinical details of 27 patients with adult-onset asthma from 28 countries, all on biologic therapy, were analyzed. The study, a multicounty collaboration, was led by ERS Severe Heterogeneous Asthma Research Collaboration, Patient-centred (SHARP), and aimed to understand the role of biologics in the emergence of EGPA.

The most significant finding presented at the ERS 2024 International Congress was that EGPA was not associated with maintenance corticosteroids; instead, it often emerged when corticosteroid doses were reduced or tapered off. “This might suggest that steroid withdrawal may unmask the underlying disease,” said Hitasha Rupani, MD, a consultant respiratory physician at the University Hospital Southampton, in Southampton, England. Importantly, the rate at which steroids were tapered did not influence the onset of EGPA, indicating that the tapering process, rather than its speed, may be the critical factor. However, due to the small sample size, this remains a hypothesis, Dr. Rupani explained.

The study also found that when clinicians had a clinical suspicion of EGPA before starting biologic therapy, the diagnosis was made earlier than in cases without such suspicion. Dr. Rupani concluded that this underscores the importance of clinical vigilance and the need to monitor patients closely for EGPA symptoms, especially during corticosteroid tapering.

The study was funded by GSK. Mr. Howarth is an employee at GSK. Dr. Rupani reports no relevant financial relationships. 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Night Owls May Be at Greater Risk for T2D, Beyond Lifestyle

Article Type
Changed
Wed, 09/11/2024 - 10:20

 

Night owls — individuals with late chronotypes — may be at an increased risk for type 2 diabetes (T2D), beyond the risks conferred by an unhealthy lifestyle, research presented at the annual meeting of the European Association for the Study of Diabetes suggested.

In the study, night owls were almost 50% more likely to develop T2D than those who went to sleep earlier.

“The magnitude of this risk was more than I expected, [although] residual confounding may have occurred,” said Jeroen van der Velde, PhD, Leiden University Medical Center in the Netherlands, who presented the study.

“Late chronotype has previously been associated with unhealthy lifestyle and overweight or obesity and, subsequently, cardiometabolic diseases,” he said in an interview. However, although the current study found that individuals with late chronotypes did indeed have larger waists and more visceral fat, “we (and others) believe that lifestyle cannot fully explain the relation between late chronotype and metabolic disorders.”

“In addition,” he noted, “previous studies that observed that late chronotype is associated with overweight or obesity mainly focused on body mass index [BMI]. However, BMI alone does not provide accurate information regarding fat distribution in the body. People with similar BMI may have different underlying fat distribution, and this may be more relevant than BMI for metabolic risk.”

The researchers examined associations between chronotype and BMI, waist circumference, visceral fat, liver fat, and the risk for T2D in a middle-aged population from the Netherlands Epidemiology of Obesity study. Among the 5026 participants, the mean age was 56 years, 54% were women, and mean BMI was 30.

Using data from the study, the study investigators calculated the midpoint of sleep (MPS) and divided participants into three chronotypes: Early MPS < 2:30 PM (20% of participants); intermediate MPS 2:30–4:00 PM (reference category; 60% of participants); and late MPS ≥ 4:00 PM (20% of participants). BMI and waist circumference were measured in all participants, and visceral fat and liver fat were measured in 1576 participants using MRI scans and MR spectroscopy, respectively.

During a median follow-up of 6.6 years, 225 participants were diagnosed with T2D. After adjustment for age, sex, education, physical activity, smoking, alcohol intake, diet quality, sleep quality and duration, and total body fat, participants with a late chronotype had a 46% increased risk for T2D.

Further, those with a late chronotype had 0.7 higher BMI, 1.9-cm larger waist circumference, 7 cm2 more visceral fat, and 14% more liver fat.
 

Body Clock Out of Sync?

“Late chronotype was associated with increased ectopic body fat and with an increased risk of T2D independent of lifestyle factors and is an emerging risk factor for metabolic diseases,” the researchers concluded.

“A likely explanation is that the circadian rhythm or body clock in late chronotypes is out of sync with the work and social schedules followed by society,” Dr. van der Velde suggested. “This can lead to circadian misalignment, which we know can lead to metabolic disturbances and ultimately type 2 diabetes.”

Might trying to adjust chronotype earlier in life have an effect on risk?

“Chronotype, as measured via midpoint of sleep, does change a lot in the first 30 years or so in life,” he said. “After that it seems to stabilize. I suppose that if you adapt an intermediate or early chronotype around the age of 30 years, this will help to maintain an earlier chronotype later in life, although we cannot answer this from our study.”

Nevertheless, with respect to T2D risk, “chronotype is likely only part of the puzzle,” he noted.

“People with late chronotypes typically eat late in the evening, and this has also been associated with adverse metabolic effects. At this stage, we do not know if a person changes his/her chronotype that this will also lead to metabolic improvements. More research is needed before we can make recommendations regarding chronotype and timing of other lifestyle behaviors.”

Commenting on the study, Gianluca Iacobellis, MD, PhD, director of the University of Miami Hospital Diabetes Service, Coral Gables, Florida, said: “Interesting data. Altering the physiological circadian rhythm can affect the complex hormonal system — including cortisol, ghrelin, leptin, and serotonin — that regulates insulin sensitivity, glucose, and blood pressure control. The night owl may become more insulin resistant and therefore at higher risk of developing diabetes.”

Like Dr. van der Velde, he noted that “late sleep may be associated with night binging that can cause weight gain and ultimately obesity, further increasing the risk of diabetes.”

Dr. Iacobellis’s group recently showed that vital exhaustion, which is characterized by fatigue and loss of vigor, is associated with a higher cardiovascular risk for and markers of visceral adiposity.

“Abnormal circadian rhythms can be easily associated with vital exhaustion,” he said. Therefore, night owls with more visceral than peripheral fat accumulation might also be at higher cardiometabolic risk through that mechanism.

“However environmental factors and family history can play an important role too,” he added.

Regardless of the mechanisms involved, “preventive actions should be taken to educate teenagers and individuals at higher risk to have healthy sleep habits,” Dr. Iacobellis concluded.

No information regarding funding was provided; Dr. van der Velde and Dr. Iacobellis reported no conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Night owls — individuals with late chronotypes — may be at an increased risk for type 2 diabetes (T2D), beyond the risks conferred by an unhealthy lifestyle, research presented at the annual meeting of the European Association for the Study of Diabetes suggested.

In the study, night owls were almost 50% more likely to develop T2D than those who went to sleep earlier.

“The magnitude of this risk was more than I expected, [although] residual confounding may have occurred,” said Jeroen van der Velde, PhD, Leiden University Medical Center in the Netherlands, who presented the study.

“Late chronotype has previously been associated with unhealthy lifestyle and overweight or obesity and, subsequently, cardiometabolic diseases,” he said in an interview. However, although the current study found that individuals with late chronotypes did indeed have larger waists and more visceral fat, “we (and others) believe that lifestyle cannot fully explain the relation between late chronotype and metabolic disorders.”

“In addition,” he noted, “previous studies that observed that late chronotype is associated with overweight or obesity mainly focused on body mass index [BMI]. However, BMI alone does not provide accurate information regarding fat distribution in the body. People with similar BMI may have different underlying fat distribution, and this may be more relevant than BMI for metabolic risk.”

The researchers examined associations between chronotype and BMI, waist circumference, visceral fat, liver fat, and the risk for T2D in a middle-aged population from the Netherlands Epidemiology of Obesity study. Among the 5026 participants, the mean age was 56 years, 54% were women, and mean BMI was 30.

Using data from the study, the study investigators calculated the midpoint of sleep (MPS) and divided participants into three chronotypes: Early MPS < 2:30 PM (20% of participants); intermediate MPS 2:30–4:00 PM (reference category; 60% of participants); and late MPS ≥ 4:00 PM (20% of participants). BMI and waist circumference were measured in all participants, and visceral fat and liver fat were measured in 1576 participants using MRI scans and MR spectroscopy, respectively.

During a median follow-up of 6.6 years, 225 participants were diagnosed with T2D. After adjustment for age, sex, education, physical activity, smoking, alcohol intake, diet quality, sleep quality and duration, and total body fat, participants with a late chronotype had a 46% increased risk for T2D.

Further, those with a late chronotype had 0.7 higher BMI, 1.9-cm larger waist circumference, 7 cm2 more visceral fat, and 14% more liver fat.
 

Body Clock Out of Sync?

“Late chronotype was associated with increased ectopic body fat and with an increased risk of T2D independent of lifestyle factors and is an emerging risk factor for metabolic diseases,” the researchers concluded.

“A likely explanation is that the circadian rhythm or body clock in late chronotypes is out of sync with the work and social schedules followed by society,” Dr. van der Velde suggested. “This can lead to circadian misalignment, which we know can lead to metabolic disturbances and ultimately type 2 diabetes.”

Might trying to adjust chronotype earlier in life have an effect on risk?

“Chronotype, as measured via midpoint of sleep, does change a lot in the first 30 years or so in life,” he said. “After that it seems to stabilize. I suppose that if you adapt an intermediate or early chronotype around the age of 30 years, this will help to maintain an earlier chronotype later in life, although we cannot answer this from our study.”

Nevertheless, with respect to T2D risk, “chronotype is likely only part of the puzzle,” he noted.

“People with late chronotypes typically eat late in the evening, and this has also been associated with adverse metabolic effects. At this stage, we do not know if a person changes his/her chronotype that this will also lead to metabolic improvements. More research is needed before we can make recommendations regarding chronotype and timing of other lifestyle behaviors.”

Commenting on the study, Gianluca Iacobellis, MD, PhD, director of the University of Miami Hospital Diabetes Service, Coral Gables, Florida, said: “Interesting data. Altering the physiological circadian rhythm can affect the complex hormonal system — including cortisol, ghrelin, leptin, and serotonin — that regulates insulin sensitivity, glucose, and blood pressure control. The night owl may become more insulin resistant and therefore at higher risk of developing diabetes.”

Like Dr. van der Velde, he noted that “late sleep may be associated with night binging that can cause weight gain and ultimately obesity, further increasing the risk of diabetes.”

Dr. Iacobellis’s group recently showed that vital exhaustion, which is characterized by fatigue and loss of vigor, is associated with a higher cardiovascular risk for and markers of visceral adiposity.

“Abnormal circadian rhythms can be easily associated with vital exhaustion,” he said. Therefore, night owls with more visceral than peripheral fat accumulation might also be at higher cardiometabolic risk through that mechanism.

“However environmental factors and family history can play an important role too,” he added.

Regardless of the mechanisms involved, “preventive actions should be taken to educate teenagers and individuals at higher risk to have healthy sleep habits,” Dr. Iacobellis concluded.

No information regarding funding was provided; Dr. van der Velde and Dr. Iacobellis reported no conflicts of interest.

A version of this article first appeared on Medscape.com.

 

Night owls — individuals with late chronotypes — may be at an increased risk for type 2 diabetes (T2D), beyond the risks conferred by an unhealthy lifestyle, research presented at the annual meeting of the European Association for the Study of Diabetes suggested.

In the study, night owls were almost 50% more likely to develop T2D than those who went to sleep earlier.

“The magnitude of this risk was more than I expected, [although] residual confounding may have occurred,” said Jeroen van der Velde, PhD, Leiden University Medical Center in the Netherlands, who presented the study.

“Late chronotype has previously been associated with unhealthy lifestyle and overweight or obesity and, subsequently, cardiometabolic diseases,” he said in an interview. However, although the current study found that individuals with late chronotypes did indeed have larger waists and more visceral fat, “we (and others) believe that lifestyle cannot fully explain the relation between late chronotype and metabolic disorders.”

“In addition,” he noted, “previous studies that observed that late chronotype is associated with overweight or obesity mainly focused on body mass index [BMI]. However, BMI alone does not provide accurate information regarding fat distribution in the body. People with similar BMI may have different underlying fat distribution, and this may be more relevant than BMI for metabolic risk.”

The researchers examined associations between chronotype and BMI, waist circumference, visceral fat, liver fat, and the risk for T2D in a middle-aged population from the Netherlands Epidemiology of Obesity study. Among the 5026 participants, the mean age was 56 years, 54% were women, and mean BMI was 30.

Using data from the study, the study investigators calculated the midpoint of sleep (MPS) and divided participants into three chronotypes: Early MPS < 2:30 PM (20% of participants); intermediate MPS 2:30–4:00 PM (reference category; 60% of participants); and late MPS ≥ 4:00 PM (20% of participants). BMI and waist circumference were measured in all participants, and visceral fat and liver fat were measured in 1576 participants using MRI scans and MR spectroscopy, respectively.

During a median follow-up of 6.6 years, 225 participants were diagnosed with T2D. After adjustment for age, sex, education, physical activity, smoking, alcohol intake, diet quality, sleep quality and duration, and total body fat, participants with a late chronotype had a 46% increased risk for T2D.

Further, those with a late chronotype had 0.7 higher BMI, 1.9-cm larger waist circumference, 7 cm2 more visceral fat, and 14% more liver fat.
 

Body Clock Out of Sync?

“Late chronotype was associated with increased ectopic body fat and with an increased risk of T2D independent of lifestyle factors and is an emerging risk factor for metabolic diseases,” the researchers concluded.

“A likely explanation is that the circadian rhythm or body clock in late chronotypes is out of sync with the work and social schedules followed by society,” Dr. van der Velde suggested. “This can lead to circadian misalignment, which we know can lead to metabolic disturbances and ultimately type 2 diabetes.”

Might trying to adjust chronotype earlier in life have an effect on risk?

“Chronotype, as measured via midpoint of sleep, does change a lot in the first 30 years or so in life,” he said. “After that it seems to stabilize. I suppose that if you adapt an intermediate or early chronotype around the age of 30 years, this will help to maintain an earlier chronotype later in life, although we cannot answer this from our study.”

Nevertheless, with respect to T2D risk, “chronotype is likely only part of the puzzle,” he noted.

“People with late chronotypes typically eat late in the evening, and this has also been associated with adverse metabolic effects. At this stage, we do not know if a person changes his/her chronotype that this will also lead to metabolic improvements. More research is needed before we can make recommendations regarding chronotype and timing of other lifestyle behaviors.”

Commenting on the study, Gianluca Iacobellis, MD, PhD, director of the University of Miami Hospital Diabetes Service, Coral Gables, Florida, said: “Interesting data. Altering the physiological circadian rhythm can affect the complex hormonal system — including cortisol, ghrelin, leptin, and serotonin — that regulates insulin sensitivity, glucose, and blood pressure control. The night owl may become more insulin resistant and therefore at higher risk of developing diabetes.”

Like Dr. van der Velde, he noted that “late sleep may be associated with night binging that can cause weight gain and ultimately obesity, further increasing the risk of diabetes.”

Dr. Iacobellis’s group recently showed that vital exhaustion, which is characterized by fatigue and loss of vigor, is associated with a higher cardiovascular risk for and markers of visceral adiposity.

“Abnormal circadian rhythms can be easily associated with vital exhaustion,” he said. Therefore, night owls with more visceral than peripheral fat accumulation might also be at higher cardiometabolic risk through that mechanism.

“However environmental factors and family history can play an important role too,” he added.

Regardless of the mechanisms involved, “preventive actions should be taken to educate teenagers and individuals at higher risk to have healthy sleep habits,” Dr. Iacobellis concluded.

No information regarding funding was provided; Dr. van der Velde and Dr. Iacobellis reported no conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EASD 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Debate: Should Patients With CLL Take Breaks From Targeted Therapies?

Article Type
Changed
Wed, 09/11/2024 - 09:22

 

Bruton’s tyrosine kinase inhibitors such as ibrutinibacalabrutinib, and zanubrutinib have revolutionized the treatment of chronic lymphocytic leukemia (CLL). But should patients take a break from them?

At the annual meeting of the Society of Hematologic Oncology, two hematologist-oncologists — Inhye Ahn, MD, of Dana-Farber Cancer Institute in Boston, Massachusetts, and Kerry A. Rogers, MD, of Ohio State University in Columbus — faced off in a debate. Ahn said the drugs can indeed be discontinued, while Rogers argued against stopping the medications.

“When I talk to my own patient about standard of care options in CLL, I use the analogy of a marathon and a sprint,” Dr. Ahn said. A marathon refers to continuous treatment with Bruton’s kinase inhibitors given daily for years, while the sprint refers to the combination of venetoclax with an anti-CD20 monoclonal antibody given over 12 cycles for the frontline regimen and 2 years for refractory CLL.

“I tell them these are both considered very efficacious regimens and well tolerated, one is by IV [the venetoclax regimen] and the other isn’t [Bruton’s kinase inhibitors], and they have different toxicity profile. I ask them what would you do? The most common question that I get from my patient is, ‘why would anyone do a marathon?’ ”

It’s not solely the length of treatment that’s important, Dr. Ahn said, as toxicities from the long-term use of Bruton’s kinase inhibitors build up over time and can lead to hypertension, arrhythmia, and sudden cardiac death.

In addition, she said, infections can occur, as well as hampered vaccine response, an important risk in the era of the COVID-19 pandemic. The cost of the drugs is high and adds up over time, and continuous use can boost resistance.

Is there a way to turn the marathon of Bruton’s kinase inhibitor use into a sprint without hurting patients? The answer is yes, through temporary discontinuation, Dr. Ahn said, although she cautioned that early cessation could lead to disease flare. “We dipped into our own database of 84 CLL patients treated with ibrutinib, and our conclusion was that temporary dose interruption or dose reduction did not impact progression-free survival”

Moving forward, she said, “more research is needed to define the optimal regimen that would lead to treatment cessation, the optimal patient population, who would benefit most from the cessation strategy, treatment duration, and how we define success.” For her part, Dr. Rogers argued that the continuous use of Bruton’s kinase inhibitors is “really the most effective treatment we have in CLL.”

It’s clear that “responses deepen with continued treatment,” Dr. Rogers said, noting that remission times grow over years of treatment. She highlighted a 2022 study of patients with CLL who took ibrutinib that found complete remission or complete remission with incomplete hematologic recovery was 7% at 12 months and 34% at 7 years. When patients quit taking the drugs, “you don’t get to maximize your patient’s response to this treatment.”

Dr. Rogers also noted that the RESONATE-2 trial found that ibrutinib is linked to the longest median progression-free survival of any CLL treatment at 8.9 years. “That really struck me a very effective initial therapy.”

Indeed, “when you’re offering someone initial therapy with a Bruton’s kinase inhibitor as a continuous treatment strategy, you can tell people that they can expect a normal lifespan with this approach. That’s extremely important when you’re talking to patients about whether they might want to alter their leukemia treatment.”

Finally, she noted that discontinuation of ibrutinib was linked to shorter survival in early research. “This was the first suggestion that discontinuation is not good.”

Dr. Rogers said that discontinuing the drugs is sometimes necessary because of adverse events, but patients can “certainly switch to a more tolerable Bruton’s kinase inhibitor. With the options available today, that should be a strategy that’s considered.”

Audience members at the debate were invited to respond to a live online survey about whether Bruton’s kinase inhibitors can be discontinued. Among 49 respondents, most (52.3%) said no, 42.8% said yes, and the rest were undecided/other.

Disclosures for the speakers were not provided. Dr. Ahn disclosed consulting for BeiGene and AstraZeneca. Dr. Rogers disclosed receiving research funding from Genentech, AbbVie, Janssen, and Novartis; consulting for AstraZeneca, BeiGene, Janssen, Pharmacyclics, AbbVie, Genentech, and LOXO@Lilly; and receiving travel funding from AstraZeneca.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

Bruton’s tyrosine kinase inhibitors such as ibrutinibacalabrutinib, and zanubrutinib have revolutionized the treatment of chronic lymphocytic leukemia (CLL). But should patients take a break from them?

At the annual meeting of the Society of Hematologic Oncology, two hematologist-oncologists — Inhye Ahn, MD, of Dana-Farber Cancer Institute in Boston, Massachusetts, and Kerry A. Rogers, MD, of Ohio State University in Columbus — faced off in a debate. Ahn said the drugs can indeed be discontinued, while Rogers argued against stopping the medications.

“When I talk to my own patient about standard of care options in CLL, I use the analogy of a marathon and a sprint,” Dr. Ahn said. A marathon refers to continuous treatment with Bruton’s kinase inhibitors given daily for years, while the sprint refers to the combination of venetoclax with an anti-CD20 monoclonal antibody given over 12 cycles for the frontline regimen and 2 years for refractory CLL.

“I tell them these are both considered very efficacious regimens and well tolerated, one is by IV [the venetoclax regimen] and the other isn’t [Bruton’s kinase inhibitors], and they have different toxicity profile. I ask them what would you do? The most common question that I get from my patient is, ‘why would anyone do a marathon?’ ”

It’s not solely the length of treatment that’s important, Dr. Ahn said, as toxicities from the long-term use of Bruton’s kinase inhibitors build up over time and can lead to hypertension, arrhythmia, and sudden cardiac death.

In addition, she said, infections can occur, as well as hampered vaccine response, an important risk in the era of the COVID-19 pandemic. The cost of the drugs is high and adds up over time, and continuous use can boost resistance.

Is there a way to turn the marathon of Bruton’s kinase inhibitor use into a sprint without hurting patients? The answer is yes, through temporary discontinuation, Dr. Ahn said, although she cautioned that early cessation could lead to disease flare. “We dipped into our own database of 84 CLL patients treated with ibrutinib, and our conclusion was that temporary dose interruption or dose reduction did not impact progression-free survival”

Moving forward, she said, “more research is needed to define the optimal regimen that would lead to treatment cessation, the optimal patient population, who would benefit most from the cessation strategy, treatment duration, and how we define success.” For her part, Dr. Rogers argued that the continuous use of Bruton’s kinase inhibitors is “really the most effective treatment we have in CLL.”

It’s clear that “responses deepen with continued treatment,” Dr. Rogers said, noting that remission times grow over years of treatment. She highlighted a 2022 study of patients with CLL who took ibrutinib that found complete remission or complete remission with incomplete hematologic recovery was 7% at 12 months and 34% at 7 years. When patients quit taking the drugs, “you don’t get to maximize your patient’s response to this treatment.”

Dr. Rogers also noted that the RESONATE-2 trial found that ibrutinib is linked to the longest median progression-free survival of any CLL treatment at 8.9 years. “That really struck me a very effective initial therapy.”

Indeed, “when you’re offering someone initial therapy with a Bruton’s kinase inhibitor as a continuous treatment strategy, you can tell people that they can expect a normal lifespan with this approach. That’s extremely important when you’re talking to patients about whether they might want to alter their leukemia treatment.”

Finally, she noted that discontinuation of ibrutinib was linked to shorter survival in early research. “This was the first suggestion that discontinuation is not good.”

Dr. Rogers said that discontinuing the drugs is sometimes necessary because of adverse events, but patients can “certainly switch to a more tolerable Bruton’s kinase inhibitor. With the options available today, that should be a strategy that’s considered.”

Audience members at the debate were invited to respond to a live online survey about whether Bruton’s kinase inhibitors can be discontinued. Among 49 respondents, most (52.3%) said no, 42.8% said yes, and the rest were undecided/other.

Disclosures for the speakers were not provided. Dr. Ahn disclosed consulting for BeiGene and AstraZeneca. Dr. Rogers disclosed receiving research funding from Genentech, AbbVie, Janssen, and Novartis; consulting for AstraZeneca, BeiGene, Janssen, Pharmacyclics, AbbVie, Genentech, and LOXO@Lilly; and receiving travel funding from AstraZeneca.

A version of this article appeared on Medscape.com.

 

Bruton’s tyrosine kinase inhibitors such as ibrutinibacalabrutinib, and zanubrutinib have revolutionized the treatment of chronic lymphocytic leukemia (CLL). But should patients take a break from them?

At the annual meeting of the Society of Hematologic Oncology, two hematologist-oncologists — Inhye Ahn, MD, of Dana-Farber Cancer Institute in Boston, Massachusetts, and Kerry A. Rogers, MD, of Ohio State University in Columbus — faced off in a debate. Ahn said the drugs can indeed be discontinued, while Rogers argued against stopping the medications.

“When I talk to my own patient about standard of care options in CLL, I use the analogy of a marathon and a sprint,” Dr. Ahn said. A marathon refers to continuous treatment with Bruton’s kinase inhibitors given daily for years, while the sprint refers to the combination of venetoclax with an anti-CD20 monoclonal antibody given over 12 cycles for the frontline regimen and 2 years for refractory CLL.

“I tell them these are both considered very efficacious regimens and well tolerated, one is by IV [the venetoclax regimen] and the other isn’t [Bruton’s kinase inhibitors], and they have different toxicity profile. I ask them what would you do? The most common question that I get from my patient is, ‘why would anyone do a marathon?’ ”

It’s not solely the length of treatment that’s important, Dr. Ahn said, as toxicities from the long-term use of Bruton’s kinase inhibitors build up over time and can lead to hypertension, arrhythmia, and sudden cardiac death.

In addition, she said, infections can occur, as well as hampered vaccine response, an important risk in the era of the COVID-19 pandemic. The cost of the drugs is high and adds up over time, and continuous use can boost resistance.

Is there a way to turn the marathon of Bruton’s kinase inhibitor use into a sprint without hurting patients? The answer is yes, through temporary discontinuation, Dr. Ahn said, although she cautioned that early cessation could lead to disease flare. “We dipped into our own database of 84 CLL patients treated with ibrutinib, and our conclusion was that temporary dose interruption or dose reduction did not impact progression-free survival”

Moving forward, she said, “more research is needed to define the optimal regimen that would lead to treatment cessation, the optimal patient population, who would benefit most from the cessation strategy, treatment duration, and how we define success.” For her part, Dr. Rogers argued that the continuous use of Bruton’s kinase inhibitors is “really the most effective treatment we have in CLL.”

It’s clear that “responses deepen with continued treatment,” Dr. Rogers said, noting that remission times grow over years of treatment. She highlighted a 2022 study of patients with CLL who took ibrutinib that found complete remission or complete remission with incomplete hematologic recovery was 7% at 12 months and 34% at 7 years. When patients quit taking the drugs, “you don’t get to maximize your patient’s response to this treatment.”

Dr. Rogers also noted that the RESONATE-2 trial found that ibrutinib is linked to the longest median progression-free survival of any CLL treatment at 8.9 years. “That really struck me a very effective initial therapy.”

Indeed, “when you’re offering someone initial therapy with a Bruton’s kinase inhibitor as a continuous treatment strategy, you can tell people that they can expect a normal lifespan with this approach. That’s extremely important when you’re talking to patients about whether they might want to alter their leukemia treatment.”

Finally, she noted that discontinuation of ibrutinib was linked to shorter survival in early research. “This was the first suggestion that discontinuation is not good.”

Dr. Rogers said that discontinuing the drugs is sometimes necessary because of adverse events, but patients can “certainly switch to a more tolerable Bruton’s kinase inhibitor. With the options available today, that should be a strategy that’s considered.”

Audience members at the debate were invited to respond to a live online survey about whether Bruton’s kinase inhibitors can be discontinued. Among 49 respondents, most (52.3%) said no, 42.8% said yes, and the rest were undecided/other.

Disclosures for the speakers were not provided. Dr. Ahn disclosed consulting for BeiGene and AstraZeneca. Dr. Rogers disclosed receiving research funding from Genentech, AbbVie, Janssen, and Novartis; consulting for AstraZeneca, BeiGene, Janssen, Pharmacyclics, AbbVie, Genentech, and LOXO@Lilly; and receiving travel funding from AstraZeneca.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SOHO 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Five Key Measures to Ensure a Quality Colonoscopy

Article Type
Changed
Wed, 09/18/2024 - 09:44

 

A task force established by the American College of Gastroenterology (ACG) and the American Society for Gastrointestinal Endoscopy (ASGE) issued updated recommendations highlighting what they consider to be the highest priority quality indicators for colonoscopy, a list that, for the first time, includes adequate bowel preparation and sessile serrated lesion detection rate (SSLDR).

“Endoscopy teams now have an updated set of guidelines which can be used to enhance the quality of their colonoscopies and should certainly use these current quality measures to ‘raise the bar’ on behalf of their patients,” task force member Nicholas J. Shaheen, MD, MPH, Division of Gastroenterology and Hepatology, The University of North Carolina at Chapel Hill, said in a statement.

Dr. Nicholas J. Shaheen



The task force published the recommendations online August 21 in The American Journal of Gastroenterology and in Gastrointestinal Endoscopy. It represents the third iteration of the ACG/ASGE quality indicators on colonoscopy recommendations and incorporates new evidence published since 2015.

“The last set of quality indicators from this group was 9 years ago. Since then, there has been a tremendous amount of new data published in colonoscopy quality,” Ziad F. Gellad, MD, MPH, professor of medicine, Duke University Medical Center, Durham, North Carolina, said in an interview.

“Keeping up with that data is a challenge, and so guidelines such as these are important in helping clinicians synthesize data on quality of care and implement best practices,” said Dr. Gellad, who was not involved with the task force.
 

Two New Priority Indicators 

The task force identified 15 quality indicators, divided into preprocedure, intraprocedure, and postprocedure. It includes five “priority” indicators — two of which are new.

One is the rate of adequate bowel preparation, preferably defined as a Boston Bowel Preparation Scale score ≥ 2 in each of three colon segments or by description of the preparation as excellent, good, or adequate. It has a performance target > 90%.

“Inadequate bowel preparation substantially increases the cost of colonoscopy delivery and creates risk and inconvenience for patients, thus warranting a ranking as a priority indicator,” the task force wrote.

Dr. Gellad explained that the addition of this priority indicator is “notable because it highlights the importance of bowel prep in high-quality colonoscopy. It also shifts more of the responsibility of bowel prep from the patient to the practice.”

The second new quality indicator is the SSLDR, which was selected due to its ability to contribute to cancer prevention.

Based on available evidence, the task force recommends a current minimum threshold for the SSLDR of 6%. “This is expected to be revised upward as evidence of increasing detection occurs,” they wrote.

Duke University
Dr. Ziad F. Gellad



Dr. Gellad said the addition of SSLDR is “an important advance in these recommendations. We know that serrated adenomas are a precursor for colorectal cancer and that the detection of these subtle lesions is variable.

“Providing a benchmark encourages practices to measure the detection of serrated adenomas and intervene when rates are below benchmarks. Prior to these benchmarks, it was difficult to know where to peg our expectations,” Dr. Gellad added.
 

 

 

Changes to the Adenoma Detection Rate (ADR)

The ADR remains a priority indicator in the update, albeit with changes.

To keep the ADR measurement consistent with current screening guidelines, the task force now recommends that the ADR be measured starting at age 45 rather than 50 years.

“ADR plays a critical role in evaluating the performance of the colonoscopists,” task force lead Douglas K. Rex, MD, a gastroenterologist at Indiana University School of Medicine in Indianapolis, said in the statement.

“It is recommended that ADR calculations include screening, surveillance, and diagnostic colonoscopy but exclude indications of a positive noncolonoscopy screening test and therapeutic procedures for resection or treatment of known neoplasia, genetic cancer syndromes, and inflammatory bowel disease,” Dr. Rex explained.

Dr. Douglas K. Rex



The task force recommends a minimum ADR threshold of 35% (40% in men and 30% in women) and that colonoscopists with ADRs below 35% “undertake remedial measures to improve and to achieve acceptable performance.”
 

Additional Priorities 

The cecal intubation rate (CIR) — the percentage of patients undergoing colonoscopy with intact colons who have full intubation of the cecum with photo documentation of cecal landmarks — remains a priority quality indicator and has a performance target ≥ 95%.

“A trained colonoscopist should achieve a high CIR with a very high level of safety,” the task force wrote. “Low CIRs have been associated with higher PCCRC [postcolonoscopy colorectal cancer] rates.” 

The final priority indicator is the rate of using recommended screening and surveillance intervals, which carries a performance target ≥ 90%.

“We recommend that quality improvement efforts initially focus on high-priority indicators and then progress to other indicators once it is ascertained that endoscopists are performing above recommended thresholds, either at baseline or after corrective interventions,” the task force wrote.

“The priority indicators are absolutely important for practices to implement,” Dr. Gellad said.

“There is compelling evidence that these measures are correlated with clinically important outcomes, particularly ADR,” he added. “Many practices already capture this data, and the changes in ADR calculation make measurement less burdensome. Hopefully, this will encourage more practices to collect and report these measures.” 

Dr. Rex is a consultant for Olympus, Boston Scientific, Braintree Laboratories, Norgine, GI Supply, Medtronic, and Acacia Pharmaceuticals; receives research support from Olympus, Medivators, Erbe USA, and Braintree Laboratories; and is a shareholder in Satisfai Health. Dr. Shaheen had no relevant disclosures. Dr. Gellad has consulted for Merck & Co. and Novo Nordisk and is a cofounder of Higgs Boson.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

A task force established by the American College of Gastroenterology (ACG) and the American Society for Gastrointestinal Endoscopy (ASGE) issued updated recommendations highlighting what they consider to be the highest priority quality indicators for colonoscopy, a list that, for the first time, includes adequate bowel preparation and sessile serrated lesion detection rate (SSLDR).

“Endoscopy teams now have an updated set of guidelines which can be used to enhance the quality of their colonoscopies and should certainly use these current quality measures to ‘raise the bar’ on behalf of their patients,” task force member Nicholas J. Shaheen, MD, MPH, Division of Gastroenterology and Hepatology, The University of North Carolina at Chapel Hill, said in a statement.

Dr. Nicholas J. Shaheen



The task force published the recommendations online August 21 in The American Journal of Gastroenterology and in Gastrointestinal Endoscopy. It represents the third iteration of the ACG/ASGE quality indicators on colonoscopy recommendations and incorporates new evidence published since 2015.

“The last set of quality indicators from this group was 9 years ago. Since then, there has been a tremendous amount of new data published in colonoscopy quality,” Ziad F. Gellad, MD, MPH, professor of medicine, Duke University Medical Center, Durham, North Carolina, said in an interview.

“Keeping up with that data is a challenge, and so guidelines such as these are important in helping clinicians synthesize data on quality of care and implement best practices,” said Dr. Gellad, who was not involved with the task force.
 

Two New Priority Indicators 

The task force identified 15 quality indicators, divided into preprocedure, intraprocedure, and postprocedure. It includes five “priority” indicators — two of which are new.

One is the rate of adequate bowel preparation, preferably defined as a Boston Bowel Preparation Scale score ≥ 2 in each of three colon segments or by description of the preparation as excellent, good, or adequate. It has a performance target > 90%.

“Inadequate bowel preparation substantially increases the cost of colonoscopy delivery and creates risk and inconvenience for patients, thus warranting a ranking as a priority indicator,” the task force wrote.

Dr. Gellad explained that the addition of this priority indicator is “notable because it highlights the importance of bowel prep in high-quality colonoscopy. It also shifts more of the responsibility of bowel prep from the patient to the practice.”

The second new quality indicator is the SSLDR, which was selected due to its ability to contribute to cancer prevention.

Based on available evidence, the task force recommends a current minimum threshold for the SSLDR of 6%. “This is expected to be revised upward as evidence of increasing detection occurs,” they wrote.

Duke University
Dr. Ziad F. Gellad



Dr. Gellad said the addition of SSLDR is “an important advance in these recommendations. We know that serrated adenomas are a precursor for colorectal cancer and that the detection of these subtle lesions is variable.

“Providing a benchmark encourages practices to measure the detection of serrated adenomas and intervene when rates are below benchmarks. Prior to these benchmarks, it was difficult to know where to peg our expectations,” Dr. Gellad added.
 

 

 

Changes to the Adenoma Detection Rate (ADR)

The ADR remains a priority indicator in the update, albeit with changes.

To keep the ADR measurement consistent with current screening guidelines, the task force now recommends that the ADR be measured starting at age 45 rather than 50 years.

“ADR plays a critical role in evaluating the performance of the colonoscopists,” task force lead Douglas K. Rex, MD, a gastroenterologist at Indiana University School of Medicine in Indianapolis, said in the statement.

“It is recommended that ADR calculations include screening, surveillance, and diagnostic colonoscopy but exclude indications of a positive noncolonoscopy screening test and therapeutic procedures for resection or treatment of known neoplasia, genetic cancer syndromes, and inflammatory bowel disease,” Dr. Rex explained.

Dr. Douglas K. Rex



The task force recommends a minimum ADR threshold of 35% (40% in men and 30% in women) and that colonoscopists with ADRs below 35% “undertake remedial measures to improve and to achieve acceptable performance.”
 

Additional Priorities 

The cecal intubation rate (CIR) — the percentage of patients undergoing colonoscopy with intact colons who have full intubation of the cecum with photo documentation of cecal landmarks — remains a priority quality indicator and has a performance target ≥ 95%.

“A trained colonoscopist should achieve a high CIR with a very high level of safety,” the task force wrote. “Low CIRs have been associated with higher PCCRC [postcolonoscopy colorectal cancer] rates.” 

The final priority indicator is the rate of using recommended screening and surveillance intervals, which carries a performance target ≥ 90%.

“We recommend that quality improvement efforts initially focus on high-priority indicators and then progress to other indicators once it is ascertained that endoscopists are performing above recommended thresholds, either at baseline or after corrective interventions,” the task force wrote.

“The priority indicators are absolutely important for practices to implement,” Dr. Gellad said.

“There is compelling evidence that these measures are correlated with clinically important outcomes, particularly ADR,” he added. “Many practices already capture this data, and the changes in ADR calculation make measurement less burdensome. Hopefully, this will encourage more practices to collect and report these measures.” 

Dr. Rex is a consultant for Olympus, Boston Scientific, Braintree Laboratories, Norgine, GI Supply, Medtronic, and Acacia Pharmaceuticals; receives research support from Olympus, Medivators, Erbe USA, and Braintree Laboratories; and is a shareholder in Satisfai Health. Dr. Shaheen had no relevant disclosures. Dr. Gellad has consulted for Merck & Co. and Novo Nordisk and is a cofounder of Higgs Boson.
 

A version of this article first appeared on Medscape.com.

 

A task force established by the American College of Gastroenterology (ACG) and the American Society for Gastrointestinal Endoscopy (ASGE) issued updated recommendations highlighting what they consider to be the highest priority quality indicators for colonoscopy, a list that, for the first time, includes adequate bowel preparation and sessile serrated lesion detection rate (SSLDR).

“Endoscopy teams now have an updated set of guidelines which can be used to enhance the quality of their colonoscopies and should certainly use these current quality measures to ‘raise the bar’ on behalf of their patients,” task force member Nicholas J. Shaheen, MD, MPH, Division of Gastroenterology and Hepatology, The University of North Carolina at Chapel Hill, said in a statement.

Dr. Nicholas J. Shaheen



The task force published the recommendations online August 21 in The American Journal of Gastroenterology and in Gastrointestinal Endoscopy. It represents the third iteration of the ACG/ASGE quality indicators on colonoscopy recommendations and incorporates new evidence published since 2015.

“The last set of quality indicators from this group was 9 years ago. Since then, there has been a tremendous amount of new data published in colonoscopy quality,” Ziad F. Gellad, MD, MPH, professor of medicine, Duke University Medical Center, Durham, North Carolina, said in an interview.

“Keeping up with that data is a challenge, and so guidelines such as these are important in helping clinicians synthesize data on quality of care and implement best practices,” said Dr. Gellad, who was not involved with the task force.
 

Two New Priority Indicators 

The task force identified 15 quality indicators, divided into preprocedure, intraprocedure, and postprocedure. It includes five “priority” indicators — two of which are new.

One is the rate of adequate bowel preparation, preferably defined as a Boston Bowel Preparation Scale score ≥ 2 in each of three colon segments or by description of the preparation as excellent, good, or adequate. It has a performance target > 90%.

“Inadequate bowel preparation substantially increases the cost of colonoscopy delivery and creates risk and inconvenience for patients, thus warranting a ranking as a priority indicator,” the task force wrote.

Dr. Gellad explained that the addition of this priority indicator is “notable because it highlights the importance of bowel prep in high-quality colonoscopy. It also shifts more of the responsibility of bowel prep from the patient to the practice.”

The second new quality indicator is the SSLDR, which was selected due to its ability to contribute to cancer prevention.

Based on available evidence, the task force recommends a current minimum threshold for the SSLDR of 6%. “This is expected to be revised upward as evidence of increasing detection occurs,” they wrote.

Duke University
Dr. Ziad F. Gellad



Dr. Gellad said the addition of SSLDR is “an important advance in these recommendations. We know that serrated adenomas are a precursor for colorectal cancer and that the detection of these subtle lesions is variable.

“Providing a benchmark encourages practices to measure the detection of serrated adenomas and intervene when rates are below benchmarks. Prior to these benchmarks, it was difficult to know where to peg our expectations,” Dr. Gellad added.
 

 

 

Changes to the Adenoma Detection Rate (ADR)

The ADR remains a priority indicator in the update, albeit with changes.

To keep the ADR measurement consistent with current screening guidelines, the task force now recommends that the ADR be measured starting at age 45 rather than 50 years.

“ADR plays a critical role in evaluating the performance of the colonoscopists,” task force lead Douglas K. Rex, MD, a gastroenterologist at Indiana University School of Medicine in Indianapolis, said in the statement.

“It is recommended that ADR calculations include screening, surveillance, and diagnostic colonoscopy but exclude indications of a positive noncolonoscopy screening test and therapeutic procedures for resection or treatment of known neoplasia, genetic cancer syndromes, and inflammatory bowel disease,” Dr. Rex explained.

Dr. Douglas K. Rex



The task force recommends a minimum ADR threshold of 35% (40% in men and 30% in women) and that colonoscopists with ADRs below 35% “undertake remedial measures to improve and to achieve acceptable performance.”
 

Additional Priorities 

The cecal intubation rate (CIR) — the percentage of patients undergoing colonoscopy with intact colons who have full intubation of the cecum with photo documentation of cecal landmarks — remains a priority quality indicator and has a performance target ≥ 95%.

“A trained colonoscopist should achieve a high CIR with a very high level of safety,” the task force wrote. “Low CIRs have been associated with higher PCCRC [postcolonoscopy colorectal cancer] rates.” 

The final priority indicator is the rate of using recommended screening and surveillance intervals, which carries a performance target ≥ 90%.

“We recommend that quality improvement efforts initially focus on high-priority indicators and then progress to other indicators once it is ascertained that endoscopists are performing above recommended thresholds, either at baseline or after corrective interventions,” the task force wrote.

“The priority indicators are absolutely important for practices to implement,” Dr. Gellad said.

“There is compelling evidence that these measures are correlated with clinically important outcomes, particularly ADR,” he added. “Many practices already capture this data, and the changes in ADR calculation make measurement less burdensome. Hopefully, this will encourage more practices to collect and report these measures.” 

Dr. Rex is a consultant for Olympus, Boston Scientific, Braintree Laboratories, Norgine, GI Supply, Medtronic, and Acacia Pharmaceuticals; receives research support from Olympus, Medivators, Erbe USA, and Braintree Laboratories; and is a shareholder in Satisfai Health. Dr. Shaheen had no relevant disclosures. Dr. Gellad has consulted for Merck & Co. and Novo Nordisk and is a cofounder of Higgs Boson.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 09/18/2024 - 01:23
Un-Gate On Date
Wed, 09/18/2024 - 01:23
Use ProPublica
CFC Schedule Remove Status
Wed, 09/18/2024 - 01:23
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 09/18/2024 - 01:23

Hormone Therapy Can Benefit Women into Their 80s

Article Type
Changed
Thu, 09/19/2024 - 15:33

 

Hormone therapy (HT) can help women manage menopause symptoms into their 80s and the reasons are varied, according to a retrospective analysis being presented at the annual meeting of The Menopause Society.

“It’s important to know that this is a preselected group of women who had no contraindications to continuing their hormone therapy,” senior author Wendy Wolfman, MD, director of the Menopause Clinic and The Premature Ovarian Insufficiency Clinic at Mount Sinai Hospital in Toronto, Ontario, Canada, said in an interview. “They had the initiation of hormone therapy closer to menopause and carried on their hormones. We followed them for a long time and basically saw no real concerns about taking the hormones and the patients did very well. It’s important to emphasize this was not the new initiation of hormone therapy in elderly women.”

She said that, in her large tertiary referral center, “I still see patients who are referred who are told that they have to stop their hormones after 5 years based on a false assumption. Everybody ages at different rates and everybody has different risk factors.”

About 70%-80% of women experience menopause symptoms that restrict quality of life and productivity, the authors noted. HT has consistently been the most effective means for managing many of the side effects, especially hot flashes.

Hot flashes last on average 7-11 years. But they continue in up to 40% of women in their 60s and 10%-15% in their 70s, the authors wrote. 

The analysis included more than 100 women in Canada older than 65 who continue to use HT and explored the motivations of the women to use them.

The average age of the women was 71 and nearly 8% were age 80 or older. The mean age for starting HT was 52 years and the women continued HT for an average 18 years, though 42% used it regularly for more than 20 years. Most of the women (nearly 88%) used a transdermal form of estrogen; only 12% used oral estrogen pills. Fewer than 5% of participants used synthetic progestins.

Controlling hot flashes was the No. 1 reason the women continued HT beyond age 65 (55%), followed by a desire for a better quality of life (29%), and to reduce chronic pain and arthritis symptoms (7%).

Some adverse effects were reported – postmenopausal bleeding was the most common – but no strokes, myocardial infarctions, or uterine cancers were documented.

More than one fourth (26.4%) of the women tried stopping HT once, but 87% reported that the return of hot flashes was the main reason to restart HT.

In addition, “many women choose to continue hormone therapy long term for relief of nonvasomotor symptoms, preservation of bone density, and a desire to benefit from potential long-term cardiovascular protection,” said Lauren F. Streicher, MD, Professor of Obstetrics and Gynecology at Feinberg School of Medicine at Northwestern University in Chicago, who was not part of the research.

In 2022, The Menopause Society position statement on hormone therapy acknowledged that, on an individual basis, it is appropriate for women to continue hormone therapy long term with counseling on benefits and risks.

“However, few studies have evaluated the outcomes of using hormone therapy for more than 10 years, and individual motivation for doing so,” Dr. Streicher said. She pointed to a study that analyzed the insurance records of more than 10 million women who continued their HT past the age of 65 and reassuringly found that there were significant risk reductions in all-cause mortality, breast cancer, lung cancer, colorectal cancer, heart failure, venous thromboembolism, atrial fibrillation, acute myocardial infarction, and dementia. In that study, however, the reasons women chose to continue hormone therapy were not specified. 

“In this retrospective Canadian study,” she noted, “the outcomes were again reassuring, with no increase in strokes, myocardial infarctions, or uterine cancers. The reasons cited for continuing hormone therapy were not just to treat ongoing vasomotor symptoms, but also other menopause symptoms such as musculoskeletal aches and pains, and overall quality of life.

Dr. Streicher said that, while long-term longitudinal studies are needed to make definitive recommendations, “It is reassuring that women who choose to extend hormone therapy can safely do so. It is irresponsible, cruel, and nonsensical to continue to make blanket statements that hormone therapy should be discontinued based on age or years of use and commit women to enduring symptoms and depriving them of possible long-term benefits.”

Dr. Streicher gives lectures for Midi Health and owns Sermonix stock. Dr. Wolfman has been on the advisory boards for many pharmaceutical companies. She is the past president of the Canadian Menopause Society and is on the board of the International Menopause Society.

Publications
Topics
Sections

 

Hormone therapy (HT) can help women manage menopause symptoms into their 80s and the reasons are varied, according to a retrospective analysis being presented at the annual meeting of The Menopause Society.

“It’s important to know that this is a preselected group of women who had no contraindications to continuing their hormone therapy,” senior author Wendy Wolfman, MD, director of the Menopause Clinic and The Premature Ovarian Insufficiency Clinic at Mount Sinai Hospital in Toronto, Ontario, Canada, said in an interview. “They had the initiation of hormone therapy closer to menopause and carried on their hormones. We followed them for a long time and basically saw no real concerns about taking the hormones and the patients did very well. It’s important to emphasize this was not the new initiation of hormone therapy in elderly women.”

She said that, in her large tertiary referral center, “I still see patients who are referred who are told that they have to stop their hormones after 5 years based on a false assumption. Everybody ages at different rates and everybody has different risk factors.”

About 70%-80% of women experience menopause symptoms that restrict quality of life and productivity, the authors noted. HT has consistently been the most effective means for managing many of the side effects, especially hot flashes.

Hot flashes last on average 7-11 years. But they continue in up to 40% of women in their 60s and 10%-15% in their 70s, the authors wrote. 

The analysis included more than 100 women in Canada older than 65 who continue to use HT and explored the motivations of the women to use them.

The average age of the women was 71 and nearly 8% were age 80 or older. The mean age for starting HT was 52 years and the women continued HT for an average 18 years, though 42% used it regularly for more than 20 years. Most of the women (nearly 88%) used a transdermal form of estrogen; only 12% used oral estrogen pills. Fewer than 5% of participants used synthetic progestins.

Controlling hot flashes was the No. 1 reason the women continued HT beyond age 65 (55%), followed by a desire for a better quality of life (29%), and to reduce chronic pain and arthritis symptoms (7%).

Some adverse effects were reported – postmenopausal bleeding was the most common – but no strokes, myocardial infarctions, or uterine cancers were documented.

More than one fourth (26.4%) of the women tried stopping HT once, but 87% reported that the return of hot flashes was the main reason to restart HT.

In addition, “many women choose to continue hormone therapy long term for relief of nonvasomotor symptoms, preservation of bone density, and a desire to benefit from potential long-term cardiovascular protection,” said Lauren F. Streicher, MD, Professor of Obstetrics and Gynecology at Feinberg School of Medicine at Northwestern University in Chicago, who was not part of the research.

In 2022, The Menopause Society position statement on hormone therapy acknowledged that, on an individual basis, it is appropriate for women to continue hormone therapy long term with counseling on benefits and risks.

“However, few studies have evaluated the outcomes of using hormone therapy for more than 10 years, and individual motivation for doing so,” Dr. Streicher said. She pointed to a study that analyzed the insurance records of more than 10 million women who continued their HT past the age of 65 and reassuringly found that there were significant risk reductions in all-cause mortality, breast cancer, lung cancer, colorectal cancer, heart failure, venous thromboembolism, atrial fibrillation, acute myocardial infarction, and dementia. In that study, however, the reasons women chose to continue hormone therapy were not specified. 

“In this retrospective Canadian study,” she noted, “the outcomes were again reassuring, with no increase in strokes, myocardial infarctions, or uterine cancers. The reasons cited for continuing hormone therapy were not just to treat ongoing vasomotor symptoms, but also other menopause symptoms such as musculoskeletal aches and pains, and overall quality of life.

Dr. Streicher said that, while long-term longitudinal studies are needed to make definitive recommendations, “It is reassuring that women who choose to extend hormone therapy can safely do so. It is irresponsible, cruel, and nonsensical to continue to make blanket statements that hormone therapy should be discontinued based on age or years of use and commit women to enduring symptoms and depriving them of possible long-term benefits.”

Dr. Streicher gives lectures for Midi Health and owns Sermonix stock. Dr. Wolfman has been on the advisory boards for many pharmaceutical companies. She is the past president of the Canadian Menopause Society and is on the board of the International Menopause Society.

 

Hormone therapy (HT) can help women manage menopause symptoms into their 80s and the reasons are varied, according to a retrospective analysis being presented at the annual meeting of The Menopause Society.

“It’s important to know that this is a preselected group of women who had no contraindications to continuing their hormone therapy,” senior author Wendy Wolfman, MD, director of the Menopause Clinic and The Premature Ovarian Insufficiency Clinic at Mount Sinai Hospital in Toronto, Ontario, Canada, said in an interview. “They had the initiation of hormone therapy closer to menopause and carried on their hormones. We followed them for a long time and basically saw no real concerns about taking the hormones and the patients did very well. It’s important to emphasize this was not the new initiation of hormone therapy in elderly women.”

She said that, in her large tertiary referral center, “I still see patients who are referred who are told that they have to stop their hormones after 5 years based on a false assumption. Everybody ages at different rates and everybody has different risk factors.”

About 70%-80% of women experience menopause symptoms that restrict quality of life and productivity, the authors noted. HT has consistently been the most effective means for managing many of the side effects, especially hot flashes.

Hot flashes last on average 7-11 years. But they continue in up to 40% of women in their 60s and 10%-15% in their 70s, the authors wrote. 

The analysis included more than 100 women in Canada older than 65 who continue to use HT and explored the motivations of the women to use them.

The average age of the women was 71 and nearly 8% were age 80 or older. The mean age for starting HT was 52 years and the women continued HT for an average 18 years, though 42% used it regularly for more than 20 years. Most of the women (nearly 88%) used a transdermal form of estrogen; only 12% used oral estrogen pills. Fewer than 5% of participants used synthetic progestins.

Controlling hot flashes was the No. 1 reason the women continued HT beyond age 65 (55%), followed by a desire for a better quality of life (29%), and to reduce chronic pain and arthritis symptoms (7%).

Some adverse effects were reported – postmenopausal bleeding was the most common – but no strokes, myocardial infarctions, or uterine cancers were documented.

More than one fourth (26.4%) of the women tried stopping HT once, but 87% reported that the return of hot flashes was the main reason to restart HT.

In addition, “many women choose to continue hormone therapy long term for relief of nonvasomotor symptoms, preservation of bone density, and a desire to benefit from potential long-term cardiovascular protection,” said Lauren F. Streicher, MD, Professor of Obstetrics and Gynecology at Feinberg School of Medicine at Northwestern University in Chicago, who was not part of the research.

In 2022, The Menopause Society position statement on hormone therapy acknowledged that, on an individual basis, it is appropriate for women to continue hormone therapy long term with counseling on benefits and risks.

“However, few studies have evaluated the outcomes of using hormone therapy for more than 10 years, and individual motivation for doing so,” Dr. Streicher said. She pointed to a study that analyzed the insurance records of more than 10 million women who continued their HT past the age of 65 and reassuringly found that there were significant risk reductions in all-cause mortality, breast cancer, lung cancer, colorectal cancer, heart failure, venous thromboembolism, atrial fibrillation, acute myocardial infarction, and dementia. In that study, however, the reasons women chose to continue hormone therapy were not specified. 

“In this retrospective Canadian study,” she noted, “the outcomes were again reassuring, with no increase in strokes, myocardial infarctions, or uterine cancers. The reasons cited for continuing hormone therapy were not just to treat ongoing vasomotor symptoms, but also other menopause symptoms such as musculoskeletal aches and pains, and overall quality of life.

Dr. Streicher said that, while long-term longitudinal studies are needed to make definitive recommendations, “It is reassuring that women who choose to extend hormone therapy can safely do so. It is irresponsible, cruel, and nonsensical to continue to make blanket statements that hormone therapy should be discontinued based on age or years of use and commit women to enduring symptoms and depriving them of possible long-term benefits.”

Dr. Streicher gives lectures for Midi Health and owns Sermonix stock. Dr. Wolfman has been on the advisory boards for many pharmaceutical companies. She is the past president of the Canadian Menopause Society and is on the board of the International Menopause Society.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE MENOPAUSE SOCIETY 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Can Antihistamines Trigger Seizures in Young Kids?

Article Type
Changed
Tue, 09/10/2024 - 14:59

 

TOPLINE:

First-generation antihistamines are linked to a 22% higher risk for seizures in children, new research shows. The risk appears to be most pronounced in children aged 6-24 months.

METHODOLOGY:

  • Researchers in Korea used a self-controlled case-crossover design to assess the risk for seizures associated with prescriptions of first-generation antihistamines.
  • They analyzed data from 11,729 children who had a seizure event (an emergency department visit with a diagnosis of epilepsy, status epilepticus, or convulsion) and had previously received a prescription for a first-generation antihistamine, including chlorpheniramine maleate, mequitazine, oxatomide, piprinhydrinate, or hydroxyzine hydrochloride.
  • Prescriptions during the 15 days before a seizure were considered to have been received during a hazard period, whereas earlier prescriptions were considered to have been received during a control period.
  • The researchers excluded patients with febrile seizures.

TAKEAWAY:

  • In an adjusted analysis, a prescription for an antihistamine during the hazard period was associated with a 22% higher risk for seizures in children (adjusted odds ratio, 1.22; 95% CI, 1.13-1.31).
  • The seizure risk was significant in children aged 6-24 months, with an adjusted odds ratio of 1.49 (95% CI, 1.31-1.70).
  • For older children, the risk was not statistically significant.

IN PRACTICE:

“The study underscores a substantial increase in seizure risk associated with antihistamine prescription among children aged 6-24 months,” the authors of the study wrote. “We are not aware of any other studies that have pointed out the increased risk of seizures with first-generation antihistamines in this particular age group. ... The benefits and risks of antihistamine use should always be carefully considered, especially when prescribing H1 antihistamines to vulnerable infants.”

The findings raise a host of questions for clinicians, including how a “relatively small risk” should translate into practice, and whether the risk may be attenuated with newer antihistamines, wrote Frank Max Charles Besag, MB, ChB, with East London NHS Foundation Trust in England, in an editorial accompanying the study. “It would be reasonable to inform families that at least one study has suggested a relatively small increase in the risk of seizures with first-generation antihistamines, adding that there are still too few data to draw any firm conclusions and also providing families with the information on what to do if the child were to have a seizure.” 
 

SOURCE:

Seonkyeong Rhie, MD, and Man Yong Han, MD, both with the Department of Pediatrics at CHA University School of Medicine, in Seongnam, South Korea, were the corresponding authors on the study. The research was published online in JAMA Network Open.

LIMITATIONS:

The researchers did not have details about seizure symptoms, did not include children seen in outpatient clinics, and were unable to verify the actual intake of the prescribed antihistamines. Although second-generation antihistamines may be less likely to cross the blood-brain barrier, one newer medication, desloratadine, has been associated with seizures.

DISCLOSURES:

The study was supported by grants from the Korea Health Technology R&D Project through the Korea Health Industry Development Institute, the Ministry of Health and Welfare, Republic of Korea.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

First-generation antihistamines are linked to a 22% higher risk for seizures in children, new research shows. The risk appears to be most pronounced in children aged 6-24 months.

METHODOLOGY:

  • Researchers in Korea used a self-controlled case-crossover design to assess the risk for seizures associated with prescriptions of first-generation antihistamines.
  • They analyzed data from 11,729 children who had a seizure event (an emergency department visit with a diagnosis of epilepsy, status epilepticus, or convulsion) and had previously received a prescription for a first-generation antihistamine, including chlorpheniramine maleate, mequitazine, oxatomide, piprinhydrinate, or hydroxyzine hydrochloride.
  • Prescriptions during the 15 days before a seizure were considered to have been received during a hazard period, whereas earlier prescriptions were considered to have been received during a control period.
  • The researchers excluded patients with febrile seizures.

TAKEAWAY:

  • In an adjusted analysis, a prescription for an antihistamine during the hazard period was associated with a 22% higher risk for seizures in children (adjusted odds ratio, 1.22; 95% CI, 1.13-1.31).
  • The seizure risk was significant in children aged 6-24 months, with an adjusted odds ratio of 1.49 (95% CI, 1.31-1.70).
  • For older children, the risk was not statistically significant.

IN PRACTICE:

“The study underscores a substantial increase in seizure risk associated with antihistamine prescription among children aged 6-24 months,” the authors of the study wrote. “We are not aware of any other studies that have pointed out the increased risk of seizures with first-generation antihistamines in this particular age group. ... The benefits and risks of antihistamine use should always be carefully considered, especially when prescribing H1 antihistamines to vulnerable infants.”

The findings raise a host of questions for clinicians, including how a “relatively small risk” should translate into practice, and whether the risk may be attenuated with newer antihistamines, wrote Frank Max Charles Besag, MB, ChB, with East London NHS Foundation Trust in England, in an editorial accompanying the study. “It would be reasonable to inform families that at least one study has suggested a relatively small increase in the risk of seizures with first-generation antihistamines, adding that there are still too few data to draw any firm conclusions and also providing families with the information on what to do if the child were to have a seizure.” 
 

SOURCE:

Seonkyeong Rhie, MD, and Man Yong Han, MD, both with the Department of Pediatrics at CHA University School of Medicine, in Seongnam, South Korea, were the corresponding authors on the study. The research was published online in JAMA Network Open.

LIMITATIONS:

The researchers did not have details about seizure symptoms, did not include children seen in outpatient clinics, and were unable to verify the actual intake of the prescribed antihistamines. Although second-generation antihistamines may be less likely to cross the blood-brain barrier, one newer medication, desloratadine, has been associated with seizures.

DISCLOSURES:

The study was supported by grants from the Korea Health Technology R&D Project through the Korea Health Industry Development Institute, the Ministry of Health and Welfare, Republic of Korea.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

First-generation antihistamines are linked to a 22% higher risk for seizures in children, new research shows. The risk appears to be most pronounced in children aged 6-24 months.

METHODOLOGY:

  • Researchers in Korea used a self-controlled case-crossover design to assess the risk for seizures associated with prescriptions of first-generation antihistamines.
  • They analyzed data from 11,729 children who had a seizure event (an emergency department visit with a diagnosis of epilepsy, status epilepticus, or convulsion) and had previously received a prescription for a first-generation antihistamine, including chlorpheniramine maleate, mequitazine, oxatomide, piprinhydrinate, or hydroxyzine hydrochloride.
  • Prescriptions during the 15 days before a seizure were considered to have been received during a hazard period, whereas earlier prescriptions were considered to have been received during a control period.
  • The researchers excluded patients with febrile seizures.

TAKEAWAY:

  • In an adjusted analysis, a prescription for an antihistamine during the hazard period was associated with a 22% higher risk for seizures in children (adjusted odds ratio, 1.22; 95% CI, 1.13-1.31).
  • The seizure risk was significant in children aged 6-24 months, with an adjusted odds ratio of 1.49 (95% CI, 1.31-1.70).
  • For older children, the risk was not statistically significant.

IN PRACTICE:

“The study underscores a substantial increase in seizure risk associated with antihistamine prescription among children aged 6-24 months,” the authors of the study wrote. “We are not aware of any other studies that have pointed out the increased risk of seizures with first-generation antihistamines in this particular age group. ... The benefits and risks of antihistamine use should always be carefully considered, especially when prescribing H1 antihistamines to vulnerable infants.”

The findings raise a host of questions for clinicians, including how a “relatively small risk” should translate into practice, and whether the risk may be attenuated with newer antihistamines, wrote Frank Max Charles Besag, MB, ChB, with East London NHS Foundation Trust in England, in an editorial accompanying the study. “It would be reasonable to inform families that at least one study has suggested a relatively small increase in the risk of seizures with first-generation antihistamines, adding that there are still too few data to draw any firm conclusions and also providing families with the information on what to do if the child were to have a seizure.” 
 

SOURCE:

Seonkyeong Rhie, MD, and Man Yong Han, MD, both with the Department of Pediatrics at CHA University School of Medicine, in Seongnam, South Korea, were the corresponding authors on the study. The research was published online in JAMA Network Open.

LIMITATIONS:

The researchers did not have details about seizure symptoms, did not include children seen in outpatient clinics, and were unable to verify the actual intake of the prescribed antihistamines. Although second-generation antihistamines may be less likely to cross the blood-brain barrier, one newer medication, desloratadine, has been associated with seizures.

DISCLOSURES:

The study was supported by grants from the Korea Health Technology R&D Project through the Korea Health Industry Development Institute, the Ministry of Health and Welfare, Republic of Korea.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Genetic Testing and Novel Biomarkers Important in Cystic Fibrosis Diagnosis and Monitoring

Article Type
Changed
Tue, 09/10/2024 - 14:58

 

— Advances in genetic testing and newly discovered biomarkers can help screen newborns and monitor inflammation and pulmonary exacerbations in patients diagnosed with cystic fibrosis.

At the European Respiratory Society (ERS) 2024 International Congress, clinical researchers presented results from the Turkish context.

Cystic fibrosis is the most common genetic disorder among Caucasians. The average prevalence at birth in Europe is 1 in 5000, whereas the overall population averages 1 in 9000. Both rates vary significantly based on geographic area. In the central Anatolia region, one study found that the incidence of cystic fibrosis is 1 in 3400 live births.

Çigdem Korkmaz, a researcher at the Department of Pediatric Pulmonology at the Istanbul University-Cerrahpasa in Istanbul, Turkey, said that diagnosis in Turkey is especially challenging because of the genetic diversity of cystic fibrosis within the population. She said genetic testing might be necessary to catch missed cases by traditional screening methods.
 

Genetic Testing Picks Up Missed Cases

In 2022, 30 European countries run newborn bloodspot screening for cystic fibrosis, with 26 national programs. Screening protocols vary between countries but generally involve initial screening using an immunoreactive trypsinogen (IRT) blood test. Follow-up testing may include a second IRT test, DNA analysis for common CFTR mutations, and sweat chloride test (SCT).

Turkey introduced newborn screening for cystic fibrosis in 2015. Newborns with an elevated IRT and confirmatory SCT undergo genetic testing. However, in a retrospective study, researchers found that IRT tests turn many false-positive results, and some patients who turn a normal SCT are diagnosed with the disease through genetic testing.

The study included 205 infants referred to a tertiary care center in Istanbul between January 2015 and January 2023 following an elevator IRT result. The researchers analyzed the clinical and sociodemographic data, IRT and SCT values, and genetic analysis results.

They found that cystic fibrosis was confirmed in only 30% newborns, while genetic testing could identify nine cases otherwise missed by SCT. “The high false-positive rate of the current screening strategy suggests that the IRT thresholds used in Turkey may be too low,” said Ms. Korkmaz, who presented the study at the ERS Congress. She added that genetic testing might be important, especially in patients with normal SCT results. “Early diagnosis means these patients avoid missing or delaying treatments.”
 

Biomarkers for Monitoring Cystic Fibrosis Exacerbations

C-reactive protein (CRP) blood testing is typically used in monitoring inflammation and pulmonary exacerbations in patients who have already been diagnosed with cystic fibrosis. CRP is an inflammatory biomarker that increases in patients with cystic fibrosis during pulmonary exacerbations and settles with treatment.

Researchers at Gazi University in Ankara, Turkey, found other biomarkers to identify inflammation and pulmonary exacerbations with great sensitivity and specificity in patients with cystic fibrosis.

Over 3 years, from 2021 to 2024, the researchers analyzed blood samples from 54 children aged 1-18 years during exacerbation and non-exacerbation periods. Besides CRP, they tested CRP/albumin (ALB) ratio, neutrophil-to-lymphocyte ratio (NLR), delivered NLR (dNLR), and systemic immune inflammation (SII).

All biomarkers increased during exacerbation episodes. All showed high specificity and sensitivity:

  • CPR/ALB had a specificity of 81% and a sensitivity of 90% at a cutoff of 1.7 mg/dL.
  • SII had a specificity of 86% and a sensitivity of 67% at a cutoff of 426 mg/dL.
  • NLR had a specificity of 62% and a sensitivity of 79% at a cutoff of 2.2 mg/dL.
  • SII had a specificity of 86% and a sensitivity of 67% at a cutoff of 426 mg/dL.
  • dNLR had a specificity of 71% and a sensitivity of 66% at a cutoff of 1.15 mg/dL.
  • In comparison, CPR had a specificity of 85% and a sensitivity of 84% at a cutoff of 6.2 mg/dL.
 

 

Ayse Tana Aslan, a professor at the Department of Pediatric Pulmonology, Faculty of Medicine, at Gazi University in Ankara, Turkey, who presented the results at the ERS Congress, said that these biomarkers can be easily and quickly identified with a blood test while waiting on phlegm culture results, which can take days. “It is important to predict inflammation and exacerbation quickly so that patients can start a course of antibiotics as soon as possible,” she said.

Ms. Korkmaz and Ms. Aslan reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

— Advances in genetic testing and newly discovered biomarkers can help screen newborns and monitor inflammation and pulmonary exacerbations in patients diagnosed with cystic fibrosis.

At the European Respiratory Society (ERS) 2024 International Congress, clinical researchers presented results from the Turkish context.

Cystic fibrosis is the most common genetic disorder among Caucasians. The average prevalence at birth in Europe is 1 in 5000, whereas the overall population averages 1 in 9000. Both rates vary significantly based on geographic area. In the central Anatolia region, one study found that the incidence of cystic fibrosis is 1 in 3400 live births.

Çigdem Korkmaz, a researcher at the Department of Pediatric Pulmonology at the Istanbul University-Cerrahpasa in Istanbul, Turkey, said that diagnosis in Turkey is especially challenging because of the genetic diversity of cystic fibrosis within the population. She said genetic testing might be necessary to catch missed cases by traditional screening methods.
 

Genetic Testing Picks Up Missed Cases

In 2022, 30 European countries run newborn bloodspot screening for cystic fibrosis, with 26 national programs. Screening protocols vary between countries but generally involve initial screening using an immunoreactive trypsinogen (IRT) blood test. Follow-up testing may include a second IRT test, DNA analysis for common CFTR mutations, and sweat chloride test (SCT).

Turkey introduced newborn screening for cystic fibrosis in 2015. Newborns with an elevated IRT and confirmatory SCT undergo genetic testing. However, in a retrospective study, researchers found that IRT tests turn many false-positive results, and some patients who turn a normal SCT are diagnosed with the disease through genetic testing.

The study included 205 infants referred to a tertiary care center in Istanbul between January 2015 and January 2023 following an elevator IRT result. The researchers analyzed the clinical and sociodemographic data, IRT and SCT values, and genetic analysis results.

They found that cystic fibrosis was confirmed in only 30% newborns, while genetic testing could identify nine cases otherwise missed by SCT. “The high false-positive rate of the current screening strategy suggests that the IRT thresholds used in Turkey may be too low,” said Ms. Korkmaz, who presented the study at the ERS Congress. She added that genetic testing might be important, especially in patients with normal SCT results. “Early diagnosis means these patients avoid missing or delaying treatments.”
 

Biomarkers for Monitoring Cystic Fibrosis Exacerbations

C-reactive protein (CRP) blood testing is typically used in monitoring inflammation and pulmonary exacerbations in patients who have already been diagnosed with cystic fibrosis. CRP is an inflammatory biomarker that increases in patients with cystic fibrosis during pulmonary exacerbations and settles with treatment.

Researchers at Gazi University in Ankara, Turkey, found other biomarkers to identify inflammation and pulmonary exacerbations with great sensitivity and specificity in patients with cystic fibrosis.

Over 3 years, from 2021 to 2024, the researchers analyzed blood samples from 54 children aged 1-18 years during exacerbation and non-exacerbation periods. Besides CRP, they tested CRP/albumin (ALB) ratio, neutrophil-to-lymphocyte ratio (NLR), delivered NLR (dNLR), and systemic immune inflammation (SII).

All biomarkers increased during exacerbation episodes. All showed high specificity and sensitivity:

  • CPR/ALB had a specificity of 81% and a sensitivity of 90% at a cutoff of 1.7 mg/dL.
  • SII had a specificity of 86% and a sensitivity of 67% at a cutoff of 426 mg/dL.
  • NLR had a specificity of 62% and a sensitivity of 79% at a cutoff of 2.2 mg/dL.
  • SII had a specificity of 86% and a sensitivity of 67% at a cutoff of 426 mg/dL.
  • dNLR had a specificity of 71% and a sensitivity of 66% at a cutoff of 1.15 mg/dL.
  • In comparison, CPR had a specificity of 85% and a sensitivity of 84% at a cutoff of 6.2 mg/dL.
 

 

Ayse Tana Aslan, a professor at the Department of Pediatric Pulmonology, Faculty of Medicine, at Gazi University in Ankara, Turkey, who presented the results at the ERS Congress, said that these biomarkers can be easily and quickly identified with a blood test while waiting on phlegm culture results, which can take days. “It is important to predict inflammation and exacerbation quickly so that patients can start a course of antibiotics as soon as possible,” she said.

Ms. Korkmaz and Ms. Aslan reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

 

— Advances in genetic testing and newly discovered biomarkers can help screen newborns and monitor inflammation and pulmonary exacerbations in patients diagnosed with cystic fibrosis.

At the European Respiratory Society (ERS) 2024 International Congress, clinical researchers presented results from the Turkish context.

Cystic fibrosis is the most common genetic disorder among Caucasians. The average prevalence at birth in Europe is 1 in 5000, whereas the overall population averages 1 in 9000. Both rates vary significantly based on geographic area. In the central Anatolia region, one study found that the incidence of cystic fibrosis is 1 in 3400 live births.

Çigdem Korkmaz, a researcher at the Department of Pediatric Pulmonology at the Istanbul University-Cerrahpasa in Istanbul, Turkey, said that diagnosis in Turkey is especially challenging because of the genetic diversity of cystic fibrosis within the population. She said genetic testing might be necessary to catch missed cases by traditional screening methods.
 

Genetic Testing Picks Up Missed Cases

In 2022, 30 European countries run newborn bloodspot screening for cystic fibrosis, with 26 national programs. Screening protocols vary between countries but generally involve initial screening using an immunoreactive trypsinogen (IRT) blood test. Follow-up testing may include a second IRT test, DNA analysis for common CFTR mutations, and sweat chloride test (SCT).

Turkey introduced newborn screening for cystic fibrosis in 2015. Newborns with an elevated IRT and confirmatory SCT undergo genetic testing. However, in a retrospective study, researchers found that IRT tests turn many false-positive results, and some patients who turn a normal SCT are diagnosed with the disease through genetic testing.

The study included 205 infants referred to a tertiary care center in Istanbul between January 2015 and January 2023 following an elevator IRT result. The researchers analyzed the clinical and sociodemographic data, IRT and SCT values, and genetic analysis results.

They found that cystic fibrosis was confirmed in only 30% newborns, while genetic testing could identify nine cases otherwise missed by SCT. “The high false-positive rate of the current screening strategy suggests that the IRT thresholds used in Turkey may be too low,” said Ms. Korkmaz, who presented the study at the ERS Congress. She added that genetic testing might be important, especially in patients with normal SCT results. “Early diagnosis means these patients avoid missing or delaying treatments.”
 

Biomarkers for Monitoring Cystic Fibrosis Exacerbations

C-reactive protein (CRP) blood testing is typically used in monitoring inflammation and pulmonary exacerbations in patients who have already been diagnosed with cystic fibrosis. CRP is an inflammatory biomarker that increases in patients with cystic fibrosis during pulmonary exacerbations and settles with treatment.

Researchers at Gazi University in Ankara, Turkey, found other biomarkers to identify inflammation and pulmonary exacerbations with great sensitivity and specificity in patients with cystic fibrosis.

Over 3 years, from 2021 to 2024, the researchers analyzed blood samples from 54 children aged 1-18 years during exacerbation and non-exacerbation periods. Besides CRP, they tested CRP/albumin (ALB) ratio, neutrophil-to-lymphocyte ratio (NLR), delivered NLR (dNLR), and systemic immune inflammation (SII).

All biomarkers increased during exacerbation episodes. All showed high specificity and sensitivity:

  • CPR/ALB had a specificity of 81% and a sensitivity of 90% at a cutoff of 1.7 mg/dL.
  • SII had a specificity of 86% and a sensitivity of 67% at a cutoff of 426 mg/dL.
  • NLR had a specificity of 62% and a sensitivity of 79% at a cutoff of 2.2 mg/dL.
  • SII had a specificity of 86% and a sensitivity of 67% at a cutoff of 426 mg/dL.
  • dNLR had a specificity of 71% and a sensitivity of 66% at a cutoff of 1.15 mg/dL.
  • In comparison, CPR had a specificity of 85% and a sensitivity of 84% at a cutoff of 6.2 mg/dL.
 

 

Ayse Tana Aslan, a professor at the Department of Pediatric Pulmonology, Faculty of Medicine, at Gazi University in Ankara, Turkey, who presented the results at the ERS Congress, said that these biomarkers can be easily and quickly identified with a blood test while waiting on phlegm culture results, which can take days. “It is important to predict inflammation and exacerbation quickly so that patients can start a course of antibiotics as soon as possible,” she said.

Ms. Korkmaz and Ms. Aslan reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Blood Eosinophil Counts Might Predict Childhood Asthma, Treatment Response

Article Type
Changed
Tue, 09/10/2024 - 14:37

 

— Simply relying on clinical symptoms is insufficient to predict which children with wheezing will develop asthma and respond to treatments. More objective tests like blood eosinophil counts are needed for early diagnosis and to avoid unnecessary medication use in children unlikely to develop asthma.

Sejal Saglani, MD, PhD, a professor of pediatric respiratory medicine at the National Heart and Lung Institute, Imperial College, London, England, said that preschool wheezing has long-term adverse consequences through to adulthood. “We need to prevent that downward trajectory of low lung function,” she said, presenting the latest research in the field at the annual European Respiratory Society International Congress.

Wheezing affects up to one third of all infants and preschool children, with one third developing asthma later in life. “It’s important to identify those kids because then we can treat them with the right medication,” said Mariëlle W.H. Pijnenburg, MD, PhD, a pulmonary specialist at Erasmus University Rotterdam in the Netherlands.

“We cannot just use clinical phenotype to decide what treatment a child should get. We need to run tests to identify the endotype of preschool wheeze and intervene appropriately,” Dr. Saglani added.
 

Eosinophilia as a Biomarker for Predicting Exacerbations and Steroid Responsiveness 

In a cluster analysis, Dr. Saglani and colleagues classified preschool children with wheezing into two main subgroups: Those who experience frequent exacerbations and those who experience sporadic attacks. Frequent exacerbators were more likely to develop asthma, use asthma medications, and show signs of reduced lung function and airway inflammation, such as higher fractional exhaled nitric oxide and allergic sensitization. “Severe and frequent exacerbators are the kids that get in trouble,” she said. “They’re the ones we must identify at preschool age and really try to minimize their exacerbations.”

Research has shown that eosinophilia is a valuable biomarker in predicting both asthma exacerbations and responsiveness to inhaled corticosteroids. Children with elevated blood eosinophils are more likely to experience frequent and severe exacerbations. These children often demonstrate an inflammatory profile more responsive to corticosteroids, making eosinophilia a predictor of treatment success. Children with eosinophilia are also more likely to have underlying allergic sensitizations, which further supports the use of corticosteroids as part of their management strategy.

Dr. Saglani said a simple blood test can provide a window into the child’s inflammatory status, allowing physicians to make more targeted and personalized treatment plans.

Traditionally, identifying eosinophilia required venipuncture and laboratory analysis, which can be time consuming and impractical in a busy clinical setting. Dr. Saglani’s research group is developing a point-of-care test designed to quickly and efficiently measure blood eosinophil levels in children with asthma or wheezing symptoms from a finger-prick test. Preliminary data presented at the congress show that children with higher eosinophil counts in the clinic were more likely to experience an asthma attack within 3 months.

“The problem is the majority of the children we see are either not atopic or do not have high blood eosinophils. What are we going to do with those?”
 

How to Treat Those Who Don’t Have Eosinophilia

Most children with wheezing are not atopic and do not exhibit eosinophilic inflammation, and these children may not respond as effectively to corticosteroids. How to treat them remains the “1-billion-dollar question,” Dr. Saglani said.

Respiratory syncytial virus and rhinovirus play a crucial role in triggering wheezing episodes in these children. Research has shown that viral-induced wheezing is a common feature in this phenotype, and repeated viral infections can lead to an increased severity and frequency of exacerbations. However, there are currently no effective antiviral therapies or vaccines for rhinovirus, which limits the ability to address the viral component of the disease directly.

Up to 50% of children with severe, recurrent wheezing also have bacterial pathogens like Moraxella catarrhalis and Haemophilus influenzae in their lower airways. For these children, addressing the bacterial infection is the best treatment option to mitigate the wheezing. “We now have something that we can target with antibiotics for those who don’t respond to corticosteroids,” Dr. Saglani said.

Dr. Pijnenburg said that this body of research is helping pulmonary specialists and general pediatricians navigate the complexity of childhood wheezing beyond phenotyping and symptoms. “We need to dive more deeply into those kids with preschool wheezing to see what’s happening in their lungs.”

Dr. Pijnenburg and Dr. Saglani reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

— Simply relying on clinical symptoms is insufficient to predict which children with wheezing will develop asthma and respond to treatments. More objective tests like blood eosinophil counts are needed for early diagnosis and to avoid unnecessary medication use in children unlikely to develop asthma.

Sejal Saglani, MD, PhD, a professor of pediatric respiratory medicine at the National Heart and Lung Institute, Imperial College, London, England, said that preschool wheezing has long-term adverse consequences through to adulthood. “We need to prevent that downward trajectory of low lung function,” she said, presenting the latest research in the field at the annual European Respiratory Society International Congress.

Wheezing affects up to one third of all infants and preschool children, with one third developing asthma later in life. “It’s important to identify those kids because then we can treat them with the right medication,” said Mariëlle W.H. Pijnenburg, MD, PhD, a pulmonary specialist at Erasmus University Rotterdam in the Netherlands.

“We cannot just use clinical phenotype to decide what treatment a child should get. We need to run tests to identify the endotype of preschool wheeze and intervene appropriately,” Dr. Saglani added.
 

Eosinophilia as a Biomarker for Predicting Exacerbations and Steroid Responsiveness 

In a cluster analysis, Dr. Saglani and colleagues classified preschool children with wheezing into two main subgroups: Those who experience frequent exacerbations and those who experience sporadic attacks. Frequent exacerbators were more likely to develop asthma, use asthma medications, and show signs of reduced lung function and airway inflammation, such as higher fractional exhaled nitric oxide and allergic sensitization. “Severe and frequent exacerbators are the kids that get in trouble,” she said. “They’re the ones we must identify at preschool age and really try to minimize their exacerbations.”

Research has shown that eosinophilia is a valuable biomarker in predicting both asthma exacerbations and responsiveness to inhaled corticosteroids. Children with elevated blood eosinophils are more likely to experience frequent and severe exacerbations. These children often demonstrate an inflammatory profile more responsive to corticosteroids, making eosinophilia a predictor of treatment success. Children with eosinophilia are also more likely to have underlying allergic sensitizations, which further supports the use of corticosteroids as part of their management strategy.

Dr. Saglani said a simple blood test can provide a window into the child’s inflammatory status, allowing physicians to make more targeted and personalized treatment plans.

Traditionally, identifying eosinophilia required venipuncture and laboratory analysis, which can be time consuming and impractical in a busy clinical setting. Dr. Saglani’s research group is developing a point-of-care test designed to quickly and efficiently measure blood eosinophil levels in children with asthma or wheezing symptoms from a finger-prick test. Preliminary data presented at the congress show that children with higher eosinophil counts in the clinic were more likely to experience an asthma attack within 3 months.

“The problem is the majority of the children we see are either not atopic or do not have high blood eosinophils. What are we going to do with those?”
 

How to Treat Those Who Don’t Have Eosinophilia

Most children with wheezing are not atopic and do not exhibit eosinophilic inflammation, and these children may not respond as effectively to corticosteroids. How to treat them remains the “1-billion-dollar question,” Dr. Saglani said.

Respiratory syncytial virus and rhinovirus play a crucial role in triggering wheezing episodes in these children. Research has shown that viral-induced wheezing is a common feature in this phenotype, and repeated viral infections can lead to an increased severity and frequency of exacerbations. However, there are currently no effective antiviral therapies or vaccines for rhinovirus, which limits the ability to address the viral component of the disease directly.

Up to 50% of children with severe, recurrent wheezing also have bacterial pathogens like Moraxella catarrhalis and Haemophilus influenzae in their lower airways. For these children, addressing the bacterial infection is the best treatment option to mitigate the wheezing. “We now have something that we can target with antibiotics for those who don’t respond to corticosteroids,” Dr. Saglani said.

Dr. Pijnenburg said that this body of research is helping pulmonary specialists and general pediatricians navigate the complexity of childhood wheezing beyond phenotyping and symptoms. “We need to dive more deeply into those kids with preschool wheezing to see what’s happening in their lungs.”

Dr. Pijnenburg and Dr. Saglani reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

 

— Simply relying on clinical symptoms is insufficient to predict which children with wheezing will develop asthma and respond to treatments. More objective tests like blood eosinophil counts are needed for early diagnosis and to avoid unnecessary medication use in children unlikely to develop asthma.

Sejal Saglani, MD, PhD, a professor of pediatric respiratory medicine at the National Heart and Lung Institute, Imperial College, London, England, said that preschool wheezing has long-term adverse consequences through to adulthood. “We need to prevent that downward trajectory of low lung function,” she said, presenting the latest research in the field at the annual European Respiratory Society International Congress.

Wheezing affects up to one third of all infants and preschool children, with one third developing asthma later in life. “It’s important to identify those kids because then we can treat them with the right medication,” said Mariëlle W.H. Pijnenburg, MD, PhD, a pulmonary specialist at Erasmus University Rotterdam in the Netherlands.

“We cannot just use clinical phenotype to decide what treatment a child should get. We need to run tests to identify the endotype of preschool wheeze and intervene appropriately,” Dr. Saglani added.
 

Eosinophilia as a Biomarker for Predicting Exacerbations and Steroid Responsiveness 

In a cluster analysis, Dr. Saglani and colleagues classified preschool children with wheezing into two main subgroups: Those who experience frequent exacerbations and those who experience sporadic attacks. Frequent exacerbators were more likely to develop asthma, use asthma medications, and show signs of reduced lung function and airway inflammation, such as higher fractional exhaled nitric oxide and allergic sensitization. “Severe and frequent exacerbators are the kids that get in trouble,” she said. “They’re the ones we must identify at preschool age and really try to minimize their exacerbations.”

Research has shown that eosinophilia is a valuable biomarker in predicting both asthma exacerbations and responsiveness to inhaled corticosteroids. Children with elevated blood eosinophils are more likely to experience frequent and severe exacerbations. These children often demonstrate an inflammatory profile more responsive to corticosteroids, making eosinophilia a predictor of treatment success. Children with eosinophilia are also more likely to have underlying allergic sensitizations, which further supports the use of corticosteroids as part of their management strategy.

Dr. Saglani said a simple blood test can provide a window into the child’s inflammatory status, allowing physicians to make more targeted and personalized treatment plans.

Traditionally, identifying eosinophilia required venipuncture and laboratory analysis, which can be time consuming and impractical in a busy clinical setting. Dr. Saglani’s research group is developing a point-of-care test designed to quickly and efficiently measure blood eosinophil levels in children with asthma or wheezing symptoms from a finger-prick test. Preliminary data presented at the congress show that children with higher eosinophil counts in the clinic were more likely to experience an asthma attack within 3 months.

“The problem is the majority of the children we see are either not atopic or do not have high blood eosinophils. What are we going to do with those?”
 

How to Treat Those Who Don’t Have Eosinophilia

Most children with wheezing are not atopic and do not exhibit eosinophilic inflammation, and these children may not respond as effectively to corticosteroids. How to treat them remains the “1-billion-dollar question,” Dr. Saglani said.

Respiratory syncytial virus and rhinovirus play a crucial role in triggering wheezing episodes in these children. Research has shown that viral-induced wheezing is a common feature in this phenotype, and repeated viral infections can lead to an increased severity and frequency of exacerbations. However, there are currently no effective antiviral therapies or vaccines for rhinovirus, which limits the ability to address the viral component of the disease directly.

Up to 50% of children with severe, recurrent wheezing also have bacterial pathogens like Moraxella catarrhalis and Haemophilus influenzae in their lower airways. For these children, addressing the bacterial infection is the best treatment option to mitigate the wheezing. “We now have something that we can target with antibiotics for those who don’t respond to corticosteroids,” Dr. Saglani said.

Dr. Pijnenburg said that this body of research is helping pulmonary specialists and general pediatricians navigate the complexity of childhood wheezing beyond phenotyping and symptoms. “We need to dive more deeply into those kids with preschool wheezing to see what’s happening in their lungs.”

Dr. Pijnenburg and Dr. Saglani reported no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The Link Between Vision Impairment and Dementia in Older Adults

Article Type
Changed
Tue, 09/17/2024 - 10:48

 

TOPLINE:

Addressing vision impairments could help with dementia prevention, as vision impairment is linked to 19% of dementia cases in older adults.
 

METHODOLOGY:

  • Researchers conducted a cross-sectional analysis using data from the National Health and Aging Trends Study (NHATS).
  • The analysis included 2767 US adults aged 71 years or older (54.7% female and 45.3% male).
  • Vision impairments were defined using 2019 World Health Organization criteria. Near and distance vision impairments were defined as greater than 0.30 logMAR, and contrast sensitivity impairment was identified by scores below 1.55 logCS.
  • Dementia was classified using a standardized algorithm developed in NHATS, which incorporated a series of tests measuring cognition, memory and orientation, reports of Alzheimer’s disease, or a dementia diagnosis from the patient or a proxy, and an informant questionnaire (Ascertain Dementia-8 Dementia Screening Interview).
  • The study analyzed data from 2021, with the primary outcome being the population attributable fraction (PAF) of dementia from vision impairment.

TAKEAWAY:

  • The PAF of dementia associated with at least one vision impairment was 19% (95% CI, 8.2-29.7).
  • Impairment in contrast sensitivity had the highest PAF among all other vision issues, at 15% (95% CI, 6.6-23.6). This figure was higher than that for impairment of near acuity, at 9.7% (95% CI, 2.6-17.0), or distance acuity, at 4.9% (95% CI, 0.1-9.9).
  • The highest PAFs for dementia due to vision impairment was among participants aged 71-79 years (24.3%; 95% CI, 6.6-41.8), women (26.8%; 95% CI, 12.2-39.9), and non-Hispanic White participants (22.3%; 95% CI, 9.6-34.5).

IN PRACTICE:

“While not proving a cause-and-effect relationship, these findings support inclusion of multiple objective measures of vision impairments, including contrast sensitivity and visual acuity, to capture the total potential impact of addressing vision impairment on dementia,” study authors wrote.

SOURCE:

This study was led by Jason R. Smith, ScM, of the Department of Epidemiology at the Johns Hopkins Bloomberg School of Public Health in Baltimore. It was published online in JAMA Ophthalmology.

LIMITATIONS:

The limited sample sizes for American Indian, Alaska Native, Asian, and Hispanic groups prevented researchers from calculating PAFs for these populations. The cross-sectional design prevented the researchers from examining the timing of vision impairment in relation to a diagnosis of dementia. The study did not explore links between other measures of vision and dementia. Those with early cognitive impairment may not have updated glasses, affecting visual performance. The findings from the study may not apply to institutionalized older adults.

DISCLOSURES:

Jennifer A. Deal, PhD, MHS, reported receiving personal fees from Frontiers in Epidemiology, Velux Stiftung, and Medical Education Speakers Network outside the submitted work. Nicholas S. Reed, AuD, PhD, reported receiving stock options from Neosensory outside the submitted work. No other disclosures were reported.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Addressing vision impairments could help with dementia prevention, as vision impairment is linked to 19% of dementia cases in older adults.
 

METHODOLOGY:

  • Researchers conducted a cross-sectional analysis using data from the National Health and Aging Trends Study (NHATS).
  • The analysis included 2767 US adults aged 71 years or older (54.7% female and 45.3% male).
  • Vision impairments were defined using 2019 World Health Organization criteria. Near and distance vision impairments were defined as greater than 0.30 logMAR, and contrast sensitivity impairment was identified by scores below 1.55 logCS.
  • Dementia was classified using a standardized algorithm developed in NHATS, which incorporated a series of tests measuring cognition, memory and orientation, reports of Alzheimer’s disease, or a dementia diagnosis from the patient or a proxy, and an informant questionnaire (Ascertain Dementia-8 Dementia Screening Interview).
  • The study analyzed data from 2021, with the primary outcome being the population attributable fraction (PAF) of dementia from vision impairment.

TAKEAWAY:

  • The PAF of dementia associated with at least one vision impairment was 19% (95% CI, 8.2-29.7).
  • Impairment in contrast sensitivity had the highest PAF among all other vision issues, at 15% (95% CI, 6.6-23.6). This figure was higher than that for impairment of near acuity, at 9.7% (95% CI, 2.6-17.0), or distance acuity, at 4.9% (95% CI, 0.1-9.9).
  • The highest PAFs for dementia due to vision impairment was among participants aged 71-79 years (24.3%; 95% CI, 6.6-41.8), women (26.8%; 95% CI, 12.2-39.9), and non-Hispanic White participants (22.3%; 95% CI, 9.6-34.5).

IN PRACTICE:

“While not proving a cause-and-effect relationship, these findings support inclusion of multiple objective measures of vision impairments, including contrast sensitivity and visual acuity, to capture the total potential impact of addressing vision impairment on dementia,” study authors wrote.

SOURCE:

This study was led by Jason R. Smith, ScM, of the Department of Epidemiology at the Johns Hopkins Bloomberg School of Public Health in Baltimore. It was published online in JAMA Ophthalmology.

LIMITATIONS:

The limited sample sizes for American Indian, Alaska Native, Asian, and Hispanic groups prevented researchers from calculating PAFs for these populations. The cross-sectional design prevented the researchers from examining the timing of vision impairment in relation to a diagnosis of dementia. The study did not explore links between other measures of vision and dementia. Those with early cognitive impairment may not have updated glasses, affecting visual performance. The findings from the study may not apply to institutionalized older adults.

DISCLOSURES:

Jennifer A. Deal, PhD, MHS, reported receiving personal fees from Frontiers in Epidemiology, Velux Stiftung, and Medical Education Speakers Network outside the submitted work. Nicholas S. Reed, AuD, PhD, reported receiving stock options from Neosensory outside the submitted work. No other disclosures were reported.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.A version of this article appeared on Medscape.com.

 

TOPLINE:

Addressing vision impairments could help with dementia prevention, as vision impairment is linked to 19% of dementia cases in older adults.
 

METHODOLOGY:

  • Researchers conducted a cross-sectional analysis using data from the National Health and Aging Trends Study (NHATS).
  • The analysis included 2767 US adults aged 71 years or older (54.7% female and 45.3% male).
  • Vision impairments were defined using 2019 World Health Organization criteria. Near and distance vision impairments were defined as greater than 0.30 logMAR, and contrast sensitivity impairment was identified by scores below 1.55 logCS.
  • Dementia was classified using a standardized algorithm developed in NHATS, which incorporated a series of tests measuring cognition, memory and orientation, reports of Alzheimer’s disease, or a dementia diagnosis from the patient or a proxy, and an informant questionnaire (Ascertain Dementia-8 Dementia Screening Interview).
  • The study analyzed data from 2021, with the primary outcome being the population attributable fraction (PAF) of dementia from vision impairment.

TAKEAWAY:

  • The PAF of dementia associated with at least one vision impairment was 19% (95% CI, 8.2-29.7).
  • Impairment in contrast sensitivity had the highest PAF among all other vision issues, at 15% (95% CI, 6.6-23.6). This figure was higher than that for impairment of near acuity, at 9.7% (95% CI, 2.6-17.0), or distance acuity, at 4.9% (95% CI, 0.1-9.9).
  • The highest PAFs for dementia due to vision impairment was among participants aged 71-79 years (24.3%; 95% CI, 6.6-41.8), women (26.8%; 95% CI, 12.2-39.9), and non-Hispanic White participants (22.3%; 95% CI, 9.6-34.5).

IN PRACTICE:

“While not proving a cause-and-effect relationship, these findings support inclusion of multiple objective measures of vision impairments, including contrast sensitivity and visual acuity, to capture the total potential impact of addressing vision impairment on dementia,” study authors wrote.

SOURCE:

This study was led by Jason R. Smith, ScM, of the Department of Epidemiology at the Johns Hopkins Bloomberg School of Public Health in Baltimore. It was published online in JAMA Ophthalmology.

LIMITATIONS:

The limited sample sizes for American Indian, Alaska Native, Asian, and Hispanic groups prevented researchers from calculating PAFs for these populations. The cross-sectional design prevented the researchers from examining the timing of vision impairment in relation to a diagnosis of dementia. The study did not explore links between other measures of vision and dementia. Those with early cognitive impairment may not have updated glasses, affecting visual performance. The findings from the study may not apply to institutionalized older adults.

DISCLOSURES:

Jennifer A. Deal, PhD, MHS, reported receiving personal fees from Frontiers in Epidemiology, Velux Stiftung, and Medical Education Speakers Network outside the submitted work. Nicholas S. Reed, AuD, PhD, reported receiving stock options from Neosensory outside the submitted work. No other disclosures were reported.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Alcohol’s Effect on Gout Risk Strongest in Men But Present in Both Sexes

Article Type
Changed
Tue, 09/10/2024 - 13:36

 

TOPLINE:

A higher alcohol consumption is associated with an increased risk for gout, more strongly in men than in women. This sex-specific difference may be attributed to the different types of alcohol consumed by men and women, rather than biologic variations.

METHODOLOGY:

  • This prospective cohort study investigated the association between total and specific alcohol consumption and the long-term risk for incident gout in 179,828 men (mean age, 56.0 years) and 221,300 women (mean age, 56.0 years) from the UK Biobank who did not have gout at baseline.
  • Alcohol consumption was assessed using a computer-assisted touch screen system. Among men, 2.9%, 3.6%, and 93.6% were identified as never, former, and current drinkers, respectively. Among women, 5.9%, 3.6%, and 90.5% were identified as never, former, and current drinkers, respectively.
  • Participants were also required to share details about their weekly alcohol intake and the types of alcoholic beverages they consumed (red wine, champagne or white wine, beer or cider, spirits, or fortified wine).
  • The median follow-up duration of this study was 12.7 years.
  • Cases of incident gout during the follow-up period were identified using hospital records and the International Classification of Diseases codes.

TAKEAWAY:

  • The risk for gout was 69% higher in men who were current drinkers than in those who were never drinkers (hazard ratio [HR], 1.69; 95% CI, 1.30-2.18), while an inverse association was observed in women who were current drinkers, although it was not statistically significant. A significant interaction was observed between drinking status and sex (P < .001 for interaction).
  • Among current drinkers, more frequent alcohol consumption was associated with a higher risk for gout among both sexes, with the association being stronger in men (HR, 2.05; 95% CI, 1.84-2.30) than in women (HR, 1.34; 95% CI, 1.12-1.61).
  • The consumption of beer or cider was higher in men than in women (4.2 vs 0.4 pints/wk).
  • Among all alcoholic beverages, the consumption of beer or cider (per 1 pint/d) showed the strongest association with the risk for gout in both men (HR, 1.60; 95% CI, 1.53-1.67) and women (HR, 1.62; 95% CI, 1.02-2.57).

IN PRACTICE:

“The observed sex-specific difference in the association of total alcohol consumption with incident gout may be owing to differences between men and women in the types of alcohol consumed rather than biological differences,” the authors wrote.

SOURCE:

The study was led by Jie-Qiong Lyu, MPH, Department of Nutrition and Food Hygiene, School of Public Health, Suzhou Medical College of Soochow University in China. It was published online in JAMA Network Open.

LIMITATIONS:

The frequency of alcohol consumption was self-reported, leading to potential misclassification. Incident cases of gout were identified from hospital records, which may have caused some undiagnosed cases or those diagnosed only in primary care settings to be missed. Most participants were of European descent and relatively healthier than the general population, limiting generalizability.

DISCLOSURES:

This work was supported by the Gusu Leading Talent Plan for Scientific and Technological Innovation and Entrepreneurship. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

A higher alcohol consumption is associated with an increased risk for gout, more strongly in men than in women. This sex-specific difference may be attributed to the different types of alcohol consumed by men and women, rather than biologic variations.

METHODOLOGY:

  • This prospective cohort study investigated the association between total and specific alcohol consumption and the long-term risk for incident gout in 179,828 men (mean age, 56.0 years) and 221,300 women (mean age, 56.0 years) from the UK Biobank who did not have gout at baseline.
  • Alcohol consumption was assessed using a computer-assisted touch screen system. Among men, 2.9%, 3.6%, and 93.6% were identified as never, former, and current drinkers, respectively. Among women, 5.9%, 3.6%, and 90.5% were identified as never, former, and current drinkers, respectively.
  • Participants were also required to share details about their weekly alcohol intake and the types of alcoholic beverages they consumed (red wine, champagne or white wine, beer or cider, spirits, or fortified wine).
  • The median follow-up duration of this study was 12.7 years.
  • Cases of incident gout during the follow-up period were identified using hospital records and the International Classification of Diseases codes.

TAKEAWAY:

  • The risk for gout was 69% higher in men who were current drinkers than in those who were never drinkers (hazard ratio [HR], 1.69; 95% CI, 1.30-2.18), while an inverse association was observed in women who were current drinkers, although it was not statistically significant. A significant interaction was observed between drinking status and sex (P < .001 for interaction).
  • Among current drinkers, more frequent alcohol consumption was associated with a higher risk for gout among both sexes, with the association being stronger in men (HR, 2.05; 95% CI, 1.84-2.30) than in women (HR, 1.34; 95% CI, 1.12-1.61).
  • The consumption of beer or cider was higher in men than in women (4.2 vs 0.4 pints/wk).
  • Among all alcoholic beverages, the consumption of beer or cider (per 1 pint/d) showed the strongest association with the risk for gout in both men (HR, 1.60; 95% CI, 1.53-1.67) and women (HR, 1.62; 95% CI, 1.02-2.57).

IN PRACTICE:

“The observed sex-specific difference in the association of total alcohol consumption with incident gout may be owing to differences between men and women in the types of alcohol consumed rather than biological differences,” the authors wrote.

SOURCE:

The study was led by Jie-Qiong Lyu, MPH, Department of Nutrition and Food Hygiene, School of Public Health, Suzhou Medical College of Soochow University in China. It was published online in JAMA Network Open.

LIMITATIONS:

The frequency of alcohol consumption was self-reported, leading to potential misclassification. Incident cases of gout were identified from hospital records, which may have caused some undiagnosed cases or those diagnosed only in primary care settings to be missed. Most participants were of European descent and relatively healthier than the general population, limiting generalizability.

DISCLOSURES:

This work was supported by the Gusu Leading Talent Plan for Scientific and Technological Innovation and Entrepreneurship. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

A higher alcohol consumption is associated with an increased risk for gout, more strongly in men than in women. This sex-specific difference may be attributed to the different types of alcohol consumed by men and women, rather than biologic variations.

METHODOLOGY:

  • This prospective cohort study investigated the association between total and specific alcohol consumption and the long-term risk for incident gout in 179,828 men (mean age, 56.0 years) and 221,300 women (mean age, 56.0 years) from the UK Biobank who did not have gout at baseline.
  • Alcohol consumption was assessed using a computer-assisted touch screen system. Among men, 2.9%, 3.6%, and 93.6% were identified as never, former, and current drinkers, respectively. Among women, 5.9%, 3.6%, and 90.5% were identified as never, former, and current drinkers, respectively.
  • Participants were also required to share details about their weekly alcohol intake and the types of alcoholic beverages they consumed (red wine, champagne or white wine, beer or cider, spirits, or fortified wine).
  • The median follow-up duration of this study was 12.7 years.
  • Cases of incident gout during the follow-up period were identified using hospital records and the International Classification of Diseases codes.

TAKEAWAY:

  • The risk for gout was 69% higher in men who were current drinkers than in those who were never drinkers (hazard ratio [HR], 1.69; 95% CI, 1.30-2.18), while an inverse association was observed in women who were current drinkers, although it was not statistically significant. A significant interaction was observed between drinking status and sex (P < .001 for interaction).
  • Among current drinkers, more frequent alcohol consumption was associated with a higher risk for gout among both sexes, with the association being stronger in men (HR, 2.05; 95% CI, 1.84-2.30) than in women (HR, 1.34; 95% CI, 1.12-1.61).
  • The consumption of beer or cider was higher in men than in women (4.2 vs 0.4 pints/wk).
  • Among all alcoholic beverages, the consumption of beer or cider (per 1 pint/d) showed the strongest association with the risk for gout in both men (HR, 1.60; 95% CI, 1.53-1.67) and women (HR, 1.62; 95% CI, 1.02-2.57).

IN PRACTICE:

“The observed sex-specific difference in the association of total alcohol consumption with incident gout may be owing to differences between men and women in the types of alcohol consumed rather than biological differences,” the authors wrote.

SOURCE:

The study was led by Jie-Qiong Lyu, MPH, Department of Nutrition and Food Hygiene, School of Public Health, Suzhou Medical College of Soochow University in China. It was published online in JAMA Network Open.

LIMITATIONS:

The frequency of alcohol consumption was self-reported, leading to potential misclassification. Incident cases of gout were identified from hospital records, which may have caused some undiagnosed cases or those diagnosed only in primary care settings to be missed. Most participants were of European descent and relatively healthier than the general population, limiting generalizability.

DISCLOSURES:

This work was supported by the Gusu Leading Talent Plan for Scientific and Technological Innovation and Entrepreneurship. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article