User login
MD-IQ only
What Is a Blue Zone Certified Clinician?
It is a great day when a patient shows up at clinical appointment already motivated to make lifestyle behavior changes. Often, they have been inspired by health information they consumed elsewhere, such as from a book, movie, documentary, TV show, a friend, or something out in the community.
Currently, one of the more public representations of health and longevity promotion is Blue Zones. The organization, named for specific areas of the world — the so-called blue zones, where people experience less disease and live longer lives — has created considerable public awareness for healthy living. Today, there are more than 75 Blue Zones Project communities across the United States, where community leaders, businesses, organizations, and citizens collaborate to make healthier choices the easier choices. A recent Netflix special, Live to 100: Secrets of the Blue Zones, further propelled blue zones into the public consciousness.
The Blue Zones emphasis on “plant-slant” diet, natural movement, purpose and contribution, downshifting, and family and community intersect with the lifestyle medicine pillars of whole-food, plant-predominant eating patterns, regular physical activity, stress management, restorative sleep, and positive social connections. Both Blue Zones and lifestyle medicine share a goal of creating healthier and stronger individuals and communities.
For those reasons, it made perfect sense that Blue Zones and the American College of Lifestyle Medicine (ACLM) recently announced a partnership to synergize both organizations’ strengths and resources. Among other things, the collaboration will establish a new certification status of Blue Zones–Certified Physician or Blue Zones–Certified Healthcare Professional, available in 2025 exclusively to clinicians who already are or become certified in lifestyle medicine.
Because of Blue Zones’ considerable consumer awareness, physicians and other health professionals who earn the certification will stand out to potential patients as clinicians with the training and knowledge to help them make sustainable lifestyle behavior changes. A challenging part of any clinician’s job is educating and convincing patients on the proven health benefits of lifestyle behavior change within the time restraints of a routine clinical visit. Patients familiar with Blue Zones are more likely to arrive already interested in changing lifestyle behavior, and clinicians should have the skills to help them achieve their goals.
In addition, community infrastructure developed through Blue Zones that supports healthful lifestyle choices is significant for patients. Lack of resources in their home, work, and community environments is a common obstacle that patients cite when discussing lifestyle change with a clinician. Bicycle lanes for commuting, parks with exercise equipment, accessible healthy food options, and community events to facilitate positive social connections enhance lifestyle-medicine prescriptions. Workplaces, restaurants, places of worship, and grocery stores are examples of community stakeholders that collaborate in Blue Zones communities to promote healthy lifestyle decisions. Although lifestyle medicine clinicians can and do identify creative ways to support patients in communities without strong healthy choice infrastructure, the Blue Zones road map is a welcome companion.
The timing is right for this synthesis of Blue Zones and lifestyle medicine. As consumer interest in Blue Zones has risen, so has clinician interest in evidence-based lifestyle medicine. Since certification in lifestyle medicine began in 2017, almost 6700 physicians and other health professionals have become certified worldwide. More than 43,000 health care professionals have registered for ACLM’s complimentary lifestyle and food-as-medicine courses highlighted by the White House Conference on Hunger, Nutrition, and Health.
What if more patients came to us motivated to make lifestyle changes because of awareness infused in their work and supported in their surrounding community? Matching lifestyle medicine certification with Blue Zone communities equips clinicians to help these patients achieve what they really want: to live longer and better.
Dr. Collings is Director of Lifestyle Medicine, Silicon Valley Medical Development, and Past President, American College of Lifestyle Medicine, Mountain View, California. She has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
It is a great day when a patient shows up at clinical appointment already motivated to make lifestyle behavior changes. Often, they have been inspired by health information they consumed elsewhere, such as from a book, movie, documentary, TV show, a friend, or something out in the community.
Currently, one of the more public representations of health and longevity promotion is Blue Zones. The organization, named for specific areas of the world — the so-called blue zones, where people experience less disease and live longer lives — has created considerable public awareness for healthy living. Today, there are more than 75 Blue Zones Project communities across the United States, where community leaders, businesses, organizations, and citizens collaborate to make healthier choices the easier choices. A recent Netflix special, Live to 100: Secrets of the Blue Zones, further propelled blue zones into the public consciousness.
The Blue Zones emphasis on “plant-slant” diet, natural movement, purpose and contribution, downshifting, and family and community intersect with the lifestyle medicine pillars of whole-food, plant-predominant eating patterns, regular physical activity, stress management, restorative sleep, and positive social connections. Both Blue Zones and lifestyle medicine share a goal of creating healthier and stronger individuals and communities.
For those reasons, it made perfect sense that Blue Zones and the American College of Lifestyle Medicine (ACLM) recently announced a partnership to synergize both organizations’ strengths and resources. Among other things, the collaboration will establish a new certification status of Blue Zones–Certified Physician or Blue Zones–Certified Healthcare Professional, available in 2025 exclusively to clinicians who already are or become certified in lifestyle medicine.
Because of Blue Zones’ considerable consumer awareness, physicians and other health professionals who earn the certification will stand out to potential patients as clinicians with the training and knowledge to help them make sustainable lifestyle behavior changes. A challenging part of any clinician’s job is educating and convincing patients on the proven health benefits of lifestyle behavior change within the time restraints of a routine clinical visit. Patients familiar with Blue Zones are more likely to arrive already interested in changing lifestyle behavior, and clinicians should have the skills to help them achieve their goals.
In addition, community infrastructure developed through Blue Zones that supports healthful lifestyle choices is significant for patients. Lack of resources in their home, work, and community environments is a common obstacle that patients cite when discussing lifestyle change with a clinician. Bicycle lanes for commuting, parks with exercise equipment, accessible healthy food options, and community events to facilitate positive social connections enhance lifestyle-medicine prescriptions. Workplaces, restaurants, places of worship, and grocery stores are examples of community stakeholders that collaborate in Blue Zones communities to promote healthy lifestyle decisions. Although lifestyle medicine clinicians can and do identify creative ways to support patients in communities without strong healthy choice infrastructure, the Blue Zones road map is a welcome companion.
The timing is right for this synthesis of Blue Zones and lifestyle medicine. As consumer interest in Blue Zones has risen, so has clinician interest in evidence-based lifestyle medicine. Since certification in lifestyle medicine began in 2017, almost 6700 physicians and other health professionals have become certified worldwide. More than 43,000 health care professionals have registered for ACLM’s complimentary lifestyle and food-as-medicine courses highlighted by the White House Conference on Hunger, Nutrition, and Health.
What if more patients came to us motivated to make lifestyle changes because of awareness infused in their work and supported in their surrounding community? Matching lifestyle medicine certification with Blue Zone communities equips clinicians to help these patients achieve what they really want: to live longer and better.
Dr. Collings is Director of Lifestyle Medicine, Silicon Valley Medical Development, and Past President, American College of Lifestyle Medicine, Mountain View, California. She has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
It is a great day when a patient shows up at clinical appointment already motivated to make lifestyle behavior changes. Often, they have been inspired by health information they consumed elsewhere, such as from a book, movie, documentary, TV show, a friend, or something out in the community.
Currently, one of the more public representations of health and longevity promotion is Blue Zones. The organization, named for specific areas of the world — the so-called blue zones, where people experience less disease and live longer lives — has created considerable public awareness for healthy living. Today, there are more than 75 Blue Zones Project communities across the United States, where community leaders, businesses, organizations, and citizens collaborate to make healthier choices the easier choices. A recent Netflix special, Live to 100: Secrets of the Blue Zones, further propelled blue zones into the public consciousness.
The Blue Zones emphasis on “plant-slant” diet, natural movement, purpose and contribution, downshifting, and family and community intersect with the lifestyle medicine pillars of whole-food, plant-predominant eating patterns, regular physical activity, stress management, restorative sleep, and positive social connections. Both Blue Zones and lifestyle medicine share a goal of creating healthier and stronger individuals and communities.
For those reasons, it made perfect sense that Blue Zones and the American College of Lifestyle Medicine (ACLM) recently announced a partnership to synergize both organizations’ strengths and resources. Among other things, the collaboration will establish a new certification status of Blue Zones–Certified Physician or Blue Zones–Certified Healthcare Professional, available in 2025 exclusively to clinicians who already are or become certified in lifestyle medicine.
Because of Blue Zones’ considerable consumer awareness, physicians and other health professionals who earn the certification will stand out to potential patients as clinicians with the training and knowledge to help them make sustainable lifestyle behavior changes. A challenging part of any clinician’s job is educating and convincing patients on the proven health benefits of lifestyle behavior change within the time restraints of a routine clinical visit. Patients familiar with Blue Zones are more likely to arrive already interested in changing lifestyle behavior, and clinicians should have the skills to help them achieve their goals.
In addition, community infrastructure developed through Blue Zones that supports healthful lifestyle choices is significant for patients. Lack of resources in their home, work, and community environments is a common obstacle that patients cite when discussing lifestyle change with a clinician. Bicycle lanes for commuting, parks with exercise equipment, accessible healthy food options, and community events to facilitate positive social connections enhance lifestyle-medicine prescriptions. Workplaces, restaurants, places of worship, and grocery stores are examples of community stakeholders that collaborate in Blue Zones communities to promote healthy lifestyle decisions. Although lifestyle medicine clinicians can and do identify creative ways to support patients in communities without strong healthy choice infrastructure, the Blue Zones road map is a welcome companion.
The timing is right for this synthesis of Blue Zones and lifestyle medicine. As consumer interest in Blue Zones has risen, so has clinician interest in evidence-based lifestyle medicine. Since certification in lifestyle medicine began in 2017, almost 6700 physicians and other health professionals have become certified worldwide. More than 43,000 health care professionals have registered for ACLM’s complimentary lifestyle and food-as-medicine courses highlighted by the White House Conference on Hunger, Nutrition, and Health.
What if more patients came to us motivated to make lifestyle changes because of awareness infused in their work and supported in their surrounding community? Matching lifestyle medicine certification with Blue Zone communities equips clinicians to help these patients achieve what they really want: to live longer and better.
Dr. Collings is Director of Lifestyle Medicine, Silicon Valley Medical Development, and Past President, American College of Lifestyle Medicine, Mountain View, California. She has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Celiac Disease: Five Things to Know
Celiac disease is a chronic, immune-mediated, systemic disorder caused by intolerance to gluten — a protein present in rye, barley, and wheat grains — that affects genetically predisposed individuals.
Due to its wide spectrum of clinical manifestations, celiac disease resembles a multisystemic disorder. Its most common gastrointestinal (GI) symptoms include chronic diarrhea, weight loss, and abdominal distention. However, celiac disease can also manifest in myriad extraintestinal symptoms, ranging from headache and fatigue to delayed puberty and psychiatric disorders, with differing presentations in children and adults.
To date, the only treatment is adopting a gluten-free diet (GFD). Although key to preventing persistent villous atrophy, the main cause of complications in celiac disease, lifelong adherence to GFD is challenging and may not resolve all clinical issues. These shortcomings have driven recent efforts to develop novel therapeutic options for patients with this disease.
Here are five things to know about celiac disease.
1. Rising Prevalence of Celiac Disease and Other Autoimmune Disorders Suggests Environmental Factors May Be at Play
Gluten was first identified as the cause of celiac disease in the 1950s. At that time, the condition was thought to be a relatively rare GI disease of childhood that primarily affected people of European descent, but it is now known to be a common disease affecting those of various ages, races, and ethnicities.
A 2018 meta-analysis found the pooled global prevalence of celiac disease was 1.4%. Incidence has increased by as much as 7.5% annually over the past several decades.
Increased awareness among clinicians and improved detection likely play a role in the trend. However, the growth in celiac disease is consistent with that seen for other autoimmune disorders, according to a 2024 update of evidence surrounding celiac disease. Shared environmental factors have been proposed as triggers for celiac disease and other autoimmune diseases and appear to be influencing their rise, the authors noted. These factors include migration and population growth, changing dietary patterns and food processing practices, and altered wheat consumption.
2. No-Biopsy Diagnosis Is Accepted for Children and Shows Promise for Adults
It is estimated that almost 60 million people worldwide have celiac disease, but most remain undiagnosed or misdiagnosed, or they experience significant diagnostic delays.
Prospective data indicate that children with first-degree relatives with celiac disease are at a significantly higher risk of developing the condition, which should prompt screening efforts in this population.
The 2023 updated guidelines from the American College of Gastroenterology (ACG) state that serology testing plays a central role in screening. This commonly involves serological testing for positive serological markers of the disease, including immunoglobulin A (IgA), anti-tissue transglutaminase IgA (tTG-IgA), anti-deamidated gliadin peptide, or endomysial antibodies.
To confirm diagnosis, clinicians have relied on intestinal biopsy since the late 1950s. The ACG still recommends esophagogastroduodenoscopy with multiple duodenal biopsies for confirmation of diagnosis in both children and adults with suspicion of celiac disease. However, recent years have seen a shift toward a no-biopsy approach.
For more than a decade in Europe, a no-biopsy approach has been established practice in pediatric patients, for whom the burden of obtaining a histological confirmation is understandably greater. Most guidelines now permit children to be diagnosed with celiac disease in the absence of a biopsy under specific circumstances (eg, characteristic symptoms of celiac disease and tTG-IgA levels > 10 times the upper limit of normal). The ACG guidelines state that “this approach is a reasonable alternative to the standard approach to a [celiac disease] diagnosis in selected children.”
The ACG does not recommend a no-biopsy approach in adults, noting that, in comparison with children, there is a relative lack of data indicating that serology is predictive in this population. However, it does recognize that physicians may encounter patients for whom a biopsy diagnosis may not be safe or practical. In such cases, an “after-the-fact” diagnosis of likely celiac disease can be given to symptomatic adult patients with a ≥ 10-fold elevation of tTG-IgA and a positive endomysial antibody in a second blood sample.
A 2024 meta-analysis of 18 studies involving over 12,103 adult patients from 15 countries concluded that a no-biopsy approach using tTG-IgA antibody levels ≥ 10 times the upper limit of normal was highly specific and predictive of celiac disease.
3. Celiac Disease Is Associated With Several Life-Threatening Conditions
Emerging data indicate that gastroenterologists should be vigilant in screening patients with celiac disease for several other GI conditions.
Inflammatory bowel disease and celiac disease have a strong bidirectional association, suggesting a possible genetic link between the conditions and indicating that physicians should consider the alternate diagnosis when symptoms persist after treatment.
Given the hypervigilance around food and diet inherent to celiac disease, patients are at an increased risk of developing avoidant/restrictive food intake disorder, according to a 2022 retrospective study.
In 2023, Italian investigators showed that children with celiac disease have an elevated prevalence of functional GI disorders even after adopting a GFD for a year, regardless of whether they consumed processed or natural foods. It was unclear whether this was due to a chronic inflammatory process or to nutritional factors.
Complications resulting from celiac disease are not limited to GI disorders. For a variety of underlying pathophysiological reasons, including intestinal permeability, hyposplenism, and malabsorption of nutrients, patients with celiac disease may be at a higher risk for non-GI conditions, such as osteopenia, women’s health disorders (eg, ovarian failure, endometriosis, or pregnancy loss), juvenile idiopathic arthritis in children and rheumatoid arthritis in adults, certain forms of cancer, infectious diseases, and cardiomyopathy.
4. GFD Is the Only Treatment, but It’s Imperfect and Frustrating for Patients
GFD is the only treatment for celiac disease and must be adhered to without deviation throughout a patient’s life.
Maintaining unwavering adherence reaps considerable benefits: Improved clinical symptoms, robust mucosal healing, and normalization of serological markers. Yet it also takes a considerable toll on patients. Patients with celiac disease struggle with a host of negative physical, psychological, and social impacts. They also report a higher treatment burden than those with gastroesophageal reflux disease or hypertension, and comparable with end-stage renal disease.
GFD also poses financial challenges. Although the price of gluten-free products has decreased in recent years, they still cost significantly more than items with gluten.
Adherence to GFD does not always equate to complete mucosal recovery. While mucosal recovery is achieved in 95% of children within 2 years of the diet’s adoption, only 34% and 66% of adults obtain it within 2 and 5 years, respectively.
GFD may lead to nutrient imbalances because gluten-free foods are typically low in alimentary fiber, micronutrients (eg, vitamin D, vitamin B12, or folate), and minerals (eg, iron, zinc, magnesium, or calcium). With higher sugar and fat content, GFD may leave patients susceptible to unwanted weight gain.
The pervasiveness of gluten in the food production system makes the risk for cross-contamination high. Gluten is often found in both naturally gluten-free foods and products labeled as such. Gluten-sensing technologies, some of which can be used via smartphone apps, have been developed to help patients identify possible cross-contamination. However, the ACG guidelines recommend against the use of these technologies until there is sufficient evidence supporting their ability to improve adherence and clinical outcomes.
5. Novel Therapies for Celiac Disease Are in the Pipeline
The limitations of GFD as the standard treatment for celiac disease have led to an increased focus on developing novel therapeutic interventions. They can be sorted into five key categories: Modulation of the immunostimulatory effects of toxic gluten peptides, elimination of toxic gluten peptides before they reach the intestine, induction of gluten tolerance, modulation of intestinal permeability, and restoration of gut microbiota balance.
Three therapies designed to block antigen presentation by HLA-DQ2/8, the gene alleles that predispose people to celiac disease, show promise: TPM502, an agent that contains three gluten-specific antigenic peptides with overlapping T-cell epitopes for the HLA-DQ2.5 gene; KAN-101, designed to induce gluten tolerance by targeting receptors on the liver; and DONQ52, a multi-specific antibody that targets HLA-DQ2. The KAN-101 therapy received Fast Track designation by the US Food and Drug Administration in 2022.
These and several other agents in clinical and preclinical development are discussed in detail in a 2024 review article. Although no therapies have reached phase 3 testing, when they do, it will undoubtedly be welcomed by those with celiac disease.
A version of this article first appeared on Medscape.com.
Celiac disease is a chronic, immune-mediated, systemic disorder caused by intolerance to gluten — a protein present in rye, barley, and wheat grains — that affects genetically predisposed individuals.
Due to its wide spectrum of clinical manifestations, celiac disease resembles a multisystemic disorder. Its most common gastrointestinal (GI) symptoms include chronic diarrhea, weight loss, and abdominal distention. However, celiac disease can also manifest in myriad extraintestinal symptoms, ranging from headache and fatigue to delayed puberty and psychiatric disorders, with differing presentations in children and adults.
To date, the only treatment is adopting a gluten-free diet (GFD). Although key to preventing persistent villous atrophy, the main cause of complications in celiac disease, lifelong adherence to GFD is challenging and may not resolve all clinical issues. These shortcomings have driven recent efforts to develop novel therapeutic options for patients with this disease.
Here are five things to know about celiac disease.
1. Rising Prevalence of Celiac Disease and Other Autoimmune Disorders Suggests Environmental Factors May Be at Play
Gluten was first identified as the cause of celiac disease in the 1950s. At that time, the condition was thought to be a relatively rare GI disease of childhood that primarily affected people of European descent, but it is now known to be a common disease affecting those of various ages, races, and ethnicities.
A 2018 meta-analysis found the pooled global prevalence of celiac disease was 1.4%. Incidence has increased by as much as 7.5% annually over the past several decades.
Increased awareness among clinicians and improved detection likely play a role in the trend. However, the growth in celiac disease is consistent with that seen for other autoimmune disorders, according to a 2024 update of evidence surrounding celiac disease. Shared environmental factors have been proposed as triggers for celiac disease and other autoimmune diseases and appear to be influencing their rise, the authors noted. These factors include migration and population growth, changing dietary patterns and food processing practices, and altered wheat consumption.
2. No-Biopsy Diagnosis Is Accepted for Children and Shows Promise for Adults
It is estimated that almost 60 million people worldwide have celiac disease, but most remain undiagnosed or misdiagnosed, or they experience significant diagnostic delays.
Prospective data indicate that children with first-degree relatives with celiac disease are at a significantly higher risk of developing the condition, which should prompt screening efforts in this population.
The 2023 updated guidelines from the American College of Gastroenterology (ACG) state that serology testing plays a central role in screening. This commonly involves serological testing for positive serological markers of the disease, including immunoglobulin A (IgA), anti-tissue transglutaminase IgA (tTG-IgA), anti-deamidated gliadin peptide, or endomysial antibodies.
To confirm diagnosis, clinicians have relied on intestinal biopsy since the late 1950s. The ACG still recommends esophagogastroduodenoscopy with multiple duodenal biopsies for confirmation of diagnosis in both children and adults with suspicion of celiac disease. However, recent years have seen a shift toward a no-biopsy approach.
For more than a decade in Europe, a no-biopsy approach has been established practice in pediatric patients, for whom the burden of obtaining a histological confirmation is understandably greater. Most guidelines now permit children to be diagnosed with celiac disease in the absence of a biopsy under specific circumstances (eg, characteristic symptoms of celiac disease and tTG-IgA levels > 10 times the upper limit of normal). The ACG guidelines state that “this approach is a reasonable alternative to the standard approach to a [celiac disease] diagnosis in selected children.”
The ACG does not recommend a no-biopsy approach in adults, noting that, in comparison with children, there is a relative lack of data indicating that serology is predictive in this population. However, it does recognize that physicians may encounter patients for whom a biopsy diagnosis may not be safe or practical. In such cases, an “after-the-fact” diagnosis of likely celiac disease can be given to symptomatic adult patients with a ≥ 10-fold elevation of tTG-IgA and a positive endomysial antibody in a second blood sample.
A 2024 meta-analysis of 18 studies involving over 12,103 adult patients from 15 countries concluded that a no-biopsy approach using tTG-IgA antibody levels ≥ 10 times the upper limit of normal was highly specific and predictive of celiac disease.
3. Celiac Disease Is Associated With Several Life-Threatening Conditions
Emerging data indicate that gastroenterologists should be vigilant in screening patients with celiac disease for several other GI conditions.
Inflammatory bowel disease and celiac disease have a strong bidirectional association, suggesting a possible genetic link between the conditions and indicating that physicians should consider the alternate diagnosis when symptoms persist after treatment.
Given the hypervigilance around food and diet inherent to celiac disease, patients are at an increased risk of developing avoidant/restrictive food intake disorder, according to a 2022 retrospective study.
In 2023, Italian investigators showed that children with celiac disease have an elevated prevalence of functional GI disorders even after adopting a GFD for a year, regardless of whether they consumed processed or natural foods. It was unclear whether this was due to a chronic inflammatory process or to nutritional factors.
Complications resulting from celiac disease are not limited to GI disorders. For a variety of underlying pathophysiological reasons, including intestinal permeability, hyposplenism, and malabsorption of nutrients, patients with celiac disease may be at a higher risk for non-GI conditions, such as osteopenia, women’s health disorders (eg, ovarian failure, endometriosis, or pregnancy loss), juvenile idiopathic arthritis in children and rheumatoid arthritis in adults, certain forms of cancer, infectious diseases, and cardiomyopathy.
4. GFD Is the Only Treatment, but It’s Imperfect and Frustrating for Patients
GFD is the only treatment for celiac disease and must be adhered to without deviation throughout a patient’s life.
Maintaining unwavering adherence reaps considerable benefits: Improved clinical symptoms, robust mucosal healing, and normalization of serological markers. Yet it also takes a considerable toll on patients. Patients with celiac disease struggle with a host of negative physical, psychological, and social impacts. They also report a higher treatment burden than those with gastroesophageal reflux disease or hypertension, and comparable with end-stage renal disease.
GFD also poses financial challenges. Although the price of gluten-free products has decreased in recent years, they still cost significantly more than items with gluten.
Adherence to GFD does not always equate to complete mucosal recovery. While mucosal recovery is achieved in 95% of children within 2 years of the diet’s adoption, only 34% and 66% of adults obtain it within 2 and 5 years, respectively.
GFD may lead to nutrient imbalances because gluten-free foods are typically low in alimentary fiber, micronutrients (eg, vitamin D, vitamin B12, or folate), and minerals (eg, iron, zinc, magnesium, or calcium). With higher sugar and fat content, GFD may leave patients susceptible to unwanted weight gain.
The pervasiveness of gluten in the food production system makes the risk for cross-contamination high. Gluten is often found in both naturally gluten-free foods and products labeled as such. Gluten-sensing technologies, some of which can be used via smartphone apps, have been developed to help patients identify possible cross-contamination. However, the ACG guidelines recommend against the use of these technologies until there is sufficient evidence supporting their ability to improve adherence and clinical outcomes.
5. Novel Therapies for Celiac Disease Are in the Pipeline
The limitations of GFD as the standard treatment for celiac disease have led to an increased focus on developing novel therapeutic interventions. They can be sorted into five key categories: Modulation of the immunostimulatory effects of toxic gluten peptides, elimination of toxic gluten peptides before they reach the intestine, induction of gluten tolerance, modulation of intestinal permeability, and restoration of gut microbiota balance.
Three therapies designed to block antigen presentation by HLA-DQ2/8, the gene alleles that predispose people to celiac disease, show promise: TPM502, an agent that contains three gluten-specific antigenic peptides with overlapping T-cell epitopes for the HLA-DQ2.5 gene; KAN-101, designed to induce gluten tolerance by targeting receptors on the liver; and DONQ52, a multi-specific antibody that targets HLA-DQ2. The KAN-101 therapy received Fast Track designation by the US Food and Drug Administration in 2022.
These and several other agents in clinical and preclinical development are discussed in detail in a 2024 review article. Although no therapies have reached phase 3 testing, when they do, it will undoubtedly be welcomed by those with celiac disease.
A version of this article first appeared on Medscape.com.
Celiac disease is a chronic, immune-mediated, systemic disorder caused by intolerance to gluten — a protein present in rye, barley, and wheat grains — that affects genetically predisposed individuals.
Due to its wide spectrum of clinical manifestations, celiac disease resembles a multisystemic disorder. Its most common gastrointestinal (GI) symptoms include chronic diarrhea, weight loss, and abdominal distention. However, celiac disease can also manifest in myriad extraintestinal symptoms, ranging from headache and fatigue to delayed puberty and psychiatric disorders, with differing presentations in children and adults.
To date, the only treatment is adopting a gluten-free diet (GFD). Although key to preventing persistent villous atrophy, the main cause of complications in celiac disease, lifelong adherence to GFD is challenging and may not resolve all clinical issues. These shortcomings have driven recent efforts to develop novel therapeutic options for patients with this disease.
Here are five things to know about celiac disease.
1. Rising Prevalence of Celiac Disease and Other Autoimmune Disorders Suggests Environmental Factors May Be at Play
Gluten was first identified as the cause of celiac disease in the 1950s. At that time, the condition was thought to be a relatively rare GI disease of childhood that primarily affected people of European descent, but it is now known to be a common disease affecting those of various ages, races, and ethnicities.
A 2018 meta-analysis found the pooled global prevalence of celiac disease was 1.4%. Incidence has increased by as much as 7.5% annually over the past several decades.
Increased awareness among clinicians and improved detection likely play a role in the trend. However, the growth in celiac disease is consistent with that seen for other autoimmune disorders, according to a 2024 update of evidence surrounding celiac disease. Shared environmental factors have been proposed as triggers for celiac disease and other autoimmune diseases and appear to be influencing their rise, the authors noted. These factors include migration and population growth, changing dietary patterns and food processing practices, and altered wheat consumption.
2. No-Biopsy Diagnosis Is Accepted for Children and Shows Promise for Adults
It is estimated that almost 60 million people worldwide have celiac disease, but most remain undiagnosed or misdiagnosed, or they experience significant diagnostic delays.
Prospective data indicate that children with first-degree relatives with celiac disease are at a significantly higher risk of developing the condition, which should prompt screening efforts in this population.
The 2023 updated guidelines from the American College of Gastroenterology (ACG) state that serology testing plays a central role in screening. This commonly involves serological testing for positive serological markers of the disease, including immunoglobulin A (IgA), anti-tissue transglutaminase IgA (tTG-IgA), anti-deamidated gliadin peptide, or endomysial antibodies.
To confirm diagnosis, clinicians have relied on intestinal biopsy since the late 1950s. The ACG still recommends esophagogastroduodenoscopy with multiple duodenal biopsies for confirmation of diagnosis in both children and adults with suspicion of celiac disease. However, recent years have seen a shift toward a no-biopsy approach.
For more than a decade in Europe, a no-biopsy approach has been established practice in pediatric patients, for whom the burden of obtaining a histological confirmation is understandably greater. Most guidelines now permit children to be diagnosed with celiac disease in the absence of a biopsy under specific circumstances (eg, characteristic symptoms of celiac disease and tTG-IgA levels > 10 times the upper limit of normal). The ACG guidelines state that “this approach is a reasonable alternative to the standard approach to a [celiac disease] diagnosis in selected children.”
The ACG does not recommend a no-biopsy approach in adults, noting that, in comparison with children, there is a relative lack of data indicating that serology is predictive in this population. However, it does recognize that physicians may encounter patients for whom a biopsy diagnosis may not be safe or practical. In such cases, an “after-the-fact” diagnosis of likely celiac disease can be given to symptomatic adult patients with a ≥ 10-fold elevation of tTG-IgA and a positive endomysial antibody in a second blood sample.
A 2024 meta-analysis of 18 studies involving over 12,103 adult patients from 15 countries concluded that a no-biopsy approach using tTG-IgA antibody levels ≥ 10 times the upper limit of normal was highly specific and predictive of celiac disease.
3. Celiac Disease Is Associated With Several Life-Threatening Conditions
Emerging data indicate that gastroenterologists should be vigilant in screening patients with celiac disease for several other GI conditions.
Inflammatory bowel disease and celiac disease have a strong bidirectional association, suggesting a possible genetic link between the conditions and indicating that physicians should consider the alternate diagnosis when symptoms persist after treatment.
Given the hypervigilance around food and diet inherent to celiac disease, patients are at an increased risk of developing avoidant/restrictive food intake disorder, according to a 2022 retrospective study.
In 2023, Italian investigators showed that children with celiac disease have an elevated prevalence of functional GI disorders even after adopting a GFD for a year, regardless of whether they consumed processed or natural foods. It was unclear whether this was due to a chronic inflammatory process or to nutritional factors.
Complications resulting from celiac disease are not limited to GI disorders. For a variety of underlying pathophysiological reasons, including intestinal permeability, hyposplenism, and malabsorption of nutrients, patients with celiac disease may be at a higher risk for non-GI conditions, such as osteopenia, women’s health disorders (eg, ovarian failure, endometriosis, or pregnancy loss), juvenile idiopathic arthritis in children and rheumatoid arthritis in adults, certain forms of cancer, infectious diseases, and cardiomyopathy.
4. GFD Is the Only Treatment, but It’s Imperfect and Frustrating for Patients
GFD is the only treatment for celiac disease and must be adhered to without deviation throughout a patient’s life.
Maintaining unwavering adherence reaps considerable benefits: Improved clinical symptoms, robust mucosal healing, and normalization of serological markers. Yet it also takes a considerable toll on patients. Patients with celiac disease struggle with a host of negative physical, psychological, and social impacts. They also report a higher treatment burden than those with gastroesophageal reflux disease or hypertension, and comparable with end-stage renal disease.
GFD also poses financial challenges. Although the price of gluten-free products has decreased in recent years, they still cost significantly more than items with gluten.
Adherence to GFD does not always equate to complete mucosal recovery. While mucosal recovery is achieved in 95% of children within 2 years of the diet’s adoption, only 34% and 66% of adults obtain it within 2 and 5 years, respectively.
GFD may lead to nutrient imbalances because gluten-free foods are typically low in alimentary fiber, micronutrients (eg, vitamin D, vitamin B12, or folate), and minerals (eg, iron, zinc, magnesium, or calcium). With higher sugar and fat content, GFD may leave patients susceptible to unwanted weight gain.
The pervasiveness of gluten in the food production system makes the risk for cross-contamination high. Gluten is often found in both naturally gluten-free foods and products labeled as such. Gluten-sensing technologies, some of which can be used via smartphone apps, have been developed to help patients identify possible cross-contamination. However, the ACG guidelines recommend against the use of these technologies until there is sufficient evidence supporting their ability to improve adherence and clinical outcomes.
5. Novel Therapies for Celiac Disease Are in the Pipeline
The limitations of GFD as the standard treatment for celiac disease have led to an increased focus on developing novel therapeutic interventions. They can be sorted into five key categories: Modulation of the immunostimulatory effects of toxic gluten peptides, elimination of toxic gluten peptides before they reach the intestine, induction of gluten tolerance, modulation of intestinal permeability, and restoration of gut microbiota balance.
Three therapies designed to block antigen presentation by HLA-DQ2/8, the gene alleles that predispose people to celiac disease, show promise: TPM502, an agent that contains three gluten-specific antigenic peptides with overlapping T-cell epitopes for the HLA-DQ2.5 gene; KAN-101, designed to induce gluten tolerance by targeting receptors on the liver; and DONQ52, a multi-specific antibody that targets HLA-DQ2. The KAN-101 therapy received Fast Track designation by the US Food and Drug Administration in 2022.
These and several other agents in clinical and preclinical development are discussed in detail in a 2024 review article. Although no therapies have reached phase 3 testing, when they do, it will undoubtedly be welcomed by those with celiac disease.
A version of this article first appeared on Medscape.com.
Late-Night Eaters May Have Increased Risk for Colorectal Cancer
WASHINGTON —
, according to the results of research presented at the annual Digestive Disease Week® (DDW).Investigators in a new study questioned 664 people getting a colonoscopy to screen for cancer, and 42% said they were late eaters. This group was 46% more likely than non–late eaters to have an adenoma found during colonoscopy. An estimated 5% to 10% of them become cancerous over time.
“A lot of other studies are about what we eat but not when we eat,” said Edena Khoshaba, lead investigator and a medical student at Rush University Medical College in Chicago. “The common advice includes not eating red meat, eating more fruits and vegetables — which is great, of course — but we wanted to see if the timing affects us at all.”
Ms. Khoshaba and colleagues found it did. Late eaters were 5.5 times more likely to have three or more tubular adenomas compared to non–late eaters, even after adjusting for what people were eating. Tubular adenomas are the most common type of polyp found in the colon.
So, what’s the possible connection between late eating and the risk for colorectal cancer?
Resetting Your Internal Clock
Eating close to bedtime could be throwing off the body’s circadian rhythm. But in this case, it’s not the central circadian center located in the brain — the one that releases melatonin. Instead, late eating could disrupt the peripheral circadian rhythm, part of which is found in the GI tract. For example, if a person is eating late at night, the brain thinks it is nighttime while the gut thinks it is daytime, Ms. Khoshaba said in an interview.
This is an interesting study, said Amy Bragagnini, MS, RD, spokesperson for the Academy of Nutrition and Dietetics, when asked to comment on the research. “It is true that eating later at night can disrupt your circadian rhythm.”
“In addition, many of my patients have told me that when they do eat later at night, they don’t always make the healthiest food choices,” Ms. Bragagnini said. “Their late-night food choices are generally higher in added sugar and fat. This may cause them to consume far more calories than their body needs.” So, eating late at night can also lead to unwanted weight gain.
An unanswered question is if late eating is connected in any way at all to increasing rates of colorectal cancer seen in younger patients.
This was an observational study, and another possible limitation, Ms. Khoshaba said, is that people were asked to recall their diets over 24 hours, which may not always be accurate.
Some of the organisms in the gut have their own internal clocks that follow a daily rhythm, and what someone eat determines how many different kinds of these organisms are active, Ms. Bragagnini said.
“So, if your late-night eating consists of foods high in sugar and fat, you may be negatively impacting your microbiome.” she said.
The next step for Ms. Khoshaba and colleagues is a study examining the peripheral circadian rhythm, changes in the gut microbiome, and the risk for developing metabolic syndrome. Ms. Khoshaba and Ms. Bragagnini had no relevant disclosures.
A version of this article appeared on Medscape.com.
WASHINGTON —
, according to the results of research presented at the annual Digestive Disease Week® (DDW).Investigators in a new study questioned 664 people getting a colonoscopy to screen for cancer, and 42% said they were late eaters. This group was 46% more likely than non–late eaters to have an adenoma found during colonoscopy. An estimated 5% to 10% of them become cancerous over time.
“A lot of other studies are about what we eat but not when we eat,” said Edena Khoshaba, lead investigator and a medical student at Rush University Medical College in Chicago. “The common advice includes not eating red meat, eating more fruits and vegetables — which is great, of course — but we wanted to see if the timing affects us at all.”
Ms. Khoshaba and colleagues found it did. Late eaters were 5.5 times more likely to have three or more tubular adenomas compared to non–late eaters, even after adjusting for what people were eating. Tubular adenomas are the most common type of polyp found in the colon.
So, what’s the possible connection between late eating and the risk for colorectal cancer?
Resetting Your Internal Clock
Eating close to bedtime could be throwing off the body’s circadian rhythm. But in this case, it’s not the central circadian center located in the brain — the one that releases melatonin. Instead, late eating could disrupt the peripheral circadian rhythm, part of which is found in the GI tract. For example, if a person is eating late at night, the brain thinks it is nighttime while the gut thinks it is daytime, Ms. Khoshaba said in an interview.
This is an interesting study, said Amy Bragagnini, MS, RD, spokesperson for the Academy of Nutrition and Dietetics, when asked to comment on the research. “It is true that eating later at night can disrupt your circadian rhythm.”
“In addition, many of my patients have told me that when they do eat later at night, they don’t always make the healthiest food choices,” Ms. Bragagnini said. “Their late-night food choices are generally higher in added sugar and fat. This may cause them to consume far more calories than their body needs.” So, eating late at night can also lead to unwanted weight gain.
An unanswered question is if late eating is connected in any way at all to increasing rates of colorectal cancer seen in younger patients.
This was an observational study, and another possible limitation, Ms. Khoshaba said, is that people were asked to recall their diets over 24 hours, which may not always be accurate.
Some of the organisms in the gut have their own internal clocks that follow a daily rhythm, and what someone eat determines how many different kinds of these organisms are active, Ms. Bragagnini said.
“So, if your late-night eating consists of foods high in sugar and fat, you may be negatively impacting your microbiome.” she said.
The next step for Ms. Khoshaba and colleagues is a study examining the peripheral circadian rhythm, changes in the gut microbiome, and the risk for developing metabolic syndrome. Ms. Khoshaba and Ms. Bragagnini had no relevant disclosures.
A version of this article appeared on Medscape.com.
WASHINGTON —
, according to the results of research presented at the annual Digestive Disease Week® (DDW).Investigators in a new study questioned 664 people getting a colonoscopy to screen for cancer, and 42% said they were late eaters. This group was 46% more likely than non–late eaters to have an adenoma found during colonoscopy. An estimated 5% to 10% of them become cancerous over time.
“A lot of other studies are about what we eat but not when we eat,” said Edena Khoshaba, lead investigator and a medical student at Rush University Medical College in Chicago. “The common advice includes not eating red meat, eating more fruits and vegetables — which is great, of course — but we wanted to see if the timing affects us at all.”
Ms. Khoshaba and colleagues found it did. Late eaters were 5.5 times more likely to have three or more tubular adenomas compared to non–late eaters, even after adjusting for what people were eating. Tubular adenomas are the most common type of polyp found in the colon.
So, what’s the possible connection between late eating and the risk for colorectal cancer?
Resetting Your Internal Clock
Eating close to bedtime could be throwing off the body’s circadian rhythm. But in this case, it’s not the central circadian center located in the brain — the one that releases melatonin. Instead, late eating could disrupt the peripheral circadian rhythm, part of which is found in the GI tract. For example, if a person is eating late at night, the brain thinks it is nighttime while the gut thinks it is daytime, Ms. Khoshaba said in an interview.
This is an interesting study, said Amy Bragagnini, MS, RD, spokesperson for the Academy of Nutrition and Dietetics, when asked to comment on the research. “It is true that eating later at night can disrupt your circadian rhythm.”
“In addition, many of my patients have told me that when they do eat later at night, they don’t always make the healthiest food choices,” Ms. Bragagnini said. “Their late-night food choices are generally higher in added sugar and fat. This may cause them to consume far more calories than their body needs.” So, eating late at night can also lead to unwanted weight gain.
An unanswered question is if late eating is connected in any way at all to increasing rates of colorectal cancer seen in younger patients.
This was an observational study, and another possible limitation, Ms. Khoshaba said, is that people were asked to recall their diets over 24 hours, which may not always be accurate.
Some of the organisms in the gut have their own internal clocks that follow a daily rhythm, and what someone eat determines how many different kinds of these organisms are active, Ms. Bragagnini said.
“So, if your late-night eating consists of foods high in sugar and fat, you may be negatively impacting your microbiome.” she said.
The next step for Ms. Khoshaba and colleagues is a study examining the peripheral circadian rhythm, changes in the gut microbiome, and the risk for developing metabolic syndrome. Ms. Khoshaba and Ms. Bragagnini had no relevant disclosures.
A version of this article appeared on Medscape.com.
FROM DDW 2024
Helping Patients With Intellectual Disabilities Make Informed Decisions
BOSTON — Primary care clinicians caring for patients with intellectual and developmental disabilities often recommend guardianship, a responsibility with life-altering implications.
But only approximately 30% of primary care residency programs in the United States provide training on how to assess the ability of patients with disabilities to make decisions for themselves, and much of this training is optional, according to a recent study cited during a workshop at the 2024 annual meeting of the Society of General Internal Medicine.
Assessing the capacity of patients with disabilities involves navigating a maze of legal, ethical, and clinical considerations, according to Mary Thomas, MD, MPH, a clinical fellow in geriatrics at Yale University School of Medicine in New Haven, Connecticut, who co-moderated the workshop.
Guardianship, while sometimes necessary, can be overly restrictive and diminish patient autonomy, she said. The legal process — ultimately decided through the courts — gives a guardian permission to manage medical care and make decisions for someone who cannot make or communicate those decisions themselves.
Clinicians can assess patients through an evaluation of functional capacity, which allows them to observe a patient’s demeanor and administer a cognition test. Alternatives such as supported decision-making may be less restrictive and can better serve patients, she said. Supported decision-making allows for a person with disabilities to receive assistance from a supporter who can help a patient process medical conditions and treatment needs. The supporter helps empower capable patients to decide on their own.
Some states have introduced legislation that would legally recognize supported decision-making as a less restrictive alternative to guardianship or conservatorship, in which a court-appointed individual manages all aspects of a person’s life.
Sara Mixter, MD, MPH, an assistant professor of medicine and pediatrics at the Johns Hopkins University School of Medicine in Baltimore and a co-moderator of the workshop, called the use of inclusive language in patient communication the “first step toward fostering an environment where patients feel respected and understood.”
Inclusive conversations can include person-first language and using words such as “caregiver” rather than “caretaker.”
Dr. Thomas and Dr. Mixter also called for the directors of residency programs to provide more training on disabilities. They cited a 2023 survey of directors, many of whom said that educational boards do not require training in disability-specific care and that experts in the care of people with disabilities are few and far between.
“Education and awareness are key to overcoming the challenges we face,” Dr. Thomas said. “Improving our training programs means we can ensure that all patients receive the care and respect they deserve.”
Dr. Thomas and Dr. Mixter report no relevant disclosures.
A version of this article first appeared on Medscape.com.
BOSTON — Primary care clinicians caring for patients with intellectual and developmental disabilities often recommend guardianship, a responsibility with life-altering implications.
But only approximately 30% of primary care residency programs in the United States provide training on how to assess the ability of patients with disabilities to make decisions for themselves, and much of this training is optional, according to a recent study cited during a workshop at the 2024 annual meeting of the Society of General Internal Medicine.
Assessing the capacity of patients with disabilities involves navigating a maze of legal, ethical, and clinical considerations, according to Mary Thomas, MD, MPH, a clinical fellow in geriatrics at Yale University School of Medicine in New Haven, Connecticut, who co-moderated the workshop.
Guardianship, while sometimes necessary, can be overly restrictive and diminish patient autonomy, she said. The legal process — ultimately decided through the courts — gives a guardian permission to manage medical care and make decisions for someone who cannot make or communicate those decisions themselves.
Clinicians can assess patients through an evaluation of functional capacity, which allows them to observe a patient’s demeanor and administer a cognition test. Alternatives such as supported decision-making may be less restrictive and can better serve patients, she said. Supported decision-making allows for a person with disabilities to receive assistance from a supporter who can help a patient process medical conditions and treatment needs. The supporter helps empower capable patients to decide on their own.
Some states have introduced legislation that would legally recognize supported decision-making as a less restrictive alternative to guardianship or conservatorship, in which a court-appointed individual manages all aspects of a person’s life.
Sara Mixter, MD, MPH, an assistant professor of medicine and pediatrics at the Johns Hopkins University School of Medicine in Baltimore and a co-moderator of the workshop, called the use of inclusive language in patient communication the “first step toward fostering an environment where patients feel respected and understood.”
Inclusive conversations can include person-first language and using words such as “caregiver” rather than “caretaker.”
Dr. Thomas and Dr. Mixter also called for the directors of residency programs to provide more training on disabilities. They cited a 2023 survey of directors, many of whom said that educational boards do not require training in disability-specific care and that experts in the care of people with disabilities are few and far between.
“Education and awareness are key to overcoming the challenges we face,” Dr. Thomas said. “Improving our training programs means we can ensure that all patients receive the care and respect they deserve.”
Dr. Thomas and Dr. Mixter report no relevant disclosures.
A version of this article first appeared on Medscape.com.
BOSTON — Primary care clinicians caring for patients with intellectual and developmental disabilities often recommend guardianship, a responsibility with life-altering implications.
But only approximately 30% of primary care residency programs in the United States provide training on how to assess the ability of patients with disabilities to make decisions for themselves, and much of this training is optional, according to a recent study cited during a workshop at the 2024 annual meeting of the Society of General Internal Medicine.
Assessing the capacity of patients with disabilities involves navigating a maze of legal, ethical, and clinical considerations, according to Mary Thomas, MD, MPH, a clinical fellow in geriatrics at Yale University School of Medicine in New Haven, Connecticut, who co-moderated the workshop.
Guardianship, while sometimes necessary, can be overly restrictive and diminish patient autonomy, she said. The legal process — ultimately decided through the courts — gives a guardian permission to manage medical care and make decisions for someone who cannot make or communicate those decisions themselves.
Clinicians can assess patients through an evaluation of functional capacity, which allows them to observe a patient’s demeanor and administer a cognition test. Alternatives such as supported decision-making may be less restrictive and can better serve patients, she said. Supported decision-making allows for a person with disabilities to receive assistance from a supporter who can help a patient process medical conditions and treatment needs. The supporter helps empower capable patients to decide on their own.
Some states have introduced legislation that would legally recognize supported decision-making as a less restrictive alternative to guardianship or conservatorship, in which a court-appointed individual manages all aspects of a person’s life.
Sara Mixter, MD, MPH, an assistant professor of medicine and pediatrics at the Johns Hopkins University School of Medicine in Baltimore and a co-moderator of the workshop, called the use of inclusive language in patient communication the “first step toward fostering an environment where patients feel respected and understood.”
Inclusive conversations can include person-first language and using words such as “caregiver” rather than “caretaker.”
Dr. Thomas and Dr. Mixter also called for the directors of residency programs to provide more training on disabilities. They cited a 2023 survey of directors, many of whom said that educational boards do not require training in disability-specific care and that experts in the care of people with disabilities are few and far between.
“Education and awareness are key to overcoming the challenges we face,” Dr. Thomas said. “Improving our training programs means we can ensure that all patients receive the care and respect they deserve.”
Dr. Thomas and Dr. Mixter report no relevant disclosures.
A version of this article first appeared on Medscape.com.
Ultraprocessed Foods May Be an Independent Risk Factor for Poor Brain Health
, new research suggests.
Observations from a large cohort of adults followed for more than 10 years suggested that eating more ultraprocessed foods (UPFs) may increase the risk for cognitive decline and stroke, while eating more unprocessed or minimally processed foods may lower the risk.
“The first key takeaway is that the type of food that we eat matters for brain health, but it’s equally important to think about how it’s made and handled when thinking about brain health,” said study investigator W. Taylor Kimberly, MD, PhD, with Massachusetts General Hospital in Boston.
“The second is that it’s not just all a bad news story because while increased consumption of ultra-processed foods is associated with a higher risk of cognitive impairment and stroke, unprocessed foods appear to be protective,” Dr. Kimberly added.
The study was published online on May 22 in Neurology.
Food Processing Matters
UPFs are highly manipulated, low in protein and fiber, and packed with added ingredients, including sugar, fat, and salt. Examples of UPFs are soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, and fries.
Unprocessed or minimally processed foods include meats such as simple cuts of beef, pork, and chicken, and vegetables and fruits.
Research has shown associations between high UPF consumption and increased risk for metabolic and neurologic disorders.
As reported previously, in the ELSA-Brasil study, higher intake of UPFs was significantly associated with a faster rate of decline in executive and global cognitive function.
Yet, it’s unclear whether the extent of food processing contributes to the risk of adverse neurologic outcomes independent of dietary patterns.
Dr. Kimberly and colleagues examined the association of food processing levels with the risk for cognitive impairment and stroke in the long-running REGARDS study, a large prospective US cohort of Black and White adults aged 45 years and older.
Food processing levels were defined by the NOVA food classification system, which ranges from unprocessed or minimally processed foods (NOVA1) to UPFs (NOVA4). Dietary patterns were characterized based on food frequency questionnaires.
In the cognitive impairment cohort, 768 of 14,175 adults without evidence of impairment at baseline who underwent follow-up testing developed cognitive impairment.
Diet an Opportunity to Protect Brain Health
In multivariable Cox proportional hazards models adjusting for age, sex, high blood pressure, and other factors, a 10% increase in relative intake of UPFs was associated with a 16% higher risk for cognitive impairment (hazard ratio [HR], 1.16). Conversely, a higher intake of unprocessed or minimally processed foods correlated with a 12% lower risk for cognitive impairment (HR, 0.88).
In the stroke cohort, 1108 of 20,243 adults without a history of stroke had a stroke during the follow-up.
In multivariable Cox models, greater intake of UPFs was associated with an 8% increased risk for stroke (HR, 1.08), while greater intake of unprocessed or minimally processed foods correlated with a 9% lower risk for stroke (HR, 0.91).
The effect of UPFs on stroke risk was greater among Black than among White adults (UPF-by-race interaction HR, 1.15).
The associations between UPFs and both cognitive impairment and stroke were independent of adherence to the Mediterranean diet, the Dietary Approaches to Stop Hypertension (DASH) diet, and the Mediterranean-DASH Intervention for Neurodegenerative Delay diet.
These results “highlight the possibility that we have the capacity to maintain our brain health and prevent poor brain health outcomes by focusing on unprocessed foods in the long term,” Dr. Kimberly said.
He cautioned that this was “an observational study and not an interventional study, so we can’t say with certainty that substituting ultra-processed foods with unprocessed foods will definitively improve brain health,” Dr. Kimberly said. “That’s a clinical trial question that has not been done but our results certainly are provocative.”
Consider UPFs in National Guidelines?
The coauthors of an accompanying editorial said the “robust” results from Kimberly and colleagues highlight the “significant role of food processing levels and their relationship with adverse neurologic outcomes, independent of conventional dietary patterns.”
Peipei Gao, MS, with Harvard T.H. Chan School of Public Health, and Zhendong Mei, PhD, with Harvard Medical School, both in Boston, noted that the mechanisms underlying the impact of UPFs on adverse neurologic outcomes “can be attributed not only to their nutritional profiles,” including poor nutrient composition and high glycemic load, “but also to the presence of additives including emulsifiers, colorants, sweeteners, and nitrates/nitrites, which have been associated with disruptions in the gut microbial ecosystem and inflammation.
“Understanding how food processing levels are associated with human health offers a fresh take on the saying ‘you are what you eat,’ ” the editorialists wrote.
This new study, they noted, adds to the evidence by highlighting the link between UPFs and brain health, independent of traditional dietary patterns and “raises questions about whether considerations of UPFs should be included in dietary guidelines, as well as national and global public health policies for improving brain health.”
The editorialists called for large prospective population studies and randomized controlled trials to better understand the link between UPF consumption and brain health. “In addition, mechanistic studies are warranted to identify specific foods, detrimental processes, and additives that play a role in UPFs and their association with neurologic disorders,” they concluded.
Funding for the study was provided by the National Institute of Neurological Disorders and Stroke, the National Institute on Aging, National Institutes of Health, and Department of Health and Human Services. The authors and editorial writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
, new research suggests.
Observations from a large cohort of adults followed for more than 10 years suggested that eating more ultraprocessed foods (UPFs) may increase the risk for cognitive decline and stroke, while eating more unprocessed or minimally processed foods may lower the risk.
“The first key takeaway is that the type of food that we eat matters for brain health, but it’s equally important to think about how it’s made and handled when thinking about brain health,” said study investigator W. Taylor Kimberly, MD, PhD, with Massachusetts General Hospital in Boston.
“The second is that it’s not just all a bad news story because while increased consumption of ultra-processed foods is associated with a higher risk of cognitive impairment and stroke, unprocessed foods appear to be protective,” Dr. Kimberly added.
The study was published online on May 22 in Neurology.
Food Processing Matters
UPFs are highly manipulated, low in protein and fiber, and packed with added ingredients, including sugar, fat, and salt. Examples of UPFs are soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, and fries.
Unprocessed or minimally processed foods include meats such as simple cuts of beef, pork, and chicken, and vegetables and fruits.
Research has shown associations between high UPF consumption and increased risk for metabolic and neurologic disorders.
As reported previously, in the ELSA-Brasil study, higher intake of UPFs was significantly associated with a faster rate of decline in executive and global cognitive function.
Yet, it’s unclear whether the extent of food processing contributes to the risk of adverse neurologic outcomes independent of dietary patterns.
Dr. Kimberly and colleagues examined the association of food processing levels with the risk for cognitive impairment and stroke in the long-running REGARDS study, a large prospective US cohort of Black and White adults aged 45 years and older.
Food processing levels were defined by the NOVA food classification system, which ranges from unprocessed or minimally processed foods (NOVA1) to UPFs (NOVA4). Dietary patterns were characterized based on food frequency questionnaires.
In the cognitive impairment cohort, 768 of 14,175 adults without evidence of impairment at baseline who underwent follow-up testing developed cognitive impairment.
Diet an Opportunity to Protect Brain Health
In multivariable Cox proportional hazards models adjusting for age, sex, high blood pressure, and other factors, a 10% increase in relative intake of UPFs was associated with a 16% higher risk for cognitive impairment (hazard ratio [HR], 1.16). Conversely, a higher intake of unprocessed or minimally processed foods correlated with a 12% lower risk for cognitive impairment (HR, 0.88).
In the stroke cohort, 1108 of 20,243 adults without a history of stroke had a stroke during the follow-up.
In multivariable Cox models, greater intake of UPFs was associated with an 8% increased risk for stroke (HR, 1.08), while greater intake of unprocessed or minimally processed foods correlated with a 9% lower risk for stroke (HR, 0.91).
The effect of UPFs on stroke risk was greater among Black than among White adults (UPF-by-race interaction HR, 1.15).
The associations between UPFs and both cognitive impairment and stroke were independent of adherence to the Mediterranean diet, the Dietary Approaches to Stop Hypertension (DASH) diet, and the Mediterranean-DASH Intervention for Neurodegenerative Delay diet.
These results “highlight the possibility that we have the capacity to maintain our brain health and prevent poor brain health outcomes by focusing on unprocessed foods in the long term,” Dr. Kimberly said.
He cautioned that this was “an observational study and not an interventional study, so we can’t say with certainty that substituting ultra-processed foods with unprocessed foods will definitively improve brain health,” Dr. Kimberly said. “That’s a clinical trial question that has not been done but our results certainly are provocative.”
Consider UPFs in National Guidelines?
The coauthors of an accompanying editorial said the “robust” results from Kimberly and colleagues highlight the “significant role of food processing levels and their relationship with adverse neurologic outcomes, independent of conventional dietary patterns.”
Peipei Gao, MS, with Harvard T.H. Chan School of Public Health, and Zhendong Mei, PhD, with Harvard Medical School, both in Boston, noted that the mechanisms underlying the impact of UPFs on adverse neurologic outcomes “can be attributed not only to their nutritional profiles,” including poor nutrient composition and high glycemic load, “but also to the presence of additives including emulsifiers, colorants, sweeteners, and nitrates/nitrites, which have been associated with disruptions in the gut microbial ecosystem and inflammation.
“Understanding how food processing levels are associated with human health offers a fresh take on the saying ‘you are what you eat,’ ” the editorialists wrote.
This new study, they noted, adds to the evidence by highlighting the link between UPFs and brain health, independent of traditional dietary patterns and “raises questions about whether considerations of UPFs should be included in dietary guidelines, as well as national and global public health policies for improving brain health.”
The editorialists called for large prospective population studies and randomized controlled trials to better understand the link between UPF consumption and brain health. “In addition, mechanistic studies are warranted to identify specific foods, detrimental processes, and additives that play a role in UPFs and their association with neurologic disorders,” they concluded.
Funding for the study was provided by the National Institute of Neurological Disorders and Stroke, the National Institute on Aging, National Institutes of Health, and Department of Health and Human Services. The authors and editorial writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
, new research suggests.
Observations from a large cohort of adults followed for more than 10 years suggested that eating more ultraprocessed foods (UPFs) may increase the risk for cognitive decline and stroke, while eating more unprocessed or minimally processed foods may lower the risk.
“The first key takeaway is that the type of food that we eat matters for brain health, but it’s equally important to think about how it’s made and handled when thinking about brain health,” said study investigator W. Taylor Kimberly, MD, PhD, with Massachusetts General Hospital in Boston.
“The second is that it’s not just all a bad news story because while increased consumption of ultra-processed foods is associated with a higher risk of cognitive impairment and stroke, unprocessed foods appear to be protective,” Dr. Kimberly added.
The study was published online on May 22 in Neurology.
Food Processing Matters
UPFs are highly manipulated, low in protein and fiber, and packed with added ingredients, including sugar, fat, and salt. Examples of UPFs are soft drinks, chips, chocolate, candy, ice cream, sweetened breakfast cereals, packaged soups, chicken nuggets, hot dogs, and fries.
Unprocessed or minimally processed foods include meats such as simple cuts of beef, pork, and chicken, and vegetables and fruits.
Research has shown associations between high UPF consumption and increased risk for metabolic and neurologic disorders.
As reported previously, in the ELSA-Brasil study, higher intake of UPFs was significantly associated with a faster rate of decline in executive and global cognitive function.
Yet, it’s unclear whether the extent of food processing contributes to the risk of adverse neurologic outcomes independent of dietary patterns.
Dr. Kimberly and colleagues examined the association of food processing levels with the risk for cognitive impairment and stroke in the long-running REGARDS study, a large prospective US cohort of Black and White adults aged 45 years and older.
Food processing levels were defined by the NOVA food classification system, which ranges from unprocessed or minimally processed foods (NOVA1) to UPFs (NOVA4). Dietary patterns were characterized based on food frequency questionnaires.
In the cognitive impairment cohort, 768 of 14,175 adults without evidence of impairment at baseline who underwent follow-up testing developed cognitive impairment.
Diet an Opportunity to Protect Brain Health
In multivariable Cox proportional hazards models adjusting for age, sex, high blood pressure, and other factors, a 10% increase in relative intake of UPFs was associated with a 16% higher risk for cognitive impairment (hazard ratio [HR], 1.16). Conversely, a higher intake of unprocessed or minimally processed foods correlated with a 12% lower risk for cognitive impairment (HR, 0.88).
In the stroke cohort, 1108 of 20,243 adults without a history of stroke had a stroke during the follow-up.
In multivariable Cox models, greater intake of UPFs was associated with an 8% increased risk for stroke (HR, 1.08), while greater intake of unprocessed or minimally processed foods correlated with a 9% lower risk for stroke (HR, 0.91).
The effect of UPFs on stroke risk was greater among Black than among White adults (UPF-by-race interaction HR, 1.15).
The associations between UPFs and both cognitive impairment and stroke were independent of adherence to the Mediterranean diet, the Dietary Approaches to Stop Hypertension (DASH) diet, and the Mediterranean-DASH Intervention for Neurodegenerative Delay diet.
These results “highlight the possibility that we have the capacity to maintain our brain health and prevent poor brain health outcomes by focusing on unprocessed foods in the long term,” Dr. Kimberly said.
He cautioned that this was “an observational study and not an interventional study, so we can’t say with certainty that substituting ultra-processed foods with unprocessed foods will definitively improve brain health,” Dr. Kimberly said. “That’s a clinical trial question that has not been done but our results certainly are provocative.”
Consider UPFs in National Guidelines?
The coauthors of an accompanying editorial said the “robust” results from Kimberly and colleagues highlight the “significant role of food processing levels and their relationship with adverse neurologic outcomes, independent of conventional dietary patterns.”
Peipei Gao, MS, with Harvard T.H. Chan School of Public Health, and Zhendong Mei, PhD, with Harvard Medical School, both in Boston, noted that the mechanisms underlying the impact of UPFs on adverse neurologic outcomes “can be attributed not only to their nutritional profiles,” including poor nutrient composition and high glycemic load, “but also to the presence of additives including emulsifiers, colorants, sweeteners, and nitrates/nitrites, which have been associated with disruptions in the gut microbial ecosystem and inflammation.
“Understanding how food processing levels are associated with human health offers a fresh take on the saying ‘you are what you eat,’ ” the editorialists wrote.
This new study, they noted, adds to the evidence by highlighting the link between UPFs and brain health, independent of traditional dietary patterns and “raises questions about whether considerations of UPFs should be included in dietary guidelines, as well as national and global public health policies for improving brain health.”
The editorialists called for large prospective population studies and randomized controlled trials to better understand the link between UPF consumption and brain health. “In addition, mechanistic studies are warranted to identify specific foods, detrimental processes, and additives that play a role in UPFs and their association with neurologic disorders,” they concluded.
Funding for the study was provided by the National Institute of Neurological Disorders and Stroke, the National Institute on Aging, National Institutes of Health, and Department of Health and Human Services. The authors and editorial writers had no relevant disclosures.
A version of this article appeared on Medscape.com.
FROM NEUROLOGY
Low-FODMAP, Low-Carb Diets May Beat Medical Treatment for IBS
According to a new study, evidence was found that these dietary interventions were more efficacious at 4 weeks, suggesting their potential as first-line treatments.
“IBS is a disorder that may have different underlying causes, and it can manifest in different ways among patients. It is also likely that the most effective treatment option can differ in patients,” said lead author Sanna Nybacka, RD, PhD, a postdoctoral researcher in molecular and clinical medicine at the University of Gothenburg’s Sahlgrenska Academy, Gothenburg, Sweden.
“Up to 80% of patients with IBS report that their symptoms are exacerbated by dietary factors, and dietary modifications are considered a promising avenue for alleviating IBS symptoms,” she said. “However, as not all patients respond to dietary modifications, we need studies comparing the effectiveness of dietary vs pharmacological treatments in IBS to better understand which patients are more likely to benefit from which treatment.”
The study was published online in The Lancet Gastroenterology and Hepatology.
Treatment Comparison
Dr. Nybacka and colleagues conducted a single-blind randomized controlled trial at a specialized outpatient clinic at Sahlgrenska University Hospital in Gothenburg, Sweden, between January 2017 and September 2021. They included adults with moderate to severe IBS, which was defined as ≥ 175 points on the IBS Severity Scoring System (IBS-SSS), and who had no other serious diseases or food allergies.
The participants were assigned 1:1:1 to receive a low-FODMAP diet plus traditional dietary advice (50% carbohydrates, 33% fat, 17% protein), a fiber-optimized diet with low carbohydrates and high protein and fat (10% carbohydrates, 67% fat, 23% protein), or optimized medical treatment based on predominant IBS symptoms. Participants were masked to the names of the diets, but the pharmacological treatment was open-label.
After 4 weeks, participants were unmasked and encouraged to continue their diets.
During 6 months of follow-up, those in the low-FODMAP group were instructed on how to reintroduce FODMAPs, and those in the pharmacological treatment group were offered personalized diet counseling and to continue their medication.
Among 1104 participants assessed for eligibility, 304 were randomly assigned. However, 10 participants did not receive their intervention after randomization, so only 294 participants were included in the modified intention-to-treat population: 96 in the low-FODMAP group, 97 in the low-carbohydrate group, and 101 in the optimized medical treatment group. Overall, 82% were women, and the mean age was 38 years.
Following the 4-week intervention, 73 of 96 participants (76%) in the low-FODMAP group, 69 of 97 participants (71%) in the low-carbohydrate group, and 59 of 101 participants (58%) in the optimized medical treatment group had a reduction of ≥ 50 points in the IBS-SSS compared with baseline.
A stricter score reduction of ≥ 100 points was observed in 61% of the low-FODMAP group, 58% of the low-carbohydrate group, and 39% of the optimized medical treatment group.
In both the low-FODMAP group and the low-carbohydrate group, 95% of participants completed the 4-week intervention compared with 90% among the pharmacological group. Two people in each group said adverse events prompted their discontinuation, and five in the medical treatment group stopped prematurely due to side effects. No serious adverse events or treatment-related deaths occurred.
“We were surprised by the effectiveness of the fiber-optimized low-carbohydrate diet, which demonstrated comparable efficacy to the combined low-FODMAP and traditional IBS diet,” Dr. Nybacka said. “While previous knowledge suggested that high-fat intake could worsen symptoms in some individuals, the synergy with low-carbohydrate intake appeared to render the diet more tolerable for these patients.”
The authors noted that since all three treatment options showed significant and clinically meaningful efficacy, patient preference, ease of implementation, compliance, cost-effectiveness, and long-term effects, including those on nutritional status and gut microbiota, should be considered in personalized plans.
Future Practice Considerations
Dr. Nybacka and colleagues recommended additional trials before implementing the low-carbohydrate diet in clinical practice. “Worse blood lipid levels among some participants in the low-carbohydrate group point to an area for caution,” she said.
The research team also plans to evaluate changes in microbiota composition and metabolomic profiles among participants to further understand factors associated with positive treatment outcomes.
“Approximately two thirds of patients with IBS report that certain foods trigger symptoms of IBS, which is why many patients are interested in exploring dietary interventions for their symptoms,” said Brian Lacy, MD, professor of medicine and program director of the GI fellowship program at the Mayo Clinic in Jacksonville, Florida. “One of the most commonly employed diets for the treatment of IBS is the low-FODMAP diet.”
Dr. Lacy, who wasn’t involved with this study, co-authored the 2021 American College of Gastroenterology clinical guideline for the management of IBS.
He and his colleagues recommended a limited trial of a low-FODMAP diet to improve symptoms, as well as targeted use of medications for IBS subtypes with constipation or diarrhea and gut-directed psychotherapy for overall IBS symptoms.
“However, there are problems with the low-FODMAP diet, as it can be difficult to institute, it can be fairly restrictive, and long-term use has the potential to lead to micronutrient deficiencies,” he said. “Importantly, large studies comparing dietary interventions directly to medical therapies are absent, which led to the study by Nybacka and colleagues.”
Dr. Lacy noted several limitations, including the single-center focus, short-term intervention, and variety of therapies used among the medical arm of the study. In addition, some therapies available in the United States aren’t available in Europe, so the varying approaches to medical management in the former may lead to different results. At the same time, he said, the study is important and will be widely discussed among patients and clinicians.
“I think it will likely stand the test of time,” Dr. Lacy said. “An easy-to-use diet with common sense advice that improves symptoms will likely eventually translate into first-line therapy for IBS patients.”
The study was funded by grants from the Healthcare Board Region Västra Götaland, Swedish Research Council, Swedish Research Council for Health, Working Life and Welfare, and AFA Insurance; the ALF agreement between the Swedish government and county councils; Wilhelm and Martina Lundgren Science Foundation; Skandia; Dietary Science Foundation; and Nanna Swartz Foundation. Several authors declared grants, consulting fees, and advisory board roles with various pharmaceutical companies. Dr. Lacy reported no relevant disclosures.
A version of this article appeared on Medscape.com.
According to a new study, evidence was found that these dietary interventions were more efficacious at 4 weeks, suggesting their potential as first-line treatments.
“IBS is a disorder that may have different underlying causes, and it can manifest in different ways among patients. It is also likely that the most effective treatment option can differ in patients,” said lead author Sanna Nybacka, RD, PhD, a postdoctoral researcher in molecular and clinical medicine at the University of Gothenburg’s Sahlgrenska Academy, Gothenburg, Sweden.
“Up to 80% of patients with IBS report that their symptoms are exacerbated by dietary factors, and dietary modifications are considered a promising avenue for alleviating IBS symptoms,” she said. “However, as not all patients respond to dietary modifications, we need studies comparing the effectiveness of dietary vs pharmacological treatments in IBS to better understand which patients are more likely to benefit from which treatment.”
The study was published online in The Lancet Gastroenterology and Hepatology.
Treatment Comparison
Dr. Nybacka and colleagues conducted a single-blind randomized controlled trial at a specialized outpatient clinic at Sahlgrenska University Hospital in Gothenburg, Sweden, between January 2017 and September 2021. They included adults with moderate to severe IBS, which was defined as ≥ 175 points on the IBS Severity Scoring System (IBS-SSS), and who had no other serious diseases or food allergies.
The participants were assigned 1:1:1 to receive a low-FODMAP diet plus traditional dietary advice (50% carbohydrates, 33% fat, 17% protein), a fiber-optimized diet with low carbohydrates and high protein and fat (10% carbohydrates, 67% fat, 23% protein), or optimized medical treatment based on predominant IBS symptoms. Participants were masked to the names of the diets, but the pharmacological treatment was open-label.
After 4 weeks, participants were unmasked and encouraged to continue their diets.
During 6 months of follow-up, those in the low-FODMAP group were instructed on how to reintroduce FODMAPs, and those in the pharmacological treatment group were offered personalized diet counseling and to continue their medication.
Among 1104 participants assessed for eligibility, 304 were randomly assigned. However, 10 participants did not receive their intervention after randomization, so only 294 participants were included in the modified intention-to-treat population: 96 in the low-FODMAP group, 97 in the low-carbohydrate group, and 101 in the optimized medical treatment group. Overall, 82% were women, and the mean age was 38 years.
Following the 4-week intervention, 73 of 96 participants (76%) in the low-FODMAP group, 69 of 97 participants (71%) in the low-carbohydrate group, and 59 of 101 participants (58%) in the optimized medical treatment group had a reduction of ≥ 50 points in the IBS-SSS compared with baseline.
A stricter score reduction of ≥ 100 points was observed in 61% of the low-FODMAP group, 58% of the low-carbohydrate group, and 39% of the optimized medical treatment group.
In both the low-FODMAP group and the low-carbohydrate group, 95% of participants completed the 4-week intervention compared with 90% among the pharmacological group. Two people in each group said adverse events prompted their discontinuation, and five in the medical treatment group stopped prematurely due to side effects. No serious adverse events or treatment-related deaths occurred.
“We were surprised by the effectiveness of the fiber-optimized low-carbohydrate diet, which demonstrated comparable efficacy to the combined low-FODMAP and traditional IBS diet,” Dr. Nybacka said. “While previous knowledge suggested that high-fat intake could worsen symptoms in some individuals, the synergy with low-carbohydrate intake appeared to render the diet more tolerable for these patients.”
The authors noted that since all three treatment options showed significant and clinically meaningful efficacy, patient preference, ease of implementation, compliance, cost-effectiveness, and long-term effects, including those on nutritional status and gut microbiota, should be considered in personalized plans.
Future Practice Considerations
Dr. Nybacka and colleagues recommended additional trials before implementing the low-carbohydrate diet in clinical practice. “Worse blood lipid levels among some participants in the low-carbohydrate group point to an area for caution,” she said.
The research team also plans to evaluate changes in microbiota composition and metabolomic profiles among participants to further understand factors associated with positive treatment outcomes.
“Approximately two thirds of patients with IBS report that certain foods trigger symptoms of IBS, which is why many patients are interested in exploring dietary interventions for their symptoms,” said Brian Lacy, MD, professor of medicine and program director of the GI fellowship program at the Mayo Clinic in Jacksonville, Florida. “One of the most commonly employed diets for the treatment of IBS is the low-FODMAP diet.”
Dr. Lacy, who wasn’t involved with this study, co-authored the 2021 American College of Gastroenterology clinical guideline for the management of IBS.
He and his colleagues recommended a limited trial of a low-FODMAP diet to improve symptoms, as well as targeted use of medications for IBS subtypes with constipation or diarrhea and gut-directed psychotherapy for overall IBS symptoms.
“However, there are problems with the low-FODMAP diet, as it can be difficult to institute, it can be fairly restrictive, and long-term use has the potential to lead to micronutrient deficiencies,” he said. “Importantly, large studies comparing dietary interventions directly to medical therapies are absent, which led to the study by Nybacka and colleagues.”
Dr. Lacy noted several limitations, including the single-center focus, short-term intervention, and variety of therapies used among the medical arm of the study. In addition, some therapies available in the United States aren’t available in Europe, so the varying approaches to medical management in the former may lead to different results. At the same time, he said, the study is important and will be widely discussed among patients and clinicians.
“I think it will likely stand the test of time,” Dr. Lacy said. “An easy-to-use diet with common sense advice that improves symptoms will likely eventually translate into first-line therapy for IBS patients.”
The study was funded by grants from the Healthcare Board Region Västra Götaland, Swedish Research Council, Swedish Research Council for Health, Working Life and Welfare, and AFA Insurance; the ALF agreement between the Swedish government and county councils; Wilhelm and Martina Lundgren Science Foundation; Skandia; Dietary Science Foundation; and Nanna Swartz Foundation. Several authors declared grants, consulting fees, and advisory board roles with various pharmaceutical companies. Dr. Lacy reported no relevant disclosures.
A version of this article appeared on Medscape.com.
According to a new study, evidence was found that these dietary interventions were more efficacious at 4 weeks, suggesting their potential as first-line treatments.
“IBS is a disorder that may have different underlying causes, and it can manifest in different ways among patients. It is also likely that the most effective treatment option can differ in patients,” said lead author Sanna Nybacka, RD, PhD, a postdoctoral researcher in molecular and clinical medicine at the University of Gothenburg’s Sahlgrenska Academy, Gothenburg, Sweden.
“Up to 80% of patients with IBS report that their symptoms are exacerbated by dietary factors, and dietary modifications are considered a promising avenue for alleviating IBS symptoms,” she said. “However, as not all patients respond to dietary modifications, we need studies comparing the effectiveness of dietary vs pharmacological treatments in IBS to better understand which patients are more likely to benefit from which treatment.”
The study was published online in The Lancet Gastroenterology and Hepatology.
Treatment Comparison
Dr. Nybacka and colleagues conducted a single-blind randomized controlled trial at a specialized outpatient clinic at Sahlgrenska University Hospital in Gothenburg, Sweden, between January 2017 and September 2021. They included adults with moderate to severe IBS, which was defined as ≥ 175 points on the IBS Severity Scoring System (IBS-SSS), and who had no other serious diseases or food allergies.
The participants were assigned 1:1:1 to receive a low-FODMAP diet plus traditional dietary advice (50% carbohydrates, 33% fat, 17% protein), a fiber-optimized diet with low carbohydrates and high protein and fat (10% carbohydrates, 67% fat, 23% protein), or optimized medical treatment based on predominant IBS symptoms. Participants were masked to the names of the diets, but the pharmacological treatment was open-label.
After 4 weeks, participants were unmasked and encouraged to continue their diets.
During 6 months of follow-up, those in the low-FODMAP group were instructed on how to reintroduce FODMAPs, and those in the pharmacological treatment group were offered personalized diet counseling and to continue their medication.
Among 1104 participants assessed for eligibility, 304 were randomly assigned. However, 10 participants did not receive their intervention after randomization, so only 294 participants were included in the modified intention-to-treat population: 96 in the low-FODMAP group, 97 in the low-carbohydrate group, and 101 in the optimized medical treatment group. Overall, 82% were women, and the mean age was 38 years.
Following the 4-week intervention, 73 of 96 participants (76%) in the low-FODMAP group, 69 of 97 participants (71%) in the low-carbohydrate group, and 59 of 101 participants (58%) in the optimized medical treatment group had a reduction of ≥ 50 points in the IBS-SSS compared with baseline.
A stricter score reduction of ≥ 100 points was observed in 61% of the low-FODMAP group, 58% of the low-carbohydrate group, and 39% of the optimized medical treatment group.
In both the low-FODMAP group and the low-carbohydrate group, 95% of participants completed the 4-week intervention compared with 90% among the pharmacological group. Two people in each group said adverse events prompted their discontinuation, and five in the medical treatment group stopped prematurely due to side effects. No serious adverse events or treatment-related deaths occurred.
“We were surprised by the effectiveness of the fiber-optimized low-carbohydrate diet, which demonstrated comparable efficacy to the combined low-FODMAP and traditional IBS diet,” Dr. Nybacka said. “While previous knowledge suggested that high-fat intake could worsen symptoms in some individuals, the synergy with low-carbohydrate intake appeared to render the diet more tolerable for these patients.”
The authors noted that since all three treatment options showed significant and clinically meaningful efficacy, patient preference, ease of implementation, compliance, cost-effectiveness, and long-term effects, including those on nutritional status and gut microbiota, should be considered in personalized plans.
Future Practice Considerations
Dr. Nybacka and colleagues recommended additional trials before implementing the low-carbohydrate diet in clinical practice. “Worse blood lipid levels among some participants in the low-carbohydrate group point to an area for caution,” she said.
The research team also plans to evaluate changes in microbiota composition and metabolomic profiles among participants to further understand factors associated with positive treatment outcomes.
“Approximately two thirds of patients with IBS report that certain foods trigger symptoms of IBS, which is why many patients are interested in exploring dietary interventions for their symptoms,” said Brian Lacy, MD, professor of medicine and program director of the GI fellowship program at the Mayo Clinic in Jacksonville, Florida. “One of the most commonly employed diets for the treatment of IBS is the low-FODMAP diet.”
Dr. Lacy, who wasn’t involved with this study, co-authored the 2021 American College of Gastroenterology clinical guideline for the management of IBS.
He and his colleagues recommended a limited trial of a low-FODMAP diet to improve symptoms, as well as targeted use of medications for IBS subtypes with constipation or diarrhea and gut-directed psychotherapy for overall IBS symptoms.
“However, there are problems with the low-FODMAP diet, as it can be difficult to institute, it can be fairly restrictive, and long-term use has the potential to lead to micronutrient deficiencies,” he said. “Importantly, large studies comparing dietary interventions directly to medical therapies are absent, which led to the study by Nybacka and colleagues.”
Dr. Lacy noted several limitations, including the single-center focus, short-term intervention, and variety of therapies used among the medical arm of the study. In addition, some therapies available in the United States aren’t available in Europe, so the varying approaches to medical management in the former may lead to different results. At the same time, he said, the study is important and will be widely discussed among patients and clinicians.
“I think it will likely stand the test of time,” Dr. Lacy said. “An easy-to-use diet with common sense advice that improves symptoms will likely eventually translate into first-line therapy for IBS patients.”
The study was funded by grants from the Healthcare Board Region Västra Götaland, Swedish Research Council, Swedish Research Council for Health, Working Life and Welfare, and AFA Insurance; the ALF agreement between the Swedish government and county councils; Wilhelm and Martina Lundgren Science Foundation; Skandia; Dietary Science Foundation; and Nanna Swartz Foundation. Several authors declared grants, consulting fees, and advisory board roles with various pharmaceutical companies. Dr. Lacy reported no relevant disclosures.
A version of this article appeared on Medscape.com.
FROM THE LANCET GASTROENTEROLOGY AND HEPATOLOGY
Online, Self-Help Program May Curb Binge Eating
An online program aimed at helping those with binge-eating disorder (BED), based on completing cognitive-behavioral therapy (CBT) modules, showed positive results in a randomized, controlled trial. The findings were published in JAMA Network Open.
In the study, led by Luise Pruessner, MS, with the Department of Psychology at Heidelberg University in Germany, 154 patients (96% female; average age 35.9) who met the criteria for BED were randomized 1-to-1 to the intervention or control group.
12-Week CBT Program with 6 Modules
The intervention group had access to a 12-week CBT online program with a core curriculum of six mandatory modules of texts and videos, focused on self-monitoring of binge eating, psychoeducation, and regulating emotion. Each could be accessed only after the previous module was completed. Participants also chose six specialization areas to personalize the experience. Email reminders were sent to participants who delayed starting the program to boost initial and continuing engagement.
The control group had no access to the program and participants were told they were on a 12-week waiting list for it. They could explore other treatments during that time, an option that mimics real-world experiences. The design choice also helped navigate the ethics of withholding a potentially effective treatment.
Significant Improvement in Outcomes
The intervention group had a significant reduction in binge-eating episodes, the primary outcome, compared with the control group. In the intervention group, the average number of episodes decreased from 14.79 at baseline to 6.07 (95% confidence interval, −11.31 to −6.72; P < .001). The reduction surpassed the clinically meaningful threshold of 3.97 episodes. The control group, as expected, had no significant reductions in episodes.
The intervention group also showed improvement in outcomes including well-being, self-esteem, and emotional regulation and reductions in clinical impairment, depression, and anxiety. “However, there were no meaningful between-group differences regarding changes in work capacity,” the authors noted.
In an invited commentary, Andrea Graham, PhD, with the Center for Behavioral Intervention Technologies at the Feinberg School of Medicine, Northwestern University, Chicago, noted that BED “is a prevalent, serious, and impairing psychiatric illness.”
The study authors pointed out that BED is one of the most prevalent eating disorders, affecting “1.0% to 2.8% of the population over their lifetimes.”
Dr. Graham notes that while there are evidence-based, face-to-face psychological treatments, many patients have considerable barriers to accessing those services.
Digital Intervention Advantages
“Digital interventions, such as the one evaluated by Pruessner and colleagues, have the potential to curb the mental health crisis by reaching large numbers of people in need” in the moments they need help most, she wrote.
She added that with BED, eating decisions and signals for dysregulated eating occur frequently throughout the day, highlighting the need for on-demand and immediate access to self-help, like the solution Ms. Pruessner and colleagues describe.
“The importance of Pruessner and colleagues’ findings is strengthened because their digital intervention did not rely on human support for delivery,” she wrote. Relying on human intervention poses financial challenges for achieving scale.
“Therefore, self-help interventions that achieve clinically significant improvements in outcomes present an important opportunity for closing the treatment gap for binge eating. Given its effectiveness, the critical next step is to learn where and how to implement this intervention to broadly reach individuals in need,” Dr. Graham wrote.
Primary care clinicians don’t typically intervene in eating disorders and a self-help intervention might help address that gap, she added.
“However, a first step would require increasing screening for eating disorders in primary care,” Dr. Graham pointed out.
The authors report no relevant financial relationships. Dr. Graham reports grants from the National Institute of Mental Health, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), and the Agency for Healthcare Research and Quality. She reports receiving a grant from the NIDDK-funded Chicago Center for Diabetes Translation Research, Dean’s Office of the Biological Sciences Division of the University of Chicago and Feinberg School of Medicine at Northwestern University; and being an adviser to Alavida Health.
An online program aimed at helping those with binge-eating disorder (BED), based on completing cognitive-behavioral therapy (CBT) modules, showed positive results in a randomized, controlled trial. The findings were published in JAMA Network Open.
In the study, led by Luise Pruessner, MS, with the Department of Psychology at Heidelberg University in Germany, 154 patients (96% female; average age 35.9) who met the criteria for BED were randomized 1-to-1 to the intervention or control group.
12-Week CBT Program with 6 Modules
The intervention group had access to a 12-week CBT online program with a core curriculum of six mandatory modules of texts and videos, focused on self-monitoring of binge eating, psychoeducation, and regulating emotion. Each could be accessed only after the previous module was completed. Participants also chose six specialization areas to personalize the experience. Email reminders were sent to participants who delayed starting the program to boost initial and continuing engagement.
The control group had no access to the program and participants were told they were on a 12-week waiting list for it. They could explore other treatments during that time, an option that mimics real-world experiences. The design choice also helped navigate the ethics of withholding a potentially effective treatment.
Significant Improvement in Outcomes
The intervention group had a significant reduction in binge-eating episodes, the primary outcome, compared with the control group. In the intervention group, the average number of episodes decreased from 14.79 at baseline to 6.07 (95% confidence interval, −11.31 to −6.72; P < .001). The reduction surpassed the clinically meaningful threshold of 3.97 episodes. The control group, as expected, had no significant reductions in episodes.
The intervention group also showed improvement in outcomes including well-being, self-esteem, and emotional regulation and reductions in clinical impairment, depression, and anxiety. “However, there were no meaningful between-group differences regarding changes in work capacity,” the authors noted.
In an invited commentary, Andrea Graham, PhD, with the Center for Behavioral Intervention Technologies at the Feinberg School of Medicine, Northwestern University, Chicago, noted that BED “is a prevalent, serious, and impairing psychiatric illness.”
The study authors pointed out that BED is one of the most prevalent eating disorders, affecting “1.0% to 2.8% of the population over their lifetimes.”
Dr. Graham notes that while there are evidence-based, face-to-face psychological treatments, many patients have considerable barriers to accessing those services.
Digital Intervention Advantages
“Digital interventions, such as the one evaluated by Pruessner and colleagues, have the potential to curb the mental health crisis by reaching large numbers of people in need” in the moments they need help most, she wrote.
She added that with BED, eating decisions and signals for dysregulated eating occur frequently throughout the day, highlighting the need for on-demand and immediate access to self-help, like the solution Ms. Pruessner and colleagues describe.
“The importance of Pruessner and colleagues’ findings is strengthened because their digital intervention did not rely on human support for delivery,” she wrote. Relying on human intervention poses financial challenges for achieving scale.
“Therefore, self-help interventions that achieve clinically significant improvements in outcomes present an important opportunity for closing the treatment gap for binge eating. Given its effectiveness, the critical next step is to learn where and how to implement this intervention to broadly reach individuals in need,” Dr. Graham wrote.
Primary care clinicians don’t typically intervene in eating disorders and a self-help intervention might help address that gap, she added.
“However, a first step would require increasing screening for eating disorders in primary care,” Dr. Graham pointed out.
The authors report no relevant financial relationships. Dr. Graham reports grants from the National Institute of Mental Health, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), and the Agency for Healthcare Research and Quality. She reports receiving a grant from the NIDDK-funded Chicago Center for Diabetes Translation Research, Dean’s Office of the Biological Sciences Division of the University of Chicago and Feinberg School of Medicine at Northwestern University; and being an adviser to Alavida Health.
An online program aimed at helping those with binge-eating disorder (BED), based on completing cognitive-behavioral therapy (CBT) modules, showed positive results in a randomized, controlled trial. The findings were published in JAMA Network Open.
In the study, led by Luise Pruessner, MS, with the Department of Psychology at Heidelberg University in Germany, 154 patients (96% female; average age 35.9) who met the criteria for BED were randomized 1-to-1 to the intervention or control group.
12-Week CBT Program with 6 Modules
The intervention group had access to a 12-week CBT online program with a core curriculum of six mandatory modules of texts and videos, focused on self-monitoring of binge eating, psychoeducation, and regulating emotion. Each could be accessed only after the previous module was completed. Participants also chose six specialization areas to personalize the experience. Email reminders were sent to participants who delayed starting the program to boost initial and continuing engagement.
The control group had no access to the program and participants were told they were on a 12-week waiting list for it. They could explore other treatments during that time, an option that mimics real-world experiences. The design choice also helped navigate the ethics of withholding a potentially effective treatment.
Significant Improvement in Outcomes
The intervention group had a significant reduction in binge-eating episodes, the primary outcome, compared with the control group. In the intervention group, the average number of episodes decreased from 14.79 at baseline to 6.07 (95% confidence interval, −11.31 to −6.72; P < .001). The reduction surpassed the clinically meaningful threshold of 3.97 episodes. The control group, as expected, had no significant reductions in episodes.
The intervention group also showed improvement in outcomes including well-being, self-esteem, and emotional regulation and reductions in clinical impairment, depression, and anxiety. “However, there were no meaningful between-group differences regarding changes in work capacity,” the authors noted.
In an invited commentary, Andrea Graham, PhD, with the Center for Behavioral Intervention Technologies at the Feinberg School of Medicine, Northwestern University, Chicago, noted that BED “is a prevalent, serious, and impairing psychiatric illness.”
The study authors pointed out that BED is one of the most prevalent eating disorders, affecting “1.0% to 2.8% of the population over their lifetimes.”
Dr. Graham notes that while there are evidence-based, face-to-face psychological treatments, many patients have considerable barriers to accessing those services.
Digital Intervention Advantages
“Digital interventions, such as the one evaluated by Pruessner and colleagues, have the potential to curb the mental health crisis by reaching large numbers of people in need” in the moments they need help most, she wrote.
She added that with BED, eating decisions and signals for dysregulated eating occur frequently throughout the day, highlighting the need for on-demand and immediate access to self-help, like the solution Ms. Pruessner and colleagues describe.
“The importance of Pruessner and colleagues’ findings is strengthened because their digital intervention did not rely on human support for delivery,” she wrote. Relying on human intervention poses financial challenges for achieving scale.
“Therefore, self-help interventions that achieve clinically significant improvements in outcomes present an important opportunity for closing the treatment gap for binge eating. Given its effectiveness, the critical next step is to learn where and how to implement this intervention to broadly reach individuals in need,” Dr. Graham wrote.
Primary care clinicians don’t typically intervene in eating disorders and a self-help intervention might help address that gap, she added.
“However, a first step would require increasing screening for eating disorders in primary care,” Dr. Graham pointed out.
The authors report no relevant financial relationships. Dr. Graham reports grants from the National Institute of Mental Health, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), and the Agency for Healthcare Research and Quality. She reports receiving a grant from the NIDDK-funded Chicago Center for Diabetes Translation Research, Dean’s Office of the Biological Sciences Division of the University of Chicago and Feinberg School of Medicine at Northwestern University; and being an adviser to Alavida Health.
FROM JAMA NETWORK OPEN
It Would Be Nice if Olive Oil Really Did Prevent Dementia
This transcript has been edited for clarity.
As you all know by now, I’m always looking out for lifestyle changes that are both pleasurable and healthy. They are hard to find, especially when it comes to diet. My kids complain about this all the time: “When you say ‘healthy food,’ you just mean yucky food.” And yes, French fries are amazing, and no, we can’t have them three times a day.
So, when I saw an article claiming that olive oil reduces the risk for dementia, I was interested. I love olive oil; I cook with it all the time. But as is always the case in the world of nutritional epidemiology, we need to be careful. There are a lot of reasons to doubt the results of this study — and one reason to believe it’s true.
The study I’m talking about is “Consumption of Olive Oil and Diet Quality and Risk of Dementia-Related Death,” appearing in JAMA Network Open and following a well-trod formula in the nutritional epidemiology space.
Nearly 100,000 participants, all healthcare workers, filled out a food frequency questionnaire every 4 years with 130 questions touching on all aspects of diet: How often do you eat bananas, bacon, olive oil? Participants were followed for more than 20 years, and if they died, the cause of death was flagged as being dementia-related or not. Over that time frame there were around 38,000 deaths, of which 4751 were due to dementia.
The rest is just statistics. The authors show that those who reported consuming more olive oil were less likely to die from dementia — about 50% less likely, if you compare those who reported eating more than 7 grams of olive oil a day with those who reported eating none.
Is It What You Eat, or What You Don’t Eat?
And we could stop there if we wanted to; I’m sure big olive oil would be happy with that. Is there such a thing as “big olive oil”? But no, we need to dig deeper here because this study has the same problems as all nutritional epidemiology studies. Number one, no one is sitting around drinking small cups of olive oil. They consume it with other foods. And it was clear from the food frequency questionnaire that people who consumed more olive oil also consumed less red meat, more fruits and vegetables, more whole grains, more butter, and less margarine. And those are just the findings reported in the paper. I suspect that people who eat more olive oil also eat more tomatoes, for example, though data this granular aren’t shown. So, it can be really hard, in studies like this, to know for sure that it’s actually the olive oil that is helpful rather than some other constituent in the diet.
The flip side of that coin presents another issue. The food you eat is also a marker of the food you don’t eat. People who ate olive oil consumed less margarine, for example. At the time of this study, margarine was still adulterated with trans-fats, which a pretty solid evidence base suggests are really bad for your vascular system. So perhaps it’s not that olive oil is particularly good for you but that something else is bad for you. In other words, simply adding olive oil to your diet without changing anything else may not do anything.
The other major problem with studies of this sort is that people don’t consume food at random. The type of person who eats a lot of olive oil is simply different from the type of person who doesn›t. For one thing, olive oil is expensive. A 25-ounce bottle of olive oil is on sale at my local supermarket right now for $11.00. A similar-sized bottle of vegetable oil goes for $4.00.
Isn’t it interesting that food that costs more money tends to be associated with better health outcomes? (I’m looking at you, red wine.) Perhaps it’s not the food; perhaps it’s the money. We aren’t provided data on household income in this study, but we can see that the heavy olive oil users were less likely to be current smokers and they got more physical activity.
Now, the authors are aware of these limitations and do their best to account for them. In multivariable models, they adjust for other stuff in the diet, and even for income (sort of; they use census tract as a proxy for income, which is really a broad brush), and still find a significant though weakened association showing a protective effect of olive oil on dementia-related death. But still — adjustment is never perfect, and the small effect size here could definitely be due to residual confounding.
Evidence More Convincing
Now, I did tell you that there is one reason to believe that this study is true, but it’s not really from this study.
It’s from the PREDIMED randomized trial.
This is nutritional epidemiology I can get behind. Published in 2018, investigators in Spain randomized around 7500 participants to receive a liter of olive oil once a week vs mixed nuts, vs small nonfood gifts, the idea here being that if you have olive oil around, you’ll use it more. And people who were randomly assigned to get the olive oil had a 30% lower rate of cardiovascular events. A secondary analysis of that study found that the rate of development of mild cognitive impairment was 65% lower in those who were randomly assigned to olive oil. That’s an impressive result.
So, there might be something to this olive oil thing, but I’m not quite ready to add it to my “pleasurable things that are still good for you” list just yet. Though it does make me wonder: Can we make French fries in the stuff?
Dr. Wilson is associate professor of medicine and public health and director of the Clinical and Translational Research Accelerator at Yale University, New Haven, Conn. He has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
This transcript has been edited for clarity.
As you all know by now, I’m always looking out for lifestyle changes that are both pleasurable and healthy. They are hard to find, especially when it comes to diet. My kids complain about this all the time: “When you say ‘healthy food,’ you just mean yucky food.” And yes, French fries are amazing, and no, we can’t have them three times a day.
So, when I saw an article claiming that olive oil reduces the risk for dementia, I was interested. I love olive oil; I cook with it all the time. But as is always the case in the world of nutritional epidemiology, we need to be careful. There are a lot of reasons to doubt the results of this study — and one reason to believe it’s true.
The study I’m talking about is “Consumption of Olive Oil and Diet Quality and Risk of Dementia-Related Death,” appearing in JAMA Network Open and following a well-trod formula in the nutritional epidemiology space.
Nearly 100,000 participants, all healthcare workers, filled out a food frequency questionnaire every 4 years with 130 questions touching on all aspects of diet: How often do you eat bananas, bacon, olive oil? Participants were followed for more than 20 years, and if they died, the cause of death was flagged as being dementia-related or not. Over that time frame there were around 38,000 deaths, of which 4751 were due to dementia.
The rest is just statistics. The authors show that those who reported consuming more olive oil were less likely to die from dementia — about 50% less likely, if you compare those who reported eating more than 7 grams of olive oil a day with those who reported eating none.
Is It What You Eat, or What You Don’t Eat?
And we could stop there if we wanted to; I’m sure big olive oil would be happy with that. Is there such a thing as “big olive oil”? But no, we need to dig deeper here because this study has the same problems as all nutritional epidemiology studies. Number one, no one is sitting around drinking small cups of olive oil. They consume it with other foods. And it was clear from the food frequency questionnaire that people who consumed more olive oil also consumed less red meat, more fruits and vegetables, more whole grains, more butter, and less margarine. And those are just the findings reported in the paper. I suspect that people who eat more olive oil also eat more tomatoes, for example, though data this granular aren’t shown. So, it can be really hard, in studies like this, to know for sure that it’s actually the olive oil that is helpful rather than some other constituent in the diet.
The flip side of that coin presents another issue. The food you eat is also a marker of the food you don’t eat. People who ate olive oil consumed less margarine, for example. At the time of this study, margarine was still adulterated with trans-fats, which a pretty solid evidence base suggests are really bad for your vascular system. So perhaps it’s not that olive oil is particularly good for you but that something else is bad for you. In other words, simply adding olive oil to your diet without changing anything else may not do anything.
The other major problem with studies of this sort is that people don’t consume food at random. The type of person who eats a lot of olive oil is simply different from the type of person who doesn›t. For one thing, olive oil is expensive. A 25-ounce bottle of olive oil is on sale at my local supermarket right now for $11.00. A similar-sized bottle of vegetable oil goes for $4.00.
Isn’t it interesting that food that costs more money tends to be associated with better health outcomes? (I’m looking at you, red wine.) Perhaps it’s not the food; perhaps it’s the money. We aren’t provided data on household income in this study, but we can see that the heavy olive oil users were less likely to be current smokers and they got more physical activity.
Now, the authors are aware of these limitations and do their best to account for them. In multivariable models, they adjust for other stuff in the diet, and even for income (sort of; they use census tract as a proxy for income, which is really a broad brush), and still find a significant though weakened association showing a protective effect of olive oil on dementia-related death. But still — adjustment is never perfect, and the small effect size here could definitely be due to residual confounding.
Evidence More Convincing
Now, I did tell you that there is one reason to believe that this study is true, but it’s not really from this study.
It’s from the PREDIMED randomized trial.
This is nutritional epidemiology I can get behind. Published in 2018, investigators in Spain randomized around 7500 participants to receive a liter of olive oil once a week vs mixed nuts, vs small nonfood gifts, the idea here being that if you have olive oil around, you’ll use it more. And people who were randomly assigned to get the olive oil had a 30% lower rate of cardiovascular events. A secondary analysis of that study found that the rate of development of mild cognitive impairment was 65% lower in those who were randomly assigned to olive oil. That’s an impressive result.
So, there might be something to this olive oil thing, but I’m not quite ready to add it to my “pleasurable things that are still good for you” list just yet. Though it does make me wonder: Can we make French fries in the stuff?
Dr. Wilson is associate professor of medicine and public health and director of the Clinical and Translational Research Accelerator at Yale University, New Haven, Conn. He has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
This transcript has been edited for clarity.
As you all know by now, I’m always looking out for lifestyle changes that are both pleasurable and healthy. They are hard to find, especially when it comes to diet. My kids complain about this all the time: “When you say ‘healthy food,’ you just mean yucky food.” And yes, French fries are amazing, and no, we can’t have them three times a day.
So, when I saw an article claiming that olive oil reduces the risk for dementia, I was interested. I love olive oil; I cook with it all the time. But as is always the case in the world of nutritional epidemiology, we need to be careful. There are a lot of reasons to doubt the results of this study — and one reason to believe it’s true.
The study I’m talking about is “Consumption of Olive Oil and Diet Quality and Risk of Dementia-Related Death,” appearing in JAMA Network Open and following a well-trod formula in the nutritional epidemiology space.
Nearly 100,000 participants, all healthcare workers, filled out a food frequency questionnaire every 4 years with 130 questions touching on all aspects of diet: How often do you eat bananas, bacon, olive oil? Participants were followed for more than 20 years, and if they died, the cause of death was flagged as being dementia-related or not. Over that time frame there were around 38,000 deaths, of which 4751 were due to dementia.
The rest is just statistics. The authors show that those who reported consuming more olive oil were less likely to die from dementia — about 50% less likely, if you compare those who reported eating more than 7 grams of olive oil a day with those who reported eating none.
Is It What You Eat, or What You Don’t Eat?
And we could stop there if we wanted to; I’m sure big olive oil would be happy with that. Is there such a thing as “big olive oil”? But no, we need to dig deeper here because this study has the same problems as all nutritional epidemiology studies. Number one, no one is sitting around drinking small cups of olive oil. They consume it with other foods. And it was clear from the food frequency questionnaire that people who consumed more olive oil also consumed less red meat, more fruits and vegetables, more whole grains, more butter, and less margarine. And those are just the findings reported in the paper. I suspect that people who eat more olive oil also eat more tomatoes, for example, though data this granular aren’t shown. So, it can be really hard, in studies like this, to know for sure that it’s actually the olive oil that is helpful rather than some other constituent in the diet.
The flip side of that coin presents another issue. The food you eat is also a marker of the food you don’t eat. People who ate olive oil consumed less margarine, for example. At the time of this study, margarine was still adulterated with trans-fats, which a pretty solid evidence base suggests are really bad for your vascular system. So perhaps it’s not that olive oil is particularly good for you but that something else is bad for you. In other words, simply adding olive oil to your diet without changing anything else may not do anything.
The other major problem with studies of this sort is that people don’t consume food at random. The type of person who eats a lot of olive oil is simply different from the type of person who doesn›t. For one thing, olive oil is expensive. A 25-ounce bottle of olive oil is on sale at my local supermarket right now for $11.00. A similar-sized bottle of vegetable oil goes for $4.00.
Isn’t it interesting that food that costs more money tends to be associated with better health outcomes? (I’m looking at you, red wine.) Perhaps it’s not the food; perhaps it’s the money. We aren’t provided data on household income in this study, but we can see that the heavy olive oil users were less likely to be current smokers and they got more physical activity.
Now, the authors are aware of these limitations and do their best to account for them. In multivariable models, they adjust for other stuff in the diet, and even for income (sort of; they use census tract as a proxy for income, which is really a broad brush), and still find a significant though weakened association showing a protective effect of olive oil on dementia-related death. But still — adjustment is never perfect, and the small effect size here could definitely be due to residual confounding.
Evidence More Convincing
Now, I did tell you that there is one reason to believe that this study is true, but it’s not really from this study.
It’s from the PREDIMED randomized trial.
This is nutritional epidemiology I can get behind. Published in 2018, investigators in Spain randomized around 7500 participants to receive a liter of olive oil once a week vs mixed nuts, vs small nonfood gifts, the idea here being that if you have olive oil around, you’ll use it more. And people who were randomly assigned to get the olive oil had a 30% lower rate of cardiovascular events. A secondary analysis of that study found that the rate of development of mild cognitive impairment was 65% lower in those who were randomly assigned to olive oil. That’s an impressive result.
So, there might be something to this olive oil thing, but I’m not quite ready to add it to my “pleasurable things that are still good for you” list just yet. Though it does make me wonder: Can we make French fries in the stuff?
Dr. Wilson is associate professor of medicine and public health and director of the Clinical and Translational Research Accelerator at Yale University, New Haven, Conn. He has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
High-Quality Diet in Early Life May Ward Off Later IBD
, prospective pooled data from two Scandinavian birth cohorts suggested.
It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.
Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.
“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
Two-Cohort Study
The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.
The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).
Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:
- Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
- The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
- AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
- There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.
The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”
An ‘Elegant’ Study
In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.
Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.
Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”
As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”
In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.
The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.
The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.
Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.
, prospective pooled data from two Scandinavian birth cohorts suggested.
It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.
Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.
“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
Two-Cohort Study
The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.
The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).
Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:
- Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
- The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
- AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
- There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.
The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”
An ‘Elegant’ Study
In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.
Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.
Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”
As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”
In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.
The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.
The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.
Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.
, prospective pooled data from two Scandinavian birth cohorts suggested.
It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.
Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.
“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
Two-Cohort Study
The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.
The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).
Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:
- Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
- The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
- AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
- There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.
The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”
An ‘Elegant’ Study
In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.
Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.
Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”
As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”
In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.
The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.
The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.
Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.
FROM GUT
More Cases of Acute Diverticulitis Treated Outside Hospital
BOSTON — Patients with acute colonic diverticulitis are more likely to be seen by primary care providers than by emergency physicians, representing a shift in the way clinicians detect and treat the condition.
Acute colonic diverticulitis affects roughly 180 per 100,000 people per year in the United States.
CT of the abdomen and pelvis may not be a first-line method to detect diverticulitis in the primary care setting as it has been in emergent care, according to Kaveh Sharzehi, MD, MS, associate professor of medicine in the Division of Gastroenterology and Hepatology at Oregon Health & Science University in Portland.
Indeed, clinical guidelines by multiple physician groups recommend that providers use a more individualized approach to detecting and treating the condition.
“There is still great value in proper and thorough physical history and some adjunct testing,” Dr. Sharzehi told attendees during a presentation on April 20 at the American College of Physicians Internal Medicine Meeting 2024. These two methods can detect the disease up to 65% of the time, Dr. Sharzehi added.
An initial evaluation of a patient with suspected acute diverticulitis should first assess the patient’s history of abdominal pain, fever, and leukocytosis, Dr. Sharzehi said.
A C-reactive protein level > 50 mg/L “almost doubles the odds of having diverticulitis,” Dr. Sharzehi said. Studies also suggest increased levels of procalcitonin and fecal calprotectin can indicate the presence of the condition.
The American Gastroenterological Association (AGA) and the American College of Physicians recommend abdominal CT if clinicians are uncertain of the diagnosis, and to evaluate potential complications in severe cases. Ultrasound and MRI can be useful alternatives, according to guidelines from the American Society of Colon and Rectal Surgeons.
The chances of developing diverticulitis increase with age. More than 60% of Americans aged 60 years or older have diverticulosis, a condition characterized by small pouches in the colon lining that can weaken the colon wall. Less than 5% of people with diverticulosis go on to develop diverticulitis.
“Aspirin and opioid use are also risk factors, likely from their effect on the colonic transit time and causing constipation that might contribute to diverticulitis, but that›s not very well understood,” Dr. Sharzehi said.
Medical management has shifted from predominantly inpatient to predominantly outpatient care, Dr. Sharzehi told attendees
“Unfortunately, there are not that many supportive guidelines for what diet a patient should have in the acute setting of diverticulitis,” he said.
Patients with a mild case may benefit from a clear liquid diet; for some patients, high-fiber diets, regular physical activity, and statins may protect against recurrence.
Current guidelines recommend against prescribing antibiotics for most cases because evidence suggests that diverticulitis is primarily an inflammatory process that can result in small tears in the diverticulum, rather than the disease being a complication of existing tears.
Patients should also not be treated with probiotics or 5-aminosalicylic acid agents, Dr. Sharzehi said.
“My practice is in the Pacific Northwest, where there’s a lot of belief in naturopathic remedies, so we get a lot of questions about supplements and probiotics in preventing diverticulitis,” he said. “We don’t think it does help, and this is unanimous among all the main [physician] societies.”
The AGA recommends referring patients for a colonoscopy within a year after diverticulitis symptoms have resided.
Severe or unresolved cases could require inpatient procedures such as percutaneous drainage or surgery. An estimated 15%-30% of patients admitted to hospital with acute diverticulitis require surgery, Dr. Sharzehi said.
Surgery may become an option for patients who have recurrent cases of the disease, even if not severe, Dr. Sharzehi said.
Dr. Sharzehi reported no relevant disclosures.
A version of this article first appeared on Medscape.com.
BOSTON — Patients with acute colonic diverticulitis are more likely to be seen by primary care providers than by emergency physicians, representing a shift in the way clinicians detect and treat the condition.
Acute colonic diverticulitis affects roughly 180 per 100,000 people per year in the United States.
CT of the abdomen and pelvis may not be a first-line method to detect diverticulitis in the primary care setting as it has been in emergent care, according to Kaveh Sharzehi, MD, MS, associate professor of medicine in the Division of Gastroenterology and Hepatology at Oregon Health & Science University in Portland.
Indeed, clinical guidelines by multiple physician groups recommend that providers use a more individualized approach to detecting and treating the condition.
“There is still great value in proper and thorough physical history and some adjunct testing,” Dr. Sharzehi told attendees during a presentation on April 20 at the American College of Physicians Internal Medicine Meeting 2024. These two methods can detect the disease up to 65% of the time, Dr. Sharzehi added.
An initial evaluation of a patient with suspected acute diverticulitis should first assess the patient’s history of abdominal pain, fever, and leukocytosis, Dr. Sharzehi said.
A C-reactive protein level > 50 mg/L “almost doubles the odds of having diverticulitis,” Dr. Sharzehi said. Studies also suggest increased levels of procalcitonin and fecal calprotectin can indicate the presence of the condition.
The American Gastroenterological Association (AGA) and the American College of Physicians recommend abdominal CT if clinicians are uncertain of the diagnosis, and to evaluate potential complications in severe cases. Ultrasound and MRI can be useful alternatives, according to guidelines from the American Society of Colon and Rectal Surgeons.
The chances of developing diverticulitis increase with age. More than 60% of Americans aged 60 years or older have diverticulosis, a condition characterized by small pouches in the colon lining that can weaken the colon wall. Less than 5% of people with diverticulosis go on to develop diverticulitis.
“Aspirin and opioid use are also risk factors, likely from their effect on the colonic transit time and causing constipation that might contribute to diverticulitis, but that›s not very well understood,” Dr. Sharzehi said.
Medical management has shifted from predominantly inpatient to predominantly outpatient care, Dr. Sharzehi told attendees
“Unfortunately, there are not that many supportive guidelines for what diet a patient should have in the acute setting of diverticulitis,” he said.
Patients with a mild case may benefit from a clear liquid diet; for some patients, high-fiber diets, regular physical activity, and statins may protect against recurrence.
Current guidelines recommend against prescribing antibiotics for most cases because evidence suggests that diverticulitis is primarily an inflammatory process that can result in small tears in the diverticulum, rather than the disease being a complication of existing tears.
Patients should also not be treated with probiotics or 5-aminosalicylic acid agents, Dr. Sharzehi said.
“My practice is in the Pacific Northwest, where there’s a lot of belief in naturopathic remedies, so we get a lot of questions about supplements and probiotics in preventing diverticulitis,” he said. “We don’t think it does help, and this is unanimous among all the main [physician] societies.”
The AGA recommends referring patients for a colonoscopy within a year after diverticulitis symptoms have resided.
Severe or unresolved cases could require inpatient procedures such as percutaneous drainage or surgery. An estimated 15%-30% of patients admitted to hospital with acute diverticulitis require surgery, Dr. Sharzehi said.
Surgery may become an option for patients who have recurrent cases of the disease, even if not severe, Dr. Sharzehi said.
Dr. Sharzehi reported no relevant disclosures.
A version of this article first appeared on Medscape.com.
BOSTON — Patients with acute colonic diverticulitis are more likely to be seen by primary care providers than by emergency physicians, representing a shift in the way clinicians detect and treat the condition.
Acute colonic diverticulitis affects roughly 180 per 100,000 people per year in the United States.
CT of the abdomen and pelvis may not be a first-line method to detect diverticulitis in the primary care setting as it has been in emergent care, according to Kaveh Sharzehi, MD, MS, associate professor of medicine in the Division of Gastroenterology and Hepatology at Oregon Health & Science University in Portland.
Indeed, clinical guidelines by multiple physician groups recommend that providers use a more individualized approach to detecting and treating the condition.
“There is still great value in proper and thorough physical history and some adjunct testing,” Dr. Sharzehi told attendees during a presentation on April 20 at the American College of Physicians Internal Medicine Meeting 2024. These two methods can detect the disease up to 65% of the time, Dr. Sharzehi added.
An initial evaluation of a patient with suspected acute diverticulitis should first assess the patient’s history of abdominal pain, fever, and leukocytosis, Dr. Sharzehi said.
A C-reactive protein level > 50 mg/L “almost doubles the odds of having diverticulitis,” Dr. Sharzehi said. Studies also suggest increased levels of procalcitonin and fecal calprotectin can indicate the presence of the condition.
The American Gastroenterological Association (AGA) and the American College of Physicians recommend abdominal CT if clinicians are uncertain of the diagnosis, and to evaluate potential complications in severe cases. Ultrasound and MRI can be useful alternatives, according to guidelines from the American Society of Colon and Rectal Surgeons.
The chances of developing diverticulitis increase with age. More than 60% of Americans aged 60 years or older have diverticulosis, a condition characterized by small pouches in the colon lining that can weaken the colon wall. Less than 5% of people with diverticulosis go on to develop diverticulitis.
“Aspirin and opioid use are also risk factors, likely from their effect on the colonic transit time and causing constipation that might contribute to diverticulitis, but that›s not very well understood,” Dr. Sharzehi said.
Medical management has shifted from predominantly inpatient to predominantly outpatient care, Dr. Sharzehi told attendees
“Unfortunately, there are not that many supportive guidelines for what diet a patient should have in the acute setting of diverticulitis,” he said.
Patients with a mild case may benefit from a clear liquid diet; for some patients, high-fiber diets, regular physical activity, and statins may protect against recurrence.
Current guidelines recommend against prescribing antibiotics for most cases because evidence suggests that diverticulitis is primarily an inflammatory process that can result in small tears in the diverticulum, rather than the disease being a complication of existing tears.
Patients should also not be treated with probiotics or 5-aminosalicylic acid agents, Dr. Sharzehi said.
“My practice is in the Pacific Northwest, where there’s a lot of belief in naturopathic remedies, so we get a lot of questions about supplements and probiotics in preventing diverticulitis,” he said. “We don’t think it does help, and this is unanimous among all the main [physician] societies.”
The AGA recommends referring patients for a colonoscopy within a year after diverticulitis symptoms have resided.
Severe or unresolved cases could require inpatient procedures such as percutaneous drainage or surgery. An estimated 15%-30% of patients admitted to hospital with acute diverticulitis require surgery, Dr. Sharzehi said.
Surgery may become an option for patients who have recurrent cases of the disease, even if not severe, Dr. Sharzehi said.
Dr. Sharzehi reported no relevant disclosures.
A version of this article first appeared on Medscape.com.