User login
Sodium controversy: More fuel for the fire
Three large international studies addressing sodium intake’s effect on blood pressure and on cardiovascular and mortality outcomes are not likely to quell the controversy surrounding this issue. Rather, since the findings of one study directly oppose those of the other two, the results promise to fan the flames a bit higher.
All three studies were reported online August 14 in the New England Journal of Medicine.
Sodium and blood pressure: PURE
The first report concerned a substudy of data from the Prospective Urban Rural Epidemiology (PURE) study involving 102,216 adults aged 35-70 years residing in 667 communities in 18 low-, middle-, and high-income countries worldwide. Urinary sodium and potassium levels were used as surrogates for dietary intake of these elements, and these excretion levels were correlated with the participants’ blood pressure levels, said Andrew Mente, Ph.D., of the Population Health Research Institute, Hamilton (Ont.) Health Services, McMaster University, and his associates.
Current guidelines recommend a maximum sodium intake of 1.5-2.4 g/day, depending on the country. Only 0.6% of the study population achieved the lowest level of 1.5 g/day, the level recommended in the United States, and only 10% achieved less than 3 g/day. The largest segment of the study population, 46%, had a sodium excretion of 3-5 g/day, and the next largest segment, 44%, had a sodium excretion of more than 5 g/day.
"This suggests that, at present, human consumption of extremely low amounts of sodium for prolonged periods is rare," the investigators noted.
The investigators found, after multivariate adjustment, that for each 1-g increment in sodium excretion, there was an increment of 2.11 mm Hg in systolic blood pressure and 0.78 mm Hg in diastolic blood pressure (P less than .001 for both) for all areas of the globe.
However, this correlation was nonlinear. The association between sodium and blood pressure was weak in the largest subset of participants who had an excretion of 3-5 g/day, and was nonsignificant in those who had an excretion of less than 3 g/day.
The association between sodium intake and blood pressure was stronger in people who had an excretion of more than 5 g/day and in those who already had hypertension at baseline. It also increased with increasing patient age.
Taken together, these findings indicate that sodium’s effect on blood pressure is nonuniform and depends on the background diet of the population as well as the individual’s age and hypertension status, Dr. Mente and his associates said (N. Engl. J. Med. 2014 Aug. 14;371:601-11 [doi:10.1056/NEJMoa1311989]).
Sodium and cardiovascular mortality: PURE
The second report also was a substudy of the PURE study, this time headed by Dr. Martin O’Donnell of the Population Health Institute and McMaster University. The researchers performed a prospective cohort study involving 101,945 PURE participants to assess the association between baseline urinary sodium and potassium excretion, again as a surrogate for intake, with mortality and incident cardiovascular (CV) events during 3.7 years of follow-up.
The primary composite outcome of death or a major CV event occurred in 3,317 participants (3.3%). The mean 24-hour sodium excretion was 4.9 g.
Surprisingly, the lowest risk of death and CV events was seen not in people with the recommended levels of sodium excretion but in those whose sodium excretion was much higher, at 3-6 g/day. Risks actually increased at levels of sodium excretion that were lower than 3 g/day, as is recommended, as well as at levels that were higher than 6 g/day. Moreover, the association between high sodium excretion and high CV and mortality risk was significant only among adults who already had hypertension at baseline.
"The projected benefits of low sodium intake ... are derived from models ... that assume a linear relationship between sodium intake and blood pressure and between blood pressure and cardiovascular events. Implicit in these guidelines is the assumption that there is no unsafe lower limit of sodium intake," Dr. O’Donnell and his associates wrote (N. Engl. J. Med. 2014 Aug. 14;371:612-23 [doi:10.1056/NEJMoa131889]).
The findings from both of these PURE studies call those assumptions into question.
Sodium and cardiovascular mortality: NUTRICODE
The third report was a review of the literature regarding sodium intake’s effect on CV mortality worldwide; the gathered data then served as the basis for a complex statistical model that estimated how many deaths could be attributed to sodium consumption in excess of a reference level of 2.0 g/day. This study was performed by the Global Burden of Diseases, Nutrition, and Chronic Diseases Expert Group (NUTRICODE) and was headed by Dr. Dariush Mozaffarian, a cardiologist and epidemiologist with Tufts University and the Harvard School of Public Health, both in Boston.
These investigators quantified sodium intake in 66 countries (accounting for 74% of adults throughout the world) by age, sex, and country of residence, and correlated these data first with findings from their meta-analysis of 107 randomized trials of interventions to curb sodium intake and then with the results of two large international trials linking the effects of various blood pressure levels on CV mortality.
They estimated that the mean level of sodium intake worldwide is 3.95 g/day and that those mean levels varied by geographic region from a low of 2.18 g to a high of 5.51 g. "Overall, 181 of 187 countries – 99.2% of the adult population of the world – had estimated mean levels of sodium intake exceeding the World Health Organization recommendation of 2.0 g/day," Dr. Mozaffarian and his associates said.
Contrary to the findings of the two PURE analyses, these data showed "strong evidence of a linear dose-response relationship" between sodium intake and blood pressure, such that each reduction of 2.30 g/day of sodium was significantly linked with a reduction of 3.82 mm Hg in systolic blood pressure, as well as a direct correlation between increasing blood pressure and increasing CV mortality.
Extrapolating from these data, "we found that 1.65 million deaths from CV causes worldwide in 2010 were attributable to sodium consumption above the reference level" of 2 g/day. "Globally, 40.4% of these deaths occurred prematurely, i.e. in persons younger than 70 years of age," Dr. Mozaffarian and his associates said (N. Engl. J. Med. 2014 Aug. 14;371:624-34 [doi:10.1056/NEJMoa1304127]).
"In sum, approximately 1 of every 10 deaths from CV causes worldwide and nearly 1 of every 5 premature deaths from CV causes were attributed to sodium consumption above the reference level," they said.
In an editorial accompanying this report, Dr. Suzanne Oparil said, "The NUTRICODE investigators should be applauded for a herculean effort in synthesizing a large body of data regarding the potential harm of excess salt consumption" (N. Engl. J. Med. 2014 Aug. 14;371:677-9 [doi:10.1056/NEJMe1407695]).
"However, given the numerous assumptions necessitated by the lack of high-quality data [in the literature], caution should be taken in interpreting the findings of this study," said Dr. Oparil of the vascular biology and hypertension program, University of Alabama at Birmingham.
The PURE studies were supported by the Heart and Stroke Foundation of Ontario, the Population Health Research Institute, the Canadian Institutes of Health Research, several pharmaceutical companies, and various national or local organizations in 18 participating countries. These funders played no role in the design or conduct of the studies, in collection or analysis of data, or in preparing the manuscript. Dr. O’Donnell reported ties to Boehringer Ingelheim, Bayer, Bristol-Myers Squibb, and Pfizer, and his associates reported ties to Sanofi-Aventis, AstraZeneca, and Cadila. The NUTRICODE study was funded by the Bill and Melinda Gates Foundation.
The provocative findings from both groups of PURE investigators call into question "the feasibility and usefulness of reducing dietary sodium as a population-based strategy for reducing blood pressure," said Dr. Suzanne Oparil.
The authors’ suggested alternative approach of recommending high-quality diets rich in potassium "might achieve greater health benefits, including blood pressure reduction, than aggressive sodium reduction alone," she noted.
Dr. Suzanne Oparil is in the vascular biology and hypertension program at the University of Alabama at Birmingham. These remarks were taken from her editorial accompanying the three reports on sodium consumption (N. Engl. J. Med. 2014 Aug. 14;371:677-9 [doi:10.1056/NEJMe1407695]).
The provocative findings from both groups of PURE investigators call into question "the feasibility and usefulness of reducing dietary sodium as a population-based strategy for reducing blood pressure," said Dr. Suzanne Oparil.
The authors’ suggested alternative approach of recommending high-quality diets rich in potassium "might achieve greater health benefits, including blood pressure reduction, than aggressive sodium reduction alone," she noted.
Dr. Suzanne Oparil is in the vascular biology and hypertension program at the University of Alabama at Birmingham. These remarks were taken from her editorial accompanying the three reports on sodium consumption (N. Engl. J. Med. 2014 Aug. 14;371:677-9 [doi:10.1056/NEJMe1407695]).
The provocative findings from both groups of PURE investigators call into question "the feasibility and usefulness of reducing dietary sodium as a population-based strategy for reducing blood pressure," said Dr. Suzanne Oparil.
The authors’ suggested alternative approach of recommending high-quality diets rich in potassium "might achieve greater health benefits, including blood pressure reduction, than aggressive sodium reduction alone," she noted.
Dr. Suzanne Oparil is in the vascular biology and hypertension program at the University of Alabama at Birmingham. These remarks were taken from her editorial accompanying the three reports on sodium consumption (N. Engl. J. Med. 2014 Aug. 14;371:677-9 [doi:10.1056/NEJMe1407695]).
Three large international studies addressing sodium intake’s effect on blood pressure and on cardiovascular and mortality outcomes are not likely to quell the controversy surrounding this issue. Rather, since the findings of one study directly oppose those of the other two, the results promise to fan the flames a bit higher.
All three studies were reported online August 14 in the New England Journal of Medicine.
Sodium and blood pressure: PURE
The first report concerned a substudy of data from the Prospective Urban Rural Epidemiology (PURE) study involving 102,216 adults aged 35-70 years residing in 667 communities in 18 low-, middle-, and high-income countries worldwide. Urinary sodium and potassium levels were used as surrogates for dietary intake of these elements, and these excretion levels were correlated with the participants’ blood pressure levels, said Andrew Mente, Ph.D., of the Population Health Research Institute, Hamilton (Ont.) Health Services, McMaster University, and his associates.
Current guidelines recommend a maximum sodium intake of 1.5-2.4 g/day, depending on the country. Only 0.6% of the study population achieved the lowest level of 1.5 g/day, the level recommended in the United States, and only 10% achieved less than 3 g/day. The largest segment of the study population, 46%, had a sodium excretion of 3-5 g/day, and the next largest segment, 44%, had a sodium excretion of more than 5 g/day.
"This suggests that, at present, human consumption of extremely low amounts of sodium for prolonged periods is rare," the investigators noted.
The investigators found, after multivariate adjustment, that for each 1-g increment in sodium excretion, there was an increment of 2.11 mm Hg in systolic blood pressure and 0.78 mm Hg in diastolic blood pressure (P less than .001 for both) for all areas of the globe.
However, this correlation was nonlinear. The association between sodium and blood pressure was weak in the largest subset of participants who had an excretion of 3-5 g/day, and was nonsignificant in those who had an excretion of less than 3 g/day.
The association between sodium intake and blood pressure was stronger in people who had an excretion of more than 5 g/day and in those who already had hypertension at baseline. It also increased with increasing patient age.
Taken together, these findings indicate that sodium’s effect on blood pressure is nonuniform and depends on the background diet of the population as well as the individual’s age and hypertension status, Dr. Mente and his associates said (N. Engl. J. Med. 2014 Aug. 14;371:601-11 [doi:10.1056/NEJMoa1311989]).
Sodium and cardiovascular mortality: PURE
The second report also was a substudy of the PURE study, this time headed by Dr. Martin O’Donnell of the Population Health Institute and McMaster University. The researchers performed a prospective cohort study involving 101,945 PURE participants to assess the association between baseline urinary sodium and potassium excretion, again as a surrogate for intake, with mortality and incident cardiovascular (CV) events during 3.7 years of follow-up.
The primary composite outcome of death or a major CV event occurred in 3,317 participants (3.3%). The mean 24-hour sodium excretion was 4.9 g.
Surprisingly, the lowest risk of death and CV events was seen not in people with the recommended levels of sodium excretion but in those whose sodium excretion was much higher, at 3-6 g/day. Risks actually increased at levels of sodium excretion that were lower than 3 g/day, as is recommended, as well as at levels that were higher than 6 g/day. Moreover, the association between high sodium excretion and high CV and mortality risk was significant only among adults who already had hypertension at baseline.
"The projected benefits of low sodium intake ... are derived from models ... that assume a linear relationship between sodium intake and blood pressure and between blood pressure and cardiovascular events. Implicit in these guidelines is the assumption that there is no unsafe lower limit of sodium intake," Dr. O’Donnell and his associates wrote (N. Engl. J. Med. 2014 Aug. 14;371:612-23 [doi:10.1056/NEJMoa131889]).
The findings from both of these PURE studies call those assumptions into question.
Sodium and cardiovascular mortality: NUTRICODE
The third report was a review of the literature regarding sodium intake’s effect on CV mortality worldwide; the gathered data then served as the basis for a complex statistical model that estimated how many deaths could be attributed to sodium consumption in excess of a reference level of 2.0 g/day. This study was performed by the Global Burden of Diseases, Nutrition, and Chronic Diseases Expert Group (NUTRICODE) and was headed by Dr. Dariush Mozaffarian, a cardiologist and epidemiologist with Tufts University and the Harvard School of Public Health, both in Boston.
These investigators quantified sodium intake in 66 countries (accounting for 74% of adults throughout the world) by age, sex, and country of residence, and correlated these data first with findings from their meta-analysis of 107 randomized trials of interventions to curb sodium intake and then with the results of two large international trials linking the effects of various blood pressure levels on CV mortality.
They estimated that the mean level of sodium intake worldwide is 3.95 g/day and that those mean levels varied by geographic region from a low of 2.18 g to a high of 5.51 g. "Overall, 181 of 187 countries – 99.2% of the adult population of the world – had estimated mean levels of sodium intake exceeding the World Health Organization recommendation of 2.0 g/day," Dr. Mozaffarian and his associates said.
Contrary to the findings of the two PURE analyses, these data showed "strong evidence of a linear dose-response relationship" between sodium intake and blood pressure, such that each reduction of 2.30 g/day of sodium was significantly linked with a reduction of 3.82 mm Hg in systolic blood pressure, as well as a direct correlation between increasing blood pressure and increasing CV mortality.
Extrapolating from these data, "we found that 1.65 million deaths from CV causes worldwide in 2010 were attributable to sodium consumption above the reference level" of 2 g/day. "Globally, 40.4% of these deaths occurred prematurely, i.e. in persons younger than 70 years of age," Dr. Mozaffarian and his associates said (N. Engl. J. Med. 2014 Aug. 14;371:624-34 [doi:10.1056/NEJMoa1304127]).
"In sum, approximately 1 of every 10 deaths from CV causes worldwide and nearly 1 of every 5 premature deaths from CV causes were attributed to sodium consumption above the reference level," they said.
In an editorial accompanying this report, Dr. Suzanne Oparil said, "The NUTRICODE investigators should be applauded for a herculean effort in synthesizing a large body of data regarding the potential harm of excess salt consumption" (N. Engl. J. Med. 2014 Aug. 14;371:677-9 [doi:10.1056/NEJMe1407695]).
"However, given the numerous assumptions necessitated by the lack of high-quality data [in the literature], caution should be taken in interpreting the findings of this study," said Dr. Oparil of the vascular biology and hypertension program, University of Alabama at Birmingham.
The PURE studies were supported by the Heart and Stroke Foundation of Ontario, the Population Health Research Institute, the Canadian Institutes of Health Research, several pharmaceutical companies, and various national or local organizations in 18 participating countries. These funders played no role in the design or conduct of the studies, in collection or analysis of data, or in preparing the manuscript. Dr. O’Donnell reported ties to Boehringer Ingelheim, Bayer, Bristol-Myers Squibb, and Pfizer, and his associates reported ties to Sanofi-Aventis, AstraZeneca, and Cadila. The NUTRICODE study was funded by the Bill and Melinda Gates Foundation.
Three large international studies addressing sodium intake’s effect on blood pressure and on cardiovascular and mortality outcomes are not likely to quell the controversy surrounding this issue. Rather, since the findings of one study directly oppose those of the other two, the results promise to fan the flames a bit higher.
All three studies were reported online August 14 in the New England Journal of Medicine.
Sodium and blood pressure: PURE
The first report concerned a substudy of data from the Prospective Urban Rural Epidemiology (PURE) study involving 102,216 adults aged 35-70 years residing in 667 communities in 18 low-, middle-, and high-income countries worldwide. Urinary sodium and potassium levels were used as surrogates for dietary intake of these elements, and these excretion levels were correlated with the participants’ blood pressure levels, said Andrew Mente, Ph.D., of the Population Health Research Institute, Hamilton (Ont.) Health Services, McMaster University, and his associates.
Current guidelines recommend a maximum sodium intake of 1.5-2.4 g/day, depending on the country. Only 0.6% of the study population achieved the lowest level of 1.5 g/day, the level recommended in the United States, and only 10% achieved less than 3 g/day. The largest segment of the study population, 46%, had a sodium excretion of 3-5 g/day, and the next largest segment, 44%, had a sodium excretion of more than 5 g/day.
"This suggests that, at present, human consumption of extremely low amounts of sodium for prolonged periods is rare," the investigators noted.
The investigators found, after multivariate adjustment, that for each 1-g increment in sodium excretion, there was an increment of 2.11 mm Hg in systolic blood pressure and 0.78 mm Hg in diastolic blood pressure (P less than .001 for both) for all areas of the globe.
However, this correlation was nonlinear. The association between sodium and blood pressure was weak in the largest subset of participants who had an excretion of 3-5 g/day, and was nonsignificant in those who had an excretion of less than 3 g/day.
The association between sodium intake and blood pressure was stronger in people who had an excretion of more than 5 g/day and in those who already had hypertension at baseline. It also increased with increasing patient age.
Taken together, these findings indicate that sodium’s effect on blood pressure is nonuniform and depends on the background diet of the population as well as the individual’s age and hypertension status, Dr. Mente and his associates said (N. Engl. J. Med. 2014 Aug. 14;371:601-11 [doi:10.1056/NEJMoa1311989]).
Sodium and cardiovascular mortality: PURE
The second report also was a substudy of the PURE study, this time headed by Dr. Martin O’Donnell of the Population Health Institute and McMaster University. The researchers performed a prospective cohort study involving 101,945 PURE participants to assess the association between baseline urinary sodium and potassium excretion, again as a surrogate for intake, with mortality and incident cardiovascular (CV) events during 3.7 years of follow-up.
The primary composite outcome of death or a major CV event occurred in 3,317 participants (3.3%). The mean 24-hour sodium excretion was 4.9 g.
Surprisingly, the lowest risk of death and CV events was seen not in people with the recommended levels of sodium excretion but in those whose sodium excretion was much higher, at 3-6 g/day. Risks actually increased at levels of sodium excretion that were lower than 3 g/day, as is recommended, as well as at levels that were higher than 6 g/day. Moreover, the association between high sodium excretion and high CV and mortality risk was significant only among adults who already had hypertension at baseline.
"The projected benefits of low sodium intake ... are derived from models ... that assume a linear relationship between sodium intake and blood pressure and between blood pressure and cardiovascular events. Implicit in these guidelines is the assumption that there is no unsafe lower limit of sodium intake," Dr. O’Donnell and his associates wrote (N. Engl. J. Med. 2014 Aug. 14;371:612-23 [doi:10.1056/NEJMoa131889]).
The findings from both of these PURE studies call those assumptions into question.
Sodium and cardiovascular mortality: NUTRICODE
The third report was a review of the literature regarding sodium intake’s effect on CV mortality worldwide; the gathered data then served as the basis for a complex statistical model that estimated how many deaths could be attributed to sodium consumption in excess of a reference level of 2.0 g/day. This study was performed by the Global Burden of Diseases, Nutrition, and Chronic Diseases Expert Group (NUTRICODE) and was headed by Dr. Dariush Mozaffarian, a cardiologist and epidemiologist with Tufts University and the Harvard School of Public Health, both in Boston.
These investigators quantified sodium intake in 66 countries (accounting for 74% of adults throughout the world) by age, sex, and country of residence, and correlated these data first with findings from their meta-analysis of 107 randomized trials of interventions to curb sodium intake and then with the results of two large international trials linking the effects of various blood pressure levels on CV mortality.
They estimated that the mean level of sodium intake worldwide is 3.95 g/day and that those mean levels varied by geographic region from a low of 2.18 g to a high of 5.51 g. "Overall, 181 of 187 countries – 99.2% of the adult population of the world – had estimated mean levels of sodium intake exceeding the World Health Organization recommendation of 2.0 g/day," Dr. Mozaffarian and his associates said.
Contrary to the findings of the two PURE analyses, these data showed "strong evidence of a linear dose-response relationship" between sodium intake and blood pressure, such that each reduction of 2.30 g/day of sodium was significantly linked with a reduction of 3.82 mm Hg in systolic blood pressure, as well as a direct correlation between increasing blood pressure and increasing CV mortality.
Extrapolating from these data, "we found that 1.65 million deaths from CV causes worldwide in 2010 were attributable to sodium consumption above the reference level" of 2 g/day. "Globally, 40.4% of these deaths occurred prematurely, i.e. in persons younger than 70 years of age," Dr. Mozaffarian and his associates said (N. Engl. J. Med. 2014 Aug. 14;371:624-34 [doi:10.1056/NEJMoa1304127]).
"In sum, approximately 1 of every 10 deaths from CV causes worldwide and nearly 1 of every 5 premature deaths from CV causes were attributed to sodium consumption above the reference level," they said.
In an editorial accompanying this report, Dr. Suzanne Oparil said, "The NUTRICODE investigators should be applauded for a herculean effort in synthesizing a large body of data regarding the potential harm of excess salt consumption" (N. Engl. J. Med. 2014 Aug. 14;371:677-9 [doi:10.1056/NEJMe1407695]).
"However, given the numerous assumptions necessitated by the lack of high-quality data [in the literature], caution should be taken in interpreting the findings of this study," said Dr. Oparil of the vascular biology and hypertension program, University of Alabama at Birmingham.
The PURE studies were supported by the Heart and Stroke Foundation of Ontario, the Population Health Research Institute, the Canadian Institutes of Health Research, several pharmaceutical companies, and various national or local organizations in 18 participating countries. These funders played no role in the design or conduct of the studies, in collection or analysis of data, or in preparing the manuscript. Dr. O’Donnell reported ties to Boehringer Ingelheim, Bayer, Bristol-Myers Squibb, and Pfizer, and his associates reported ties to Sanofi-Aventis, AstraZeneca, and Cadila. The NUTRICODE study was funded by the Bill and Melinda Gates Foundation.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: The results from two of three large studies on the sodium intake/blood pressure/cardiovascular death triad contradict each other.
Major finding: Sodium excretion, as an indicator of intake, positively correlated with systolic and diastolic blood pressure across all geographic regions of the globe, but this correlation was nonlinear and was weakest in the largest subset of participants, who had an intake of 3-5 g/day. The lowest risk of death and CV events was seen not in people with the recommended levels of sodium excretion but in those whose sodium excretion was much higher, at 3-6 g/day; risks actually increased at levels of sodium excretion that were lower than 3 g/day, as is recommended. 1.65 million deaths from CV causes worldwide in 2010 were attributable to sodium consumption above the WHO recommended maximum of 2 g/day.
Data source: PURE, a prospective international epidemiologic study of the link between sodium excretion and blood pressure in 102,216 adults, and NUTRICODE, a review of the literature plus statistical modeling of CV deaths tied to sodium consumption worldwide.
Disclosures: The PURE studies were supported by the Heart and Stroke Foundation of Ontario, the Population Health Research Institute, the Canadian Institutes of Health Research, several pharmaceutical companies, and various national or local organizations in 18 participating countries. Dr. O’Donnell reported ties to Boehringer Ingelheim, Bayer, Bristol-Myers Squibb, and Pfizer, and his associates reported ties to Sanofi-Aventis, AstraZeneca, and Cadila. The NUTRICODE study was funded by the Bill and Melinda Gates Foundation.
Flexible sigmoidoscopy screening reduces colorectal cancer, mortality
One-time-only screening for colorectal cancer using flexible sigmoidoscopy reduced colorectal cancer incidence by 20% and colorectal cancer–specific mortality by 27% in a Norwegian study involving more than 20,000 adults followed for approximately 11 years, which was published online Aug. 12 in JAMA.
In addition, younger patients aged 50-54 years appeared to benefit at least as much from screening as older patients aged 55-64 years, in what the investigators described as the first randomized controlled trial to assess the benefit of the procedure in this age group. This finding is particularly important given that some national screening recommendations, including those in the United States, advise that colorectal cancer screening be initiated at age 50 rather than at 55, said Dr. Oyvind Holme of Sorlandet Hospital, Kristiansand, Norway, and his associates in the Norwegian Colorectal Cancer Prevention (NORCCAP) trial.
NORCCAP is a population-based, randomized controlled study in which 100,210 men and women aged 50-64 years residing in two geographic regions in Norway in 1999-2001 were invited by mail to undergo once-only flexible sigmoidoscopy screening or flexible sigmoidoscopy plus fecal occult blood testing (FOBT) for colorectal cancer. Virtually no other screening colonoscopies were available outside this trial in Norway at that time, so it was ensured that members of the control group couldn’t be screened on their own, biasing the study results.
The intent-to-treat population comprised 98,792 adults: 10,283 randomly assigned to flexible sigmoidoscopy, 10,289 randomly assigned to flexible sigmoidoscopy plus FOBT, and 78,220 control subjects. Adherence to screening was 63%, with 12,955 of the invited patients attending their screening exam.
After a median follow-up of 11 years, the incidence of colorectal cancer was 112.6 per 100,000 person-years in the screening group, compared with 141.0 in the control group, a 20% difference. Similarly, colorectal cancer–specific mortality was 31.4 per 100,000 person-years in the screening group, compared with 43.1 in the control group, a 27% difference, the investigators reported (JAMA 2014 312:606-15).
Adding FOBT to flexible sigmoidoscopy not only failed to improve colorectal cancer detection, it was actually a deterrent to adherence, and thus may have impeded detection, Dr. Holme and his associates said.
NORCCAP was funded by the Norwegian government, the Norwegian Cancer Society, the Research Council of Norway, The South-East Regional Health Authority, the Fulbright Foundation, Sorlandet Hospital, and the National Institutes of Health. Dr. Holme reported no potential financial conflicts of interest; one of his associates reported ties to Exact Sciences, Olympus, and other companies.
This is the fourth large randomized trial assessing sigmoidoscopy screening in recent years – the others taking place in the United Kingdom, the United States, and Italy – even though the procedure "has all but vanished" here, replaced by screening colonoscopy, Dr. Allan S. Brett said.
And even the benefit of screening colonoscopy may soon become moot, as that invasive procedure is replaced by "a multitarget stool test that identifies several DNA abnormalities associated with colorectal cancer or precancerous adenomas," which is now under investigation. In one large study published earlier this year, this test’s sensitivity was 92% for detecting cancer and 42% for detecting advanced precancerous lesions, and its specificity was 90%, he noted.
Dr. Allan S. Brett is in the department of medicine at the University of South Carolina, Columbia. He reported no financial conflicts of interest. These remarks were taken from his editorial accompanying Dr. Holme’s report (JAMA 2014 312:601-2).
This is the fourth large randomized trial assessing sigmoidoscopy screening in recent years – the others taking place in the United Kingdom, the United States, and Italy – even though the procedure "has all but vanished" here, replaced by screening colonoscopy, Dr. Allan S. Brett said.
And even the benefit of screening colonoscopy may soon become moot, as that invasive procedure is replaced by "a multitarget stool test that identifies several DNA abnormalities associated with colorectal cancer or precancerous adenomas," which is now under investigation. In one large study published earlier this year, this test’s sensitivity was 92% for detecting cancer and 42% for detecting advanced precancerous lesions, and its specificity was 90%, he noted.
Dr. Allan S. Brett is in the department of medicine at the University of South Carolina, Columbia. He reported no financial conflicts of interest. These remarks were taken from his editorial accompanying Dr. Holme’s report (JAMA 2014 312:601-2).
This is the fourth large randomized trial assessing sigmoidoscopy screening in recent years – the others taking place in the United Kingdom, the United States, and Italy – even though the procedure "has all but vanished" here, replaced by screening colonoscopy, Dr. Allan S. Brett said.
And even the benefit of screening colonoscopy may soon become moot, as that invasive procedure is replaced by "a multitarget stool test that identifies several DNA abnormalities associated with colorectal cancer or precancerous adenomas," which is now under investigation. In one large study published earlier this year, this test’s sensitivity was 92% for detecting cancer and 42% for detecting advanced precancerous lesions, and its specificity was 90%, he noted.
Dr. Allan S. Brett is in the department of medicine at the University of South Carolina, Columbia. He reported no financial conflicts of interest. These remarks were taken from his editorial accompanying Dr. Holme’s report (JAMA 2014 312:601-2).
One-time-only screening for colorectal cancer using flexible sigmoidoscopy reduced colorectal cancer incidence by 20% and colorectal cancer–specific mortality by 27% in a Norwegian study involving more than 20,000 adults followed for approximately 11 years, which was published online Aug. 12 in JAMA.
In addition, younger patients aged 50-54 years appeared to benefit at least as much from screening as older patients aged 55-64 years, in what the investigators described as the first randomized controlled trial to assess the benefit of the procedure in this age group. This finding is particularly important given that some national screening recommendations, including those in the United States, advise that colorectal cancer screening be initiated at age 50 rather than at 55, said Dr. Oyvind Holme of Sorlandet Hospital, Kristiansand, Norway, and his associates in the Norwegian Colorectal Cancer Prevention (NORCCAP) trial.
NORCCAP is a population-based, randomized controlled study in which 100,210 men and women aged 50-64 years residing in two geographic regions in Norway in 1999-2001 were invited by mail to undergo once-only flexible sigmoidoscopy screening or flexible sigmoidoscopy plus fecal occult blood testing (FOBT) for colorectal cancer. Virtually no other screening colonoscopies were available outside this trial in Norway at that time, so it was ensured that members of the control group couldn’t be screened on their own, biasing the study results.
The intent-to-treat population comprised 98,792 adults: 10,283 randomly assigned to flexible sigmoidoscopy, 10,289 randomly assigned to flexible sigmoidoscopy plus FOBT, and 78,220 control subjects. Adherence to screening was 63%, with 12,955 of the invited patients attending their screening exam.
After a median follow-up of 11 years, the incidence of colorectal cancer was 112.6 per 100,000 person-years in the screening group, compared with 141.0 in the control group, a 20% difference. Similarly, colorectal cancer–specific mortality was 31.4 per 100,000 person-years in the screening group, compared with 43.1 in the control group, a 27% difference, the investigators reported (JAMA 2014 312:606-15).
Adding FOBT to flexible sigmoidoscopy not only failed to improve colorectal cancer detection, it was actually a deterrent to adherence, and thus may have impeded detection, Dr. Holme and his associates said.
NORCCAP was funded by the Norwegian government, the Norwegian Cancer Society, the Research Council of Norway, The South-East Regional Health Authority, the Fulbright Foundation, Sorlandet Hospital, and the National Institutes of Health. Dr. Holme reported no potential financial conflicts of interest; one of his associates reported ties to Exact Sciences, Olympus, and other companies.
One-time-only screening for colorectal cancer using flexible sigmoidoscopy reduced colorectal cancer incidence by 20% and colorectal cancer–specific mortality by 27% in a Norwegian study involving more than 20,000 adults followed for approximately 11 years, which was published online Aug. 12 in JAMA.
In addition, younger patients aged 50-54 years appeared to benefit at least as much from screening as older patients aged 55-64 years, in what the investigators described as the first randomized controlled trial to assess the benefit of the procedure in this age group. This finding is particularly important given that some national screening recommendations, including those in the United States, advise that colorectal cancer screening be initiated at age 50 rather than at 55, said Dr. Oyvind Holme of Sorlandet Hospital, Kristiansand, Norway, and his associates in the Norwegian Colorectal Cancer Prevention (NORCCAP) trial.
NORCCAP is a population-based, randomized controlled study in which 100,210 men and women aged 50-64 years residing in two geographic regions in Norway in 1999-2001 were invited by mail to undergo once-only flexible sigmoidoscopy screening or flexible sigmoidoscopy plus fecal occult blood testing (FOBT) for colorectal cancer. Virtually no other screening colonoscopies were available outside this trial in Norway at that time, so it was ensured that members of the control group couldn’t be screened on their own, biasing the study results.
The intent-to-treat population comprised 98,792 adults: 10,283 randomly assigned to flexible sigmoidoscopy, 10,289 randomly assigned to flexible sigmoidoscopy plus FOBT, and 78,220 control subjects. Adherence to screening was 63%, with 12,955 of the invited patients attending their screening exam.
After a median follow-up of 11 years, the incidence of colorectal cancer was 112.6 per 100,000 person-years in the screening group, compared with 141.0 in the control group, a 20% difference. Similarly, colorectal cancer–specific mortality was 31.4 per 100,000 person-years in the screening group, compared with 43.1 in the control group, a 27% difference, the investigators reported (JAMA 2014 312:606-15).
Adding FOBT to flexible sigmoidoscopy not only failed to improve colorectal cancer detection, it was actually a deterrent to adherence, and thus may have impeded detection, Dr. Holme and his associates said.
NORCCAP was funded by the Norwegian government, the Norwegian Cancer Society, the Research Council of Norway, The South-East Regional Health Authority, the Fulbright Foundation, Sorlandet Hospital, and the National Institutes of Health. Dr. Holme reported no potential financial conflicts of interest; one of his associates reported ties to Exact Sciences, Olympus, and other companies.
FROM JAMA
Key clinical point: Flexible sigmoidoscopy can significantly reduce the incidence of colorectal cancer.
Major finding: The incidence of colorectal cancer was 112.6 per 100,000 person-years in the screening group, compared with 141.0 in the control group, a 20% difference; and colorectal cancer–specific mortality was 31.4 per 100,000 person-years in the screening group, compared with 43.1 in the control group, a 27% difference.
Data source: A population-based, randomized controlled clinical trial involving 20,572 adults aged 50-64 years who underwent flexible sigmoidoscopy screening for colorectal cancer and 78,220 control subjects who did not, all of whom were followed for approximately 11 years for the development of colorectal cancer.
Disclosures: NORCCAP was funded by the Norwegian government, the Norwegian Cancer Society, the Research Council of Norway, The South-East Regional Health Authority, the Fulbright Foundation, Sorlandet Hospital, and the National Institutes of Health. Dr. Holme reported no potential financial conflicts of interest.
Benefits outweigh harms of aspirin therapy
Prophylactic aspirin therapy of at least 5 years’ duration has a favorable benefit-harm profile, primarily because of its effectiveness in preventing colorectal and other cancers, according to a report in the August issue of Annals of Oncology.
In a review and summary of the current evidence regarding the benefits and harms of aspirin therapy, investigators analyzed the protective and adverse effects of several dosing regimens for both men and women at four different ages: 50, 55, 60, and 65 years. Their analysis assumed (conservatively) that protection against cancer begins 3 years after starting aspirin and continues for 5 years after stopping it, said Jack Cuzick, Ph.D., of the center for cancer prevention, Wolfson Institute of Preventive Medicine, Queen Mary University of London, and his associates.
The researchers found "overwhelming" evidence that aspirin therapy reduces the incidence of colorectal cancer by approximately 30% and mortality from the disease by approximately 40%. It is effective in both men and women, in every age group, at every dosage, and in patients at high risk for colorectal cancer.
Prophylactic aspirin is also effective in reducing mortality related to esophageal cancer by 27%-58%, depending on the study. It exerts "substantial" protection against stomach and other gastric cancers, but does not appear to do so with pancreatic cancer. Aspirin therapy exerts a smaller protective effect against breast, prostate, lung, and endometrial cancers, Dr. Cuzick and his associates said.
These benefits are seen in both men and women, as well as in all age groups studied. But absolute benefits are greatest among older men.
Prophylactic aspirin also reduces serious cardiovascular events, particularly nonfatal myocardial infarction, to a much smaller degree.
"Using our ‘best estimates’ for individuals taking aspirin for 10 years, there would be a relative reduction of approximately 9% in the number of men and 7% in the number of women with a cancer, MI, or stroke event over a 15-year period," the investigators said (Ann. Oncol. 2014 Aug. 5 [doi:10.1093/annonc/mdu225]).
"Reductions in cancer incidence would account for 61%-80% of the overall benefit, and reductions in colorectal cancer alone would account for 30%-36% of it," they noted.
It is still uncertain whether there is an upper age limit at which potential harms, such as excess bleeding, outweigh potential benefits, and the optimal dose for cancer prevention hasn’t yet been established, the authors added.
This study was sponsored by the International Society of Cancer Prevention, Cancer Research U.K., the British Heart Foundation, and the American Cancer Society. Dr. Cuzick reported serving on an advisory board for Bayer; several associates reported ties to numerous industry sources.
Prophylactic aspirin therapy of at least 5 years’ duration has a favorable benefit-harm profile, primarily because of its effectiveness in preventing colorectal and other cancers, according to a report in the August issue of Annals of Oncology.
In a review and summary of the current evidence regarding the benefits and harms of aspirin therapy, investigators analyzed the protective and adverse effects of several dosing regimens for both men and women at four different ages: 50, 55, 60, and 65 years. Their analysis assumed (conservatively) that protection against cancer begins 3 years after starting aspirin and continues for 5 years after stopping it, said Jack Cuzick, Ph.D., of the center for cancer prevention, Wolfson Institute of Preventive Medicine, Queen Mary University of London, and his associates.
The researchers found "overwhelming" evidence that aspirin therapy reduces the incidence of colorectal cancer by approximately 30% and mortality from the disease by approximately 40%. It is effective in both men and women, in every age group, at every dosage, and in patients at high risk for colorectal cancer.
Prophylactic aspirin is also effective in reducing mortality related to esophageal cancer by 27%-58%, depending on the study. It exerts "substantial" protection against stomach and other gastric cancers, but does not appear to do so with pancreatic cancer. Aspirin therapy exerts a smaller protective effect against breast, prostate, lung, and endometrial cancers, Dr. Cuzick and his associates said.
These benefits are seen in both men and women, as well as in all age groups studied. But absolute benefits are greatest among older men.
Prophylactic aspirin also reduces serious cardiovascular events, particularly nonfatal myocardial infarction, to a much smaller degree.
"Using our ‘best estimates’ for individuals taking aspirin for 10 years, there would be a relative reduction of approximately 9% in the number of men and 7% in the number of women with a cancer, MI, or stroke event over a 15-year period," the investigators said (Ann. Oncol. 2014 Aug. 5 [doi:10.1093/annonc/mdu225]).
"Reductions in cancer incidence would account for 61%-80% of the overall benefit, and reductions in colorectal cancer alone would account for 30%-36% of it," they noted.
It is still uncertain whether there is an upper age limit at which potential harms, such as excess bleeding, outweigh potential benefits, and the optimal dose for cancer prevention hasn’t yet been established, the authors added.
This study was sponsored by the International Society of Cancer Prevention, Cancer Research U.K., the British Heart Foundation, and the American Cancer Society. Dr. Cuzick reported serving on an advisory board for Bayer; several associates reported ties to numerous industry sources.
Prophylactic aspirin therapy of at least 5 years’ duration has a favorable benefit-harm profile, primarily because of its effectiveness in preventing colorectal and other cancers, according to a report in the August issue of Annals of Oncology.
In a review and summary of the current evidence regarding the benefits and harms of aspirin therapy, investigators analyzed the protective and adverse effects of several dosing regimens for both men and women at four different ages: 50, 55, 60, and 65 years. Their analysis assumed (conservatively) that protection against cancer begins 3 years after starting aspirin and continues for 5 years after stopping it, said Jack Cuzick, Ph.D., of the center for cancer prevention, Wolfson Institute of Preventive Medicine, Queen Mary University of London, and his associates.
The researchers found "overwhelming" evidence that aspirin therapy reduces the incidence of colorectal cancer by approximately 30% and mortality from the disease by approximately 40%. It is effective in both men and women, in every age group, at every dosage, and in patients at high risk for colorectal cancer.
Prophylactic aspirin is also effective in reducing mortality related to esophageal cancer by 27%-58%, depending on the study. It exerts "substantial" protection against stomach and other gastric cancers, but does not appear to do so with pancreatic cancer. Aspirin therapy exerts a smaller protective effect against breast, prostate, lung, and endometrial cancers, Dr. Cuzick and his associates said.
These benefits are seen in both men and women, as well as in all age groups studied. But absolute benefits are greatest among older men.
Prophylactic aspirin also reduces serious cardiovascular events, particularly nonfatal myocardial infarction, to a much smaller degree.
"Using our ‘best estimates’ for individuals taking aspirin for 10 years, there would be a relative reduction of approximately 9% in the number of men and 7% in the number of women with a cancer, MI, or stroke event over a 15-year period," the investigators said (Ann. Oncol. 2014 Aug. 5 [doi:10.1093/annonc/mdu225]).
"Reductions in cancer incidence would account for 61%-80% of the overall benefit, and reductions in colorectal cancer alone would account for 30%-36% of it," they noted.
It is still uncertain whether there is an upper age limit at which potential harms, such as excess bleeding, outweigh potential benefits, and the optimal dose for cancer prevention hasn’t yet been established, the authors added.
This study was sponsored by the International Society of Cancer Prevention, Cancer Research U.K., the British Heart Foundation, and the American Cancer Society. Dr. Cuzick reported serving on an advisory board for Bayer; several associates reported ties to numerous industry sources.
FROM ANNALS OF ONCOLOGY
Key clinical point: Prophylactic aspirin therapy has a favorable benefit-harm profile.
Major finding: Using "best estimates" for individuals taking aspirin for 10 years, there would be a relative reduction of approximately 9% in the number of men and 7% in the number of women with a cancer, MI, or stroke event over a 15-year period.
Data source: A review and summary of current evidence regarding the benefits and harms of prophylactic aspirin therapy.
Disclosures: This study was sponsored by the International Society of Cancer Prevention, Cancer Research U.K., the British Heart Foundation, and the American Cancer Society. Dr. Cuzick reported serving on an advisory board for Bayer; several associates reported ties to numerous industry sources.
Robot-assisted radical cystectomy doesn’t cut complications
Compared with open radical cystectomy, robot-assisted laparoscopic radical cystectomy did not reduce the number or severity of complications among patients with bladder cancer enrolled in a single-center randomized clinical trial, according to a letter to the editor in the New England Journal of Medicine.
Retrospective studies have suggested that the robot-assisted laparoscopic approach reduces the complication rate and shortens the length of hospitalization in patients with bladder cancer, many of whom are older, are smokers, and have coexisting conditions. This trial was designed to compare that approach against open radical cystectomy, with both procedures using extracorporeal urinary diversion, in 118 patients who had stage Ta-3N0-3M0 bladder cancer, said Dr. Bernard H. Bochner and his associates at Memorial Sloan Kettering Cancer Center, New York.
The primary outcome – the rate of complications of grade 2-5 within 90 days of surgery—was 62% (37 of 60 patients) with robot-assisted laparoscopic surgery, which was not significantly different from the 66% rate (38 of 58 patients) with open surgery. Similarly, the rates of high-grade complications were not significantly different (22% vs 21%), and the mean length of hospitalization was identical (8 days) in both groups. The amount of intraoperative blood loss was smaller with the robot-assisted surgery (mean difference, 159 cc), but the duration of surgery was shorter with open surgery (mean difference, 127 minutes).
"Because the trial was performed by experienced surgeons at a single, high-volume referral center, the results may not be generalizable to all clinical settings. Nonetheless, these results highlight the need for randomized trials to inform the benefits and risks of new surgical technologies before widespread implementation," Dr. Bochner and his associates said (N. Engl. J. Med. 2014;371:389-90).
This work was supported by the Sidney Kimmel Center for Prostate and Urologic Cancers at Memorial Sloan Kettering Cancer Center, Pin Down Bladder Cancer, and the Michael A. and Zena Wiener Research and Therapeutics Program in Bladder Cancer. Dr. Bochner and his associates reported no financial conflicts of interest.
Compared with open radical cystectomy, robot-assisted laparoscopic radical cystectomy did not reduce the number or severity of complications among patients with bladder cancer enrolled in a single-center randomized clinical trial, according to a letter to the editor in the New England Journal of Medicine.
Retrospective studies have suggested that the robot-assisted laparoscopic approach reduces the complication rate and shortens the length of hospitalization in patients with bladder cancer, many of whom are older, are smokers, and have coexisting conditions. This trial was designed to compare that approach against open radical cystectomy, with both procedures using extracorporeal urinary diversion, in 118 patients who had stage Ta-3N0-3M0 bladder cancer, said Dr. Bernard H. Bochner and his associates at Memorial Sloan Kettering Cancer Center, New York.
The primary outcome – the rate of complications of grade 2-5 within 90 days of surgery—was 62% (37 of 60 patients) with robot-assisted laparoscopic surgery, which was not significantly different from the 66% rate (38 of 58 patients) with open surgery. Similarly, the rates of high-grade complications were not significantly different (22% vs 21%), and the mean length of hospitalization was identical (8 days) in both groups. The amount of intraoperative blood loss was smaller with the robot-assisted surgery (mean difference, 159 cc), but the duration of surgery was shorter with open surgery (mean difference, 127 minutes).
"Because the trial was performed by experienced surgeons at a single, high-volume referral center, the results may not be generalizable to all clinical settings. Nonetheless, these results highlight the need for randomized trials to inform the benefits and risks of new surgical technologies before widespread implementation," Dr. Bochner and his associates said (N. Engl. J. Med. 2014;371:389-90).
This work was supported by the Sidney Kimmel Center for Prostate and Urologic Cancers at Memorial Sloan Kettering Cancer Center, Pin Down Bladder Cancer, and the Michael A. and Zena Wiener Research and Therapeutics Program in Bladder Cancer. Dr. Bochner and his associates reported no financial conflicts of interest.
Compared with open radical cystectomy, robot-assisted laparoscopic radical cystectomy did not reduce the number or severity of complications among patients with bladder cancer enrolled in a single-center randomized clinical trial, according to a letter to the editor in the New England Journal of Medicine.
Retrospective studies have suggested that the robot-assisted laparoscopic approach reduces the complication rate and shortens the length of hospitalization in patients with bladder cancer, many of whom are older, are smokers, and have coexisting conditions. This trial was designed to compare that approach against open radical cystectomy, with both procedures using extracorporeal urinary diversion, in 118 patients who had stage Ta-3N0-3M0 bladder cancer, said Dr. Bernard H. Bochner and his associates at Memorial Sloan Kettering Cancer Center, New York.
The primary outcome – the rate of complications of grade 2-5 within 90 days of surgery—was 62% (37 of 60 patients) with robot-assisted laparoscopic surgery, which was not significantly different from the 66% rate (38 of 58 patients) with open surgery. Similarly, the rates of high-grade complications were not significantly different (22% vs 21%), and the mean length of hospitalization was identical (8 days) in both groups. The amount of intraoperative blood loss was smaller with the robot-assisted surgery (mean difference, 159 cc), but the duration of surgery was shorter with open surgery (mean difference, 127 minutes).
"Because the trial was performed by experienced surgeons at a single, high-volume referral center, the results may not be generalizable to all clinical settings. Nonetheless, these results highlight the need for randomized trials to inform the benefits and risks of new surgical technologies before widespread implementation," Dr. Bochner and his associates said (N. Engl. J. Med. 2014;371:389-90).
This work was supported by the Sidney Kimmel Center for Prostate and Urologic Cancers at Memorial Sloan Kettering Cancer Center, Pin Down Bladder Cancer, and the Michael A. and Zena Wiener Research and Therapeutics Program in Bladder Cancer. Dr. Bochner and his associates reported no financial conflicts of interest.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Decisions about surgical approach to bladder cancer surgery should not be based on the assumption that robot-assisted laparoscopic radical cystectomy will necessarily result in fewer complications.
Major finding: The primary outcome – the rate of complications of grade 2-5 within 90 days of surgery – was 62% (37 of 60 patients) with robot-assisted laparoscopic surgery, which was not significantly different from the 66% rate (38 of 58 patients) with open surgery.
Data source: A 3-year single-center randomized, controlled clinical trial involving 60 patients who underwent robot-assisted laparoscopic radical cystectomy and 58 who underwent open radical cystectomy to treat bladder cancer.
Disclosures: This work was supported by the Sidney Kimmel Center for Prostate and Urologic Cancers at Memorial Sloan-Kettering Cancer Center, Pin Down Bladder Cancer, and the Michael A. and Zena Wiener Research and Therapeutics Program in Bladder Cancer. Dr. Bochner and his associates reported no financial conflicts of interest.
Bisphosphonates don’t cut risk of breast cancer
Three to four years of therapy with the bisphosphonates alendronate and zoledronic acid, taken at doses used to treat osteoporosis, did not decrease the risk of incident breast cancer in postmenopausal women, according to a report published online Aug. 11 in JAMA Internal Medicine.
In a post hoc analysis of data from two large multicenter, randomized, double-blind, controlled clinical trials assessing the effectiveness of alendronate or zoledronic acid for osteoporosis, the development of incident breast cancer was not significantly different between women taking the drugs and women taking placebo, said Trisha F. Hue, Ph.D., of the department of epidemiology and biostatistics at the University of California, San Francisco, and her associates.
Numerous previous observational studies, as well as a metaanalysis pooling the data from several observational studies, had shown that bisphosphonates taken for osteoporosis significantly lowered the risk of breast cancer by 32%-39%. Their findings, however, may have been confounded by indication, because other conditions in postmenopausal women – notably, low levels of estradiol and high levels of sex hormone–binding globulin (SHBG) – are strongly associated both with low bone density, fractures, and bone loss with a low risk of ER-positive breast cancer, Dr. Hue and her associates said. Thus, they suggested, the postmenopausal women who are most likely to be given bisphosphonates for bone health already have a lower risk of breast cancer.
Such confounding can be averted by using a randomized trial design, so Dr. Hue and her colleagues assessed whether bisphosphonates reduced incident breast cancer by analyzing data from the Fracture Intervention Trial (FIT) and the Health Outcomes and Reduced Incidence With Zoledronic Acid Once Yearly-Pivotal Fracture Trial (HORIZON-PFT).
Among the 6,194 FIT participants in this study, there were 103 cases of invasive breast cancer during a mean of 3.8 years of follow-up. The incidence of breast cancer was 1.8% (57 women) among women taking alendronate and 1.5% (46 women) in those taking placebo, a nonsignificant difference in favor of placebo.
Among the 7,580 HORIZON-PFT participants in this study, there were 62 cases of invasive breast cancer during 3 years of follow-up. The incidence of breast cancer was 0.9% (33 women) among women taking zoledronic acid and 0.8% (29 women) among those taking placebo, which was, again, a nonsignificant difference in favor of placebo, the investigators said (JAMA Intern. Med. 2014 Aug. 11 [doi:10.1001/jamainternmed.2014.3634]).
The total of breast cancer cases was relatively small, so Dr. Hue and her associates pooled the data from both studies to increase the sample size. The combined incidence of breast cancer was 1.3% (90 women) taking bisphosphonates and 1.1% (75 women) taking placebo – again, a nonsignificant difference favoring placebo.
The discrepancy between these findings from randomized controlled trials and the results of previous observational studies illustrates "the hazard of drawing conclusions about treatment effects from observational studies (even those that are very well done)" and highlights the value of confirming such findings in randomized controlled trials, Dr. Hue and her associates said.
This study received no industry support. FIT was supported by Merck, and HORIZON-PFT was supported by Novartis; both companies were involved in data collection and management. Dr. Hue reported no potential financial conflicts of interest. One of her associates reported serving as a consultant for Merck Sharpe & Dohme.
Three to four years of therapy with the bisphosphonates alendronate and zoledronic acid, taken at doses used to treat osteoporosis, did not decrease the risk of incident breast cancer in postmenopausal women, according to a report published online Aug. 11 in JAMA Internal Medicine.
In a post hoc analysis of data from two large multicenter, randomized, double-blind, controlled clinical trials assessing the effectiveness of alendronate or zoledronic acid for osteoporosis, the development of incident breast cancer was not significantly different between women taking the drugs and women taking placebo, said Trisha F. Hue, Ph.D., of the department of epidemiology and biostatistics at the University of California, San Francisco, and her associates.
Numerous previous observational studies, as well as a metaanalysis pooling the data from several observational studies, had shown that bisphosphonates taken for osteoporosis significantly lowered the risk of breast cancer by 32%-39%. Their findings, however, may have been confounded by indication, because other conditions in postmenopausal women – notably, low levels of estradiol and high levels of sex hormone–binding globulin (SHBG) – are strongly associated both with low bone density, fractures, and bone loss with a low risk of ER-positive breast cancer, Dr. Hue and her associates said. Thus, they suggested, the postmenopausal women who are most likely to be given bisphosphonates for bone health already have a lower risk of breast cancer.
Such confounding can be averted by using a randomized trial design, so Dr. Hue and her colleagues assessed whether bisphosphonates reduced incident breast cancer by analyzing data from the Fracture Intervention Trial (FIT) and the Health Outcomes and Reduced Incidence With Zoledronic Acid Once Yearly-Pivotal Fracture Trial (HORIZON-PFT).
Among the 6,194 FIT participants in this study, there were 103 cases of invasive breast cancer during a mean of 3.8 years of follow-up. The incidence of breast cancer was 1.8% (57 women) among women taking alendronate and 1.5% (46 women) in those taking placebo, a nonsignificant difference in favor of placebo.
Among the 7,580 HORIZON-PFT participants in this study, there were 62 cases of invasive breast cancer during 3 years of follow-up. The incidence of breast cancer was 0.9% (33 women) among women taking zoledronic acid and 0.8% (29 women) among those taking placebo, which was, again, a nonsignificant difference in favor of placebo, the investigators said (JAMA Intern. Med. 2014 Aug. 11 [doi:10.1001/jamainternmed.2014.3634]).
The total of breast cancer cases was relatively small, so Dr. Hue and her associates pooled the data from both studies to increase the sample size. The combined incidence of breast cancer was 1.3% (90 women) taking bisphosphonates and 1.1% (75 women) taking placebo – again, a nonsignificant difference favoring placebo.
The discrepancy between these findings from randomized controlled trials and the results of previous observational studies illustrates "the hazard of drawing conclusions about treatment effects from observational studies (even those that are very well done)" and highlights the value of confirming such findings in randomized controlled trials, Dr. Hue and her associates said.
This study received no industry support. FIT was supported by Merck, and HORIZON-PFT was supported by Novartis; both companies were involved in data collection and management. Dr. Hue reported no potential financial conflicts of interest. One of her associates reported serving as a consultant for Merck Sharpe & Dohme.
Three to four years of therapy with the bisphosphonates alendronate and zoledronic acid, taken at doses used to treat osteoporosis, did not decrease the risk of incident breast cancer in postmenopausal women, according to a report published online Aug. 11 in JAMA Internal Medicine.
In a post hoc analysis of data from two large multicenter, randomized, double-blind, controlled clinical trials assessing the effectiveness of alendronate or zoledronic acid for osteoporosis, the development of incident breast cancer was not significantly different between women taking the drugs and women taking placebo, said Trisha F. Hue, Ph.D., of the department of epidemiology and biostatistics at the University of California, San Francisco, and her associates.
Numerous previous observational studies, as well as a metaanalysis pooling the data from several observational studies, had shown that bisphosphonates taken for osteoporosis significantly lowered the risk of breast cancer by 32%-39%. Their findings, however, may have been confounded by indication, because other conditions in postmenopausal women – notably, low levels of estradiol and high levels of sex hormone–binding globulin (SHBG) – are strongly associated both with low bone density, fractures, and bone loss with a low risk of ER-positive breast cancer, Dr. Hue and her associates said. Thus, they suggested, the postmenopausal women who are most likely to be given bisphosphonates for bone health already have a lower risk of breast cancer.
Such confounding can be averted by using a randomized trial design, so Dr. Hue and her colleagues assessed whether bisphosphonates reduced incident breast cancer by analyzing data from the Fracture Intervention Trial (FIT) and the Health Outcomes and Reduced Incidence With Zoledronic Acid Once Yearly-Pivotal Fracture Trial (HORIZON-PFT).
Among the 6,194 FIT participants in this study, there were 103 cases of invasive breast cancer during a mean of 3.8 years of follow-up. The incidence of breast cancer was 1.8% (57 women) among women taking alendronate and 1.5% (46 women) in those taking placebo, a nonsignificant difference in favor of placebo.
Among the 7,580 HORIZON-PFT participants in this study, there were 62 cases of invasive breast cancer during 3 years of follow-up. The incidence of breast cancer was 0.9% (33 women) among women taking zoledronic acid and 0.8% (29 women) among those taking placebo, which was, again, a nonsignificant difference in favor of placebo, the investigators said (JAMA Intern. Med. 2014 Aug. 11 [doi:10.1001/jamainternmed.2014.3634]).
The total of breast cancer cases was relatively small, so Dr. Hue and her associates pooled the data from both studies to increase the sample size. The combined incidence of breast cancer was 1.3% (90 women) taking bisphosphonates and 1.1% (75 women) taking placebo – again, a nonsignificant difference favoring placebo.
The discrepancy between these findings from randomized controlled trials and the results of previous observational studies illustrates "the hazard of drawing conclusions about treatment effects from observational studies (even those that are very well done)" and highlights the value of confirming such findings in randomized controlled trials, Dr. Hue and her associates said.
This study received no industry support. FIT was supported by Merck, and HORIZON-PFT was supported by Novartis; both companies were involved in data collection and management. Dr. Hue reported no potential financial conflicts of interest. One of her associates reported serving as a consultant for Merck Sharpe & Dohme.
FROM JAMA INTERNAL MEDICINE
Key clinical point: Alendronate and zoledronic acid do not appear to reduce the risk of breast cancer.
Major finding: The incidence of breast cancer was 1.8% in women taking alendronate and 1.5% in those taking placebo, a nonsignificant difference; the incidence of breast cancer was 0.9% in women taking zoledronic acid and 0.8% in those taking placebo, again, a nonsignificant difference.
Data source: A post hoc analysis of data from two large randomized clinical trials involving 6,194 postmenopausal women with osteoporosis who received either alendronate or placebo and 7,580 who received either zoledronic acid or placebo.
Disclosures: This study received no industry support. FIT was supported by Merck, and HORIZON-PFT was supported by Novartis; both companies were involved in data collection and management. Dr. Hue reported no potential financial conflicts of interest. One of her associates reported serving as a consultant for Merck Sharpe & Dohme.
Catheter-directed thrombolysis for leg DVTs held risky
Catheter-directed thrombolysis plus anticoagulation is no more effective than anticoagulation alone in preventing in-hospital death among adults who have lower-extremity proximal deep vein thrombosis, according to a nationwide observational study reported online July 21 in JAMA Internal Medicine.
However, catheter-directed thrombolysis carries higher risks, particularly serious bleeding risks such as intracranial hemorrhage, than does anticoagulation alone, and it costs nearly three times as much money. These findings highlight the need for randomized trials "to evaluate the magnitude of the effect of catheter-directed thrombolysis on ... mortality, postthrombotic syndrome, and recurrence of DVT [deep vein thrombosis]. In the absence of such data, it may be reasonable to restrict this form of therapy to those patients who have a low bleeding risk and a high risk for postthrombotic syndrome, such as patients with iliofemoral DVT," said Dr. Riyaz Bashir of the division of cardiovascular diseases, Temple University, Philadelphia, and his associates.
Conflicting data from several small studies as to the safety and effectiveness of catheter-directed thrombolysis have led professional societies to devise conflicting recommendations for its use: CHEST (the American College of Chest Physicians) advises against using the procedure, while the American Heart Association recommends it as a first-line therapy for certain patients. "We sought to assess real-world comparative-safety outcomes in patients with proximal and caval DVT who underwent catheter-directed thrombolysis plus anticoagulation with a group treated with anticoagulation alone using risk-adjusted propensity-score matching," the investigators said.
They analyzed data from an Agency for Healthcare Research and Quality administrative database of patient discharges from approximately 1,000 nonfederal acute-care hospitals per year for a 6-year period. They identified 90,618 patients with a discharge diagnosis of proximal DVT; propensity-score matching yielded 3,594 well-matched patients in each study group. In-hospital mortality was not significantly different between patients who had catheter-directed thrombolysis plus anticoagulation (1.2%) and those who had anticoagulation alone (0.9%), Dr. Bashir and his associates said (JAMA Intern. Med. 2014 July 21 [doi:10.1001/jamainternmed.2014.3415]).
However, rates of blood transfusion (11.1% vs. 6.5%), pulmonary embolism (17.9% vs 11.4%), and intracranial hemorrhage (0.9% vs 0.3%) were significantly higher with the invasive intervention. And patients in the catheter-directed thrombolysis group required significantly longer hospitalizations (7.2 vs. 5.0 days) and incurred significantly higher hospital expenses ($85,094 vs. $28,164). "It is imperative that the magnitude of benefit from catheter-directed therapy be substantial to justify the increased initial resource utilization and bleeding risks of this therapy," the investigators noted.
Dr. Steven Q. Simpson, FCCP, comments: This observational, real-world study provides more practical information, I believe, than we would obtain with a controlled trial for two drugs/techniques that are already FDA–approved for this purpose. We are able to infer how the drug/technique affects short-term mortality outcomes and financial costs beyond the strict selection criteria and adherence to a tight protocol that a trial requires. However, the study leaves us with the question of how catheter-directed thrombolysis compares with anticoagulation in the more immediately life-threatening setting of massive pulmonary embolism. Additionally, proponents of catheter-directed thrombolysis suggest that it reduces the pain and suffering of postphlebitic syndrome, an outcome not addressed by this study.
Dr. Steven Q. Simpson, FCCP, comments: This observational, real-world study provides more practical information, I believe, than we would obtain with a controlled trial for two drugs/techniques that are already FDA–approved for this purpose. We are able to infer how the drug/technique affects short-term mortality outcomes and financial costs beyond the strict selection criteria and adherence to a tight protocol that a trial requires. However, the study leaves us with the question of how catheter-directed thrombolysis compares with anticoagulation in the more immediately life-threatening setting of massive pulmonary embolism. Additionally, proponents of catheter-directed thrombolysis suggest that it reduces the pain and suffering of postphlebitic syndrome, an outcome not addressed by this study.
Dr. Steven Q. Simpson, FCCP, comments: This observational, real-world study provides more practical information, I believe, than we would obtain with a controlled trial for two drugs/techniques that are already FDA–approved for this purpose. We are able to infer how the drug/technique affects short-term mortality outcomes and financial costs beyond the strict selection criteria and adherence to a tight protocol that a trial requires. However, the study leaves us with the question of how catheter-directed thrombolysis compares with anticoagulation in the more immediately life-threatening setting of massive pulmonary embolism. Additionally, proponents of catheter-directed thrombolysis suggest that it reduces the pain and suffering of postphlebitic syndrome, an outcome not addressed by this study.
Catheter-directed thrombolysis plus anticoagulation is no more effective than anticoagulation alone in preventing in-hospital death among adults who have lower-extremity proximal deep vein thrombosis, according to a nationwide observational study reported online July 21 in JAMA Internal Medicine.
However, catheter-directed thrombolysis carries higher risks, particularly serious bleeding risks such as intracranial hemorrhage, than does anticoagulation alone, and it costs nearly three times as much money. These findings highlight the need for randomized trials "to evaluate the magnitude of the effect of catheter-directed thrombolysis on ... mortality, postthrombotic syndrome, and recurrence of DVT [deep vein thrombosis]. In the absence of such data, it may be reasonable to restrict this form of therapy to those patients who have a low bleeding risk and a high risk for postthrombotic syndrome, such as patients with iliofemoral DVT," said Dr. Riyaz Bashir of the division of cardiovascular diseases, Temple University, Philadelphia, and his associates.
Conflicting data from several small studies as to the safety and effectiveness of catheter-directed thrombolysis have led professional societies to devise conflicting recommendations for its use: CHEST (the American College of Chest Physicians) advises against using the procedure, while the American Heart Association recommends it as a first-line therapy for certain patients. "We sought to assess real-world comparative-safety outcomes in patients with proximal and caval DVT who underwent catheter-directed thrombolysis plus anticoagulation with a group treated with anticoagulation alone using risk-adjusted propensity-score matching," the investigators said.
They analyzed data from an Agency for Healthcare Research and Quality administrative database of patient discharges from approximately 1,000 nonfederal acute-care hospitals per year for a 6-year period. They identified 90,618 patients with a discharge diagnosis of proximal DVT; propensity-score matching yielded 3,594 well-matched patients in each study group. In-hospital mortality was not significantly different between patients who had catheter-directed thrombolysis plus anticoagulation (1.2%) and those who had anticoagulation alone (0.9%), Dr. Bashir and his associates said (JAMA Intern. Med. 2014 July 21 [doi:10.1001/jamainternmed.2014.3415]).
However, rates of blood transfusion (11.1% vs. 6.5%), pulmonary embolism (17.9% vs 11.4%), and intracranial hemorrhage (0.9% vs 0.3%) were significantly higher with the invasive intervention. And patients in the catheter-directed thrombolysis group required significantly longer hospitalizations (7.2 vs. 5.0 days) and incurred significantly higher hospital expenses ($85,094 vs. $28,164). "It is imperative that the magnitude of benefit from catheter-directed therapy be substantial to justify the increased initial resource utilization and bleeding risks of this therapy," the investigators noted.
Catheter-directed thrombolysis plus anticoagulation is no more effective than anticoagulation alone in preventing in-hospital death among adults who have lower-extremity proximal deep vein thrombosis, according to a nationwide observational study reported online July 21 in JAMA Internal Medicine.
However, catheter-directed thrombolysis carries higher risks, particularly serious bleeding risks such as intracranial hemorrhage, than does anticoagulation alone, and it costs nearly three times as much money. These findings highlight the need for randomized trials "to evaluate the magnitude of the effect of catheter-directed thrombolysis on ... mortality, postthrombotic syndrome, and recurrence of DVT [deep vein thrombosis]. In the absence of such data, it may be reasonable to restrict this form of therapy to those patients who have a low bleeding risk and a high risk for postthrombotic syndrome, such as patients with iliofemoral DVT," said Dr. Riyaz Bashir of the division of cardiovascular diseases, Temple University, Philadelphia, and his associates.
Conflicting data from several small studies as to the safety and effectiveness of catheter-directed thrombolysis have led professional societies to devise conflicting recommendations for its use: CHEST (the American College of Chest Physicians) advises against using the procedure, while the American Heart Association recommends it as a first-line therapy for certain patients. "We sought to assess real-world comparative-safety outcomes in patients with proximal and caval DVT who underwent catheter-directed thrombolysis plus anticoagulation with a group treated with anticoagulation alone using risk-adjusted propensity-score matching," the investigators said.
They analyzed data from an Agency for Healthcare Research and Quality administrative database of patient discharges from approximately 1,000 nonfederal acute-care hospitals per year for a 6-year period. They identified 90,618 patients with a discharge diagnosis of proximal DVT; propensity-score matching yielded 3,594 well-matched patients in each study group. In-hospital mortality was not significantly different between patients who had catheter-directed thrombolysis plus anticoagulation (1.2%) and those who had anticoagulation alone (0.9%), Dr. Bashir and his associates said (JAMA Intern. Med. 2014 July 21 [doi:10.1001/jamainternmed.2014.3415]).
However, rates of blood transfusion (11.1% vs. 6.5%), pulmonary embolism (17.9% vs 11.4%), and intracranial hemorrhage (0.9% vs 0.3%) were significantly higher with the invasive intervention. And patients in the catheter-directed thrombolysis group required significantly longer hospitalizations (7.2 vs. 5.0 days) and incurred significantly higher hospital expenses ($85,094 vs. $28,164). "It is imperative that the magnitude of benefit from catheter-directed therapy be substantial to justify the increased initial resource utilization and bleeding risks of this therapy," the investigators noted.
Key clinical point: Catheter-directed thrombolysis carries higher risks and may not improve outcomes for proximal DVT patients.
Major finding: In-hospital mortality was not significantly different between patients who had catheter-directed thrombolysis plus anticoagulation (1.2%) and those who had anticoagulation alone (0.9%), but rates of blood transfusion (11.1% vs 6.5%), pulmonary embolism (17.9% vs 11.4%), and intracranial hemorrhage (0.9% vs 0.3%) were significantly higher with the invasive intervention.
Data source: A propensity-matched analysis comparing the effectiveness and safety profiles between catheter-directed thrombolysis plus anticoagulation and anticoagulation alone in 3,594 adults across the country hospitalized with lower-extremity proximal DVT during a 6-year period.
Disclosures: This study was supported by Temple University Hospital, Philadelphia. Dr. Bashir reported no financial conflicts of interest; his associates reported ties to Covidien, Health Systems Networks, and Insight Telehealth.
New Creutzfeldt-Jakob disease tests have high sensitivities, specificities
Two minimally invasive assays for detecting prions that are diagnostic of Creutzfeldt-Jakob disease in living patients have shown promise in preliminary studies reported by separate research groups Aug. 6 in the New England Journal of Medicine.
One assay tests epithelial samples obtained from nasal brushings and the other tests urine samples; both can be used in patients suspected of having the sporadic, inherited, or acquired forms of Creutzfeldt-Jakob disease (CJD), such as variant CJD and iatrogenic CJD. Both assays achieved sensitivities and specificities of 93%-100% in very small patient populations in these exploratory studies, which is better than the diagnostic accuracy of cerebrospinal fluid (CSF) testing.
If these findings are replicated in larger studies, both assays have the potential for establishing a definitive diagnosis of CJD in clinical settings. The test that uses nasal brushings may do so earlier in the course of the disease than has been possible previously, at least allowing the possibility of intervention for this invariably fatal neurodegenerative disorder.
In addition, the incidental finding that simple brushing of the olfactory mucosa yields an even higher quantity of prion "seeds" than are found in patients’ CSF suggests that infectivity may be present in the nasal cavity, which has important biosafety implications, the researchers noted.
In the first report, investigators applied real-time quaking-induced conversion (RT-QuIC) technology to olfactory epithelium samples from 31 patients who had rapidly progressive dementia and were referred for evaluation of possible or probable CJD from clinicians across Italy. These patients also underwent CSF sampling at the same time. A total of 12 patients with other neurodegenerative disorders (chiefly Alzheimer’s disease or Parkinson’s disease) and 31 patients at an ear, nose, and throat clinic who had no neurologic disorders served as controls, said Christina D. Orrú, Ph.D., of the Laboratory of Persistent Viral Diseases at the National Institute of Allergy and Infectious Diseases’ Rocky Mountain Laboratories in Hamilton, Mont., and her associates.
Obtaining the nasal brushings was described as a gentle procedure in which unsedated patients were first given a local vasoconstrictor applied with a nasal tampon, and then had a fiberoptic rhinoscope with a disposable sheath inserted into the nasal cavity to locate the olfactory mucosal lining of the nasal vault. A sterile, disposable brush was inserted alongside the rhinoscope, gently rolled on the mucosal surface, withdrawn, and immersed in saline solution in a centrifuge tube for further preparation.
The assays using this material yielded positive results for 15 of the 15 patients who had definite sporadic CJD, 13 of the 14 who had probable sporadic CJD, and 2 of the 2 patients who had inherited CJD. In contrast, all 43 control subjects had negative results. This represents a sensitivity of 97% (95% confidence interval [CI], 82-100) and a specificity of 100% (95% CI, 90-100) in this study population. In comparison, testing of CSF samples from the same patients only achieved a 77% sensitivity (95% CI, 57-89), Dr. Orrú and her associates said (N. Engl. J. Med. 2014 Aug. 6 [doi:10.1056/NEJMoa1315200]).
In addition, the "substantial" prion seeding found in the olfactory mucosa – greater than that in the CSF – raises the possibility that CJD prions could contaminate patients’ nasal discharges. "Nasal and aerosol-borne transmission of prion diseases have been documented in animal models, but there is no epidemiologic evidence for aerosol-borne transmission of sporadic CJD" to date, the investigators wrote.
It also is possible that medical instruments that come into contact with the nasal mucosa may become contaminated with prions, "which poses the question of whether iatrogenic transmission is possible. Therefore, further study of possible biohazards ... is warranted," they added.
In the second report, Fabio Moda, Ph.D., of the Mitchell Center for Research in Alzheimer’s Disease and Related Brain Disorders at the University of Texas, Houston, and his associates assayed urine samples using an extensive amplification technology for the presence of minute quantities of the misfolded prion protein in 68 patients with sporadic CJD, 14 with variant CJD, and 156 controls. The control group included 4 patients with genetic prion diseases, 50 with other neurodegenerative disorders (Alzheimer’s disease, Parkinson’s disease, frontotemporal dementia, motor neuron disease, and progressive supranuclear palsy), 50 patients with nondegenerative neurologic disorders (chiefly cerebrovascular disease, multiple sclerosis, epilepsy, brain tumors, autoimmune encephalitis, and meningitis), and 52 healthy adults.
This assay achieved a sensitivity of 93% (95% CI, 66.1-99.8) and a specificity of 100% (95% CI, 98.4-100.0) in distinguishing CJD from other brain disorders and from brain health in this patient population, they said (N. Engl. J. Med. 2014 Aug. 6 [doi:10.1056/NEJMoa1404401]).
The quantities of the prion protein excreted in the urine were extremely small, so the potential for infectivity was not addressed in this study.
Dr. Orrú’s study was funded by the National Institute of Allergy and Infectious Diseases; Fondazione Cariverona; the Italian Ministry of Health; the Creutzfeldt-Jakob Disease Foundation; Programma Master and Back-Percorsi di rientro; and by donations in memory of Jeffrey Smith from Mary Hilderman Smith, Zoe Smith Jaye, and Jenny Smith Unruh. Dr. Moda’s study was funded by the National Institutes of Health, PrioNet Canada, Merck Serono, the Italian Ministry of Health, Associazione Italiana Encefalopatie da Prioni, Ministero dell’Universita e della Ricerca, the Charles S. Britton Fund, the U.K. Department of Health, and the Scottish government.
These findings are encouraging because clinicians and researchers have long sought a sensitive and minimally invasive diagnostic tool specifically targeted to the protein that causes all forms of CJD, said Dr. Colin L. Masters.
It will, however, be important for additional studies to determine more precise estimates of the tests’ specificities – necessitated by the wide confidence intervals reported for the tests’ specificities – because the techniques used can give rise to "breakthrough" false-positive results. "Creutzfeldt-Jakob disease is extremely uncommon, and a test without near-perfect specificity may also result in many false positive results if it is applied to patients with a low probability of having the disease. In these circumstances, it is important to highlight the preliminary nature of these studies," Dr. Masters wrote.
Moreover, the finding that abnormal prion protein "seeds" are found in the olfactory mucosa "at concentrations equivalent to those in diseased brain, and several logs greater than those in cerebrospinal fluid," has implications for infection control. "Some experts have [already] recommended appropriate decontamination of surgical instruments that come into contact with the olfactory epithelium of patients at high risk for Creutzfeldt-Jakob disease," he said.
Dr. Master is with the Florey Institute of Neuroscience and Mental Health at the University of Melbourne. These remarks were taken from his editorial accompanying the two reports on CJD assays (N. Engl. J. Med. 2014 August 6 [doi:10.1056/NEJMe1407419]).
These findings are encouraging because clinicians and researchers have long sought a sensitive and minimally invasive diagnostic tool specifically targeted to the protein that causes all forms of CJD, said Dr. Colin L. Masters.
It will, however, be important for additional studies to determine more precise estimates of the tests’ specificities – necessitated by the wide confidence intervals reported for the tests’ specificities – because the techniques used can give rise to "breakthrough" false-positive results. "Creutzfeldt-Jakob disease is extremely uncommon, and a test without near-perfect specificity may also result in many false positive results if it is applied to patients with a low probability of having the disease. In these circumstances, it is important to highlight the preliminary nature of these studies," Dr. Masters wrote.
Moreover, the finding that abnormal prion protein "seeds" are found in the olfactory mucosa "at concentrations equivalent to those in diseased brain, and several logs greater than those in cerebrospinal fluid," has implications for infection control. "Some experts have [already] recommended appropriate decontamination of surgical instruments that come into contact with the olfactory epithelium of patients at high risk for Creutzfeldt-Jakob disease," he said.
Dr. Master is with the Florey Institute of Neuroscience and Mental Health at the University of Melbourne. These remarks were taken from his editorial accompanying the two reports on CJD assays (N. Engl. J. Med. 2014 August 6 [doi:10.1056/NEJMe1407419]).
These findings are encouraging because clinicians and researchers have long sought a sensitive and minimally invasive diagnostic tool specifically targeted to the protein that causes all forms of CJD, said Dr. Colin L. Masters.
It will, however, be important for additional studies to determine more precise estimates of the tests’ specificities – necessitated by the wide confidence intervals reported for the tests’ specificities – because the techniques used can give rise to "breakthrough" false-positive results. "Creutzfeldt-Jakob disease is extremely uncommon, and a test without near-perfect specificity may also result in many false positive results if it is applied to patients with a low probability of having the disease. In these circumstances, it is important to highlight the preliminary nature of these studies," Dr. Masters wrote.
Moreover, the finding that abnormal prion protein "seeds" are found in the olfactory mucosa "at concentrations equivalent to those in diseased brain, and several logs greater than those in cerebrospinal fluid," has implications for infection control. "Some experts have [already] recommended appropriate decontamination of surgical instruments that come into contact with the olfactory epithelium of patients at high risk for Creutzfeldt-Jakob disease," he said.
Dr. Master is with the Florey Institute of Neuroscience and Mental Health at the University of Melbourne. These remarks were taken from his editorial accompanying the two reports on CJD assays (N. Engl. J. Med. 2014 August 6 [doi:10.1056/NEJMe1407419]).
Two minimally invasive assays for detecting prions that are diagnostic of Creutzfeldt-Jakob disease in living patients have shown promise in preliminary studies reported by separate research groups Aug. 6 in the New England Journal of Medicine.
One assay tests epithelial samples obtained from nasal brushings and the other tests urine samples; both can be used in patients suspected of having the sporadic, inherited, or acquired forms of Creutzfeldt-Jakob disease (CJD), such as variant CJD and iatrogenic CJD. Both assays achieved sensitivities and specificities of 93%-100% in very small patient populations in these exploratory studies, which is better than the diagnostic accuracy of cerebrospinal fluid (CSF) testing.
If these findings are replicated in larger studies, both assays have the potential for establishing a definitive diagnosis of CJD in clinical settings. The test that uses nasal brushings may do so earlier in the course of the disease than has been possible previously, at least allowing the possibility of intervention for this invariably fatal neurodegenerative disorder.
In addition, the incidental finding that simple brushing of the olfactory mucosa yields an even higher quantity of prion "seeds" than are found in patients’ CSF suggests that infectivity may be present in the nasal cavity, which has important biosafety implications, the researchers noted.
In the first report, investigators applied real-time quaking-induced conversion (RT-QuIC) technology to olfactory epithelium samples from 31 patients who had rapidly progressive dementia and were referred for evaluation of possible or probable CJD from clinicians across Italy. These patients also underwent CSF sampling at the same time. A total of 12 patients with other neurodegenerative disorders (chiefly Alzheimer’s disease or Parkinson’s disease) and 31 patients at an ear, nose, and throat clinic who had no neurologic disorders served as controls, said Christina D. Orrú, Ph.D., of the Laboratory of Persistent Viral Diseases at the National Institute of Allergy and Infectious Diseases’ Rocky Mountain Laboratories in Hamilton, Mont., and her associates.
Obtaining the nasal brushings was described as a gentle procedure in which unsedated patients were first given a local vasoconstrictor applied with a nasal tampon, and then had a fiberoptic rhinoscope with a disposable sheath inserted into the nasal cavity to locate the olfactory mucosal lining of the nasal vault. A sterile, disposable brush was inserted alongside the rhinoscope, gently rolled on the mucosal surface, withdrawn, and immersed in saline solution in a centrifuge tube for further preparation.
The assays using this material yielded positive results for 15 of the 15 patients who had definite sporadic CJD, 13 of the 14 who had probable sporadic CJD, and 2 of the 2 patients who had inherited CJD. In contrast, all 43 control subjects had negative results. This represents a sensitivity of 97% (95% confidence interval [CI], 82-100) and a specificity of 100% (95% CI, 90-100) in this study population. In comparison, testing of CSF samples from the same patients only achieved a 77% sensitivity (95% CI, 57-89), Dr. Orrú and her associates said (N. Engl. J. Med. 2014 Aug. 6 [doi:10.1056/NEJMoa1315200]).
In addition, the "substantial" prion seeding found in the olfactory mucosa – greater than that in the CSF – raises the possibility that CJD prions could contaminate patients’ nasal discharges. "Nasal and aerosol-borne transmission of prion diseases have been documented in animal models, but there is no epidemiologic evidence for aerosol-borne transmission of sporadic CJD" to date, the investigators wrote.
It also is possible that medical instruments that come into contact with the nasal mucosa may become contaminated with prions, "which poses the question of whether iatrogenic transmission is possible. Therefore, further study of possible biohazards ... is warranted," they added.
In the second report, Fabio Moda, Ph.D., of the Mitchell Center for Research in Alzheimer’s Disease and Related Brain Disorders at the University of Texas, Houston, and his associates assayed urine samples using an extensive amplification technology for the presence of minute quantities of the misfolded prion protein in 68 patients with sporadic CJD, 14 with variant CJD, and 156 controls. The control group included 4 patients with genetic prion diseases, 50 with other neurodegenerative disorders (Alzheimer’s disease, Parkinson’s disease, frontotemporal dementia, motor neuron disease, and progressive supranuclear palsy), 50 patients with nondegenerative neurologic disorders (chiefly cerebrovascular disease, multiple sclerosis, epilepsy, brain tumors, autoimmune encephalitis, and meningitis), and 52 healthy adults.
This assay achieved a sensitivity of 93% (95% CI, 66.1-99.8) and a specificity of 100% (95% CI, 98.4-100.0) in distinguishing CJD from other brain disorders and from brain health in this patient population, they said (N. Engl. J. Med. 2014 Aug. 6 [doi:10.1056/NEJMoa1404401]).
The quantities of the prion protein excreted in the urine were extremely small, so the potential for infectivity was not addressed in this study.
Dr. Orrú’s study was funded by the National Institute of Allergy and Infectious Diseases; Fondazione Cariverona; the Italian Ministry of Health; the Creutzfeldt-Jakob Disease Foundation; Programma Master and Back-Percorsi di rientro; and by donations in memory of Jeffrey Smith from Mary Hilderman Smith, Zoe Smith Jaye, and Jenny Smith Unruh. Dr. Moda’s study was funded by the National Institutes of Health, PrioNet Canada, Merck Serono, the Italian Ministry of Health, Associazione Italiana Encefalopatie da Prioni, Ministero dell’Universita e della Ricerca, the Charles S. Britton Fund, the U.K. Department of Health, and the Scottish government.
Two minimally invasive assays for detecting prions that are diagnostic of Creutzfeldt-Jakob disease in living patients have shown promise in preliminary studies reported by separate research groups Aug. 6 in the New England Journal of Medicine.
One assay tests epithelial samples obtained from nasal brushings and the other tests urine samples; both can be used in patients suspected of having the sporadic, inherited, or acquired forms of Creutzfeldt-Jakob disease (CJD), such as variant CJD and iatrogenic CJD. Both assays achieved sensitivities and specificities of 93%-100% in very small patient populations in these exploratory studies, which is better than the diagnostic accuracy of cerebrospinal fluid (CSF) testing.
If these findings are replicated in larger studies, both assays have the potential for establishing a definitive diagnosis of CJD in clinical settings. The test that uses nasal brushings may do so earlier in the course of the disease than has been possible previously, at least allowing the possibility of intervention for this invariably fatal neurodegenerative disorder.
In addition, the incidental finding that simple brushing of the olfactory mucosa yields an even higher quantity of prion "seeds" than are found in patients’ CSF suggests that infectivity may be present in the nasal cavity, which has important biosafety implications, the researchers noted.
In the first report, investigators applied real-time quaking-induced conversion (RT-QuIC) technology to olfactory epithelium samples from 31 patients who had rapidly progressive dementia and were referred for evaluation of possible or probable CJD from clinicians across Italy. These patients also underwent CSF sampling at the same time. A total of 12 patients with other neurodegenerative disorders (chiefly Alzheimer’s disease or Parkinson’s disease) and 31 patients at an ear, nose, and throat clinic who had no neurologic disorders served as controls, said Christina D. Orrú, Ph.D., of the Laboratory of Persistent Viral Diseases at the National Institute of Allergy and Infectious Diseases’ Rocky Mountain Laboratories in Hamilton, Mont., and her associates.
Obtaining the nasal brushings was described as a gentle procedure in which unsedated patients were first given a local vasoconstrictor applied with a nasal tampon, and then had a fiberoptic rhinoscope with a disposable sheath inserted into the nasal cavity to locate the olfactory mucosal lining of the nasal vault. A sterile, disposable brush was inserted alongside the rhinoscope, gently rolled on the mucosal surface, withdrawn, and immersed in saline solution in a centrifuge tube for further preparation.
The assays using this material yielded positive results for 15 of the 15 patients who had definite sporadic CJD, 13 of the 14 who had probable sporadic CJD, and 2 of the 2 patients who had inherited CJD. In contrast, all 43 control subjects had negative results. This represents a sensitivity of 97% (95% confidence interval [CI], 82-100) and a specificity of 100% (95% CI, 90-100) in this study population. In comparison, testing of CSF samples from the same patients only achieved a 77% sensitivity (95% CI, 57-89), Dr. Orrú and her associates said (N. Engl. J. Med. 2014 Aug. 6 [doi:10.1056/NEJMoa1315200]).
In addition, the "substantial" prion seeding found in the olfactory mucosa – greater than that in the CSF – raises the possibility that CJD prions could contaminate patients’ nasal discharges. "Nasal and aerosol-borne transmission of prion diseases have been documented in animal models, but there is no epidemiologic evidence for aerosol-borne transmission of sporadic CJD" to date, the investigators wrote.
It also is possible that medical instruments that come into contact with the nasal mucosa may become contaminated with prions, "which poses the question of whether iatrogenic transmission is possible. Therefore, further study of possible biohazards ... is warranted," they added.
In the second report, Fabio Moda, Ph.D., of the Mitchell Center for Research in Alzheimer’s Disease and Related Brain Disorders at the University of Texas, Houston, and his associates assayed urine samples using an extensive amplification technology for the presence of minute quantities of the misfolded prion protein in 68 patients with sporadic CJD, 14 with variant CJD, and 156 controls. The control group included 4 patients with genetic prion diseases, 50 with other neurodegenerative disorders (Alzheimer’s disease, Parkinson’s disease, frontotemporal dementia, motor neuron disease, and progressive supranuclear palsy), 50 patients with nondegenerative neurologic disorders (chiefly cerebrovascular disease, multiple sclerosis, epilepsy, brain tumors, autoimmune encephalitis, and meningitis), and 52 healthy adults.
This assay achieved a sensitivity of 93% (95% CI, 66.1-99.8) and a specificity of 100% (95% CI, 98.4-100.0) in distinguishing CJD from other brain disorders and from brain health in this patient population, they said (N. Engl. J. Med. 2014 Aug. 6 [doi:10.1056/NEJMoa1404401]).
The quantities of the prion protein excreted in the urine were extremely small, so the potential for infectivity was not addressed in this study.
Dr. Orrú’s study was funded by the National Institute of Allergy and Infectious Diseases; Fondazione Cariverona; the Italian Ministry of Health; the Creutzfeldt-Jakob Disease Foundation; Programma Master and Back-Percorsi di rientro; and by donations in memory of Jeffrey Smith from Mary Hilderman Smith, Zoe Smith Jaye, and Jenny Smith Unruh. Dr. Moda’s study was funded by the National Institutes of Health, PrioNet Canada, Merck Serono, the Italian Ministry of Health, Associazione Italiana Encefalopatie da Prioni, Ministero dell’Universita e della Ricerca, the Charles S. Britton Fund, the U.K. Department of Health, and the Scottish government.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Two minimally invasive approaches to detecting misfolded prion proteins have high sensitivities and specificities for detecting forms of Creutzfeldt-Jakob disease.
Major finding: The assay of nasal brushings achieved a sensitivity of 97% and a specificity of 100% in identifying CJD in living patients, while the urine assay achieved a sensitivity of 93% and a specificity of 100%.
Data source: A case-control study of the diagnostic accuracy of an assay of nasal brushings, involving 31 patients with suspected CJD and 43 control subjects (Dr. Orrú), and a case-control study of the diagnostic accuracy of an assay of urine samples, involving 82 patients with various forms of CJD and 156 controls (Dr. Moda).
Disclosures: Dr. Orrú’s study was funded by foundations, private donations, and governmental institutes, including the National Institute of Allergy and Infectious Diseases. Dr. Moda’s study was funded by Italian, British, and Scottish government entities, private funds, Merck Serono, the National Institutes of Health, and PrioNet Canada.
Combined hormonal contraception raises VTE risk fivefold
Combined hormonal contraception raises the risk for venous thromboembolism fivefold overall, with certain formulations increasing that risk even further and with thrombophilic genotypes raising it further still, according to a report published online Aug. 5 in Obstetrics & Gynecology.
To assess the association between various types of hormonal contraception and venous thromboembolism (VTE) risk, Swedish investigators performed a nationwide case-control study involving 948 women aged 18-54 years who were treated for deep vein thrombosis of the leg or pelvis, pulmonary embolism, or both conditions during a 6-year period and 902 healthy control subjects from the general population. All the women provided a blood sample for genetic analysis and provided detailed information regarding their contraceptive use, body mass index (BMI), smoking status, and recent history of immobilization, said Dr. Annica Bergendal of the Centre for Pharmacoepidemiology, Karolinska Institutet, Stockholm, and her associates.
A total of 32.8% of the case group reported current use of combined hormonal contraception, compared with only 11.9% of the control group. Overall, current use of combined hormonal contraception was associated with a fivefold increased risk of VTE, with an adjusted OR of 5.3. "Combinations with the progestogen desogestrel yielded the highest risk estimate (adjusted OR, 11.4), followed by drospirenone (adjusted OR, 8.4). The adjusted OR could not be calculated for lynestrenol because there were no exposed women in the control group," the investigators wrote (Obstet. Gynecol. 2014;124:600-9).
In contrast, progestogen-only contraception did not increase the risk of VTE, except at the highest dose level.
Women who used combination contraception and were carriers of factor V Leiden or of the prothrombin gene mutation were at extremely high risk for VTE, nearly 20-fold for either, compared with nonusers and noncarriers. Women who used combination contraception and were carriers of factor XIII had a much lower, but still elevated, risk for VTE (OR, 2.8).
All of these differences in risk appeared to be independent of BMI, smoking status, and recent history of immobilization, Dr. Bergendal and her associates added.
This study was supported by unrestricted grants from Janssen-Cilag, Novartis, Organon, Schering, Wyeth, AFA Insurance, and the Medical Products Agency. Dr. Bergendal and her associates reported no relevant financial conflicts.
Combined hormonal contraception raises the risk for venous thromboembolism fivefold overall, with certain formulations increasing that risk even further and with thrombophilic genotypes raising it further still, according to a report published online Aug. 5 in Obstetrics & Gynecology.
To assess the association between various types of hormonal contraception and venous thromboembolism (VTE) risk, Swedish investigators performed a nationwide case-control study involving 948 women aged 18-54 years who were treated for deep vein thrombosis of the leg or pelvis, pulmonary embolism, or both conditions during a 6-year period and 902 healthy control subjects from the general population. All the women provided a blood sample for genetic analysis and provided detailed information regarding their contraceptive use, body mass index (BMI), smoking status, and recent history of immobilization, said Dr. Annica Bergendal of the Centre for Pharmacoepidemiology, Karolinska Institutet, Stockholm, and her associates.
A total of 32.8% of the case group reported current use of combined hormonal contraception, compared with only 11.9% of the control group. Overall, current use of combined hormonal contraception was associated with a fivefold increased risk of VTE, with an adjusted OR of 5.3. "Combinations with the progestogen desogestrel yielded the highest risk estimate (adjusted OR, 11.4), followed by drospirenone (adjusted OR, 8.4). The adjusted OR could not be calculated for lynestrenol because there were no exposed women in the control group," the investigators wrote (Obstet. Gynecol. 2014;124:600-9).
In contrast, progestogen-only contraception did not increase the risk of VTE, except at the highest dose level.
Women who used combination contraception and were carriers of factor V Leiden or of the prothrombin gene mutation were at extremely high risk for VTE, nearly 20-fold for either, compared with nonusers and noncarriers. Women who used combination contraception and were carriers of factor XIII had a much lower, but still elevated, risk for VTE (OR, 2.8).
All of these differences in risk appeared to be independent of BMI, smoking status, and recent history of immobilization, Dr. Bergendal and her associates added.
This study was supported by unrestricted grants from Janssen-Cilag, Novartis, Organon, Schering, Wyeth, AFA Insurance, and the Medical Products Agency. Dr. Bergendal and her associates reported no relevant financial conflicts.
Combined hormonal contraception raises the risk for venous thromboembolism fivefold overall, with certain formulations increasing that risk even further and with thrombophilic genotypes raising it further still, according to a report published online Aug. 5 in Obstetrics & Gynecology.
To assess the association between various types of hormonal contraception and venous thromboembolism (VTE) risk, Swedish investigators performed a nationwide case-control study involving 948 women aged 18-54 years who were treated for deep vein thrombosis of the leg or pelvis, pulmonary embolism, or both conditions during a 6-year period and 902 healthy control subjects from the general population. All the women provided a blood sample for genetic analysis and provided detailed information regarding their contraceptive use, body mass index (BMI), smoking status, and recent history of immobilization, said Dr. Annica Bergendal of the Centre for Pharmacoepidemiology, Karolinska Institutet, Stockholm, and her associates.
A total of 32.8% of the case group reported current use of combined hormonal contraception, compared with only 11.9% of the control group. Overall, current use of combined hormonal contraception was associated with a fivefold increased risk of VTE, with an adjusted OR of 5.3. "Combinations with the progestogen desogestrel yielded the highest risk estimate (adjusted OR, 11.4), followed by drospirenone (adjusted OR, 8.4). The adjusted OR could not be calculated for lynestrenol because there were no exposed women in the control group," the investigators wrote (Obstet. Gynecol. 2014;124:600-9).
In contrast, progestogen-only contraception did not increase the risk of VTE, except at the highest dose level.
Women who used combination contraception and were carriers of factor V Leiden or of the prothrombin gene mutation were at extremely high risk for VTE, nearly 20-fold for either, compared with nonusers and noncarriers. Women who used combination contraception and were carriers of factor XIII had a much lower, but still elevated, risk for VTE (OR, 2.8).
All of these differences in risk appeared to be independent of BMI, smoking status, and recent history of immobilization, Dr. Bergendal and her associates added.
This study was supported by unrestricted grants from Janssen-Cilag, Novartis, Organon, Schering, Wyeth, AFA Insurance, and the Medical Products Agency. Dr. Bergendal and her associates reported no relevant financial conflicts.
FROM OBSTETRICS & GYNECOLOGY
Key clinical point: Combined hormonal contraception increases the risk of VTE, especially in women with certain thrombophilic genotypes.
Major finding: Current use of combined hormonal contraception was associated with a fivefold increased risk of VTE (adjusted OR, 5.3); combinations with the progestogen desogestrel yielded the highest risk estimate (adjusted OR, 11.4), followed by those containing drospirenone (adjusted OR, 8.4).
Data source: A nationwide Swedish case-control study involving 948 women aged 18-54 years treated for deep vein thrombosis or pulmonary embolism over the course of 6 years and 902 control subjects.
Disclosures: This study was supported by unrestricted grants from Janssen-Cilag, Novartis, Organon, Schering, Wyeth, AFA Insurance, and the Medical Products Agency. Dr. Bergendal and her associates reported no financial conflicts.
Mutations identified for phenytoin-related severe skin reactions
Mutations in the CYP2C genes on chromosome 10 appear to predispose carriers to severe adverse cutaneous reactions to the antiepileptic drug phenytoin, according to a report published Aug. 5 in JAMA.
Researchers performed a genome-wide association study of more than 850,000 single-nucleotide polymorphisms (SNPs), followed by direct sequencing of the genes identified as suspicious, to investigate possible genetic factors associated with severe phenytoin-related cutaneous reactions. Phenytoin – the most frequently used first-line antiepileptic agent in hospitalized patients, which is also effective for other neurologic disorders – is known to cause cutaneous reactions ranging from mild rash to life-threatening eosinophilia, Stevens-Johnson syndrome, and toxic epidermal necrolysis, said Dr. Wen-Hung Chung, of the department of dermatology at Chang Gung Memorial Hospital, Taiwan, and his associates.
The study participants were 168 Taiwanese patients taking the drug who developed cutaneous reactions, including 13 who died from those adverse events; 130 Taiwanese patients who were tolerant of phenytoin; and 412 controls from the general Taiwanese population. The genome-wide association study identified a cluster of 16 SNPs on chromosome 10 that showed some association with the adverse cutaneous reactions, including 8 SNPs on CYP2C genes. Direct sequencing of the CYP2C genes found another two variants that were significantly associated with phenytoin-related severe cutaneous adverse reactions. The findings were replicated in an independent set of 30 cases of phenytoin-related severe cutaneous adverse reaction who were recruited from the Taiwan Severe Cutaneous Adverse Reactions Consortium and compared against the 130 phenytoin-tolerant controls (JAMA 2014;312:525-34).
One of the identified mutations, CYP2C9*3, is known from previous studies to impair clearance of phenytoin from the body; in this study it also was linked to extremely slow metabolism, and thus high plasma concentrations, of the drug. The association between CYP2C9*3 and severe cutaneous adverse reactions findings was then validated in additional population-based samples from Taiwan, Japan, and Malaysia. A meta-analysis of the data from all the study populations showed that, overall, CYP2C9*3 carriers were at markedly increased risk for severe cutaneous adverse reactions, with an odds ratio of 11.
"We propose that delayed clearance and accumulation of reactive metabolites caused by genetic variants of drug-metabolizing enzymes may be the primary factor, and that immunogenicity, such as the presence of risk HLA alleles and specific T-cell receptor clonotypes in susceptible individuals, may facilitate the development and guide the different types of cutaneous adverse reactions," Dr. Chung and his associates wrote.
However, delayed clearance was also noted in severely affected patients who did not carry the CYP2C9*3 mutation, "suggesting that nongenetic factors such as renal insufficiency, hepatic dysfunction, and concurrent use of substances that compete with or inhibit the enzymes may also affect phenytoin metabolism and contribute to severe cutaneous adverse reactions," they said.
If these findings are corroborated in future studies, it is possible that patients might be tested for these genetic mutations before they take phenytoin, to prevent these severe and sometimes fatal reactions, the investigators added.
This study was supported by the National Science Council, Taiwan; the National Core Facility Program for Biotechnology, Taiwan; and Chang Gung Memorial Hospital, also in Taiwan. Dr. Chung and a coauthor reported having a patent pending for risk assessment of phenytoin-induced adverse reactions.
Mutations in the CYP2C genes on chromosome 10 appear to predispose carriers to severe adverse cutaneous reactions to the antiepileptic drug phenytoin, according to a report published Aug. 5 in JAMA.
Researchers performed a genome-wide association study of more than 850,000 single-nucleotide polymorphisms (SNPs), followed by direct sequencing of the genes identified as suspicious, to investigate possible genetic factors associated with severe phenytoin-related cutaneous reactions. Phenytoin – the most frequently used first-line antiepileptic agent in hospitalized patients, which is also effective for other neurologic disorders – is known to cause cutaneous reactions ranging from mild rash to life-threatening eosinophilia, Stevens-Johnson syndrome, and toxic epidermal necrolysis, said Dr. Wen-Hung Chung, of the department of dermatology at Chang Gung Memorial Hospital, Taiwan, and his associates.
The study participants were 168 Taiwanese patients taking the drug who developed cutaneous reactions, including 13 who died from those adverse events; 130 Taiwanese patients who were tolerant of phenytoin; and 412 controls from the general Taiwanese population. The genome-wide association study identified a cluster of 16 SNPs on chromosome 10 that showed some association with the adverse cutaneous reactions, including 8 SNPs on CYP2C genes. Direct sequencing of the CYP2C genes found another two variants that were significantly associated with phenytoin-related severe cutaneous adverse reactions. The findings were replicated in an independent set of 30 cases of phenytoin-related severe cutaneous adverse reaction who were recruited from the Taiwan Severe Cutaneous Adverse Reactions Consortium and compared against the 130 phenytoin-tolerant controls (JAMA 2014;312:525-34).
One of the identified mutations, CYP2C9*3, is known from previous studies to impair clearance of phenytoin from the body; in this study it also was linked to extremely slow metabolism, and thus high plasma concentrations, of the drug. The association between CYP2C9*3 and severe cutaneous adverse reactions findings was then validated in additional population-based samples from Taiwan, Japan, and Malaysia. A meta-analysis of the data from all the study populations showed that, overall, CYP2C9*3 carriers were at markedly increased risk for severe cutaneous adverse reactions, with an odds ratio of 11.
"We propose that delayed clearance and accumulation of reactive metabolites caused by genetic variants of drug-metabolizing enzymes may be the primary factor, and that immunogenicity, such as the presence of risk HLA alleles and specific T-cell receptor clonotypes in susceptible individuals, may facilitate the development and guide the different types of cutaneous adverse reactions," Dr. Chung and his associates wrote.
However, delayed clearance was also noted in severely affected patients who did not carry the CYP2C9*3 mutation, "suggesting that nongenetic factors such as renal insufficiency, hepatic dysfunction, and concurrent use of substances that compete with or inhibit the enzymes may also affect phenytoin metabolism and contribute to severe cutaneous adverse reactions," they said.
If these findings are corroborated in future studies, it is possible that patients might be tested for these genetic mutations before they take phenytoin, to prevent these severe and sometimes fatal reactions, the investigators added.
This study was supported by the National Science Council, Taiwan; the National Core Facility Program for Biotechnology, Taiwan; and Chang Gung Memorial Hospital, also in Taiwan. Dr. Chung and a coauthor reported having a patent pending for risk assessment of phenytoin-induced adverse reactions.
Mutations in the CYP2C genes on chromosome 10 appear to predispose carriers to severe adverse cutaneous reactions to the antiepileptic drug phenytoin, according to a report published Aug. 5 in JAMA.
Researchers performed a genome-wide association study of more than 850,000 single-nucleotide polymorphisms (SNPs), followed by direct sequencing of the genes identified as suspicious, to investigate possible genetic factors associated with severe phenytoin-related cutaneous reactions. Phenytoin – the most frequently used first-line antiepileptic agent in hospitalized patients, which is also effective for other neurologic disorders – is known to cause cutaneous reactions ranging from mild rash to life-threatening eosinophilia, Stevens-Johnson syndrome, and toxic epidermal necrolysis, said Dr. Wen-Hung Chung, of the department of dermatology at Chang Gung Memorial Hospital, Taiwan, and his associates.
The study participants were 168 Taiwanese patients taking the drug who developed cutaneous reactions, including 13 who died from those adverse events; 130 Taiwanese patients who were tolerant of phenytoin; and 412 controls from the general Taiwanese population. The genome-wide association study identified a cluster of 16 SNPs on chromosome 10 that showed some association with the adverse cutaneous reactions, including 8 SNPs on CYP2C genes. Direct sequencing of the CYP2C genes found another two variants that were significantly associated with phenytoin-related severe cutaneous adverse reactions. The findings were replicated in an independent set of 30 cases of phenytoin-related severe cutaneous adverse reaction who were recruited from the Taiwan Severe Cutaneous Adverse Reactions Consortium and compared against the 130 phenytoin-tolerant controls (JAMA 2014;312:525-34).
One of the identified mutations, CYP2C9*3, is known from previous studies to impair clearance of phenytoin from the body; in this study it also was linked to extremely slow metabolism, and thus high plasma concentrations, of the drug. The association between CYP2C9*3 and severe cutaneous adverse reactions findings was then validated in additional population-based samples from Taiwan, Japan, and Malaysia. A meta-analysis of the data from all the study populations showed that, overall, CYP2C9*3 carriers were at markedly increased risk for severe cutaneous adverse reactions, with an odds ratio of 11.
"We propose that delayed clearance and accumulation of reactive metabolites caused by genetic variants of drug-metabolizing enzymes may be the primary factor, and that immunogenicity, such as the presence of risk HLA alleles and specific T-cell receptor clonotypes in susceptible individuals, may facilitate the development and guide the different types of cutaneous adverse reactions," Dr. Chung and his associates wrote.
However, delayed clearance was also noted in severely affected patients who did not carry the CYP2C9*3 mutation, "suggesting that nongenetic factors such as renal insufficiency, hepatic dysfunction, and concurrent use of substances that compete with or inhibit the enzymes may also affect phenytoin metabolism and contribute to severe cutaneous adverse reactions," they said.
If these findings are corroborated in future studies, it is possible that patients might be tested for these genetic mutations before they take phenytoin, to prevent these severe and sometimes fatal reactions, the investigators added.
This study was supported by the National Science Council, Taiwan; the National Core Facility Program for Biotechnology, Taiwan; and Chang Gung Memorial Hospital, also in Taiwan. Dr. Chung and a coauthor reported having a patent pending for risk assessment of phenytoin-induced adverse reactions.
FROM JAMA
Key clinical point: If the findings are corroborated in future studies, patients might be tested for genetic mutations in CYP2C genes before they take phenytoin.
Major finding: Carriers of the CYP2C9*3 variant were at markedly increased risk for developing severe cutaneous adverse reactions when treated with phenytoin, with an OR of 11.
Data source: A genome-wide association study involving 168 cases of phenytoin-induced severe cutaneous reactions, 130 phenytoin-tolerant controls, and 412 controls from the general Taiwanese population, with additional analyses in further samples from patients in Taiwan, Japan, and Malaysia.
Disclosures: This study was supported by the National Science Council, Taiwan; the National Core Facility Program for Biotechnology, Taiwan; and Chang Gung Memorial Hospital, also in Taiwan. Dr. Chung and a coauthor reported having a patent pending for risk assessment of phenytoin-induced adverse reactions.
Simulation model projects HCV to be rare by 2026
A simulation model that incorporated a one-time universal screening of U.S. adults for hepatitis C virus and made conservative assumptions as to the availability and efficacy of various therapies projected that the infection could become rare by 2026, according to a report published online Aug. 4 in the Annals of Internal Medicine.
Treatment for HCV has evolved rapidly during the past 20 years, and new therapies being developed now show the potential to increase response rates even further, reduce treatment duration, and improve safety profiles. In light of these advances, researchers devised a computerized simulation model to project what the burden of HCV disease has been and will be for the years 2001-2050 in the United States, said Mina Kabiri of the University of Pittsburgh Graduate School of Public Health and her associates.
The investigators analyzed information in national databases to form baseline assumptions about infection rates, the prevalences of each disease stage, treatment responses, and the medical system’s treatment capacity, and they attempted to adjust for the introduction of new therapies over time. They made a conservative estimate that over time, rates of sustained virologic response would average 90% in real world practice, even though rates as high as 97% have already been reported in clinical trials of currently available treatments. And they incorporated a one-time-only screening of all adults for occult HCV infection, on the assumption that most of those identified would seek treatment, which would head off the development of HCV-related diseases such as hepatocellular carcinoma. Finally, Ms. Kabiri and her associates validated their model by applying it in several published study cohorts and finding that their projections accorded with the actual results of those studies.
The model projected that under base-case assumptions, HCV infection would become rare by 2036, affecting only 1 in 1,500 persons, and that under best-case assumptions HCV could become rare by 2026. "The ideal scenario could reduce the total number of cases of [HCV-related] decompensated cirrhosis by 135,800 cases (46%), cases of hepatocellular carcinoma by 96,300 (40%), liver-related deaths by 161,500 (37%), and liver transplantations by 13,900 (37%) during 2014-2050," the researchers wrote (Ann. Intern. Med. 2014 Aug. 4 [doi:10.7326/M14-0095]).
Increasing the medical system’s capacity to treat HCV would further reduce the burden of disease. "With the launch of all-oral drugs that can simplify treatment, primary care physicians or infectious disease specialists also may take on the role of treating patients with HCV infection, thus alleviating the burden on hepatologists," the investigators noted.
This study was supported by the National Center for Advancing Translational Sciences at the National Institutes of Health and by the University of Pittsburgh Graduate School of Public Health.
![]() |
|
It is widely understood that we are in the midst of the peak burden of liver disease attributable to hepatitis C virus and that the advent of highly effective direct-acting antiviral therapies will have a tremendous impact on the course of the disease moving forward. This study reports on a comprehensive computer model that demonstrates the impact that birth cohort screening and treatment with new therapies will have on the prevalence of HCV, accounting for conservative real world cure rates that can be expected from the evolving treatment options as well as the limitation of resources (fiscal and manpower) that affect their immediate widespread use. The modeling predicts that HCV will become a rare disease in about 20 years and could even achieve this status 10 years earlier if resources to diagnose and treat the disease were limitless (their "ideal" projection).
However, one cannot also help but look at this model and wonder where we might be if HCV had received the attention and funding that HIV has had over the past 25 years. Strong advocacy coupled with significant funding dedicated to prevention, detection, and the development of therapies resulted in significant progress in the fight against HIV, and numerous medications are available today for treatment. Funding for HCV has significantly lagged behind HIV for years Nature 2011;474:S18-9) and not surprisingly, so has progress in fighting the disease. The arrival to market of direct-acting antiviral therapies and the profound impact they will have as demonstrated in this model should be celebrated, but we cannot forget the many patients who have regrettably been unable to benefit from these developments and how much sooner HCV might have become a rare disease if the appropriate attention and funding had been provided over the years.
Dr. Sean Koppe is a gastroenterologist at Northwestern University and at the Jesse Brown VA Medical Center, Chicago. He reported no relevant conflicts of interest.
![]() |
|
It is widely understood that we are in the midst of the peak burden of liver disease attributable to hepatitis C virus and that the advent of highly effective direct-acting antiviral therapies will have a tremendous impact on the course of the disease moving forward. This study reports on a comprehensive computer model that demonstrates the impact that birth cohort screening and treatment with new therapies will have on the prevalence of HCV, accounting for conservative real world cure rates that can be expected from the evolving treatment options as well as the limitation of resources (fiscal and manpower) that affect their immediate widespread use. The modeling predicts that HCV will become a rare disease in about 20 years and could even achieve this status 10 years earlier if resources to diagnose and treat the disease were limitless (their "ideal" projection).
However, one cannot also help but look at this model and wonder where we might be if HCV had received the attention and funding that HIV has had over the past 25 years. Strong advocacy coupled with significant funding dedicated to prevention, detection, and the development of therapies resulted in significant progress in the fight against HIV, and numerous medications are available today for treatment. Funding for HCV has significantly lagged behind HIV for years Nature 2011;474:S18-9) and not surprisingly, so has progress in fighting the disease. The arrival to market of direct-acting antiviral therapies and the profound impact they will have as demonstrated in this model should be celebrated, but we cannot forget the many patients who have regrettably been unable to benefit from these developments and how much sooner HCV might have become a rare disease if the appropriate attention and funding had been provided over the years.
Dr. Sean Koppe is a gastroenterologist at Northwestern University and at the Jesse Brown VA Medical Center, Chicago. He reported no relevant conflicts of interest.
![]() |
|
It is widely understood that we are in the midst of the peak burden of liver disease attributable to hepatitis C virus and that the advent of highly effective direct-acting antiviral therapies will have a tremendous impact on the course of the disease moving forward. This study reports on a comprehensive computer model that demonstrates the impact that birth cohort screening and treatment with new therapies will have on the prevalence of HCV, accounting for conservative real world cure rates that can be expected from the evolving treatment options as well as the limitation of resources (fiscal and manpower) that affect their immediate widespread use. The modeling predicts that HCV will become a rare disease in about 20 years and could even achieve this status 10 years earlier if resources to diagnose and treat the disease were limitless (their "ideal" projection).
However, one cannot also help but look at this model and wonder where we might be if HCV had received the attention and funding that HIV has had over the past 25 years. Strong advocacy coupled with significant funding dedicated to prevention, detection, and the development of therapies resulted in significant progress in the fight against HIV, and numerous medications are available today for treatment. Funding for HCV has significantly lagged behind HIV for years Nature 2011;474:S18-9) and not surprisingly, so has progress in fighting the disease. The arrival to market of direct-acting antiviral therapies and the profound impact they will have as demonstrated in this model should be celebrated, but we cannot forget the many patients who have regrettably been unable to benefit from these developments and how much sooner HCV might have become a rare disease if the appropriate attention and funding had been provided over the years.
Dr. Sean Koppe is a gastroenterologist at Northwestern University and at the Jesse Brown VA Medical Center, Chicago. He reported no relevant conflicts of interest.
A simulation model that incorporated a one-time universal screening of U.S. adults for hepatitis C virus and made conservative assumptions as to the availability and efficacy of various therapies projected that the infection could become rare by 2026, according to a report published online Aug. 4 in the Annals of Internal Medicine.
Treatment for HCV has evolved rapidly during the past 20 years, and new therapies being developed now show the potential to increase response rates even further, reduce treatment duration, and improve safety profiles. In light of these advances, researchers devised a computerized simulation model to project what the burden of HCV disease has been and will be for the years 2001-2050 in the United States, said Mina Kabiri of the University of Pittsburgh Graduate School of Public Health and her associates.
The investigators analyzed information in national databases to form baseline assumptions about infection rates, the prevalences of each disease stage, treatment responses, and the medical system’s treatment capacity, and they attempted to adjust for the introduction of new therapies over time. They made a conservative estimate that over time, rates of sustained virologic response would average 90% in real world practice, even though rates as high as 97% have already been reported in clinical trials of currently available treatments. And they incorporated a one-time-only screening of all adults for occult HCV infection, on the assumption that most of those identified would seek treatment, which would head off the development of HCV-related diseases such as hepatocellular carcinoma. Finally, Ms. Kabiri and her associates validated their model by applying it in several published study cohorts and finding that their projections accorded with the actual results of those studies.
The model projected that under base-case assumptions, HCV infection would become rare by 2036, affecting only 1 in 1,500 persons, and that under best-case assumptions HCV could become rare by 2026. "The ideal scenario could reduce the total number of cases of [HCV-related] decompensated cirrhosis by 135,800 cases (46%), cases of hepatocellular carcinoma by 96,300 (40%), liver-related deaths by 161,500 (37%), and liver transplantations by 13,900 (37%) during 2014-2050," the researchers wrote (Ann. Intern. Med. 2014 Aug. 4 [doi:10.7326/M14-0095]).
Increasing the medical system’s capacity to treat HCV would further reduce the burden of disease. "With the launch of all-oral drugs that can simplify treatment, primary care physicians or infectious disease specialists also may take on the role of treating patients with HCV infection, thus alleviating the burden on hepatologists," the investigators noted.
This study was supported by the National Center for Advancing Translational Sciences at the National Institutes of Health and by the University of Pittsburgh Graduate School of Public Health.
A simulation model that incorporated a one-time universal screening of U.S. adults for hepatitis C virus and made conservative assumptions as to the availability and efficacy of various therapies projected that the infection could become rare by 2026, according to a report published online Aug. 4 in the Annals of Internal Medicine.
Treatment for HCV has evolved rapidly during the past 20 years, and new therapies being developed now show the potential to increase response rates even further, reduce treatment duration, and improve safety profiles. In light of these advances, researchers devised a computerized simulation model to project what the burden of HCV disease has been and will be for the years 2001-2050 in the United States, said Mina Kabiri of the University of Pittsburgh Graduate School of Public Health and her associates.
The investigators analyzed information in national databases to form baseline assumptions about infection rates, the prevalences of each disease stage, treatment responses, and the medical system’s treatment capacity, and they attempted to adjust for the introduction of new therapies over time. They made a conservative estimate that over time, rates of sustained virologic response would average 90% in real world practice, even though rates as high as 97% have already been reported in clinical trials of currently available treatments. And they incorporated a one-time-only screening of all adults for occult HCV infection, on the assumption that most of those identified would seek treatment, which would head off the development of HCV-related diseases such as hepatocellular carcinoma. Finally, Ms. Kabiri and her associates validated their model by applying it in several published study cohorts and finding that their projections accorded with the actual results of those studies.
The model projected that under base-case assumptions, HCV infection would become rare by 2036, affecting only 1 in 1,500 persons, and that under best-case assumptions HCV could become rare by 2026. "The ideal scenario could reduce the total number of cases of [HCV-related] decompensated cirrhosis by 135,800 cases (46%), cases of hepatocellular carcinoma by 96,300 (40%), liver-related deaths by 161,500 (37%), and liver transplantations by 13,900 (37%) during 2014-2050," the researchers wrote (Ann. Intern. Med. 2014 Aug. 4 [doi:10.7326/M14-0095]).
Increasing the medical system’s capacity to treat HCV would further reduce the burden of disease. "With the launch of all-oral drugs that can simplify treatment, primary care physicians or infectious disease specialists also may take on the role of treating patients with HCV infection, thus alleviating the burden on hepatologists," the investigators noted.
This study was supported by the National Center for Advancing Translational Sciences at the National Institutes of Health and by the University of Pittsburgh Graduate School of Public Health.
FROM ANNALS OF INTERNAL MEDICINE
Key clinical point: Diagnosing and treating those infected with HCV now could nearly eliminate the disease by 2026.
Major finding: The model projected that under best-case assumptions, HCV could become rare by 2026, and that by 2050, cases of HCV-related decompensated cirrhosis could be cut by 135,800, cases of hepatocellular carcinoma could be reduced by 96,300, liver-related deaths could be decreased by 161,500, and liver transplantations could be cut by 13,900.
Data source: A computerized simulation model projecting the burden of HCV disease in the U.S. population in 2001-2050.
Disclosures: This study was supported by the National Center for Advancing Translational Sciences at the National Institutes of Health and by the University of Pittsburgh Graduate School of Public Health.








