Prevalence of Low Bone Mineral Density in Younger Versus Older Women With Distal Radius Fractures

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Prevalence of Low Bone Mineral Density in Younger Versus Older Women With Distal Radius Fractures

Many organizations and work groups have issued recommendations regarding which patients should undergo bone densitometry. In 2004, the US Surgeon General recommended bone mineral density (BMD) evaluation for all women over age 65 years and for women and men with fragility fractures.1 The Centers for Medicare & Medicaid Services recommended BMD assessment for estrogen-deficient patients, for patients with vertebral abnormalities or hyperparathyroidism, and for patients receiving either steroid therapy or osteoporosis medications approved by the US Food and Drug Administration.2 The US Preventive Services Task Force and the National Osteoporosis Foundation each recommended screening for all women age 65 years or older and for postmenopausal women (age, 60-64 years) at high risk.3,4 The International Society for Clinical Densitometry (ISCD) recommended screening for all women age 65 years or older, all men age 70 years or older, and high-risk women under age 65 years.5

These current recommendations for BMD evaluation focus on women over age 65 years. More recent studies of postmenopausal women with distal radius fractures (DRFs) have found that both younger women (age, 45-65 years) and older women (age, ≥65 years) can have lower BMD and increased risk for hip and spine fracture.6,7 The authors of those studies recommended that all postmenopausal women with DRFs be evaluated for low BMD and that fracture prevention treatment be initiated. Earnshaw and colleagues8 and Oyen and colleagues9 found that men and women (age, ≥50 years) with DRFs had low BMD and elevated 10-year fracture rates. They concluded that BMD should be evaluated and treated in all DRF patients age 50 years or older. Other studies have shown low BMD in the contralateral distal radius of patients of all ages who presented with Colles fractures.10,11 These 2 studies did not measure spine or hip BMD.

The literature on BMD of younger women with DRFs is limited, relying solely on data collected for the contralateral distal radius.10,11 The ISCD recommended measuring both hip and spine BMD in premenopausal women. They also stated that z scores, not t scores, should be used for premenopausal women.5 The causes of low BMD in women over age 55 years are primarily nutritional deficiency and normal aging.1 In younger females, low BMD results from secondary causes, such as diet, medications, medical conditions, and endocrine disorders. When the secondary cause of low BMD can be identified and treated, osteoporosis can be stopped and even reversed in younger patients.12-14 Low BMD is more amenable to treatment in younger patients than in postmenopausal women. Younger patients with low BMD carry a higher lifetime fracture risk because they have more years of life with low BMD; therefore, early identification and treatment have a more significant impact on fracture prevention in these patients.

In the present study, we determined the prevalence of osteoporosis and osteopenia in younger women (age, 35-50 years) with DRFs and compared BMD measurements from younger women (age, 35-50 years) and older women (age, >50 years) with DRFs. The main goal was to determine which patients should be referred for bone densitometry and subsequent treatment.

Patients and Methods

This study received institutional review board approval. During a 5-year period (January 2005–August 2010), we prospectively collected dual-energy x-ray absorptiometry (DXA) scans for 128 women (age, >35 years) who presented with DRFs to our level I trauma center. Age ranged from 35 to 86 years. Data on mechanism of injury, treatment, and body mass index (BMI) were collected. The 128 patients were divided into a younger group (47 women; age range, 35-50 years; mean age, 44 years) and an older group (81 women; age, ≥51 years; mean age, 61 years). Mean BMI was 29.3 in the younger group and 28.8 in the older group (P = .88) (Table).

BMD was measured with a General Electric Lunar Prodigy Advance scanner that was tested annually for accuracy and precision. BMD of hips and lumbar spines was measured with a 76-Kv x-ray source. All DXA scans were analyzed by the same physician. BMD was omitted in cases of patients with a history of lumbar spine or hip fracture.

Two-sample Student t test was used to compare the 2 groups’ data. When multiple groups were being compared, analysis of variance was used. Spearman rank-order test was used to calculate a correlation coefficient for evaluation of the relationships between age and BMD.

Results

Mean lumbar spine (L1–L4) BMD was 1.12 in the younger group and 1.063 in the older group (P = .02); t scores were –0.63 and –1.132, respectively (P = .02); and mean z scores were –0.69 and –0.61, respectively (P = .81). Mean femoral neck BMD was 0.91 in the younger group and 0.80 in the older group (P < .05); t scores were –0.87 and –1.65, respectively (P < .01), and mean femoral neck z scores were –0.69 and –0.67, respectively (P = .92).

 

 

To further analyze BMD of specific age groups, we divided patients by decade: 35-39, 40-49, 50-59, 60-69, 70-79, 80-89 years. Among all 6 decades, there were no statistically significant differences between hip z scores (P = .83) (Figure 1). Spearman rank-order correlation test showed a moderate inverse correlation between age and femoral neck BMD (R = –0.42) and t score (R = –0.43). There was a weak correlation between increasing age and decreasing spine BMD, t score, and z score (Rs = –0.27, –0.31, 0.03). There was no correlation between age and femoral neck z score (R = –0.04).

According to the WHO classification system, 11 (23%) of the 47 women in the younger group were osteopenic, and 8 (17%) were osteoporotic, based on spine BMD. Hip BMD values indicated that 20 patients (43%) were osteopenic, and 3 (6%) were osteoporotic. One patient in the younger group had a hip z score of less than –2, and 14 patients (39%) had a hip z score between –2 and –1. Six patients (18%) had a spine z score of less than –2, and 6 patients (18%) had a spine z score between –2 and –1. Of the 81 older patients, 22 (27%) were osteopenic, and 21 (26%) were osteoporotic, according to spine measurements. The femoral neck data indicated that 39 (48%) of the older patients were osteopenic, and 22 (27%) were osteoporotic.

In both groups, mechanisms of injury were identified. Of the 47 younger patients, 26 fell from standing, 7 fell from a height of more than 6 feet, and 14 were injured in motor vehicle collisions (MVCs). Of the 81 older patients, 2 sustained a direct blow, 64 fell from standing, 4 fell from a height of more than 6 feet, and 11 were injured in MVCs. The differences in z scores based on mechanism of injury were not statistically significant (P = .22) (Figure 2).

Discussion

Several studies have shown that older women with DRFs have low BMD in the spine and femoral neck.8,9 These studies focused on older women who sustained low-energy fractures caused by a fall from a standing height. Studies of younger women with DRFs focused on BMD of the contralateral distal radius, not the spine or femoral neck.10,11 Those study groups also had low BMD. Findings from a multitude of studies have established that patients who are older than 50 years when they sustain distal radius fragility fractures should be referred for bone densitometry studies, and there is increasing evidence that younger patients with fragility fractures should undergo this evaluation as well.

The present study was designed to expand the range of patients and mechanisms of injury. Women in this study were 35 years or older. In addition to collecting data from patients injured in a fall from standing, we examined the medical records of women injured in MVCs, in falls from heights of more than 6 feet, and from direct trauma to the wrist. We measured the BMD of the spine and femoral neck and of the contralateral distal radius.

For this discussion, several key points should be made about BMD evaluation in younger versus older women. Most organizations caution against using spine BMD in older women. The ISCD, however, recommended measuring both hip and spine BMD; whereas BMD can be falsely elevated by spine osteoarthritis in older patients, spine BMD measurements are accurate in younger patients not affected by osteoarthritis. The ISCD also stipulated that z scores should be used in examining BMD in younger patients. The z score is a value of how many standard deviations BMD differs from a matched population of the same age, sex, ethnicity, and weight. The t score, which is useful in evaluating older patients, compares a patient’s BMD with that of an average 30-year-old.12

According to the WHO classification system (intended for older women), osteopenia is indicated by a t score between –1.0 and –2.5, and osteoporosis is indicated by a t score of less than –2.5. In the present study, about 43% of the younger patients (age, 35-50 years) with DRFs were osteopenic, and 6% of these patients were osteoporotic. In concert with previous studies,9 48% of our older women (age, >50 years) with DRFs were osteopenic, and 27% were osteoporotic. The difference in mean spinal z scores between the younger and older groups was not statistically significant (P = .81).

As mentioned, when examining BMD of younger patients, it is imperative to use spine z scores. About 18% of our younger patients had a z score of less than –2, and 18% had a z score between –2 and –1. In our comparison of patients from 5 different age decades (range, 35-79 years), there was no statistically significant difference in z scores (P = .83). In addition, there was no correlation between increasing age and decreasing z score (R = –0.04).

 

 

Secondary causes of osteoporosis have been documented in 30% of premenopausal women and 55% of men with vertebral fractures.13-15 Primary osteoporosis results from the normal aging process; secondary osteoporosis results from reversible causes, including medications, gastrointestinal disorders, renal disease, endocrine disorders, and sedentary lifestyle.15,16 When a secondary cause of osteoporosis is identified, treatment can be initiated to increase BMD. As younger patients can reverse bone loss and even increase BMD, it is important to identify reversible causes of osteopenia and osteoporosis in this age group. It is well documented that both younger and older patients with DRFs are at increased risk for subsequent fractures.6 Preventing further bone loss at a younger age may drastically decrease lifetime fracture risk.12,17

Most previous studies of BMD in women were limited to patients with DRFs caused by a low-energy mechanism or by a fall from standing. Current recommendations for BMD testing focus on postmenopausal women who have sustained a fragility or low-energy DRF. When an osteoporotic or osteopenic patient’s distal radius is subjected to a high-energy force, a fracture is likely. Therefore, we expanded our study to include high-energy mechanisms of injury. Our analysis of BMD in patients with DRFs sustained in MVCs indicated that 12% of this group were osteoporotic, and 44% were osteopenic. Forty-three percent of our younger patients with a DRF fractured in a MVC were osteopenic, and 6% were osteoporotic. Among 4 mechanisms of injury for DRFs, there was no statistically significant difference in z scores (P = .22) (Figure 2). This provides evidence that a significant portion of patients with DRFs from both high- and low-energy mechanisms are osteoporotic or osteopenic. Patients with DRFs sustained in MVCs or in falls from heights of more than 6 feet should be referred for BMD evaluation.         

Conclusion

A significant proportion of younger patients with DRFs are osteopenic or osteoporotic (43% and 6%, respectively), and their z scores are comparable to those of older patients with DRFs. There was no statistically significant difference in BMD z scores between younger and older patients and no difference in mechanisms of injury. This is evidence that younger patients with DRFs caused by a high- or low-energy mechanism of injury should undergo both DXA scan and BMD evaluation. If osteoporosis or osteopenia can be diagnosed at an earlier age, and if these patients can be properly treated, subsequent fractures could be prevented. The present study provides evidence supporting a simplification of the current recommendations for BMD evaluation: All women with DRFs should undergo bone densitometry.

References

1.    US Department of Health and Human Services. Bone Health and Osteoporosis: A Report of the Surgeon General. Rockville, MD: US Dept of Health and Human Services, Public Health Service, Office of the Surgeon General; 2004. http://www.ncbi.nlm.nih.gov/books/NBK45513/pdf/Bookshelf_NBK45513.pdf. Accessed November 3, 2015.

2.    Bone mass measurement (bone density). Medicare website. https://www.medicare.gov/coverage/bone-density.html. Accessed November 3, 2015.

3.    Final update summary: osteoporosis: screening. US Preventive Services Task Force website. http://www.uspreventiveservicestaskforce.org/Page/Document/UpdateSummaryFinal/osteoporosis-screening. Updated July 2015. Accessed November 3, 2015.

4.    National Osteoporosis Foundation. Clinician’s Guide to Prevention and Treatment of Osteoporosis. Washington, DC: National Osteoporosis Foundation; 2010. http://nof.org/files/nof/public/content/file/344/upload/159.pdf. Accessed November 3, 2015.

5.    Khan AA, Bachrach L, Brown JP, et al. Canadian Panel of International Society of Clinical Densitometry. Standards and guidelines for performing central dual-energy x-ray absorptiometry in premenopausal women, men, and children. J Clin Densitom. 2004;7(1):51-64.

6.    Barrett-Connor E, Sajjan SG, Siris ES, Miller PD, Chen YT, Markson LE. Wrist fracture as a predictor of future fractures in younger versus older postmenopausal women: results from the National Osteoporosis Risk Assessment (NORA). Osteoporos Int. 2008;19(5):607-613.

7.    Lauritzen JB, Schwarz P, Lund B, McNair P, Transbøl I. Changing incidence and residual lifetime risk of common osteoporosis-related fractures. Osteoporos Int. 1993;3(3):127-132.

8.    Earnshaw SA, Cawte SA, Worley A, Hosking DJ. Colles’ fracture of the wrist as an indicator of underlying osteoporosis in postmenopausal women: a prospective study of bone mineral density and bone turnover rate. Osteoporos Int. 1998;8(1):53-60.

9.    Oyen J, Brudvik C, Gjesdal CG, Tell GS, Lie SA, Hove LM. Osteoporosis as a risk factor for distal radius fractures: a case–control study. J Bone Joint Surg Am. 2011;93(4):348-356.

10. Wigderowitz CA, Cunningham T, Rowley DI, Mole PA, Paterson CR. Peripheral bone mineral density in patients with distal radial fractures. J Bone Joint Surg Br. 2003;85(3):423-425.

11. Wigderowitz CA, Rowley DI, Mole PA, Paterson CR, Abel EW. Bone mineral density of the radius in patients with Colles’ fracture. J Bone Joint Surg Br. 2000;82(1):87-89.

12. Khan A, Syed Z. Bone mineral density assessment in premenopausal women. Womens Health. 2006;2(4):639-645.

13. Fitzpatrick LA. Secondary causes of osteoporosis. Mayo Clin Proc. 2002;77(5):453-468.

14. Hudec SM, Camacho PM. Secondary causes of osteoporosis. Endocr Pract. 2013;19(1):120-128.

15. Scane AC, Sutcliffe AM, Francis RM. Osteoporosis in men. Baillieres Clin Rheumatol. 1993;7(3):589-601.

16. Binkley N, Bilezikian JP, Kendler DL, Leib ES, Lewiecki EM, Petak SM. Summary of the International Society for Clinical Densitometry 2005 Position Development Conference. J Bone Miner Res. 2007;22(5):643-645.

17. Kelepouris N, Harper KD, Gannon F, Kaplan FS, Haddad JG. Severe osteoporosis in men. Ann Intern Med. 1995;123(6):452-460.

Article PDF
Author and Disclosure Information

Patrick A. Massey, MD, Jeremy R. James, MD, Joseph Bonvillain, MD, Bradley G. Nelson, MD, Stacey R. Massey, MD, and Anne Hollister, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
E493-E496
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, bone, bone mineral density, BMD, distal radius fractures, fracture, fracture management, fractures, DRF, osteoporosis, wrist, joint, massey, james, bonvillain, nelson, hollister
Sections
Author and Disclosure Information

Patrick A. Massey, MD, Jeremy R. James, MD, Joseph Bonvillain, MD, Bradley G. Nelson, MD, Stacey R. Massey, MD, and Anne Hollister, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Patrick A. Massey, MD, Jeremy R. James, MD, Joseph Bonvillain, MD, Bradley G. Nelson, MD, Stacey R. Massey, MD, and Anne Hollister, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

Many organizations and work groups have issued recommendations regarding which patients should undergo bone densitometry. In 2004, the US Surgeon General recommended bone mineral density (BMD) evaluation for all women over age 65 years and for women and men with fragility fractures.1 The Centers for Medicare & Medicaid Services recommended BMD assessment for estrogen-deficient patients, for patients with vertebral abnormalities or hyperparathyroidism, and for patients receiving either steroid therapy or osteoporosis medications approved by the US Food and Drug Administration.2 The US Preventive Services Task Force and the National Osteoporosis Foundation each recommended screening for all women age 65 years or older and for postmenopausal women (age, 60-64 years) at high risk.3,4 The International Society for Clinical Densitometry (ISCD) recommended screening for all women age 65 years or older, all men age 70 years or older, and high-risk women under age 65 years.5

These current recommendations for BMD evaluation focus on women over age 65 years. More recent studies of postmenopausal women with distal radius fractures (DRFs) have found that both younger women (age, 45-65 years) and older women (age, ≥65 years) can have lower BMD and increased risk for hip and spine fracture.6,7 The authors of those studies recommended that all postmenopausal women with DRFs be evaluated for low BMD and that fracture prevention treatment be initiated. Earnshaw and colleagues8 and Oyen and colleagues9 found that men and women (age, ≥50 years) with DRFs had low BMD and elevated 10-year fracture rates. They concluded that BMD should be evaluated and treated in all DRF patients age 50 years or older. Other studies have shown low BMD in the contralateral distal radius of patients of all ages who presented with Colles fractures.10,11 These 2 studies did not measure spine or hip BMD.

The literature on BMD of younger women with DRFs is limited, relying solely on data collected for the contralateral distal radius.10,11 The ISCD recommended measuring both hip and spine BMD in premenopausal women. They also stated that z scores, not t scores, should be used for premenopausal women.5 The causes of low BMD in women over age 55 years are primarily nutritional deficiency and normal aging.1 In younger females, low BMD results from secondary causes, such as diet, medications, medical conditions, and endocrine disorders. When the secondary cause of low BMD can be identified and treated, osteoporosis can be stopped and even reversed in younger patients.12-14 Low BMD is more amenable to treatment in younger patients than in postmenopausal women. Younger patients with low BMD carry a higher lifetime fracture risk because they have more years of life with low BMD; therefore, early identification and treatment have a more significant impact on fracture prevention in these patients.

In the present study, we determined the prevalence of osteoporosis and osteopenia in younger women (age, 35-50 years) with DRFs and compared BMD measurements from younger women (age, 35-50 years) and older women (age, >50 years) with DRFs. The main goal was to determine which patients should be referred for bone densitometry and subsequent treatment.

Patients and Methods

This study received institutional review board approval. During a 5-year period (January 2005–August 2010), we prospectively collected dual-energy x-ray absorptiometry (DXA) scans for 128 women (age, >35 years) who presented with DRFs to our level I trauma center. Age ranged from 35 to 86 years. Data on mechanism of injury, treatment, and body mass index (BMI) were collected. The 128 patients were divided into a younger group (47 women; age range, 35-50 years; mean age, 44 years) and an older group (81 women; age, ≥51 years; mean age, 61 years). Mean BMI was 29.3 in the younger group and 28.8 in the older group (P = .88) (Table).

BMD was measured with a General Electric Lunar Prodigy Advance scanner that was tested annually for accuracy and precision. BMD of hips and lumbar spines was measured with a 76-Kv x-ray source. All DXA scans were analyzed by the same physician. BMD was omitted in cases of patients with a history of lumbar spine or hip fracture.

Two-sample Student t test was used to compare the 2 groups’ data. When multiple groups were being compared, analysis of variance was used. Spearman rank-order test was used to calculate a correlation coefficient for evaluation of the relationships between age and BMD.

Results

Mean lumbar spine (L1–L4) BMD was 1.12 in the younger group and 1.063 in the older group (P = .02); t scores were –0.63 and –1.132, respectively (P = .02); and mean z scores were –0.69 and –0.61, respectively (P = .81). Mean femoral neck BMD was 0.91 in the younger group and 0.80 in the older group (P < .05); t scores were –0.87 and –1.65, respectively (P < .01), and mean femoral neck z scores were –0.69 and –0.67, respectively (P = .92).

 

 

To further analyze BMD of specific age groups, we divided patients by decade: 35-39, 40-49, 50-59, 60-69, 70-79, 80-89 years. Among all 6 decades, there were no statistically significant differences between hip z scores (P = .83) (Figure 1). Spearman rank-order correlation test showed a moderate inverse correlation between age and femoral neck BMD (R = –0.42) and t score (R = –0.43). There was a weak correlation between increasing age and decreasing spine BMD, t score, and z score (Rs = –0.27, –0.31, 0.03). There was no correlation between age and femoral neck z score (R = –0.04).

According to the WHO classification system, 11 (23%) of the 47 women in the younger group were osteopenic, and 8 (17%) were osteoporotic, based on spine BMD. Hip BMD values indicated that 20 patients (43%) were osteopenic, and 3 (6%) were osteoporotic. One patient in the younger group had a hip z score of less than –2, and 14 patients (39%) had a hip z score between –2 and –1. Six patients (18%) had a spine z score of less than –2, and 6 patients (18%) had a spine z score between –2 and –1. Of the 81 older patients, 22 (27%) were osteopenic, and 21 (26%) were osteoporotic, according to spine measurements. The femoral neck data indicated that 39 (48%) of the older patients were osteopenic, and 22 (27%) were osteoporotic.

In both groups, mechanisms of injury were identified. Of the 47 younger patients, 26 fell from standing, 7 fell from a height of more than 6 feet, and 14 were injured in motor vehicle collisions (MVCs). Of the 81 older patients, 2 sustained a direct blow, 64 fell from standing, 4 fell from a height of more than 6 feet, and 11 were injured in MVCs. The differences in z scores based on mechanism of injury were not statistically significant (P = .22) (Figure 2).

Discussion

Several studies have shown that older women with DRFs have low BMD in the spine and femoral neck.8,9 These studies focused on older women who sustained low-energy fractures caused by a fall from a standing height. Studies of younger women with DRFs focused on BMD of the contralateral distal radius, not the spine or femoral neck.10,11 Those study groups also had low BMD. Findings from a multitude of studies have established that patients who are older than 50 years when they sustain distal radius fragility fractures should be referred for bone densitometry studies, and there is increasing evidence that younger patients with fragility fractures should undergo this evaluation as well.

The present study was designed to expand the range of patients and mechanisms of injury. Women in this study were 35 years or older. In addition to collecting data from patients injured in a fall from standing, we examined the medical records of women injured in MVCs, in falls from heights of more than 6 feet, and from direct trauma to the wrist. We measured the BMD of the spine and femoral neck and of the contralateral distal radius.

For this discussion, several key points should be made about BMD evaluation in younger versus older women. Most organizations caution against using spine BMD in older women. The ISCD, however, recommended measuring both hip and spine BMD; whereas BMD can be falsely elevated by spine osteoarthritis in older patients, spine BMD measurements are accurate in younger patients not affected by osteoarthritis. The ISCD also stipulated that z scores should be used in examining BMD in younger patients. The z score is a value of how many standard deviations BMD differs from a matched population of the same age, sex, ethnicity, and weight. The t score, which is useful in evaluating older patients, compares a patient’s BMD with that of an average 30-year-old.12

According to the WHO classification system (intended for older women), osteopenia is indicated by a t score between –1.0 and –2.5, and osteoporosis is indicated by a t score of less than –2.5. In the present study, about 43% of the younger patients (age, 35-50 years) with DRFs were osteopenic, and 6% of these patients were osteoporotic. In concert with previous studies,9 48% of our older women (age, >50 years) with DRFs were osteopenic, and 27% were osteoporotic. The difference in mean spinal z scores between the younger and older groups was not statistically significant (P = .81).

As mentioned, when examining BMD of younger patients, it is imperative to use spine z scores. About 18% of our younger patients had a z score of less than –2, and 18% had a z score between –2 and –1. In our comparison of patients from 5 different age decades (range, 35-79 years), there was no statistically significant difference in z scores (P = .83). In addition, there was no correlation between increasing age and decreasing z score (R = –0.04).

 

 

Secondary causes of osteoporosis have been documented in 30% of premenopausal women and 55% of men with vertebral fractures.13-15 Primary osteoporosis results from the normal aging process; secondary osteoporosis results from reversible causes, including medications, gastrointestinal disorders, renal disease, endocrine disorders, and sedentary lifestyle.15,16 When a secondary cause of osteoporosis is identified, treatment can be initiated to increase BMD. As younger patients can reverse bone loss and even increase BMD, it is important to identify reversible causes of osteopenia and osteoporosis in this age group. It is well documented that both younger and older patients with DRFs are at increased risk for subsequent fractures.6 Preventing further bone loss at a younger age may drastically decrease lifetime fracture risk.12,17

Most previous studies of BMD in women were limited to patients with DRFs caused by a low-energy mechanism or by a fall from standing. Current recommendations for BMD testing focus on postmenopausal women who have sustained a fragility or low-energy DRF. When an osteoporotic or osteopenic patient’s distal radius is subjected to a high-energy force, a fracture is likely. Therefore, we expanded our study to include high-energy mechanisms of injury. Our analysis of BMD in patients with DRFs sustained in MVCs indicated that 12% of this group were osteoporotic, and 44% were osteopenic. Forty-three percent of our younger patients with a DRF fractured in a MVC were osteopenic, and 6% were osteoporotic. Among 4 mechanisms of injury for DRFs, there was no statistically significant difference in z scores (P = .22) (Figure 2). This provides evidence that a significant portion of patients with DRFs from both high- and low-energy mechanisms are osteoporotic or osteopenic. Patients with DRFs sustained in MVCs or in falls from heights of more than 6 feet should be referred for BMD evaluation.         

Conclusion

A significant proportion of younger patients with DRFs are osteopenic or osteoporotic (43% and 6%, respectively), and their z scores are comparable to those of older patients with DRFs. There was no statistically significant difference in BMD z scores between younger and older patients and no difference in mechanisms of injury. This is evidence that younger patients with DRFs caused by a high- or low-energy mechanism of injury should undergo both DXA scan and BMD evaluation. If osteoporosis or osteopenia can be diagnosed at an earlier age, and if these patients can be properly treated, subsequent fractures could be prevented. The present study provides evidence supporting a simplification of the current recommendations for BMD evaluation: All women with DRFs should undergo bone densitometry.

Many organizations and work groups have issued recommendations regarding which patients should undergo bone densitometry. In 2004, the US Surgeon General recommended bone mineral density (BMD) evaluation for all women over age 65 years and for women and men with fragility fractures.1 The Centers for Medicare & Medicaid Services recommended BMD assessment for estrogen-deficient patients, for patients with vertebral abnormalities or hyperparathyroidism, and for patients receiving either steroid therapy or osteoporosis medications approved by the US Food and Drug Administration.2 The US Preventive Services Task Force and the National Osteoporosis Foundation each recommended screening for all women age 65 years or older and for postmenopausal women (age, 60-64 years) at high risk.3,4 The International Society for Clinical Densitometry (ISCD) recommended screening for all women age 65 years or older, all men age 70 years or older, and high-risk women under age 65 years.5

These current recommendations for BMD evaluation focus on women over age 65 years. More recent studies of postmenopausal women with distal radius fractures (DRFs) have found that both younger women (age, 45-65 years) and older women (age, ≥65 years) can have lower BMD and increased risk for hip and spine fracture.6,7 The authors of those studies recommended that all postmenopausal women with DRFs be evaluated for low BMD and that fracture prevention treatment be initiated. Earnshaw and colleagues8 and Oyen and colleagues9 found that men and women (age, ≥50 years) with DRFs had low BMD and elevated 10-year fracture rates. They concluded that BMD should be evaluated and treated in all DRF patients age 50 years or older. Other studies have shown low BMD in the contralateral distal radius of patients of all ages who presented with Colles fractures.10,11 These 2 studies did not measure spine or hip BMD.

The literature on BMD of younger women with DRFs is limited, relying solely on data collected for the contralateral distal radius.10,11 The ISCD recommended measuring both hip and spine BMD in premenopausal women. They also stated that z scores, not t scores, should be used for premenopausal women.5 The causes of low BMD in women over age 55 years are primarily nutritional deficiency and normal aging.1 In younger females, low BMD results from secondary causes, such as diet, medications, medical conditions, and endocrine disorders. When the secondary cause of low BMD can be identified and treated, osteoporosis can be stopped and even reversed in younger patients.12-14 Low BMD is more amenable to treatment in younger patients than in postmenopausal women. Younger patients with low BMD carry a higher lifetime fracture risk because they have more years of life with low BMD; therefore, early identification and treatment have a more significant impact on fracture prevention in these patients.

In the present study, we determined the prevalence of osteoporosis and osteopenia in younger women (age, 35-50 years) with DRFs and compared BMD measurements from younger women (age, 35-50 years) and older women (age, >50 years) with DRFs. The main goal was to determine which patients should be referred for bone densitometry and subsequent treatment.

Patients and Methods

This study received institutional review board approval. During a 5-year period (January 2005–August 2010), we prospectively collected dual-energy x-ray absorptiometry (DXA) scans for 128 women (age, >35 years) who presented with DRFs to our level I trauma center. Age ranged from 35 to 86 years. Data on mechanism of injury, treatment, and body mass index (BMI) were collected. The 128 patients were divided into a younger group (47 women; age range, 35-50 years; mean age, 44 years) and an older group (81 women; age, ≥51 years; mean age, 61 years). Mean BMI was 29.3 in the younger group and 28.8 in the older group (P = .88) (Table).

BMD was measured with a General Electric Lunar Prodigy Advance scanner that was tested annually for accuracy and precision. BMD of hips and lumbar spines was measured with a 76-Kv x-ray source. All DXA scans were analyzed by the same physician. BMD was omitted in cases of patients with a history of lumbar spine or hip fracture.

Two-sample Student t test was used to compare the 2 groups’ data. When multiple groups were being compared, analysis of variance was used. Spearman rank-order test was used to calculate a correlation coefficient for evaluation of the relationships between age and BMD.

Results

Mean lumbar spine (L1–L4) BMD was 1.12 in the younger group and 1.063 in the older group (P = .02); t scores were –0.63 and –1.132, respectively (P = .02); and mean z scores were –0.69 and –0.61, respectively (P = .81). Mean femoral neck BMD was 0.91 in the younger group and 0.80 in the older group (P < .05); t scores were –0.87 and –1.65, respectively (P < .01), and mean femoral neck z scores were –0.69 and –0.67, respectively (P = .92).

 

 

To further analyze BMD of specific age groups, we divided patients by decade: 35-39, 40-49, 50-59, 60-69, 70-79, 80-89 years. Among all 6 decades, there were no statistically significant differences between hip z scores (P = .83) (Figure 1). Spearman rank-order correlation test showed a moderate inverse correlation between age and femoral neck BMD (R = –0.42) and t score (R = –0.43). There was a weak correlation between increasing age and decreasing spine BMD, t score, and z score (Rs = –0.27, –0.31, 0.03). There was no correlation between age and femoral neck z score (R = –0.04).

According to the WHO classification system, 11 (23%) of the 47 women in the younger group were osteopenic, and 8 (17%) were osteoporotic, based on spine BMD. Hip BMD values indicated that 20 patients (43%) were osteopenic, and 3 (6%) were osteoporotic. One patient in the younger group had a hip z score of less than –2, and 14 patients (39%) had a hip z score between –2 and –1. Six patients (18%) had a spine z score of less than –2, and 6 patients (18%) had a spine z score between –2 and –1. Of the 81 older patients, 22 (27%) were osteopenic, and 21 (26%) were osteoporotic, according to spine measurements. The femoral neck data indicated that 39 (48%) of the older patients were osteopenic, and 22 (27%) were osteoporotic.

In both groups, mechanisms of injury were identified. Of the 47 younger patients, 26 fell from standing, 7 fell from a height of more than 6 feet, and 14 were injured in motor vehicle collisions (MVCs). Of the 81 older patients, 2 sustained a direct blow, 64 fell from standing, 4 fell from a height of more than 6 feet, and 11 were injured in MVCs. The differences in z scores based on mechanism of injury were not statistically significant (P = .22) (Figure 2).

Discussion

Several studies have shown that older women with DRFs have low BMD in the spine and femoral neck.8,9 These studies focused on older women who sustained low-energy fractures caused by a fall from a standing height. Studies of younger women with DRFs focused on BMD of the contralateral distal radius, not the spine or femoral neck.10,11 Those study groups also had low BMD. Findings from a multitude of studies have established that patients who are older than 50 years when they sustain distal radius fragility fractures should be referred for bone densitometry studies, and there is increasing evidence that younger patients with fragility fractures should undergo this evaluation as well.

The present study was designed to expand the range of patients and mechanisms of injury. Women in this study were 35 years or older. In addition to collecting data from patients injured in a fall from standing, we examined the medical records of women injured in MVCs, in falls from heights of more than 6 feet, and from direct trauma to the wrist. We measured the BMD of the spine and femoral neck and of the contralateral distal radius.

For this discussion, several key points should be made about BMD evaluation in younger versus older women. Most organizations caution against using spine BMD in older women. The ISCD, however, recommended measuring both hip and spine BMD; whereas BMD can be falsely elevated by spine osteoarthritis in older patients, spine BMD measurements are accurate in younger patients not affected by osteoarthritis. The ISCD also stipulated that z scores should be used in examining BMD in younger patients. The z score is a value of how many standard deviations BMD differs from a matched population of the same age, sex, ethnicity, and weight. The t score, which is useful in evaluating older patients, compares a patient’s BMD with that of an average 30-year-old.12

According to the WHO classification system (intended for older women), osteopenia is indicated by a t score between –1.0 and –2.5, and osteoporosis is indicated by a t score of less than –2.5. In the present study, about 43% of the younger patients (age, 35-50 years) with DRFs were osteopenic, and 6% of these patients were osteoporotic. In concert with previous studies,9 48% of our older women (age, >50 years) with DRFs were osteopenic, and 27% were osteoporotic. The difference in mean spinal z scores between the younger and older groups was not statistically significant (P = .81).

As mentioned, when examining BMD of younger patients, it is imperative to use spine z scores. About 18% of our younger patients had a z score of less than –2, and 18% had a z score between –2 and –1. In our comparison of patients from 5 different age decades (range, 35-79 years), there was no statistically significant difference in z scores (P = .83). In addition, there was no correlation between increasing age and decreasing z score (R = –0.04).

 

 

Secondary causes of osteoporosis have been documented in 30% of premenopausal women and 55% of men with vertebral fractures.13-15 Primary osteoporosis results from the normal aging process; secondary osteoporosis results from reversible causes, including medications, gastrointestinal disorders, renal disease, endocrine disorders, and sedentary lifestyle.15,16 When a secondary cause of osteoporosis is identified, treatment can be initiated to increase BMD. As younger patients can reverse bone loss and even increase BMD, it is important to identify reversible causes of osteopenia and osteoporosis in this age group. It is well documented that both younger and older patients with DRFs are at increased risk for subsequent fractures.6 Preventing further bone loss at a younger age may drastically decrease lifetime fracture risk.12,17

Most previous studies of BMD in women were limited to patients with DRFs caused by a low-energy mechanism or by a fall from standing. Current recommendations for BMD testing focus on postmenopausal women who have sustained a fragility or low-energy DRF. When an osteoporotic or osteopenic patient’s distal radius is subjected to a high-energy force, a fracture is likely. Therefore, we expanded our study to include high-energy mechanisms of injury. Our analysis of BMD in patients with DRFs sustained in MVCs indicated that 12% of this group were osteoporotic, and 44% were osteopenic. Forty-three percent of our younger patients with a DRF fractured in a MVC were osteopenic, and 6% were osteoporotic. Among 4 mechanisms of injury for DRFs, there was no statistically significant difference in z scores (P = .22) (Figure 2). This provides evidence that a significant portion of patients with DRFs from both high- and low-energy mechanisms are osteoporotic or osteopenic. Patients with DRFs sustained in MVCs or in falls from heights of more than 6 feet should be referred for BMD evaluation.         

Conclusion

A significant proportion of younger patients with DRFs are osteopenic or osteoporotic (43% and 6%, respectively), and their z scores are comparable to those of older patients with DRFs. There was no statistically significant difference in BMD z scores between younger and older patients and no difference in mechanisms of injury. This is evidence that younger patients with DRFs caused by a high- or low-energy mechanism of injury should undergo both DXA scan and BMD evaluation. If osteoporosis or osteopenia can be diagnosed at an earlier age, and if these patients can be properly treated, subsequent fractures could be prevented. The present study provides evidence supporting a simplification of the current recommendations for BMD evaluation: All women with DRFs should undergo bone densitometry.

References

1.    US Department of Health and Human Services. Bone Health and Osteoporosis: A Report of the Surgeon General. Rockville, MD: US Dept of Health and Human Services, Public Health Service, Office of the Surgeon General; 2004. http://www.ncbi.nlm.nih.gov/books/NBK45513/pdf/Bookshelf_NBK45513.pdf. Accessed November 3, 2015.

2.    Bone mass measurement (bone density). Medicare website. https://www.medicare.gov/coverage/bone-density.html. Accessed November 3, 2015.

3.    Final update summary: osteoporosis: screening. US Preventive Services Task Force website. http://www.uspreventiveservicestaskforce.org/Page/Document/UpdateSummaryFinal/osteoporosis-screening. Updated July 2015. Accessed November 3, 2015.

4.    National Osteoporosis Foundation. Clinician’s Guide to Prevention and Treatment of Osteoporosis. Washington, DC: National Osteoporosis Foundation; 2010. http://nof.org/files/nof/public/content/file/344/upload/159.pdf. Accessed November 3, 2015.

5.    Khan AA, Bachrach L, Brown JP, et al. Canadian Panel of International Society of Clinical Densitometry. Standards and guidelines for performing central dual-energy x-ray absorptiometry in premenopausal women, men, and children. J Clin Densitom. 2004;7(1):51-64.

6.    Barrett-Connor E, Sajjan SG, Siris ES, Miller PD, Chen YT, Markson LE. Wrist fracture as a predictor of future fractures in younger versus older postmenopausal women: results from the National Osteoporosis Risk Assessment (NORA). Osteoporos Int. 2008;19(5):607-613.

7.    Lauritzen JB, Schwarz P, Lund B, McNair P, Transbøl I. Changing incidence and residual lifetime risk of common osteoporosis-related fractures. Osteoporos Int. 1993;3(3):127-132.

8.    Earnshaw SA, Cawte SA, Worley A, Hosking DJ. Colles’ fracture of the wrist as an indicator of underlying osteoporosis in postmenopausal women: a prospective study of bone mineral density and bone turnover rate. Osteoporos Int. 1998;8(1):53-60.

9.    Oyen J, Brudvik C, Gjesdal CG, Tell GS, Lie SA, Hove LM. Osteoporosis as a risk factor for distal radius fractures: a case–control study. J Bone Joint Surg Am. 2011;93(4):348-356.

10. Wigderowitz CA, Cunningham T, Rowley DI, Mole PA, Paterson CR. Peripheral bone mineral density in patients with distal radial fractures. J Bone Joint Surg Br. 2003;85(3):423-425.

11. Wigderowitz CA, Rowley DI, Mole PA, Paterson CR, Abel EW. Bone mineral density of the radius in patients with Colles’ fracture. J Bone Joint Surg Br. 2000;82(1):87-89.

12. Khan A, Syed Z. Bone mineral density assessment in premenopausal women. Womens Health. 2006;2(4):639-645.

13. Fitzpatrick LA. Secondary causes of osteoporosis. Mayo Clin Proc. 2002;77(5):453-468.

14. Hudec SM, Camacho PM. Secondary causes of osteoporosis. Endocr Pract. 2013;19(1):120-128.

15. Scane AC, Sutcliffe AM, Francis RM. Osteoporosis in men. Baillieres Clin Rheumatol. 1993;7(3):589-601.

16. Binkley N, Bilezikian JP, Kendler DL, Leib ES, Lewiecki EM, Petak SM. Summary of the International Society for Clinical Densitometry 2005 Position Development Conference. J Bone Miner Res. 2007;22(5):643-645.

17. Kelepouris N, Harper KD, Gannon F, Kaplan FS, Haddad JG. Severe osteoporosis in men. Ann Intern Med. 1995;123(6):452-460.

References

1.    US Department of Health and Human Services. Bone Health and Osteoporosis: A Report of the Surgeon General. Rockville, MD: US Dept of Health and Human Services, Public Health Service, Office of the Surgeon General; 2004. http://www.ncbi.nlm.nih.gov/books/NBK45513/pdf/Bookshelf_NBK45513.pdf. Accessed November 3, 2015.

2.    Bone mass measurement (bone density). Medicare website. https://www.medicare.gov/coverage/bone-density.html. Accessed November 3, 2015.

3.    Final update summary: osteoporosis: screening. US Preventive Services Task Force website. http://www.uspreventiveservicestaskforce.org/Page/Document/UpdateSummaryFinal/osteoporosis-screening. Updated July 2015. Accessed November 3, 2015.

4.    National Osteoporosis Foundation. Clinician’s Guide to Prevention and Treatment of Osteoporosis. Washington, DC: National Osteoporosis Foundation; 2010. http://nof.org/files/nof/public/content/file/344/upload/159.pdf. Accessed November 3, 2015.

5.    Khan AA, Bachrach L, Brown JP, et al. Canadian Panel of International Society of Clinical Densitometry. Standards and guidelines for performing central dual-energy x-ray absorptiometry in premenopausal women, men, and children. J Clin Densitom. 2004;7(1):51-64.

6.    Barrett-Connor E, Sajjan SG, Siris ES, Miller PD, Chen YT, Markson LE. Wrist fracture as a predictor of future fractures in younger versus older postmenopausal women: results from the National Osteoporosis Risk Assessment (NORA). Osteoporos Int. 2008;19(5):607-613.

7.    Lauritzen JB, Schwarz P, Lund B, McNair P, Transbøl I. Changing incidence and residual lifetime risk of common osteoporosis-related fractures. Osteoporos Int. 1993;3(3):127-132.

8.    Earnshaw SA, Cawte SA, Worley A, Hosking DJ. Colles’ fracture of the wrist as an indicator of underlying osteoporosis in postmenopausal women: a prospective study of bone mineral density and bone turnover rate. Osteoporos Int. 1998;8(1):53-60.

9.    Oyen J, Brudvik C, Gjesdal CG, Tell GS, Lie SA, Hove LM. Osteoporosis as a risk factor for distal radius fractures: a case–control study. J Bone Joint Surg Am. 2011;93(4):348-356.

10. Wigderowitz CA, Cunningham T, Rowley DI, Mole PA, Paterson CR. Peripheral bone mineral density in patients with distal radial fractures. J Bone Joint Surg Br. 2003;85(3):423-425.

11. Wigderowitz CA, Rowley DI, Mole PA, Paterson CR, Abel EW. Bone mineral density of the radius in patients with Colles’ fracture. J Bone Joint Surg Br. 2000;82(1):87-89.

12. Khan A, Syed Z. Bone mineral density assessment in premenopausal women. Womens Health. 2006;2(4):639-645.

13. Fitzpatrick LA. Secondary causes of osteoporosis. Mayo Clin Proc. 2002;77(5):453-468.

14. Hudec SM, Camacho PM. Secondary causes of osteoporosis. Endocr Pract. 2013;19(1):120-128.

15. Scane AC, Sutcliffe AM, Francis RM. Osteoporosis in men. Baillieres Clin Rheumatol. 1993;7(3):589-601.

16. Binkley N, Bilezikian JP, Kendler DL, Leib ES, Lewiecki EM, Petak SM. Summary of the International Society for Clinical Densitometry 2005 Position Development Conference. J Bone Miner Res. 2007;22(5):643-645.

17. Kelepouris N, Harper KD, Gannon F, Kaplan FS, Haddad JG. Severe osteoporosis in men. Ann Intern Med. 1995;123(6):452-460.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
E493-E496
Page Number
E493-E496
Publications
Publications
Topics
Article Type
Display Headline
Prevalence of Low Bone Mineral Density in Younger Versus Older Women With Distal Radius Fractures
Display Headline
Prevalence of Low Bone Mineral Density in Younger Versus Older Women With Distal Radius Fractures
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, bone, bone mineral density, BMD, distal radius fractures, fracture, fracture management, fractures, DRF, osteoporosis, wrist, joint, massey, james, bonvillain, nelson, hollister
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, bone, bone mineral density, BMD, distal radius fractures, fracture, fracture management, fractures, DRF, osteoporosis, wrist, joint, massey, james, bonvillain, nelson, hollister
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Analysis of Predictors and Outcomes of Allogeneic Blood Transfusion After Shoulder Arthroplasty

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Analysis of Predictors and Outcomes of Allogeneic Blood Transfusion After Shoulder Arthroplasty

In shoulder arthroplasty, it is not uncommon for patients to receive postoperative blood transfusions; rates range from 7% to 43%.1-6 Allogeneic blood transfusions (ABTs) are costly and not entirely free of risks.7 The risk for infection has decreased because of improved screening and risk reduction strategies, but there are still significant risks associated with ABTs, such as clerical errors, acute and delayed hemolytic reactions, graft-versus-host reactions, transfusion-related acute lung injury, and anaphylaxis.8-10 As use of shoulder arthroplasty continues to increase, the importance of minimizing unnecessary transfusions is growing as well.7

Predictive factors for ABT have been explored in other orthopedic settings, yet little has been done in shoulder arthroplasty.1-6,11-15 Previous shoulder arthroplasty studies have shown that low preoperative hemoglobin (Hb) levels are independent risk factors for postoperative blood transfusion. However, there is debate over the significance of other variables, such as procedure type, age, sex, and medical comorbidities. Further, prior studies were limited by relatively small samples from single institutions; the largest series included fewer than 600 patients.1-6

We conducted a study to determine predictors of ABT in a large cohort of patients admitted to US hospitals for shoulder arthroplasty. We also wanted to evaluate the effect of ABT on postoperative outcomes, including inpatient mortality, adverse events, prolonged hospital stay, and nonroutine discharge. According to the null hypothesis, in shoulder arthroplasty there will be no difference in risk factors between patients who require ABT and those who did not, after accounting for confounding variables.

Materials and Methods

This study was exempt from institutional review board approval, as all data were appropriately deidentified before use in this project. We used the Nationwide Inpatient Sample (NIS) to retrospectively study the period 2002–2011, from which all demographic, clinical, and resource use data were derived.16 NIS, an annual survey conducted by the Agency for Healthcare Research and Quality (AHRQ) since 1988, has generated a huge amount of data, forming the largest all-payer inpatient care database in the United States. Yearly samples contain discharge data from about 8 million hospital stays at more than 1000 hospitals across 46 states, approximating a 20% random sample of all hospital discharges at participating institutions.17 These data are then weighted to generate statistically valid national estimates.

The NIS database uses International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) codes to identify 15 medical diagnoses up to the year 2008 and a maximum of 25 medical diagnoses and 15 procedures thereafter. In addition, the database includes information on patient and hospital characteristics as well as inpatient outcomes such as length of stay, total hospitalization charges, and discharge disposition.18,19 Given its large sample size and data volume, NIS is a powerful tool in the analysis of data associated with a multitude of medical diagnoses and procedures.20

We used the NIS database to study a population of 422,371 patients (age, >18 years) who underwent total shoulder arthroplasty (TSA) or hemiarthroplasty (HSA) between 2002 and 2011. ICD-9-CM procedure codes for TSA (81.80, 81.88) and HSA (81.81) were used to identify this population. We also analyzed data for reverse TSA for the year 2011. Then we divided our target population into 2 different cohorts: patients who did not receive any blood transfusion products and patients who received a transfusion of allogeneic packed cells (ICD-9-CM code 99.04 was used to identify the latter cohort).

In this study, normal distribution of the dataset was assumed, given the large sample size. The 2 cohorts were evaluated through bivariate analysis using the Pearson χ2 test for categorical data and the independent-samples t test for continuous data. The extent to which diagnosis, age, race, sex, and medical comorbidities were predictive of blood transfusion after TSA or HSA was evaluated through multivariate binary logistic regression analysis. Statistical significance was set at P < .05. All statistical analyses and data modeling were performed with SPSS Version 22.0.

Results

Using the NIS database, we stratified an estimated 422,371 patients who presented for shoulder arthroplasty between January 1, 2002, and December 31, 2011, into a TSA cohort (59.3%) and an HSA cohort (40.7%). Eight percent (33,889) of all patients received an ABT; the proportion of patients who received ABT was higher (P < .001) for the HSA cohort (55.6%) than the TSA cohort (39.4%). Further, the rate of ABT after shoulder arthroplasty showed an upward inclination (Figure).

Demographically, patients who received ABT tended (P < .001) to be older (74±11 years vs 68±11 years) and of a minority race (black or Hispanic) and to fall in either the lowest range of median household income (21.5% vs 20.7%; ≤$38,999) or the highest (27.3% vs 25.4%; ≥$63,000). Shoulder arthroplasty with ABT occurred more often (P < .001) at hospitals that were urban (13.3% vs 11.3%), medium in size (27.3% vs 23.4%), and nonteaching (56.2% vs 54.3%). In addition, ABT was used more often (P < .001) in patients with a primary diagnosis of fracture (43.1% vs 14.3%) or fracture nonunion (4.4% vs 2.1%). These groups also had a longer (P < .001) hospital stay (5.0±4.3 days vs 2.5±2.2 days). Table 1 summarizes these findings.

 

 

The 2 cohorts were then analyzed for presence of medical comorbidities (Table 2). Patients who required ABT during shoulder arthroplasty had a significantly (P < .001) higher prevalence of congestive heart failure, chronic lung disease, hypertension, uncomplicated and complicated diabetes mellitus, liver disease, renal failure, fluid and electrolyte disorders, pulmonary circulatory disease, weight loss, coagulopathy, and deficiency anemia.

In multivariate regression modeling (Table 3), demographic predictors of ABT (P < .001) included increasing age (odds ratio [OR], 1.03 per year; 95% confidence interval [95% CI], 1.03-1.03), female sex (OR, 1.55; 95% CI, 1.51-1.60), and minority race (black or Hispanic). Odds of requiring ABT were higher for patients with Medicare (OR, 1.25; 95% CI, 1.20-1.30) and patients with Medicaid (OR, 1.63; 95% CI, 1.51-1.77) than for patients with private insurance.

ABT was more likely to be required (P < .001) in patients with a primary diagnosis of fracture (OR, 4.49; 95% CI, 4.34-4.65), avascular necrosis (OR, 2.06; 95% CI, 1.91-2.22), rheumatoid arthritis (OR, 1.91; 95% CI, 1.72-2.12), fracture nonunion (OR, 3.55; 95% CI, 3.33-3.79), or rotator cuff arthropathy (OR, 1.47; 95% CI, 1.41-1.54) than for patients with osteoarthritis. Moreover, compared with patients having HSA, patients having TSA were more likely to require ABT (OR, 1.20; 95% CI, 1.17-1.24). According to the analysis restricted to the year 2011, compared with patients having anatomical TSAs, patients having reverse TSAs were 1.6 times more likely (P < .001) to require ABT (OR, 1.63; 95% CI, 1.50-1.79).

With the exception of obesity, all comorbidities were significant (P < .001) independent predictors of ABT after shoulder arthroplasty: deficiency anemia (OR, 3.42; 95% CI, 3.32-3.52), coagulopathy (OR, 2.54; 95% CI, 2.36-2.73), fluid and electrolyte disorders (OR, 1.91; 95% CI, 1.84-1.97), and weight loss (OR, 1.78; 95% CI, 1.58-2.00).

Patients who received ABT were more likely to experience adverse events (OR, 1.74; 95% CI, 1.68-1.81), prolonged hospital stay (OR, 3.21; 95% CI, 3.12-3.30), and nonroutine discharge (OR, 1.77; 95% CI, 1.72-1.82) (Table 4). There was no difference in mortality between the 2 cohorts.

Discussion

There is an abundance of literature on blood transfusions in hip and knee arthroplasty, but there are few articles on ABT in shoulder arthroplasty, and they all report data from single institutions with relatively low caseloads.1,2,11-13,15,21 In the present study, we investigated ABT in shoulder arthroplasty from the perspective of a multi-institutional database with a caseload of more than 400,000. Given the rapidly increasing rates of shoulder arthroplasty, it is important to further examine this issue to minimize unnecessary blood transfusion and its associated risks and costs.7

We found that 8% of patients who had shoulder arthroplasty received ABT, which is consistent with previously reported transfusion rates (range, 7%-43%).1-6 Rates of ABT after shoulder arthroplasty have continued to rise. The exception, a decrease during the year 2010, can be explained by increased efforts to more rigidly follow transfusion indication guidelines to reduce the number of potentially unnecessary ABTs.21-24 Our study also identified numerous significant independent predictors of ABT in shoulder arthroplasty: age, sex, race, insurance status, procedure type, primary diagnoses, and multiple medical comorbidities.

Demographics

According to our analysis, more than 80% of patients who received ABT were over age 65 years, which aligns with what several other studies have demonstrated: Increasing age is a predictor of ABT, despite higher rates of comorbidities and lower preoperative Hb levels in this population.1,2,4,5,25-27 Consistent with previous work, female sex was predictive of ABT.2,5 It has been suggested that females are more likely predisposed to ABT because of lower preoperative Hb and smaller blood mass.2,5,28 Interestingly, our study showed a higher likelihood of ABT in both black and Hispanic populations. Further, patients with Medicare or Medicaid were more likely to receive ABT.

Primary Diagnosis

Although patients with a primary diagnosis of osteoarthritis constitute the majority of patients who undergo shoulder arthroplasty, our analysis showed that patients with a diagnosis of proximal humerus fracture were more likely to receive ABT. This finding is reasonable given studies showing the high prevalence of proximal humerus fractures in elderly women.29,30 Similarly, patients with a humerus fracture nonunion were more likely to receive a blood transfusion, which is unsurprising given the increased complexity associated with arthroplasty in this predominately elderly population.31 Interestingly, compared with patients with osteoarthritis, patients with any one of the other primary diagnoses were more likely to require a transfusion—proximal humerus fracture being the most significant, followed by humerus fracture nonunion, avascular necrosis, rheumatoid arthritis, and rotator cuff arthropathy.

 

 

Type of Arthroplasty

Bivariate analysis revealed that 55.6% of the patients who received ABT underwent HSA; the other 44.4% underwent TSA. The effect of primary diagnosis on procedure choice likely played a role in this finding. HSA indications include humerus fracture, which has been associated with increased ABT, whereas patients with osteoarthritis requiring TSA are significantly less likely to require ABT, as reflected in this analysis.7,32-34 Previous studies have failed to show a difference in blood transfusion rates between TSA and HSA.2,4-6,35 Conversely, with confounding factors controlled for, multivariate logistic regression analysis showed that TSA was 1.2 times more likely than HSA to require ABT, which could be explained by the increased operative time, case complexity, and blood loss that may be associated with the glenoid exposure.36,37 With analysis restricted to the year 2011, patients with reverse TSAs were 1.6 times more likely than patients with anatomical TSAs to receive a blood transfusion (OR, 1.63; 95% CI, 1.50-1.79). Although this finding differs from what was previously reported, it fits given that patients having reverse TSAs are often older and may present with a more significant comorbidity profile.3 In addition, there are the increased technical surgical aspects associated with “salvage surgery” for challenging indications such as cuff arthropathy and failed previous arthroplasty.38-41

Medical Comorbidities

Patients who received ABT were more likely to present with numerous medical comorbidities. Previous studies have indicated that the presence of multiple medical comorbidities significantly increased blood transfusion rates, possibly by working synergistically.42 All studies of blood transfusion in shoulder arthroplasty concluded that lower preoperative Hb was an independent predictor.1-6 Schumer and colleagues4 reported a 4-fold increase in likelihood of blood transfusion in patients with a preoperative Hb level less than 12.5 g/dL. In addition, Millett and colleagues6 showed a 20-fold increase in likelihood of transfusion in patients with a preoperative Hb level less than 11.0 g/dL compared with patients with a level higher than 13.0 g/dL. Patients with a Hb level between 11.0 and 13.0 g/dL showed a 5-fold increase in likelihood of transfusion.6 We should note that correction of preoperative anemia through various pharmacologic methods (eg, erythropoietin, intravenous iron supplementation) has been shown to decrease postoperative transfusion rates.43,44 Although we could not include preoperative Hb levels in the present study, given inherent limitations in using NIS, our multivariate analysis showed that preoperative deficiency anemia and coagulopathy were the most significant predictors of ABT.

In addition, the multivariate logistic regression model showed that both cardiac disease and diabetes were independent predictors of ABT, confirming data reported by Ahmadi and colleagues.1 Although not as well characterized in other studies, in the current analysis multiple other medical comorbidities, including fluid and electrolyte abnormalities, weight loss, liver disease, renal failure, and chronic lung disease, had significant predictive value. Contrarily, obesity significantly decreased the odds of ABT, likely because of higher baseline blood volume in obese patients.

Patient Outcomes

Patients who undergo shoulder arthroplasty with ABT are more likely to experience adverse events or a prolonged hospital stay and are more often discharged to a nursing home or an extended-care facility. In this population, however, deaths did not occur at a significantly higher rate—similar to what was found for patients who underwent hip or knee arthroplasty with blood transfusions.45

Little has been done to investigate the effect of pharmacologic agents on the need for perioperative ABT for orthopedic shoulder procedures. Aprotinin, tranexamic acid, epoetin-α, and aminocaproic acid have all been effective in limiting ABT during the perioperative period in various orthopedic hip, knee, and spine procedures.9,46-53 Given the increased morbidity associated with ABT, it may be beneficial to use similar methods to limit blood loss in high-risk patients undergoing shoulder arthroplasty.

Study Limitations

NIS has intrinsic limitations. Given its massive volume, it is subject to errors in both data entry and clinical coding. Moreover, the database lacks data that would have been useful in our study: preoperative Hb levels, intraoperative course, number of units transfused, total blood loss, use of blood conservation techniques, transfusion protocols, and severity of comorbidities. Reverse TSA was given a unique ICD-9-CM code in October 2010, so 2011 was the only year we were able to examine the relationship between reverse TSA and transfusions. Further, our analysis was unable to identify any medications, including chronic anticoagulants or postoperative prophylaxis, that have been shown to significantly affect blood transfusion rates.54 Yet, there are obvious advantages to using the NIS database, as previously outlined across the medical landscape.

 

 

Conclusion

Our results confirmed previous findings and identified new predictors of ABT in shoulder arthroplasty in a large cohort. We examined demographics and perioperative complications while identifying predictors of ABT use. Patients who received ABT were older, female, and nonwhite and were covered by Medicare or Medicaid insurance, and many had a primary diagnosis of proximal humerus fracture. The ABT cohort had numerous medical comorbidities, including deficiency anemia and coagulopathy. Identifying this patient population is a prerequisite to educating patients while minimizing unnecessary risks and costs.

Using NIS data on a population of 422,371 patients who underwent shoulder arthroplasty, we identified the 5 likeliest predictors of ABT: fracture, fracture nonunion, deficiency anemia, coagulopathy, and avascular necrosis. Of the identified variables associated with ABT, deficiency anemia may be the most amenable to treatment; therefore, there may be benefit in delaying elective shoulder arthroplasty in this cohort. Given these findings, it is important to identify at-risk patients before surgery, with the intent to provide education and minimize risk.

References

1.    Ahmadi S, Lawrence TM, Sahota S, et al. The incidence and risk factors for blood transfusion in revision shoulder arthroplasty: our institution’s experience and review of the literature. J Shoulder Elbow Surg. 2014;23(1):43-48.

2.    Sperling JW, Duncan SF, Cofield RH, Schleck CD, Harmsen WS. Incidence and risk factors for blood transfusion in shoulder arthroplasty. J Shoulder Elbow Surg. 2005;14(6):599-601.

3.    Hardy JC, Hung M, Snow BJ, et al. Blood transfusion associated with shoulder arthroplasty. J Shoulder Elbow Surg. 2013;22(2):233-239.

4.    Schumer RA, Chae JS, Markert RJ, Sprott D, Crosby LA. Predicting transfusion in shoulder arthroplasty. J Shoulder Elbow Surg. 2010;19(1):91-96.

5.    Gruson KI, Accousti KJ, Parsons BO, Pillai G, Flatow EL. Transfusion after shoulder arthroplasty: an analysis of rates and risk factors. J Shoulder Elbow Surg. 2009;18(2):225-230.

6.    Millett PJ, Porramatikul M, Chen N, Zurakowski D, Warner JJ. Analysis of transfusion predictors in shoulder arthroplasty. J Bone Joint Surg Am. 2006;88(6):1223-1230.

7.    Kim SH, Wise BL, Zhang Y, Szabo RM. Increasing incidence of shoulder arthroplasty in the United States. J Bone Joint Surg Am. 2011;93(24):2249-2254.

8.    Ceccherini-Nelli L, Filipponi F, Mosca F, Campa M. The risk of contracting an infectious disease from blood transfusion. Transplantation Proc. 2004;36(3):680-682.

9.    Friedman R, Homering M, Holberg G, Berkowitz SD. Allogeneic blood transfusions and postoperative infections after total hip or knee arthroplasty. J Bone Joint Surg Am. 2014;96(4):272-278.

10. Hatzidakis AM, Mendlick RM, McKillip T, Reddy RL, Garvin KL. Preoperative autologous donation for total joint arthroplasty. An analysis of risk factors for allogenic transfusion. J Bone Joint Surg Am. 2000;82(1):89-100.

11. Park JH, Rasouli MR, Mortazavi SM, Tokarski AT, Maltenfort MG, Parvizi J. Predictors of perioperative blood loss in total joint arthroplasty. J Bone Joint Surg Am. 2013;95(19):1777-1783.

12. Aderinto J, Brenkel IJ. Pre-operative predictors of the requirement for blood transfusion following total hip replacement. J Bone Joint Surg Br. 2004;86(7):970-973.

13. Browne JA, Adib F, Brown TE, Novicoff WM. Transfusion rates are increasing following total hip arthroplasty: risk factors and outcomes. J Arthroplasty. 2013;28(8 suppl):34-37.

14. Yoshihara H, Yoneoka D. Predictors of allogeneic blood transfusion in spinal fusion in the United States, 2004–2009. Spine. 2014;39(4):304-310.

15. Noticewala MS, Nyce JD, Wang W, Geller JA, Macaulay W. Predicting need for allogeneic transfusion after total knee arthroplasty. J Arthroplasty. 2012;27(6):961-967.

16. Griffin JW, Novicoff WM, Browne JA, Brockmeier SF. Obstructive sleep apnea as a risk factor after shoulder arthroplasty. J Shoulder Elbow Surg. 2013;22(12):e6-e9.

17. Maynard C, Sales AE. Changes in the use of coronary artery revascularization procedures in the Department of Veterans Affairs, the National Hospital Discharge Survey, and the Nationwide Inpatient Sample, 1991–1999. BMC Health Serv Res. 2003;3(1):12.

18. Pereira BM, Chan PH, Weinstein PR, Fishman RA. Cerebral protection during reperfusion with superoxide dismutase in focal cerebral ischemia. Adv Neurol. 1990;52:97-103.

19. Hambright D, Henderson RA, Cook C, Worrell T, Moorman CT, Bolognesi MP. A comparison of perioperative outcomes in patients with and without rheumatoid arthritis after receiving a total shoulder replacement arthroplasty. J Shoulder Elbow Surg. 2011;20(1):77-85.

20. Ponce BA, Menendez ME, Oladeji LO, Soldado F. Diabetes as a risk factor for poorer early postoperative outcomes after shoulder arthroplasty. J Shoulder Elbow Surg. 2014;23(5):671-678.

21. Pierson JL, Hannon TJ, Earles DR. A blood-conservation algorithm to reduce blood transfusions after total hip and knee arthroplasty. J Bone Joint Surg Am. 2004;86(7):1512-1518.

22. Martinez V, Monsaingeon-Lion A, Cherif K, Judet T, Chauvin M, Fletcher D. Transfusion strategy for primary knee and hip arthroplasty: impact of an algorithm to lower transfusion rates and hospital costs. Br J Anaesth. 2007;99(6):794-800.

23. Helm AT, Karski MT, Parsons SJ, Sampath JS, Bale RS. A strategy for reducing blood-transfusion requirements in elective orthopaedic surgery. Audit of an algorithm for arthroplasty of the lower limb. J Bone Joint Surg Br. 2003;85(4):484-489.

24. Watts CD, Pagnano MW. Minimising blood loss and transfusion in contemporary hip and knee arthroplasty. J Bone Joint Surg Br. 2012;94(11 suppl A):8-10.

25. Guralnik JM, Eisenstaedt RS, Ferrucci L, Klein HG, Woodman RC. Prevalence of anemia in persons 65 years and older in the United States: evidence for a high rate of unexplained anemia. Blood. 2004;104(8):2263-2268.

26. Rogers MA, Blumberg N, Heal JM, Langa KM. Utilization of blood transfusion among older adults in the United States. Transfusion. 2011;51(4):710-718.

27. Cobain TJ, Vamvakas EC, Wells A, Titlestad K. A survey of the demographics of blood use. Transfusion Med. 2007;17(1):1-15.

28. Fosco M, Di Fiore M. Factors predicting blood transfusion in different surgical procedures for degenerative spine disease. Eur Rev Med Pharmacol Sci. 2012;16(13):1853-1858.

29. Handoll HH, Ollivere BJ, Rollins KE. Interventions for treating proximal humeral fractures in adults. Cochrane Database Syst Rev. 2012;12:CD000434.

30. Neuhaus V, Swellengrebel CH, Bossen JK, Ring D. What are the factors influencing outcome among patients admitted to a hospital with a proximal humeral fracture? Clin Orthop Relat Res. 2013;471(5):1698-1706.

31. Volgas DA, Stannard JP, Alonso JE. Nonunions of the humerus. Clin Orthop Relat Res. 2004;(419):46-50.

32. Chambers L, Dines JS, Lorich DG, Dines DM. Hemiarthroplasty for proximal humerus fractures. Curr Rev Musculoskeletal Med. 2013;6(1):57-62.

33. Jain NB, Hocker S, Pietrobon R, Guller U, Bathia N, Higgins LD. Total arthroplasty versus hemiarthroplasty for glenohumeral osteoarthritis: role of provider volume. J Shoulder Elbow Surg. 2005;14(4):361-367.

34.  Izquierdo R, Voloshin I, Edwards S, et al. Treatment of glenohumeral osteoarthritis. J Am Acad Orthop Surg. 2010;18(6):375-382.

35. Shields E, Iannuzzi JC, Thorsness R, Noyes K, Voloshin I. Perioperative complications after hemiarthroplasty and total shoulder arthroplasty are equivalent. J Shoulder Elbow Surg. 2014;23(10):1449-1453.

36. Gartsman GM, Roddey TS, Hammerman SM. Shoulder arthroplasty with or without resurfacing of the glenoid in patients who have osteoarthritis. J Bone Joint Surg Am. 2000;82(1):26-34.

37. Singh A, Yian EH, Dillon MT, Takayanagi M, Burke MF, Navarro RA. The effect of surgeon and hospital volume on shoulder arthroplasty perioperative quality metrics. J Shoulder Elbow Surg. 2014;23(8):1187-1194.

38. Groh GI, Groh GM. Complications rates, reoperation rates, and the learning curve in reverse shoulder arthroplasty. J Shoulder Elbow Surg. 2014;23(3):388-394.

39. Boileau P, Gonzalez JF, Chuinard C, Bicknell R, Walch G. Reverse total shoulder arthroplasty after failed rotator cuff surgery. J Shoulder Elbow Surg. 2009;18(4):600-606.

40.    Boileau P, Watkinson D, Hatzidakis AM, Hovorka I. Neer Award 2005: the Grammont reverse shoulder prosthesis: results in cuff tear arthritis, fracture sequelae, and revision arthroplasty. J Shoulder Elbow Surg. 2006;15(5):527-540.

41. Boileau P, Watkinson DJ, Hatzidakis AM, Balg F. Grammont reverse prosthesis: design, rationale, and biomechanics. J Shoulder Elbow Surg. 2005;14(1 suppl S):147S-161S.

42. Pola E, Papaleo P, Santoliquido A, Gasparini G, Aulisa L, De Santis E. Clinical factors associated with an increased risk of perioperative blood transfusion in nonanemic patients undergoing total hip arthroplasty. J Bone Joint Surg Am. 2004;86(1):57-61.

43. Lin DM, Lin ES, Tran MH. Efficacy and safety of erythropoietin and intravenous iron in perioperative blood management: a systematic review. Transfusion Med Rev. 2013;27(4):221-234.

44. Muñoz M, Gómez-Ramírez S, Cuenca J, et al. Very-short-term perioperative intravenous iron administration and postoperative outcome in major orthopedic surgery: a pooled analysis of observational data from 2547 patients. Transfusion. 2014;54(2):289-299.

45. Danninger T, Rasul R, Poeran J, et al. Blood transfusions in total hip and knee arthroplasty: an analysis of outcomes. ScientificWorldJournal. 2014;2014:623460.

46. Baldus CR, Bridwell KH, Lenke LG, Okubadejo GO. Can we safely reduce blood loss during lumbar pedicle subtraction osteotomy procedures using tranexamic acid or aprotinin? A comparative study with controls. Spine. 2010;35(2):235-239.

47. Chang CH, Chang Y, Chen DW, Ueng SW, Lee MS. Topical tranexamic acid reduces blood loss and transfusion rates associated with primary total hip arthroplasty. Clin Orthop Relat Res. 2014;472(5):1552-1557.

48. Delasotta LA, Orozco F, Jafari SM, Blair JL, Ong A. Should we use preoperative epoetin-alpha in the mildly anemic patient undergoing simultaneous total knee arthroplasty? Open Orthop J. 2013;7:47-50.

49. Delasotta LA, Rangavajjula A, Frank ML, Blair J, Orozco F, Ong A. The use of preoperative epoetin-alpha in revision hip arthroplasty. Open Orthop J. 2012;6:179-183.

50. Kelley TC, Tucker KK, Adams MJ, Dalury DF. Use of tranexamic acid results in decreased blood loss and decreased transfusions in patients undergoing staged bilateral total knee arthroplasty. Transfusion. 2014;54(1):26-30.

51. Martin JG, Cassatt KB, Kincaid-Cinnamon KA, Westendorf DS, Garton AS, Lemke JH. Topical administration of tranexamic acid in primary total hip and total knee arthroplasty. J Arthroplasty. 2014;29(5):889-894.

52. Tzortzopoulou A, Cepeda MS, Schumann R, Carr DB. Antifibrinolytic agents for reducing blood loss in scoliosis surgery in children. Cochrane Database Syst Rev. 2008(3):CD006883.

53. Zhang H, Chen J, Chen F, Que W. The effect of tranexamic acid on blood loss and use of blood products in total knee arthroplasty: a meta-analysis. Knee Surg Sports Traumatol Arthrosc. 2012;20(9):1742-1752.

54. Bong MR, Patel V, Chang E, Issack PS, Hebert R, Di Cesare PE. Risks associated with blood transfusion after total knee arthroplasty. J Arthroplasty. 2004;19(3):281-287.

Article PDF
Author and Disclosure Information

Brent A. Ponce, MD, Jonathan C. Yu, MD, Mariano E. Menendez, MD, and Lasun O. Oladeji, MS

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
E486-E492
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, allogeneic, blood, transfusion, shoulder arthroplasty, shoulder, arthroplasty, total shoulder arthroplasty, TSA, ABT, ponce, yu, menendez, oladeji
Sections
Author and Disclosure Information

Brent A. Ponce, MD, Jonathan C. Yu, MD, Mariano E. Menendez, MD, and Lasun O. Oladeji, MS

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Brent A. Ponce, MD, Jonathan C. Yu, MD, Mariano E. Menendez, MD, and Lasun O. Oladeji, MS

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

In shoulder arthroplasty, it is not uncommon for patients to receive postoperative blood transfusions; rates range from 7% to 43%.1-6 Allogeneic blood transfusions (ABTs) are costly and not entirely free of risks.7 The risk for infection has decreased because of improved screening and risk reduction strategies, but there are still significant risks associated with ABTs, such as clerical errors, acute and delayed hemolytic reactions, graft-versus-host reactions, transfusion-related acute lung injury, and anaphylaxis.8-10 As use of shoulder arthroplasty continues to increase, the importance of minimizing unnecessary transfusions is growing as well.7

Predictive factors for ABT have been explored in other orthopedic settings, yet little has been done in shoulder arthroplasty.1-6,11-15 Previous shoulder arthroplasty studies have shown that low preoperative hemoglobin (Hb) levels are independent risk factors for postoperative blood transfusion. However, there is debate over the significance of other variables, such as procedure type, age, sex, and medical comorbidities. Further, prior studies were limited by relatively small samples from single institutions; the largest series included fewer than 600 patients.1-6

We conducted a study to determine predictors of ABT in a large cohort of patients admitted to US hospitals for shoulder arthroplasty. We also wanted to evaluate the effect of ABT on postoperative outcomes, including inpatient mortality, adverse events, prolonged hospital stay, and nonroutine discharge. According to the null hypothesis, in shoulder arthroplasty there will be no difference in risk factors between patients who require ABT and those who did not, after accounting for confounding variables.

Materials and Methods

This study was exempt from institutional review board approval, as all data were appropriately deidentified before use in this project. We used the Nationwide Inpatient Sample (NIS) to retrospectively study the period 2002–2011, from which all demographic, clinical, and resource use data were derived.16 NIS, an annual survey conducted by the Agency for Healthcare Research and Quality (AHRQ) since 1988, has generated a huge amount of data, forming the largest all-payer inpatient care database in the United States. Yearly samples contain discharge data from about 8 million hospital stays at more than 1000 hospitals across 46 states, approximating a 20% random sample of all hospital discharges at participating institutions.17 These data are then weighted to generate statistically valid national estimates.

The NIS database uses International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) codes to identify 15 medical diagnoses up to the year 2008 and a maximum of 25 medical diagnoses and 15 procedures thereafter. In addition, the database includes information on patient and hospital characteristics as well as inpatient outcomes such as length of stay, total hospitalization charges, and discharge disposition.18,19 Given its large sample size and data volume, NIS is a powerful tool in the analysis of data associated with a multitude of medical diagnoses and procedures.20

We used the NIS database to study a population of 422,371 patients (age, >18 years) who underwent total shoulder arthroplasty (TSA) or hemiarthroplasty (HSA) between 2002 and 2011. ICD-9-CM procedure codes for TSA (81.80, 81.88) and HSA (81.81) were used to identify this population. We also analyzed data for reverse TSA for the year 2011. Then we divided our target population into 2 different cohorts: patients who did not receive any blood transfusion products and patients who received a transfusion of allogeneic packed cells (ICD-9-CM code 99.04 was used to identify the latter cohort).

In this study, normal distribution of the dataset was assumed, given the large sample size. The 2 cohorts were evaluated through bivariate analysis using the Pearson χ2 test for categorical data and the independent-samples t test for continuous data. The extent to which diagnosis, age, race, sex, and medical comorbidities were predictive of blood transfusion after TSA or HSA was evaluated through multivariate binary logistic regression analysis. Statistical significance was set at P < .05. All statistical analyses and data modeling were performed with SPSS Version 22.0.

Results

Using the NIS database, we stratified an estimated 422,371 patients who presented for shoulder arthroplasty between January 1, 2002, and December 31, 2011, into a TSA cohort (59.3%) and an HSA cohort (40.7%). Eight percent (33,889) of all patients received an ABT; the proportion of patients who received ABT was higher (P < .001) for the HSA cohort (55.6%) than the TSA cohort (39.4%). Further, the rate of ABT after shoulder arthroplasty showed an upward inclination (Figure).

Demographically, patients who received ABT tended (P < .001) to be older (74±11 years vs 68±11 years) and of a minority race (black or Hispanic) and to fall in either the lowest range of median household income (21.5% vs 20.7%; ≤$38,999) or the highest (27.3% vs 25.4%; ≥$63,000). Shoulder arthroplasty with ABT occurred more often (P < .001) at hospitals that were urban (13.3% vs 11.3%), medium in size (27.3% vs 23.4%), and nonteaching (56.2% vs 54.3%). In addition, ABT was used more often (P < .001) in patients with a primary diagnosis of fracture (43.1% vs 14.3%) or fracture nonunion (4.4% vs 2.1%). These groups also had a longer (P < .001) hospital stay (5.0±4.3 days vs 2.5±2.2 days). Table 1 summarizes these findings.

 

 

The 2 cohorts were then analyzed for presence of medical comorbidities (Table 2). Patients who required ABT during shoulder arthroplasty had a significantly (P < .001) higher prevalence of congestive heart failure, chronic lung disease, hypertension, uncomplicated and complicated diabetes mellitus, liver disease, renal failure, fluid and electrolyte disorders, pulmonary circulatory disease, weight loss, coagulopathy, and deficiency anemia.

In multivariate regression modeling (Table 3), demographic predictors of ABT (P < .001) included increasing age (odds ratio [OR], 1.03 per year; 95% confidence interval [95% CI], 1.03-1.03), female sex (OR, 1.55; 95% CI, 1.51-1.60), and minority race (black or Hispanic). Odds of requiring ABT were higher for patients with Medicare (OR, 1.25; 95% CI, 1.20-1.30) and patients with Medicaid (OR, 1.63; 95% CI, 1.51-1.77) than for patients with private insurance.

ABT was more likely to be required (P < .001) in patients with a primary diagnosis of fracture (OR, 4.49; 95% CI, 4.34-4.65), avascular necrosis (OR, 2.06; 95% CI, 1.91-2.22), rheumatoid arthritis (OR, 1.91; 95% CI, 1.72-2.12), fracture nonunion (OR, 3.55; 95% CI, 3.33-3.79), or rotator cuff arthropathy (OR, 1.47; 95% CI, 1.41-1.54) than for patients with osteoarthritis. Moreover, compared with patients having HSA, patients having TSA were more likely to require ABT (OR, 1.20; 95% CI, 1.17-1.24). According to the analysis restricted to the year 2011, compared with patients having anatomical TSAs, patients having reverse TSAs were 1.6 times more likely (P < .001) to require ABT (OR, 1.63; 95% CI, 1.50-1.79).

With the exception of obesity, all comorbidities were significant (P < .001) independent predictors of ABT after shoulder arthroplasty: deficiency anemia (OR, 3.42; 95% CI, 3.32-3.52), coagulopathy (OR, 2.54; 95% CI, 2.36-2.73), fluid and electrolyte disorders (OR, 1.91; 95% CI, 1.84-1.97), and weight loss (OR, 1.78; 95% CI, 1.58-2.00).

Patients who received ABT were more likely to experience adverse events (OR, 1.74; 95% CI, 1.68-1.81), prolonged hospital stay (OR, 3.21; 95% CI, 3.12-3.30), and nonroutine discharge (OR, 1.77; 95% CI, 1.72-1.82) (Table 4). There was no difference in mortality between the 2 cohorts.

Discussion

There is an abundance of literature on blood transfusions in hip and knee arthroplasty, but there are few articles on ABT in shoulder arthroplasty, and they all report data from single institutions with relatively low caseloads.1,2,11-13,15,21 In the present study, we investigated ABT in shoulder arthroplasty from the perspective of a multi-institutional database with a caseload of more than 400,000. Given the rapidly increasing rates of shoulder arthroplasty, it is important to further examine this issue to minimize unnecessary blood transfusion and its associated risks and costs.7

We found that 8% of patients who had shoulder arthroplasty received ABT, which is consistent with previously reported transfusion rates (range, 7%-43%).1-6 Rates of ABT after shoulder arthroplasty have continued to rise. The exception, a decrease during the year 2010, can be explained by increased efforts to more rigidly follow transfusion indication guidelines to reduce the number of potentially unnecessary ABTs.21-24 Our study also identified numerous significant independent predictors of ABT in shoulder arthroplasty: age, sex, race, insurance status, procedure type, primary diagnoses, and multiple medical comorbidities.

Demographics

According to our analysis, more than 80% of patients who received ABT were over age 65 years, which aligns with what several other studies have demonstrated: Increasing age is a predictor of ABT, despite higher rates of comorbidities and lower preoperative Hb levels in this population.1,2,4,5,25-27 Consistent with previous work, female sex was predictive of ABT.2,5 It has been suggested that females are more likely predisposed to ABT because of lower preoperative Hb and smaller blood mass.2,5,28 Interestingly, our study showed a higher likelihood of ABT in both black and Hispanic populations. Further, patients with Medicare or Medicaid were more likely to receive ABT.

Primary Diagnosis

Although patients with a primary diagnosis of osteoarthritis constitute the majority of patients who undergo shoulder arthroplasty, our analysis showed that patients with a diagnosis of proximal humerus fracture were more likely to receive ABT. This finding is reasonable given studies showing the high prevalence of proximal humerus fractures in elderly women.29,30 Similarly, patients with a humerus fracture nonunion were more likely to receive a blood transfusion, which is unsurprising given the increased complexity associated with arthroplasty in this predominately elderly population.31 Interestingly, compared with patients with osteoarthritis, patients with any one of the other primary diagnoses were more likely to require a transfusion—proximal humerus fracture being the most significant, followed by humerus fracture nonunion, avascular necrosis, rheumatoid arthritis, and rotator cuff arthropathy.

 

 

Type of Arthroplasty

Bivariate analysis revealed that 55.6% of the patients who received ABT underwent HSA; the other 44.4% underwent TSA. The effect of primary diagnosis on procedure choice likely played a role in this finding. HSA indications include humerus fracture, which has been associated with increased ABT, whereas patients with osteoarthritis requiring TSA are significantly less likely to require ABT, as reflected in this analysis.7,32-34 Previous studies have failed to show a difference in blood transfusion rates between TSA and HSA.2,4-6,35 Conversely, with confounding factors controlled for, multivariate logistic regression analysis showed that TSA was 1.2 times more likely than HSA to require ABT, which could be explained by the increased operative time, case complexity, and blood loss that may be associated with the glenoid exposure.36,37 With analysis restricted to the year 2011, patients with reverse TSAs were 1.6 times more likely than patients with anatomical TSAs to receive a blood transfusion (OR, 1.63; 95% CI, 1.50-1.79). Although this finding differs from what was previously reported, it fits given that patients having reverse TSAs are often older and may present with a more significant comorbidity profile.3 In addition, there are the increased technical surgical aspects associated with “salvage surgery” for challenging indications such as cuff arthropathy and failed previous arthroplasty.38-41

Medical Comorbidities

Patients who received ABT were more likely to present with numerous medical comorbidities. Previous studies have indicated that the presence of multiple medical comorbidities significantly increased blood transfusion rates, possibly by working synergistically.42 All studies of blood transfusion in shoulder arthroplasty concluded that lower preoperative Hb was an independent predictor.1-6 Schumer and colleagues4 reported a 4-fold increase in likelihood of blood transfusion in patients with a preoperative Hb level less than 12.5 g/dL. In addition, Millett and colleagues6 showed a 20-fold increase in likelihood of transfusion in patients with a preoperative Hb level less than 11.0 g/dL compared with patients with a level higher than 13.0 g/dL. Patients with a Hb level between 11.0 and 13.0 g/dL showed a 5-fold increase in likelihood of transfusion.6 We should note that correction of preoperative anemia through various pharmacologic methods (eg, erythropoietin, intravenous iron supplementation) has been shown to decrease postoperative transfusion rates.43,44 Although we could not include preoperative Hb levels in the present study, given inherent limitations in using NIS, our multivariate analysis showed that preoperative deficiency anemia and coagulopathy were the most significant predictors of ABT.

In addition, the multivariate logistic regression model showed that both cardiac disease and diabetes were independent predictors of ABT, confirming data reported by Ahmadi and colleagues.1 Although not as well characterized in other studies, in the current analysis multiple other medical comorbidities, including fluid and electrolyte abnormalities, weight loss, liver disease, renal failure, and chronic lung disease, had significant predictive value. Contrarily, obesity significantly decreased the odds of ABT, likely because of higher baseline blood volume in obese patients.

Patient Outcomes

Patients who undergo shoulder arthroplasty with ABT are more likely to experience adverse events or a prolonged hospital stay and are more often discharged to a nursing home or an extended-care facility. In this population, however, deaths did not occur at a significantly higher rate—similar to what was found for patients who underwent hip or knee arthroplasty with blood transfusions.45

Little has been done to investigate the effect of pharmacologic agents on the need for perioperative ABT for orthopedic shoulder procedures. Aprotinin, tranexamic acid, epoetin-α, and aminocaproic acid have all been effective in limiting ABT during the perioperative period in various orthopedic hip, knee, and spine procedures.9,46-53 Given the increased morbidity associated with ABT, it may be beneficial to use similar methods to limit blood loss in high-risk patients undergoing shoulder arthroplasty.

Study Limitations

NIS has intrinsic limitations. Given its massive volume, it is subject to errors in both data entry and clinical coding. Moreover, the database lacks data that would have been useful in our study: preoperative Hb levels, intraoperative course, number of units transfused, total blood loss, use of blood conservation techniques, transfusion protocols, and severity of comorbidities. Reverse TSA was given a unique ICD-9-CM code in October 2010, so 2011 was the only year we were able to examine the relationship between reverse TSA and transfusions. Further, our analysis was unable to identify any medications, including chronic anticoagulants or postoperative prophylaxis, that have been shown to significantly affect blood transfusion rates.54 Yet, there are obvious advantages to using the NIS database, as previously outlined across the medical landscape.

 

 

Conclusion

Our results confirmed previous findings and identified new predictors of ABT in shoulder arthroplasty in a large cohort. We examined demographics and perioperative complications while identifying predictors of ABT use. Patients who received ABT were older, female, and nonwhite and were covered by Medicare or Medicaid insurance, and many had a primary diagnosis of proximal humerus fracture. The ABT cohort had numerous medical comorbidities, including deficiency anemia and coagulopathy. Identifying this patient population is a prerequisite to educating patients while minimizing unnecessary risks and costs.

Using NIS data on a population of 422,371 patients who underwent shoulder arthroplasty, we identified the 5 likeliest predictors of ABT: fracture, fracture nonunion, deficiency anemia, coagulopathy, and avascular necrosis. Of the identified variables associated with ABT, deficiency anemia may be the most amenable to treatment; therefore, there may be benefit in delaying elective shoulder arthroplasty in this cohort. Given these findings, it is important to identify at-risk patients before surgery, with the intent to provide education and minimize risk.

In shoulder arthroplasty, it is not uncommon for patients to receive postoperative blood transfusions; rates range from 7% to 43%.1-6 Allogeneic blood transfusions (ABTs) are costly and not entirely free of risks.7 The risk for infection has decreased because of improved screening and risk reduction strategies, but there are still significant risks associated with ABTs, such as clerical errors, acute and delayed hemolytic reactions, graft-versus-host reactions, transfusion-related acute lung injury, and anaphylaxis.8-10 As use of shoulder arthroplasty continues to increase, the importance of minimizing unnecessary transfusions is growing as well.7

Predictive factors for ABT have been explored in other orthopedic settings, yet little has been done in shoulder arthroplasty.1-6,11-15 Previous shoulder arthroplasty studies have shown that low preoperative hemoglobin (Hb) levels are independent risk factors for postoperative blood transfusion. However, there is debate over the significance of other variables, such as procedure type, age, sex, and medical comorbidities. Further, prior studies were limited by relatively small samples from single institutions; the largest series included fewer than 600 patients.1-6

We conducted a study to determine predictors of ABT in a large cohort of patients admitted to US hospitals for shoulder arthroplasty. We also wanted to evaluate the effect of ABT on postoperative outcomes, including inpatient mortality, adverse events, prolonged hospital stay, and nonroutine discharge. According to the null hypothesis, in shoulder arthroplasty there will be no difference in risk factors between patients who require ABT and those who did not, after accounting for confounding variables.

Materials and Methods

This study was exempt from institutional review board approval, as all data were appropriately deidentified before use in this project. We used the Nationwide Inpatient Sample (NIS) to retrospectively study the period 2002–2011, from which all demographic, clinical, and resource use data were derived.16 NIS, an annual survey conducted by the Agency for Healthcare Research and Quality (AHRQ) since 1988, has generated a huge amount of data, forming the largest all-payer inpatient care database in the United States. Yearly samples contain discharge data from about 8 million hospital stays at more than 1000 hospitals across 46 states, approximating a 20% random sample of all hospital discharges at participating institutions.17 These data are then weighted to generate statistically valid national estimates.

The NIS database uses International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) codes to identify 15 medical diagnoses up to the year 2008 and a maximum of 25 medical diagnoses and 15 procedures thereafter. In addition, the database includes information on patient and hospital characteristics as well as inpatient outcomes such as length of stay, total hospitalization charges, and discharge disposition.18,19 Given its large sample size and data volume, NIS is a powerful tool in the analysis of data associated with a multitude of medical diagnoses and procedures.20

We used the NIS database to study a population of 422,371 patients (age, >18 years) who underwent total shoulder arthroplasty (TSA) or hemiarthroplasty (HSA) between 2002 and 2011. ICD-9-CM procedure codes for TSA (81.80, 81.88) and HSA (81.81) were used to identify this population. We also analyzed data for reverse TSA for the year 2011. Then we divided our target population into 2 different cohorts: patients who did not receive any blood transfusion products and patients who received a transfusion of allogeneic packed cells (ICD-9-CM code 99.04 was used to identify the latter cohort).

In this study, normal distribution of the dataset was assumed, given the large sample size. The 2 cohorts were evaluated through bivariate analysis using the Pearson χ2 test for categorical data and the independent-samples t test for continuous data. The extent to which diagnosis, age, race, sex, and medical comorbidities were predictive of blood transfusion after TSA or HSA was evaluated through multivariate binary logistic regression analysis. Statistical significance was set at P < .05. All statistical analyses and data modeling were performed with SPSS Version 22.0.

Results

Using the NIS database, we stratified an estimated 422,371 patients who presented for shoulder arthroplasty between January 1, 2002, and December 31, 2011, into a TSA cohort (59.3%) and an HSA cohort (40.7%). Eight percent (33,889) of all patients received an ABT; the proportion of patients who received ABT was higher (P < .001) for the HSA cohort (55.6%) than the TSA cohort (39.4%). Further, the rate of ABT after shoulder arthroplasty showed an upward inclination (Figure).

Demographically, patients who received ABT tended (P < .001) to be older (74±11 years vs 68±11 years) and of a minority race (black or Hispanic) and to fall in either the lowest range of median household income (21.5% vs 20.7%; ≤$38,999) or the highest (27.3% vs 25.4%; ≥$63,000). Shoulder arthroplasty with ABT occurred more often (P < .001) at hospitals that were urban (13.3% vs 11.3%), medium in size (27.3% vs 23.4%), and nonteaching (56.2% vs 54.3%). In addition, ABT was used more often (P < .001) in patients with a primary diagnosis of fracture (43.1% vs 14.3%) or fracture nonunion (4.4% vs 2.1%). These groups also had a longer (P < .001) hospital stay (5.0±4.3 days vs 2.5±2.2 days). Table 1 summarizes these findings.

 

 

The 2 cohorts were then analyzed for presence of medical comorbidities (Table 2). Patients who required ABT during shoulder arthroplasty had a significantly (P < .001) higher prevalence of congestive heart failure, chronic lung disease, hypertension, uncomplicated and complicated diabetes mellitus, liver disease, renal failure, fluid and electrolyte disorders, pulmonary circulatory disease, weight loss, coagulopathy, and deficiency anemia.

In multivariate regression modeling (Table 3), demographic predictors of ABT (P < .001) included increasing age (odds ratio [OR], 1.03 per year; 95% confidence interval [95% CI], 1.03-1.03), female sex (OR, 1.55; 95% CI, 1.51-1.60), and minority race (black or Hispanic). Odds of requiring ABT were higher for patients with Medicare (OR, 1.25; 95% CI, 1.20-1.30) and patients with Medicaid (OR, 1.63; 95% CI, 1.51-1.77) than for patients with private insurance.

ABT was more likely to be required (P < .001) in patients with a primary diagnosis of fracture (OR, 4.49; 95% CI, 4.34-4.65), avascular necrosis (OR, 2.06; 95% CI, 1.91-2.22), rheumatoid arthritis (OR, 1.91; 95% CI, 1.72-2.12), fracture nonunion (OR, 3.55; 95% CI, 3.33-3.79), or rotator cuff arthropathy (OR, 1.47; 95% CI, 1.41-1.54) than for patients with osteoarthritis. Moreover, compared with patients having HSA, patients having TSA were more likely to require ABT (OR, 1.20; 95% CI, 1.17-1.24). According to the analysis restricted to the year 2011, compared with patients having anatomical TSAs, patients having reverse TSAs were 1.6 times more likely (P < .001) to require ABT (OR, 1.63; 95% CI, 1.50-1.79).

With the exception of obesity, all comorbidities were significant (P < .001) independent predictors of ABT after shoulder arthroplasty: deficiency anemia (OR, 3.42; 95% CI, 3.32-3.52), coagulopathy (OR, 2.54; 95% CI, 2.36-2.73), fluid and electrolyte disorders (OR, 1.91; 95% CI, 1.84-1.97), and weight loss (OR, 1.78; 95% CI, 1.58-2.00).

Patients who received ABT were more likely to experience adverse events (OR, 1.74; 95% CI, 1.68-1.81), prolonged hospital stay (OR, 3.21; 95% CI, 3.12-3.30), and nonroutine discharge (OR, 1.77; 95% CI, 1.72-1.82) (Table 4). There was no difference in mortality between the 2 cohorts.

Discussion

There is an abundance of literature on blood transfusions in hip and knee arthroplasty, but there are few articles on ABT in shoulder arthroplasty, and they all report data from single institutions with relatively low caseloads.1,2,11-13,15,21 In the present study, we investigated ABT in shoulder arthroplasty from the perspective of a multi-institutional database with a caseload of more than 400,000. Given the rapidly increasing rates of shoulder arthroplasty, it is important to further examine this issue to minimize unnecessary blood transfusion and its associated risks and costs.7

We found that 8% of patients who had shoulder arthroplasty received ABT, which is consistent with previously reported transfusion rates (range, 7%-43%).1-6 Rates of ABT after shoulder arthroplasty have continued to rise. The exception, a decrease during the year 2010, can be explained by increased efforts to more rigidly follow transfusion indication guidelines to reduce the number of potentially unnecessary ABTs.21-24 Our study also identified numerous significant independent predictors of ABT in shoulder arthroplasty: age, sex, race, insurance status, procedure type, primary diagnoses, and multiple medical comorbidities.

Demographics

According to our analysis, more than 80% of patients who received ABT were over age 65 years, which aligns with what several other studies have demonstrated: Increasing age is a predictor of ABT, despite higher rates of comorbidities and lower preoperative Hb levels in this population.1,2,4,5,25-27 Consistent with previous work, female sex was predictive of ABT.2,5 It has been suggested that females are more likely predisposed to ABT because of lower preoperative Hb and smaller blood mass.2,5,28 Interestingly, our study showed a higher likelihood of ABT in both black and Hispanic populations. Further, patients with Medicare or Medicaid were more likely to receive ABT.

Primary Diagnosis

Although patients with a primary diagnosis of osteoarthritis constitute the majority of patients who undergo shoulder arthroplasty, our analysis showed that patients with a diagnosis of proximal humerus fracture were more likely to receive ABT. This finding is reasonable given studies showing the high prevalence of proximal humerus fractures in elderly women.29,30 Similarly, patients with a humerus fracture nonunion were more likely to receive a blood transfusion, which is unsurprising given the increased complexity associated with arthroplasty in this predominately elderly population.31 Interestingly, compared with patients with osteoarthritis, patients with any one of the other primary diagnoses were more likely to require a transfusion—proximal humerus fracture being the most significant, followed by humerus fracture nonunion, avascular necrosis, rheumatoid arthritis, and rotator cuff arthropathy.

 

 

Type of Arthroplasty

Bivariate analysis revealed that 55.6% of the patients who received ABT underwent HSA; the other 44.4% underwent TSA. The effect of primary diagnosis on procedure choice likely played a role in this finding. HSA indications include humerus fracture, which has been associated with increased ABT, whereas patients with osteoarthritis requiring TSA are significantly less likely to require ABT, as reflected in this analysis.7,32-34 Previous studies have failed to show a difference in blood transfusion rates between TSA and HSA.2,4-6,35 Conversely, with confounding factors controlled for, multivariate logistic regression analysis showed that TSA was 1.2 times more likely than HSA to require ABT, which could be explained by the increased operative time, case complexity, and blood loss that may be associated with the glenoid exposure.36,37 With analysis restricted to the year 2011, patients with reverse TSAs were 1.6 times more likely than patients with anatomical TSAs to receive a blood transfusion (OR, 1.63; 95% CI, 1.50-1.79). Although this finding differs from what was previously reported, it fits given that patients having reverse TSAs are often older and may present with a more significant comorbidity profile.3 In addition, there are the increased technical surgical aspects associated with “salvage surgery” for challenging indications such as cuff arthropathy and failed previous arthroplasty.38-41

Medical Comorbidities

Patients who received ABT were more likely to present with numerous medical comorbidities. Previous studies have indicated that the presence of multiple medical comorbidities significantly increased blood transfusion rates, possibly by working synergistically.42 All studies of blood transfusion in shoulder arthroplasty concluded that lower preoperative Hb was an independent predictor.1-6 Schumer and colleagues4 reported a 4-fold increase in likelihood of blood transfusion in patients with a preoperative Hb level less than 12.5 g/dL. In addition, Millett and colleagues6 showed a 20-fold increase in likelihood of transfusion in patients with a preoperative Hb level less than 11.0 g/dL compared with patients with a level higher than 13.0 g/dL. Patients with a Hb level between 11.0 and 13.0 g/dL showed a 5-fold increase in likelihood of transfusion.6 We should note that correction of preoperative anemia through various pharmacologic methods (eg, erythropoietin, intravenous iron supplementation) has been shown to decrease postoperative transfusion rates.43,44 Although we could not include preoperative Hb levels in the present study, given inherent limitations in using NIS, our multivariate analysis showed that preoperative deficiency anemia and coagulopathy were the most significant predictors of ABT.

In addition, the multivariate logistic regression model showed that both cardiac disease and diabetes were independent predictors of ABT, confirming data reported by Ahmadi and colleagues.1 Although not as well characterized in other studies, in the current analysis multiple other medical comorbidities, including fluid and electrolyte abnormalities, weight loss, liver disease, renal failure, and chronic lung disease, had significant predictive value. Contrarily, obesity significantly decreased the odds of ABT, likely because of higher baseline blood volume in obese patients.

Patient Outcomes

Patients who undergo shoulder arthroplasty with ABT are more likely to experience adverse events or a prolonged hospital stay and are more often discharged to a nursing home or an extended-care facility. In this population, however, deaths did not occur at a significantly higher rate—similar to what was found for patients who underwent hip or knee arthroplasty with blood transfusions.45

Little has been done to investigate the effect of pharmacologic agents on the need for perioperative ABT for orthopedic shoulder procedures. Aprotinin, tranexamic acid, epoetin-α, and aminocaproic acid have all been effective in limiting ABT during the perioperative period in various orthopedic hip, knee, and spine procedures.9,46-53 Given the increased morbidity associated with ABT, it may be beneficial to use similar methods to limit blood loss in high-risk patients undergoing shoulder arthroplasty.

Study Limitations

NIS has intrinsic limitations. Given its massive volume, it is subject to errors in both data entry and clinical coding. Moreover, the database lacks data that would have been useful in our study: preoperative Hb levels, intraoperative course, number of units transfused, total blood loss, use of blood conservation techniques, transfusion protocols, and severity of comorbidities. Reverse TSA was given a unique ICD-9-CM code in October 2010, so 2011 was the only year we were able to examine the relationship between reverse TSA and transfusions. Further, our analysis was unable to identify any medications, including chronic anticoagulants or postoperative prophylaxis, that have been shown to significantly affect blood transfusion rates.54 Yet, there are obvious advantages to using the NIS database, as previously outlined across the medical landscape.

 

 

Conclusion

Our results confirmed previous findings and identified new predictors of ABT in shoulder arthroplasty in a large cohort. We examined demographics and perioperative complications while identifying predictors of ABT use. Patients who received ABT were older, female, and nonwhite and were covered by Medicare or Medicaid insurance, and many had a primary diagnosis of proximal humerus fracture. The ABT cohort had numerous medical comorbidities, including deficiency anemia and coagulopathy. Identifying this patient population is a prerequisite to educating patients while minimizing unnecessary risks and costs.

Using NIS data on a population of 422,371 patients who underwent shoulder arthroplasty, we identified the 5 likeliest predictors of ABT: fracture, fracture nonunion, deficiency anemia, coagulopathy, and avascular necrosis. Of the identified variables associated with ABT, deficiency anemia may be the most amenable to treatment; therefore, there may be benefit in delaying elective shoulder arthroplasty in this cohort. Given these findings, it is important to identify at-risk patients before surgery, with the intent to provide education and minimize risk.

References

1.    Ahmadi S, Lawrence TM, Sahota S, et al. The incidence and risk factors for blood transfusion in revision shoulder arthroplasty: our institution’s experience and review of the literature. J Shoulder Elbow Surg. 2014;23(1):43-48.

2.    Sperling JW, Duncan SF, Cofield RH, Schleck CD, Harmsen WS. Incidence and risk factors for blood transfusion in shoulder arthroplasty. J Shoulder Elbow Surg. 2005;14(6):599-601.

3.    Hardy JC, Hung M, Snow BJ, et al. Blood transfusion associated with shoulder arthroplasty. J Shoulder Elbow Surg. 2013;22(2):233-239.

4.    Schumer RA, Chae JS, Markert RJ, Sprott D, Crosby LA. Predicting transfusion in shoulder arthroplasty. J Shoulder Elbow Surg. 2010;19(1):91-96.

5.    Gruson KI, Accousti KJ, Parsons BO, Pillai G, Flatow EL. Transfusion after shoulder arthroplasty: an analysis of rates and risk factors. J Shoulder Elbow Surg. 2009;18(2):225-230.

6.    Millett PJ, Porramatikul M, Chen N, Zurakowski D, Warner JJ. Analysis of transfusion predictors in shoulder arthroplasty. J Bone Joint Surg Am. 2006;88(6):1223-1230.

7.    Kim SH, Wise BL, Zhang Y, Szabo RM. Increasing incidence of shoulder arthroplasty in the United States. J Bone Joint Surg Am. 2011;93(24):2249-2254.

8.    Ceccherini-Nelli L, Filipponi F, Mosca F, Campa M. The risk of contracting an infectious disease from blood transfusion. Transplantation Proc. 2004;36(3):680-682.

9.    Friedman R, Homering M, Holberg G, Berkowitz SD. Allogeneic blood transfusions and postoperative infections after total hip or knee arthroplasty. J Bone Joint Surg Am. 2014;96(4):272-278.

10. Hatzidakis AM, Mendlick RM, McKillip T, Reddy RL, Garvin KL. Preoperative autologous donation for total joint arthroplasty. An analysis of risk factors for allogenic transfusion. J Bone Joint Surg Am. 2000;82(1):89-100.

11. Park JH, Rasouli MR, Mortazavi SM, Tokarski AT, Maltenfort MG, Parvizi J. Predictors of perioperative blood loss in total joint arthroplasty. J Bone Joint Surg Am. 2013;95(19):1777-1783.

12. Aderinto J, Brenkel IJ. Pre-operative predictors of the requirement for blood transfusion following total hip replacement. J Bone Joint Surg Br. 2004;86(7):970-973.

13. Browne JA, Adib F, Brown TE, Novicoff WM. Transfusion rates are increasing following total hip arthroplasty: risk factors and outcomes. J Arthroplasty. 2013;28(8 suppl):34-37.

14. Yoshihara H, Yoneoka D. Predictors of allogeneic blood transfusion in spinal fusion in the United States, 2004–2009. Spine. 2014;39(4):304-310.

15. Noticewala MS, Nyce JD, Wang W, Geller JA, Macaulay W. Predicting need for allogeneic transfusion after total knee arthroplasty. J Arthroplasty. 2012;27(6):961-967.

16. Griffin JW, Novicoff WM, Browne JA, Brockmeier SF. Obstructive sleep apnea as a risk factor after shoulder arthroplasty. J Shoulder Elbow Surg. 2013;22(12):e6-e9.

17. Maynard C, Sales AE. Changes in the use of coronary artery revascularization procedures in the Department of Veterans Affairs, the National Hospital Discharge Survey, and the Nationwide Inpatient Sample, 1991–1999. BMC Health Serv Res. 2003;3(1):12.

18. Pereira BM, Chan PH, Weinstein PR, Fishman RA. Cerebral protection during reperfusion with superoxide dismutase in focal cerebral ischemia. Adv Neurol. 1990;52:97-103.

19. Hambright D, Henderson RA, Cook C, Worrell T, Moorman CT, Bolognesi MP. A comparison of perioperative outcomes in patients with and without rheumatoid arthritis after receiving a total shoulder replacement arthroplasty. J Shoulder Elbow Surg. 2011;20(1):77-85.

20. Ponce BA, Menendez ME, Oladeji LO, Soldado F. Diabetes as a risk factor for poorer early postoperative outcomes after shoulder arthroplasty. J Shoulder Elbow Surg. 2014;23(5):671-678.

21. Pierson JL, Hannon TJ, Earles DR. A blood-conservation algorithm to reduce blood transfusions after total hip and knee arthroplasty. J Bone Joint Surg Am. 2004;86(7):1512-1518.

22. Martinez V, Monsaingeon-Lion A, Cherif K, Judet T, Chauvin M, Fletcher D. Transfusion strategy for primary knee and hip arthroplasty: impact of an algorithm to lower transfusion rates and hospital costs. Br J Anaesth. 2007;99(6):794-800.

23. Helm AT, Karski MT, Parsons SJ, Sampath JS, Bale RS. A strategy for reducing blood-transfusion requirements in elective orthopaedic surgery. Audit of an algorithm for arthroplasty of the lower limb. J Bone Joint Surg Br. 2003;85(4):484-489.

24. Watts CD, Pagnano MW. Minimising blood loss and transfusion in contemporary hip and knee arthroplasty. J Bone Joint Surg Br. 2012;94(11 suppl A):8-10.

25. Guralnik JM, Eisenstaedt RS, Ferrucci L, Klein HG, Woodman RC. Prevalence of anemia in persons 65 years and older in the United States: evidence for a high rate of unexplained anemia. Blood. 2004;104(8):2263-2268.

26. Rogers MA, Blumberg N, Heal JM, Langa KM. Utilization of blood transfusion among older adults in the United States. Transfusion. 2011;51(4):710-718.

27. Cobain TJ, Vamvakas EC, Wells A, Titlestad K. A survey of the demographics of blood use. Transfusion Med. 2007;17(1):1-15.

28. Fosco M, Di Fiore M. Factors predicting blood transfusion in different surgical procedures for degenerative spine disease. Eur Rev Med Pharmacol Sci. 2012;16(13):1853-1858.

29. Handoll HH, Ollivere BJ, Rollins KE. Interventions for treating proximal humeral fractures in adults. Cochrane Database Syst Rev. 2012;12:CD000434.

30. Neuhaus V, Swellengrebel CH, Bossen JK, Ring D. What are the factors influencing outcome among patients admitted to a hospital with a proximal humeral fracture? Clin Orthop Relat Res. 2013;471(5):1698-1706.

31. Volgas DA, Stannard JP, Alonso JE. Nonunions of the humerus. Clin Orthop Relat Res. 2004;(419):46-50.

32. Chambers L, Dines JS, Lorich DG, Dines DM. Hemiarthroplasty for proximal humerus fractures. Curr Rev Musculoskeletal Med. 2013;6(1):57-62.

33. Jain NB, Hocker S, Pietrobon R, Guller U, Bathia N, Higgins LD. Total arthroplasty versus hemiarthroplasty for glenohumeral osteoarthritis: role of provider volume. J Shoulder Elbow Surg. 2005;14(4):361-367.

34.  Izquierdo R, Voloshin I, Edwards S, et al. Treatment of glenohumeral osteoarthritis. J Am Acad Orthop Surg. 2010;18(6):375-382.

35. Shields E, Iannuzzi JC, Thorsness R, Noyes K, Voloshin I. Perioperative complications after hemiarthroplasty and total shoulder arthroplasty are equivalent. J Shoulder Elbow Surg. 2014;23(10):1449-1453.

36. Gartsman GM, Roddey TS, Hammerman SM. Shoulder arthroplasty with or without resurfacing of the glenoid in patients who have osteoarthritis. J Bone Joint Surg Am. 2000;82(1):26-34.

37. Singh A, Yian EH, Dillon MT, Takayanagi M, Burke MF, Navarro RA. The effect of surgeon and hospital volume on shoulder arthroplasty perioperative quality metrics. J Shoulder Elbow Surg. 2014;23(8):1187-1194.

38. Groh GI, Groh GM. Complications rates, reoperation rates, and the learning curve in reverse shoulder arthroplasty. J Shoulder Elbow Surg. 2014;23(3):388-394.

39. Boileau P, Gonzalez JF, Chuinard C, Bicknell R, Walch G. Reverse total shoulder arthroplasty after failed rotator cuff surgery. J Shoulder Elbow Surg. 2009;18(4):600-606.

40.    Boileau P, Watkinson D, Hatzidakis AM, Hovorka I. Neer Award 2005: the Grammont reverse shoulder prosthesis: results in cuff tear arthritis, fracture sequelae, and revision arthroplasty. J Shoulder Elbow Surg. 2006;15(5):527-540.

41. Boileau P, Watkinson DJ, Hatzidakis AM, Balg F. Grammont reverse prosthesis: design, rationale, and biomechanics. J Shoulder Elbow Surg. 2005;14(1 suppl S):147S-161S.

42. Pola E, Papaleo P, Santoliquido A, Gasparini G, Aulisa L, De Santis E. Clinical factors associated with an increased risk of perioperative blood transfusion in nonanemic patients undergoing total hip arthroplasty. J Bone Joint Surg Am. 2004;86(1):57-61.

43. Lin DM, Lin ES, Tran MH. Efficacy and safety of erythropoietin and intravenous iron in perioperative blood management: a systematic review. Transfusion Med Rev. 2013;27(4):221-234.

44. Muñoz M, Gómez-Ramírez S, Cuenca J, et al. Very-short-term perioperative intravenous iron administration and postoperative outcome in major orthopedic surgery: a pooled analysis of observational data from 2547 patients. Transfusion. 2014;54(2):289-299.

45. Danninger T, Rasul R, Poeran J, et al. Blood transfusions in total hip and knee arthroplasty: an analysis of outcomes. ScientificWorldJournal. 2014;2014:623460.

46. Baldus CR, Bridwell KH, Lenke LG, Okubadejo GO. Can we safely reduce blood loss during lumbar pedicle subtraction osteotomy procedures using tranexamic acid or aprotinin? A comparative study with controls. Spine. 2010;35(2):235-239.

47. Chang CH, Chang Y, Chen DW, Ueng SW, Lee MS. Topical tranexamic acid reduces blood loss and transfusion rates associated with primary total hip arthroplasty. Clin Orthop Relat Res. 2014;472(5):1552-1557.

48. Delasotta LA, Orozco F, Jafari SM, Blair JL, Ong A. Should we use preoperative epoetin-alpha in the mildly anemic patient undergoing simultaneous total knee arthroplasty? Open Orthop J. 2013;7:47-50.

49. Delasotta LA, Rangavajjula A, Frank ML, Blair J, Orozco F, Ong A. The use of preoperative epoetin-alpha in revision hip arthroplasty. Open Orthop J. 2012;6:179-183.

50. Kelley TC, Tucker KK, Adams MJ, Dalury DF. Use of tranexamic acid results in decreased blood loss and decreased transfusions in patients undergoing staged bilateral total knee arthroplasty. Transfusion. 2014;54(1):26-30.

51. Martin JG, Cassatt KB, Kincaid-Cinnamon KA, Westendorf DS, Garton AS, Lemke JH. Topical administration of tranexamic acid in primary total hip and total knee arthroplasty. J Arthroplasty. 2014;29(5):889-894.

52. Tzortzopoulou A, Cepeda MS, Schumann R, Carr DB. Antifibrinolytic agents for reducing blood loss in scoliosis surgery in children. Cochrane Database Syst Rev. 2008(3):CD006883.

53. Zhang H, Chen J, Chen F, Que W. The effect of tranexamic acid on blood loss and use of blood products in total knee arthroplasty: a meta-analysis. Knee Surg Sports Traumatol Arthrosc. 2012;20(9):1742-1752.

54. Bong MR, Patel V, Chang E, Issack PS, Hebert R, Di Cesare PE. Risks associated with blood transfusion after total knee arthroplasty. J Arthroplasty. 2004;19(3):281-287.

References

1.    Ahmadi S, Lawrence TM, Sahota S, et al. The incidence and risk factors for blood transfusion in revision shoulder arthroplasty: our institution’s experience and review of the literature. J Shoulder Elbow Surg. 2014;23(1):43-48.

2.    Sperling JW, Duncan SF, Cofield RH, Schleck CD, Harmsen WS. Incidence and risk factors for blood transfusion in shoulder arthroplasty. J Shoulder Elbow Surg. 2005;14(6):599-601.

3.    Hardy JC, Hung M, Snow BJ, et al. Blood transfusion associated with shoulder arthroplasty. J Shoulder Elbow Surg. 2013;22(2):233-239.

4.    Schumer RA, Chae JS, Markert RJ, Sprott D, Crosby LA. Predicting transfusion in shoulder arthroplasty. J Shoulder Elbow Surg. 2010;19(1):91-96.

5.    Gruson KI, Accousti KJ, Parsons BO, Pillai G, Flatow EL. Transfusion after shoulder arthroplasty: an analysis of rates and risk factors. J Shoulder Elbow Surg. 2009;18(2):225-230.

6.    Millett PJ, Porramatikul M, Chen N, Zurakowski D, Warner JJ. Analysis of transfusion predictors in shoulder arthroplasty. J Bone Joint Surg Am. 2006;88(6):1223-1230.

7.    Kim SH, Wise BL, Zhang Y, Szabo RM. Increasing incidence of shoulder arthroplasty in the United States. J Bone Joint Surg Am. 2011;93(24):2249-2254.

8.    Ceccherini-Nelli L, Filipponi F, Mosca F, Campa M. The risk of contracting an infectious disease from blood transfusion. Transplantation Proc. 2004;36(3):680-682.

9.    Friedman R, Homering M, Holberg G, Berkowitz SD. Allogeneic blood transfusions and postoperative infections after total hip or knee arthroplasty. J Bone Joint Surg Am. 2014;96(4):272-278.

10. Hatzidakis AM, Mendlick RM, McKillip T, Reddy RL, Garvin KL. Preoperative autologous donation for total joint arthroplasty. An analysis of risk factors for allogenic transfusion. J Bone Joint Surg Am. 2000;82(1):89-100.

11. Park JH, Rasouli MR, Mortazavi SM, Tokarski AT, Maltenfort MG, Parvizi J. Predictors of perioperative blood loss in total joint arthroplasty. J Bone Joint Surg Am. 2013;95(19):1777-1783.

12. Aderinto J, Brenkel IJ. Pre-operative predictors of the requirement for blood transfusion following total hip replacement. J Bone Joint Surg Br. 2004;86(7):970-973.

13. Browne JA, Adib F, Brown TE, Novicoff WM. Transfusion rates are increasing following total hip arthroplasty: risk factors and outcomes. J Arthroplasty. 2013;28(8 suppl):34-37.

14. Yoshihara H, Yoneoka D. Predictors of allogeneic blood transfusion in spinal fusion in the United States, 2004–2009. Spine. 2014;39(4):304-310.

15. Noticewala MS, Nyce JD, Wang W, Geller JA, Macaulay W. Predicting need for allogeneic transfusion after total knee arthroplasty. J Arthroplasty. 2012;27(6):961-967.

16. Griffin JW, Novicoff WM, Browne JA, Brockmeier SF. Obstructive sleep apnea as a risk factor after shoulder arthroplasty. J Shoulder Elbow Surg. 2013;22(12):e6-e9.

17. Maynard C, Sales AE. Changes in the use of coronary artery revascularization procedures in the Department of Veterans Affairs, the National Hospital Discharge Survey, and the Nationwide Inpatient Sample, 1991–1999. BMC Health Serv Res. 2003;3(1):12.

18. Pereira BM, Chan PH, Weinstein PR, Fishman RA. Cerebral protection during reperfusion with superoxide dismutase in focal cerebral ischemia. Adv Neurol. 1990;52:97-103.

19. Hambright D, Henderson RA, Cook C, Worrell T, Moorman CT, Bolognesi MP. A comparison of perioperative outcomes in patients with and without rheumatoid arthritis after receiving a total shoulder replacement arthroplasty. J Shoulder Elbow Surg. 2011;20(1):77-85.

20. Ponce BA, Menendez ME, Oladeji LO, Soldado F. Diabetes as a risk factor for poorer early postoperative outcomes after shoulder arthroplasty. J Shoulder Elbow Surg. 2014;23(5):671-678.

21. Pierson JL, Hannon TJ, Earles DR. A blood-conservation algorithm to reduce blood transfusions after total hip and knee arthroplasty. J Bone Joint Surg Am. 2004;86(7):1512-1518.

22. Martinez V, Monsaingeon-Lion A, Cherif K, Judet T, Chauvin M, Fletcher D. Transfusion strategy for primary knee and hip arthroplasty: impact of an algorithm to lower transfusion rates and hospital costs. Br J Anaesth. 2007;99(6):794-800.

23. Helm AT, Karski MT, Parsons SJ, Sampath JS, Bale RS. A strategy for reducing blood-transfusion requirements in elective orthopaedic surgery. Audit of an algorithm for arthroplasty of the lower limb. J Bone Joint Surg Br. 2003;85(4):484-489.

24. Watts CD, Pagnano MW. Minimising blood loss and transfusion in contemporary hip and knee arthroplasty. J Bone Joint Surg Br. 2012;94(11 suppl A):8-10.

25. Guralnik JM, Eisenstaedt RS, Ferrucci L, Klein HG, Woodman RC. Prevalence of anemia in persons 65 years and older in the United States: evidence for a high rate of unexplained anemia. Blood. 2004;104(8):2263-2268.

26. Rogers MA, Blumberg N, Heal JM, Langa KM. Utilization of blood transfusion among older adults in the United States. Transfusion. 2011;51(4):710-718.

27. Cobain TJ, Vamvakas EC, Wells A, Titlestad K. A survey of the demographics of blood use. Transfusion Med. 2007;17(1):1-15.

28. Fosco M, Di Fiore M. Factors predicting blood transfusion in different surgical procedures for degenerative spine disease. Eur Rev Med Pharmacol Sci. 2012;16(13):1853-1858.

29. Handoll HH, Ollivere BJ, Rollins KE. Interventions for treating proximal humeral fractures in adults. Cochrane Database Syst Rev. 2012;12:CD000434.

30. Neuhaus V, Swellengrebel CH, Bossen JK, Ring D. What are the factors influencing outcome among patients admitted to a hospital with a proximal humeral fracture? Clin Orthop Relat Res. 2013;471(5):1698-1706.

31. Volgas DA, Stannard JP, Alonso JE. Nonunions of the humerus. Clin Orthop Relat Res. 2004;(419):46-50.

32. Chambers L, Dines JS, Lorich DG, Dines DM. Hemiarthroplasty for proximal humerus fractures. Curr Rev Musculoskeletal Med. 2013;6(1):57-62.

33. Jain NB, Hocker S, Pietrobon R, Guller U, Bathia N, Higgins LD. Total arthroplasty versus hemiarthroplasty for glenohumeral osteoarthritis: role of provider volume. J Shoulder Elbow Surg. 2005;14(4):361-367.

34.  Izquierdo R, Voloshin I, Edwards S, et al. Treatment of glenohumeral osteoarthritis. J Am Acad Orthop Surg. 2010;18(6):375-382.

35. Shields E, Iannuzzi JC, Thorsness R, Noyes K, Voloshin I. Perioperative complications after hemiarthroplasty and total shoulder arthroplasty are equivalent. J Shoulder Elbow Surg. 2014;23(10):1449-1453.

36. Gartsman GM, Roddey TS, Hammerman SM. Shoulder arthroplasty with or without resurfacing of the glenoid in patients who have osteoarthritis. J Bone Joint Surg Am. 2000;82(1):26-34.

37. Singh A, Yian EH, Dillon MT, Takayanagi M, Burke MF, Navarro RA. The effect of surgeon and hospital volume on shoulder arthroplasty perioperative quality metrics. J Shoulder Elbow Surg. 2014;23(8):1187-1194.

38. Groh GI, Groh GM. Complications rates, reoperation rates, and the learning curve in reverse shoulder arthroplasty. J Shoulder Elbow Surg. 2014;23(3):388-394.

39. Boileau P, Gonzalez JF, Chuinard C, Bicknell R, Walch G. Reverse total shoulder arthroplasty after failed rotator cuff surgery. J Shoulder Elbow Surg. 2009;18(4):600-606.

40.    Boileau P, Watkinson D, Hatzidakis AM, Hovorka I. Neer Award 2005: the Grammont reverse shoulder prosthesis: results in cuff tear arthritis, fracture sequelae, and revision arthroplasty. J Shoulder Elbow Surg. 2006;15(5):527-540.

41. Boileau P, Watkinson DJ, Hatzidakis AM, Balg F. Grammont reverse prosthesis: design, rationale, and biomechanics. J Shoulder Elbow Surg. 2005;14(1 suppl S):147S-161S.

42. Pola E, Papaleo P, Santoliquido A, Gasparini G, Aulisa L, De Santis E. Clinical factors associated with an increased risk of perioperative blood transfusion in nonanemic patients undergoing total hip arthroplasty. J Bone Joint Surg Am. 2004;86(1):57-61.

43. Lin DM, Lin ES, Tran MH. Efficacy and safety of erythropoietin and intravenous iron in perioperative blood management: a systematic review. Transfusion Med Rev. 2013;27(4):221-234.

44. Muñoz M, Gómez-Ramírez S, Cuenca J, et al. Very-short-term perioperative intravenous iron administration and postoperative outcome in major orthopedic surgery: a pooled analysis of observational data from 2547 patients. Transfusion. 2014;54(2):289-299.

45. Danninger T, Rasul R, Poeran J, et al. Blood transfusions in total hip and knee arthroplasty: an analysis of outcomes. ScientificWorldJournal. 2014;2014:623460.

46. Baldus CR, Bridwell KH, Lenke LG, Okubadejo GO. Can we safely reduce blood loss during lumbar pedicle subtraction osteotomy procedures using tranexamic acid or aprotinin? A comparative study with controls. Spine. 2010;35(2):235-239.

47. Chang CH, Chang Y, Chen DW, Ueng SW, Lee MS. Topical tranexamic acid reduces blood loss and transfusion rates associated with primary total hip arthroplasty. Clin Orthop Relat Res. 2014;472(5):1552-1557.

48. Delasotta LA, Orozco F, Jafari SM, Blair JL, Ong A. Should we use preoperative epoetin-alpha in the mildly anemic patient undergoing simultaneous total knee arthroplasty? Open Orthop J. 2013;7:47-50.

49. Delasotta LA, Rangavajjula A, Frank ML, Blair J, Orozco F, Ong A. The use of preoperative epoetin-alpha in revision hip arthroplasty. Open Orthop J. 2012;6:179-183.

50. Kelley TC, Tucker KK, Adams MJ, Dalury DF. Use of tranexamic acid results in decreased blood loss and decreased transfusions in patients undergoing staged bilateral total knee arthroplasty. Transfusion. 2014;54(1):26-30.

51. Martin JG, Cassatt KB, Kincaid-Cinnamon KA, Westendorf DS, Garton AS, Lemke JH. Topical administration of tranexamic acid in primary total hip and total knee arthroplasty. J Arthroplasty. 2014;29(5):889-894.

52. Tzortzopoulou A, Cepeda MS, Schumann R, Carr DB. Antifibrinolytic agents for reducing blood loss in scoliosis surgery in children. Cochrane Database Syst Rev. 2008(3):CD006883.

53. Zhang H, Chen J, Chen F, Que W. The effect of tranexamic acid on blood loss and use of blood products in total knee arthroplasty: a meta-analysis. Knee Surg Sports Traumatol Arthrosc. 2012;20(9):1742-1752.

54. Bong MR, Patel V, Chang E, Issack PS, Hebert R, Di Cesare PE. Risks associated with blood transfusion after total knee arthroplasty. J Arthroplasty. 2004;19(3):281-287.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
E486-E492
Page Number
E486-E492
Publications
Publications
Topics
Article Type
Display Headline
Analysis of Predictors and Outcomes of Allogeneic Blood Transfusion After Shoulder Arthroplasty
Display Headline
Analysis of Predictors and Outcomes of Allogeneic Blood Transfusion After Shoulder Arthroplasty
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, allogeneic, blood, transfusion, shoulder arthroplasty, shoulder, arthroplasty, total shoulder arthroplasty, TSA, ABT, ponce, yu, menendez, oladeji
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, allogeneic, blood, transfusion, shoulder arthroplasty, shoulder, arthroplasty, total shoulder arthroplasty, TSA, ABT, ponce, yu, menendez, oladeji
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Orthopedic Practice Patterns Relating to Anterior Cruciate Ligament Reconstruction in Elite Athletes

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Orthopedic Practice Patterns Relating to Anterior Cruciate Ligament Reconstruction in Elite Athletes

National Hockey League (NHL), Major League Soccer (MLS), and US Olympic/World Cup Ski/Snowboard (Olympic) athletes receive orthopedic care from a select group of surgeons. There are 30 NHL teams, 19 MLS teams, 1 Olympic ski team, and 1 Olympic snowboard team, for a total of 51 teams and a rough total of 2229 athletes (1500 NHL, 570 MLS, 159 Olympic).1

Studies have shown that MLS athletes and X-Game skiers and snowboarders have performed well on return to sport (RTS) after anterior cruciate ligament (ACL) reconstruction.2,3 However, the techniques, graft choices, and rehabilitation protocols used to return these elite athletes to their preinjury level of performance have not been elucidated. It is unclear if the treatment given to these elite athletes differs from that given to recreational athletes and nonathletes. Bradley and colleagues4 examined how 32 NFL team orthopedists treated ACL tears, and Erickson and colleagues5 recently surveyed NFL and National Collegiate Athletic Association (NCAA) team physicians to determine practice patterns (eg, surgical techniques, graft choices, postoperative protocols) in treating ACL tears. Until now, however, no one has examined NHL, MLS, or Olympic team orthopedic surgeons’ practice patterns as they relate to ACL reconstruction.

We conducted an online survey of NHL, MLS, and Olympic team orthopedic surgeons to determine practice patterns relating to ACL reconstruction in elite athletes. Given the practice patterns of surgeons in our practice, we hypothesized that the surveyed surgeons treating these elite athletes would most commonly use bone–patellar tendon–bone (BPTB) autograft with a single-bundle technique. We also hypothesized that they would permit RTS without a brace at a minimum of 6 months after surgery, with a normal physical examination, and after successful completion of a structured battery of RTS tests.

Materials and Methods

On the SurveyMonkey website (http://www.surveymonkey.com), we created a 7-question base survey, with other questions added for the NHL and MLS surveys (Figure 1). We sent this survey to 94 team orthopedic surgeons (41 NHL, 26 MLS, 27 Olympic) identified through Internet searches and direct contact with team public relations departments. The survey was approved by MLS and NHL research committees. In 2013, each survey was sent out 5 times. The response rates for each round are shown in Figure 2. All responses remained confidential; we did not learn surgeons’ identities. Data were collected and analyzed through the SurveyMonkey website. Each surgeon was instructed to respond to all relevant questions in the survey. The survey was designed such that the participant could not submit the survey without answering all the questions. Descriptive statistics were calculated for each study and parameter analyzed. Continuous variable data are reported as means and standard deviations (weighted means where applicable). Categorical data are reported as frequencies with percentages.

Results

Of the 94 team orthopedic surgeons surveyed, 47 (50%) responded (NHL, 49%; MLS, 50%; Olympic, 52%). Mean (SD) experience as a team physician was 7.73 (5.33) years (range, 2-20 years) for NHL, 6.77 (6.64) years (range, 2-20 years) for MLS, and 1.14 (0.36) years (range, 1-10 years) for Olympic. Mean (SD) number of ACL reconstructions performed in 2012 was 101 (51) for NHL (range, 50-200), 78 (38) for MLS (range, 20-150), and 110 (105) for Olympic (range, 25-175) (Table 1). Of the 47 surgeons, 42 (89.4%) used autograft in the treatment of elite athletes, and 5 (10.6%) used allograft. Autograft choices were BPTB (n = 33; 70.2%), 4-strand semitendinosus (n = 7; 14.9%), and quadriceps (n = 2; 4.3%); allograft choices were 4-strand semitendinosus (n = 4; 8.5%) and BPTB (n = 1; 2.1%) (Table 2).

Of the 40 surgeons (85.1%) who indicated they would use autograft in 25-year-old recreational athletes, 25 (53.2%) would use BPTB, 13 (27.7%) would use 4-strand semitendinosus, and 2 (4.3%) would use quadriceps; of the 7 who indicated they would use allograft, 4 (8.5%) would use 4-strand semitendinosus, and 3 (6.4%) would use BPTB. In the NHL and MLS surveys, 19 surgeons (57.6%) indicated they would use autograft (6 would use BPTB, 13 would use 4-strand semitendinosus), and 14 (42.4%) would use allograft (7 would use BPTB, 5 would use Achilles, and 2 would use tibialis anterior) in 35-year-old recreational athletes.

Twenty-one surgeons (44.7%) were drilling the femoral tunnel through a transtibial portal, 36.2% through an anteromedial portal, and 12.8% using a 2-incision technique. All surgeons indicated they were using a single-bundle technique in ACL reconstruction. Thirty-three surgeons (70.2%) did not recommend a brace for their elite athletes on RTS. Olympic team surgeons had the highest rate of brace wear in RTS (50%, both skiers and snowboarders); NHL and MLS surgeons had significantly lower rates (25% and 15.4%, respectively) (Table 3).

 

 

Twenty (60.6%) of the NHL and MLS surgeons recommended waiting at least 6 months before RTS; 2 (6.1%) recommended waiting at least 9 months; no surgeon recommended waiting at least 12 months; and the others did not have a specific time frame for RTS. Twenty-seven surgeons (81.8%) recommended RTS after an athlete passed a series of RTS tests (eg, Vail, single-leg hop). Nineteen surgeons (57.6%) recommended waiting until the athlete had full range of motion, no pain, full strength, and subjective stability in the knee. Physicians could choose more than one answer for the previous question, allowing for a total percentage higher than 100%.

Discussion

The goal of this study was to determine how NHL, MLS, and Olympic team orthopedic surgeons manage ACL tears in elite and recreational athletes. Our study hypotheses were confirmed, as 70.2% of those surveyed used BPTB autograft for elite athletes, 100% used the single-bundle technique, 70.2% did not require a brace on RTS, 81.8% recommended RTS after the athlete passed a series of RTS tests (eg, Vail, single-leg hop), and 60.6% waited at least 6 months after surgery.

As soccer and skiing are the top 2 sports in which participants sustain ACL tears, it is necessary to report how surgeons obtain successful results in these patient populations.6 Using the US and Norwegian ACL reconstruction registries, Granan and colleagues6 found that, over a 7-year period, 5760 ACL tears occurred during soccer, and 2030 occurred during skiing. The scope of ACL injuries is broad, and treatment patterns must be elucidated. Although most surgeons do not treat elite athletes, many high school and college athletes compete at very high levels. Therefore, replicating the methods of the surgeons who treat elite athletes may be warranted.

In our survey, autograft (89.4%), particularly BPTB autograft (70.2%), was the most common graft choice for elite athletes. The rate of allograft use (42.4%) was higher for 35-year-old recreational athletes. As BPTB autograft produces reliable long-term results, this graft type is a reasonable choice.7 However, only 18% of our surveyed orthopedic surgeons indicated they would use BPTB autograft in older, recreational athletes. This stark difference is likely related to the more than 40% long-term side effects of anterior knee pain and graft harvest site morbidity with BPTB autograft as opposed to allograft and other types of autograft.8,9 Younger patients may be more willing to accept some anterior knee pain to ensure bone-to-bone healing with BPTB autograft. This shift in graft choice may also reflect the desire to minimize skin incisions and their resulting scars, especially in female recreational athletes.

In a meta-analysis of more than 5000 patients, Kraeutler and colleagues7 found that BPTB autograft outperformed allograft according to several knee scores, including Lysholm and Tegner, and had a lower re-rupture rate (4.3% vs 12.7%). However, despite the superior performance of BPTB autograft, graft choice cannot overcome surgeon error in graft placement.10 BPTB autograft appears to remain the gold standard for ACL reconstruction for many reasons, including low failure rates and decreased costs.11 Recently, investigators have tried to challenge the superiority of BPTB autograft. In a retrospective case–control study, Mascarenhas and colleagues12 found that hamstring autograft afforded patients better extension and higher subjective outcome scores. Bourke and colleagues13 found a higher rate of contralateral ACL rupture in patients treated with BPTB autograft compared with hamstring autograft.

According to this survey, 44.7% of surgeons indicated they drilled the femoral tunnel through a transtibial portal, 36.2% used an anteromedial portal, and 12.8% used the 2-incision technique. These methods were recently evaluated to determine if any is superior to the others, but the study results were not definitive.14 Franceschi and colleagues15 found improved rotational and anterior stability of the knee with use of an anteromedial approach, but their findings were not clinically or functionally significant. Wang and colleagues16 found an extension loss in the late-stance phase of gait with the anteromedial approach; the transtibial approach was correlated with inferior anterior-posterior stability during the stance phase of gait. Therefore, our results parallel those in the current literature in that the surveyed population is split on which technique to use and likely bases its practice on comfort level and residency/fellowship training.

Limitations

This study had several limitations. First, it provided level V evidence of team physicians in 3 major sports. Although some of these physicians were also treating athletes in other sports, our survey targeted NHL, MLS, and Olympic athletes. It did not address all ages and both sexes—which is significant, given the higher rate of ACL tears in females. All NHL and MLS players are male, and there was a high rate of BPTB graft use in these sports. However, recreational athletes include both males and females, and the fact that some surgeons would choose a hamstring graft for a female for cosmetic reasons must not be overlooked. Conversely, that there was no difference in the number of BPTB autografts chosen between NHL and MLS surgeons versus Olympic surgeons, where females are included (all chose about 60% BPTB autografts for their elite athletes), disputes this limitation. Our survey response rate was 50%. Other studies have had similar rates in relation to ACL practices,17 especially elite team physicians’ practices,5 and recent literature has confirmed that lower response rates in surveys did not alter results and may in fact have improved results.18,19 This percentage could be falsely low if some of our email addresses were incorrect. This rate also raises the possibility of selection bias, as surgeons who routinely used allograft in their athlete population may not have wanted to admit this. It is possible that some NHL, MLS, and Olympic athletes were treated by surgeons not included in this survey (in some cases, a non–team surgeon may have performed the athlete’s surgery). This survey did not address concomitant knee pathology or cover all possible technique variables.

 

 

Conclusion

Most of the NHL, MLS, and Olympic team orthopedic surgeons who were surveyed perform their ACL reconstructions using BPTB autograft, using a single-bundle technique, through a transtibial portal, and do not require bracing for their athletes returning to sport. Most required their athletes to complete a series of RTS tests before resuming competitive play.

References

1.    Team USA. 2013. US Olympic Committee website. http://www.teamusa.org/athletes?pg=1&seasonId=%7BCF2DC66A-C2B3-44A8-ABB8-A486F3FBFDDF%7D&ngbId=%7BB36167A0-2AC8-4B0F-876F-93D0A44DF60A%7D. Accessed October 23, 2015.

2.    Erickson BJ, Harris JD, Cvetanovich GL, et al. Performance and return to sport after anterior cruciate ligament reconstruction in male major league soccer players. Orthop J Sports Med. 2013;1(2):1-8.

3.    Erickson BJ, Harris JD, Fillingham YA, et al. Performance and return to sport after anterior cruciate ligament reconstruction in X-Games skiers and snowboarders. Orthop J Sports Med. 2013;1(6):1-5.

4.    Bradley JP, Klimkiewicz JJ, Rytel MJ, Powell JW. Anterior cruciate ligament injuries in the National Football League: epidemiology and current treatment trends among team physicians. Arthroscopy. 2002;18(5):502-509.

5.    Erickson BJ, Harris JD, Fillingham YA, et al. Anterior cruciate ligament reconstruction practice patterns by NFL and NCAA football team physicians. Arthroscopy. 2014;30(6):731-738.

6.    Granan LP, Inacio MC, Maletis GB, Funahashi TT, Engebretsen L. Sport-specific injury pattern recorded during anterior cruciate ligament reconstruction. Am J Sports Med. 2013;41(12):2814-2818.

7.    Kraeutler MJ, Bravman JT, McCarty EC. Bone–patellar tendon–bone autograft versus allograft in outcomes of anterior cruciate ligament reconstruction: a meta-analysis of 5182 patients. Am J Sports Med. 2013;41(10):2439-2448.

8.    Poehling GG, Curl WW, Lee CA, et al. Analysis of outcomes of anterior cruciate ligament repair with 5-year follow-up: allograft versus autograft. Arthroscopy. 2005;21(7):774-785.

9.    Kartus J, Magnusson L, Stener S, Brandsson S, Eriksson BI, Karlsson J. Complications following arthroscopic anterior cruciate ligament reconstruction. A 2-5-year follow-up of 604 patients with special emphasis on anterior knee pain. Knee Surg Sports Traumatol Arthrosc. 1999;7(1):2-8.

10.  Boszotta H. Arthroscopic anterior cruciate ligament reconstruction using a patellar tendon graft in press-fit technique: surgical technique and follow-up. Arthroscopy. 1997;13(3):332-339.

11.  Hospodar SJ, Miller MD. Controversies in ACL reconstruction: bone–patellar tendon–bone anterior cruciate ligament reconstruction remains the gold standard. Sports Med Arthrosc Rev. 2009;17(4):242-246.

12.  Mascarenhas R, Tranovich MJ, Kropf EJ, Fu FH, Harner CD. Bone–patellar tendon–bone autograft versus hamstring autograft anterior cruciate ligament reconstruction in the young athlete: a retrospective matched analysis with 2-10 year follow-up. Knee Surg Sports Traumatol Arthrosc. 2012;20(8):1520-1527.

13.  Bourke HE, Salmon LJ, Waller A, Patterson V, Pinczewski LA. Survival of the anterior cruciate ligament graft and the contralateral ACL at a minimum of 15 years. Am J Sports Med. 2012;40(9):1985-1992.

14.  Chalmers PN, Mall NA, Cole BJ, Verma NN, Bush-Joseph CA, Bach BR Jr. Anteromedial versus transtibial tunnel drilling in anterior cruciate ligament reconstructions: a systematic review. Arthroscopy. 2013;29(7):1235-1242.

15.  Franceschi F, Papalia R, Rizzello G, Del Buono A, Maffulli N, Denaro V. Anteromedial portal versus transtibial drilling techniques in anterior cruciate ligament reconstruction: any clinical relevance? A retrospective comparative study. Arthroscopy. 2013;29(8):1330-1337.

16.  Wang H, Fleischli JE, Zheng NN. Transtibial versus anteromedial portal technique in single-bundle anterior cruciate ligament reconstruction: outcomes of knee joint kinematics during walking. Am J Sports Med. 2013;41(8):1847-1856.

17.  Chechik O, Amar E, Khashan M, Lador R, Eyal G, Gold A. An international survey on anterior cruciate ligament reconstruction practices. Int Orthop. 2013;37(2):201-206.

18.  Keeter S, Miller C, Kohut A, Groves RM, Presser S. Consequences of reducing nonresponse in a national telephone survey. Public Opin Q. 2000;64(2):125-148.

19.  Curtin R, Presser S, Singer E. The effects of response rate changes on the index of consumer sentiment. Public Opin Q. 2000;64(4):413-428.

Article PDF
Author and Disclosure Information

Brandon J. Erickson, MD, Joshua D. Harris, MD, Yale A. Fillingham, MD, Gregory L. Cvetanovich, MD, Charles Bush-Joseph, MD, Brian J. Cole, MD, MBA, Bernard R. Bach Jr, MD, and Nikhil N. Verma, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
E480-E485
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, practice, anterior cruciate ligament, ACL, reconstruction, athletes, sports medicine, athletic, sports, hockey, soccer, olympic, ski, snowboard, team, NHL, MLS, sport, erickson, harris, fillingham, cvetanovich, bush-joseph, cole, bach, verma
Sections
Author and Disclosure Information

Brandon J. Erickson, MD, Joshua D. Harris, MD, Yale A. Fillingham, MD, Gregory L. Cvetanovich, MD, Charles Bush-Joseph, MD, Brian J. Cole, MD, MBA, Bernard R. Bach Jr, MD, and Nikhil N. Verma, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Brandon J. Erickson, MD, Joshua D. Harris, MD, Yale A. Fillingham, MD, Gregory L. Cvetanovich, MD, Charles Bush-Joseph, MD, Brian J. Cole, MD, MBA, Bernard R. Bach Jr, MD, and Nikhil N. Verma, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

National Hockey League (NHL), Major League Soccer (MLS), and US Olympic/World Cup Ski/Snowboard (Olympic) athletes receive orthopedic care from a select group of surgeons. There are 30 NHL teams, 19 MLS teams, 1 Olympic ski team, and 1 Olympic snowboard team, for a total of 51 teams and a rough total of 2229 athletes (1500 NHL, 570 MLS, 159 Olympic).1

Studies have shown that MLS athletes and X-Game skiers and snowboarders have performed well on return to sport (RTS) after anterior cruciate ligament (ACL) reconstruction.2,3 However, the techniques, graft choices, and rehabilitation protocols used to return these elite athletes to their preinjury level of performance have not been elucidated. It is unclear if the treatment given to these elite athletes differs from that given to recreational athletes and nonathletes. Bradley and colleagues4 examined how 32 NFL team orthopedists treated ACL tears, and Erickson and colleagues5 recently surveyed NFL and National Collegiate Athletic Association (NCAA) team physicians to determine practice patterns (eg, surgical techniques, graft choices, postoperative protocols) in treating ACL tears. Until now, however, no one has examined NHL, MLS, or Olympic team orthopedic surgeons’ practice patterns as they relate to ACL reconstruction.

We conducted an online survey of NHL, MLS, and Olympic team orthopedic surgeons to determine practice patterns relating to ACL reconstruction in elite athletes. Given the practice patterns of surgeons in our practice, we hypothesized that the surveyed surgeons treating these elite athletes would most commonly use bone–patellar tendon–bone (BPTB) autograft with a single-bundle technique. We also hypothesized that they would permit RTS without a brace at a minimum of 6 months after surgery, with a normal physical examination, and after successful completion of a structured battery of RTS tests.

Materials and Methods

On the SurveyMonkey website (http://www.surveymonkey.com), we created a 7-question base survey, with other questions added for the NHL and MLS surveys (Figure 1). We sent this survey to 94 team orthopedic surgeons (41 NHL, 26 MLS, 27 Olympic) identified through Internet searches and direct contact with team public relations departments. The survey was approved by MLS and NHL research committees. In 2013, each survey was sent out 5 times. The response rates for each round are shown in Figure 2. All responses remained confidential; we did not learn surgeons’ identities. Data were collected and analyzed through the SurveyMonkey website. Each surgeon was instructed to respond to all relevant questions in the survey. The survey was designed such that the participant could not submit the survey without answering all the questions. Descriptive statistics were calculated for each study and parameter analyzed. Continuous variable data are reported as means and standard deviations (weighted means where applicable). Categorical data are reported as frequencies with percentages.

Results

Of the 94 team orthopedic surgeons surveyed, 47 (50%) responded (NHL, 49%; MLS, 50%; Olympic, 52%). Mean (SD) experience as a team physician was 7.73 (5.33) years (range, 2-20 years) for NHL, 6.77 (6.64) years (range, 2-20 years) for MLS, and 1.14 (0.36) years (range, 1-10 years) for Olympic. Mean (SD) number of ACL reconstructions performed in 2012 was 101 (51) for NHL (range, 50-200), 78 (38) for MLS (range, 20-150), and 110 (105) for Olympic (range, 25-175) (Table 1). Of the 47 surgeons, 42 (89.4%) used autograft in the treatment of elite athletes, and 5 (10.6%) used allograft. Autograft choices were BPTB (n = 33; 70.2%), 4-strand semitendinosus (n = 7; 14.9%), and quadriceps (n = 2; 4.3%); allograft choices were 4-strand semitendinosus (n = 4; 8.5%) and BPTB (n = 1; 2.1%) (Table 2).

Of the 40 surgeons (85.1%) who indicated they would use autograft in 25-year-old recreational athletes, 25 (53.2%) would use BPTB, 13 (27.7%) would use 4-strand semitendinosus, and 2 (4.3%) would use quadriceps; of the 7 who indicated they would use allograft, 4 (8.5%) would use 4-strand semitendinosus, and 3 (6.4%) would use BPTB. In the NHL and MLS surveys, 19 surgeons (57.6%) indicated they would use autograft (6 would use BPTB, 13 would use 4-strand semitendinosus), and 14 (42.4%) would use allograft (7 would use BPTB, 5 would use Achilles, and 2 would use tibialis anterior) in 35-year-old recreational athletes.

Twenty-one surgeons (44.7%) were drilling the femoral tunnel through a transtibial portal, 36.2% through an anteromedial portal, and 12.8% using a 2-incision technique. All surgeons indicated they were using a single-bundle technique in ACL reconstruction. Thirty-three surgeons (70.2%) did not recommend a brace for their elite athletes on RTS. Olympic team surgeons had the highest rate of brace wear in RTS (50%, both skiers and snowboarders); NHL and MLS surgeons had significantly lower rates (25% and 15.4%, respectively) (Table 3).

 

 

Twenty (60.6%) of the NHL and MLS surgeons recommended waiting at least 6 months before RTS; 2 (6.1%) recommended waiting at least 9 months; no surgeon recommended waiting at least 12 months; and the others did not have a specific time frame for RTS. Twenty-seven surgeons (81.8%) recommended RTS after an athlete passed a series of RTS tests (eg, Vail, single-leg hop). Nineteen surgeons (57.6%) recommended waiting until the athlete had full range of motion, no pain, full strength, and subjective stability in the knee. Physicians could choose more than one answer for the previous question, allowing for a total percentage higher than 100%.

Discussion

The goal of this study was to determine how NHL, MLS, and Olympic team orthopedic surgeons manage ACL tears in elite and recreational athletes. Our study hypotheses were confirmed, as 70.2% of those surveyed used BPTB autograft for elite athletes, 100% used the single-bundle technique, 70.2% did not require a brace on RTS, 81.8% recommended RTS after the athlete passed a series of RTS tests (eg, Vail, single-leg hop), and 60.6% waited at least 6 months after surgery.

As soccer and skiing are the top 2 sports in which participants sustain ACL tears, it is necessary to report how surgeons obtain successful results in these patient populations.6 Using the US and Norwegian ACL reconstruction registries, Granan and colleagues6 found that, over a 7-year period, 5760 ACL tears occurred during soccer, and 2030 occurred during skiing. The scope of ACL injuries is broad, and treatment patterns must be elucidated. Although most surgeons do not treat elite athletes, many high school and college athletes compete at very high levels. Therefore, replicating the methods of the surgeons who treat elite athletes may be warranted.

In our survey, autograft (89.4%), particularly BPTB autograft (70.2%), was the most common graft choice for elite athletes. The rate of allograft use (42.4%) was higher for 35-year-old recreational athletes. As BPTB autograft produces reliable long-term results, this graft type is a reasonable choice.7 However, only 18% of our surveyed orthopedic surgeons indicated they would use BPTB autograft in older, recreational athletes. This stark difference is likely related to the more than 40% long-term side effects of anterior knee pain and graft harvest site morbidity with BPTB autograft as opposed to allograft and other types of autograft.8,9 Younger patients may be more willing to accept some anterior knee pain to ensure bone-to-bone healing with BPTB autograft. This shift in graft choice may also reflect the desire to minimize skin incisions and their resulting scars, especially in female recreational athletes.

In a meta-analysis of more than 5000 patients, Kraeutler and colleagues7 found that BPTB autograft outperformed allograft according to several knee scores, including Lysholm and Tegner, and had a lower re-rupture rate (4.3% vs 12.7%). However, despite the superior performance of BPTB autograft, graft choice cannot overcome surgeon error in graft placement.10 BPTB autograft appears to remain the gold standard for ACL reconstruction for many reasons, including low failure rates and decreased costs.11 Recently, investigators have tried to challenge the superiority of BPTB autograft. In a retrospective case–control study, Mascarenhas and colleagues12 found that hamstring autograft afforded patients better extension and higher subjective outcome scores. Bourke and colleagues13 found a higher rate of contralateral ACL rupture in patients treated with BPTB autograft compared with hamstring autograft.

According to this survey, 44.7% of surgeons indicated they drilled the femoral tunnel through a transtibial portal, 36.2% used an anteromedial portal, and 12.8% used the 2-incision technique. These methods were recently evaluated to determine if any is superior to the others, but the study results were not definitive.14 Franceschi and colleagues15 found improved rotational and anterior stability of the knee with use of an anteromedial approach, but their findings were not clinically or functionally significant. Wang and colleagues16 found an extension loss in the late-stance phase of gait with the anteromedial approach; the transtibial approach was correlated with inferior anterior-posterior stability during the stance phase of gait. Therefore, our results parallel those in the current literature in that the surveyed population is split on which technique to use and likely bases its practice on comfort level and residency/fellowship training.

Limitations

This study had several limitations. First, it provided level V evidence of team physicians in 3 major sports. Although some of these physicians were also treating athletes in other sports, our survey targeted NHL, MLS, and Olympic athletes. It did not address all ages and both sexes—which is significant, given the higher rate of ACL tears in females. All NHL and MLS players are male, and there was a high rate of BPTB graft use in these sports. However, recreational athletes include both males and females, and the fact that some surgeons would choose a hamstring graft for a female for cosmetic reasons must not be overlooked. Conversely, that there was no difference in the number of BPTB autografts chosen between NHL and MLS surgeons versus Olympic surgeons, where females are included (all chose about 60% BPTB autografts for their elite athletes), disputes this limitation. Our survey response rate was 50%. Other studies have had similar rates in relation to ACL practices,17 especially elite team physicians’ practices,5 and recent literature has confirmed that lower response rates in surveys did not alter results and may in fact have improved results.18,19 This percentage could be falsely low if some of our email addresses were incorrect. This rate also raises the possibility of selection bias, as surgeons who routinely used allograft in their athlete population may not have wanted to admit this. It is possible that some NHL, MLS, and Olympic athletes were treated by surgeons not included in this survey (in some cases, a non–team surgeon may have performed the athlete’s surgery). This survey did not address concomitant knee pathology or cover all possible technique variables.

 

 

Conclusion

Most of the NHL, MLS, and Olympic team orthopedic surgeons who were surveyed perform their ACL reconstructions using BPTB autograft, using a single-bundle technique, through a transtibial portal, and do not require bracing for their athletes returning to sport. Most required their athletes to complete a series of RTS tests before resuming competitive play.

National Hockey League (NHL), Major League Soccer (MLS), and US Olympic/World Cup Ski/Snowboard (Olympic) athletes receive orthopedic care from a select group of surgeons. There are 30 NHL teams, 19 MLS teams, 1 Olympic ski team, and 1 Olympic snowboard team, for a total of 51 teams and a rough total of 2229 athletes (1500 NHL, 570 MLS, 159 Olympic).1

Studies have shown that MLS athletes and X-Game skiers and snowboarders have performed well on return to sport (RTS) after anterior cruciate ligament (ACL) reconstruction.2,3 However, the techniques, graft choices, and rehabilitation protocols used to return these elite athletes to their preinjury level of performance have not been elucidated. It is unclear if the treatment given to these elite athletes differs from that given to recreational athletes and nonathletes. Bradley and colleagues4 examined how 32 NFL team orthopedists treated ACL tears, and Erickson and colleagues5 recently surveyed NFL and National Collegiate Athletic Association (NCAA) team physicians to determine practice patterns (eg, surgical techniques, graft choices, postoperative protocols) in treating ACL tears. Until now, however, no one has examined NHL, MLS, or Olympic team orthopedic surgeons’ practice patterns as they relate to ACL reconstruction.

We conducted an online survey of NHL, MLS, and Olympic team orthopedic surgeons to determine practice patterns relating to ACL reconstruction in elite athletes. Given the practice patterns of surgeons in our practice, we hypothesized that the surveyed surgeons treating these elite athletes would most commonly use bone–patellar tendon–bone (BPTB) autograft with a single-bundle technique. We also hypothesized that they would permit RTS without a brace at a minimum of 6 months after surgery, with a normal physical examination, and after successful completion of a structured battery of RTS tests.

Materials and Methods

On the SurveyMonkey website (http://www.surveymonkey.com), we created a 7-question base survey, with other questions added for the NHL and MLS surveys (Figure 1). We sent this survey to 94 team orthopedic surgeons (41 NHL, 26 MLS, 27 Olympic) identified through Internet searches and direct contact with team public relations departments. The survey was approved by MLS and NHL research committees. In 2013, each survey was sent out 5 times. The response rates for each round are shown in Figure 2. All responses remained confidential; we did not learn surgeons’ identities. Data were collected and analyzed through the SurveyMonkey website. Each surgeon was instructed to respond to all relevant questions in the survey. The survey was designed such that the participant could not submit the survey without answering all the questions. Descriptive statistics were calculated for each study and parameter analyzed. Continuous variable data are reported as means and standard deviations (weighted means where applicable). Categorical data are reported as frequencies with percentages.

Results

Of the 94 team orthopedic surgeons surveyed, 47 (50%) responded (NHL, 49%; MLS, 50%; Olympic, 52%). Mean (SD) experience as a team physician was 7.73 (5.33) years (range, 2-20 years) for NHL, 6.77 (6.64) years (range, 2-20 years) for MLS, and 1.14 (0.36) years (range, 1-10 years) for Olympic. Mean (SD) number of ACL reconstructions performed in 2012 was 101 (51) for NHL (range, 50-200), 78 (38) for MLS (range, 20-150), and 110 (105) for Olympic (range, 25-175) (Table 1). Of the 47 surgeons, 42 (89.4%) used autograft in the treatment of elite athletes, and 5 (10.6%) used allograft. Autograft choices were BPTB (n = 33; 70.2%), 4-strand semitendinosus (n = 7; 14.9%), and quadriceps (n = 2; 4.3%); allograft choices were 4-strand semitendinosus (n = 4; 8.5%) and BPTB (n = 1; 2.1%) (Table 2).

Of the 40 surgeons (85.1%) who indicated they would use autograft in 25-year-old recreational athletes, 25 (53.2%) would use BPTB, 13 (27.7%) would use 4-strand semitendinosus, and 2 (4.3%) would use quadriceps; of the 7 who indicated they would use allograft, 4 (8.5%) would use 4-strand semitendinosus, and 3 (6.4%) would use BPTB. In the NHL and MLS surveys, 19 surgeons (57.6%) indicated they would use autograft (6 would use BPTB, 13 would use 4-strand semitendinosus), and 14 (42.4%) would use allograft (7 would use BPTB, 5 would use Achilles, and 2 would use tibialis anterior) in 35-year-old recreational athletes.

Twenty-one surgeons (44.7%) were drilling the femoral tunnel through a transtibial portal, 36.2% through an anteromedial portal, and 12.8% using a 2-incision technique. All surgeons indicated they were using a single-bundle technique in ACL reconstruction. Thirty-three surgeons (70.2%) did not recommend a brace for their elite athletes on RTS. Olympic team surgeons had the highest rate of brace wear in RTS (50%, both skiers and snowboarders); NHL and MLS surgeons had significantly lower rates (25% and 15.4%, respectively) (Table 3).

 

 

Twenty (60.6%) of the NHL and MLS surgeons recommended waiting at least 6 months before RTS; 2 (6.1%) recommended waiting at least 9 months; no surgeon recommended waiting at least 12 months; and the others did not have a specific time frame for RTS. Twenty-seven surgeons (81.8%) recommended RTS after an athlete passed a series of RTS tests (eg, Vail, single-leg hop). Nineteen surgeons (57.6%) recommended waiting until the athlete had full range of motion, no pain, full strength, and subjective stability in the knee. Physicians could choose more than one answer for the previous question, allowing for a total percentage higher than 100%.

Discussion

The goal of this study was to determine how NHL, MLS, and Olympic team orthopedic surgeons manage ACL tears in elite and recreational athletes. Our study hypotheses were confirmed, as 70.2% of those surveyed used BPTB autograft for elite athletes, 100% used the single-bundle technique, 70.2% did not require a brace on RTS, 81.8% recommended RTS after the athlete passed a series of RTS tests (eg, Vail, single-leg hop), and 60.6% waited at least 6 months after surgery.

As soccer and skiing are the top 2 sports in which participants sustain ACL tears, it is necessary to report how surgeons obtain successful results in these patient populations.6 Using the US and Norwegian ACL reconstruction registries, Granan and colleagues6 found that, over a 7-year period, 5760 ACL tears occurred during soccer, and 2030 occurred during skiing. The scope of ACL injuries is broad, and treatment patterns must be elucidated. Although most surgeons do not treat elite athletes, many high school and college athletes compete at very high levels. Therefore, replicating the methods of the surgeons who treat elite athletes may be warranted.

In our survey, autograft (89.4%), particularly BPTB autograft (70.2%), was the most common graft choice for elite athletes. The rate of allograft use (42.4%) was higher for 35-year-old recreational athletes. As BPTB autograft produces reliable long-term results, this graft type is a reasonable choice.7 However, only 18% of our surveyed orthopedic surgeons indicated they would use BPTB autograft in older, recreational athletes. This stark difference is likely related to the more than 40% long-term side effects of anterior knee pain and graft harvest site morbidity with BPTB autograft as opposed to allograft and other types of autograft.8,9 Younger patients may be more willing to accept some anterior knee pain to ensure bone-to-bone healing with BPTB autograft. This shift in graft choice may also reflect the desire to minimize skin incisions and their resulting scars, especially in female recreational athletes.

In a meta-analysis of more than 5000 patients, Kraeutler and colleagues7 found that BPTB autograft outperformed allograft according to several knee scores, including Lysholm and Tegner, and had a lower re-rupture rate (4.3% vs 12.7%). However, despite the superior performance of BPTB autograft, graft choice cannot overcome surgeon error in graft placement.10 BPTB autograft appears to remain the gold standard for ACL reconstruction for many reasons, including low failure rates and decreased costs.11 Recently, investigators have tried to challenge the superiority of BPTB autograft. In a retrospective case–control study, Mascarenhas and colleagues12 found that hamstring autograft afforded patients better extension and higher subjective outcome scores. Bourke and colleagues13 found a higher rate of contralateral ACL rupture in patients treated with BPTB autograft compared with hamstring autograft.

According to this survey, 44.7% of surgeons indicated they drilled the femoral tunnel through a transtibial portal, 36.2% used an anteromedial portal, and 12.8% used the 2-incision technique. These methods were recently evaluated to determine if any is superior to the others, but the study results were not definitive.14 Franceschi and colleagues15 found improved rotational and anterior stability of the knee with use of an anteromedial approach, but their findings were not clinically or functionally significant. Wang and colleagues16 found an extension loss in the late-stance phase of gait with the anteromedial approach; the transtibial approach was correlated with inferior anterior-posterior stability during the stance phase of gait. Therefore, our results parallel those in the current literature in that the surveyed population is split on which technique to use and likely bases its practice on comfort level and residency/fellowship training.

Limitations

This study had several limitations. First, it provided level V evidence of team physicians in 3 major sports. Although some of these physicians were also treating athletes in other sports, our survey targeted NHL, MLS, and Olympic athletes. It did not address all ages and both sexes—which is significant, given the higher rate of ACL tears in females. All NHL and MLS players are male, and there was a high rate of BPTB graft use in these sports. However, recreational athletes include both males and females, and the fact that some surgeons would choose a hamstring graft for a female for cosmetic reasons must not be overlooked. Conversely, that there was no difference in the number of BPTB autografts chosen between NHL and MLS surgeons versus Olympic surgeons, where females are included (all chose about 60% BPTB autografts for their elite athletes), disputes this limitation. Our survey response rate was 50%. Other studies have had similar rates in relation to ACL practices,17 especially elite team physicians’ practices,5 and recent literature has confirmed that lower response rates in surveys did not alter results and may in fact have improved results.18,19 This percentage could be falsely low if some of our email addresses were incorrect. This rate also raises the possibility of selection bias, as surgeons who routinely used allograft in their athlete population may not have wanted to admit this. It is possible that some NHL, MLS, and Olympic athletes were treated by surgeons not included in this survey (in some cases, a non–team surgeon may have performed the athlete’s surgery). This survey did not address concomitant knee pathology or cover all possible technique variables.

 

 

Conclusion

Most of the NHL, MLS, and Olympic team orthopedic surgeons who were surveyed perform their ACL reconstructions using BPTB autograft, using a single-bundle technique, through a transtibial portal, and do not require bracing for their athletes returning to sport. Most required their athletes to complete a series of RTS tests before resuming competitive play.

References

1.    Team USA. 2013. US Olympic Committee website. http://www.teamusa.org/athletes?pg=1&seasonId=%7BCF2DC66A-C2B3-44A8-ABB8-A486F3FBFDDF%7D&ngbId=%7BB36167A0-2AC8-4B0F-876F-93D0A44DF60A%7D. Accessed October 23, 2015.

2.    Erickson BJ, Harris JD, Cvetanovich GL, et al. Performance and return to sport after anterior cruciate ligament reconstruction in male major league soccer players. Orthop J Sports Med. 2013;1(2):1-8.

3.    Erickson BJ, Harris JD, Fillingham YA, et al. Performance and return to sport after anterior cruciate ligament reconstruction in X-Games skiers and snowboarders. Orthop J Sports Med. 2013;1(6):1-5.

4.    Bradley JP, Klimkiewicz JJ, Rytel MJ, Powell JW. Anterior cruciate ligament injuries in the National Football League: epidemiology and current treatment trends among team physicians. Arthroscopy. 2002;18(5):502-509.

5.    Erickson BJ, Harris JD, Fillingham YA, et al. Anterior cruciate ligament reconstruction practice patterns by NFL and NCAA football team physicians. Arthroscopy. 2014;30(6):731-738.

6.    Granan LP, Inacio MC, Maletis GB, Funahashi TT, Engebretsen L. Sport-specific injury pattern recorded during anterior cruciate ligament reconstruction. Am J Sports Med. 2013;41(12):2814-2818.

7.    Kraeutler MJ, Bravman JT, McCarty EC. Bone–patellar tendon–bone autograft versus allograft in outcomes of anterior cruciate ligament reconstruction: a meta-analysis of 5182 patients. Am J Sports Med. 2013;41(10):2439-2448.

8.    Poehling GG, Curl WW, Lee CA, et al. Analysis of outcomes of anterior cruciate ligament repair with 5-year follow-up: allograft versus autograft. Arthroscopy. 2005;21(7):774-785.

9.    Kartus J, Magnusson L, Stener S, Brandsson S, Eriksson BI, Karlsson J. Complications following arthroscopic anterior cruciate ligament reconstruction. A 2-5-year follow-up of 604 patients with special emphasis on anterior knee pain. Knee Surg Sports Traumatol Arthrosc. 1999;7(1):2-8.

10.  Boszotta H. Arthroscopic anterior cruciate ligament reconstruction using a patellar tendon graft in press-fit technique: surgical technique and follow-up. Arthroscopy. 1997;13(3):332-339.

11.  Hospodar SJ, Miller MD. Controversies in ACL reconstruction: bone–patellar tendon–bone anterior cruciate ligament reconstruction remains the gold standard. Sports Med Arthrosc Rev. 2009;17(4):242-246.

12.  Mascarenhas R, Tranovich MJ, Kropf EJ, Fu FH, Harner CD. Bone–patellar tendon–bone autograft versus hamstring autograft anterior cruciate ligament reconstruction in the young athlete: a retrospective matched analysis with 2-10 year follow-up. Knee Surg Sports Traumatol Arthrosc. 2012;20(8):1520-1527.

13.  Bourke HE, Salmon LJ, Waller A, Patterson V, Pinczewski LA. Survival of the anterior cruciate ligament graft and the contralateral ACL at a minimum of 15 years. Am J Sports Med. 2012;40(9):1985-1992.

14.  Chalmers PN, Mall NA, Cole BJ, Verma NN, Bush-Joseph CA, Bach BR Jr. Anteromedial versus transtibial tunnel drilling in anterior cruciate ligament reconstructions: a systematic review. Arthroscopy. 2013;29(7):1235-1242.

15.  Franceschi F, Papalia R, Rizzello G, Del Buono A, Maffulli N, Denaro V. Anteromedial portal versus transtibial drilling techniques in anterior cruciate ligament reconstruction: any clinical relevance? A retrospective comparative study. Arthroscopy. 2013;29(8):1330-1337.

16.  Wang H, Fleischli JE, Zheng NN. Transtibial versus anteromedial portal technique in single-bundle anterior cruciate ligament reconstruction: outcomes of knee joint kinematics during walking. Am J Sports Med. 2013;41(8):1847-1856.

17.  Chechik O, Amar E, Khashan M, Lador R, Eyal G, Gold A. An international survey on anterior cruciate ligament reconstruction practices. Int Orthop. 2013;37(2):201-206.

18.  Keeter S, Miller C, Kohut A, Groves RM, Presser S. Consequences of reducing nonresponse in a national telephone survey. Public Opin Q. 2000;64(2):125-148.

19.  Curtin R, Presser S, Singer E. The effects of response rate changes on the index of consumer sentiment. Public Opin Q. 2000;64(4):413-428.

References

1.    Team USA. 2013. US Olympic Committee website. http://www.teamusa.org/athletes?pg=1&seasonId=%7BCF2DC66A-C2B3-44A8-ABB8-A486F3FBFDDF%7D&ngbId=%7BB36167A0-2AC8-4B0F-876F-93D0A44DF60A%7D. Accessed October 23, 2015.

2.    Erickson BJ, Harris JD, Cvetanovich GL, et al. Performance and return to sport after anterior cruciate ligament reconstruction in male major league soccer players. Orthop J Sports Med. 2013;1(2):1-8.

3.    Erickson BJ, Harris JD, Fillingham YA, et al. Performance and return to sport after anterior cruciate ligament reconstruction in X-Games skiers and snowboarders. Orthop J Sports Med. 2013;1(6):1-5.

4.    Bradley JP, Klimkiewicz JJ, Rytel MJ, Powell JW. Anterior cruciate ligament injuries in the National Football League: epidemiology and current treatment trends among team physicians. Arthroscopy. 2002;18(5):502-509.

5.    Erickson BJ, Harris JD, Fillingham YA, et al. Anterior cruciate ligament reconstruction practice patterns by NFL and NCAA football team physicians. Arthroscopy. 2014;30(6):731-738.

6.    Granan LP, Inacio MC, Maletis GB, Funahashi TT, Engebretsen L. Sport-specific injury pattern recorded during anterior cruciate ligament reconstruction. Am J Sports Med. 2013;41(12):2814-2818.

7.    Kraeutler MJ, Bravman JT, McCarty EC. Bone–patellar tendon–bone autograft versus allograft in outcomes of anterior cruciate ligament reconstruction: a meta-analysis of 5182 patients. Am J Sports Med. 2013;41(10):2439-2448.

8.    Poehling GG, Curl WW, Lee CA, et al. Analysis of outcomes of anterior cruciate ligament repair with 5-year follow-up: allograft versus autograft. Arthroscopy. 2005;21(7):774-785.

9.    Kartus J, Magnusson L, Stener S, Brandsson S, Eriksson BI, Karlsson J. Complications following arthroscopic anterior cruciate ligament reconstruction. A 2-5-year follow-up of 604 patients with special emphasis on anterior knee pain. Knee Surg Sports Traumatol Arthrosc. 1999;7(1):2-8.

10.  Boszotta H. Arthroscopic anterior cruciate ligament reconstruction using a patellar tendon graft in press-fit technique: surgical technique and follow-up. Arthroscopy. 1997;13(3):332-339.

11.  Hospodar SJ, Miller MD. Controversies in ACL reconstruction: bone–patellar tendon–bone anterior cruciate ligament reconstruction remains the gold standard. Sports Med Arthrosc Rev. 2009;17(4):242-246.

12.  Mascarenhas R, Tranovich MJ, Kropf EJ, Fu FH, Harner CD. Bone–patellar tendon–bone autograft versus hamstring autograft anterior cruciate ligament reconstruction in the young athlete: a retrospective matched analysis with 2-10 year follow-up. Knee Surg Sports Traumatol Arthrosc. 2012;20(8):1520-1527.

13.  Bourke HE, Salmon LJ, Waller A, Patterson V, Pinczewski LA. Survival of the anterior cruciate ligament graft and the contralateral ACL at a minimum of 15 years. Am J Sports Med. 2012;40(9):1985-1992.

14.  Chalmers PN, Mall NA, Cole BJ, Verma NN, Bush-Joseph CA, Bach BR Jr. Anteromedial versus transtibial tunnel drilling in anterior cruciate ligament reconstructions: a systematic review. Arthroscopy. 2013;29(7):1235-1242.

15.  Franceschi F, Papalia R, Rizzello G, Del Buono A, Maffulli N, Denaro V. Anteromedial portal versus transtibial drilling techniques in anterior cruciate ligament reconstruction: any clinical relevance? A retrospective comparative study. Arthroscopy. 2013;29(8):1330-1337.

16.  Wang H, Fleischli JE, Zheng NN. Transtibial versus anteromedial portal technique in single-bundle anterior cruciate ligament reconstruction: outcomes of knee joint kinematics during walking. Am J Sports Med. 2013;41(8):1847-1856.

17.  Chechik O, Amar E, Khashan M, Lador R, Eyal G, Gold A. An international survey on anterior cruciate ligament reconstruction practices. Int Orthop. 2013;37(2):201-206.

18.  Keeter S, Miller C, Kohut A, Groves RM, Presser S. Consequences of reducing nonresponse in a national telephone survey. Public Opin Q. 2000;64(2):125-148.

19.  Curtin R, Presser S, Singer E. The effects of response rate changes on the index of consumer sentiment. Public Opin Q. 2000;64(4):413-428.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
E480-E485
Page Number
E480-E485
Publications
Publications
Topics
Article Type
Display Headline
Orthopedic Practice Patterns Relating to Anterior Cruciate Ligament Reconstruction in Elite Athletes
Display Headline
Orthopedic Practice Patterns Relating to Anterior Cruciate Ligament Reconstruction in Elite Athletes
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, practice, anterior cruciate ligament, ACL, reconstruction, athletes, sports medicine, athletic, sports, hockey, soccer, olympic, ski, snowboard, team, NHL, MLS, sport, erickson, harris, fillingham, cvetanovich, bush-joseph, cole, bach, verma
Legacy Keywords
american journal of orthopedics, AJO, original study, online exclusive, study, practice, anterior cruciate ligament, ACL, reconstruction, athletes, sports medicine, athletic, sports, hockey, soccer, olympic, ski, snowboard, team, NHL, MLS, sport, erickson, harris, fillingham, cvetanovich, bush-joseph, cole, bach, verma
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Magnetic Resonance Imaging of Complications of Anterior Cruciate Ligament Reconstruction

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Magnetic Resonance Imaging of Complications of Anterior Cruciate Ligament Reconstruction

Magnetic resonance imaging (MRI) is the preferred modality in the evaluation of complications of anterior cruciate ligament reconstruction (ACL-R).1-3 ACL-R complications may be broadly characterized as those resulting in decreased range of motion (ROM), eg, arthrofibrosis and impingement, and those resulting in increased laxity, ie, graft disruption.4 Short tau inversion recovery (STIR) sequences best minimize artifact related to field inhomogeneity in the presence of metal-containing fixation devices. Patients with contraindications to MRI may undergo high-resolution computed tomographic arthrography of the knee for evaluation of postoperative graft abnormalities.1

Arthrofibrosis refers to focal or diffuse synovial scar tissue, which may limit ROM. Preoperative irritation, preoperative limited ROM, and reconstruction within 4 weeks of trauma may all play a role in the development of arthrofibrosis.5,6 The focal form, cyclops lesion, named for its arthroscopic appearance, has been reported in 1% to 10% of patients with ACL-R.1 On MRI, focal arthrofibrosis may be seen as a focal or diffuse intermediate signal lesion in the anterior intercondylar notch extending linearly along the intercondylar roof1 (Figure 1).

MRI can be used to accurately determine the position of the femoral and tibial tunnels. Correct femoral tunnel position results in isometry of the graft during full ROM of the knee. Graft impingement can occur when the tibial tunnel is placed too far anteriorly such that the graft contacts the roof of the intercondylar notch before the knee is able to fully extend.7 A tibial tunnel placed anterior to the intersection of the Blumensaat line and the tibia is at higher risk for impingement.1,4 Impingement may be accompanied by signal change in the graft on intermediate-weighted and fluid-sensitive sequences. The signal abnormality is usually focal and persists longer than the expected signal changes related to revascularization of immature grafts within the first year (Figure 2). If left untreated, impingement may progress to graft rupture.4

Complete graft rupture is diagnosed on the basis of discontinuity of the graft fibers. MRI findings include fluid-filled defect or absence of intact graft fibers. Other reliable signs include large joint effusion, anterior tibial translation, pivot-shift–type marrow edema pattern, and horizontal orientation, laxity, or resorption of the graft fibers.1,8,9 The diagnosis of partial graft rupture may be challenging, as there are several other causes of increased graft signal, including revascularization (within 12 months after procedure), signal heterogeneity between individual bundles of hamstring grafts, and focal signal changes related to impingment (Figures 3, 4).

 

Fluid within the tunnels is a normal finding after surgery and typically resolves within the first 18 months.1 Cyst formation within the tibial tunnel is an uncommon complication of ACL-R and may be incidental to or present with clinical symptoms caused by extension into the pretibial soft tissues or expansion of the tunnel (Figure 5). Communication of cyst with joint space is important, as a noncommunicating cyst requires simple excision without need for bone grafting.7

Hardware-related complications (eg, loosening of fixation devices) are uncommon but may require revision surgery (Figure 6). Septic arthritis after ACL-R has a cumulative incidence of 0.1% to 0.9% and may be difficult to diagnose clinically because of the lack of classic symptoms of a septic joint.1 Diagnosis requires joint aspiration.

MRI is reliably and accurately used to assess ACL-R complications. The clinical history helps in stratifying complications that result in decreased ROM or increased laxity.

References

1.    Bencardino JT, Beltran J, Feldman MI, Rose DJ. MR imaging of complications of anterior cruciate ligament graft reconstruction. Radiographics. 2009;29(7):2115-2126.

2.    Recht MP, Kramer J. MR imaging of the postoperative knee: a pictorial essay. Radiographics. 2002;22(4):765-774.

3.    Papakonstantinou O, Chung CB, Chanchairujira K, Resnick DL. Complications of anterior cruciate ligament reconstruction: MR imaging. Eur Radiol. 2003;13(5):1106-1117.

4.    Meyers AB, Haims AH, Menn K, Moukaddam H. Imaging of anterior cruciate ligament repair and its complications. AJR Am J Roentgenol. 2010;194(2):476-484.

5.    Kwok CS, Harrison T, Servant C. The optimal timing for anterior cruciate ligament reconstruction with respect to the risk of postoperative stiffness. Arthroscopy. 2013;29(3):556-565.

6.    Mayr HO, Weig TG, Plitz W. Arthrofibrosis following ACL reconstruction—reasons and outcome. Arch Orthop Trauma Surg. 2004;124(8):518-522.

7.    Ghazikhanian V, Beltran J, Nikac V, Feldman M, Bencardino JT. Tibial tunnel and pretibial cysts following ACL graft reconstruction: MR imaging diagnosis. Skeletal Radiol. 2012;41(11):1375-1379.

8.    Collins MS, Unruh KP, Bond JR, Mandrekar JN. Magnetic resonance imaging of surgically confirmed anterior cruciate ligament graft disruption. Skeletal Radiol. 2008;37(3):233-243.

9.    Saupe N, White LM, Chiavaras MM, et al. Anterior cruciate ligament reconstruction grafts: MR imaging features at long-term follow-up—correlation with functional and clinical evaluation. Radiology. 2008;249(2):581-590.

Article PDF
Author and Disclosure Information

Etan Dayan, MD, Alex Maderazo, MD, MBA, and Darren Fitzpatrick, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
569-571
Legacy Keywords
american journal of orthopedics, AJO, imaging series, imaging, magnetic resonance imaging, MRI, anterior cruciate ligament reconstruction, anterior cruciate ligament, ACL, reconstruction, ACL-R, joint, graft, dayan, maderazo, fitzpatrick
Sections
Author and Disclosure Information

Etan Dayan, MD, Alex Maderazo, MD, MBA, and Darren Fitzpatrick, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Etan Dayan, MD, Alex Maderazo, MD, MBA, and Darren Fitzpatrick, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

Magnetic resonance imaging (MRI) is the preferred modality in the evaluation of complications of anterior cruciate ligament reconstruction (ACL-R).1-3 ACL-R complications may be broadly characterized as those resulting in decreased range of motion (ROM), eg, arthrofibrosis and impingement, and those resulting in increased laxity, ie, graft disruption.4 Short tau inversion recovery (STIR) sequences best minimize artifact related to field inhomogeneity in the presence of metal-containing fixation devices. Patients with contraindications to MRI may undergo high-resolution computed tomographic arthrography of the knee for evaluation of postoperative graft abnormalities.1

Arthrofibrosis refers to focal or diffuse synovial scar tissue, which may limit ROM. Preoperative irritation, preoperative limited ROM, and reconstruction within 4 weeks of trauma may all play a role in the development of arthrofibrosis.5,6 The focal form, cyclops lesion, named for its arthroscopic appearance, has been reported in 1% to 10% of patients with ACL-R.1 On MRI, focal arthrofibrosis may be seen as a focal or diffuse intermediate signal lesion in the anterior intercondylar notch extending linearly along the intercondylar roof1 (Figure 1).

MRI can be used to accurately determine the position of the femoral and tibial tunnels. Correct femoral tunnel position results in isometry of the graft during full ROM of the knee. Graft impingement can occur when the tibial tunnel is placed too far anteriorly such that the graft contacts the roof of the intercondylar notch before the knee is able to fully extend.7 A tibial tunnel placed anterior to the intersection of the Blumensaat line and the tibia is at higher risk for impingement.1,4 Impingement may be accompanied by signal change in the graft on intermediate-weighted and fluid-sensitive sequences. The signal abnormality is usually focal and persists longer than the expected signal changes related to revascularization of immature grafts within the first year (Figure 2). If left untreated, impingement may progress to graft rupture.4

Complete graft rupture is diagnosed on the basis of discontinuity of the graft fibers. MRI findings include fluid-filled defect or absence of intact graft fibers. Other reliable signs include large joint effusion, anterior tibial translation, pivot-shift–type marrow edema pattern, and horizontal orientation, laxity, or resorption of the graft fibers.1,8,9 The diagnosis of partial graft rupture may be challenging, as there are several other causes of increased graft signal, including revascularization (within 12 months after procedure), signal heterogeneity between individual bundles of hamstring grafts, and focal signal changes related to impingment (Figures 3, 4).

 

Fluid within the tunnels is a normal finding after surgery and typically resolves within the first 18 months.1 Cyst formation within the tibial tunnel is an uncommon complication of ACL-R and may be incidental to or present with clinical symptoms caused by extension into the pretibial soft tissues or expansion of the tunnel (Figure 5). Communication of cyst with joint space is important, as a noncommunicating cyst requires simple excision without need for bone grafting.7

Hardware-related complications (eg, loosening of fixation devices) are uncommon but may require revision surgery (Figure 6). Septic arthritis after ACL-R has a cumulative incidence of 0.1% to 0.9% and may be difficult to diagnose clinically because of the lack of classic symptoms of a septic joint.1 Diagnosis requires joint aspiration.

MRI is reliably and accurately used to assess ACL-R complications. The clinical history helps in stratifying complications that result in decreased ROM or increased laxity.

Magnetic resonance imaging (MRI) is the preferred modality in the evaluation of complications of anterior cruciate ligament reconstruction (ACL-R).1-3 ACL-R complications may be broadly characterized as those resulting in decreased range of motion (ROM), eg, arthrofibrosis and impingement, and those resulting in increased laxity, ie, graft disruption.4 Short tau inversion recovery (STIR) sequences best minimize artifact related to field inhomogeneity in the presence of metal-containing fixation devices. Patients with contraindications to MRI may undergo high-resolution computed tomographic arthrography of the knee for evaluation of postoperative graft abnormalities.1

Arthrofibrosis refers to focal or diffuse synovial scar tissue, which may limit ROM. Preoperative irritation, preoperative limited ROM, and reconstruction within 4 weeks of trauma may all play a role in the development of arthrofibrosis.5,6 The focal form, cyclops lesion, named for its arthroscopic appearance, has been reported in 1% to 10% of patients with ACL-R.1 On MRI, focal arthrofibrosis may be seen as a focal or diffuse intermediate signal lesion in the anterior intercondylar notch extending linearly along the intercondylar roof1 (Figure 1).

MRI can be used to accurately determine the position of the femoral and tibial tunnels. Correct femoral tunnel position results in isometry of the graft during full ROM of the knee. Graft impingement can occur when the tibial tunnel is placed too far anteriorly such that the graft contacts the roof of the intercondylar notch before the knee is able to fully extend.7 A tibial tunnel placed anterior to the intersection of the Blumensaat line and the tibia is at higher risk for impingement.1,4 Impingement may be accompanied by signal change in the graft on intermediate-weighted and fluid-sensitive sequences. The signal abnormality is usually focal and persists longer than the expected signal changes related to revascularization of immature grafts within the first year (Figure 2). If left untreated, impingement may progress to graft rupture.4

Complete graft rupture is diagnosed on the basis of discontinuity of the graft fibers. MRI findings include fluid-filled defect or absence of intact graft fibers. Other reliable signs include large joint effusion, anterior tibial translation, pivot-shift–type marrow edema pattern, and horizontal orientation, laxity, or resorption of the graft fibers.1,8,9 The diagnosis of partial graft rupture may be challenging, as there are several other causes of increased graft signal, including revascularization (within 12 months after procedure), signal heterogeneity between individual bundles of hamstring grafts, and focal signal changes related to impingment (Figures 3, 4).

 

Fluid within the tunnels is a normal finding after surgery and typically resolves within the first 18 months.1 Cyst formation within the tibial tunnel is an uncommon complication of ACL-R and may be incidental to or present with clinical symptoms caused by extension into the pretibial soft tissues or expansion of the tunnel (Figure 5). Communication of cyst with joint space is important, as a noncommunicating cyst requires simple excision without need for bone grafting.7

Hardware-related complications (eg, loosening of fixation devices) are uncommon but may require revision surgery (Figure 6). Septic arthritis after ACL-R has a cumulative incidence of 0.1% to 0.9% and may be difficult to diagnose clinically because of the lack of classic symptoms of a septic joint.1 Diagnosis requires joint aspiration.

MRI is reliably and accurately used to assess ACL-R complications. The clinical history helps in stratifying complications that result in decreased ROM or increased laxity.

References

1.    Bencardino JT, Beltran J, Feldman MI, Rose DJ. MR imaging of complications of anterior cruciate ligament graft reconstruction. Radiographics. 2009;29(7):2115-2126.

2.    Recht MP, Kramer J. MR imaging of the postoperative knee: a pictorial essay. Radiographics. 2002;22(4):765-774.

3.    Papakonstantinou O, Chung CB, Chanchairujira K, Resnick DL. Complications of anterior cruciate ligament reconstruction: MR imaging. Eur Radiol. 2003;13(5):1106-1117.

4.    Meyers AB, Haims AH, Menn K, Moukaddam H. Imaging of anterior cruciate ligament repair and its complications. AJR Am J Roentgenol. 2010;194(2):476-484.

5.    Kwok CS, Harrison T, Servant C. The optimal timing for anterior cruciate ligament reconstruction with respect to the risk of postoperative stiffness. Arthroscopy. 2013;29(3):556-565.

6.    Mayr HO, Weig TG, Plitz W. Arthrofibrosis following ACL reconstruction—reasons and outcome. Arch Orthop Trauma Surg. 2004;124(8):518-522.

7.    Ghazikhanian V, Beltran J, Nikac V, Feldman M, Bencardino JT. Tibial tunnel and pretibial cysts following ACL graft reconstruction: MR imaging diagnosis. Skeletal Radiol. 2012;41(11):1375-1379.

8.    Collins MS, Unruh KP, Bond JR, Mandrekar JN. Magnetic resonance imaging of surgically confirmed anterior cruciate ligament graft disruption. Skeletal Radiol. 2008;37(3):233-243.

9.    Saupe N, White LM, Chiavaras MM, et al. Anterior cruciate ligament reconstruction grafts: MR imaging features at long-term follow-up—correlation with functional and clinical evaluation. Radiology. 2008;249(2):581-590.

References

1.    Bencardino JT, Beltran J, Feldman MI, Rose DJ. MR imaging of complications of anterior cruciate ligament graft reconstruction. Radiographics. 2009;29(7):2115-2126.

2.    Recht MP, Kramer J. MR imaging of the postoperative knee: a pictorial essay. Radiographics. 2002;22(4):765-774.

3.    Papakonstantinou O, Chung CB, Chanchairujira K, Resnick DL. Complications of anterior cruciate ligament reconstruction: MR imaging. Eur Radiol. 2003;13(5):1106-1117.

4.    Meyers AB, Haims AH, Menn K, Moukaddam H. Imaging of anterior cruciate ligament repair and its complications. AJR Am J Roentgenol. 2010;194(2):476-484.

5.    Kwok CS, Harrison T, Servant C. The optimal timing for anterior cruciate ligament reconstruction with respect to the risk of postoperative stiffness. Arthroscopy. 2013;29(3):556-565.

6.    Mayr HO, Weig TG, Plitz W. Arthrofibrosis following ACL reconstruction—reasons and outcome. Arch Orthop Trauma Surg. 2004;124(8):518-522.

7.    Ghazikhanian V, Beltran J, Nikac V, Feldman M, Bencardino JT. Tibial tunnel and pretibial cysts following ACL graft reconstruction: MR imaging diagnosis. Skeletal Radiol. 2012;41(11):1375-1379.

8.    Collins MS, Unruh KP, Bond JR, Mandrekar JN. Magnetic resonance imaging of surgically confirmed anterior cruciate ligament graft disruption. Skeletal Radiol. 2008;37(3):233-243.

9.    Saupe N, White LM, Chiavaras MM, et al. Anterior cruciate ligament reconstruction grafts: MR imaging features at long-term follow-up—correlation with functional and clinical evaluation. Radiology. 2008;249(2):581-590.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
569-571
Page Number
569-571
Publications
Publications
Topics
Article Type
Display Headline
Magnetic Resonance Imaging of Complications of Anterior Cruciate Ligament Reconstruction
Display Headline
Magnetic Resonance Imaging of Complications of Anterior Cruciate Ligament Reconstruction
Legacy Keywords
american journal of orthopedics, AJO, imaging series, imaging, magnetic resonance imaging, MRI, anterior cruciate ligament reconstruction, anterior cruciate ligament, ACL, reconstruction, ACL-R, joint, graft, dayan, maderazo, fitzpatrick
Legacy Keywords
american journal of orthopedics, AJO, imaging series, imaging, magnetic resonance imaging, MRI, anterior cruciate ligament reconstruction, anterior cruciate ligament, ACL, reconstruction, ACL-R, joint, graft, dayan, maderazo, fitzpatrick
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Orthopedic Implant Waste: Analysis and Quantification

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Orthopedic Implant Waste: Analysis and Quantification

The cost of health care in the United States is increasing at an unsustainable rate.1-3 To decrease or even reverse this trend, we must decrease the cost of care without adversely affecting quality. Porter4 defined value as the quality of care divided by its cost. The economics of total joint arthroplasty (TJA) has received a great deal of attention because of both increasing demand and increasing cost.5-9 About 33% of all orthopedic surgeries and the majority of TJAs are paid for by Medicare.9 In recent years, the rate of reimbursement for orthopedic cases has steadily declined while the cost of implants has increased.3,10,11 Given the significant cost of implants, health care providers in some subspecialties have focused on implant costs as a potential area for cost reduction.12 For example, in TJA this has proved effective in reducing the overall cost, as has decreasing length of stay after surgery.8,10,13-16

With little evidence suggesting any specific orthopedic implant has outcomes superior to those of others, with the exception of select poorly performing outliers, we must increase value of care by lowering the cost when considering these devices.17,18 In addition, some experts have suggested that intraoperative waste is a significant factor in TJA cost, and it does contribute to the average implant cost for a TJA case.6,19 Using data collected from 72 institutions, Zywiel and colleagues19 estimated the annual cost of wasted hip and knee arthroplasty implants to be more than $36 million in the United States.

However, considering the aging US population, TJA is not the only orthopedic surgery with increased demand. An estimated 600,000 spine surgeries are performed each year in the United States.20 Between 1992 and 2003, Medicare spending for lumbar spinal fusion increased 500%.21 In addition, in a 15-month observational study of incidence of intraoperative waste in spine surgery, Soroceanu and colleagues22 reported waste occurring in 20% of spine procedures.

Although these studies have described implant waste in TJA and spine surgeries, little has been published on the cost of wasted implants in a center performing the full range of orthopedic procedures. In this article, we detail the implant waste costs incurred by surgeons for all orthopedic subspecialties at a single orthopedic specialty hospital over a 1-year period. Our study goals were to identify types of implants wasted, and incidence and cost of implant waste, for all total hip arthroplasties (THAs), total knee arthroplasties (TKAs), and lumbar spinal fusions performed at the hospital and to determine whether case volume or years in surgical practice affect the rate or cost of implants wasted.

Methods

We performed a retrospective economic analysis of 1 year of administrative implant data from our institution. Collected data were quantified and analyzed for factors that might explain any variance in implant waste among surgeons. We were granted exempt institutional review board status, as no patient information was involved in this study.

We reviewed the administrative implant data for the 12-month period beginning June 2012 and ending May 2013. For that period, number of cases in which an implant was used and number of cases in which an implant was wasted were recorded. For each instance of waste, type and cost of the wasted implant were entered into the administrative database. In addition, overall cost of implants for the year and cost of wasted implants were determined. Data were available for 81 surgeons across 8 orthopedic divisions (subspecialties). From this information, we determined percentage of cases in which waste occurred, percentage of total implant cost wasted, average cost of waste per case, and most commonly wasted implants. All 3 variables were also calculated for THAs, TKAs, and lumbar spinal fusion procedures.

Statistical Analysis

The data were analyzed to determine if surgeon case volume or years in surgical practice affected implant waste. All analyses were performed at department, division (subspecialty), and surgeon levels. Case volume was analyzed in 3 groups: top 25%, middle 50%, and lower 25%. Number of years in surgical practice was analyzed in 3 groups: fewer than 10 years, 10 to 19 years, and 20 years or more. Normality assumption of variables was tested using the Shapiro-Wilk test (P < .05). For between-group differences, 1-way analysis of variance and the Tukey honestly significant difference post hoc test were performed for variables with a normal distribution, and the Kruskal-Wallis and Mann-Whitney tests were performed for variables without a normal distribution.

For the subspecialty-level analyses, only the Adult Reconstruction, Sports Medicine, and Spine divisions were analyzed for the effects of volume, and only the Sports Medicine and Spine divisions were analyzed for the effect of surgical experience, as surgeon numbers were insufficient for adequate grouping(s).

 

 

Data are presented as means with corresponding 95% confidence intervals (CIs). Categorical variables are presented as counts with percentages. All statistical analyses were performed with SPSS Version 21.0 (IBM SPSS) statistical software. Statistical significance was set at .05.

Results

During the 1-year period, 8954 department cases involved an implant of any type. Waste occurred in 12% (1072) of these cases. The rate ranged from 8% in the Adult Reconstruction division to 30% in the Trauma division (Table 1), and the rate for individual surgeons ranged from 3% to 100%, though the surgeon with 100% performed only 1 case, and the next highest rate was 50%.

Total implant cost for our hospital during the period was $34,340,607. Of that total cost, 1.8% ($634,668) was lost because of implant waste. Percentage of total implant cost wasted ranged from 1.6% in the Adult Reconstruction division to 4.7% in the Sports Medicine division (Table 1). Percentage of total implant cost wasted for individual surgeons ranged from 0.2% to 16.1%. Tables 2 and 3 list the most commonly wasted implants by count and cost, respectively.

When total cost of wasted implants was averaged over all implant cases performed during the period, the loss resulting from waste amounted to $71 per case for the department and ranged from $21 per case for the Hand division to $105 per case for the Pediatric division (Table 1). For individual surgeons, the loss ranged from $4 to $250 per case.

During the period studied, an implant was wasted in 9% (100) of the 1076 primary THAs performed, 4% (42) of the 1003 primary TKAs, and 14% (30) of the 217 lumbar spinal fusions (Tables 4, 5).

There was no significant difference between groups for department (P = .46) or for the Adult Reconstruction (P = .83), Spine (P = .10), or Sports Medicine (P = .69) division. Analyzing for variance by years in surgical practice, we found a significant difference for department (P = .01) but not for the Adult Reconstruction (P = .12) or Spine (P = .14) division. The department difference resulted from a significant difference (P = .001; 95% CI, 1.112-17.408) between surgeons (<10 years of surgical practice) who wasted implants in 12.8% of their cases and surgeons (>20 years of surgical practice) who wasted implants in 9% of their cases (Table 4).

There was no significant difference between groups for department (P = .83) or for the Adult Reconstruction (P = .29) or Spine (P = .41) division when analyzed by years in surgical practice. Analyzing by case volume, we found a significant difference for the Sports Medicine division (P = .004): Percentage of total implant waste was significantly higher (P = .003; 95% CI, –12.61 to –2.97) for surgeons with the lower 25% of case volume (9.8%) than for surgeons with the middle 50% of case volume (3.5%) (Table 5). No other significant difference was found for department (P = .52) or for the Adult Reconstruction (P = .69) or Spine (P = .45) division.

Analyzing by case volume and years in surgical practice, we found no significant difference for department (case volume, P = .76; years in surgical practice, P = .07), Adult Reconstruction division (case volume, P = .47; years in surgical practice, P = .78), Spine division (case volume, P = .11; years in surgical practice, P = .15), or Sports Medicine division (case volume, P = .08).

Selected Procedures

Total Hip Arthroplasty. Regarding variance by case volume and years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (volume, P = .072; years in practice, P = .076), percentage of total implant cost wasted (volume, P = .074; years in practice, P = .12), cost of waste per case (volume, P = .075; years in practice, P = .32).

Total Knee Arthroplasty. Regarding variance by years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (P = .38), percentage of total implant cost wasted (P = .50), cost of waste per case (P = .50). Regarding variance by volume, there was no significant difference for percentage of cases with waste (P = .70) or cost of waste per case (P = .05), but we found a significant difference for percentage of total implant cost wasted (P = .038). That difference was caused by an outlier: One surgeon with the lower 25% of case volume wasted an implant in the only TKA he performed that year. Correction for the outlier removed the significance.

 

 

Posterior Lumbar Spinal Fusion. Regarding variance by case volume and years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (volume, P = .36; years in surgical practice, P = .22), percentage of total implant cost wasted (volume, P = .33; years in surgical practice, P = .41), cost of waste per case (volume, P = .34; years in practice, P = .15).

Discussion

The steadily increasing demand for orthopedic surgeries and declining rates of reimbursement by Medicare and other insurance providers have led many hospitals to look for ways to control the cost of these surgeries. Reducing operating room costs, lowering implant prices, and shortening hospital stays have all proved successful.6,15,20,23 One area that has not been thoroughly explored is the cost burden of wasted implants. Our findings suggest implant waste contributes significantly to the cost of orthopedic surgeries.

One weakness of this study is that its data, though encompassing all orthopedic subspecialties and procedures, come from a single teaching institution and therefore are less representative of all orthopedic departments across the United States. However, the findings are useful in that the analysis was performed across multiple specialties at a high-volume institution and may be applied to similar institutions. Another weakness of this study is that the data cover only 1 year. Collecting data over a longer period could improve the magnitude and power of the analysis. Nonetheless, 1 year of data is a good starting point in identifying the issues and guiding the initiation of measures to address them. Last, we did not explore the reason for each instance of waste during the period reviewed. Knowing the reason for implant waste would be helpful in developing strategies to reduce implant waste.

Our study results showed that, in 1 year, implant waste occurred in 1.8% of procedures that required an implant—representing a loss of $634,000. Other studies have quantified implant waste for selected procedures or single departments, but to our knowledge none has quantified implant waste for an entire orthopedic department or hospital. It is therefore difficult to compare our institutional results with other results. For instance, definitions of waste differ. A study that found waste in 20% of spine surgery cases22 included all intraoperative waste, whereas our 11% of spine cases were implant waste only. Similarly, though rates of implant waste in trauma cases differed significantly between a multi-institution study by Zywiel and colleagues24 (0.6%) and our institution (30%), their study excluded arthroplasty cases from the trauma subset and reported implant waste for a single vendor, whereas we included arthroplasty cases and a wide array of implant vendors. In addition, costs cannot be directly compared because, in our study, implants wasted may have differed. Although the Trauma division had the highest incidence of waste (30%) in our analysis, it did not have the highest waste-related costs. Instead, the Adult Reconstruction division, with waste in 8% of cases, had the highest waste cost, $214,869. The cost difference is certainly the result of the difference in type of implants wasted. The implants most commonly wasted in the Trauma division were screws, which cost between $17 and $150; a single femoral stem, though wasted less often, cost significantly more, $2000 to $6000.

Our results showed a combined implant waste incidence of 6.8% for primary THA and primary TKA cases over the year. In their multi-institution study, Zywiel and colleagues19 reported a combined incidence of implant waste in 2% of THA and TKA cases. The difference is that Zywiel and colleagues19 reported data from a single implant vendor and included revision surgeries, hip hemiarthroplasties, and unicondylar knee arthroplasties. Another study reported implant waste in 5.7% of all TKA cases but did not specify whether revision or unicondylar arthroplasties were included.25 For lumbar spinal fusion, we found an implant waste incidence of 14%. Given the lack of studies in this area, we cannot make a comparison of results.

To our knowledge, there has been no other study of the effects of case volume and years in surgical practice on implant waste. Our analysis showed that waste incidence was not related to surgeon case volume but was related to years in surgical practice. Incidence of waste was significantly lower among surgeons practicing 20 years or more than among surgeons practicing fewer than 10 years. The difference may be a reflection that case volume during a single year is not totally indicative of a surgeon’s lifetime case volume. For example, several surgeons with many years of experience and a significant lifetime case volume had an annual case volume in the lower 25% of the department because they were approaching retirement or had only recently joined the institution. More rigorous prospective studies are needed to further understand this relationship.

 

 

Conclusions

Our study demonstrated significant costs related to implant waste. These costs are important to consider not only for traditional cases, such as total joint and spine procedures, in which implant costs are routinely scrutinized, but for all subspecialties, such as sports medicine, in which the majority of cases are performed on an outpatient basis. Considering the estimated $36 million wasted during THAs and TKAs and $126 million wasted on spine surgeries in the United States annually, and the significant waste we observed in other orthopedic subspecialties, decreasing the rate of intraoperative waste during orthopedic surgeries represents another area that could provide significant cost reduction through implant cost savings.19,22 A few successful programs have been reported. Soroceanu and colleagues22 found an almost 50% decrease in intraoperative waste during spine surgery after an educational program was used to address such waste. Elsewhere, use of a computer-based system (e.Label and Compatibility) led to an estimated cost reduction of $75,000 in implant waste.25 Efforts to develop and implement other programs to reduce implant waste are needed and should be part of any orthopedic operating room cost reduction strategy.

References

1.    Alhassani A, Chandra A, Chernew ME. The sources of the SGR “hole.” N Engl J Med. 2012;366(4):289-291.

2.    Hariri S, Bozic KJ, Lavernia C, Prestipino A, Rubash HE. Medicare physician reimbursement: past, present, and future. J Bone Joint Surg Am. 2007;89(11):2536-2546.

3.    Keehan SP, Sisko AM, Truffer CJ, et al. National health spending projections through 2020: economic recovery and reform drive faster spending growth. Health Aff. 2011;30(8):1594-1605.

4.    Porter ME. What is value in health care? N Engl J Med. 2010;363(26):2477-2481.

5.    Belatti DA, Phisitkul P. Trends in orthopedics: an analysis of Medicare claims, 2000–2010. Orthopedics. 2013;36(3):e366-e372.

6.    Kurtz S, Ong K, Lau E, Mowat F, Halpern M. Projections of primary and revision hip and knee arthroplasty in the United States from 2005 to 2030. J Bone Joint Surg Am. 2007;89(4):780-785.

7.    Lavernia CJ, Hernandez VH, Rossi MD. Payment analysis of total hip replacement. Curr Opin Orthop. 2007;18(5):23-27.

8.    Mendenhall S. 2003 hip and knee implant review. Orthop Network News. 2003;14(3):2.

9.    Mendenhall S. 2008 hip and knee implant review. Orthop Network News. 2008;19(3):20.

10. Healy WL, Rana AJ, Iorio R. Hospital economics of primary total knee arthroplasty at a teaching hospital. Clin Orthop Relat Res. 2011;469(1):87-94.

11. Mendenhall S. 2007 hip and knee implant review. Orthop Network News. 2007;18(3):16.

12. Iorio R, Davis CM 3rd, Healy WL, Fehring TK, O’Connor MI, York S. Impact of the economic downturn on adult reconstruction surgery: a survey of the American Association of Hip and Knee Surgeons. J Arthroplasty. 2010;25(7):1005-1014.

13. Healy WL, Iorio R, Ko J, Appleby D, Lemos DW. Impact of cost reduction programs on short-term patient outcome and hospital cost of total knee arthroplasty. J Bone Joint Surg Am. 2002;84(3):348-353.

14. Iorio R, Robb WJ, Healy WL, et al. Orthopaedic surgeon workforce and volume assessment for total hip and knee replacement in the United States: preparing for an epidemic. J Bone Joint Surg Am. 2008;90(7):1598-1605.

15. Rana AJ, Iorio R, Healy WL. Hospital economics of primary THA decreasing reimbursement and increasing cost, 1990 to 2008. Clin Orthop Relat Res. 2011;469(2):355-361.

16. Robinson JC, Pozen A, Tseng S, Bozic KJ. Variability in costs associated with total hip and knee replacement implants. J Bone Joint Surg Am. 2012;94(18):1693-1698.

17.  de Steiger RN, Miller LN, Davidson DC, Ryan P, Graves SE. Joint registry approach for identification of outlier prostheses. Acta Orthop. 2013;84(4):348-352.

18. Havelin LI, Fenstad AM, Salomonsson R, et al. The Nordic Arthroplasty Register Association: a unique collaboration between 3 national hip arthroplasty registries with 280,201 THRs. Acta Orthop. 2009;80(4):393-401.

19. Zywiel MG, Ulrich SD, Suda AJ, Duncan JL, McGrath MS, Mont MA. Incidence and cost of intraoperative waste of hip and knee arthroplasty implants. J Arthroplasty. 2010;25(4):558-562.

20. Kim P, Kurokawa R, Itoki K. Technical advancements and utilization of spine surgery—international disparities in trend-dynamics between Japan, Korea, and the USA. Neurol Med Chir. 2010;50(9):853-858.

21. Weinstein JN, Lurie JD, Olson PR, Bronner KK, Fisher ES. United States’ trends and regional variations in lumbar spine surgery: 1992–2003. Spine. 2006;31(23):2707-2714.

22. Soroceanu A, Canacari E, Brown E, Robinson A, McGuire KJ. Intraoperative waste in spine surgery: incidence, cost, and effectiveness of an educational program. Spine. 2011;36(19):E1270-E1273.

23. Bosco JA, Alvarado CM, Slover JD, Iorio R, Hutzler LH. Decreasing total joint implant costs and physician specific cost variation through negotiation. J Arthroplasty. 2014;29(4):678-680.

24. Zywiel MG, Delanois RE, McGrath MS, Ulrich SD, Duncan JL, Mont MA. Intraoperative waste of trauma implants: a cost burden to hospitals worth addressing? J Orthop Trauma. 2009;23(10):710-715.

25. Ast MP, Mayman DJ, Su EP, Gonzalez Della Valle AM, Parks ML, Haas SB. The reduction of implant-related errors and waste in total knee arthroplasty using a novel, computer based, e.Label and Compatibility system. J Arthroplasty. 2014;29(1):132-136.

Article PDF
Author and Disclosure Information

Ashley Payne, MSc, James Slover, MD, MS, Ifeoma Inneh, MPH, Lorraine Hutzler, BA, Richard Iorio, MD, and Joseph A. Bosco III, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
554-560
Legacy Keywords
american journal of orthopedics, AJO, original study, study, orthopedic, implant, waste, implant waste, cost, total joint arthroplasty, TJA, arthroplasty, joint, total hip arthroplasty, THA, posterior lumbar spinal fusion, spine, total knee arthroplasty, TKA, hip, knee, practice management, payne, slover, inneh, hutzler, iorio, bosco
Sections
Author and Disclosure Information

Ashley Payne, MSc, James Slover, MD, MS, Ifeoma Inneh, MPH, Lorraine Hutzler, BA, Richard Iorio, MD, and Joseph A. Bosco III, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Ashley Payne, MSc, James Slover, MD, MS, Ifeoma Inneh, MPH, Lorraine Hutzler, BA, Richard Iorio, MD, and Joseph A. Bosco III, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

The cost of health care in the United States is increasing at an unsustainable rate.1-3 To decrease or even reverse this trend, we must decrease the cost of care without adversely affecting quality. Porter4 defined value as the quality of care divided by its cost. The economics of total joint arthroplasty (TJA) has received a great deal of attention because of both increasing demand and increasing cost.5-9 About 33% of all orthopedic surgeries and the majority of TJAs are paid for by Medicare.9 In recent years, the rate of reimbursement for orthopedic cases has steadily declined while the cost of implants has increased.3,10,11 Given the significant cost of implants, health care providers in some subspecialties have focused on implant costs as a potential area for cost reduction.12 For example, in TJA this has proved effective in reducing the overall cost, as has decreasing length of stay after surgery.8,10,13-16

With little evidence suggesting any specific orthopedic implant has outcomes superior to those of others, with the exception of select poorly performing outliers, we must increase value of care by lowering the cost when considering these devices.17,18 In addition, some experts have suggested that intraoperative waste is a significant factor in TJA cost, and it does contribute to the average implant cost for a TJA case.6,19 Using data collected from 72 institutions, Zywiel and colleagues19 estimated the annual cost of wasted hip and knee arthroplasty implants to be more than $36 million in the United States.

However, considering the aging US population, TJA is not the only orthopedic surgery with increased demand. An estimated 600,000 spine surgeries are performed each year in the United States.20 Between 1992 and 2003, Medicare spending for lumbar spinal fusion increased 500%.21 In addition, in a 15-month observational study of incidence of intraoperative waste in spine surgery, Soroceanu and colleagues22 reported waste occurring in 20% of spine procedures.

Although these studies have described implant waste in TJA and spine surgeries, little has been published on the cost of wasted implants in a center performing the full range of orthopedic procedures. In this article, we detail the implant waste costs incurred by surgeons for all orthopedic subspecialties at a single orthopedic specialty hospital over a 1-year period. Our study goals were to identify types of implants wasted, and incidence and cost of implant waste, for all total hip arthroplasties (THAs), total knee arthroplasties (TKAs), and lumbar spinal fusions performed at the hospital and to determine whether case volume or years in surgical practice affect the rate or cost of implants wasted.

Methods

We performed a retrospective economic analysis of 1 year of administrative implant data from our institution. Collected data were quantified and analyzed for factors that might explain any variance in implant waste among surgeons. We were granted exempt institutional review board status, as no patient information was involved in this study.

We reviewed the administrative implant data for the 12-month period beginning June 2012 and ending May 2013. For that period, number of cases in which an implant was used and number of cases in which an implant was wasted were recorded. For each instance of waste, type and cost of the wasted implant were entered into the administrative database. In addition, overall cost of implants for the year and cost of wasted implants were determined. Data were available for 81 surgeons across 8 orthopedic divisions (subspecialties). From this information, we determined percentage of cases in which waste occurred, percentage of total implant cost wasted, average cost of waste per case, and most commonly wasted implants. All 3 variables were also calculated for THAs, TKAs, and lumbar spinal fusion procedures.

Statistical Analysis

The data were analyzed to determine if surgeon case volume or years in surgical practice affected implant waste. All analyses were performed at department, division (subspecialty), and surgeon levels. Case volume was analyzed in 3 groups: top 25%, middle 50%, and lower 25%. Number of years in surgical practice was analyzed in 3 groups: fewer than 10 years, 10 to 19 years, and 20 years or more. Normality assumption of variables was tested using the Shapiro-Wilk test (P < .05). For between-group differences, 1-way analysis of variance and the Tukey honestly significant difference post hoc test were performed for variables with a normal distribution, and the Kruskal-Wallis and Mann-Whitney tests were performed for variables without a normal distribution.

For the subspecialty-level analyses, only the Adult Reconstruction, Sports Medicine, and Spine divisions were analyzed for the effects of volume, and only the Sports Medicine and Spine divisions were analyzed for the effect of surgical experience, as surgeon numbers were insufficient for adequate grouping(s).

 

 

Data are presented as means with corresponding 95% confidence intervals (CIs). Categorical variables are presented as counts with percentages. All statistical analyses were performed with SPSS Version 21.0 (IBM SPSS) statistical software. Statistical significance was set at .05.

Results

During the 1-year period, 8954 department cases involved an implant of any type. Waste occurred in 12% (1072) of these cases. The rate ranged from 8% in the Adult Reconstruction division to 30% in the Trauma division (Table 1), and the rate for individual surgeons ranged from 3% to 100%, though the surgeon with 100% performed only 1 case, and the next highest rate was 50%.

Total implant cost for our hospital during the period was $34,340,607. Of that total cost, 1.8% ($634,668) was lost because of implant waste. Percentage of total implant cost wasted ranged from 1.6% in the Adult Reconstruction division to 4.7% in the Sports Medicine division (Table 1). Percentage of total implant cost wasted for individual surgeons ranged from 0.2% to 16.1%. Tables 2 and 3 list the most commonly wasted implants by count and cost, respectively.

When total cost of wasted implants was averaged over all implant cases performed during the period, the loss resulting from waste amounted to $71 per case for the department and ranged from $21 per case for the Hand division to $105 per case for the Pediatric division (Table 1). For individual surgeons, the loss ranged from $4 to $250 per case.

During the period studied, an implant was wasted in 9% (100) of the 1076 primary THAs performed, 4% (42) of the 1003 primary TKAs, and 14% (30) of the 217 lumbar spinal fusions (Tables 4, 5).

There was no significant difference between groups for department (P = .46) or for the Adult Reconstruction (P = .83), Spine (P = .10), or Sports Medicine (P = .69) division. Analyzing for variance by years in surgical practice, we found a significant difference for department (P = .01) but not for the Adult Reconstruction (P = .12) or Spine (P = .14) division. The department difference resulted from a significant difference (P = .001; 95% CI, 1.112-17.408) between surgeons (<10 years of surgical practice) who wasted implants in 12.8% of their cases and surgeons (>20 years of surgical practice) who wasted implants in 9% of their cases (Table 4).

There was no significant difference between groups for department (P = .83) or for the Adult Reconstruction (P = .29) or Spine (P = .41) division when analyzed by years in surgical practice. Analyzing by case volume, we found a significant difference for the Sports Medicine division (P = .004): Percentage of total implant waste was significantly higher (P = .003; 95% CI, –12.61 to –2.97) for surgeons with the lower 25% of case volume (9.8%) than for surgeons with the middle 50% of case volume (3.5%) (Table 5). No other significant difference was found for department (P = .52) or for the Adult Reconstruction (P = .69) or Spine (P = .45) division.

Analyzing by case volume and years in surgical practice, we found no significant difference for department (case volume, P = .76; years in surgical practice, P = .07), Adult Reconstruction division (case volume, P = .47; years in surgical practice, P = .78), Spine division (case volume, P = .11; years in surgical practice, P = .15), or Sports Medicine division (case volume, P = .08).

Selected Procedures

Total Hip Arthroplasty. Regarding variance by case volume and years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (volume, P = .072; years in practice, P = .076), percentage of total implant cost wasted (volume, P = .074; years in practice, P = .12), cost of waste per case (volume, P = .075; years in practice, P = .32).

Total Knee Arthroplasty. Regarding variance by years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (P = .38), percentage of total implant cost wasted (P = .50), cost of waste per case (P = .50). Regarding variance by volume, there was no significant difference for percentage of cases with waste (P = .70) or cost of waste per case (P = .05), but we found a significant difference for percentage of total implant cost wasted (P = .038). That difference was caused by an outlier: One surgeon with the lower 25% of case volume wasted an implant in the only TKA he performed that year. Correction for the outlier removed the significance.

 

 

Posterior Lumbar Spinal Fusion. Regarding variance by case volume and years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (volume, P = .36; years in surgical practice, P = .22), percentage of total implant cost wasted (volume, P = .33; years in surgical practice, P = .41), cost of waste per case (volume, P = .34; years in practice, P = .15).

Discussion

The steadily increasing demand for orthopedic surgeries and declining rates of reimbursement by Medicare and other insurance providers have led many hospitals to look for ways to control the cost of these surgeries. Reducing operating room costs, lowering implant prices, and shortening hospital stays have all proved successful.6,15,20,23 One area that has not been thoroughly explored is the cost burden of wasted implants. Our findings suggest implant waste contributes significantly to the cost of orthopedic surgeries.

One weakness of this study is that its data, though encompassing all orthopedic subspecialties and procedures, come from a single teaching institution and therefore are less representative of all orthopedic departments across the United States. However, the findings are useful in that the analysis was performed across multiple specialties at a high-volume institution and may be applied to similar institutions. Another weakness of this study is that the data cover only 1 year. Collecting data over a longer period could improve the magnitude and power of the analysis. Nonetheless, 1 year of data is a good starting point in identifying the issues and guiding the initiation of measures to address them. Last, we did not explore the reason for each instance of waste during the period reviewed. Knowing the reason for implant waste would be helpful in developing strategies to reduce implant waste.

Our study results showed that, in 1 year, implant waste occurred in 1.8% of procedures that required an implant—representing a loss of $634,000. Other studies have quantified implant waste for selected procedures or single departments, but to our knowledge none has quantified implant waste for an entire orthopedic department or hospital. It is therefore difficult to compare our institutional results with other results. For instance, definitions of waste differ. A study that found waste in 20% of spine surgery cases22 included all intraoperative waste, whereas our 11% of spine cases were implant waste only. Similarly, though rates of implant waste in trauma cases differed significantly between a multi-institution study by Zywiel and colleagues24 (0.6%) and our institution (30%), their study excluded arthroplasty cases from the trauma subset and reported implant waste for a single vendor, whereas we included arthroplasty cases and a wide array of implant vendors. In addition, costs cannot be directly compared because, in our study, implants wasted may have differed. Although the Trauma division had the highest incidence of waste (30%) in our analysis, it did not have the highest waste-related costs. Instead, the Adult Reconstruction division, with waste in 8% of cases, had the highest waste cost, $214,869. The cost difference is certainly the result of the difference in type of implants wasted. The implants most commonly wasted in the Trauma division were screws, which cost between $17 and $150; a single femoral stem, though wasted less often, cost significantly more, $2000 to $6000.

Our results showed a combined implant waste incidence of 6.8% for primary THA and primary TKA cases over the year. In their multi-institution study, Zywiel and colleagues19 reported a combined incidence of implant waste in 2% of THA and TKA cases. The difference is that Zywiel and colleagues19 reported data from a single implant vendor and included revision surgeries, hip hemiarthroplasties, and unicondylar knee arthroplasties. Another study reported implant waste in 5.7% of all TKA cases but did not specify whether revision or unicondylar arthroplasties were included.25 For lumbar spinal fusion, we found an implant waste incidence of 14%. Given the lack of studies in this area, we cannot make a comparison of results.

To our knowledge, there has been no other study of the effects of case volume and years in surgical practice on implant waste. Our analysis showed that waste incidence was not related to surgeon case volume but was related to years in surgical practice. Incidence of waste was significantly lower among surgeons practicing 20 years or more than among surgeons practicing fewer than 10 years. The difference may be a reflection that case volume during a single year is not totally indicative of a surgeon’s lifetime case volume. For example, several surgeons with many years of experience and a significant lifetime case volume had an annual case volume in the lower 25% of the department because they were approaching retirement or had only recently joined the institution. More rigorous prospective studies are needed to further understand this relationship.

 

 

Conclusions

Our study demonstrated significant costs related to implant waste. These costs are important to consider not only for traditional cases, such as total joint and spine procedures, in which implant costs are routinely scrutinized, but for all subspecialties, such as sports medicine, in which the majority of cases are performed on an outpatient basis. Considering the estimated $36 million wasted during THAs and TKAs and $126 million wasted on spine surgeries in the United States annually, and the significant waste we observed in other orthopedic subspecialties, decreasing the rate of intraoperative waste during orthopedic surgeries represents another area that could provide significant cost reduction through implant cost savings.19,22 A few successful programs have been reported. Soroceanu and colleagues22 found an almost 50% decrease in intraoperative waste during spine surgery after an educational program was used to address such waste. Elsewhere, use of a computer-based system (e.Label and Compatibility) led to an estimated cost reduction of $75,000 in implant waste.25 Efforts to develop and implement other programs to reduce implant waste are needed and should be part of any orthopedic operating room cost reduction strategy.

The cost of health care in the United States is increasing at an unsustainable rate.1-3 To decrease or even reverse this trend, we must decrease the cost of care without adversely affecting quality. Porter4 defined value as the quality of care divided by its cost. The economics of total joint arthroplasty (TJA) has received a great deal of attention because of both increasing demand and increasing cost.5-9 About 33% of all orthopedic surgeries and the majority of TJAs are paid for by Medicare.9 In recent years, the rate of reimbursement for orthopedic cases has steadily declined while the cost of implants has increased.3,10,11 Given the significant cost of implants, health care providers in some subspecialties have focused on implant costs as a potential area for cost reduction.12 For example, in TJA this has proved effective in reducing the overall cost, as has decreasing length of stay after surgery.8,10,13-16

With little evidence suggesting any specific orthopedic implant has outcomes superior to those of others, with the exception of select poorly performing outliers, we must increase value of care by lowering the cost when considering these devices.17,18 In addition, some experts have suggested that intraoperative waste is a significant factor in TJA cost, and it does contribute to the average implant cost for a TJA case.6,19 Using data collected from 72 institutions, Zywiel and colleagues19 estimated the annual cost of wasted hip and knee arthroplasty implants to be more than $36 million in the United States.

However, considering the aging US population, TJA is not the only orthopedic surgery with increased demand. An estimated 600,000 spine surgeries are performed each year in the United States.20 Between 1992 and 2003, Medicare spending for lumbar spinal fusion increased 500%.21 In addition, in a 15-month observational study of incidence of intraoperative waste in spine surgery, Soroceanu and colleagues22 reported waste occurring in 20% of spine procedures.

Although these studies have described implant waste in TJA and spine surgeries, little has been published on the cost of wasted implants in a center performing the full range of orthopedic procedures. In this article, we detail the implant waste costs incurred by surgeons for all orthopedic subspecialties at a single orthopedic specialty hospital over a 1-year period. Our study goals were to identify types of implants wasted, and incidence and cost of implant waste, for all total hip arthroplasties (THAs), total knee arthroplasties (TKAs), and lumbar spinal fusions performed at the hospital and to determine whether case volume or years in surgical practice affect the rate or cost of implants wasted.

Methods

We performed a retrospective economic analysis of 1 year of administrative implant data from our institution. Collected data were quantified and analyzed for factors that might explain any variance in implant waste among surgeons. We were granted exempt institutional review board status, as no patient information was involved in this study.

We reviewed the administrative implant data for the 12-month period beginning June 2012 and ending May 2013. For that period, number of cases in which an implant was used and number of cases in which an implant was wasted were recorded. For each instance of waste, type and cost of the wasted implant were entered into the administrative database. In addition, overall cost of implants for the year and cost of wasted implants were determined. Data were available for 81 surgeons across 8 orthopedic divisions (subspecialties). From this information, we determined percentage of cases in which waste occurred, percentage of total implant cost wasted, average cost of waste per case, and most commonly wasted implants. All 3 variables were also calculated for THAs, TKAs, and lumbar spinal fusion procedures.

Statistical Analysis

The data were analyzed to determine if surgeon case volume or years in surgical practice affected implant waste. All analyses were performed at department, division (subspecialty), and surgeon levels. Case volume was analyzed in 3 groups: top 25%, middle 50%, and lower 25%. Number of years in surgical practice was analyzed in 3 groups: fewer than 10 years, 10 to 19 years, and 20 years or more. Normality assumption of variables was tested using the Shapiro-Wilk test (P < .05). For between-group differences, 1-way analysis of variance and the Tukey honestly significant difference post hoc test were performed for variables with a normal distribution, and the Kruskal-Wallis and Mann-Whitney tests were performed for variables without a normal distribution.

For the subspecialty-level analyses, only the Adult Reconstruction, Sports Medicine, and Spine divisions were analyzed for the effects of volume, and only the Sports Medicine and Spine divisions were analyzed for the effect of surgical experience, as surgeon numbers were insufficient for adequate grouping(s).

 

 

Data are presented as means with corresponding 95% confidence intervals (CIs). Categorical variables are presented as counts with percentages. All statistical analyses were performed with SPSS Version 21.0 (IBM SPSS) statistical software. Statistical significance was set at .05.

Results

During the 1-year period, 8954 department cases involved an implant of any type. Waste occurred in 12% (1072) of these cases. The rate ranged from 8% in the Adult Reconstruction division to 30% in the Trauma division (Table 1), and the rate for individual surgeons ranged from 3% to 100%, though the surgeon with 100% performed only 1 case, and the next highest rate was 50%.

Total implant cost for our hospital during the period was $34,340,607. Of that total cost, 1.8% ($634,668) was lost because of implant waste. Percentage of total implant cost wasted ranged from 1.6% in the Adult Reconstruction division to 4.7% in the Sports Medicine division (Table 1). Percentage of total implant cost wasted for individual surgeons ranged from 0.2% to 16.1%. Tables 2 and 3 list the most commonly wasted implants by count and cost, respectively.

When total cost of wasted implants was averaged over all implant cases performed during the period, the loss resulting from waste amounted to $71 per case for the department and ranged from $21 per case for the Hand division to $105 per case for the Pediatric division (Table 1). For individual surgeons, the loss ranged from $4 to $250 per case.

During the period studied, an implant was wasted in 9% (100) of the 1076 primary THAs performed, 4% (42) of the 1003 primary TKAs, and 14% (30) of the 217 lumbar spinal fusions (Tables 4, 5).

There was no significant difference between groups for department (P = .46) or for the Adult Reconstruction (P = .83), Spine (P = .10), or Sports Medicine (P = .69) division. Analyzing for variance by years in surgical practice, we found a significant difference for department (P = .01) but not for the Adult Reconstruction (P = .12) or Spine (P = .14) division. The department difference resulted from a significant difference (P = .001; 95% CI, 1.112-17.408) between surgeons (<10 years of surgical practice) who wasted implants in 12.8% of their cases and surgeons (>20 years of surgical practice) who wasted implants in 9% of their cases (Table 4).

There was no significant difference between groups for department (P = .83) or for the Adult Reconstruction (P = .29) or Spine (P = .41) division when analyzed by years in surgical practice. Analyzing by case volume, we found a significant difference for the Sports Medicine division (P = .004): Percentage of total implant waste was significantly higher (P = .003; 95% CI, –12.61 to –2.97) for surgeons with the lower 25% of case volume (9.8%) than for surgeons with the middle 50% of case volume (3.5%) (Table 5). No other significant difference was found for department (P = .52) or for the Adult Reconstruction (P = .69) or Spine (P = .45) division.

Analyzing by case volume and years in surgical practice, we found no significant difference for department (case volume, P = .76; years in surgical practice, P = .07), Adult Reconstruction division (case volume, P = .47; years in surgical practice, P = .78), Spine division (case volume, P = .11; years in surgical practice, P = .15), or Sports Medicine division (case volume, P = .08).

Selected Procedures

Total Hip Arthroplasty. Regarding variance by case volume and years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (volume, P = .072; years in practice, P = .076), percentage of total implant cost wasted (volume, P = .074; years in practice, P = .12), cost of waste per case (volume, P = .075; years in practice, P = .32).

Total Knee Arthroplasty. Regarding variance by years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (P = .38), percentage of total implant cost wasted (P = .50), cost of waste per case (P = .50). Regarding variance by volume, there was no significant difference for percentage of cases with waste (P = .70) or cost of waste per case (P = .05), but we found a significant difference for percentage of total implant cost wasted (P = .038). That difference was caused by an outlier: One surgeon with the lower 25% of case volume wasted an implant in the only TKA he performed that year. Correction for the outlier removed the significance.

 

 

Posterior Lumbar Spinal Fusion. Regarding variance by case volume and years in surgical practice, we found no significant difference for any variable analyzed: percentage of cases with waste (volume, P = .36; years in surgical practice, P = .22), percentage of total implant cost wasted (volume, P = .33; years in surgical practice, P = .41), cost of waste per case (volume, P = .34; years in practice, P = .15).

Discussion

The steadily increasing demand for orthopedic surgeries and declining rates of reimbursement by Medicare and other insurance providers have led many hospitals to look for ways to control the cost of these surgeries. Reducing operating room costs, lowering implant prices, and shortening hospital stays have all proved successful.6,15,20,23 One area that has not been thoroughly explored is the cost burden of wasted implants. Our findings suggest implant waste contributes significantly to the cost of orthopedic surgeries.

One weakness of this study is that its data, though encompassing all orthopedic subspecialties and procedures, come from a single teaching institution and therefore are less representative of all orthopedic departments across the United States. However, the findings are useful in that the analysis was performed across multiple specialties at a high-volume institution and may be applied to similar institutions. Another weakness of this study is that the data cover only 1 year. Collecting data over a longer period could improve the magnitude and power of the analysis. Nonetheless, 1 year of data is a good starting point in identifying the issues and guiding the initiation of measures to address them. Last, we did not explore the reason for each instance of waste during the period reviewed. Knowing the reason for implant waste would be helpful in developing strategies to reduce implant waste.

Our study results showed that, in 1 year, implant waste occurred in 1.8% of procedures that required an implant—representing a loss of $634,000. Other studies have quantified implant waste for selected procedures or single departments, but to our knowledge none has quantified implant waste for an entire orthopedic department or hospital. It is therefore difficult to compare our institutional results with other results. For instance, definitions of waste differ. A study that found waste in 20% of spine surgery cases22 included all intraoperative waste, whereas our 11% of spine cases were implant waste only. Similarly, though rates of implant waste in trauma cases differed significantly between a multi-institution study by Zywiel and colleagues24 (0.6%) and our institution (30%), their study excluded arthroplasty cases from the trauma subset and reported implant waste for a single vendor, whereas we included arthroplasty cases and a wide array of implant vendors. In addition, costs cannot be directly compared because, in our study, implants wasted may have differed. Although the Trauma division had the highest incidence of waste (30%) in our analysis, it did not have the highest waste-related costs. Instead, the Adult Reconstruction division, with waste in 8% of cases, had the highest waste cost, $214,869. The cost difference is certainly the result of the difference in type of implants wasted. The implants most commonly wasted in the Trauma division were screws, which cost between $17 and $150; a single femoral stem, though wasted less often, cost significantly more, $2000 to $6000.

Our results showed a combined implant waste incidence of 6.8% for primary THA and primary TKA cases over the year. In their multi-institution study, Zywiel and colleagues19 reported a combined incidence of implant waste in 2% of THA and TKA cases. The difference is that Zywiel and colleagues19 reported data from a single implant vendor and included revision surgeries, hip hemiarthroplasties, and unicondylar knee arthroplasties. Another study reported implant waste in 5.7% of all TKA cases but did not specify whether revision or unicondylar arthroplasties were included.25 For lumbar spinal fusion, we found an implant waste incidence of 14%. Given the lack of studies in this area, we cannot make a comparison of results.

To our knowledge, there has been no other study of the effects of case volume and years in surgical practice on implant waste. Our analysis showed that waste incidence was not related to surgeon case volume but was related to years in surgical practice. Incidence of waste was significantly lower among surgeons practicing 20 years or more than among surgeons practicing fewer than 10 years. The difference may be a reflection that case volume during a single year is not totally indicative of a surgeon’s lifetime case volume. For example, several surgeons with many years of experience and a significant lifetime case volume had an annual case volume in the lower 25% of the department because they were approaching retirement or had only recently joined the institution. More rigorous prospective studies are needed to further understand this relationship.

 

 

Conclusions

Our study demonstrated significant costs related to implant waste. These costs are important to consider not only for traditional cases, such as total joint and spine procedures, in which implant costs are routinely scrutinized, but for all subspecialties, such as sports medicine, in which the majority of cases are performed on an outpatient basis. Considering the estimated $36 million wasted during THAs and TKAs and $126 million wasted on spine surgeries in the United States annually, and the significant waste we observed in other orthopedic subspecialties, decreasing the rate of intraoperative waste during orthopedic surgeries represents another area that could provide significant cost reduction through implant cost savings.19,22 A few successful programs have been reported. Soroceanu and colleagues22 found an almost 50% decrease in intraoperative waste during spine surgery after an educational program was used to address such waste. Elsewhere, use of a computer-based system (e.Label and Compatibility) led to an estimated cost reduction of $75,000 in implant waste.25 Efforts to develop and implement other programs to reduce implant waste are needed and should be part of any orthopedic operating room cost reduction strategy.

References

1.    Alhassani A, Chandra A, Chernew ME. The sources of the SGR “hole.” N Engl J Med. 2012;366(4):289-291.

2.    Hariri S, Bozic KJ, Lavernia C, Prestipino A, Rubash HE. Medicare physician reimbursement: past, present, and future. J Bone Joint Surg Am. 2007;89(11):2536-2546.

3.    Keehan SP, Sisko AM, Truffer CJ, et al. National health spending projections through 2020: economic recovery and reform drive faster spending growth. Health Aff. 2011;30(8):1594-1605.

4.    Porter ME. What is value in health care? N Engl J Med. 2010;363(26):2477-2481.

5.    Belatti DA, Phisitkul P. Trends in orthopedics: an analysis of Medicare claims, 2000–2010. Orthopedics. 2013;36(3):e366-e372.

6.    Kurtz S, Ong K, Lau E, Mowat F, Halpern M. Projections of primary and revision hip and knee arthroplasty in the United States from 2005 to 2030. J Bone Joint Surg Am. 2007;89(4):780-785.

7.    Lavernia CJ, Hernandez VH, Rossi MD. Payment analysis of total hip replacement. Curr Opin Orthop. 2007;18(5):23-27.

8.    Mendenhall S. 2003 hip and knee implant review. Orthop Network News. 2003;14(3):2.

9.    Mendenhall S. 2008 hip and knee implant review. Orthop Network News. 2008;19(3):20.

10. Healy WL, Rana AJ, Iorio R. Hospital economics of primary total knee arthroplasty at a teaching hospital. Clin Orthop Relat Res. 2011;469(1):87-94.

11. Mendenhall S. 2007 hip and knee implant review. Orthop Network News. 2007;18(3):16.

12. Iorio R, Davis CM 3rd, Healy WL, Fehring TK, O’Connor MI, York S. Impact of the economic downturn on adult reconstruction surgery: a survey of the American Association of Hip and Knee Surgeons. J Arthroplasty. 2010;25(7):1005-1014.

13. Healy WL, Iorio R, Ko J, Appleby D, Lemos DW. Impact of cost reduction programs on short-term patient outcome and hospital cost of total knee arthroplasty. J Bone Joint Surg Am. 2002;84(3):348-353.

14. Iorio R, Robb WJ, Healy WL, et al. Orthopaedic surgeon workforce and volume assessment for total hip and knee replacement in the United States: preparing for an epidemic. J Bone Joint Surg Am. 2008;90(7):1598-1605.

15. Rana AJ, Iorio R, Healy WL. Hospital economics of primary THA decreasing reimbursement and increasing cost, 1990 to 2008. Clin Orthop Relat Res. 2011;469(2):355-361.

16. Robinson JC, Pozen A, Tseng S, Bozic KJ. Variability in costs associated with total hip and knee replacement implants. J Bone Joint Surg Am. 2012;94(18):1693-1698.

17.  de Steiger RN, Miller LN, Davidson DC, Ryan P, Graves SE. Joint registry approach for identification of outlier prostheses. Acta Orthop. 2013;84(4):348-352.

18. Havelin LI, Fenstad AM, Salomonsson R, et al. The Nordic Arthroplasty Register Association: a unique collaboration between 3 national hip arthroplasty registries with 280,201 THRs. Acta Orthop. 2009;80(4):393-401.

19. Zywiel MG, Ulrich SD, Suda AJ, Duncan JL, McGrath MS, Mont MA. Incidence and cost of intraoperative waste of hip and knee arthroplasty implants. J Arthroplasty. 2010;25(4):558-562.

20. Kim P, Kurokawa R, Itoki K. Technical advancements and utilization of spine surgery—international disparities in trend-dynamics between Japan, Korea, and the USA. Neurol Med Chir. 2010;50(9):853-858.

21. Weinstein JN, Lurie JD, Olson PR, Bronner KK, Fisher ES. United States’ trends and regional variations in lumbar spine surgery: 1992–2003. Spine. 2006;31(23):2707-2714.

22. Soroceanu A, Canacari E, Brown E, Robinson A, McGuire KJ. Intraoperative waste in spine surgery: incidence, cost, and effectiveness of an educational program. Spine. 2011;36(19):E1270-E1273.

23. Bosco JA, Alvarado CM, Slover JD, Iorio R, Hutzler LH. Decreasing total joint implant costs and physician specific cost variation through negotiation. J Arthroplasty. 2014;29(4):678-680.

24. Zywiel MG, Delanois RE, McGrath MS, Ulrich SD, Duncan JL, Mont MA. Intraoperative waste of trauma implants: a cost burden to hospitals worth addressing? J Orthop Trauma. 2009;23(10):710-715.

25. Ast MP, Mayman DJ, Su EP, Gonzalez Della Valle AM, Parks ML, Haas SB. The reduction of implant-related errors and waste in total knee arthroplasty using a novel, computer based, e.Label and Compatibility system. J Arthroplasty. 2014;29(1):132-136.

References

1.    Alhassani A, Chandra A, Chernew ME. The sources of the SGR “hole.” N Engl J Med. 2012;366(4):289-291.

2.    Hariri S, Bozic KJ, Lavernia C, Prestipino A, Rubash HE. Medicare physician reimbursement: past, present, and future. J Bone Joint Surg Am. 2007;89(11):2536-2546.

3.    Keehan SP, Sisko AM, Truffer CJ, et al. National health spending projections through 2020: economic recovery and reform drive faster spending growth. Health Aff. 2011;30(8):1594-1605.

4.    Porter ME. What is value in health care? N Engl J Med. 2010;363(26):2477-2481.

5.    Belatti DA, Phisitkul P. Trends in orthopedics: an analysis of Medicare claims, 2000–2010. Orthopedics. 2013;36(3):e366-e372.

6.    Kurtz S, Ong K, Lau E, Mowat F, Halpern M. Projections of primary and revision hip and knee arthroplasty in the United States from 2005 to 2030. J Bone Joint Surg Am. 2007;89(4):780-785.

7.    Lavernia CJ, Hernandez VH, Rossi MD. Payment analysis of total hip replacement. Curr Opin Orthop. 2007;18(5):23-27.

8.    Mendenhall S. 2003 hip and knee implant review. Orthop Network News. 2003;14(3):2.

9.    Mendenhall S. 2008 hip and knee implant review. Orthop Network News. 2008;19(3):20.

10. Healy WL, Rana AJ, Iorio R. Hospital economics of primary total knee arthroplasty at a teaching hospital. Clin Orthop Relat Res. 2011;469(1):87-94.

11. Mendenhall S. 2007 hip and knee implant review. Orthop Network News. 2007;18(3):16.

12. Iorio R, Davis CM 3rd, Healy WL, Fehring TK, O’Connor MI, York S. Impact of the economic downturn on adult reconstruction surgery: a survey of the American Association of Hip and Knee Surgeons. J Arthroplasty. 2010;25(7):1005-1014.

13. Healy WL, Iorio R, Ko J, Appleby D, Lemos DW. Impact of cost reduction programs on short-term patient outcome and hospital cost of total knee arthroplasty. J Bone Joint Surg Am. 2002;84(3):348-353.

14. Iorio R, Robb WJ, Healy WL, et al. Orthopaedic surgeon workforce and volume assessment for total hip and knee replacement in the United States: preparing for an epidemic. J Bone Joint Surg Am. 2008;90(7):1598-1605.

15. Rana AJ, Iorio R, Healy WL. Hospital economics of primary THA decreasing reimbursement and increasing cost, 1990 to 2008. Clin Orthop Relat Res. 2011;469(2):355-361.

16. Robinson JC, Pozen A, Tseng S, Bozic KJ. Variability in costs associated with total hip and knee replacement implants. J Bone Joint Surg Am. 2012;94(18):1693-1698.

17.  de Steiger RN, Miller LN, Davidson DC, Ryan P, Graves SE. Joint registry approach for identification of outlier prostheses. Acta Orthop. 2013;84(4):348-352.

18. Havelin LI, Fenstad AM, Salomonsson R, et al. The Nordic Arthroplasty Register Association: a unique collaboration between 3 national hip arthroplasty registries with 280,201 THRs. Acta Orthop. 2009;80(4):393-401.

19. Zywiel MG, Ulrich SD, Suda AJ, Duncan JL, McGrath MS, Mont MA. Incidence and cost of intraoperative waste of hip and knee arthroplasty implants. J Arthroplasty. 2010;25(4):558-562.

20. Kim P, Kurokawa R, Itoki K. Technical advancements and utilization of spine surgery—international disparities in trend-dynamics between Japan, Korea, and the USA. Neurol Med Chir. 2010;50(9):853-858.

21. Weinstein JN, Lurie JD, Olson PR, Bronner KK, Fisher ES. United States’ trends and regional variations in lumbar spine surgery: 1992–2003. Spine. 2006;31(23):2707-2714.

22. Soroceanu A, Canacari E, Brown E, Robinson A, McGuire KJ. Intraoperative waste in spine surgery: incidence, cost, and effectiveness of an educational program. Spine. 2011;36(19):E1270-E1273.

23. Bosco JA, Alvarado CM, Slover JD, Iorio R, Hutzler LH. Decreasing total joint implant costs and physician specific cost variation through negotiation. J Arthroplasty. 2014;29(4):678-680.

24. Zywiel MG, Delanois RE, McGrath MS, Ulrich SD, Duncan JL, Mont MA. Intraoperative waste of trauma implants: a cost burden to hospitals worth addressing? J Orthop Trauma. 2009;23(10):710-715.

25. Ast MP, Mayman DJ, Su EP, Gonzalez Della Valle AM, Parks ML, Haas SB. The reduction of implant-related errors and waste in total knee arthroplasty using a novel, computer based, e.Label and Compatibility system. J Arthroplasty. 2014;29(1):132-136.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
554-560
Page Number
554-560
Publications
Publications
Topics
Article Type
Display Headline
Orthopedic Implant Waste: Analysis and Quantification
Display Headline
Orthopedic Implant Waste: Analysis and Quantification
Legacy Keywords
american journal of orthopedics, AJO, original study, study, orthopedic, implant, waste, implant waste, cost, total joint arthroplasty, TJA, arthroplasty, joint, total hip arthroplasty, THA, posterior lumbar spinal fusion, spine, total knee arthroplasty, TKA, hip, knee, practice management, payne, slover, inneh, hutzler, iorio, bosco
Legacy Keywords
american journal of orthopedics, AJO, original study, study, orthopedic, implant, waste, implant waste, cost, total joint arthroplasty, TJA, arthroplasty, joint, total hip arthroplasty, THA, posterior lumbar spinal fusion, spine, total knee arthroplasty, TKA, hip, knee, practice management, payne, slover, inneh, hutzler, iorio, bosco
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Technique Using Isoelastic Tension Band for Treatment of Olecranon Fractures

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Technique Using Isoelastic Tension Band for Treatment of Olecranon Fractures

Olecranon fractures are relatively common in adults and constitute 10% of all upper extremity injuries.1,2 An olecranon fracture may be sustained either directly (from blunt trauma or a fall onto the tip of the elbow) or indirectly (as a result of forceful hyperextension of the triceps during a fall onto an outstretched arm). Displaced olecranon fractures with extensor discontinuity require reduction and stabilization. One treatment option is tension band wiring (TBW), which is used to manage noncomminuted fractures.3 TBW, first described by Weber and Vasey4 in 1963, involves transforming the distractive forces of the triceps into dynamic compression forces across the olecranon articular surface using 2 intramedullary Kirschner wires (K-wires) and stainless steel wires looped in figure-of-8 fashion.

Various modifications of the TBW technique of Weber and Vasey4 have been proposed to reduce the frequency of complications. These modifications include substituting screws for K-wires, aiming the angle of the K-wires into the anterior coronoid cortex or loop configuration of the stainless steel wire, using double knots and twisting procedures to finalize fixation, and using alternative materials for the loop construct.5-8 In the literature and in our experience, patients often complain after surgery about prominent K-wires and the twisted knots used to tension the construct.9-12 Surgeons also must address the technical difficulties of positioning the brittle wire without kinking, and avoiding slack while tensioning.

In this article, we report on the clinical outcomes of a series of 7 patients with olecranon fracture treated with a US Food and Drug Administration–approved novel isoelastic ultrahigh-molecular-weight polyethylene (UHMWPE) cerclage cable (Iso-Elastic Cerclage System, Kinamed).

Materials and Methods

Surgical Technique

The patient is arranged in a sloppy lateral position to allow access to the posterior elbow. A nonsterile tourniquet is placed on the upper arm, and the limb is sterilely prepared and draped in standard fashion. A posterolateral incision is made around the olecranon and extended proximally 6 cm and distally 6 cm along the subcutaneous border of the ulna. The fracture is visualized and comminution identified.

To provide anchorage for a pointed reduction clamp, the surgeon drills a 2.5-mm hole in the subcutaneous border of the ulnar shaft. The fracture is reduced in extension and the clamp affixed. The elbow is then flexed and the reduction confirmed visually and by imaging. After realignment of the articular surfaces, 2 longitudinal, parallel K-wires (diameter, 1.6-2.0 mm) are passed in antegrade direction through the proximal olecranon within the medullary canal of the shaft. The proximal ends must not cross the cortex so they may fully capture the figure-of-8 wire during subsequent, final advancement, and the distal ends must not pierce the anterior cortex. A 2.5-mm transverse hole is created distal to the fracture in the dorsal aspect of the ulnar shaft from medial to lateral at 2 times the distance from the tip of the olecranon to the fracture site. This hole is expanded with a 3.5-mm drill bit, allowing both strands of the cable to be passed simultaneously medial to lateral, making the figure-of-8. The 3.5-mm hole represents about 20% of the overall width of the bone, which we have not found to create a significant stress riser in either laboratory or clinical tests of this construct. Proximally, the cables are placed on the periosteum of the olecranon but deep to the triceps tendon and adjacent to the K-wires. The locking clip is placed on the posterolateral aspect of the elbow joint in a location where it can be covered with local tissue for adequate padding. The cable is then threaded through the clamping bracket and tightened slowly and gradually with a tensioning device to low torque level (Figure 1). At this stage, tension may be released to make any necessary adjustments. Last, the locking clip is deployed, securing the tension band in the clip, and the excess cable is trimmed with a scalpel. Softening and pliability of the cable during its insertion and tensioning should be noted.

The ends of the K-wires are now curved in a hook configuration. The tines of the hooks should be parallel to accommodate the cable, and then the triceps is sharply incised to bone. If the bone is hard, an awl is used to create a pilot hole so the hook may be impaled into bone while capturing the cable. Next, the triceps is closed over the pins, minimizing the potential for pin migration and backout. The 2 K-wires are left in place to keep the fragments in proper anatomical alignment during healing and to prevent displacement with elbow motion. Figure 2 is a schematic of the final construct, and Figure 3 shows the construct in a patient.

 

 

 

Reduction of the olecranon fracture is assessed by imaging in full extension to check for possible implant impingement. Last, we apply the previously harvested fracture callus to the fracture site. Layered closure is performed, and bulky soft dressings are applied. Postoperative immobilization with a splint is used. Gentle range-of-motion exercises begin in about 2 weeks and progress as pain allows.

A case example with preoperative and postoperative images taken at 3-month follow-up is provided in Figure 4. The entire surgical technique can be viewed in the Video.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

Clinical Cases

Between July 2007 and February 2011, 7 patients with displaced olecranon fractures underwent osteosynthesis using the isoelastic tension band (Table 1). According to the Mayo classification system, 5 of these patients had type 2A fractures, 1 had a type 2B fracture with an ipsilateral nondisplaced radial neck fracture, and 1 had a type 3B fracture. There were 4 female and 3 male patients. The injury was on the dominant side in 3 patients. All patients gave informed consent to evaluation at subsequent office visits and completed outcomes questionnaires by mail several years after surgery. Mean follow-up at which outcome measures questionnaires were obtained was 3.3 years (range, 2.1-6.8 years). Exclusion criteria were age under 18 years and inability to provide informed consent, fracture patterns with extensive articular comminution, and open fractures. Permission to conduct this research was granted by institutional review board.

At each visit, patients completed the Disabilities of the Arm, Shoulder, and Hand (DASH) functional outcome survey and were evaluated according to Broberg and Morrey’s elbow scoring system.13,14 Chart review consisted of evaluation of medical records, including radiographs and orthopedic physician notes in which preoperative examination was documented, mechanism of injury was noted, radiologic fracture pattern was evaluated, and time to bony union was recorded. Elbow motion was documented. Grip strength was measured with a calibrated Jamar dynamometer (Sammons Preston Rolyan) set at level 2, as delineated in Broberg and Morrey’s functional elbow scoring system.

Results

The 7 patients were assessed at a mean final follow-up of 19 months after surgery and received a mean Broberg and Morrey score of good (92.2/100) (Table 2). Restoration of motion and strength was excellent; compared with contralateral extremity, mean flexion arc was 96%, and mean forearm rotation was 96%. Grip was 99% of the noninjured side, perhaps the result of increased conditioning from physical therapy. Patients completed outcomes questionnaires at a mean of 3.3 years after surgery. Mean (SD) DASH score at this longest follow-up was 12.6 (17.2) (Table 2). Patients were satisfied (mean, 9.8/10; range, 9.5-10) and had little pain (mean, 0.8/10; range, 0-3). All fractures united, and there were no infections. One patient had a satisfactory union with complete restoration of motion and continued to play sports vocationally but developed pain over the locking clip 5 years after the index procedure and decided to have the implant removed. He had no radiographic evidence of K-wire or implant migration. Another patient had a minor degree of implant irritation at longest follow-up but did not request hardware removal.

Discussion

Stainless steel wire is often used in TBW because of its widespread availability, low cost, lack of immunogenicity, and relative strength.7 However, stainless steel wire has several disadvantages. It is susceptible to low-cycle fatigue failure, and fatigue strength may be seriously reduced secondary to incidental trauma to the wire on implantation.15,16 Other complications are kinking, skin irritation, implant prominence, fixation loss caused by wire loosening, and inadequate initial reduction potentially requiring revision.10,12,17-21

Isoelastic cable is a new type of cerclage cable that consists of UHMWPE strands braided over a nylon core. The particular property profile of the isoelastic tension band gives the cable intrinsic elastic and pliable qualities. In addition, unlike stainless steel, the band maintains a uniform, continuous compression force across a fracture site.22 Multifilament braided cables fatigue and fray, but the isoelastic cerclage cable showed no evidence of fraying or breakage after 1 million loading cycles.22,23 Compared with metal wire or braided metal cable, the band also has higher fatigue strength and higher ultimate tensile strength.7 Furthermore, the cable is less abrasive than stainless steel, so theoretically it is less irritating to surrounding subcutaneous tissue. Last, the pliability of the band allows the surgeon to create multiple loops of cable without the wire-failure side effects related to kinking, which is common with the metal construct.

In 2010, Ting and colleagues24 retrospectively studied implant failure complications associated with use of isoelastic cerclage cables in the treatment of periprosthetic fractures in total hip arthroplasty. They reported a breakage rate of 0% and noted that previously published breakage data for metallic cerclage devices ranged from 0% to 44%. They concluded that isoelastic cables were not associated with material failure, and there were no direct complications related to the cables. Similarly, Edwards and colleagues25 evaluated the same type of cable used in revision shoulder arthroplasty and reported excellent success and no failures. Although these data stem from use in the femur and humerus, we think the noted benefits apply to fractures of the elbow as well, as we observed a similar breakage rate (0%).

 

 

Various studies have addressed the clinical complaints and reoperation rates associated with retained metal implants after olecranon fixation. Traditional AO (Arbeitsgemeinschaft für Osteosynthesefragen) technique involves subcutaneous placement of stainless steel wires, which often results in tissue irritation. Reoperation rates as high as 80% have been reported, and a proportion of implant removals may in fact be caused by factors related to the subcutaneous placement of the metallic implants rather than K-wire migration alone.5,12,18 A nonmetallic isoelastic tension band can provide a more comfortable and less irritating implant, which could reduce the need for secondary intervention related to painful subcutaneous implant. One of our 7 patients had a symptomatic implant removed 5 years after surgery. This patient complained of pain over the area of the tension band device clip, so after fracture healing the entire fixation device was removed in the operating room. If reoperation is necessary, removal of intramedullary K-wires is relatively simple using a minimal incision; removal of stainless steel TBW may require a larger approach if the twisted knots cannot be easily retrieved.

A study of compression forces created by stainless steel wire demonstrated that a “finely tuned mechanical sense” was needed to produce optimal fixation compression when using stainless steel wire.26 It was observed that a submaximal twist created insufficient compressive force, while an ostensibly minimal increase in twisting force above optimum abruptly caused wire failure through breakage. Cerclage cables using clasping devices, such as the current isoelastic cerclage cable, were superior in ease of application. Furthermore, a clasping device allows for cable tension readjustment that is not possible with stainless steel wire. The clasping mechanism precludes the surgeon from having to bury the stainless steel knot and allows for the objective cable-tensioning not possible with stainless steel wire. Last, the tensioning device is titratable, which allows the surgeon to set the construct at a predetermined quantitative tension, which is of benefit in patients with osteopenia.

One limitation of this study is that it did not resolve the potential for K-wire migration, and we agree with previous recommendations that careful attention to surgical technique may avoid such a complication.10 In addition, the sample was small, and the study lacked a control group; a larger sample and a control group would have boosted study power. Nevertheless, the physical and functional outcomes associated with use of this technique were excellent. These results demonstrate an efficacious attempt to decrease secondary surgery rates and are therefore proof of concept that the isoelastic tension band may be used as an alternative to stainless steel in the TBW of displaced olecranon fractures with minimal or no comminution.

Conclusion

This easily reproducible technique for use of an isoelastic tension band in olecranon fracture fixation was associated with excellent physical and functional outcomes in a series of 7 patients. The rate of secondary intervention was slightly better for these patients than for patients treated with wire tension band fixation. Although more rigorous study of this device is needed, we think it is a promising alternative to wire tension band techniques.

References

1.    Rommens PM, Küchle R, Schneider RU, Reuter M. Olecranon fractures in adults: factors influencing outcome. Injury. 2004;35(11):1149-1157.

2.    Veillette CJ, Steinmann SP. Olecranon fractures. Orthop Clin North Am. 2008;39(2):229-236.

3.    Newman SD, Mauffrey C, Krikler S. Olecranon fractures. Injury. 2009;40(6):575-581.

4.    Weber BG, Vasey H. Osteosynthesis in olecranon fractures [in German]. Z Unfallmed Berufskr. 1963;56:90-96.

5.    Netz P, Strömberg L. Non-sliding pins in traction absorbing wiring of fractures: a modified technique. Acta Orthop Scand. 1982;53(3):355-360.

6.    Prayson MJ, Williams JL, Marshall MP, Scilaris TA, Lingenfelter EJ. Biomechanical comparison of fixation methods in transverse olecranon fractures: a cadaveric study. J Orthop Trauma. 1997;11(8):565-572.

7.    Rothaug PG, Boston RC, Richardson DW, Nunamaker DM. A comparison of ultra-high-molecular weight polyethylene cable and stainless steel wire using two fixation techniques for repair of equine midbody sesamoid fractures: an in vitro biomechanical study. Vet Surg. 2002;31(5):445-454.

8.    Harrell RM, Tong J, Weinhold PS, Dahners LE. Comparison of the mechanical properties of different tension band materials and suture techniques. J Orthop Trauma. 2003;17(2):119-122.

9.    Nimura A, Nakagawa T, Wakabayashi Y, Sekiya I, Okawa A, Muneta T. Repair of olecranon fractures using FiberWire without metallic implants: report of two cases. J Orthop Surg Res. 2010;5:73.

10.  Macko D, Szabo RM. Complications of tension-band wiring of olecranon fractures. J Bone Joint Surg Am. 1985;67(9):1396-1401.

11.  Helm RH, Hornby R, Miller SW. The complications of surgical treatment of displaced fractures of the olecranon. Injury. 1987;18(1):48-50.

12.  Romero JM, Miran A, Jensen CH. Complications and re-operation rate after tension-band wiring of olecranon fractures. J Orthop Sci. 2000;5(4):318-320.

13.  Beaton DE, Katz JN, Fossel AH, Wright JG, Tarasuk V, Bombardier C. Measuring the whole or the parts? Validity, reliability, and responsiveness of the Disabilities of the Arm, Shoulder and Hand outcome measure in different regions of the upper extremity. J Hand Ther. 2001;14(2):128-146.

14.  Broberg MA, Morrey BF. Results of delayed excision of the radial head after fracture. J Bone Joint Surg Am. 1986;68(5):669-674.

15.  Bostrom MP, Asnis SE, Ernberg JJ, et al. Fatigue testing of cerclage stainless steel wire fixation. J Orthop Trauma. 1994;8(5):422-428.

16.  Oh I, Sander TW, Treharne RW. The fatigue resistance of orthopaedic wire. Clin Orthop Relat Res. 1985;(192):228-236.

17.  Amstutz HC, Maki S. Complications of trochanteric osteotomy in total hip replacement. J Bone Joint Surg Am. 1978;60(2):214-216.

18.  Jensen CM, Olsen BB. Drawbacks of traction-absorbing wiring (TAW) in displaced fractures of the olecranon. Injury. 1986;17(3):174-175.

19.  Kumar G, Mereddy PK, Hakkalamani S, Donnachie NJ. Implant removal following surgical stabilization of patella fracture. Orthopedics. 2010;33(5).

20.  Hume MC, Wiss DA. Olecranon fractures. A clinical and radiographic comparison of tension band wiring and plate fixation. Clin Orthop Relat Res. 1992;(285):229-235.

21.  Wolfgang G, Burke F, Bush D, et al. Surgical treatment of displaced olecranon fractures by tension band wiring technique. Clin Orthop Relat Res. 1987;(224):192-204.

22.  Sarin VK, Mattchen TM, Hack B. A novel iso-elastic cerclage cable for treatment of fractures. Paper presented at: Annual Meeting of the Orthopaedic Research Society; February 20-23, 2005; Washington, DC. Paper 739.

23.  Silverton CD, Jacobs JJ, Rosenberg AG, Kull L, Conley A, Galante JO. Complications of a cable grip system. J Arthroplasty. 1996;11(4):400-404.

24.  Ting NT, Wera GD, Levine BR, Della Valle CJ. Early experience with a novel nonmetallic cable in reconstructive hip surgery. Clin Orthop Relat Res. 2010;468(9):2382-2386.

25.  Edwards TB, Stuart KD, Trappey GJ, O’Connor DP, Sarin VK. Utility of polymer cerclage cables in revision shoulder arthroplasty. Orthopedics. 2011;34(4).

26.  Shaw JA, Daubert HB. Compression capability of cerclage fixation systems. A biomechanical study. Orthopedics. 1988;11(8):1169-1174.

Article PDF
Author and Disclosure Information

Rebecca A. Rajfer, MD, Jonathan R. Danoff, MD, Kiran S. Yemul, MHS, Ioannis Zouzias, MD, and Melvin P. Rosenwasser, MD

Authors’ Disclosure Statement: Kinamed funded production of the technique video.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
542-546
Legacy Keywords
american journal of orthopedics, AJO, orthopedic technologies and techniques, technology, technique, isoelastic tension band, band, treatment, olecranon fractures, fracture, fracture management, olecranon, elbow, rajfer, danoff, yemul, zouzias, rosenwasser
Sections
Author and Disclosure Information

Rebecca A. Rajfer, MD, Jonathan R. Danoff, MD, Kiran S. Yemul, MHS, Ioannis Zouzias, MD, and Melvin P. Rosenwasser, MD

Authors’ Disclosure Statement: Kinamed funded production of the technique video.

Author and Disclosure Information

Rebecca A. Rajfer, MD, Jonathan R. Danoff, MD, Kiran S. Yemul, MHS, Ioannis Zouzias, MD, and Melvin P. Rosenwasser, MD

Authors’ Disclosure Statement: Kinamed funded production of the technique video.

Article PDF
Article PDF

Olecranon fractures are relatively common in adults and constitute 10% of all upper extremity injuries.1,2 An olecranon fracture may be sustained either directly (from blunt trauma or a fall onto the tip of the elbow) or indirectly (as a result of forceful hyperextension of the triceps during a fall onto an outstretched arm). Displaced olecranon fractures with extensor discontinuity require reduction and stabilization. One treatment option is tension band wiring (TBW), which is used to manage noncomminuted fractures.3 TBW, first described by Weber and Vasey4 in 1963, involves transforming the distractive forces of the triceps into dynamic compression forces across the olecranon articular surface using 2 intramedullary Kirschner wires (K-wires) and stainless steel wires looped in figure-of-8 fashion.

Various modifications of the TBW technique of Weber and Vasey4 have been proposed to reduce the frequency of complications. These modifications include substituting screws for K-wires, aiming the angle of the K-wires into the anterior coronoid cortex or loop configuration of the stainless steel wire, using double knots and twisting procedures to finalize fixation, and using alternative materials for the loop construct.5-8 In the literature and in our experience, patients often complain after surgery about prominent K-wires and the twisted knots used to tension the construct.9-12 Surgeons also must address the technical difficulties of positioning the brittle wire without kinking, and avoiding slack while tensioning.

In this article, we report on the clinical outcomes of a series of 7 patients with olecranon fracture treated with a US Food and Drug Administration–approved novel isoelastic ultrahigh-molecular-weight polyethylene (UHMWPE) cerclage cable (Iso-Elastic Cerclage System, Kinamed).

Materials and Methods

Surgical Technique

The patient is arranged in a sloppy lateral position to allow access to the posterior elbow. A nonsterile tourniquet is placed on the upper arm, and the limb is sterilely prepared and draped in standard fashion. A posterolateral incision is made around the olecranon and extended proximally 6 cm and distally 6 cm along the subcutaneous border of the ulna. The fracture is visualized and comminution identified.

To provide anchorage for a pointed reduction clamp, the surgeon drills a 2.5-mm hole in the subcutaneous border of the ulnar shaft. The fracture is reduced in extension and the clamp affixed. The elbow is then flexed and the reduction confirmed visually and by imaging. After realignment of the articular surfaces, 2 longitudinal, parallel K-wires (diameter, 1.6-2.0 mm) are passed in antegrade direction through the proximal olecranon within the medullary canal of the shaft. The proximal ends must not cross the cortex so they may fully capture the figure-of-8 wire during subsequent, final advancement, and the distal ends must not pierce the anterior cortex. A 2.5-mm transverse hole is created distal to the fracture in the dorsal aspect of the ulnar shaft from medial to lateral at 2 times the distance from the tip of the olecranon to the fracture site. This hole is expanded with a 3.5-mm drill bit, allowing both strands of the cable to be passed simultaneously medial to lateral, making the figure-of-8. The 3.5-mm hole represents about 20% of the overall width of the bone, which we have not found to create a significant stress riser in either laboratory or clinical tests of this construct. Proximally, the cables are placed on the periosteum of the olecranon but deep to the triceps tendon and adjacent to the K-wires. The locking clip is placed on the posterolateral aspect of the elbow joint in a location where it can be covered with local tissue for adequate padding. The cable is then threaded through the clamping bracket and tightened slowly and gradually with a tensioning device to low torque level (Figure 1). At this stage, tension may be released to make any necessary adjustments. Last, the locking clip is deployed, securing the tension band in the clip, and the excess cable is trimmed with a scalpel. Softening and pliability of the cable during its insertion and tensioning should be noted.

The ends of the K-wires are now curved in a hook configuration. The tines of the hooks should be parallel to accommodate the cable, and then the triceps is sharply incised to bone. If the bone is hard, an awl is used to create a pilot hole so the hook may be impaled into bone while capturing the cable. Next, the triceps is closed over the pins, minimizing the potential for pin migration and backout. The 2 K-wires are left in place to keep the fragments in proper anatomical alignment during healing and to prevent displacement with elbow motion. Figure 2 is a schematic of the final construct, and Figure 3 shows the construct in a patient.

 

 

 

Reduction of the olecranon fracture is assessed by imaging in full extension to check for possible implant impingement. Last, we apply the previously harvested fracture callus to the fracture site. Layered closure is performed, and bulky soft dressings are applied. Postoperative immobilization with a splint is used. Gentle range-of-motion exercises begin in about 2 weeks and progress as pain allows.

A case example with preoperative and postoperative images taken at 3-month follow-up is provided in Figure 4. The entire surgical technique can be viewed in the Video.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

Clinical Cases

Between July 2007 and February 2011, 7 patients with displaced olecranon fractures underwent osteosynthesis using the isoelastic tension band (Table 1). According to the Mayo classification system, 5 of these patients had type 2A fractures, 1 had a type 2B fracture with an ipsilateral nondisplaced radial neck fracture, and 1 had a type 3B fracture. There were 4 female and 3 male patients. The injury was on the dominant side in 3 patients. All patients gave informed consent to evaluation at subsequent office visits and completed outcomes questionnaires by mail several years after surgery. Mean follow-up at which outcome measures questionnaires were obtained was 3.3 years (range, 2.1-6.8 years). Exclusion criteria were age under 18 years and inability to provide informed consent, fracture patterns with extensive articular comminution, and open fractures. Permission to conduct this research was granted by institutional review board.

At each visit, patients completed the Disabilities of the Arm, Shoulder, and Hand (DASH) functional outcome survey and were evaluated according to Broberg and Morrey’s elbow scoring system.13,14 Chart review consisted of evaluation of medical records, including radiographs and orthopedic physician notes in which preoperative examination was documented, mechanism of injury was noted, radiologic fracture pattern was evaluated, and time to bony union was recorded. Elbow motion was documented. Grip strength was measured with a calibrated Jamar dynamometer (Sammons Preston Rolyan) set at level 2, as delineated in Broberg and Morrey’s functional elbow scoring system.

Results

The 7 patients were assessed at a mean final follow-up of 19 months after surgery and received a mean Broberg and Morrey score of good (92.2/100) (Table 2). Restoration of motion and strength was excellent; compared with contralateral extremity, mean flexion arc was 96%, and mean forearm rotation was 96%. Grip was 99% of the noninjured side, perhaps the result of increased conditioning from physical therapy. Patients completed outcomes questionnaires at a mean of 3.3 years after surgery. Mean (SD) DASH score at this longest follow-up was 12.6 (17.2) (Table 2). Patients were satisfied (mean, 9.8/10; range, 9.5-10) and had little pain (mean, 0.8/10; range, 0-3). All fractures united, and there were no infections. One patient had a satisfactory union with complete restoration of motion and continued to play sports vocationally but developed pain over the locking clip 5 years after the index procedure and decided to have the implant removed. He had no radiographic evidence of K-wire or implant migration. Another patient had a minor degree of implant irritation at longest follow-up but did not request hardware removal.

Discussion

Stainless steel wire is often used in TBW because of its widespread availability, low cost, lack of immunogenicity, and relative strength.7 However, stainless steel wire has several disadvantages. It is susceptible to low-cycle fatigue failure, and fatigue strength may be seriously reduced secondary to incidental trauma to the wire on implantation.15,16 Other complications are kinking, skin irritation, implant prominence, fixation loss caused by wire loosening, and inadequate initial reduction potentially requiring revision.10,12,17-21

Isoelastic cable is a new type of cerclage cable that consists of UHMWPE strands braided over a nylon core. The particular property profile of the isoelastic tension band gives the cable intrinsic elastic and pliable qualities. In addition, unlike stainless steel, the band maintains a uniform, continuous compression force across a fracture site.22 Multifilament braided cables fatigue and fray, but the isoelastic cerclage cable showed no evidence of fraying or breakage after 1 million loading cycles.22,23 Compared with metal wire or braided metal cable, the band also has higher fatigue strength and higher ultimate tensile strength.7 Furthermore, the cable is less abrasive than stainless steel, so theoretically it is less irritating to surrounding subcutaneous tissue. Last, the pliability of the band allows the surgeon to create multiple loops of cable without the wire-failure side effects related to kinking, which is common with the metal construct.

In 2010, Ting and colleagues24 retrospectively studied implant failure complications associated with use of isoelastic cerclage cables in the treatment of periprosthetic fractures in total hip arthroplasty. They reported a breakage rate of 0% and noted that previously published breakage data for metallic cerclage devices ranged from 0% to 44%. They concluded that isoelastic cables were not associated with material failure, and there were no direct complications related to the cables. Similarly, Edwards and colleagues25 evaluated the same type of cable used in revision shoulder arthroplasty and reported excellent success and no failures. Although these data stem from use in the femur and humerus, we think the noted benefits apply to fractures of the elbow as well, as we observed a similar breakage rate (0%).

 

 

Various studies have addressed the clinical complaints and reoperation rates associated with retained metal implants after olecranon fixation. Traditional AO (Arbeitsgemeinschaft für Osteosynthesefragen) technique involves subcutaneous placement of stainless steel wires, which often results in tissue irritation. Reoperation rates as high as 80% have been reported, and a proportion of implant removals may in fact be caused by factors related to the subcutaneous placement of the metallic implants rather than K-wire migration alone.5,12,18 A nonmetallic isoelastic tension band can provide a more comfortable and less irritating implant, which could reduce the need for secondary intervention related to painful subcutaneous implant. One of our 7 patients had a symptomatic implant removed 5 years after surgery. This patient complained of pain over the area of the tension band device clip, so after fracture healing the entire fixation device was removed in the operating room. If reoperation is necessary, removal of intramedullary K-wires is relatively simple using a minimal incision; removal of stainless steel TBW may require a larger approach if the twisted knots cannot be easily retrieved.

A study of compression forces created by stainless steel wire demonstrated that a “finely tuned mechanical sense” was needed to produce optimal fixation compression when using stainless steel wire.26 It was observed that a submaximal twist created insufficient compressive force, while an ostensibly minimal increase in twisting force above optimum abruptly caused wire failure through breakage. Cerclage cables using clasping devices, such as the current isoelastic cerclage cable, were superior in ease of application. Furthermore, a clasping device allows for cable tension readjustment that is not possible with stainless steel wire. The clasping mechanism precludes the surgeon from having to bury the stainless steel knot and allows for the objective cable-tensioning not possible with stainless steel wire. Last, the tensioning device is titratable, which allows the surgeon to set the construct at a predetermined quantitative tension, which is of benefit in patients with osteopenia.

One limitation of this study is that it did not resolve the potential for K-wire migration, and we agree with previous recommendations that careful attention to surgical technique may avoid such a complication.10 In addition, the sample was small, and the study lacked a control group; a larger sample and a control group would have boosted study power. Nevertheless, the physical and functional outcomes associated with use of this technique were excellent. These results demonstrate an efficacious attempt to decrease secondary surgery rates and are therefore proof of concept that the isoelastic tension band may be used as an alternative to stainless steel in the TBW of displaced olecranon fractures with minimal or no comminution.

Conclusion

This easily reproducible technique for use of an isoelastic tension band in olecranon fracture fixation was associated with excellent physical and functional outcomes in a series of 7 patients. The rate of secondary intervention was slightly better for these patients than for patients treated with wire tension band fixation. Although more rigorous study of this device is needed, we think it is a promising alternative to wire tension band techniques.

Olecranon fractures are relatively common in adults and constitute 10% of all upper extremity injuries.1,2 An olecranon fracture may be sustained either directly (from blunt trauma or a fall onto the tip of the elbow) or indirectly (as a result of forceful hyperextension of the triceps during a fall onto an outstretched arm). Displaced olecranon fractures with extensor discontinuity require reduction and stabilization. One treatment option is tension band wiring (TBW), which is used to manage noncomminuted fractures.3 TBW, first described by Weber and Vasey4 in 1963, involves transforming the distractive forces of the triceps into dynamic compression forces across the olecranon articular surface using 2 intramedullary Kirschner wires (K-wires) and stainless steel wires looped in figure-of-8 fashion.

Various modifications of the TBW technique of Weber and Vasey4 have been proposed to reduce the frequency of complications. These modifications include substituting screws for K-wires, aiming the angle of the K-wires into the anterior coronoid cortex or loop configuration of the stainless steel wire, using double knots and twisting procedures to finalize fixation, and using alternative materials for the loop construct.5-8 In the literature and in our experience, patients often complain after surgery about prominent K-wires and the twisted knots used to tension the construct.9-12 Surgeons also must address the technical difficulties of positioning the brittle wire without kinking, and avoiding slack while tensioning.

In this article, we report on the clinical outcomes of a series of 7 patients with olecranon fracture treated with a US Food and Drug Administration–approved novel isoelastic ultrahigh-molecular-weight polyethylene (UHMWPE) cerclage cable (Iso-Elastic Cerclage System, Kinamed).

Materials and Methods

Surgical Technique

The patient is arranged in a sloppy lateral position to allow access to the posterior elbow. A nonsterile tourniquet is placed on the upper arm, and the limb is sterilely prepared and draped in standard fashion. A posterolateral incision is made around the olecranon and extended proximally 6 cm and distally 6 cm along the subcutaneous border of the ulna. The fracture is visualized and comminution identified.

To provide anchorage for a pointed reduction clamp, the surgeon drills a 2.5-mm hole in the subcutaneous border of the ulnar shaft. The fracture is reduced in extension and the clamp affixed. The elbow is then flexed and the reduction confirmed visually and by imaging. After realignment of the articular surfaces, 2 longitudinal, parallel K-wires (diameter, 1.6-2.0 mm) are passed in antegrade direction through the proximal olecranon within the medullary canal of the shaft. The proximal ends must not cross the cortex so they may fully capture the figure-of-8 wire during subsequent, final advancement, and the distal ends must not pierce the anterior cortex. A 2.5-mm transverse hole is created distal to the fracture in the dorsal aspect of the ulnar shaft from medial to lateral at 2 times the distance from the tip of the olecranon to the fracture site. This hole is expanded with a 3.5-mm drill bit, allowing both strands of the cable to be passed simultaneously medial to lateral, making the figure-of-8. The 3.5-mm hole represents about 20% of the overall width of the bone, which we have not found to create a significant stress riser in either laboratory or clinical tests of this construct. Proximally, the cables are placed on the periosteum of the olecranon but deep to the triceps tendon and adjacent to the K-wires. The locking clip is placed on the posterolateral aspect of the elbow joint in a location where it can be covered with local tissue for adequate padding. The cable is then threaded through the clamping bracket and tightened slowly and gradually with a tensioning device to low torque level (Figure 1). At this stage, tension may be released to make any necessary adjustments. Last, the locking clip is deployed, securing the tension band in the clip, and the excess cable is trimmed with a scalpel. Softening and pliability of the cable during its insertion and tensioning should be noted.

The ends of the K-wires are now curved in a hook configuration. The tines of the hooks should be parallel to accommodate the cable, and then the triceps is sharply incised to bone. If the bone is hard, an awl is used to create a pilot hole so the hook may be impaled into bone while capturing the cable. Next, the triceps is closed over the pins, minimizing the potential for pin migration and backout. The 2 K-wires are left in place to keep the fragments in proper anatomical alignment during healing and to prevent displacement with elbow motion. Figure 2 is a schematic of the final construct, and Figure 3 shows the construct in a patient.

 

 

 

Reduction of the olecranon fracture is assessed by imaging in full extension to check for possible implant impingement. Last, we apply the previously harvested fracture callus to the fracture site. Layered closure is performed, and bulky soft dressings are applied. Postoperative immobilization with a splint is used. Gentle range-of-motion exercises begin in about 2 weeks and progress as pain allows.

A case example with preoperative and postoperative images taken at 3-month follow-up is provided in Figure 4. The entire surgical technique can be viewed in the Video.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

Clinical Cases

Between July 2007 and February 2011, 7 patients with displaced olecranon fractures underwent osteosynthesis using the isoelastic tension band (Table 1). According to the Mayo classification system, 5 of these patients had type 2A fractures, 1 had a type 2B fracture with an ipsilateral nondisplaced radial neck fracture, and 1 had a type 3B fracture. There were 4 female and 3 male patients. The injury was on the dominant side in 3 patients. All patients gave informed consent to evaluation at subsequent office visits and completed outcomes questionnaires by mail several years after surgery. Mean follow-up at which outcome measures questionnaires were obtained was 3.3 years (range, 2.1-6.8 years). Exclusion criteria were age under 18 years and inability to provide informed consent, fracture patterns with extensive articular comminution, and open fractures. Permission to conduct this research was granted by institutional review board.

At each visit, patients completed the Disabilities of the Arm, Shoulder, and Hand (DASH) functional outcome survey and were evaluated according to Broberg and Morrey’s elbow scoring system.13,14 Chart review consisted of evaluation of medical records, including radiographs and orthopedic physician notes in which preoperative examination was documented, mechanism of injury was noted, radiologic fracture pattern was evaluated, and time to bony union was recorded. Elbow motion was documented. Grip strength was measured with a calibrated Jamar dynamometer (Sammons Preston Rolyan) set at level 2, as delineated in Broberg and Morrey’s functional elbow scoring system.

Results

The 7 patients were assessed at a mean final follow-up of 19 months after surgery and received a mean Broberg and Morrey score of good (92.2/100) (Table 2). Restoration of motion and strength was excellent; compared with contralateral extremity, mean flexion arc was 96%, and mean forearm rotation was 96%. Grip was 99% of the noninjured side, perhaps the result of increased conditioning from physical therapy. Patients completed outcomes questionnaires at a mean of 3.3 years after surgery. Mean (SD) DASH score at this longest follow-up was 12.6 (17.2) (Table 2). Patients were satisfied (mean, 9.8/10; range, 9.5-10) and had little pain (mean, 0.8/10; range, 0-3). All fractures united, and there were no infections. One patient had a satisfactory union with complete restoration of motion and continued to play sports vocationally but developed pain over the locking clip 5 years after the index procedure and decided to have the implant removed. He had no radiographic evidence of K-wire or implant migration. Another patient had a minor degree of implant irritation at longest follow-up but did not request hardware removal.

Discussion

Stainless steel wire is often used in TBW because of its widespread availability, low cost, lack of immunogenicity, and relative strength.7 However, stainless steel wire has several disadvantages. It is susceptible to low-cycle fatigue failure, and fatigue strength may be seriously reduced secondary to incidental trauma to the wire on implantation.15,16 Other complications are kinking, skin irritation, implant prominence, fixation loss caused by wire loosening, and inadequate initial reduction potentially requiring revision.10,12,17-21

Isoelastic cable is a new type of cerclage cable that consists of UHMWPE strands braided over a nylon core. The particular property profile of the isoelastic tension band gives the cable intrinsic elastic and pliable qualities. In addition, unlike stainless steel, the band maintains a uniform, continuous compression force across a fracture site.22 Multifilament braided cables fatigue and fray, but the isoelastic cerclage cable showed no evidence of fraying or breakage after 1 million loading cycles.22,23 Compared with metal wire or braided metal cable, the band also has higher fatigue strength and higher ultimate tensile strength.7 Furthermore, the cable is less abrasive than stainless steel, so theoretically it is less irritating to surrounding subcutaneous tissue. Last, the pliability of the band allows the surgeon to create multiple loops of cable without the wire-failure side effects related to kinking, which is common with the metal construct.

In 2010, Ting and colleagues24 retrospectively studied implant failure complications associated with use of isoelastic cerclage cables in the treatment of periprosthetic fractures in total hip arthroplasty. They reported a breakage rate of 0% and noted that previously published breakage data for metallic cerclage devices ranged from 0% to 44%. They concluded that isoelastic cables were not associated with material failure, and there were no direct complications related to the cables. Similarly, Edwards and colleagues25 evaluated the same type of cable used in revision shoulder arthroplasty and reported excellent success and no failures. Although these data stem from use in the femur and humerus, we think the noted benefits apply to fractures of the elbow as well, as we observed a similar breakage rate (0%).

 

 

Various studies have addressed the clinical complaints and reoperation rates associated with retained metal implants after olecranon fixation. Traditional AO (Arbeitsgemeinschaft für Osteosynthesefragen) technique involves subcutaneous placement of stainless steel wires, which often results in tissue irritation. Reoperation rates as high as 80% have been reported, and a proportion of implant removals may in fact be caused by factors related to the subcutaneous placement of the metallic implants rather than K-wire migration alone.5,12,18 A nonmetallic isoelastic tension band can provide a more comfortable and less irritating implant, which could reduce the need for secondary intervention related to painful subcutaneous implant. One of our 7 patients had a symptomatic implant removed 5 years after surgery. This patient complained of pain over the area of the tension band device clip, so after fracture healing the entire fixation device was removed in the operating room. If reoperation is necessary, removal of intramedullary K-wires is relatively simple using a minimal incision; removal of stainless steel TBW may require a larger approach if the twisted knots cannot be easily retrieved.

A study of compression forces created by stainless steel wire demonstrated that a “finely tuned mechanical sense” was needed to produce optimal fixation compression when using stainless steel wire.26 It was observed that a submaximal twist created insufficient compressive force, while an ostensibly minimal increase in twisting force above optimum abruptly caused wire failure through breakage. Cerclage cables using clasping devices, such as the current isoelastic cerclage cable, were superior in ease of application. Furthermore, a clasping device allows for cable tension readjustment that is not possible with stainless steel wire. The clasping mechanism precludes the surgeon from having to bury the stainless steel knot and allows for the objective cable-tensioning not possible with stainless steel wire. Last, the tensioning device is titratable, which allows the surgeon to set the construct at a predetermined quantitative tension, which is of benefit in patients with osteopenia.

One limitation of this study is that it did not resolve the potential for K-wire migration, and we agree with previous recommendations that careful attention to surgical technique may avoid such a complication.10 In addition, the sample was small, and the study lacked a control group; a larger sample and a control group would have boosted study power. Nevertheless, the physical and functional outcomes associated with use of this technique were excellent. These results demonstrate an efficacious attempt to decrease secondary surgery rates and are therefore proof of concept that the isoelastic tension band may be used as an alternative to stainless steel in the TBW of displaced olecranon fractures with minimal or no comminution.

Conclusion

This easily reproducible technique for use of an isoelastic tension band in olecranon fracture fixation was associated with excellent physical and functional outcomes in a series of 7 patients. The rate of secondary intervention was slightly better for these patients than for patients treated with wire tension band fixation. Although more rigorous study of this device is needed, we think it is a promising alternative to wire tension band techniques.

References

1.    Rommens PM, Küchle R, Schneider RU, Reuter M. Olecranon fractures in adults: factors influencing outcome. Injury. 2004;35(11):1149-1157.

2.    Veillette CJ, Steinmann SP. Olecranon fractures. Orthop Clin North Am. 2008;39(2):229-236.

3.    Newman SD, Mauffrey C, Krikler S. Olecranon fractures. Injury. 2009;40(6):575-581.

4.    Weber BG, Vasey H. Osteosynthesis in olecranon fractures [in German]. Z Unfallmed Berufskr. 1963;56:90-96.

5.    Netz P, Strömberg L. Non-sliding pins in traction absorbing wiring of fractures: a modified technique. Acta Orthop Scand. 1982;53(3):355-360.

6.    Prayson MJ, Williams JL, Marshall MP, Scilaris TA, Lingenfelter EJ. Biomechanical comparison of fixation methods in transverse olecranon fractures: a cadaveric study. J Orthop Trauma. 1997;11(8):565-572.

7.    Rothaug PG, Boston RC, Richardson DW, Nunamaker DM. A comparison of ultra-high-molecular weight polyethylene cable and stainless steel wire using two fixation techniques for repair of equine midbody sesamoid fractures: an in vitro biomechanical study. Vet Surg. 2002;31(5):445-454.

8.    Harrell RM, Tong J, Weinhold PS, Dahners LE. Comparison of the mechanical properties of different tension band materials and suture techniques. J Orthop Trauma. 2003;17(2):119-122.

9.    Nimura A, Nakagawa T, Wakabayashi Y, Sekiya I, Okawa A, Muneta T. Repair of olecranon fractures using FiberWire without metallic implants: report of two cases. J Orthop Surg Res. 2010;5:73.

10.  Macko D, Szabo RM. Complications of tension-band wiring of olecranon fractures. J Bone Joint Surg Am. 1985;67(9):1396-1401.

11.  Helm RH, Hornby R, Miller SW. The complications of surgical treatment of displaced fractures of the olecranon. Injury. 1987;18(1):48-50.

12.  Romero JM, Miran A, Jensen CH. Complications and re-operation rate after tension-band wiring of olecranon fractures. J Orthop Sci. 2000;5(4):318-320.

13.  Beaton DE, Katz JN, Fossel AH, Wright JG, Tarasuk V, Bombardier C. Measuring the whole or the parts? Validity, reliability, and responsiveness of the Disabilities of the Arm, Shoulder and Hand outcome measure in different regions of the upper extremity. J Hand Ther. 2001;14(2):128-146.

14.  Broberg MA, Morrey BF. Results of delayed excision of the radial head after fracture. J Bone Joint Surg Am. 1986;68(5):669-674.

15.  Bostrom MP, Asnis SE, Ernberg JJ, et al. Fatigue testing of cerclage stainless steel wire fixation. J Orthop Trauma. 1994;8(5):422-428.

16.  Oh I, Sander TW, Treharne RW. The fatigue resistance of orthopaedic wire. Clin Orthop Relat Res. 1985;(192):228-236.

17.  Amstutz HC, Maki S. Complications of trochanteric osteotomy in total hip replacement. J Bone Joint Surg Am. 1978;60(2):214-216.

18.  Jensen CM, Olsen BB. Drawbacks of traction-absorbing wiring (TAW) in displaced fractures of the olecranon. Injury. 1986;17(3):174-175.

19.  Kumar G, Mereddy PK, Hakkalamani S, Donnachie NJ. Implant removal following surgical stabilization of patella fracture. Orthopedics. 2010;33(5).

20.  Hume MC, Wiss DA. Olecranon fractures. A clinical and radiographic comparison of tension band wiring and plate fixation. Clin Orthop Relat Res. 1992;(285):229-235.

21.  Wolfgang G, Burke F, Bush D, et al. Surgical treatment of displaced olecranon fractures by tension band wiring technique. Clin Orthop Relat Res. 1987;(224):192-204.

22.  Sarin VK, Mattchen TM, Hack B. A novel iso-elastic cerclage cable for treatment of fractures. Paper presented at: Annual Meeting of the Orthopaedic Research Society; February 20-23, 2005; Washington, DC. Paper 739.

23.  Silverton CD, Jacobs JJ, Rosenberg AG, Kull L, Conley A, Galante JO. Complications of a cable grip system. J Arthroplasty. 1996;11(4):400-404.

24.  Ting NT, Wera GD, Levine BR, Della Valle CJ. Early experience with a novel nonmetallic cable in reconstructive hip surgery. Clin Orthop Relat Res. 2010;468(9):2382-2386.

25.  Edwards TB, Stuart KD, Trappey GJ, O’Connor DP, Sarin VK. Utility of polymer cerclage cables in revision shoulder arthroplasty. Orthopedics. 2011;34(4).

26.  Shaw JA, Daubert HB. Compression capability of cerclage fixation systems. A biomechanical study. Orthopedics. 1988;11(8):1169-1174.

References

1.    Rommens PM, Küchle R, Schneider RU, Reuter M. Olecranon fractures in adults: factors influencing outcome. Injury. 2004;35(11):1149-1157.

2.    Veillette CJ, Steinmann SP. Olecranon fractures. Orthop Clin North Am. 2008;39(2):229-236.

3.    Newman SD, Mauffrey C, Krikler S. Olecranon fractures. Injury. 2009;40(6):575-581.

4.    Weber BG, Vasey H. Osteosynthesis in olecranon fractures [in German]. Z Unfallmed Berufskr. 1963;56:90-96.

5.    Netz P, Strömberg L. Non-sliding pins in traction absorbing wiring of fractures: a modified technique. Acta Orthop Scand. 1982;53(3):355-360.

6.    Prayson MJ, Williams JL, Marshall MP, Scilaris TA, Lingenfelter EJ. Biomechanical comparison of fixation methods in transverse olecranon fractures: a cadaveric study. J Orthop Trauma. 1997;11(8):565-572.

7.    Rothaug PG, Boston RC, Richardson DW, Nunamaker DM. A comparison of ultra-high-molecular weight polyethylene cable and stainless steel wire using two fixation techniques for repair of equine midbody sesamoid fractures: an in vitro biomechanical study. Vet Surg. 2002;31(5):445-454.

8.    Harrell RM, Tong J, Weinhold PS, Dahners LE. Comparison of the mechanical properties of different tension band materials and suture techniques. J Orthop Trauma. 2003;17(2):119-122.

9.    Nimura A, Nakagawa T, Wakabayashi Y, Sekiya I, Okawa A, Muneta T. Repair of olecranon fractures using FiberWire without metallic implants: report of two cases. J Orthop Surg Res. 2010;5:73.

10.  Macko D, Szabo RM. Complications of tension-band wiring of olecranon fractures. J Bone Joint Surg Am. 1985;67(9):1396-1401.

11.  Helm RH, Hornby R, Miller SW. The complications of surgical treatment of displaced fractures of the olecranon. Injury. 1987;18(1):48-50.

12.  Romero JM, Miran A, Jensen CH. Complications and re-operation rate after tension-band wiring of olecranon fractures. J Orthop Sci. 2000;5(4):318-320.

13.  Beaton DE, Katz JN, Fossel AH, Wright JG, Tarasuk V, Bombardier C. Measuring the whole or the parts? Validity, reliability, and responsiveness of the Disabilities of the Arm, Shoulder and Hand outcome measure in different regions of the upper extremity. J Hand Ther. 2001;14(2):128-146.

14.  Broberg MA, Morrey BF. Results of delayed excision of the radial head after fracture. J Bone Joint Surg Am. 1986;68(5):669-674.

15.  Bostrom MP, Asnis SE, Ernberg JJ, et al. Fatigue testing of cerclage stainless steel wire fixation. J Orthop Trauma. 1994;8(5):422-428.

16.  Oh I, Sander TW, Treharne RW. The fatigue resistance of orthopaedic wire. Clin Orthop Relat Res. 1985;(192):228-236.

17.  Amstutz HC, Maki S. Complications of trochanteric osteotomy in total hip replacement. J Bone Joint Surg Am. 1978;60(2):214-216.

18.  Jensen CM, Olsen BB. Drawbacks of traction-absorbing wiring (TAW) in displaced fractures of the olecranon. Injury. 1986;17(3):174-175.

19.  Kumar G, Mereddy PK, Hakkalamani S, Donnachie NJ. Implant removal following surgical stabilization of patella fracture. Orthopedics. 2010;33(5).

20.  Hume MC, Wiss DA. Olecranon fractures. A clinical and radiographic comparison of tension band wiring and plate fixation. Clin Orthop Relat Res. 1992;(285):229-235.

21.  Wolfgang G, Burke F, Bush D, et al. Surgical treatment of displaced olecranon fractures by tension band wiring technique. Clin Orthop Relat Res. 1987;(224):192-204.

22.  Sarin VK, Mattchen TM, Hack B. A novel iso-elastic cerclage cable for treatment of fractures. Paper presented at: Annual Meeting of the Orthopaedic Research Society; February 20-23, 2005; Washington, DC. Paper 739.

23.  Silverton CD, Jacobs JJ, Rosenberg AG, Kull L, Conley A, Galante JO. Complications of a cable grip system. J Arthroplasty. 1996;11(4):400-404.

24.  Ting NT, Wera GD, Levine BR, Della Valle CJ. Early experience with a novel nonmetallic cable in reconstructive hip surgery. Clin Orthop Relat Res. 2010;468(9):2382-2386.

25.  Edwards TB, Stuart KD, Trappey GJ, O’Connor DP, Sarin VK. Utility of polymer cerclage cables in revision shoulder arthroplasty. Orthopedics. 2011;34(4).

26.  Shaw JA, Daubert HB. Compression capability of cerclage fixation systems. A biomechanical study. Orthopedics. 1988;11(8):1169-1174.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
542-546
Page Number
542-546
Publications
Publications
Topics
Article Type
Display Headline
Technique Using Isoelastic Tension Band for Treatment of Olecranon Fractures
Display Headline
Technique Using Isoelastic Tension Band for Treatment of Olecranon Fractures
Legacy Keywords
american journal of orthopedics, AJO, orthopedic technologies and techniques, technology, technique, isoelastic tension band, band, treatment, olecranon fractures, fracture, fracture management, olecranon, elbow, rajfer, danoff, yemul, zouzias, rosenwasser
Legacy Keywords
american journal of orthopedics, AJO, orthopedic technologies and techniques, technology, technique, isoelastic tension band, band, treatment, olecranon fractures, fracture, fracture management, olecranon, elbow, rajfer, danoff, yemul, zouzias, rosenwasser
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Orthopedics in US Health Care

Article Type
Changed
Thu, 09/19/2019 - 13:29
Display Headline
Orthopedics in US Health Care

In the United States, the landscape of health care is changing. Health care reform and fluctuating political and economic climates have affected and will continue to affect the practice of orthopedic surgery. Demand for musculoskeletal care and the costs of providing this care are exceeding available resources—which has led to an evolution in how we practice as individuals and in the institutions where we provide care. Patient safety, quality, and value have become the outcomes of importance. Orthopedic surgeons, as experts in musculoskeletal care, must be a part of these changes. In this review, we offer perspective on the changing face of orthopedic surgery in the modern US health care system.

1. Meeting the demand

Musculoskeletal conditions represent one of the most common and costly health issues in the United States, affecting individuals medically and economically and compromising their quality of life.1,2 In 2008, more than 110 million US adults (1 in 2) reported having a musculoskeletal condition for more than 3 months, and almost 7% reported that a chronic musculoskeletal condition made routine activities of daily living significantly difficult.1 Overall, in the United States, some of the most common chronic conditions are musculoskeletal in origin. These conditions include osteoarthritis and back pain.

Osteoarthritis is the leading cause of chronic pain and disability. Physician-diagnosed arthritis is expected to affect 25% of US adults by 2030,3 and in more than one-third of these patients arthritis limits work or other activity.4 Back pain is another of the most common debilitating conditions in the United States.3,5 St Sauver and colleagues6 found that back pain is the third most common condition (23.9%) that prompts patients to seek health care—following skin-related problems (42.7%) and osteoarthritis/joint pain (33.6%).

As life expectancy increases, so do expectations of enjoying higher levels of activity into the later years. Patients expect to be as active in their geriatric years as they were in middle age, and many are able to do so. Amid the growing obesity epidemic and increased incidence of chronic comorbidities, however, the aging population not only is at substantial risk for developing a chronic musculoskeletal disorder but may face new challenges in accessing care.

Although orthopedic surgeons specialize in treating musculoskeletal conditions, up to 90% of common nonsurgical musculoskeletal complaints are thought to be manageable in the primary care setting.7 With a disproportionate increase in musculoskeletal demand against a relatively constant number of orthopedic providers,8 it is becoming increasingly important for nonorthopedists to adequately manage musculoskeletal conditions. Physiatrists, rheumatologists, internists, family practitioners, and the expanding field of sports medicine specialists provide primary care of musculoskeletal conditions. To meet the growing demand and to ensure that patients receive quality, sustainable, effective, and efficient care, orthopedic surgeons should be actively involved in training these providers. As high as the cost of managing musculoskeletal conditions can be, it is far less than the cost resulting from inadequate or improper management. There is already justification for formal development of a specialization in nonoperative management of musculoskeletal care. Establishing this specialization requires a multidisciplinary approach, with orthopedic surgery taking a lead role.

2. The cost equation

As the prevalence of orthopedic conditions increases, so does the cost of delivering musculoskeletal care. The economic implications of meeting this growing demand are an important area of concern for our health care system. Steadily increasing hospital expenses for personnel and services, rising costs of pharmaceuticals and laboratory tests, constant evolution of costly technology, and insurance/reimbursement rates that do not keep pace with rising costs all contribute to the rapid escalation of the “cost of care.”

Health care expenditures accounted for 17.2% of the US gross domestic product (GDP) in 2012 and are expected to represent 19.3% by 2023.9 For musculoskeletal disease, direct costs alone are expected to approach $510 billion, equaling 5% of GDP and representing almost 30% of all health care expenditures. In Medicare patients, osteoarthritis is the most expensive condition to treat overall, and 3 other musculoskeletal problems rank highly as well: femoral neck fractures (3rd), back pain (10th), and fractures of all types (16th).10 Clearly, musculoskeletal care is one of the most prevalent and expensive health conditions in the United States.

Part of the direct costs of care that consistently increase each year are the steadily increasing costs of technology, which is often considered synonymous with orthopedic care. Promotion of new and more costly implants is common in the absence of evidence supporting their use. However, use of new implants and technology is being scrutinized in an effort to strike the proper cost–benefit balance.

 

 

To change the slope of the cost curve, orthopedic surgeons should utilize technological advances that are proven to be clinically significant and economically feasible and should avoid modest improvements with limited clinical benefit and higher price tags. Unfortunately, this approach is not being taken. Minor modifications of implant designs are often marketed as “new and improved” to justify increased costs, and these implants often gain widespread use. A few may prove to be clinically better, but most will be only comparable to older, less expensive designs, and some may end up being clinical failures, discovered at great cost to patients and the health care system.11,12

Orthopedic surgeons have an important role in this decision-making. We should strive for the best, most cost-effective outcomes for our patients. We should reject new technology that does not clearly improve outcomes. At the least, we should use the technology in a manufacturer-supported clinical trial to determine its superiority. Whether the improvement is in technique, implant design, or workflow efficiency, orthopedic surgeons must be actively involved in researching and developing the latest innovations and must help determine their prospective value by considering not only their potential clinical benefits but also their economic implications.

As the political and economic environment becomes more directed at the cost-containment and sustainability of care, there has been a clear shift in focus to quality and value rather than volume, giving rise to the “value-based care” approach. The “value equation,” in which value equals quality divided by cost, requires a clear measure of outcomes and an equally clear understanding of costs. Delivering high-quality care in a cost-conscious environment is an approach that every orthopedic surgeon should adopt. Widespread adoption of the value-based strategy by hospital systems and insurance companies is resulting in a paradigm shift away from more traditional volume-based metrics and in favor of value-based metrics, including quality measures, patient-reported outcomes, Hospital Consumer Assessment of Healthcare Providers and Systems, and physician-specific outcome measures.

The new paradigm has brought the bundled payment initiative (BPI), a strategy included in the Patient Protection and Affordable Care Act. The philosophy behind the BPI model is for hospital systems and physicians to control costs while maintaining and improving the quality of care. Measured by patient metrics (eg, clinical outcomes, patient satisfaction) and hospital metrics (eg, readmission rates, cost of care), bundled payments reimburse hospitals on the basis of cost of an entire episode of care rather than on the basis of individual procedures and services. This approach provides incentives for both physicians and hospitals to promote value-based care while emphasizing coordination of care among all members of the health care team.

Providing the best possible care for our patients while holding our practice to the highest standards is a central tenet of the practice of orthopedic surgery and should be independent of reimbursement strategies. Thus, to increase the value of care, we must establish practice models and strategies to optimize cost-efficiency while improving outcomes. As explained by Porter and Teisberg,13 it is important to be conscientious about cost, but above all we must not allow quality of health care delivery to be compromised when trying to improve the “value” of care. Through evidence-based management and a clear understanding of costs, we must develop cost-efficient practice models that sustainably deliver the highest value of care.

3. Evolving practice models

As the health care landscape continues to change, physician practice models evolve accordingly. Although the private practice model once dominated the physician workforce, this is no longer true, as there has been a significant shift to employer-based practice models. The multiple factors at work relate to changing patterns of reimbursement, increasing government regulations, and a general change in recent residency graduates’ expectations regarding work–life balance. Other catalysts are the shift from volume- to value-based care and the recognition that cost-effective health care is more easily achieved when physicians and their institutions are in alignment. Ultimately, physician–institution alignment is crucial in improving care and outcomes.

Physician–institution alignment requires further discussion. Ideally, it should strike the proper balance between physician autonomy and institutional priorities to ensure the highest quality care. Physicians and their institutions should align their interests in terms of patient safety, quality, and economics to create a work environment conducive to both patient/physician satisfaction and institutional success.14 As identified by Page and colleagues,15 the primary drivers of physician–institution alignment, specific to orthopedic surgery, are economic, regulatory, and cultural. In economics, implant selection and ancillary services are the important issues; in the regulatory area, cooperative efforts to address expanding state and federal requirements are needed; last, the primary cultural driver is delivery of care to an expanding, diverse patient population.

 

 

Physician–institution alignment brings opportunities for “gainsharing,” which can directly benefit individual physicians, physician groups, and departments. Gainsharing is classically defined as “arrangements in which a hospital gives physicians a percentage share of any reduction in the hospital’s costs for patient care attributable in part to the physicians’ efforts.”16 Modern gainsharing programs can be used by institutions to align the economic interests of physicians and hospitals, with the ultimate goal being to achieve a sustainable increase in the value and quality of care delivered to patients.13 Examples include efforts to reduce the cost of orthopedic implants, which is a major cost driver in orthopedic surgery. Our institution realized significant savings when surgeons were directly involved in the implant contracting process with strategic sourcing personnel. These savings were shared with the department to enhance research and education programs. BPI, a risk-sharing program in which Medicare and hospitals participate, incorporates gainsharing opportunities in which each participating physician can receive up to 50% of his or her previous Medicare billings when specific targets are achieved. BPI included 27 musculoskeletal diagnosis–related groups that could be developed into a bundled payment proposal. Our institution participated in a 90-day episode, for primary hip and knee arthroplasty and non–cervical spine fusion, that had very promising results.

Gainsharing offers physicians incentives to meet institution goals of improved outcomes and increased patient satisfaction while increasing oversight and accountability. When physician-specific outcomes do not meet the established goals in key areas (readmissions, thromboembolic complications, infections), it is only logical that steps will be taken to improve outcomes. Although physicians may not be used to this increased scrutiny, the goal of improving outcomes, even if it necessitates a change in an established approach to care, should be welcomed.

Physicians should be rewarded for good outcomes but not suboptimal outcomes. When outcomes are suboptimal, physicians should take a constructive approach to improve them. On the other hand, not being rewarded for unachieved goals can be perceived as being penalized. Additional monitoring may paradoxically lead physicians to avoid more “complex” cases, such as those of patients at higher risk for complications and poorer outcomes. An example is found in patient selection for surgery, in which issues like obesity, diabetes, and heart disease are known to negatively affect outcomes. In these models, “cherry-picking” is a well-recognized risk17,18 that can compromise our ethical obligation to provide equal access for all patients. To offset this tendency, we should use a risk-stratification model in which all patients are not considered equal in the risks they present. A risk-adjustment approach benefits both patients and providers by identifying modifiable risk factors that can be addressed to positively affect outcomes. This risk-stratification approach further incentivizes the orthopedist to closely work with other health care providers to address the medical comorbidities that may negatively affect surgical outcomes.

4. Patient and physician expectations

Living in a technology-driven society in the age of information has had a major impact on patients’ attitudes and expectations about their care—and therefore on physicians’ practice methods. It is uncommon to evaluate a patient who has not already consulted the Internet about a problem. Patients now have much more information they can use to make decisions about their treatment, and, though many question the accuracy of Internet information, there is no argument that being more informed is beneficial. In this time of shared decision-making, it is absolutely essential that patients keep themselves informed.

It is crucial to align the expectations of both physicians and patients in order to achieve the best outcomes. Gaining a clear understanding of treatment goals, management, and potential complications consistently leads to improved patient satisfaction, more favorable clinical outcomes, and reduced risk of litigation.19-22 Addressing patient concerns and expectations is significantly enhanced by a strong patient–physician relationship through clinical models focused on patient-centered care.

Now considered a standard of care, the patient-centered model has changed the way we practice. The foundation of the patient-centered approach is to strengthen the patient–physician relationship by empowering patients to become active decision-makers in the management of their own health. The role of orthopedists in this model is to provide patients with information and insight into their conditions in order to facilitate shared decision-making. Our role should be to guide patients to make educated and informed decisions. Doing so enhances communication, thereby strengthening the patient–physician relationship, and places both patient and physician expectations in perspective. Patient-reported outcomes, satisfaction rates, symptomatic burdens, and costs of care are all positively correlated with strong communication and realistic expectations achieved through a patient-centered approach.21,23

 

 

The evolution of clinical practice has been influenced by factors ranging from external forces (eg, changing political and economic climates) to social trends (use of social media and the Internet). Technology has been a driving force in our rapidly changing clinical environment, significantly altering the way we practice. Although we must be careful in how we use it, new technology can certainly work to our advantage. We have a plethora of medical information at our fingertips, and, with physician-directed guidance, our patients can become more informed than ever before. This is the principle of patient-centered medicine and shared decision-making, and its utility will only increase in importance.

5. The role of advocacy

The central tenet of orthopedic practice has always been a focus on patients. We continually strive to improve patient outcomes, reduce costs, and work efficiently in our practices and facilities. Although we can focus on our individual practices, we cannot ignore the influence and impact of the political system on our performance. Federal and state regulations give physicians and insurance companies an uneven playing field. This imbalance requires that physicians be more active in health care policymaking and advocacy. Although we are more involved than ever before, our influence is far less than what we would like it to be, perhaps partly because of the nature of the political process but perhaps also because of physicians’ resistance to becoming involved.

As experts in the treatment of musculoskeletal conditions, we should be at the forefront of health care policy development—a position we have not been able to attain. Although many factors contribute to our lack of a “seat at the table,” we must recognize our reluctance as a group to support advocacy, either financially or through personal time commitment. The American Association of Orthopaedic Surgeons (AAOS) Orthopaedic Political Action Committee has never been able to obtain donations from more than 30% of AAOS members. Although this committee historically has been successful, we could be much more so if we had financial support from 90% of members. There are many ways to be actively involved in advocacy. One way is to join local and state orthopedic societies and support their advocacy efforts. State orthopedic societies work closely with the AAOS Office of Government Relations to coordinate advocacy and direct efforts and resources to areas of greatest need. Knowing local congressional representatives and communicating with them about issues we face in our practices make our issues “real.” Some of our colleagues have even successfully run for office in Congress, and they certainly deserve our support. Advocacy will absolutely play an increasingly important role as federal and state governments expand their involvement in health care. Our role should be to get involved, at least to some degree. We need to recognize that our strength is in our numbers, as the few cannot accomplish nearly as much as the many.

Summary

Orthopedic surgeons are practicing in the midst of almost constant change—evolving patient care, shifts in employment models, advances in technology, modern patient expectations, and an increasingly complex regulatory environment. Even in this context, however, our goal remains unchanged: to give our patients the highest-quality care possible. Our core values as orthopedic surgeons and physicians are dedication, commitment, and service to patients and to our profession. As US health care continues to evolve, we must evolve as well, with an emphasis on expanding our role in the health care policy debate.

References

1.    US Bone and Joint Initiative. Burden of Musculoskeletal Diseases in the United States: Prevalence, Societal, and Economic Cost. Rosemont, IL: US Bone and Joint Initiative; 2008. http://www.boneandjointburden.org. Accessed October 26, 2015.

2.    US Bone and Joint Initiative. Burden of Musculoskeletal Diseases in the United States: Prevalence, Societal, and Economic Cost. 2nd ed. Rosemont, IL: US Bone and Joint Initiative; 2011. http://www.boneandjointburden.org. Accessed October 26, 2015.

3.    Ma VY, Chan L, Carruthers KJ. Incidence, prevalence, costs, and impact on disability of common conditions requiring rehabilitation in the United States: stroke, spinal cord injury, traumatic brain injury, multiple sclerosis, osteoarthritis, rheumatoid arthritis, limb loss, and back pain. Arch Phys Med Rehabil. 2014;95(5):986-995.e1.

4.    Hootman JM, Helmick CG. Projections of US prevalence of arthritis and associated activity limitations. Arthritis Rheum. 2006;54(1):226-229.

5.    Freburger JK, Holmes GM, Agans RP, et al. The rising prevalence of chronic low back pain. Arch Intern Med. 2009;169(3):251-258.

6.    St Sauver JL, Warner DO, Yawn BP, et al. Why patients visit their doctors: assessing the most prevalent conditions in a defined American population. Mayo Clin Proc. 2013;88(1):56-67.

7.    Anderson BC. Office Orthopedics for Primary Care: Diagnosis and Treatment. 2nd ed. Philadelphia, PA: Saunders; 1999.

8.    American Academy of Orthopaedic Surgeons, Department of Research and Scientific Affairs. Orthopaedic Practice in the U.S. 2012 [2012 Orthopaedic Surgeon Census Report]. Rosemont, IL: American Academy of Orthopaedic Surgeons; January 2013.

9.    US Department of Health and Human Services, Centers for Medicare & Medicaid Services, Office of the Actuary, National Health Statistics Group. NHE [National Health Expenditure] Fact Sheet, 2014. Centers for Medicare & Medicaid Services website. http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData/NHE-Fact-Sheet.html. Updated July 28, 2015. Accessed October 26, 2015.

10.  Cutler DM, Ghosh K. The potential for cost savings through bundled episode payments. N Engl J Med. 2012;366(12):1075-1077.

11.  Langton DJ, Jameson SS, Joyce TJ, Hallab NJ, Natu S, Nargol AV. Early failure of metal-on-metal bearings in hip resurfacing and large-diameter total hip replacement: a consequence of excess wear. J Bone Joint Surg Br. 2010;92(1):38-46.

12.  Dahlstrand H, Stark A, Anissian L, Hailer NP. Elevated serum concentrations of cobalt, chromium, nickel, and manganese after metal-on-metal alloarthroplasty of the hip: a prospective randomized study. J Arthroplasty. 2009;24(6):837-845.

13.    Porter ME, Teisberg EO. Redefining Health Care: Creating Value-Based Competition on Results. Boston, MA: Harvard Business School Press; 2006.

14.  American Association of Orthopaedic Surgeons. Alignment of physician and facility payment and incentives. Position statement 1171. American Association of Orthopaedic Surgeons website. http://www.aaos.org/about/papers/position/1171.asp. Published September 2006. Revised February 2009. Accessed October 26, 2015.

15.  Page AE, Butler CA, Bozic KJ. Factors driving physician–hospital alignment in orthopaedic surgery. Clin Orthop Relat Res. 2013;471(6):1809-1817.

16.  US Department of Health and Human Services, Office of Inspector General. Gainsharing arrangements and CMPs for hospital payments to physicians to reduce or limit services to beneficiaries [special advisory bulletin]. Office of Inspector General website. http://oig.hhs.gov/fraud/docs/alertsandbulletins/gainsh.htm. Published July 1999. Accessed October 26, 2015.

17.  Bronson WH, Fewer M, Godlewski K, et al. The ethics of patient risk modification prior to elective joint replacement surgery. J Bone Joint Surg Am. 2014;96(13):e113.

18.  Bosco J. To cherry pick or not: the unintended ethical consequences of pay for performance. Presented at: New York University Colloquium on Medical Ethics; New York, NY; November 2014.

19.  Hageman MG, Briët JP, Bossen JK, Blok RD, Ring DC, Vranceanu AM. Do previsit expectations correlate with satisfaction of new patients presenting for evaluation with an orthopaedic surgical practice? Clin Orthop Relat Res. 2015;473(2):716-721.

20.  Jourdan C, Poiraudeau S, Descamps S, et al. Comparison of patient and surgeon expectations of total hip arthroplasty. PLoS One. 2012;7(1):e30195.

21.  McMillan S, Kendall E, Sav A, et al. Patient-centered approaches to health care: a systematic review of randomized controlled trials. Med Care Res Rev. 2013;70(6):567-596.

22.  Forster HP, Schwartz J, DeRenzo E. Reducing legal risk by practicing patient-centered medicine. Arch Intern Med. 2002;162(11):1217-1219.

23.  Van Citters AD, Fahlman C, Goldmann DA, et al. Developing a pathway for high-value, patient-centered total joint arthroplasty. Clin Orthop Relat Res. 2014;472(5):1619-1635.

Article PDF
Author and Disclosure Information

Stephen Yu, MD, and Joseph D. Zuckerman, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Issue
The American Journal of Orthopedics - 44(12)
Publications
Topics
Page Number
538-541
Legacy Keywords
american journal of orthopedics, AJO, 5 points, five points, orthopedics, US, United States, health care, health issues, practice management, cost, ethics, policy, yu, zuckerman
Sections
Author and Disclosure Information

Stephen Yu, MD, and Joseph D. Zuckerman, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Author and Disclosure Information

Stephen Yu, MD, and Joseph D. Zuckerman, MD

Authors’ Disclosure Statement: The authors report no actual or potential conflict of interest in relation to this article.

Article PDF
Article PDF

In the United States, the landscape of health care is changing. Health care reform and fluctuating political and economic climates have affected and will continue to affect the practice of orthopedic surgery. Demand for musculoskeletal care and the costs of providing this care are exceeding available resources—which has led to an evolution in how we practice as individuals and in the institutions where we provide care. Patient safety, quality, and value have become the outcomes of importance. Orthopedic surgeons, as experts in musculoskeletal care, must be a part of these changes. In this review, we offer perspective on the changing face of orthopedic surgery in the modern US health care system.

1. Meeting the demand

Musculoskeletal conditions represent one of the most common and costly health issues in the United States, affecting individuals medically and economically and compromising their quality of life.1,2 In 2008, more than 110 million US adults (1 in 2) reported having a musculoskeletal condition for more than 3 months, and almost 7% reported that a chronic musculoskeletal condition made routine activities of daily living significantly difficult.1 Overall, in the United States, some of the most common chronic conditions are musculoskeletal in origin. These conditions include osteoarthritis and back pain.

Osteoarthritis is the leading cause of chronic pain and disability. Physician-diagnosed arthritis is expected to affect 25% of US adults by 2030,3 and in more than one-third of these patients arthritis limits work or other activity.4 Back pain is another of the most common debilitating conditions in the United States.3,5 St Sauver and colleagues6 found that back pain is the third most common condition (23.9%) that prompts patients to seek health care—following skin-related problems (42.7%) and osteoarthritis/joint pain (33.6%).

As life expectancy increases, so do expectations of enjoying higher levels of activity into the later years. Patients expect to be as active in their geriatric years as they were in middle age, and many are able to do so. Amid the growing obesity epidemic and increased incidence of chronic comorbidities, however, the aging population not only is at substantial risk for developing a chronic musculoskeletal disorder but may face new challenges in accessing care.

Although orthopedic surgeons specialize in treating musculoskeletal conditions, up to 90% of common nonsurgical musculoskeletal complaints are thought to be manageable in the primary care setting.7 With a disproportionate increase in musculoskeletal demand against a relatively constant number of orthopedic providers,8 it is becoming increasingly important for nonorthopedists to adequately manage musculoskeletal conditions. Physiatrists, rheumatologists, internists, family practitioners, and the expanding field of sports medicine specialists provide primary care of musculoskeletal conditions. To meet the growing demand and to ensure that patients receive quality, sustainable, effective, and efficient care, orthopedic surgeons should be actively involved in training these providers. As high as the cost of managing musculoskeletal conditions can be, it is far less than the cost resulting from inadequate or improper management. There is already justification for formal development of a specialization in nonoperative management of musculoskeletal care. Establishing this specialization requires a multidisciplinary approach, with orthopedic surgery taking a lead role.

2. The cost equation

As the prevalence of orthopedic conditions increases, so does the cost of delivering musculoskeletal care. The economic implications of meeting this growing demand are an important area of concern for our health care system. Steadily increasing hospital expenses for personnel and services, rising costs of pharmaceuticals and laboratory tests, constant evolution of costly technology, and insurance/reimbursement rates that do not keep pace with rising costs all contribute to the rapid escalation of the “cost of care.”

Health care expenditures accounted for 17.2% of the US gross domestic product (GDP) in 2012 and are expected to represent 19.3% by 2023.9 For musculoskeletal disease, direct costs alone are expected to approach $510 billion, equaling 5% of GDP and representing almost 30% of all health care expenditures. In Medicare patients, osteoarthritis is the most expensive condition to treat overall, and 3 other musculoskeletal problems rank highly as well: femoral neck fractures (3rd), back pain (10th), and fractures of all types (16th).10 Clearly, musculoskeletal care is one of the most prevalent and expensive health conditions in the United States.

Part of the direct costs of care that consistently increase each year are the steadily increasing costs of technology, which is often considered synonymous with orthopedic care. Promotion of new and more costly implants is common in the absence of evidence supporting their use. However, use of new implants and technology is being scrutinized in an effort to strike the proper cost–benefit balance.

 

 

To change the slope of the cost curve, orthopedic surgeons should utilize technological advances that are proven to be clinically significant and economically feasible and should avoid modest improvements with limited clinical benefit and higher price tags. Unfortunately, this approach is not being taken. Minor modifications of implant designs are often marketed as “new and improved” to justify increased costs, and these implants often gain widespread use. A few may prove to be clinically better, but most will be only comparable to older, less expensive designs, and some may end up being clinical failures, discovered at great cost to patients and the health care system.11,12

Orthopedic surgeons have an important role in this decision-making. We should strive for the best, most cost-effective outcomes for our patients. We should reject new technology that does not clearly improve outcomes. At the least, we should use the technology in a manufacturer-supported clinical trial to determine its superiority. Whether the improvement is in technique, implant design, or workflow efficiency, orthopedic surgeons must be actively involved in researching and developing the latest innovations and must help determine their prospective value by considering not only their potential clinical benefits but also their economic implications.

As the political and economic environment becomes more directed at the cost-containment and sustainability of care, there has been a clear shift in focus to quality and value rather than volume, giving rise to the “value-based care” approach. The “value equation,” in which value equals quality divided by cost, requires a clear measure of outcomes and an equally clear understanding of costs. Delivering high-quality care in a cost-conscious environment is an approach that every orthopedic surgeon should adopt. Widespread adoption of the value-based strategy by hospital systems and insurance companies is resulting in a paradigm shift away from more traditional volume-based metrics and in favor of value-based metrics, including quality measures, patient-reported outcomes, Hospital Consumer Assessment of Healthcare Providers and Systems, and physician-specific outcome measures.

The new paradigm has brought the bundled payment initiative (BPI), a strategy included in the Patient Protection and Affordable Care Act. The philosophy behind the BPI model is for hospital systems and physicians to control costs while maintaining and improving the quality of care. Measured by patient metrics (eg, clinical outcomes, patient satisfaction) and hospital metrics (eg, readmission rates, cost of care), bundled payments reimburse hospitals on the basis of cost of an entire episode of care rather than on the basis of individual procedures and services. This approach provides incentives for both physicians and hospitals to promote value-based care while emphasizing coordination of care among all members of the health care team.

Providing the best possible care for our patients while holding our practice to the highest standards is a central tenet of the practice of orthopedic surgery and should be independent of reimbursement strategies. Thus, to increase the value of care, we must establish practice models and strategies to optimize cost-efficiency while improving outcomes. As explained by Porter and Teisberg,13 it is important to be conscientious about cost, but above all we must not allow quality of health care delivery to be compromised when trying to improve the “value” of care. Through evidence-based management and a clear understanding of costs, we must develop cost-efficient practice models that sustainably deliver the highest value of care.

3. Evolving practice models

As the health care landscape continues to change, physician practice models evolve accordingly. Although the private practice model once dominated the physician workforce, this is no longer true, as there has been a significant shift to employer-based practice models. The multiple factors at work relate to changing patterns of reimbursement, increasing government regulations, and a general change in recent residency graduates’ expectations regarding work–life balance. Other catalysts are the shift from volume- to value-based care and the recognition that cost-effective health care is more easily achieved when physicians and their institutions are in alignment. Ultimately, physician–institution alignment is crucial in improving care and outcomes.

Physician–institution alignment requires further discussion. Ideally, it should strike the proper balance between physician autonomy and institutional priorities to ensure the highest quality care. Physicians and their institutions should align their interests in terms of patient safety, quality, and economics to create a work environment conducive to both patient/physician satisfaction and institutional success.14 As identified by Page and colleagues,15 the primary drivers of physician–institution alignment, specific to orthopedic surgery, are economic, regulatory, and cultural. In economics, implant selection and ancillary services are the important issues; in the regulatory area, cooperative efforts to address expanding state and federal requirements are needed; last, the primary cultural driver is delivery of care to an expanding, diverse patient population.

 

 

Physician–institution alignment brings opportunities for “gainsharing,” which can directly benefit individual physicians, physician groups, and departments. Gainsharing is classically defined as “arrangements in which a hospital gives physicians a percentage share of any reduction in the hospital’s costs for patient care attributable in part to the physicians’ efforts.”16 Modern gainsharing programs can be used by institutions to align the economic interests of physicians and hospitals, with the ultimate goal being to achieve a sustainable increase in the value and quality of care delivered to patients.13 Examples include efforts to reduce the cost of orthopedic implants, which is a major cost driver in orthopedic surgery. Our institution realized significant savings when surgeons were directly involved in the implant contracting process with strategic sourcing personnel. These savings were shared with the department to enhance research and education programs. BPI, a risk-sharing program in which Medicare and hospitals participate, incorporates gainsharing opportunities in which each participating physician can receive up to 50% of his or her previous Medicare billings when specific targets are achieved. BPI included 27 musculoskeletal diagnosis–related groups that could be developed into a bundled payment proposal. Our institution participated in a 90-day episode, for primary hip and knee arthroplasty and non–cervical spine fusion, that had very promising results.

Gainsharing offers physicians incentives to meet institution goals of improved outcomes and increased patient satisfaction while increasing oversight and accountability. When physician-specific outcomes do not meet the established goals in key areas (readmissions, thromboembolic complications, infections), it is only logical that steps will be taken to improve outcomes. Although physicians may not be used to this increased scrutiny, the goal of improving outcomes, even if it necessitates a change in an established approach to care, should be welcomed.

Physicians should be rewarded for good outcomes but not suboptimal outcomes. When outcomes are suboptimal, physicians should take a constructive approach to improve them. On the other hand, not being rewarded for unachieved goals can be perceived as being penalized. Additional monitoring may paradoxically lead physicians to avoid more “complex” cases, such as those of patients at higher risk for complications and poorer outcomes. An example is found in patient selection for surgery, in which issues like obesity, diabetes, and heart disease are known to negatively affect outcomes. In these models, “cherry-picking” is a well-recognized risk17,18 that can compromise our ethical obligation to provide equal access for all patients. To offset this tendency, we should use a risk-stratification model in which all patients are not considered equal in the risks they present. A risk-adjustment approach benefits both patients and providers by identifying modifiable risk factors that can be addressed to positively affect outcomes. This risk-stratification approach further incentivizes the orthopedist to closely work with other health care providers to address the medical comorbidities that may negatively affect surgical outcomes.

4. Patient and physician expectations

Living in a technology-driven society in the age of information has had a major impact on patients’ attitudes and expectations about their care—and therefore on physicians’ practice methods. It is uncommon to evaluate a patient who has not already consulted the Internet about a problem. Patients now have much more information they can use to make decisions about their treatment, and, though many question the accuracy of Internet information, there is no argument that being more informed is beneficial. In this time of shared decision-making, it is absolutely essential that patients keep themselves informed.

It is crucial to align the expectations of both physicians and patients in order to achieve the best outcomes. Gaining a clear understanding of treatment goals, management, and potential complications consistently leads to improved patient satisfaction, more favorable clinical outcomes, and reduced risk of litigation.19-22 Addressing patient concerns and expectations is significantly enhanced by a strong patient–physician relationship through clinical models focused on patient-centered care.

Now considered a standard of care, the patient-centered model has changed the way we practice. The foundation of the patient-centered approach is to strengthen the patient–physician relationship by empowering patients to become active decision-makers in the management of their own health. The role of orthopedists in this model is to provide patients with information and insight into their conditions in order to facilitate shared decision-making. Our role should be to guide patients to make educated and informed decisions. Doing so enhances communication, thereby strengthening the patient–physician relationship, and places both patient and physician expectations in perspective. Patient-reported outcomes, satisfaction rates, symptomatic burdens, and costs of care are all positively correlated with strong communication and realistic expectations achieved through a patient-centered approach.21,23

 

 

The evolution of clinical practice has been influenced by factors ranging from external forces (eg, changing political and economic climates) to social trends (use of social media and the Internet). Technology has been a driving force in our rapidly changing clinical environment, significantly altering the way we practice. Although we must be careful in how we use it, new technology can certainly work to our advantage. We have a plethora of medical information at our fingertips, and, with physician-directed guidance, our patients can become more informed than ever before. This is the principle of patient-centered medicine and shared decision-making, and its utility will only increase in importance.

5. The role of advocacy

The central tenet of orthopedic practice has always been a focus on patients. We continually strive to improve patient outcomes, reduce costs, and work efficiently in our practices and facilities. Although we can focus on our individual practices, we cannot ignore the influence and impact of the political system on our performance. Federal and state regulations give physicians and insurance companies an uneven playing field. This imbalance requires that physicians be more active in health care policymaking and advocacy. Although we are more involved than ever before, our influence is far less than what we would like it to be, perhaps partly because of the nature of the political process but perhaps also because of physicians’ resistance to becoming involved.

As experts in the treatment of musculoskeletal conditions, we should be at the forefront of health care policy development—a position we have not been able to attain. Although many factors contribute to our lack of a “seat at the table,” we must recognize our reluctance as a group to support advocacy, either financially or through personal time commitment. The American Association of Orthopaedic Surgeons (AAOS) Orthopaedic Political Action Committee has never been able to obtain donations from more than 30% of AAOS members. Although this committee historically has been successful, we could be much more so if we had financial support from 90% of members. There are many ways to be actively involved in advocacy. One way is to join local and state orthopedic societies and support their advocacy efforts. State orthopedic societies work closely with the AAOS Office of Government Relations to coordinate advocacy and direct efforts and resources to areas of greatest need. Knowing local congressional representatives and communicating with them about issues we face in our practices make our issues “real.” Some of our colleagues have even successfully run for office in Congress, and they certainly deserve our support. Advocacy will absolutely play an increasingly important role as federal and state governments expand their involvement in health care. Our role should be to get involved, at least to some degree. We need to recognize that our strength is in our numbers, as the few cannot accomplish nearly as much as the many.

Summary

Orthopedic surgeons are practicing in the midst of almost constant change—evolving patient care, shifts in employment models, advances in technology, modern patient expectations, and an increasingly complex regulatory environment. Even in this context, however, our goal remains unchanged: to give our patients the highest-quality care possible. Our core values as orthopedic surgeons and physicians are dedication, commitment, and service to patients and to our profession. As US health care continues to evolve, we must evolve as well, with an emphasis on expanding our role in the health care policy debate.

In the United States, the landscape of health care is changing. Health care reform and fluctuating political and economic climates have affected and will continue to affect the practice of orthopedic surgery. Demand for musculoskeletal care and the costs of providing this care are exceeding available resources—which has led to an evolution in how we practice as individuals and in the institutions where we provide care. Patient safety, quality, and value have become the outcomes of importance. Orthopedic surgeons, as experts in musculoskeletal care, must be a part of these changes. In this review, we offer perspective on the changing face of orthopedic surgery in the modern US health care system.

1. Meeting the demand

Musculoskeletal conditions represent one of the most common and costly health issues in the United States, affecting individuals medically and economically and compromising their quality of life.1,2 In 2008, more than 110 million US adults (1 in 2) reported having a musculoskeletal condition for more than 3 months, and almost 7% reported that a chronic musculoskeletal condition made routine activities of daily living significantly difficult.1 Overall, in the United States, some of the most common chronic conditions are musculoskeletal in origin. These conditions include osteoarthritis and back pain.

Osteoarthritis is the leading cause of chronic pain and disability. Physician-diagnosed arthritis is expected to affect 25% of US adults by 2030,3 and in more than one-third of these patients arthritis limits work or other activity.4 Back pain is another of the most common debilitating conditions in the United States.3,5 St Sauver and colleagues6 found that back pain is the third most common condition (23.9%) that prompts patients to seek health care—following skin-related problems (42.7%) and osteoarthritis/joint pain (33.6%).

As life expectancy increases, so do expectations of enjoying higher levels of activity into the later years. Patients expect to be as active in their geriatric years as they were in middle age, and many are able to do so. Amid the growing obesity epidemic and increased incidence of chronic comorbidities, however, the aging population not only is at substantial risk for developing a chronic musculoskeletal disorder but may face new challenges in accessing care.

Although orthopedic surgeons specialize in treating musculoskeletal conditions, up to 90% of common nonsurgical musculoskeletal complaints are thought to be manageable in the primary care setting.7 With a disproportionate increase in musculoskeletal demand against a relatively constant number of orthopedic providers,8 it is becoming increasingly important for nonorthopedists to adequately manage musculoskeletal conditions. Physiatrists, rheumatologists, internists, family practitioners, and the expanding field of sports medicine specialists provide primary care of musculoskeletal conditions. To meet the growing demand and to ensure that patients receive quality, sustainable, effective, and efficient care, orthopedic surgeons should be actively involved in training these providers. As high as the cost of managing musculoskeletal conditions can be, it is far less than the cost resulting from inadequate or improper management. There is already justification for formal development of a specialization in nonoperative management of musculoskeletal care. Establishing this specialization requires a multidisciplinary approach, with orthopedic surgery taking a lead role.

2. The cost equation

As the prevalence of orthopedic conditions increases, so does the cost of delivering musculoskeletal care. The economic implications of meeting this growing demand are an important area of concern for our health care system. Steadily increasing hospital expenses for personnel and services, rising costs of pharmaceuticals and laboratory tests, constant evolution of costly technology, and insurance/reimbursement rates that do not keep pace with rising costs all contribute to the rapid escalation of the “cost of care.”

Health care expenditures accounted for 17.2% of the US gross domestic product (GDP) in 2012 and are expected to represent 19.3% by 2023.9 For musculoskeletal disease, direct costs alone are expected to approach $510 billion, equaling 5% of GDP and representing almost 30% of all health care expenditures. In Medicare patients, osteoarthritis is the most expensive condition to treat overall, and 3 other musculoskeletal problems rank highly as well: femoral neck fractures (3rd), back pain (10th), and fractures of all types (16th).10 Clearly, musculoskeletal care is one of the most prevalent and expensive health conditions in the United States.

Part of the direct costs of care that consistently increase each year are the steadily increasing costs of technology, which is often considered synonymous with orthopedic care. Promotion of new and more costly implants is common in the absence of evidence supporting their use. However, use of new implants and technology is being scrutinized in an effort to strike the proper cost–benefit balance.

 

 

To change the slope of the cost curve, orthopedic surgeons should utilize technological advances that are proven to be clinically significant and economically feasible and should avoid modest improvements with limited clinical benefit and higher price tags. Unfortunately, this approach is not being taken. Minor modifications of implant designs are often marketed as “new and improved” to justify increased costs, and these implants often gain widespread use. A few may prove to be clinically better, but most will be only comparable to older, less expensive designs, and some may end up being clinical failures, discovered at great cost to patients and the health care system.11,12

Orthopedic surgeons have an important role in this decision-making. We should strive for the best, most cost-effective outcomes for our patients. We should reject new technology that does not clearly improve outcomes. At the least, we should use the technology in a manufacturer-supported clinical trial to determine its superiority. Whether the improvement is in technique, implant design, or workflow efficiency, orthopedic surgeons must be actively involved in researching and developing the latest innovations and must help determine their prospective value by considering not only their potential clinical benefits but also their economic implications.

As the political and economic environment becomes more directed at the cost-containment and sustainability of care, there has been a clear shift in focus to quality and value rather than volume, giving rise to the “value-based care” approach. The “value equation,” in which value equals quality divided by cost, requires a clear measure of outcomes and an equally clear understanding of costs. Delivering high-quality care in a cost-conscious environment is an approach that every orthopedic surgeon should adopt. Widespread adoption of the value-based strategy by hospital systems and insurance companies is resulting in a paradigm shift away from more traditional volume-based metrics and in favor of value-based metrics, including quality measures, patient-reported outcomes, Hospital Consumer Assessment of Healthcare Providers and Systems, and physician-specific outcome measures.

The new paradigm has brought the bundled payment initiative (BPI), a strategy included in the Patient Protection and Affordable Care Act. The philosophy behind the BPI model is for hospital systems and physicians to control costs while maintaining and improving the quality of care. Measured by patient metrics (eg, clinical outcomes, patient satisfaction) and hospital metrics (eg, readmission rates, cost of care), bundled payments reimburse hospitals on the basis of cost of an entire episode of care rather than on the basis of individual procedures and services. This approach provides incentives for both physicians and hospitals to promote value-based care while emphasizing coordination of care among all members of the health care team.

Providing the best possible care for our patients while holding our practice to the highest standards is a central tenet of the practice of orthopedic surgery and should be independent of reimbursement strategies. Thus, to increase the value of care, we must establish practice models and strategies to optimize cost-efficiency while improving outcomes. As explained by Porter and Teisberg,13 it is important to be conscientious about cost, but above all we must not allow quality of health care delivery to be compromised when trying to improve the “value” of care. Through evidence-based management and a clear understanding of costs, we must develop cost-efficient practice models that sustainably deliver the highest value of care.

3. Evolving practice models

As the health care landscape continues to change, physician practice models evolve accordingly. Although the private practice model once dominated the physician workforce, this is no longer true, as there has been a significant shift to employer-based practice models. The multiple factors at work relate to changing patterns of reimbursement, increasing government regulations, and a general change in recent residency graduates’ expectations regarding work–life balance. Other catalysts are the shift from volume- to value-based care and the recognition that cost-effective health care is more easily achieved when physicians and their institutions are in alignment. Ultimately, physician–institution alignment is crucial in improving care and outcomes.

Physician–institution alignment requires further discussion. Ideally, it should strike the proper balance between physician autonomy and institutional priorities to ensure the highest quality care. Physicians and their institutions should align their interests in terms of patient safety, quality, and economics to create a work environment conducive to both patient/physician satisfaction and institutional success.14 As identified by Page and colleagues,15 the primary drivers of physician–institution alignment, specific to orthopedic surgery, are economic, regulatory, and cultural. In economics, implant selection and ancillary services are the important issues; in the regulatory area, cooperative efforts to address expanding state and federal requirements are needed; last, the primary cultural driver is delivery of care to an expanding, diverse patient population.

 

 

Physician–institution alignment brings opportunities for “gainsharing,” which can directly benefit individual physicians, physician groups, and departments. Gainsharing is classically defined as “arrangements in which a hospital gives physicians a percentage share of any reduction in the hospital’s costs for patient care attributable in part to the physicians’ efforts.”16 Modern gainsharing programs can be used by institutions to align the economic interests of physicians and hospitals, with the ultimate goal being to achieve a sustainable increase in the value and quality of care delivered to patients.13 Examples include efforts to reduce the cost of orthopedic implants, which is a major cost driver in orthopedic surgery. Our institution realized significant savings when surgeons were directly involved in the implant contracting process with strategic sourcing personnel. These savings were shared with the department to enhance research and education programs. BPI, a risk-sharing program in which Medicare and hospitals participate, incorporates gainsharing opportunities in which each participating physician can receive up to 50% of his or her previous Medicare billings when specific targets are achieved. BPI included 27 musculoskeletal diagnosis–related groups that could be developed into a bundled payment proposal. Our institution participated in a 90-day episode, for primary hip and knee arthroplasty and non–cervical spine fusion, that had very promising results.

Gainsharing offers physicians incentives to meet institution goals of improved outcomes and increased patient satisfaction while increasing oversight and accountability. When physician-specific outcomes do not meet the established goals in key areas (readmissions, thromboembolic complications, infections), it is only logical that steps will be taken to improve outcomes. Although physicians may not be used to this increased scrutiny, the goal of improving outcomes, even if it necessitates a change in an established approach to care, should be welcomed.

Physicians should be rewarded for good outcomes but not suboptimal outcomes. When outcomes are suboptimal, physicians should take a constructive approach to improve them. On the other hand, not being rewarded for unachieved goals can be perceived as being penalized. Additional monitoring may paradoxically lead physicians to avoid more “complex” cases, such as those of patients at higher risk for complications and poorer outcomes. An example is found in patient selection for surgery, in which issues like obesity, diabetes, and heart disease are known to negatively affect outcomes. In these models, “cherry-picking” is a well-recognized risk17,18 that can compromise our ethical obligation to provide equal access for all patients. To offset this tendency, we should use a risk-stratification model in which all patients are not considered equal in the risks they present. A risk-adjustment approach benefits both patients and providers by identifying modifiable risk factors that can be addressed to positively affect outcomes. This risk-stratification approach further incentivizes the orthopedist to closely work with other health care providers to address the medical comorbidities that may negatively affect surgical outcomes.

4. Patient and physician expectations

Living in a technology-driven society in the age of information has had a major impact on patients’ attitudes and expectations about their care—and therefore on physicians’ practice methods. It is uncommon to evaluate a patient who has not already consulted the Internet about a problem. Patients now have much more information they can use to make decisions about their treatment, and, though many question the accuracy of Internet information, there is no argument that being more informed is beneficial. In this time of shared decision-making, it is absolutely essential that patients keep themselves informed.

It is crucial to align the expectations of both physicians and patients in order to achieve the best outcomes. Gaining a clear understanding of treatment goals, management, and potential complications consistently leads to improved patient satisfaction, more favorable clinical outcomes, and reduced risk of litigation.19-22 Addressing patient concerns and expectations is significantly enhanced by a strong patient–physician relationship through clinical models focused on patient-centered care.

Now considered a standard of care, the patient-centered model has changed the way we practice. The foundation of the patient-centered approach is to strengthen the patient–physician relationship by empowering patients to become active decision-makers in the management of their own health. The role of orthopedists in this model is to provide patients with information and insight into their conditions in order to facilitate shared decision-making. Our role should be to guide patients to make educated and informed decisions. Doing so enhances communication, thereby strengthening the patient–physician relationship, and places both patient and physician expectations in perspective. Patient-reported outcomes, satisfaction rates, symptomatic burdens, and costs of care are all positively correlated with strong communication and realistic expectations achieved through a patient-centered approach.21,23

 

 

The evolution of clinical practice has been influenced by factors ranging from external forces (eg, changing political and economic climates) to social trends (use of social media and the Internet). Technology has been a driving force in our rapidly changing clinical environment, significantly altering the way we practice. Although we must be careful in how we use it, new technology can certainly work to our advantage. We have a plethora of medical information at our fingertips, and, with physician-directed guidance, our patients can become more informed than ever before. This is the principle of patient-centered medicine and shared decision-making, and its utility will only increase in importance.

5. The role of advocacy

The central tenet of orthopedic practice has always been a focus on patients. We continually strive to improve patient outcomes, reduce costs, and work efficiently in our practices and facilities. Although we can focus on our individual practices, we cannot ignore the influence and impact of the political system on our performance. Federal and state regulations give physicians and insurance companies an uneven playing field. This imbalance requires that physicians be more active in health care policymaking and advocacy. Although we are more involved than ever before, our influence is far less than what we would like it to be, perhaps partly because of the nature of the political process but perhaps also because of physicians’ resistance to becoming involved.

As experts in the treatment of musculoskeletal conditions, we should be at the forefront of health care policy development—a position we have not been able to attain. Although many factors contribute to our lack of a “seat at the table,” we must recognize our reluctance as a group to support advocacy, either financially or through personal time commitment. The American Association of Orthopaedic Surgeons (AAOS) Orthopaedic Political Action Committee has never been able to obtain donations from more than 30% of AAOS members. Although this committee historically has been successful, we could be much more so if we had financial support from 90% of members. There are many ways to be actively involved in advocacy. One way is to join local and state orthopedic societies and support their advocacy efforts. State orthopedic societies work closely with the AAOS Office of Government Relations to coordinate advocacy and direct efforts and resources to areas of greatest need. Knowing local congressional representatives and communicating with them about issues we face in our practices make our issues “real.” Some of our colleagues have even successfully run for office in Congress, and they certainly deserve our support. Advocacy will absolutely play an increasingly important role as federal and state governments expand their involvement in health care. Our role should be to get involved, at least to some degree. We need to recognize that our strength is in our numbers, as the few cannot accomplish nearly as much as the many.

Summary

Orthopedic surgeons are practicing in the midst of almost constant change—evolving patient care, shifts in employment models, advances in technology, modern patient expectations, and an increasingly complex regulatory environment. Even in this context, however, our goal remains unchanged: to give our patients the highest-quality care possible. Our core values as orthopedic surgeons and physicians are dedication, commitment, and service to patients and to our profession. As US health care continues to evolve, we must evolve as well, with an emphasis on expanding our role in the health care policy debate.

References

1.    US Bone and Joint Initiative. Burden of Musculoskeletal Diseases in the United States: Prevalence, Societal, and Economic Cost. Rosemont, IL: US Bone and Joint Initiative; 2008. http://www.boneandjointburden.org. Accessed October 26, 2015.

2.    US Bone and Joint Initiative. Burden of Musculoskeletal Diseases in the United States: Prevalence, Societal, and Economic Cost. 2nd ed. Rosemont, IL: US Bone and Joint Initiative; 2011. http://www.boneandjointburden.org. Accessed October 26, 2015.

3.    Ma VY, Chan L, Carruthers KJ. Incidence, prevalence, costs, and impact on disability of common conditions requiring rehabilitation in the United States: stroke, spinal cord injury, traumatic brain injury, multiple sclerosis, osteoarthritis, rheumatoid arthritis, limb loss, and back pain. Arch Phys Med Rehabil. 2014;95(5):986-995.e1.

4.    Hootman JM, Helmick CG. Projections of US prevalence of arthritis and associated activity limitations. Arthritis Rheum. 2006;54(1):226-229.

5.    Freburger JK, Holmes GM, Agans RP, et al. The rising prevalence of chronic low back pain. Arch Intern Med. 2009;169(3):251-258.

6.    St Sauver JL, Warner DO, Yawn BP, et al. Why patients visit their doctors: assessing the most prevalent conditions in a defined American population. Mayo Clin Proc. 2013;88(1):56-67.

7.    Anderson BC. Office Orthopedics for Primary Care: Diagnosis and Treatment. 2nd ed. Philadelphia, PA: Saunders; 1999.

8.    American Academy of Orthopaedic Surgeons, Department of Research and Scientific Affairs. Orthopaedic Practice in the U.S. 2012 [2012 Orthopaedic Surgeon Census Report]. Rosemont, IL: American Academy of Orthopaedic Surgeons; January 2013.

9.    US Department of Health and Human Services, Centers for Medicare & Medicaid Services, Office of the Actuary, National Health Statistics Group. NHE [National Health Expenditure] Fact Sheet, 2014. Centers for Medicare & Medicaid Services website. http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData/NHE-Fact-Sheet.html. Updated July 28, 2015. Accessed October 26, 2015.

10.  Cutler DM, Ghosh K. The potential for cost savings through bundled episode payments. N Engl J Med. 2012;366(12):1075-1077.

11.  Langton DJ, Jameson SS, Joyce TJ, Hallab NJ, Natu S, Nargol AV. Early failure of metal-on-metal bearings in hip resurfacing and large-diameter total hip replacement: a consequence of excess wear. J Bone Joint Surg Br. 2010;92(1):38-46.

12.  Dahlstrand H, Stark A, Anissian L, Hailer NP. Elevated serum concentrations of cobalt, chromium, nickel, and manganese after metal-on-metal alloarthroplasty of the hip: a prospective randomized study. J Arthroplasty. 2009;24(6):837-845.

13.    Porter ME, Teisberg EO. Redefining Health Care: Creating Value-Based Competition on Results. Boston, MA: Harvard Business School Press; 2006.

14.  American Association of Orthopaedic Surgeons. Alignment of physician and facility payment and incentives. Position statement 1171. American Association of Orthopaedic Surgeons website. http://www.aaos.org/about/papers/position/1171.asp. Published September 2006. Revised February 2009. Accessed October 26, 2015.

15.  Page AE, Butler CA, Bozic KJ. Factors driving physician–hospital alignment in orthopaedic surgery. Clin Orthop Relat Res. 2013;471(6):1809-1817.

16.  US Department of Health and Human Services, Office of Inspector General. Gainsharing arrangements and CMPs for hospital payments to physicians to reduce or limit services to beneficiaries [special advisory bulletin]. Office of Inspector General website. http://oig.hhs.gov/fraud/docs/alertsandbulletins/gainsh.htm. Published July 1999. Accessed October 26, 2015.

17.  Bronson WH, Fewer M, Godlewski K, et al. The ethics of patient risk modification prior to elective joint replacement surgery. J Bone Joint Surg Am. 2014;96(13):e113.

18.  Bosco J. To cherry pick or not: the unintended ethical consequences of pay for performance. Presented at: New York University Colloquium on Medical Ethics; New York, NY; November 2014.

19.  Hageman MG, Briët JP, Bossen JK, Blok RD, Ring DC, Vranceanu AM. Do previsit expectations correlate with satisfaction of new patients presenting for evaluation with an orthopaedic surgical practice? Clin Orthop Relat Res. 2015;473(2):716-721.

20.  Jourdan C, Poiraudeau S, Descamps S, et al. Comparison of patient and surgeon expectations of total hip arthroplasty. PLoS One. 2012;7(1):e30195.

21.  McMillan S, Kendall E, Sav A, et al. Patient-centered approaches to health care: a systematic review of randomized controlled trials. Med Care Res Rev. 2013;70(6):567-596.

22.  Forster HP, Schwartz J, DeRenzo E. Reducing legal risk by practicing patient-centered medicine. Arch Intern Med. 2002;162(11):1217-1219.

23.  Van Citters AD, Fahlman C, Goldmann DA, et al. Developing a pathway for high-value, patient-centered total joint arthroplasty. Clin Orthop Relat Res. 2014;472(5):1619-1635.

References

1.    US Bone and Joint Initiative. Burden of Musculoskeletal Diseases in the United States: Prevalence, Societal, and Economic Cost. Rosemont, IL: US Bone and Joint Initiative; 2008. http://www.boneandjointburden.org. Accessed October 26, 2015.

2.    US Bone and Joint Initiative. Burden of Musculoskeletal Diseases in the United States: Prevalence, Societal, and Economic Cost. 2nd ed. Rosemont, IL: US Bone and Joint Initiative; 2011. http://www.boneandjointburden.org. Accessed October 26, 2015.

3.    Ma VY, Chan L, Carruthers KJ. Incidence, prevalence, costs, and impact on disability of common conditions requiring rehabilitation in the United States: stroke, spinal cord injury, traumatic brain injury, multiple sclerosis, osteoarthritis, rheumatoid arthritis, limb loss, and back pain. Arch Phys Med Rehabil. 2014;95(5):986-995.e1.

4.    Hootman JM, Helmick CG. Projections of US prevalence of arthritis and associated activity limitations. Arthritis Rheum. 2006;54(1):226-229.

5.    Freburger JK, Holmes GM, Agans RP, et al. The rising prevalence of chronic low back pain. Arch Intern Med. 2009;169(3):251-258.

6.    St Sauver JL, Warner DO, Yawn BP, et al. Why patients visit their doctors: assessing the most prevalent conditions in a defined American population. Mayo Clin Proc. 2013;88(1):56-67.

7.    Anderson BC. Office Orthopedics for Primary Care: Diagnosis and Treatment. 2nd ed. Philadelphia, PA: Saunders; 1999.

8.    American Academy of Orthopaedic Surgeons, Department of Research and Scientific Affairs. Orthopaedic Practice in the U.S. 2012 [2012 Orthopaedic Surgeon Census Report]. Rosemont, IL: American Academy of Orthopaedic Surgeons; January 2013.

9.    US Department of Health and Human Services, Centers for Medicare & Medicaid Services, Office of the Actuary, National Health Statistics Group. NHE [National Health Expenditure] Fact Sheet, 2014. Centers for Medicare & Medicaid Services website. http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData/NHE-Fact-Sheet.html. Updated July 28, 2015. Accessed October 26, 2015.

10.  Cutler DM, Ghosh K. The potential for cost savings through bundled episode payments. N Engl J Med. 2012;366(12):1075-1077.

11.  Langton DJ, Jameson SS, Joyce TJ, Hallab NJ, Natu S, Nargol AV. Early failure of metal-on-metal bearings in hip resurfacing and large-diameter total hip replacement: a consequence of excess wear. J Bone Joint Surg Br. 2010;92(1):38-46.

12.  Dahlstrand H, Stark A, Anissian L, Hailer NP. Elevated serum concentrations of cobalt, chromium, nickel, and manganese after metal-on-metal alloarthroplasty of the hip: a prospective randomized study. J Arthroplasty. 2009;24(6):837-845.

13.    Porter ME, Teisberg EO. Redefining Health Care: Creating Value-Based Competition on Results. Boston, MA: Harvard Business School Press; 2006.

14.  American Association of Orthopaedic Surgeons. Alignment of physician and facility payment and incentives. Position statement 1171. American Association of Orthopaedic Surgeons website. http://www.aaos.org/about/papers/position/1171.asp. Published September 2006. Revised February 2009. Accessed October 26, 2015.

15.  Page AE, Butler CA, Bozic KJ. Factors driving physician–hospital alignment in orthopaedic surgery. Clin Orthop Relat Res. 2013;471(6):1809-1817.

16.  US Department of Health and Human Services, Office of Inspector General. Gainsharing arrangements and CMPs for hospital payments to physicians to reduce or limit services to beneficiaries [special advisory bulletin]. Office of Inspector General website. http://oig.hhs.gov/fraud/docs/alertsandbulletins/gainsh.htm. Published July 1999. Accessed October 26, 2015.

17.  Bronson WH, Fewer M, Godlewski K, et al. The ethics of patient risk modification prior to elective joint replacement surgery. J Bone Joint Surg Am. 2014;96(13):e113.

18.  Bosco J. To cherry pick or not: the unintended ethical consequences of pay for performance. Presented at: New York University Colloquium on Medical Ethics; New York, NY; November 2014.

19.  Hageman MG, Briët JP, Bossen JK, Blok RD, Ring DC, Vranceanu AM. Do previsit expectations correlate with satisfaction of new patients presenting for evaluation with an orthopaedic surgical practice? Clin Orthop Relat Res. 2015;473(2):716-721.

20.  Jourdan C, Poiraudeau S, Descamps S, et al. Comparison of patient and surgeon expectations of total hip arthroplasty. PLoS One. 2012;7(1):e30195.

21.  McMillan S, Kendall E, Sav A, et al. Patient-centered approaches to health care: a systematic review of randomized controlled trials. Med Care Res Rev. 2013;70(6):567-596.

22.  Forster HP, Schwartz J, DeRenzo E. Reducing legal risk by practicing patient-centered medicine. Arch Intern Med. 2002;162(11):1217-1219.

23.  Van Citters AD, Fahlman C, Goldmann DA, et al. Developing a pathway for high-value, patient-centered total joint arthroplasty. Clin Orthop Relat Res. 2014;472(5):1619-1635.

Issue
The American Journal of Orthopedics - 44(12)
Issue
The American Journal of Orthopedics - 44(12)
Page Number
538-541
Page Number
538-541
Publications
Publications
Topics
Article Type
Display Headline
Orthopedics in US Health Care
Display Headline
Orthopedics in US Health Care
Legacy Keywords
american journal of orthopedics, AJO, 5 points, five points, orthopedics, US, United States, health care, health issues, practice management, cost, ethics, policy, yu, zuckerman
Legacy Keywords
american journal of orthopedics, AJO, 5 points, five points, orthopedics, US, United States, health care, health issues, practice management, cost, ethics, policy, yu, zuckerman
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media

Interhospital Transfer Patients

Article Type
Changed
Mon, 05/15/2017 - 22:32
Display Headline
Interhospital transfer patients discharged by academic hospitalists and general internists: Characteristics and outcomes

Interhospital transfers (IHTs) to academic medical centers (AMCs) or their affiliated hospitals may benefit patients who require unique specialty and procedural services. However, IHTs also introduce a potentially risky transition of care for patients suffering from complex or unstable medical problems.[1] Components of this risk include the dangers associated with transportation and the disrupted continuity of care that may lead to delays or errors in care.[2, 3] Furthermore, referring and accepting providers may face barriers to optimal handoffs including a lack of shared communication standards and difficulty accessing external medical records.[3, 4, 5] Although some authors have recommended the creation of formal guidelines for interhospital transfer processes for all patients to mitigate the risks of transfer, the available guidelines governing the IHT triage and communication process are limited to critically ill patients.[6]

A recent study of a diverse patient and hospital dataset demonstrated that interhospital transfer patients have a higher risk of mortality, increased length of stay (LOS), and increased risk of adverse events as compared with non‐transfer patients.[7] However, it is unknown if these findings persist in the population of patients transferred specifically to AMCs or their affiliated hospitals (the combination is hereafter referred to as academic health systems [AHSs]). AMCs provide a disproportionate share of IHT care for complex patients and have a vested interest in improving the outcomes of these transitions.[8] Prior single‐center studies of acute care adult medical patients accepted to AMCs have shown that IHT is associated with a longer LOS, increased in‐hospital mortality, and higher resource use.[9, 10] However, it is difficult to generalize from single‐center studies due to the variation in referral practices, geography, and network characteristics. Additionally, AMC referral systems, patient mix, and utilization of hospitalists have likely changed substantially in the nearly 2 decades since those reports were published.

Hospitalists and general internists often manage the transfer acceptance processes for internal medicine services at receiving hospitals, helping to triage and coordinate care for IHT patients. As a result, it is important for hospitalists to understand the characteristics and outcomes of the IHT population. In addition to informing the decision making around transfer for a given patient, such an understanding is the foundation for helping providers and institutions begin to systematically identify and mitigate peritransfer risks.

We conducted this large multicenter study to describe the characteristics and outcomes of a current, nationally representative IHT patient population discharged by hospitalists and general internists at AHSs. To identify unique features of the IHT population, we compared patients transferred from another hospital to an AHS to those admitted to the AHS directly from the AHS's emergency department (ED). Based on our anecdotal experiences and the prior single‐center study findings in adult medical populations,[9, 10] we hypothesized that the IHT population would be sicker, stay in the hospital and intensive care unit (ICU) longer, and have higher costs and in‐hospital mortality than ED patients. Although there may be fundamental differences between the 2 groups related to disease and patient condition, we hypothesized that outcome differences would persist even after adjusting for patient factors such as demographics, disease‐specific risk of mortality, and ICU utilization.

PATIENTS AND METHODS

We conducted a retrospective cohort study using data from the University HealthSystem Consortium (UHC) Clinical Database and Resource Manager (CDB/RM). UHC is an alliance of 120 academic medical centers and 300 of their affiliated hospitals for the purposes of collaboration on performance improvement. Each year, a subset of participating hospitals submits data on all of their inpatient discharges to the CDB/RM, which totals approximately 5 million records. The CDB/RM includes information from billing forms including demographics, diagnoses, and procedures as captured by International Classification of Diseases, Ninth Revision (ICD‐9) codes, discharge disposition, and line item charge detail for the type of bed (eg, floor, ICU). Most hospitals also provide detailed charge information including pharmacy, imaging, blood products, lab tests, and supplies. Some hospitals do not provide any charge data. The Beth Israel Deaconess Medical Center and University of Washington institutional review boards reviewed and approved the conduct of this study.

We included all inpatients discharged by hospitalists or general internal medicine physicians from UHC hospitals between April 1, 2011 and March 31, 2012. We excluded minors, pregnant patients, and prisoners. One hundred fifty‐eight adult academic medical centers and affiliated hospitals submitted data throughout this time period. Our primary independent variable, IHT status, was defined by patients whose admission source was another acute care institution. ED admissions were defined as patients admitted from the AHS ED whose source of origination was not another hospital or ambulatory surgery site.

Admission Characteristics

Admission characteristics of interest included age, gender, insurance status, the most common diagnoses in each cohort based on Medicare Severity Diagnosis‐Related Group (MS‐DRG), the most common Agency for Healthcare Research and Quality (AHRQ) comorbitidies,[11] the most common procedures, and the admission 3M All‐Patient Refined Diagnosis‐Related Group (APR‐DRG) risk of mortality (ROM) scores. 3M APR‐DRG ROM scores are proprietary categorical measures specific to the base APR‐DRG to which a patient is assigned, which are calculated using data available at the time of admission, including comorbid condition diagnosis codes, age, procedure codes, and principal diagnosis codes. A patient can fall into 1 of 4 categories with this score: minor, moderate, major, or extreme.[12]

Outcomes

Our primary outcome of interest was in‐hospital mortality. Secondary outcomes included LOS, the cost of care, ICU utilization, and discharge destination. The cost of care is a standardized estimate of the direct costs based on an adjustment of the charges submitted by CDB/RM participants. If an IHT is triaged through a receiving hospital's ED, the cost of care reflects those charges as well as the inpatient charges.

Statistical Analysis

We used descriptive statistics to characterize the IHT and ED patient populations. For bivariate comparisons of continuous variables, 2‐sample t tests with unequal variance were used. For categorical variables, 2 analysis was performed. We assessed the impact of IHT status on in‐hospital mortality using logistic regression to estimate unadjusted and adjusted relative risks, 95% confidence intervals (CIs), and P values. We included age, gender, insurance status, race, timing of ICU utilization, and 3M APR‐DRG ROM scores as independent variables. Prior studies have used this type of risk‐adjustment methodology with 3M APR‐DRG ROM scores,[13, 14, 15] including with interhospital transfer patients.[16] For all comparisons, a P value of <0.05 was considered statistically significant. Our sample size was determined by the data available for the 1‐year period.

Subgroup Analyses

We performed a stratified analysis based on the timing of ICU transfer to allow for additional comparisons of mortality within more homogeneous patient groups, and to control for the possibility that delays in ICU transfer could explain the association between IHT and in‐hospital mortality. We determined whether and when a patient spent time in the ICU based on daily accommodation charges. If a patient was charged for an ICU bed on the day of admission, we coded them as a direct ICU admission, and if the first ICU bed charge was on a subsequent day, they were coded as a delayed ICU admission. Approximately 20% of patients did not have the data necessary to determine the timing of ICU utilization, because the hospitals where they received care did not submit detailed charge data to the UHC.

Data analysis was performed by the UHC. Analysis was performed using Stata version 10 (StataCorp, College Station, TX). For all comparisons, a P value of <0.05 was considered significant.

RESULTS

Patient Characteristics

We identified 885,392 patients who met study criteria: 75,524 patients admitted as an IHT and 809,868 patients admitted from the ED. The proportion of each hospital's admissions that were IHTs that met our study criteria varied widely (median 9%, 25th percentile 3%, 75th percentile 14%). The average age and gender of the IHT and ED populations were similar and reflective of a nationally representative adult inpatient sample (Table 1). Racial compositions of the populations were notable for a higher portion of black patients in the ED admission group than the IHT group (25.4% vs 13.2%, P < 0.001). A slightly higher portion of the IHT population was covered by commercial insurance compared with the ED admissions (22.7% vs 19.1%, P < 0.001).

Characteristics of 885,392 Patients Discharged by Academic General Internists or Hospitalists by Source of Admission*
Demographic/Clinical VariablesEDIHT 
1st2nd 3rd4thRank
  • NOTE: Abbreviations: AHRQ, Agency for Healthcare Research and Quality; APR‐DRG admission ROM score, All‐Patient Refined Diagnosis‐Related Group Admission Risk of Mortality score; CC, complication or comorbidity (except under the AHRQ comorbidities where it refers to chronic complications); ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); GI, gastrointestinal; IHT, interhospital transfer (patients whose admission source was another acute care institution); MCC, major complication or comorbidity; MS‐DRG, Medicare Severity Diagnosis‐Related Group; MV, mechanical ventilation; SD, standard deviation. *All differences were significant at a level of P < 0.001. Denominator is the total number of patients. All other denominators are the total number of patients in that column. Subgroups may not sum to the total denominator due to incomplete data.

No. of patients809,86891.5 75,5248.5 
Age, y62.2 19.1  60.2 18.2  
Male381,56347.1 38,85051.4 
Female428,30352.9 36,67248.6 
Race      
White492,89460.9 54,78072.5 
Black205,30925.4 9,96813.2 
Other66,7098.1 7,77710.3 
Hispanic44,9565.6 2,9994.0 
Primary payer      
Commercial154,82619.1 17,13022.7 
Medicaid193,58523.9 15,92421.1 
Medicare445,22755.0 39,30152.0 
Other16,2302.0 3,1694.2 
Most common MS‐DRGs (top 5 for each group)      
Esophagitis, gastroenteritis, and miscellaneous digest disorders without MCC34,1164.21st1,5172.12nd
Septicemia or severe sepsis without MV 96+ hours with MCC25,7103.22nd2,6253.71st
Cellulitis without MCC21,6862.73rd8711.28th
Kidney and urinary tract infections without MCC19,9372.54th6310.921st
Chest pain18,0562.25th4950.734th
Renal failure with CC15,4781.99th1,0181.45th
GI hemorrhage with CC12,8551.612th1,2341.73rd
Respiratory system diagnosis w ventilator support4,7730.647th1,1181.64th
AHRQ comorbidities (top 5 for each group)      
Hypertension468,02617.81st39,34016.41st
Fluid and electrolyte disorders251,3399.52nd19,8258.32nd
Deficiency anemia208,7227.93rd19,6638.23rd
Diabetes without CCs190,1407.24th17,1317.14th
Chronic pulmonary disease178,1646.85th16,3196.85th
Most common procedures (top 5 for each group)      
Packed cell transfusion72,5907.01st9,7565.02nd
(Central) venous catheter insertion68,6876.72nd13,7557.01st
Hemodialysis41,5574.03rd5,3512.74th
Heart ultrasound (echocardiogram)37,7623.74th5,4412.83rd
Insert endotracheal tube25,3602.55th4,7052.46th
Continuous invasive mechanical ventilation19,2211.99th5,2802.75th
3M APR‐DRG admission ROM score      
Minor271,70233.6 18,62026.1 
Moderate286,42735.4 21,77530.5 
Major193,65223.9 20,53128.7 
Extreme58,0817.2 10,52714.7 

Primary discharge diagnoses (MS‐DRGs) varied widely, with no single diagnosis accounting for more than 4.2% of admissions in either group. The most common primary diagnoses among IHTs included severe sepsis (3.7%), esophagitis and gastroenteritis (2.1%), and gastrointestinal bleeding (1.7%). The top 5 most common AHRQ comorbidities were the same between the IHT and ED populations. A higher proportion of IHTs had at least 1 procedure performed during their hospitalization (68.5% vs 49.8%, P < 0.001). Note that ICD‐9 procedure codes include interventions such as blood transfusions and dialysis (Table 1), which may not be considered procedures in common medical parlance.

As compared with those admitted from the ED, IHTs had a higher proportion of patients categorized with major or extreme admission risk of mortality score (major + extreme, ED 31.1% vs IHT 43.5%, P < 0.001).

Overall Outcomes

IHT patients experienced a 60% longer average LOS, and a higher proportion spent time in the ICU than patients admitted through the ED (Table 2). On average, care for IHT patients cost more per day than for ED patients (Table 2). A lower proportion of IHTs were discharged home (68.6% vs 77.4% of ED patients), and a higher proportion died in the hospital (4.1% vs 1.8%) (P < 0.001 for both). Of the ED or IHT patients who died during their admission, there was no significant difference between the proportion who died within 48 hours of admission (26.4% vs 25.6%, P = 0.3693). After adjusting for age, gender, insurance status, race, ICU utilization and 3M APR‐DRG admission ROM scores, IHT was independently associated with the risk of in‐hospital death (odds ratio [OR]: 1.36, 95% CI: 1.291.43) (Table 3). The C statistic for the in‐hospital mortality model was 0.88.

Outcomes of 885,392 Academic Health System Patients Based on Source of Admission*
 ED, n = 809,868IHT, n = 75,524
  • NOTE: Abbreviations: ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); LOS, length of stay; SD, standard deviation. *All differences were significant at a level of P < 0.001 except the portion of deaths in 48 hours. ICU days data were available for 798,132 patients admitted from the ED and 71,054 IHT patients. Cost data were available for 792,604 patients admitted from the ED and 71,033 IHT patients.

LOS, mean SD5.0 6.98.0 13.4
ICU days, mean SD0.6 2.41.7 5.2
Patients who spent some time in the ICU14.3%29.8%
% LOS in the ICU (ICU days LOS)11.0%21.6%
Average total cost SD$10,731 $16,593$19,818 $34,665
Average cost per day (total cost LOS)$2,139$2,492
Discharged home77.4%68.6%
Died as inpatient14,869 (1.8%)3,051 (4.0%)
Died within 48 hours of admission (% total deaths)3,918 (26.4%)780 (25.6%)
Multivariable Model of In‐hospital Mortality (n = 707,248)
VariableUnadjusted OR (95% CI)Adjusted OR (95% CI)
  • NOTE: Abbreviations: APR‐DRG admission ROM score, All‐Patient Refined Diagnosis‐Related Group Admission Risk of Mortality score; CI, confidence interval; ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); OR, odds ratio.

Age, y1.00 (1.001.00)1.03 (1.031.03)
Gender  
FemaleRef.Ref.
Male1.13 (1.091.70)1.05 (1.011.09)
Medicare status  
NoRef.Ref.
Yes2.14 (2.062.22)1.39 (1.331.47)
Race  
NonblackRef.Ref.
Black0.57 (0.550.60)0.77 (0.730.81)
ICU utilization  
No ICU admissionRef.Ref.
Direct admission to the ICU5.56 (5.295.84)2.25 (2.132.38)
Delayed ICU admission5.48 (5.275.69)2.46 (2.362.57)
3M APR‐DRG admission ROM score  
MinorRef.Ref.
Moderate8.71 (7.5510.05)6.28 (5.437.25)
Major43.97 (38.3150.47)25.84 (22.4729.71)
Extreme238.65 (207.69273.80)107.17 (93.07123.40)
IHT  
NoRef.Ref.
Yes2.36 (2.262.48)1.36 (1.29 1.43)

Subgroup Analyses

Table 4 demonstrates the unadjusted and adjusted results from our analysis stratified by timing of ICU utilization. IHT remained independently associated with in‐hospital mortality regardless of timing of ICU utilization.

Unadjusted and Adjusted Associations Between IHT and In‐hospital Mortality, Stratified by ICU Timing*
SubgroupIn‐hospital Mortality, n (%)Unadjusted OR [95% CI]Adjusted OR [95% CI]
  • NOTE: Abbreviations: CI, confidence interval; ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); OR, odds ratio. *Timing of ICU utilization data were available for 650,608 of the patients admitted from the ED (80% of all ED admissions) and 56,640 of the IHT patients (75% of all IHTs).

No ICU admission, n = 552,171   
ED, n = 519,4214,913 (0.95%)Ref.Ref.
IHT, n = 32,750590 (1.80%)1.92 [1.762.09]1.68 [1.531.84]
Direct admission to the ICU, n = 44,537   
ED, n = 35,6141,733 (4.87%)Ref.Ref.
IHT, n = 8,923628 (7.04%)1.48 [1.351.63]1.24 [1.121.37]
Delayed ICU admission, n = 110,540   
ED, n = 95,5734,706 (4.92%)Ref.Ref.
IHT, n = 14,9671,068 (7.14%)1.48 [1.391.59]1.25 [1.171.35]

DISCUSSION

Our study of IHT patients ultimately discharged by hospitalists and general internists at US academic referral centers found significantly increased average LOS, costs, and in‐hospital mortality compared with patients admitted from the ED. The increased risk of mortality persisted after adjustment for patient characteristics and variables representing endogenous risk of mortality, and in more homogeneous subgroups after stratification by presence and timing of ICU utilization. These data confirm findings from single‐center studies and suggest that observations about the difference between IHT and ED populations may be generalizable across US academic hospitals.

Our work builds on 2 single‐center studies that examined mixed medical and surgical academic IHT populations from the late 1980s and early 1990s,[9, 10] and 1 studying surgical ICU patients in 2013.[17] These studies demonstrated longer average LOS, higher costs, and higher mortality rates (in both adjusted and unadjusted analyses). Our work confirmed these findings utilizing a more current, multicenter large dataset of IHT patients ultimately discharged by hospitalists and general internists. Our work is unique from a larger, more recent study[7] in that it focuses on patients transferred to academic health systems, and therefore has particular relevance to those settings. In addition, we divided patients into subpopulations based on the timing of ICU utilization, and found that in each of these populations, IHT remained independently associated with in‐hospital mortality.

Our analysis does not explain why the outcomes of IHTs are worse, but plausible contributing factors include that (1) patients chosen for IHT are at higher risk of death in ways uncaptured by established mortality risk scores, (2) referring, transferring, or accepting providers and institutions have provided inadequate care, (3) the transfer process itself involves harm, (4) socioeconomic bias in selection for IHT,[18] or (5) some combination of the above. Regardless of the causes of the worse outcomes observed in these outside‐hospital transfers, as these patients are colloquially known at accepting hospitals, they present challenges to everyone involved. Referring providers may feel a sense of urgency as these patients' needs exceed their management capabilities. The process is often time consuming and burdensome for referring and accepting providers because of poorly developed systems.[19] The transfer often takes patients further from their home and may make it more difficult for family to participate in their care. The transfer may delay care if the accepting institution cannot immediately accept the patient or if the time in transport is prolonged, which could result in decompensation at a critical juncture. For providers inheriting such patients, the stress of caring for these patients is compounded by the difficulty obtaining records about the prior hospitalization.[20] This frustrating experience is often translated into unfounded judgment of the institution that referred the patient and the care provided there.[21] It is important for hospitalists making decisions throughout the transfer process and for hospital leaders who determine staffing levels, measure the quality of care, manage hospital networks, or write hospital policy to appreciate that the transfer process itself may contribute to the challenges and poor outcomes we observe. Furthermore, regardless of the cause for the increased mortality that we observed, our findings imply that IHT patients require careful evaluation, management, and treatment.

Many accepting institutions have transfer centers that facilitate these transitions, utilizing protocols and templates to standardize the process.[22, 23] Future research should focus on the characteristics of these centers to learn which practices are most efficacious. Interventions to mitigate the known challenges of transfer (including patient selection and triage, handoff communication, and information sharing) could be tested by randomized studies at referring and accepting institutions. There may be a role for health information exchange or the development of enhanced pretransfer evaluation processes using telemedicine models; there is evidence that information sharing may reduce redundant imaging.[24] Perhaps targeted review of IHTs admitted to a non‐ICU portion of the hospital and subsequently transferred to the ICU could identify opportunities to improve triaging protocols and thus avert some of the bad outcomes observed in this subpopulation. A related future direction could be to create protected forumsusing the patient safety organization framework[25]to facilitate the discussion of interhospital transfer outcomes among the referring, transporting, and receiving parties. Lastly, future work should investigate the reasons for the different proportions of black patients in the ED versus IHT cohorts. Our finding that black race was associated with lower risk of mortality has been previously reported but may also benefit from more investigation.[26]

There are several limitations of our work. First, despite extensive adjustment for patient characteristics, due to the observational nature of our study it is still possible that IHTs differ from ED admissions in ways that were unaccounted for in our analysis, and which could be associated with increased mortality independent of the transfer process itself. We are unable to characterize features of the transfer process, such as the reason for transfer, differences in transfer processes among hospitals, or the distance and mode of travel, which may influence outcomes.[27] Because we used administrative data, variations in coding could incorrectly estimate the complexity or severity of illness on admission, which is a previously described risk.[28] In addition, although our dataset was very large, it was limited by incomplete charge data, which limited our ability to measure ICU utilization in our full cohort. The hospitals missing ICU charge data are of variable sizes and are distributed around the country, limiting the chance of systematic bias. Finally, in some settings, hospitalists may serve as the discharging physician for patients admitted to other services such as the ICU, introducing heterogeneity and bias to the sample. We attempted to mitigate such bias through our subgroup analysis, which allowed for comparisons within more homogeneous patient groupings.

In conclusion, our large multicenter study of academic health systems confirms the findings of prior single‐center academic studies and a large general population study that interhospital transfer patients have an increased average LOS, costs, and adjusted in‐hospital mortality than patients admitted from the ED. This difference in mortality persisted even after controlling for several other predictors of mortality. Our findings emphasize the need for future studies designed to clarify the reason for the increased risk and identify targets for interventions to improve outcomes for the interhospital transfer population.

Acknowledgements

The authors gratefully acknowledge Zachary Goldberger and Tom Gallagher for their critical reviews of this article.

Disclosures

Dr. Herzig was funded by grant number K23AG042459 from the National Institute on Aging. The funding organization had no involvement in any aspect of the study, including design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript. The authors report no conflicts of interest.

Files
References
  1. Iwashyna TJ. The incomplete infrastructure for interhospital patient transfer. Crit Care Med. 2012;40(8):24702478.
  2. Hains I. AHRQ WebM23(1):6875.
  3. Hickey EC, Savage AM. Improving the quality of inter‐hospital transfers. J Qual Assur. 1991;13(4):1620.
  4. Vilensky D, MacDonald RD. Communication errors in dispatch of air medical transport. Prehosp Emerg Care. 2011;15(1):3943.
  5. Warren J, Fromm RE, Orr RA, Rotello LC, Horst HM. Guidelines for the inter‐ and intrahospital transport of critically ill patients. Crit Care Med. 2004;32(1):256262.
  6. Hernandez‐Boussard T, Davies S, McDonald K, Wang NE. Interhospital facility transfers in the United States: a nationwide outcomes study [published online November 13, 2014]. J Patient Saf. doi: 10.1097/PTS.0000000000000148.
  7. Wyatt SM, Moy E, Levin RJ, et al. Patients transferred to academic medical centers and other hospitals: characteristics, resource use, and outcomes. Acad Med. 1997;72(10):921930.
  8. Bernard AM, Hayward RA, Rosevear J, Chun H, McMahon LF. Comparing the hospitalizations of transfer and non‐transfer patients in an academic medical center. Acad Med. 1996;71(3):262266.
  9. Gordon HS, Rosenthal GE. Impact of interhospital transfers on outcomes in an academic medical center. Implications for profiling hospital quality. Med Care. 1996;34(4):295309.
  10. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):827.
  11. Hughes J. 3M HIS: APR DRG classification software—overview. Mortality Measurement. Available at: http://archive.ahrq.gov/professionals/quality‐patient‐safety/quality‐resources/tools/mortality/Hughessumm.html. Accessed June 14, 2011.
  12. Romano PS, Chan BK. Risk‐adjusting acute myocardial infarction mortality: are APR‐DRGs the right tool? Health Serv Res. 2000;34(7):14691489.
  13. Singh JA, Kwoh CK, Boudreau RM, Lee G‐C, Ibrahim SA. Hospital volume and surgical outcomes after elective hip/knee arthroplasty: a risk‐adjusted analysis of a large regional database. Arthritis Rheum. 2011;63(8):25312539.
  14. Carretta HJ, Chukmaitov A, Tang A, Shin J. Examination of hospital characteristics and patient quality outcomes using four inpatient quality indicators and 30‐day all‐cause mortality. Am J Med Qual. 2013;28(1):4655.
  15. Wiggers JK, Guitton TG, Smith RM, Vrahas MS, Ring D. Observed and expected outcomes in transfer and nontransfer patients with a hip fracture. J Orthop Trauma. 2011;25(11):666669.
  16. Arthur KR, Kelz RR, Mills AM, et al. Interhospital transfer: an independent risk factor for mortality in the surgical intensive care unit. Am Surg. 2013;79(9):909913.
  17. Hanmer J, Lu X, Rosenthal GE, Cram P. Insurance status and the transfer of hospitalized patients: an observational study. Ann Intern Med. 2014;160(2):8190.
  18. Bosk EA, Veinot T, Iwashyna TJ. Which patients and where: a qualitative study of patient transfers from community hospitals. Med Care. 2011;49(6):592598.
  19. Ehrmann DE. Overwhelmed and uninspired by lack of coordinated care: a call to action for new physicians. Acad Med. 2013;88(11):16001602.
  20. Graham JD. The outside hospital. Ann Intern Med. 2013;159(7):500501.
  21. Strickler J, Amor J, McLellan M. Untangling the lines: using a transfer center to assist with interfacility transfers. Nurs Econ. 2003;21(2):9496.
  22. Pesanka DA, Greenhouse PK, Rack LL, et al. Ticket to ride: reducing handoff risk during hospital patient transport. J Nurs Care Qual. 2009;24(2):109115.
  23. Sodickson A, Opraseuth J, Ledbetter S. Outside imaging in emergency department transfer patients: CD import reduces rates of subsequent imaging utilization. Radiology. 2011;260(2):408413.
  24. Agency for Healthcare Research and Quality. Patient Safety Organization (PSO) Program. Available at: http://www.pso.ahrq.gov. Accessed July 7, 2011.
  25. Signorello LB, Cohen SS, Williams DR, Munro HM, Hargreaves MK, Blot WJ. Socioeconomic status, race, and mortality: a prospective cohort study. Am J Public Health. 2014;104(12):e98e107.
  26. Durairaj L, Will JG, Torner JC, Doebbeling BN. Prognostic factors for mortality following interhospital transfers to the medical intensive care unit of a tertiary referral center. Crit Care Med. 2003;31(7):19811986.
  27. Goldman LE, Chu PW, Osmond D, Bindman A. The accuracy of present‐on‐admission reporting in administrative data. Health Serv Res. 2011;46(6 pt 1):19461962.
Article PDF
Issue
Journal of Hospital Medicine - 11(4)
Page Number
245-250
Sections
Files
Files
Article PDF
Article PDF

Interhospital transfers (IHTs) to academic medical centers (AMCs) or their affiliated hospitals may benefit patients who require unique specialty and procedural services. However, IHTs also introduce a potentially risky transition of care for patients suffering from complex or unstable medical problems.[1] Components of this risk include the dangers associated with transportation and the disrupted continuity of care that may lead to delays or errors in care.[2, 3] Furthermore, referring and accepting providers may face barriers to optimal handoffs including a lack of shared communication standards and difficulty accessing external medical records.[3, 4, 5] Although some authors have recommended the creation of formal guidelines for interhospital transfer processes for all patients to mitigate the risks of transfer, the available guidelines governing the IHT triage and communication process are limited to critically ill patients.[6]

A recent study of a diverse patient and hospital dataset demonstrated that interhospital transfer patients have a higher risk of mortality, increased length of stay (LOS), and increased risk of adverse events as compared with non‐transfer patients.[7] However, it is unknown if these findings persist in the population of patients transferred specifically to AMCs or their affiliated hospitals (the combination is hereafter referred to as academic health systems [AHSs]). AMCs provide a disproportionate share of IHT care for complex patients and have a vested interest in improving the outcomes of these transitions.[8] Prior single‐center studies of acute care adult medical patients accepted to AMCs have shown that IHT is associated with a longer LOS, increased in‐hospital mortality, and higher resource use.[9, 10] However, it is difficult to generalize from single‐center studies due to the variation in referral practices, geography, and network characteristics. Additionally, AMC referral systems, patient mix, and utilization of hospitalists have likely changed substantially in the nearly 2 decades since those reports were published.

Hospitalists and general internists often manage the transfer acceptance processes for internal medicine services at receiving hospitals, helping to triage and coordinate care for IHT patients. As a result, it is important for hospitalists to understand the characteristics and outcomes of the IHT population. In addition to informing the decision making around transfer for a given patient, such an understanding is the foundation for helping providers and institutions begin to systematically identify and mitigate peritransfer risks.

We conducted this large multicenter study to describe the characteristics and outcomes of a current, nationally representative IHT patient population discharged by hospitalists and general internists at AHSs. To identify unique features of the IHT population, we compared patients transferred from another hospital to an AHS to those admitted to the AHS directly from the AHS's emergency department (ED). Based on our anecdotal experiences and the prior single‐center study findings in adult medical populations,[9, 10] we hypothesized that the IHT population would be sicker, stay in the hospital and intensive care unit (ICU) longer, and have higher costs and in‐hospital mortality than ED patients. Although there may be fundamental differences between the 2 groups related to disease and patient condition, we hypothesized that outcome differences would persist even after adjusting for patient factors such as demographics, disease‐specific risk of mortality, and ICU utilization.

PATIENTS AND METHODS

We conducted a retrospective cohort study using data from the University HealthSystem Consortium (UHC) Clinical Database and Resource Manager (CDB/RM). UHC is an alliance of 120 academic medical centers and 300 of their affiliated hospitals for the purposes of collaboration on performance improvement. Each year, a subset of participating hospitals submits data on all of their inpatient discharges to the CDB/RM, which totals approximately 5 million records. The CDB/RM includes information from billing forms including demographics, diagnoses, and procedures as captured by International Classification of Diseases, Ninth Revision (ICD‐9) codes, discharge disposition, and line item charge detail for the type of bed (eg, floor, ICU). Most hospitals also provide detailed charge information including pharmacy, imaging, blood products, lab tests, and supplies. Some hospitals do not provide any charge data. The Beth Israel Deaconess Medical Center and University of Washington institutional review boards reviewed and approved the conduct of this study.

We included all inpatients discharged by hospitalists or general internal medicine physicians from UHC hospitals between April 1, 2011 and March 31, 2012. We excluded minors, pregnant patients, and prisoners. One hundred fifty‐eight adult academic medical centers and affiliated hospitals submitted data throughout this time period. Our primary independent variable, IHT status, was defined by patients whose admission source was another acute care institution. ED admissions were defined as patients admitted from the AHS ED whose source of origination was not another hospital or ambulatory surgery site.

Admission Characteristics

Admission characteristics of interest included age, gender, insurance status, the most common diagnoses in each cohort based on Medicare Severity Diagnosis‐Related Group (MS‐DRG), the most common Agency for Healthcare Research and Quality (AHRQ) comorbitidies,[11] the most common procedures, and the admission 3M All‐Patient Refined Diagnosis‐Related Group (APR‐DRG) risk of mortality (ROM) scores. 3M APR‐DRG ROM scores are proprietary categorical measures specific to the base APR‐DRG to which a patient is assigned, which are calculated using data available at the time of admission, including comorbid condition diagnosis codes, age, procedure codes, and principal diagnosis codes. A patient can fall into 1 of 4 categories with this score: minor, moderate, major, or extreme.[12]

Outcomes

Our primary outcome of interest was in‐hospital mortality. Secondary outcomes included LOS, the cost of care, ICU utilization, and discharge destination. The cost of care is a standardized estimate of the direct costs based on an adjustment of the charges submitted by CDB/RM participants. If an IHT is triaged through a receiving hospital's ED, the cost of care reflects those charges as well as the inpatient charges.

Statistical Analysis

We used descriptive statistics to characterize the IHT and ED patient populations. For bivariate comparisons of continuous variables, 2‐sample t tests with unequal variance were used. For categorical variables, 2 analysis was performed. We assessed the impact of IHT status on in‐hospital mortality using logistic regression to estimate unadjusted and adjusted relative risks, 95% confidence intervals (CIs), and P values. We included age, gender, insurance status, race, timing of ICU utilization, and 3M APR‐DRG ROM scores as independent variables. Prior studies have used this type of risk‐adjustment methodology with 3M APR‐DRG ROM scores,[13, 14, 15] including with interhospital transfer patients.[16] For all comparisons, a P value of <0.05 was considered statistically significant. Our sample size was determined by the data available for the 1‐year period.

Subgroup Analyses

We performed a stratified analysis based on the timing of ICU transfer to allow for additional comparisons of mortality within more homogeneous patient groups, and to control for the possibility that delays in ICU transfer could explain the association between IHT and in‐hospital mortality. We determined whether and when a patient spent time in the ICU based on daily accommodation charges. If a patient was charged for an ICU bed on the day of admission, we coded them as a direct ICU admission, and if the first ICU bed charge was on a subsequent day, they were coded as a delayed ICU admission. Approximately 20% of patients did not have the data necessary to determine the timing of ICU utilization, because the hospitals where they received care did not submit detailed charge data to the UHC.

Data analysis was performed by the UHC. Analysis was performed using Stata version 10 (StataCorp, College Station, TX). For all comparisons, a P value of <0.05 was considered significant.

RESULTS

Patient Characteristics

We identified 885,392 patients who met study criteria: 75,524 patients admitted as an IHT and 809,868 patients admitted from the ED. The proportion of each hospital's admissions that were IHTs that met our study criteria varied widely (median 9%, 25th percentile 3%, 75th percentile 14%). The average age and gender of the IHT and ED populations were similar and reflective of a nationally representative adult inpatient sample (Table 1). Racial compositions of the populations were notable for a higher portion of black patients in the ED admission group than the IHT group (25.4% vs 13.2%, P < 0.001). A slightly higher portion of the IHT population was covered by commercial insurance compared with the ED admissions (22.7% vs 19.1%, P < 0.001).

Characteristics of 885,392 Patients Discharged by Academic General Internists or Hospitalists by Source of Admission*
Demographic/Clinical VariablesEDIHT 
1st2nd 3rd4thRank
  • NOTE: Abbreviations: AHRQ, Agency for Healthcare Research and Quality; APR‐DRG admission ROM score, All‐Patient Refined Diagnosis‐Related Group Admission Risk of Mortality score; CC, complication or comorbidity (except under the AHRQ comorbidities where it refers to chronic complications); ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); GI, gastrointestinal; IHT, interhospital transfer (patients whose admission source was another acute care institution); MCC, major complication or comorbidity; MS‐DRG, Medicare Severity Diagnosis‐Related Group; MV, mechanical ventilation; SD, standard deviation. *All differences were significant at a level of P < 0.001. Denominator is the total number of patients. All other denominators are the total number of patients in that column. Subgroups may not sum to the total denominator due to incomplete data.

No. of patients809,86891.5 75,5248.5 
Age, y62.2 19.1  60.2 18.2  
Male381,56347.1 38,85051.4 
Female428,30352.9 36,67248.6 
Race      
White492,89460.9 54,78072.5 
Black205,30925.4 9,96813.2 
Other66,7098.1 7,77710.3 
Hispanic44,9565.6 2,9994.0 
Primary payer      
Commercial154,82619.1 17,13022.7 
Medicaid193,58523.9 15,92421.1 
Medicare445,22755.0 39,30152.0 
Other16,2302.0 3,1694.2 
Most common MS‐DRGs (top 5 for each group)      
Esophagitis, gastroenteritis, and miscellaneous digest disorders without MCC34,1164.21st1,5172.12nd
Septicemia or severe sepsis without MV 96+ hours with MCC25,7103.22nd2,6253.71st
Cellulitis without MCC21,6862.73rd8711.28th
Kidney and urinary tract infections without MCC19,9372.54th6310.921st
Chest pain18,0562.25th4950.734th
Renal failure with CC15,4781.99th1,0181.45th
GI hemorrhage with CC12,8551.612th1,2341.73rd
Respiratory system diagnosis w ventilator support4,7730.647th1,1181.64th
AHRQ comorbidities (top 5 for each group)      
Hypertension468,02617.81st39,34016.41st
Fluid and electrolyte disorders251,3399.52nd19,8258.32nd
Deficiency anemia208,7227.93rd19,6638.23rd
Diabetes without CCs190,1407.24th17,1317.14th
Chronic pulmonary disease178,1646.85th16,3196.85th
Most common procedures (top 5 for each group)      
Packed cell transfusion72,5907.01st9,7565.02nd
(Central) venous catheter insertion68,6876.72nd13,7557.01st
Hemodialysis41,5574.03rd5,3512.74th
Heart ultrasound (echocardiogram)37,7623.74th5,4412.83rd
Insert endotracheal tube25,3602.55th4,7052.46th
Continuous invasive mechanical ventilation19,2211.99th5,2802.75th
3M APR‐DRG admission ROM score      
Minor271,70233.6 18,62026.1 
Moderate286,42735.4 21,77530.5 
Major193,65223.9 20,53128.7 
Extreme58,0817.2 10,52714.7 

Primary discharge diagnoses (MS‐DRGs) varied widely, with no single diagnosis accounting for more than 4.2% of admissions in either group. The most common primary diagnoses among IHTs included severe sepsis (3.7%), esophagitis and gastroenteritis (2.1%), and gastrointestinal bleeding (1.7%). The top 5 most common AHRQ comorbidities were the same between the IHT and ED populations. A higher proportion of IHTs had at least 1 procedure performed during their hospitalization (68.5% vs 49.8%, P < 0.001). Note that ICD‐9 procedure codes include interventions such as blood transfusions and dialysis (Table 1), which may not be considered procedures in common medical parlance.

As compared with those admitted from the ED, IHTs had a higher proportion of patients categorized with major or extreme admission risk of mortality score (major + extreme, ED 31.1% vs IHT 43.5%, P < 0.001).

Overall Outcomes

IHT patients experienced a 60% longer average LOS, and a higher proportion spent time in the ICU than patients admitted through the ED (Table 2). On average, care for IHT patients cost more per day than for ED patients (Table 2). A lower proportion of IHTs were discharged home (68.6% vs 77.4% of ED patients), and a higher proportion died in the hospital (4.1% vs 1.8%) (P < 0.001 for both). Of the ED or IHT patients who died during their admission, there was no significant difference between the proportion who died within 48 hours of admission (26.4% vs 25.6%, P = 0.3693). After adjusting for age, gender, insurance status, race, ICU utilization and 3M APR‐DRG admission ROM scores, IHT was independently associated with the risk of in‐hospital death (odds ratio [OR]: 1.36, 95% CI: 1.291.43) (Table 3). The C statistic for the in‐hospital mortality model was 0.88.

Outcomes of 885,392 Academic Health System Patients Based on Source of Admission*
 ED, n = 809,868IHT, n = 75,524
  • NOTE: Abbreviations: ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); LOS, length of stay; SD, standard deviation. *All differences were significant at a level of P < 0.001 except the portion of deaths in 48 hours. ICU days data were available for 798,132 patients admitted from the ED and 71,054 IHT patients. Cost data were available for 792,604 patients admitted from the ED and 71,033 IHT patients.

LOS, mean SD5.0 6.98.0 13.4
ICU days, mean SD0.6 2.41.7 5.2
Patients who spent some time in the ICU14.3%29.8%
% LOS in the ICU (ICU days LOS)11.0%21.6%
Average total cost SD$10,731 $16,593$19,818 $34,665
Average cost per day (total cost LOS)$2,139$2,492
Discharged home77.4%68.6%
Died as inpatient14,869 (1.8%)3,051 (4.0%)
Died within 48 hours of admission (% total deaths)3,918 (26.4%)780 (25.6%)
Multivariable Model of In‐hospital Mortality (n = 707,248)
VariableUnadjusted OR (95% CI)Adjusted OR (95% CI)
  • NOTE: Abbreviations: APR‐DRG admission ROM score, All‐Patient Refined Diagnosis‐Related Group Admission Risk of Mortality score; CI, confidence interval; ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); OR, odds ratio.

Age, y1.00 (1.001.00)1.03 (1.031.03)
Gender  
FemaleRef.Ref.
Male1.13 (1.091.70)1.05 (1.011.09)
Medicare status  
NoRef.Ref.
Yes2.14 (2.062.22)1.39 (1.331.47)
Race  
NonblackRef.Ref.
Black0.57 (0.550.60)0.77 (0.730.81)
ICU utilization  
No ICU admissionRef.Ref.
Direct admission to the ICU5.56 (5.295.84)2.25 (2.132.38)
Delayed ICU admission5.48 (5.275.69)2.46 (2.362.57)
3M APR‐DRG admission ROM score  
MinorRef.Ref.
Moderate8.71 (7.5510.05)6.28 (5.437.25)
Major43.97 (38.3150.47)25.84 (22.4729.71)
Extreme238.65 (207.69273.80)107.17 (93.07123.40)
IHT  
NoRef.Ref.
Yes2.36 (2.262.48)1.36 (1.29 1.43)

Subgroup Analyses

Table 4 demonstrates the unadjusted and adjusted results from our analysis stratified by timing of ICU utilization. IHT remained independently associated with in‐hospital mortality regardless of timing of ICU utilization.

Unadjusted and Adjusted Associations Between IHT and In‐hospital Mortality, Stratified by ICU Timing*
SubgroupIn‐hospital Mortality, n (%)Unadjusted OR [95% CI]Adjusted OR [95% CI]
  • NOTE: Abbreviations: CI, confidence interval; ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); OR, odds ratio. *Timing of ICU utilization data were available for 650,608 of the patients admitted from the ED (80% of all ED admissions) and 56,640 of the IHT patients (75% of all IHTs).

No ICU admission, n = 552,171   
ED, n = 519,4214,913 (0.95%)Ref.Ref.
IHT, n = 32,750590 (1.80%)1.92 [1.762.09]1.68 [1.531.84]
Direct admission to the ICU, n = 44,537   
ED, n = 35,6141,733 (4.87%)Ref.Ref.
IHT, n = 8,923628 (7.04%)1.48 [1.351.63]1.24 [1.121.37]
Delayed ICU admission, n = 110,540   
ED, n = 95,5734,706 (4.92%)Ref.Ref.
IHT, n = 14,9671,068 (7.14%)1.48 [1.391.59]1.25 [1.171.35]

DISCUSSION

Our study of IHT patients ultimately discharged by hospitalists and general internists at US academic referral centers found significantly increased average LOS, costs, and in‐hospital mortality compared with patients admitted from the ED. The increased risk of mortality persisted after adjustment for patient characteristics and variables representing endogenous risk of mortality, and in more homogeneous subgroups after stratification by presence and timing of ICU utilization. These data confirm findings from single‐center studies and suggest that observations about the difference between IHT and ED populations may be generalizable across US academic hospitals.

Our work builds on 2 single‐center studies that examined mixed medical and surgical academic IHT populations from the late 1980s and early 1990s,[9, 10] and 1 studying surgical ICU patients in 2013.[17] These studies demonstrated longer average LOS, higher costs, and higher mortality rates (in both adjusted and unadjusted analyses). Our work confirmed these findings utilizing a more current, multicenter large dataset of IHT patients ultimately discharged by hospitalists and general internists. Our work is unique from a larger, more recent study[7] in that it focuses on patients transferred to academic health systems, and therefore has particular relevance to those settings. In addition, we divided patients into subpopulations based on the timing of ICU utilization, and found that in each of these populations, IHT remained independently associated with in‐hospital mortality.

Our analysis does not explain why the outcomes of IHTs are worse, but plausible contributing factors include that (1) patients chosen for IHT are at higher risk of death in ways uncaptured by established mortality risk scores, (2) referring, transferring, or accepting providers and institutions have provided inadequate care, (3) the transfer process itself involves harm, (4) socioeconomic bias in selection for IHT,[18] or (5) some combination of the above. Regardless of the causes of the worse outcomes observed in these outside‐hospital transfers, as these patients are colloquially known at accepting hospitals, they present challenges to everyone involved. Referring providers may feel a sense of urgency as these patients' needs exceed their management capabilities. The process is often time consuming and burdensome for referring and accepting providers because of poorly developed systems.[19] The transfer often takes patients further from their home and may make it more difficult for family to participate in their care. The transfer may delay care if the accepting institution cannot immediately accept the patient or if the time in transport is prolonged, which could result in decompensation at a critical juncture. For providers inheriting such patients, the stress of caring for these patients is compounded by the difficulty obtaining records about the prior hospitalization.[20] This frustrating experience is often translated into unfounded judgment of the institution that referred the patient and the care provided there.[21] It is important for hospitalists making decisions throughout the transfer process and for hospital leaders who determine staffing levels, measure the quality of care, manage hospital networks, or write hospital policy to appreciate that the transfer process itself may contribute to the challenges and poor outcomes we observe. Furthermore, regardless of the cause for the increased mortality that we observed, our findings imply that IHT patients require careful evaluation, management, and treatment.

Many accepting institutions have transfer centers that facilitate these transitions, utilizing protocols and templates to standardize the process.[22, 23] Future research should focus on the characteristics of these centers to learn which practices are most efficacious. Interventions to mitigate the known challenges of transfer (including patient selection and triage, handoff communication, and information sharing) could be tested by randomized studies at referring and accepting institutions. There may be a role for health information exchange or the development of enhanced pretransfer evaluation processes using telemedicine models; there is evidence that information sharing may reduce redundant imaging.[24] Perhaps targeted review of IHTs admitted to a non‐ICU portion of the hospital and subsequently transferred to the ICU could identify opportunities to improve triaging protocols and thus avert some of the bad outcomes observed in this subpopulation. A related future direction could be to create protected forumsusing the patient safety organization framework[25]to facilitate the discussion of interhospital transfer outcomes among the referring, transporting, and receiving parties. Lastly, future work should investigate the reasons for the different proportions of black patients in the ED versus IHT cohorts. Our finding that black race was associated with lower risk of mortality has been previously reported but may also benefit from more investigation.[26]

There are several limitations of our work. First, despite extensive adjustment for patient characteristics, due to the observational nature of our study it is still possible that IHTs differ from ED admissions in ways that were unaccounted for in our analysis, and which could be associated with increased mortality independent of the transfer process itself. We are unable to characterize features of the transfer process, such as the reason for transfer, differences in transfer processes among hospitals, or the distance and mode of travel, which may influence outcomes.[27] Because we used administrative data, variations in coding could incorrectly estimate the complexity or severity of illness on admission, which is a previously described risk.[28] In addition, although our dataset was very large, it was limited by incomplete charge data, which limited our ability to measure ICU utilization in our full cohort. The hospitals missing ICU charge data are of variable sizes and are distributed around the country, limiting the chance of systematic bias. Finally, in some settings, hospitalists may serve as the discharging physician for patients admitted to other services such as the ICU, introducing heterogeneity and bias to the sample. We attempted to mitigate such bias through our subgroup analysis, which allowed for comparisons within more homogeneous patient groupings.

In conclusion, our large multicenter study of academic health systems confirms the findings of prior single‐center academic studies and a large general population study that interhospital transfer patients have an increased average LOS, costs, and adjusted in‐hospital mortality than patients admitted from the ED. This difference in mortality persisted even after controlling for several other predictors of mortality. Our findings emphasize the need for future studies designed to clarify the reason for the increased risk and identify targets for interventions to improve outcomes for the interhospital transfer population.

Acknowledgements

The authors gratefully acknowledge Zachary Goldberger and Tom Gallagher for their critical reviews of this article.

Disclosures

Dr. Herzig was funded by grant number K23AG042459 from the National Institute on Aging. The funding organization had no involvement in any aspect of the study, including design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript. The authors report no conflicts of interest.

Interhospital transfers (IHTs) to academic medical centers (AMCs) or their affiliated hospitals may benefit patients who require unique specialty and procedural services. However, IHTs also introduce a potentially risky transition of care for patients suffering from complex or unstable medical problems.[1] Components of this risk include the dangers associated with transportation and the disrupted continuity of care that may lead to delays or errors in care.[2, 3] Furthermore, referring and accepting providers may face barriers to optimal handoffs including a lack of shared communication standards and difficulty accessing external medical records.[3, 4, 5] Although some authors have recommended the creation of formal guidelines for interhospital transfer processes for all patients to mitigate the risks of transfer, the available guidelines governing the IHT triage and communication process are limited to critically ill patients.[6]

A recent study of a diverse patient and hospital dataset demonstrated that interhospital transfer patients have a higher risk of mortality, increased length of stay (LOS), and increased risk of adverse events as compared with non‐transfer patients.[7] However, it is unknown if these findings persist in the population of patients transferred specifically to AMCs or their affiliated hospitals (the combination is hereafter referred to as academic health systems [AHSs]). AMCs provide a disproportionate share of IHT care for complex patients and have a vested interest in improving the outcomes of these transitions.[8] Prior single‐center studies of acute care adult medical patients accepted to AMCs have shown that IHT is associated with a longer LOS, increased in‐hospital mortality, and higher resource use.[9, 10] However, it is difficult to generalize from single‐center studies due to the variation in referral practices, geography, and network characteristics. Additionally, AMC referral systems, patient mix, and utilization of hospitalists have likely changed substantially in the nearly 2 decades since those reports were published.

Hospitalists and general internists often manage the transfer acceptance processes for internal medicine services at receiving hospitals, helping to triage and coordinate care for IHT patients. As a result, it is important for hospitalists to understand the characteristics and outcomes of the IHT population. In addition to informing the decision making around transfer for a given patient, such an understanding is the foundation for helping providers and institutions begin to systematically identify and mitigate peritransfer risks.

We conducted this large multicenter study to describe the characteristics and outcomes of a current, nationally representative IHT patient population discharged by hospitalists and general internists at AHSs. To identify unique features of the IHT population, we compared patients transferred from another hospital to an AHS to those admitted to the AHS directly from the AHS's emergency department (ED). Based on our anecdotal experiences and the prior single‐center study findings in adult medical populations,[9, 10] we hypothesized that the IHT population would be sicker, stay in the hospital and intensive care unit (ICU) longer, and have higher costs and in‐hospital mortality than ED patients. Although there may be fundamental differences between the 2 groups related to disease and patient condition, we hypothesized that outcome differences would persist even after adjusting for patient factors such as demographics, disease‐specific risk of mortality, and ICU utilization.

PATIENTS AND METHODS

We conducted a retrospective cohort study using data from the University HealthSystem Consortium (UHC) Clinical Database and Resource Manager (CDB/RM). UHC is an alliance of 120 academic medical centers and 300 of their affiliated hospitals for the purposes of collaboration on performance improvement. Each year, a subset of participating hospitals submits data on all of their inpatient discharges to the CDB/RM, which totals approximately 5 million records. The CDB/RM includes information from billing forms including demographics, diagnoses, and procedures as captured by International Classification of Diseases, Ninth Revision (ICD‐9) codes, discharge disposition, and line item charge detail for the type of bed (eg, floor, ICU). Most hospitals also provide detailed charge information including pharmacy, imaging, blood products, lab tests, and supplies. Some hospitals do not provide any charge data. The Beth Israel Deaconess Medical Center and University of Washington institutional review boards reviewed and approved the conduct of this study.

We included all inpatients discharged by hospitalists or general internal medicine physicians from UHC hospitals between April 1, 2011 and March 31, 2012. We excluded minors, pregnant patients, and prisoners. One hundred fifty‐eight adult academic medical centers and affiliated hospitals submitted data throughout this time period. Our primary independent variable, IHT status, was defined by patients whose admission source was another acute care institution. ED admissions were defined as patients admitted from the AHS ED whose source of origination was not another hospital or ambulatory surgery site.

Admission Characteristics

Admission characteristics of interest included age, gender, insurance status, the most common diagnoses in each cohort based on Medicare Severity Diagnosis‐Related Group (MS‐DRG), the most common Agency for Healthcare Research and Quality (AHRQ) comorbitidies,[11] the most common procedures, and the admission 3M All‐Patient Refined Diagnosis‐Related Group (APR‐DRG) risk of mortality (ROM) scores. 3M APR‐DRG ROM scores are proprietary categorical measures specific to the base APR‐DRG to which a patient is assigned, which are calculated using data available at the time of admission, including comorbid condition diagnosis codes, age, procedure codes, and principal diagnosis codes. A patient can fall into 1 of 4 categories with this score: minor, moderate, major, or extreme.[12]

Outcomes

Our primary outcome of interest was in‐hospital mortality. Secondary outcomes included LOS, the cost of care, ICU utilization, and discharge destination. The cost of care is a standardized estimate of the direct costs based on an adjustment of the charges submitted by CDB/RM participants. If an IHT is triaged through a receiving hospital's ED, the cost of care reflects those charges as well as the inpatient charges.

Statistical Analysis

We used descriptive statistics to characterize the IHT and ED patient populations. For bivariate comparisons of continuous variables, 2‐sample t tests with unequal variance were used. For categorical variables, 2 analysis was performed. We assessed the impact of IHT status on in‐hospital mortality using logistic regression to estimate unadjusted and adjusted relative risks, 95% confidence intervals (CIs), and P values. We included age, gender, insurance status, race, timing of ICU utilization, and 3M APR‐DRG ROM scores as independent variables. Prior studies have used this type of risk‐adjustment methodology with 3M APR‐DRG ROM scores,[13, 14, 15] including with interhospital transfer patients.[16] For all comparisons, a P value of <0.05 was considered statistically significant. Our sample size was determined by the data available for the 1‐year period.

Subgroup Analyses

We performed a stratified analysis based on the timing of ICU transfer to allow for additional comparisons of mortality within more homogeneous patient groups, and to control for the possibility that delays in ICU transfer could explain the association between IHT and in‐hospital mortality. We determined whether and when a patient spent time in the ICU based on daily accommodation charges. If a patient was charged for an ICU bed on the day of admission, we coded them as a direct ICU admission, and if the first ICU bed charge was on a subsequent day, they were coded as a delayed ICU admission. Approximately 20% of patients did not have the data necessary to determine the timing of ICU utilization, because the hospitals where they received care did not submit detailed charge data to the UHC.

Data analysis was performed by the UHC. Analysis was performed using Stata version 10 (StataCorp, College Station, TX). For all comparisons, a P value of <0.05 was considered significant.

RESULTS

Patient Characteristics

We identified 885,392 patients who met study criteria: 75,524 patients admitted as an IHT and 809,868 patients admitted from the ED. The proportion of each hospital's admissions that were IHTs that met our study criteria varied widely (median 9%, 25th percentile 3%, 75th percentile 14%). The average age and gender of the IHT and ED populations were similar and reflective of a nationally representative adult inpatient sample (Table 1). Racial compositions of the populations were notable for a higher portion of black patients in the ED admission group than the IHT group (25.4% vs 13.2%, P < 0.001). A slightly higher portion of the IHT population was covered by commercial insurance compared with the ED admissions (22.7% vs 19.1%, P < 0.001).

Characteristics of 885,392 Patients Discharged by Academic General Internists or Hospitalists by Source of Admission*
Demographic/Clinical VariablesEDIHT 
1st2nd 3rd4thRank
  • NOTE: Abbreviations: AHRQ, Agency for Healthcare Research and Quality; APR‐DRG admission ROM score, All‐Patient Refined Diagnosis‐Related Group Admission Risk of Mortality score; CC, complication or comorbidity (except under the AHRQ comorbidities where it refers to chronic complications); ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); GI, gastrointestinal; IHT, interhospital transfer (patients whose admission source was another acute care institution); MCC, major complication or comorbidity; MS‐DRG, Medicare Severity Diagnosis‐Related Group; MV, mechanical ventilation; SD, standard deviation. *All differences were significant at a level of P < 0.001. Denominator is the total number of patients. All other denominators are the total number of patients in that column. Subgroups may not sum to the total denominator due to incomplete data.

No. of patients809,86891.5 75,5248.5 
Age, y62.2 19.1  60.2 18.2  
Male381,56347.1 38,85051.4 
Female428,30352.9 36,67248.6 
Race      
White492,89460.9 54,78072.5 
Black205,30925.4 9,96813.2 
Other66,7098.1 7,77710.3 
Hispanic44,9565.6 2,9994.0 
Primary payer      
Commercial154,82619.1 17,13022.7 
Medicaid193,58523.9 15,92421.1 
Medicare445,22755.0 39,30152.0 
Other16,2302.0 3,1694.2 
Most common MS‐DRGs (top 5 for each group)      
Esophagitis, gastroenteritis, and miscellaneous digest disorders without MCC34,1164.21st1,5172.12nd
Septicemia or severe sepsis without MV 96+ hours with MCC25,7103.22nd2,6253.71st
Cellulitis without MCC21,6862.73rd8711.28th
Kidney and urinary tract infections without MCC19,9372.54th6310.921st
Chest pain18,0562.25th4950.734th
Renal failure with CC15,4781.99th1,0181.45th
GI hemorrhage with CC12,8551.612th1,2341.73rd
Respiratory system diagnosis w ventilator support4,7730.647th1,1181.64th
AHRQ comorbidities (top 5 for each group)      
Hypertension468,02617.81st39,34016.41st
Fluid and electrolyte disorders251,3399.52nd19,8258.32nd
Deficiency anemia208,7227.93rd19,6638.23rd
Diabetes without CCs190,1407.24th17,1317.14th
Chronic pulmonary disease178,1646.85th16,3196.85th
Most common procedures (top 5 for each group)      
Packed cell transfusion72,5907.01st9,7565.02nd
(Central) venous catheter insertion68,6876.72nd13,7557.01st
Hemodialysis41,5574.03rd5,3512.74th
Heart ultrasound (echocardiogram)37,7623.74th5,4412.83rd
Insert endotracheal tube25,3602.55th4,7052.46th
Continuous invasive mechanical ventilation19,2211.99th5,2802.75th
3M APR‐DRG admission ROM score      
Minor271,70233.6 18,62026.1 
Moderate286,42735.4 21,77530.5 
Major193,65223.9 20,53128.7 
Extreme58,0817.2 10,52714.7 

Primary discharge diagnoses (MS‐DRGs) varied widely, with no single diagnosis accounting for more than 4.2% of admissions in either group. The most common primary diagnoses among IHTs included severe sepsis (3.7%), esophagitis and gastroenteritis (2.1%), and gastrointestinal bleeding (1.7%). The top 5 most common AHRQ comorbidities were the same between the IHT and ED populations. A higher proportion of IHTs had at least 1 procedure performed during their hospitalization (68.5% vs 49.8%, P < 0.001). Note that ICD‐9 procedure codes include interventions such as blood transfusions and dialysis (Table 1), which may not be considered procedures in common medical parlance.

As compared with those admitted from the ED, IHTs had a higher proportion of patients categorized with major or extreme admission risk of mortality score (major + extreme, ED 31.1% vs IHT 43.5%, P < 0.001).

Overall Outcomes

IHT patients experienced a 60% longer average LOS, and a higher proportion spent time in the ICU than patients admitted through the ED (Table 2). On average, care for IHT patients cost more per day than for ED patients (Table 2). A lower proportion of IHTs were discharged home (68.6% vs 77.4% of ED patients), and a higher proportion died in the hospital (4.1% vs 1.8%) (P < 0.001 for both). Of the ED or IHT patients who died during their admission, there was no significant difference between the proportion who died within 48 hours of admission (26.4% vs 25.6%, P = 0.3693). After adjusting for age, gender, insurance status, race, ICU utilization and 3M APR‐DRG admission ROM scores, IHT was independently associated with the risk of in‐hospital death (odds ratio [OR]: 1.36, 95% CI: 1.291.43) (Table 3). The C statistic for the in‐hospital mortality model was 0.88.

Outcomes of 885,392 Academic Health System Patients Based on Source of Admission*
 ED, n = 809,868IHT, n = 75,524
  • NOTE: Abbreviations: ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); LOS, length of stay; SD, standard deviation. *All differences were significant at a level of P < 0.001 except the portion of deaths in 48 hours. ICU days data were available for 798,132 patients admitted from the ED and 71,054 IHT patients. Cost data were available for 792,604 patients admitted from the ED and 71,033 IHT patients.

LOS, mean SD5.0 6.98.0 13.4
ICU days, mean SD0.6 2.41.7 5.2
Patients who spent some time in the ICU14.3%29.8%
% LOS in the ICU (ICU days LOS)11.0%21.6%
Average total cost SD$10,731 $16,593$19,818 $34,665
Average cost per day (total cost LOS)$2,139$2,492
Discharged home77.4%68.6%
Died as inpatient14,869 (1.8%)3,051 (4.0%)
Died within 48 hours of admission (% total deaths)3,918 (26.4%)780 (25.6%)
Multivariable Model of In‐hospital Mortality (n = 707,248)
VariableUnadjusted OR (95% CI)Adjusted OR (95% CI)
  • NOTE: Abbreviations: APR‐DRG admission ROM score, All‐Patient Refined Diagnosis‐Related Group Admission Risk of Mortality score; CI, confidence interval; ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); OR, odds ratio.

Age, y1.00 (1.001.00)1.03 (1.031.03)
Gender  
FemaleRef.Ref.
Male1.13 (1.091.70)1.05 (1.011.09)
Medicare status  
NoRef.Ref.
Yes2.14 (2.062.22)1.39 (1.331.47)
Race  
NonblackRef.Ref.
Black0.57 (0.550.60)0.77 (0.730.81)
ICU utilization  
No ICU admissionRef.Ref.
Direct admission to the ICU5.56 (5.295.84)2.25 (2.132.38)
Delayed ICU admission5.48 (5.275.69)2.46 (2.362.57)
3M APR‐DRG admission ROM score  
MinorRef.Ref.
Moderate8.71 (7.5510.05)6.28 (5.437.25)
Major43.97 (38.3150.47)25.84 (22.4729.71)
Extreme238.65 (207.69273.80)107.17 (93.07123.40)
IHT  
NoRef.Ref.
Yes2.36 (2.262.48)1.36 (1.29 1.43)

Subgroup Analyses

Table 4 demonstrates the unadjusted and adjusted results from our analysis stratified by timing of ICU utilization. IHT remained independently associated with in‐hospital mortality regardless of timing of ICU utilization.

Unadjusted and Adjusted Associations Between IHT and In‐hospital Mortality, Stratified by ICU Timing*
SubgroupIn‐hospital Mortality, n (%)Unadjusted OR [95% CI]Adjusted OR [95% CI]
  • NOTE: Abbreviations: CI, confidence interval; ED, emergency department (patients admitted from the academic health system's emergency department whose source of origination was not another hospital or ambulatory surgery site); ICU, intensive care unit; IHT, interhospital transfer (patients whose admission source was another acute care institution); OR, odds ratio. *Timing of ICU utilization data were available for 650,608 of the patients admitted from the ED (80% of all ED admissions) and 56,640 of the IHT patients (75% of all IHTs).

No ICU admission, n = 552,171   
ED, n = 519,4214,913 (0.95%)Ref.Ref.
IHT, n = 32,750590 (1.80%)1.92 [1.762.09]1.68 [1.531.84]
Direct admission to the ICU, n = 44,537   
ED, n = 35,6141,733 (4.87%)Ref.Ref.
IHT, n = 8,923628 (7.04%)1.48 [1.351.63]1.24 [1.121.37]
Delayed ICU admission, n = 110,540   
ED, n = 95,5734,706 (4.92%)Ref.Ref.
IHT, n = 14,9671,068 (7.14%)1.48 [1.391.59]1.25 [1.171.35]

DISCUSSION

Our study of IHT patients ultimately discharged by hospitalists and general internists at US academic referral centers found significantly increased average LOS, costs, and in‐hospital mortality compared with patients admitted from the ED. The increased risk of mortality persisted after adjustment for patient characteristics and variables representing endogenous risk of mortality, and in more homogeneous subgroups after stratification by presence and timing of ICU utilization. These data confirm findings from single‐center studies and suggest that observations about the difference between IHT and ED populations may be generalizable across US academic hospitals.

Our work builds on 2 single‐center studies that examined mixed medical and surgical academic IHT populations from the late 1980s and early 1990s,[9, 10] and 1 studying surgical ICU patients in 2013.[17] These studies demonstrated longer average LOS, higher costs, and higher mortality rates (in both adjusted and unadjusted analyses). Our work confirmed these findings utilizing a more current, multicenter large dataset of IHT patients ultimately discharged by hospitalists and general internists. Our work is unique from a larger, more recent study[7] in that it focuses on patients transferred to academic health systems, and therefore has particular relevance to those settings. In addition, we divided patients into subpopulations based on the timing of ICU utilization, and found that in each of these populations, IHT remained independently associated with in‐hospital mortality.

Our analysis does not explain why the outcomes of IHTs are worse, but plausible contributing factors include that (1) patients chosen for IHT are at higher risk of death in ways uncaptured by established mortality risk scores, (2) referring, transferring, or accepting providers and institutions have provided inadequate care, (3) the transfer process itself involves harm, (4) socioeconomic bias in selection for IHT,[18] or (5) some combination of the above. Regardless of the causes of the worse outcomes observed in these outside‐hospital transfers, as these patients are colloquially known at accepting hospitals, they present challenges to everyone involved. Referring providers may feel a sense of urgency as these patients' needs exceed their management capabilities. The process is often time consuming and burdensome for referring and accepting providers because of poorly developed systems.[19] The transfer often takes patients further from their home and may make it more difficult for family to participate in their care. The transfer may delay care if the accepting institution cannot immediately accept the patient or if the time in transport is prolonged, which could result in decompensation at a critical juncture. For providers inheriting such patients, the stress of caring for these patients is compounded by the difficulty obtaining records about the prior hospitalization.[20] This frustrating experience is often translated into unfounded judgment of the institution that referred the patient and the care provided there.[21] It is important for hospitalists making decisions throughout the transfer process and for hospital leaders who determine staffing levels, measure the quality of care, manage hospital networks, or write hospital policy to appreciate that the transfer process itself may contribute to the challenges and poor outcomes we observe. Furthermore, regardless of the cause for the increased mortality that we observed, our findings imply that IHT patients require careful evaluation, management, and treatment.

Many accepting institutions have transfer centers that facilitate these transitions, utilizing protocols and templates to standardize the process.[22, 23] Future research should focus on the characteristics of these centers to learn which practices are most efficacious. Interventions to mitigate the known challenges of transfer (including patient selection and triage, handoff communication, and information sharing) could be tested by randomized studies at referring and accepting institutions. There may be a role for health information exchange or the development of enhanced pretransfer evaluation processes using telemedicine models; there is evidence that information sharing may reduce redundant imaging.[24] Perhaps targeted review of IHTs admitted to a non‐ICU portion of the hospital and subsequently transferred to the ICU could identify opportunities to improve triaging protocols and thus avert some of the bad outcomes observed in this subpopulation. A related future direction could be to create protected forumsusing the patient safety organization framework[25]to facilitate the discussion of interhospital transfer outcomes among the referring, transporting, and receiving parties. Lastly, future work should investigate the reasons for the different proportions of black patients in the ED versus IHT cohorts. Our finding that black race was associated with lower risk of mortality has been previously reported but may also benefit from more investigation.[26]

There are several limitations of our work. First, despite extensive adjustment for patient characteristics, due to the observational nature of our study it is still possible that IHTs differ from ED admissions in ways that were unaccounted for in our analysis, and which could be associated with increased mortality independent of the transfer process itself. We are unable to characterize features of the transfer process, such as the reason for transfer, differences in transfer processes among hospitals, or the distance and mode of travel, which may influence outcomes.[27] Because we used administrative data, variations in coding could incorrectly estimate the complexity or severity of illness on admission, which is a previously described risk.[28] In addition, although our dataset was very large, it was limited by incomplete charge data, which limited our ability to measure ICU utilization in our full cohort. The hospitals missing ICU charge data are of variable sizes and are distributed around the country, limiting the chance of systematic bias. Finally, in some settings, hospitalists may serve as the discharging physician for patients admitted to other services such as the ICU, introducing heterogeneity and bias to the sample. We attempted to mitigate such bias through our subgroup analysis, which allowed for comparisons within more homogeneous patient groupings.

In conclusion, our large multicenter study of academic health systems confirms the findings of prior single‐center academic studies and a large general population study that interhospital transfer patients have an increased average LOS, costs, and adjusted in‐hospital mortality than patients admitted from the ED. This difference in mortality persisted even after controlling for several other predictors of mortality. Our findings emphasize the need for future studies designed to clarify the reason for the increased risk and identify targets for interventions to improve outcomes for the interhospital transfer population.

Acknowledgements

The authors gratefully acknowledge Zachary Goldberger and Tom Gallagher for their critical reviews of this article.

Disclosures

Dr. Herzig was funded by grant number K23AG042459 from the National Institute on Aging. The funding organization had no involvement in any aspect of the study, including design and conduct of the study; collection, management, analysis, and interpretation of the data; and preparation, review, or approval of the manuscript. The authors report no conflicts of interest.

References
  1. Iwashyna TJ. The incomplete infrastructure for interhospital patient transfer. Crit Care Med. 2012;40(8):24702478.
  2. Hains I. AHRQ WebM23(1):6875.
  3. Hickey EC, Savage AM. Improving the quality of inter‐hospital transfers. J Qual Assur. 1991;13(4):1620.
  4. Vilensky D, MacDonald RD. Communication errors in dispatch of air medical transport. Prehosp Emerg Care. 2011;15(1):3943.
  5. Warren J, Fromm RE, Orr RA, Rotello LC, Horst HM. Guidelines for the inter‐ and intrahospital transport of critically ill patients. Crit Care Med. 2004;32(1):256262.
  6. Hernandez‐Boussard T, Davies S, McDonald K, Wang NE. Interhospital facility transfers in the United States: a nationwide outcomes study [published online November 13, 2014]. J Patient Saf. doi: 10.1097/PTS.0000000000000148.
  7. Wyatt SM, Moy E, Levin RJ, et al. Patients transferred to academic medical centers and other hospitals: characteristics, resource use, and outcomes. Acad Med. 1997;72(10):921930.
  8. Bernard AM, Hayward RA, Rosevear J, Chun H, McMahon LF. Comparing the hospitalizations of transfer and non‐transfer patients in an academic medical center. Acad Med. 1996;71(3):262266.
  9. Gordon HS, Rosenthal GE. Impact of interhospital transfers on outcomes in an academic medical center. Implications for profiling hospital quality. Med Care. 1996;34(4):295309.
  10. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):827.
  11. Hughes J. 3M HIS: APR DRG classification software—overview. Mortality Measurement. Available at: http://archive.ahrq.gov/professionals/quality‐patient‐safety/quality‐resources/tools/mortality/Hughessumm.html. Accessed June 14, 2011.
  12. Romano PS, Chan BK. Risk‐adjusting acute myocardial infarction mortality: are APR‐DRGs the right tool? Health Serv Res. 2000;34(7):14691489.
  13. Singh JA, Kwoh CK, Boudreau RM, Lee G‐C, Ibrahim SA. Hospital volume and surgical outcomes after elective hip/knee arthroplasty: a risk‐adjusted analysis of a large regional database. Arthritis Rheum. 2011;63(8):25312539.
  14. Carretta HJ, Chukmaitov A, Tang A, Shin J. Examination of hospital characteristics and patient quality outcomes using four inpatient quality indicators and 30‐day all‐cause mortality. Am J Med Qual. 2013;28(1):4655.
  15. Wiggers JK, Guitton TG, Smith RM, Vrahas MS, Ring D. Observed and expected outcomes in transfer and nontransfer patients with a hip fracture. J Orthop Trauma. 2011;25(11):666669.
  16. Arthur KR, Kelz RR, Mills AM, et al. Interhospital transfer: an independent risk factor for mortality in the surgical intensive care unit. Am Surg. 2013;79(9):909913.
  17. Hanmer J, Lu X, Rosenthal GE, Cram P. Insurance status and the transfer of hospitalized patients: an observational study. Ann Intern Med. 2014;160(2):8190.
  18. Bosk EA, Veinot T, Iwashyna TJ. Which patients and where: a qualitative study of patient transfers from community hospitals. Med Care. 2011;49(6):592598.
  19. Ehrmann DE. Overwhelmed and uninspired by lack of coordinated care: a call to action for new physicians. Acad Med. 2013;88(11):16001602.
  20. Graham JD. The outside hospital. Ann Intern Med. 2013;159(7):500501.
  21. Strickler J, Amor J, McLellan M. Untangling the lines: using a transfer center to assist with interfacility transfers. Nurs Econ. 2003;21(2):9496.
  22. Pesanka DA, Greenhouse PK, Rack LL, et al. Ticket to ride: reducing handoff risk during hospital patient transport. J Nurs Care Qual. 2009;24(2):109115.
  23. Sodickson A, Opraseuth J, Ledbetter S. Outside imaging in emergency department transfer patients: CD import reduces rates of subsequent imaging utilization. Radiology. 2011;260(2):408413.
  24. Agency for Healthcare Research and Quality. Patient Safety Organization (PSO) Program. Available at: http://www.pso.ahrq.gov. Accessed July 7, 2011.
  25. Signorello LB, Cohen SS, Williams DR, Munro HM, Hargreaves MK, Blot WJ. Socioeconomic status, race, and mortality: a prospective cohort study. Am J Public Health. 2014;104(12):e98e107.
  26. Durairaj L, Will JG, Torner JC, Doebbeling BN. Prognostic factors for mortality following interhospital transfers to the medical intensive care unit of a tertiary referral center. Crit Care Med. 2003;31(7):19811986.
  27. Goldman LE, Chu PW, Osmond D, Bindman A. The accuracy of present‐on‐admission reporting in administrative data. Health Serv Res. 2011;46(6 pt 1):19461962.
References
  1. Iwashyna TJ. The incomplete infrastructure for interhospital patient transfer. Crit Care Med. 2012;40(8):24702478.
  2. Hains I. AHRQ WebM23(1):6875.
  3. Hickey EC, Savage AM. Improving the quality of inter‐hospital transfers. J Qual Assur. 1991;13(4):1620.
  4. Vilensky D, MacDonald RD. Communication errors in dispatch of air medical transport. Prehosp Emerg Care. 2011;15(1):3943.
  5. Warren J, Fromm RE, Orr RA, Rotello LC, Horst HM. Guidelines for the inter‐ and intrahospital transport of critically ill patients. Crit Care Med. 2004;32(1):256262.
  6. Hernandez‐Boussard T, Davies S, McDonald K, Wang NE. Interhospital facility transfers in the United States: a nationwide outcomes study [published online November 13, 2014]. J Patient Saf. doi: 10.1097/PTS.0000000000000148.
  7. Wyatt SM, Moy E, Levin RJ, et al. Patients transferred to academic medical centers and other hospitals: characteristics, resource use, and outcomes. Acad Med. 1997;72(10):921930.
  8. Bernard AM, Hayward RA, Rosevear J, Chun H, McMahon LF. Comparing the hospitalizations of transfer and non‐transfer patients in an academic medical center. Acad Med. 1996;71(3):262266.
  9. Gordon HS, Rosenthal GE. Impact of interhospital transfers on outcomes in an academic medical center. Implications for profiling hospital quality. Med Care. 1996;34(4):295309.
  10. Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):827.
  11. Hughes J. 3M HIS: APR DRG classification software—overview. Mortality Measurement. Available at: http://archive.ahrq.gov/professionals/quality‐patient‐safety/quality‐resources/tools/mortality/Hughessumm.html. Accessed June 14, 2011.
  12. Romano PS, Chan BK. Risk‐adjusting acute myocardial infarction mortality: are APR‐DRGs the right tool? Health Serv Res. 2000;34(7):14691489.
  13. Singh JA, Kwoh CK, Boudreau RM, Lee G‐C, Ibrahim SA. Hospital volume and surgical outcomes after elective hip/knee arthroplasty: a risk‐adjusted analysis of a large regional database. Arthritis Rheum. 2011;63(8):25312539.
  14. Carretta HJ, Chukmaitov A, Tang A, Shin J. Examination of hospital characteristics and patient quality outcomes using four inpatient quality indicators and 30‐day all‐cause mortality. Am J Med Qual. 2013;28(1):4655.
  15. Wiggers JK, Guitton TG, Smith RM, Vrahas MS, Ring D. Observed and expected outcomes in transfer and nontransfer patients with a hip fracture. J Orthop Trauma. 2011;25(11):666669.
  16. Arthur KR, Kelz RR, Mills AM, et al. Interhospital transfer: an independent risk factor for mortality in the surgical intensive care unit. Am Surg. 2013;79(9):909913.
  17. Hanmer J, Lu X, Rosenthal GE, Cram P. Insurance status and the transfer of hospitalized patients: an observational study. Ann Intern Med. 2014;160(2):8190.
  18. Bosk EA, Veinot T, Iwashyna TJ. Which patients and where: a qualitative study of patient transfers from community hospitals. Med Care. 2011;49(6):592598.
  19. Ehrmann DE. Overwhelmed and uninspired by lack of coordinated care: a call to action for new physicians. Acad Med. 2013;88(11):16001602.
  20. Graham JD. The outside hospital. Ann Intern Med. 2013;159(7):500501.
  21. Strickler J, Amor J, McLellan M. Untangling the lines: using a transfer center to assist with interfacility transfers. Nurs Econ. 2003;21(2):9496.
  22. Pesanka DA, Greenhouse PK, Rack LL, et al. Ticket to ride: reducing handoff risk during hospital patient transport. J Nurs Care Qual. 2009;24(2):109115.
  23. Sodickson A, Opraseuth J, Ledbetter S. Outside imaging in emergency department transfer patients: CD import reduces rates of subsequent imaging utilization. Radiology. 2011;260(2):408413.
  24. Agency for Healthcare Research and Quality. Patient Safety Organization (PSO) Program. Available at: http://www.pso.ahrq.gov. Accessed July 7, 2011.
  25. Signorello LB, Cohen SS, Williams DR, Munro HM, Hargreaves MK, Blot WJ. Socioeconomic status, race, and mortality: a prospective cohort study. Am J Public Health. 2014;104(12):e98e107.
  26. Durairaj L, Will JG, Torner JC, Doebbeling BN. Prognostic factors for mortality following interhospital transfers to the medical intensive care unit of a tertiary referral center. Crit Care Med. 2003;31(7):19811986.
  27. Goldman LE, Chu PW, Osmond D, Bindman A. The accuracy of present‐on‐admission reporting in administrative data. Health Serv Res. 2011;46(6 pt 1):19461962.
Issue
Journal of Hospital Medicine - 11(4)
Issue
Journal of Hospital Medicine - 11(4)
Page Number
245-250
Page Number
245-250
Article Type
Display Headline
Interhospital transfer patients discharged by academic hospitalists and general internists: Characteristics and outcomes
Display Headline
Interhospital transfer patients discharged by academic hospitalists and general internists: Characteristics and outcomes
Sections
Article Source

© 2015 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Lauge Sokol‐Hessner, MD, Beth Israel Deaconess Medical Center, Hospital Medicine, W/PBS‐2, 330 Brookline Ave., Boston, MA 02215; Telephone: 617‐754‐4677; Fax: 617‐632‐0215; E‐mail: lhessner@bidmc.harvard.edu
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Article PDF Media
Media Files

A Novel Cream Formulation Containing Nicotinamide 4%, Arbutin 3%, Bisabolol 1%, and Retinaldehyde 0.05% for Treatment of Epidermal Melasma

Article Type
Changed
Thu, 01/10/2019 - 13:27
Display Headline
A Novel Cream Formulation Containing Nicotinamide 4%, Arbutin 3%, Bisabolol 1%, and Retinaldehyde 0.05% for Treatment of Epidermal Melasma

Epidermal melasma is a common hyperpigmentation disorder that can be challenging to treat. The pathogenesis of melasma is not fully understood but has been associated with increased melanin and melanocyte activity.1,2 Melasma is characterized by jagged, light- to dark-brown patches on areas of the skin most often exposed to the sun—primarily the cheeks, forehead, upper lip, nose, and chin.3 Although it can affect both sexes and all races, melasma is more common in Fitzpatrick skin types II to IV and frequently is seen in Asian or Hispanic women residing in geographic locations with high levels of sun exposure (eg, tropical areas).2 Melasma presents more frequently in adult women of childbearing age, especially during pregnancy, but also can begin postmenopause. Onset may occur as early as menarche but typically is observed between the ages of 30 and 55 years.3,4 Only 10% of melasma cases are known to occur in males4 and are influenced by such factors as ethnicity, hormones, and level of sun exposure.2

Topical therapies for melasma attempt to inhibit melanocytic activation at each level of melanin formation until the deposited pigment is removed; however, results may vary greatly, as melasma often recurs due to the migration of new melanocytes from hair follicles to the skin’s surface, leading to new development of hyperpigmentation. The current standard of treatment for melasma involves the use of hydroquinone and other bleaching agents, but long-term use of these treatments has been associated with concerns regarding unstable preparations (which may lose their therapeutic properties) and adverse effects (eg, ochronosis, depigmentation).5 Cosmetic agents that recently have been evaluated for melasma treatment include nicotinamide (a form of vitamin B3), which inhibits the transfer of melanosomes from melanocytes to keratinocytes; arbutin, which inhibits melanin synthesis by inhibiting tyrosinase activity6; bisabolol, which prevents anti-inflammatory activity7; and retinaldehyde (RAL), a precursor of retinoic acid (RA) that has powerful bleaching action and low levels of cutaneous irritability.8

This prospective, single-arm, open-label study, evaluated the efficacy and safety of a novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05% in the treatment of epidermal melasma.

Study Product Ingredients and Background

Nicotinamide

Nicotinamide is a water-soluble amide of nicotinic acid (niacin) and one of the 2 principal forms of vitamin B3. It is a component of the coenzymes nicotinamide adenine dinucleotide and nicotinamide adenine dinucleotide phosphate. Nicotinamide essentially acts as an antioxidant, with most of its effects exerted through poly(adenosine diphosphate–ribose) polymerase inhibition. Interest has increased in the role of nicotinamide in the prevention and treatment of several skin diseases, such as acne and UV radiation–induced deleterious molecular and immunological events. Nicotinamide also has gained consideration as a potential agent in sunscreen preparations due to its possible skin-lightening effects, stimulation of DNA repair, suppression of UV photocarcinogenesis, and other antiaging effects.9

Arbutin

Arbutin is a molecule that has proven effective in treating melasma.10 Its pigment-lightening ingredients include botanicals that are structurally similar to hydroquinone. Arbutin is obtained from the leaves of the bearberry plant but also is found in lesser quantities in cranberry and blueberry leaves. A naturally occurring gluconopyranoside, arbutin reduces tyrosinase activity without affecting messenger RNA expression.11 Arbutin also inhibits melanosome maturation, is nontoxic to melanocytes, and is used in Japan in a variety of pigment-lightening preparations at 3% concentrations.12

Bisabolol

Bisabolol is a natural monocyclic sesquiterpene alcohol found in the oils of chamomile and other plants. Bisabolol often is included in cosmetics due to its favorable anti-inflammatory and depigmentation properties. Its downregulation of inducible nitric oxide synthase and cyclooxygenase-2 suggests that it may have anti-inflammatory effects.7

Retinaldehyde

Retinaldehyde is an RA precursor that forms as an intermediate metabolite in the transformation of retinol to RA in human keratinocytes. Topical RAL is well tolerated by human skin, and several of its biologic effects are identical to those of RA. Using the tails of C57BL/6 mouse models, RAL 0.05% has been found to have significantly more potent depigmenting effects than RA 0.05% (P<.001 vs P<.01, respectively) when compared to vehicle.13

Although combination therapy with RAL and arbutin could potentially cause skin irritation, the addition of bisabolol to the combination cream used in this study is believed to have conferred anti-inflammatory properties because it inhibits the release of histamine and relieves irritation.

Methods

This single-center, single-arm, prospective, open-label study evaluated the efficacy and safety of a novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% in treating epidermal melasma. Clinical evaluation included assessment of Melasma Area and Severity Index (MASI) score, photographic analysis, and in vivo reflectance confocal microscopy (RCM) analysis.

 

 

The study population included women aged 18 to 50 years with Fitzpatrick skin types I through V who had clinically diagnosed epidermal melasma on the face. Eligibility requirements included confirmation of epidermal pigmentation on Wood lamp examination and RCM analysis and a MASI score of less than 10.5. A total of 35 participants were enrolled in the study (intention to treat [ITT] population). Thirty-three participants were included in the analysis of treatment effectiveness (ITTe population), as 2 were excluded due to lack of follow-up postbaseline. Four participants were prematurely withdrawn from the study—3 due to loss to follow-up and 1 due to treatment discontinuation following an adverse event (AE). The last observation carried forward method was used to input missing data from these 4 participants excluding repeated measure analysis that used the generalized estimated equation method.

At baseline, a 25-g tube of the study cream containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% was distributed to all participants for once-daily application to the entire face for 30 days. Participants were instructed to apply the product in the evening after using a gentle cleanser, which also was to be used in the morning to remove the product residue. Additionally, participants were given a sunscreen with a sun protection factor of 30 to apply daily on the entire face in the morning, after lunch, and midafternoon. During the 30-day treatment period, treatment interruption of up to 5 consecutive days or 10 nonconsecutive days in total was permitted. At day 30, participants received another 30-day supply of the study product and sunscreen to be applied according to the same regimen for an additional 30-day treatment period.

Clinical Evaluation

At baseline, demographic data and medical history was recorded for all participants and dermatologic and physical examination was performed documenting weight, height, blood pressure, heart rate, and baseline MASI score. Following Wood lamp examination, participants’ faces were photographed and catalogued using medical imaging software that allowed for measurement of the total melasma surface area (Figure 1A). The photographs also were cross-polarized for further analysis of the pigmentation (Figure 1B).

    

Figure 1. Clinical (A) and cross-polarized (B) photographs of a patient before treatment with the novel compound containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05%.

A questionnaire evaluating treatment satisfaction was administered to participants (ITTe population [n=33]) at baseline and days 30 and 60. Questionnaire items pertained to skin blemishes, signs of facial aging, overall appearance, texture, oiliness, brightness, and hydration. Participants were instructed to rate their satisfaction for each item on a scale of 1 to 10 (1=bad, 10=excellent). For investigator analysis, scores of 1 to 4 were classified as “dissatisfied,” scores of 5 to 6 were classified as “satisfied,” and scores of 7 to 10 were classified as “completely satisfied.” A questionnaire evaluating product appreciation was administered at day 60 to participants who completed the study (n=29). Questionnaire items asked participants to rate the study cream’s ease of application, consistency, smell, absorption, and overall satisfaction using ratings of “bad,” “regular,” “good,” “very good,” or “excellent.”

Treatment efficacy in all participants was assessed by the investigators at days 30 and 60. Investigators evaluated reductions in pigmentation and total melasma surface area using ratings of “none,” “regular,” “good,” “very good,” or “excellent.” Local tolerance also was evaluated at both time points, and AEs were recorded and analyzed with respect to their duration, intensity, frequency, and severity.

Targeted hyperpigmented skin was selected for in vivo RCM analysis. At each time point, a sequence of block images was acquired at 4 levels of skin: (1) superficial dermis, (2) suprabasal layer/ dermoepidermal junction, (3) spinous layer, and (4) superficial granular layer. Blind evaluation of these images to assess the reduction in melanin quantity was conducted by a dermatopathologist at baseline and days 30 and 60. Melanin quantity present in each layer was graded according to 4 categories (0%–25%, 25.1%–50%, 50.1%–75%, 75.1%–100%). The mean value was used for statistical evaluation.

Results

Efficacy evaluation

The primary efficacy variable was the mean reduction in MASI score from baseline to the end of treatment (day 60), which was 2.25 ± 1.87 (P<.0001). The reduction in mean MASI score was significant from baseline to day 30 (P<.0001) and from day 30 to day 60 (P<.0001). The least root-mean-square error estimates of MASI score variation at days 30 and 60 were 1.40 and 2.25, respectively.

The mean total melasma surface area (as measured in analysis of clinical photographs using medical imaging software) was significantly reduced from 1398.5 mm2 at baseline to 1116.9 mm2 at day 30 (P<.0001) and 923.4 at day 60 (P<.0001). From baseline to end of treatment, the overall reduction in mean total melasma surface area was 475.1 mm2 (P<.0001)(Figure 2). Clinical and cross-polarized photographs taken at day 60 demonstrated a visible reduction in melasma surface area (Figure 3), which was confirmed using medical imaging software.

 

 

Figure 2. Mean surface area of melasma measured at baseline (1398.5 mm2), day 30 (1116.9 mm2), and day 60 (923.4 mm2), showing a mean total reduction of 475.1 mm2 from baseline to day 60.

  

Figure 3. Clinical (A) and cross-polarized (B) photographs of a patient after 60 days of treatment with the novel compound containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05%.

In vivo RCM analyses at each time point showed reduction in pigmentation in the 4 levels of the skin that were evaluated, but the results were not statistically significant.

Participant satisfaction

There was strong statistical evidence of patient satisfaction with the treatment results at the end of the study period (P<.0001). At baseline, 75.8% (25/33) of participants were dissatisfied with the appearance of their skin as compared with 15.2% (5/33) at day 60. Additionally, 18.1% (6/33) and 6.1% (2/33) of the participants were satisfied and completely satisfied at baseline compared with 33.3% (11/33) and 51.5% (17/33) at day 60, respectively. Participant satisfaction with signs of facial aging also increased over the study period (P=.0104). At baseline, 60.6% (20/33) were dissatisfied, 12.1% (4/33) were satisfied, and 27.3% (9/33) were completely satisfied; at the end of treatment, 30.3% (10/33) were dissatisfied, 36.4% (12/33) were satisfied, and 33.3% (11/33) were completely satisfied with the improvement in signs of facial aging.

Increased patient satisfaction with facial skin texture at baseline compared to day 60 also was statistically significant (P=.0157). At baseline, 39.4% (13/33) of the participants were dissatisfied, 30.3% (10/33) were satisfied, and 30.3% (10/33) were completely satisfied with facial texture; at day 60, 15.1% (5/33) were dissatisfied, 30.3% (10/33) were satisfied, and 54.6% (18/33) were completely satisfied. Significant improvement from baseline to day 60 also was observed in participant assessment of skin oiliness (P=.0210), brightness (P=.0003), overall appearance (P<.0001), and hydration (P<.0001).

Product appreciation

At day 60, 89.7% (26/29) of the participants who completed the study rated the product’s ease of application as being at least “good,” with more than half of participants (55.2% [16/29]) rating it as “very good” or “excellent.” Overall satisfaction with the product was rated as “very good” or “excellent” by 48.3% (14/29) of the participants. Similar results were observed in participant assessments of consistency, smell, and absorption (Figure 4).

Figure 4. Participant responses to product appreciation questionnaire.

Safety evaluation

A total of 52 AEs were observed in 23 (69.7%) participants, which were recorded by participants in diary entries throughout treatment and evaluated by investigators at each time point. Among these AEs, 48 (92.3%) were considered possibly, probably, or conditionally related to treatment by the investigators based on clinical observation. The most common presumed treatment-related AE was a burning sensation on the skin, reported by 30.3% (10/33) of the participants at day 30 and 13.8% (4/29) at day 60. Of the reported AEs related to treatment, 91.7% (44/48) were of mild intensity and 93.8% (45/48) required no treatment or other action. There were no reported serious AEs related to the investigational product. Blood pressure, heart rate, and weight remained stable among all participants throughout the study.

The intensity of the AEs was described as “light” in 91.7% (44/48) of cases and “moderate” in 8.3% (4/48) of cases. The frequency of AEs was classified as “unique,” “intermittent,” or “continuous” in 45.8% (22/48), 39.6% (19/48), and14.6% (7/48) of cases, respectively. Of the 48 AEs, 3 (6.3%) occurred in 1 participant, necessitating interruption of treatment, application of the topical corticosteroid cream mometasone, and removal from the study.

Comment

Following treatment with the study cream containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05%, the mean reduction in MASI score (P<.0001) and the mean reduction in total melasma surface area from baseline to end of treatment were statistically significant (P<.0001). The study product was associated with strong statistical evidence of patient satisfaction (P<.0001) regarding improvement in facial skin texture, skin oiliness, brightness, overall appearance, and hydration. Participants also responded favorably to the product and considered it safe and effective. In vivo RCM analysis demonstrated a reduction in the amount of melanin in 4 levels of the skin (superficial dermis, suprabasal layer/dermoepidermal junction, spinous layer, superficial granular layer) following treatment with the study cream; however, over the course of the 60-day treatment period, it did not reveal statistically significant reductions. This finding likely is due to the large ranges used to classify the amount of melanin present in each layer of the skin. These limitations suggest that scales used in future in vivo RCM analyses of melasma should be narrower.

 

 

Epidermal melasma is one of the most difficult dermatologic diseases to treat and control. Maintenance of clear, undamaged skin remains a treatment target for all dermatologists. This novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% has proven to be an effective, safe, and tolerable treatment option for patients with epidermal melasma.

References

1. Grimes PE, Yamada N, Bhawan J. Light microscopic, immunohistochemical, and ultrastructural alterations in patients with melasma. Am J Dermatopathol. 2005;27:96-101.

2. Kang WH, Yoon KH, Lee ES, et al. Melasma: histopathological characteristics in 56 Korean patients. Br J Dermatol. 2002;146:228-237.

3. Cestari T, Arellano I, Hexsel D, et al. Melasma in Latin America: options the therapy and treatment algorithm. JEADV. 2009;23:760-772.

4. Miot LDB, Miot HA, Silva MG, et al. Fisiopatologia do Melasma. An Bras Dermatol. 2009;84:623-635.

5. Draelos Z. Skin lightening preparations and the hydroquinone controversy. Dermatol Ther. 2007;20:308-313.

6. Parvez S, Kang M, Chung HS, et al. Survey and mechanism of skin depigmenting and lightening agents. Phytoter Res. 2006;20:921-934.

7. Kim S, Jung E, Kim JH, et al. Inhibitory effects of (-)-α-bisabolol on LPS-induced inflammatory response in RAW264.7 macrophages. Food Chem Toxicol. 2011;49:2580-2585.

8. Ortonne JP. Retinoid therapy of pigmentary disorders. Dermatol Ther. 2006;19:280-288.

9. Namazi MR. Nicotinamide-containing sunscreens for use in Australasian countries and cancer-provoking conditions. Med Hypotheses. 2003;60:544-545.

10. Ertam I, Mutlu B, Unal I, et al. Efficiency of ellagic acid and arbutin in melasma: a randomized, prospective, open-label study. J Dermatol. 2008;35:570-574.

11. Hori I, Nihei K, Kubo I. Structural criteria for depigmenting mechanism of arbutin. Phytother Res. 2004;18:475-469.

12. Ethnic skin and pigmentation. In: Draelos ZD. Cosmetics and Dermatologic Problems and Solutions. 3rd ed. Boca Raton, FL: CRC Press; 2011:52-55.

13. Kasraee B, Tran C, Sorg O, et al. The depigmenting effect of RALGA in C57BL/6 mice. Dermatology. 2005;210(suppl 1):30-34.

Article PDF
Author and Disclosure Information

Elisete I. Crocco, MD; John Verrinder Veasey, MD; Maria Fernanda Feitosa de Camargo Boin, MD; Rute Facchini Lellis, MD; Renata Oliveira Alves, MD

From Santa Casa de São Paulo Hospital and Medical School, Brazil. Drs. Crocco, Veasey, Boin, and Alves are from the Dermatology Clinic and Dr. Lellis is from the Department of Pathology.

This study was supported by TheraSkin Farmacêutica LTDA. Drs. Crocco, Veasey, Boin, Lellis, and Alves received a research grant from TheraSkin Farmacêutica LTDA for this study.

Correspondence: Elisete Crocco, MD, Avenida Macuco, 726/cj 2001, Moema, 04523-001, São Paulo-SP, Brazil (elisete@elisetecrocco.com.br).

Issue
Cutis - 96(5)
Publications
Topics
Page Number
337-342
Legacy Keywords
melasma, topical treatment, cosmetic, hydroquinone, skin lightening, pigmentation, melanin, pigmentation disorder
Sections
Author and Disclosure Information

Elisete I. Crocco, MD; John Verrinder Veasey, MD; Maria Fernanda Feitosa de Camargo Boin, MD; Rute Facchini Lellis, MD; Renata Oliveira Alves, MD

From Santa Casa de São Paulo Hospital and Medical School, Brazil. Drs. Crocco, Veasey, Boin, and Alves are from the Dermatology Clinic and Dr. Lellis is from the Department of Pathology.

This study was supported by TheraSkin Farmacêutica LTDA. Drs. Crocco, Veasey, Boin, Lellis, and Alves received a research grant from TheraSkin Farmacêutica LTDA for this study.

Correspondence: Elisete Crocco, MD, Avenida Macuco, 726/cj 2001, Moema, 04523-001, São Paulo-SP, Brazil (elisete@elisetecrocco.com.br).

Author and Disclosure Information

Elisete I. Crocco, MD; John Verrinder Veasey, MD; Maria Fernanda Feitosa de Camargo Boin, MD; Rute Facchini Lellis, MD; Renata Oliveira Alves, MD

From Santa Casa de São Paulo Hospital and Medical School, Brazil. Drs. Crocco, Veasey, Boin, and Alves are from the Dermatology Clinic and Dr. Lellis is from the Department of Pathology.

This study was supported by TheraSkin Farmacêutica LTDA. Drs. Crocco, Veasey, Boin, Lellis, and Alves received a research grant from TheraSkin Farmacêutica LTDA for this study.

Correspondence: Elisete Crocco, MD, Avenida Macuco, 726/cj 2001, Moema, 04523-001, São Paulo-SP, Brazil (elisete@elisetecrocco.com.br).

Article PDF
Article PDF
Related Articles

Epidermal melasma is a common hyperpigmentation disorder that can be challenging to treat. The pathogenesis of melasma is not fully understood but has been associated with increased melanin and melanocyte activity.1,2 Melasma is characterized by jagged, light- to dark-brown patches on areas of the skin most often exposed to the sun—primarily the cheeks, forehead, upper lip, nose, and chin.3 Although it can affect both sexes and all races, melasma is more common in Fitzpatrick skin types II to IV and frequently is seen in Asian or Hispanic women residing in geographic locations with high levels of sun exposure (eg, tropical areas).2 Melasma presents more frequently in adult women of childbearing age, especially during pregnancy, but also can begin postmenopause. Onset may occur as early as menarche but typically is observed between the ages of 30 and 55 years.3,4 Only 10% of melasma cases are known to occur in males4 and are influenced by such factors as ethnicity, hormones, and level of sun exposure.2

Topical therapies for melasma attempt to inhibit melanocytic activation at each level of melanin formation until the deposited pigment is removed; however, results may vary greatly, as melasma often recurs due to the migration of new melanocytes from hair follicles to the skin’s surface, leading to new development of hyperpigmentation. The current standard of treatment for melasma involves the use of hydroquinone and other bleaching agents, but long-term use of these treatments has been associated with concerns regarding unstable preparations (which may lose their therapeutic properties) and adverse effects (eg, ochronosis, depigmentation).5 Cosmetic agents that recently have been evaluated for melasma treatment include nicotinamide (a form of vitamin B3), which inhibits the transfer of melanosomes from melanocytes to keratinocytes; arbutin, which inhibits melanin synthesis by inhibiting tyrosinase activity6; bisabolol, which prevents anti-inflammatory activity7; and retinaldehyde (RAL), a precursor of retinoic acid (RA) that has powerful bleaching action and low levels of cutaneous irritability.8

This prospective, single-arm, open-label study, evaluated the efficacy and safety of a novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05% in the treatment of epidermal melasma.

Study Product Ingredients and Background

Nicotinamide

Nicotinamide is a water-soluble amide of nicotinic acid (niacin) and one of the 2 principal forms of vitamin B3. It is a component of the coenzymes nicotinamide adenine dinucleotide and nicotinamide adenine dinucleotide phosphate. Nicotinamide essentially acts as an antioxidant, with most of its effects exerted through poly(adenosine diphosphate–ribose) polymerase inhibition. Interest has increased in the role of nicotinamide in the prevention and treatment of several skin diseases, such as acne and UV radiation–induced deleterious molecular and immunological events. Nicotinamide also has gained consideration as a potential agent in sunscreen preparations due to its possible skin-lightening effects, stimulation of DNA repair, suppression of UV photocarcinogenesis, and other antiaging effects.9

Arbutin

Arbutin is a molecule that has proven effective in treating melasma.10 Its pigment-lightening ingredients include botanicals that are structurally similar to hydroquinone. Arbutin is obtained from the leaves of the bearberry plant but also is found in lesser quantities in cranberry and blueberry leaves. A naturally occurring gluconopyranoside, arbutin reduces tyrosinase activity without affecting messenger RNA expression.11 Arbutin also inhibits melanosome maturation, is nontoxic to melanocytes, and is used in Japan in a variety of pigment-lightening preparations at 3% concentrations.12

Bisabolol

Bisabolol is a natural monocyclic sesquiterpene alcohol found in the oils of chamomile and other plants. Bisabolol often is included in cosmetics due to its favorable anti-inflammatory and depigmentation properties. Its downregulation of inducible nitric oxide synthase and cyclooxygenase-2 suggests that it may have anti-inflammatory effects.7

Retinaldehyde

Retinaldehyde is an RA precursor that forms as an intermediate metabolite in the transformation of retinol to RA in human keratinocytes. Topical RAL is well tolerated by human skin, and several of its biologic effects are identical to those of RA. Using the tails of C57BL/6 mouse models, RAL 0.05% has been found to have significantly more potent depigmenting effects than RA 0.05% (P<.001 vs P<.01, respectively) when compared to vehicle.13

Although combination therapy with RAL and arbutin could potentially cause skin irritation, the addition of bisabolol to the combination cream used in this study is believed to have conferred anti-inflammatory properties because it inhibits the release of histamine and relieves irritation.

Methods

This single-center, single-arm, prospective, open-label study evaluated the efficacy and safety of a novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% in treating epidermal melasma. Clinical evaluation included assessment of Melasma Area and Severity Index (MASI) score, photographic analysis, and in vivo reflectance confocal microscopy (RCM) analysis.

 

 

The study population included women aged 18 to 50 years with Fitzpatrick skin types I through V who had clinically diagnosed epidermal melasma on the face. Eligibility requirements included confirmation of epidermal pigmentation on Wood lamp examination and RCM analysis and a MASI score of less than 10.5. A total of 35 participants were enrolled in the study (intention to treat [ITT] population). Thirty-three participants were included in the analysis of treatment effectiveness (ITTe population), as 2 were excluded due to lack of follow-up postbaseline. Four participants were prematurely withdrawn from the study—3 due to loss to follow-up and 1 due to treatment discontinuation following an adverse event (AE). The last observation carried forward method was used to input missing data from these 4 participants excluding repeated measure analysis that used the generalized estimated equation method.

At baseline, a 25-g tube of the study cream containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% was distributed to all participants for once-daily application to the entire face for 30 days. Participants were instructed to apply the product in the evening after using a gentle cleanser, which also was to be used in the morning to remove the product residue. Additionally, participants were given a sunscreen with a sun protection factor of 30 to apply daily on the entire face in the morning, after lunch, and midafternoon. During the 30-day treatment period, treatment interruption of up to 5 consecutive days or 10 nonconsecutive days in total was permitted. At day 30, participants received another 30-day supply of the study product and sunscreen to be applied according to the same regimen for an additional 30-day treatment period.

Clinical Evaluation

At baseline, demographic data and medical history was recorded for all participants and dermatologic and physical examination was performed documenting weight, height, blood pressure, heart rate, and baseline MASI score. Following Wood lamp examination, participants’ faces were photographed and catalogued using medical imaging software that allowed for measurement of the total melasma surface area (Figure 1A). The photographs also were cross-polarized for further analysis of the pigmentation (Figure 1B).

    

Figure 1. Clinical (A) and cross-polarized (B) photographs of a patient before treatment with the novel compound containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05%.

A questionnaire evaluating treatment satisfaction was administered to participants (ITTe population [n=33]) at baseline and days 30 and 60. Questionnaire items pertained to skin blemishes, signs of facial aging, overall appearance, texture, oiliness, brightness, and hydration. Participants were instructed to rate their satisfaction for each item on a scale of 1 to 10 (1=bad, 10=excellent). For investigator analysis, scores of 1 to 4 were classified as “dissatisfied,” scores of 5 to 6 were classified as “satisfied,” and scores of 7 to 10 were classified as “completely satisfied.” A questionnaire evaluating product appreciation was administered at day 60 to participants who completed the study (n=29). Questionnaire items asked participants to rate the study cream’s ease of application, consistency, smell, absorption, and overall satisfaction using ratings of “bad,” “regular,” “good,” “very good,” or “excellent.”

Treatment efficacy in all participants was assessed by the investigators at days 30 and 60. Investigators evaluated reductions in pigmentation and total melasma surface area using ratings of “none,” “regular,” “good,” “very good,” or “excellent.” Local tolerance also was evaluated at both time points, and AEs were recorded and analyzed with respect to their duration, intensity, frequency, and severity.

Targeted hyperpigmented skin was selected for in vivo RCM analysis. At each time point, a sequence of block images was acquired at 4 levels of skin: (1) superficial dermis, (2) suprabasal layer/ dermoepidermal junction, (3) spinous layer, and (4) superficial granular layer. Blind evaluation of these images to assess the reduction in melanin quantity was conducted by a dermatopathologist at baseline and days 30 and 60. Melanin quantity present in each layer was graded according to 4 categories (0%–25%, 25.1%–50%, 50.1%–75%, 75.1%–100%). The mean value was used for statistical evaluation.

Results

Efficacy evaluation

The primary efficacy variable was the mean reduction in MASI score from baseline to the end of treatment (day 60), which was 2.25 ± 1.87 (P<.0001). The reduction in mean MASI score was significant from baseline to day 30 (P<.0001) and from day 30 to day 60 (P<.0001). The least root-mean-square error estimates of MASI score variation at days 30 and 60 were 1.40 and 2.25, respectively.

The mean total melasma surface area (as measured in analysis of clinical photographs using medical imaging software) was significantly reduced from 1398.5 mm2 at baseline to 1116.9 mm2 at day 30 (P<.0001) and 923.4 at day 60 (P<.0001). From baseline to end of treatment, the overall reduction in mean total melasma surface area was 475.1 mm2 (P<.0001)(Figure 2). Clinical and cross-polarized photographs taken at day 60 demonstrated a visible reduction in melasma surface area (Figure 3), which was confirmed using medical imaging software.

 

 

Figure 2. Mean surface area of melasma measured at baseline (1398.5 mm2), day 30 (1116.9 mm2), and day 60 (923.4 mm2), showing a mean total reduction of 475.1 mm2 from baseline to day 60.

  

Figure 3. Clinical (A) and cross-polarized (B) photographs of a patient after 60 days of treatment with the novel compound containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05%.

In vivo RCM analyses at each time point showed reduction in pigmentation in the 4 levels of the skin that were evaluated, but the results were not statistically significant.

Participant satisfaction

There was strong statistical evidence of patient satisfaction with the treatment results at the end of the study period (P<.0001). At baseline, 75.8% (25/33) of participants were dissatisfied with the appearance of their skin as compared with 15.2% (5/33) at day 60. Additionally, 18.1% (6/33) and 6.1% (2/33) of the participants were satisfied and completely satisfied at baseline compared with 33.3% (11/33) and 51.5% (17/33) at day 60, respectively. Participant satisfaction with signs of facial aging also increased over the study period (P=.0104). At baseline, 60.6% (20/33) were dissatisfied, 12.1% (4/33) were satisfied, and 27.3% (9/33) were completely satisfied; at the end of treatment, 30.3% (10/33) were dissatisfied, 36.4% (12/33) were satisfied, and 33.3% (11/33) were completely satisfied with the improvement in signs of facial aging.

Increased patient satisfaction with facial skin texture at baseline compared to day 60 also was statistically significant (P=.0157). At baseline, 39.4% (13/33) of the participants were dissatisfied, 30.3% (10/33) were satisfied, and 30.3% (10/33) were completely satisfied with facial texture; at day 60, 15.1% (5/33) were dissatisfied, 30.3% (10/33) were satisfied, and 54.6% (18/33) were completely satisfied. Significant improvement from baseline to day 60 also was observed in participant assessment of skin oiliness (P=.0210), brightness (P=.0003), overall appearance (P<.0001), and hydration (P<.0001).

Product appreciation

At day 60, 89.7% (26/29) of the participants who completed the study rated the product’s ease of application as being at least “good,” with more than half of participants (55.2% [16/29]) rating it as “very good” or “excellent.” Overall satisfaction with the product was rated as “very good” or “excellent” by 48.3% (14/29) of the participants. Similar results were observed in participant assessments of consistency, smell, and absorption (Figure 4).

Figure 4. Participant responses to product appreciation questionnaire.

Safety evaluation

A total of 52 AEs were observed in 23 (69.7%) participants, which were recorded by participants in diary entries throughout treatment and evaluated by investigators at each time point. Among these AEs, 48 (92.3%) were considered possibly, probably, or conditionally related to treatment by the investigators based on clinical observation. The most common presumed treatment-related AE was a burning sensation on the skin, reported by 30.3% (10/33) of the participants at day 30 and 13.8% (4/29) at day 60. Of the reported AEs related to treatment, 91.7% (44/48) were of mild intensity and 93.8% (45/48) required no treatment or other action. There were no reported serious AEs related to the investigational product. Blood pressure, heart rate, and weight remained stable among all participants throughout the study.

The intensity of the AEs was described as “light” in 91.7% (44/48) of cases and “moderate” in 8.3% (4/48) of cases. The frequency of AEs was classified as “unique,” “intermittent,” or “continuous” in 45.8% (22/48), 39.6% (19/48), and14.6% (7/48) of cases, respectively. Of the 48 AEs, 3 (6.3%) occurred in 1 participant, necessitating interruption of treatment, application of the topical corticosteroid cream mometasone, and removal from the study.

Comment

Following treatment with the study cream containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05%, the mean reduction in MASI score (P<.0001) and the mean reduction in total melasma surface area from baseline to end of treatment were statistically significant (P<.0001). The study product was associated with strong statistical evidence of patient satisfaction (P<.0001) regarding improvement in facial skin texture, skin oiliness, brightness, overall appearance, and hydration. Participants also responded favorably to the product and considered it safe and effective. In vivo RCM analysis demonstrated a reduction in the amount of melanin in 4 levels of the skin (superficial dermis, suprabasal layer/dermoepidermal junction, spinous layer, superficial granular layer) following treatment with the study cream; however, over the course of the 60-day treatment period, it did not reveal statistically significant reductions. This finding likely is due to the large ranges used to classify the amount of melanin present in each layer of the skin. These limitations suggest that scales used in future in vivo RCM analyses of melasma should be narrower.

 

 

Epidermal melasma is one of the most difficult dermatologic diseases to treat and control. Maintenance of clear, undamaged skin remains a treatment target for all dermatologists. This novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% has proven to be an effective, safe, and tolerable treatment option for patients with epidermal melasma.

Epidermal melasma is a common hyperpigmentation disorder that can be challenging to treat. The pathogenesis of melasma is not fully understood but has been associated with increased melanin and melanocyte activity.1,2 Melasma is characterized by jagged, light- to dark-brown patches on areas of the skin most often exposed to the sun—primarily the cheeks, forehead, upper lip, nose, and chin.3 Although it can affect both sexes and all races, melasma is more common in Fitzpatrick skin types II to IV and frequently is seen in Asian or Hispanic women residing in geographic locations with high levels of sun exposure (eg, tropical areas).2 Melasma presents more frequently in adult women of childbearing age, especially during pregnancy, but also can begin postmenopause. Onset may occur as early as menarche but typically is observed between the ages of 30 and 55 years.3,4 Only 10% of melasma cases are known to occur in males4 and are influenced by such factors as ethnicity, hormones, and level of sun exposure.2

Topical therapies for melasma attempt to inhibit melanocytic activation at each level of melanin formation until the deposited pigment is removed; however, results may vary greatly, as melasma often recurs due to the migration of new melanocytes from hair follicles to the skin’s surface, leading to new development of hyperpigmentation. The current standard of treatment for melasma involves the use of hydroquinone and other bleaching agents, but long-term use of these treatments has been associated with concerns regarding unstable preparations (which may lose their therapeutic properties) and adverse effects (eg, ochronosis, depigmentation).5 Cosmetic agents that recently have been evaluated for melasma treatment include nicotinamide (a form of vitamin B3), which inhibits the transfer of melanosomes from melanocytes to keratinocytes; arbutin, which inhibits melanin synthesis by inhibiting tyrosinase activity6; bisabolol, which prevents anti-inflammatory activity7; and retinaldehyde (RAL), a precursor of retinoic acid (RA) that has powerful bleaching action and low levels of cutaneous irritability.8

This prospective, single-arm, open-label study, evaluated the efficacy and safety of a novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05% in the treatment of epidermal melasma.

Study Product Ingredients and Background

Nicotinamide

Nicotinamide is a water-soluble amide of nicotinic acid (niacin) and one of the 2 principal forms of vitamin B3. It is a component of the coenzymes nicotinamide adenine dinucleotide and nicotinamide adenine dinucleotide phosphate. Nicotinamide essentially acts as an antioxidant, with most of its effects exerted through poly(adenosine diphosphate–ribose) polymerase inhibition. Interest has increased in the role of nicotinamide in the prevention and treatment of several skin diseases, such as acne and UV radiation–induced deleterious molecular and immunological events. Nicotinamide also has gained consideration as a potential agent in sunscreen preparations due to its possible skin-lightening effects, stimulation of DNA repair, suppression of UV photocarcinogenesis, and other antiaging effects.9

Arbutin

Arbutin is a molecule that has proven effective in treating melasma.10 Its pigment-lightening ingredients include botanicals that are structurally similar to hydroquinone. Arbutin is obtained from the leaves of the bearberry plant but also is found in lesser quantities in cranberry and blueberry leaves. A naturally occurring gluconopyranoside, arbutin reduces tyrosinase activity without affecting messenger RNA expression.11 Arbutin also inhibits melanosome maturation, is nontoxic to melanocytes, and is used in Japan in a variety of pigment-lightening preparations at 3% concentrations.12

Bisabolol

Bisabolol is a natural monocyclic sesquiterpene alcohol found in the oils of chamomile and other plants. Bisabolol often is included in cosmetics due to its favorable anti-inflammatory and depigmentation properties. Its downregulation of inducible nitric oxide synthase and cyclooxygenase-2 suggests that it may have anti-inflammatory effects.7

Retinaldehyde

Retinaldehyde is an RA precursor that forms as an intermediate metabolite in the transformation of retinol to RA in human keratinocytes. Topical RAL is well tolerated by human skin, and several of its biologic effects are identical to those of RA. Using the tails of C57BL/6 mouse models, RAL 0.05% has been found to have significantly more potent depigmenting effects than RA 0.05% (P<.001 vs P<.01, respectively) when compared to vehicle.13

Although combination therapy with RAL and arbutin could potentially cause skin irritation, the addition of bisabolol to the combination cream used in this study is believed to have conferred anti-inflammatory properties because it inhibits the release of histamine and relieves irritation.

Methods

This single-center, single-arm, prospective, open-label study evaluated the efficacy and safety of a novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% in treating epidermal melasma. Clinical evaluation included assessment of Melasma Area and Severity Index (MASI) score, photographic analysis, and in vivo reflectance confocal microscopy (RCM) analysis.

 

 

The study population included women aged 18 to 50 years with Fitzpatrick skin types I through V who had clinically diagnosed epidermal melasma on the face. Eligibility requirements included confirmation of epidermal pigmentation on Wood lamp examination and RCM analysis and a MASI score of less than 10.5. A total of 35 participants were enrolled in the study (intention to treat [ITT] population). Thirty-three participants were included in the analysis of treatment effectiveness (ITTe population), as 2 were excluded due to lack of follow-up postbaseline. Four participants were prematurely withdrawn from the study—3 due to loss to follow-up and 1 due to treatment discontinuation following an adverse event (AE). The last observation carried forward method was used to input missing data from these 4 participants excluding repeated measure analysis that used the generalized estimated equation method.

At baseline, a 25-g tube of the study cream containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% was distributed to all participants for once-daily application to the entire face for 30 days. Participants were instructed to apply the product in the evening after using a gentle cleanser, which also was to be used in the morning to remove the product residue. Additionally, participants were given a sunscreen with a sun protection factor of 30 to apply daily on the entire face in the morning, after lunch, and midafternoon. During the 30-day treatment period, treatment interruption of up to 5 consecutive days or 10 nonconsecutive days in total was permitted. At day 30, participants received another 30-day supply of the study product and sunscreen to be applied according to the same regimen for an additional 30-day treatment period.

Clinical Evaluation

At baseline, demographic data and medical history was recorded for all participants and dermatologic and physical examination was performed documenting weight, height, blood pressure, heart rate, and baseline MASI score. Following Wood lamp examination, participants’ faces were photographed and catalogued using medical imaging software that allowed for measurement of the total melasma surface area (Figure 1A). The photographs also were cross-polarized for further analysis of the pigmentation (Figure 1B).

    

Figure 1. Clinical (A) and cross-polarized (B) photographs of a patient before treatment with the novel compound containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05%.

A questionnaire evaluating treatment satisfaction was administered to participants (ITTe population [n=33]) at baseline and days 30 and 60. Questionnaire items pertained to skin blemishes, signs of facial aging, overall appearance, texture, oiliness, brightness, and hydration. Participants were instructed to rate their satisfaction for each item on a scale of 1 to 10 (1=bad, 10=excellent). For investigator analysis, scores of 1 to 4 were classified as “dissatisfied,” scores of 5 to 6 were classified as “satisfied,” and scores of 7 to 10 were classified as “completely satisfied.” A questionnaire evaluating product appreciation was administered at day 60 to participants who completed the study (n=29). Questionnaire items asked participants to rate the study cream’s ease of application, consistency, smell, absorption, and overall satisfaction using ratings of “bad,” “regular,” “good,” “very good,” or “excellent.”

Treatment efficacy in all participants was assessed by the investigators at days 30 and 60. Investigators evaluated reductions in pigmentation and total melasma surface area using ratings of “none,” “regular,” “good,” “very good,” or “excellent.” Local tolerance also was evaluated at both time points, and AEs were recorded and analyzed with respect to their duration, intensity, frequency, and severity.

Targeted hyperpigmented skin was selected for in vivo RCM analysis. At each time point, a sequence of block images was acquired at 4 levels of skin: (1) superficial dermis, (2) suprabasal layer/ dermoepidermal junction, (3) spinous layer, and (4) superficial granular layer. Blind evaluation of these images to assess the reduction in melanin quantity was conducted by a dermatopathologist at baseline and days 30 and 60. Melanin quantity present in each layer was graded according to 4 categories (0%–25%, 25.1%–50%, 50.1%–75%, 75.1%–100%). The mean value was used for statistical evaluation.

Results

Efficacy evaluation

The primary efficacy variable was the mean reduction in MASI score from baseline to the end of treatment (day 60), which was 2.25 ± 1.87 (P<.0001). The reduction in mean MASI score was significant from baseline to day 30 (P<.0001) and from day 30 to day 60 (P<.0001). The least root-mean-square error estimates of MASI score variation at days 30 and 60 were 1.40 and 2.25, respectively.

The mean total melasma surface area (as measured in analysis of clinical photographs using medical imaging software) was significantly reduced from 1398.5 mm2 at baseline to 1116.9 mm2 at day 30 (P<.0001) and 923.4 at day 60 (P<.0001). From baseline to end of treatment, the overall reduction in mean total melasma surface area was 475.1 mm2 (P<.0001)(Figure 2). Clinical and cross-polarized photographs taken at day 60 demonstrated a visible reduction in melasma surface area (Figure 3), which was confirmed using medical imaging software.

 

 

Figure 2. Mean surface area of melasma measured at baseline (1398.5 mm2), day 30 (1116.9 mm2), and day 60 (923.4 mm2), showing a mean total reduction of 475.1 mm2 from baseline to day 60.

  

Figure 3. Clinical (A) and cross-polarized (B) photographs of a patient after 60 days of treatment with the novel compound containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05%.

In vivo RCM analyses at each time point showed reduction in pigmentation in the 4 levels of the skin that were evaluated, but the results were not statistically significant.

Participant satisfaction

There was strong statistical evidence of patient satisfaction with the treatment results at the end of the study period (P<.0001). At baseline, 75.8% (25/33) of participants were dissatisfied with the appearance of their skin as compared with 15.2% (5/33) at day 60. Additionally, 18.1% (6/33) and 6.1% (2/33) of the participants were satisfied and completely satisfied at baseline compared with 33.3% (11/33) and 51.5% (17/33) at day 60, respectively. Participant satisfaction with signs of facial aging also increased over the study period (P=.0104). At baseline, 60.6% (20/33) were dissatisfied, 12.1% (4/33) were satisfied, and 27.3% (9/33) were completely satisfied; at the end of treatment, 30.3% (10/33) were dissatisfied, 36.4% (12/33) were satisfied, and 33.3% (11/33) were completely satisfied with the improvement in signs of facial aging.

Increased patient satisfaction with facial skin texture at baseline compared to day 60 also was statistically significant (P=.0157). At baseline, 39.4% (13/33) of the participants were dissatisfied, 30.3% (10/33) were satisfied, and 30.3% (10/33) were completely satisfied with facial texture; at day 60, 15.1% (5/33) were dissatisfied, 30.3% (10/33) were satisfied, and 54.6% (18/33) were completely satisfied. Significant improvement from baseline to day 60 also was observed in participant assessment of skin oiliness (P=.0210), brightness (P=.0003), overall appearance (P<.0001), and hydration (P<.0001).

Product appreciation

At day 60, 89.7% (26/29) of the participants who completed the study rated the product’s ease of application as being at least “good,” with more than half of participants (55.2% [16/29]) rating it as “very good” or “excellent.” Overall satisfaction with the product was rated as “very good” or “excellent” by 48.3% (14/29) of the participants. Similar results were observed in participant assessments of consistency, smell, and absorption (Figure 4).

Figure 4. Participant responses to product appreciation questionnaire.

Safety evaluation

A total of 52 AEs were observed in 23 (69.7%) participants, which were recorded by participants in diary entries throughout treatment and evaluated by investigators at each time point. Among these AEs, 48 (92.3%) were considered possibly, probably, or conditionally related to treatment by the investigators based on clinical observation. The most common presumed treatment-related AE was a burning sensation on the skin, reported by 30.3% (10/33) of the participants at day 30 and 13.8% (4/29) at day 60. Of the reported AEs related to treatment, 91.7% (44/48) were of mild intensity and 93.8% (45/48) required no treatment or other action. There were no reported serious AEs related to the investigational product. Blood pressure, heart rate, and weight remained stable among all participants throughout the study.

The intensity of the AEs was described as “light” in 91.7% (44/48) of cases and “moderate” in 8.3% (4/48) of cases. The frequency of AEs was classified as “unique,” “intermittent,” or “continuous” in 45.8% (22/48), 39.6% (19/48), and14.6% (7/48) of cases, respectively. Of the 48 AEs, 3 (6.3%) occurred in 1 participant, necessitating interruption of treatment, application of the topical corticosteroid cream mometasone, and removal from the study.

Comment

Following treatment with the study cream containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05%, the mean reduction in MASI score (P<.0001) and the mean reduction in total melasma surface area from baseline to end of treatment were statistically significant (P<.0001). The study product was associated with strong statistical evidence of patient satisfaction (P<.0001) regarding improvement in facial skin texture, skin oiliness, brightness, overall appearance, and hydration. Participants also responded favorably to the product and considered it safe and effective. In vivo RCM analysis demonstrated a reduction in the amount of melanin in 4 levels of the skin (superficial dermis, suprabasal layer/dermoepidermal junction, spinous layer, superficial granular layer) following treatment with the study cream; however, over the course of the 60-day treatment period, it did not reveal statistically significant reductions. This finding likely is due to the large ranges used to classify the amount of melanin present in each layer of the skin. These limitations suggest that scales used in future in vivo RCM analyses of melasma should be narrower.

 

 

Epidermal melasma is one of the most difficult dermatologic diseases to treat and control. Maintenance of clear, undamaged skin remains a treatment target for all dermatologists. This novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and RAL 0.05% has proven to be an effective, safe, and tolerable treatment option for patients with epidermal melasma.

References

1. Grimes PE, Yamada N, Bhawan J. Light microscopic, immunohistochemical, and ultrastructural alterations in patients with melasma. Am J Dermatopathol. 2005;27:96-101.

2. Kang WH, Yoon KH, Lee ES, et al. Melasma: histopathological characteristics in 56 Korean patients. Br J Dermatol. 2002;146:228-237.

3. Cestari T, Arellano I, Hexsel D, et al. Melasma in Latin America: options the therapy and treatment algorithm. JEADV. 2009;23:760-772.

4. Miot LDB, Miot HA, Silva MG, et al. Fisiopatologia do Melasma. An Bras Dermatol. 2009;84:623-635.

5. Draelos Z. Skin lightening preparations and the hydroquinone controversy. Dermatol Ther. 2007;20:308-313.

6. Parvez S, Kang M, Chung HS, et al. Survey and mechanism of skin depigmenting and lightening agents. Phytoter Res. 2006;20:921-934.

7. Kim S, Jung E, Kim JH, et al. Inhibitory effects of (-)-α-bisabolol on LPS-induced inflammatory response in RAW264.7 macrophages. Food Chem Toxicol. 2011;49:2580-2585.

8. Ortonne JP. Retinoid therapy of pigmentary disorders. Dermatol Ther. 2006;19:280-288.

9. Namazi MR. Nicotinamide-containing sunscreens for use in Australasian countries and cancer-provoking conditions. Med Hypotheses. 2003;60:544-545.

10. Ertam I, Mutlu B, Unal I, et al. Efficiency of ellagic acid and arbutin in melasma: a randomized, prospective, open-label study. J Dermatol. 2008;35:570-574.

11. Hori I, Nihei K, Kubo I. Structural criteria for depigmenting mechanism of arbutin. Phytother Res. 2004;18:475-469.

12. Ethnic skin and pigmentation. In: Draelos ZD. Cosmetics and Dermatologic Problems and Solutions. 3rd ed. Boca Raton, FL: CRC Press; 2011:52-55.

13. Kasraee B, Tran C, Sorg O, et al. The depigmenting effect of RALGA in C57BL/6 mice. Dermatology. 2005;210(suppl 1):30-34.

References

1. Grimes PE, Yamada N, Bhawan J. Light microscopic, immunohistochemical, and ultrastructural alterations in patients with melasma. Am J Dermatopathol. 2005;27:96-101.

2. Kang WH, Yoon KH, Lee ES, et al. Melasma: histopathological characteristics in 56 Korean patients. Br J Dermatol. 2002;146:228-237.

3. Cestari T, Arellano I, Hexsel D, et al. Melasma in Latin America: options the therapy and treatment algorithm. JEADV. 2009;23:760-772.

4. Miot LDB, Miot HA, Silva MG, et al. Fisiopatologia do Melasma. An Bras Dermatol. 2009;84:623-635.

5. Draelos Z. Skin lightening preparations and the hydroquinone controversy. Dermatol Ther. 2007;20:308-313.

6. Parvez S, Kang M, Chung HS, et al. Survey and mechanism of skin depigmenting and lightening agents. Phytoter Res. 2006;20:921-934.

7. Kim S, Jung E, Kim JH, et al. Inhibitory effects of (-)-α-bisabolol on LPS-induced inflammatory response in RAW264.7 macrophages. Food Chem Toxicol. 2011;49:2580-2585.

8. Ortonne JP. Retinoid therapy of pigmentary disorders. Dermatol Ther. 2006;19:280-288.

9. Namazi MR. Nicotinamide-containing sunscreens for use in Australasian countries and cancer-provoking conditions. Med Hypotheses. 2003;60:544-545.

10. Ertam I, Mutlu B, Unal I, et al. Efficiency of ellagic acid and arbutin in melasma: a randomized, prospective, open-label study. J Dermatol. 2008;35:570-574.

11. Hori I, Nihei K, Kubo I. Structural criteria for depigmenting mechanism of arbutin. Phytother Res. 2004;18:475-469.

12. Ethnic skin and pigmentation. In: Draelos ZD. Cosmetics and Dermatologic Problems and Solutions. 3rd ed. Boca Raton, FL: CRC Press; 2011:52-55.

13. Kasraee B, Tran C, Sorg O, et al. The depigmenting effect of RALGA in C57BL/6 mice. Dermatology. 2005;210(suppl 1):30-34.

Issue
Cutis - 96(5)
Issue
Cutis - 96(5)
Page Number
337-342
Page Number
337-342
Publications
Publications
Topics
Article Type
Display Headline
A Novel Cream Formulation Containing Nicotinamide 4%, Arbutin 3%, Bisabolol 1%, and Retinaldehyde 0.05% for Treatment of Epidermal Melasma
Display Headline
A Novel Cream Formulation Containing Nicotinamide 4%, Arbutin 3%, Bisabolol 1%, and Retinaldehyde 0.05% for Treatment of Epidermal Melasma
Legacy Keywords
melasma, topical treatment, cosmetic, hydroquinone, skin lightening, pigmentation, melanin, pigmentation disorder
Legacy Keywords
melasma, topical treatment, cosmetic, hydroquinone, skin lightening, pigmentation, melanin, pigmentation disorder
Sections
Article Source

PURLs Copyright

Inside the Article

    Practice Points

  • Epidermal melasma is a common hyperpigmentation disorder characterized by the appearance of abnormal melanin deposits in different layers of the skin.
  • Melasma can be difficult to treat and often recurs due to the migration of new melanocytes from hair follicles to the skin’s surface.
  • A novel cream formulation containing nicotinamide 4%, arbutin 3%, bisabolol 1%, and retinaldehyde 0.05% offers a safe and effective option for treatment of epidermal melasma.
Article PDF Media

Evaluation of Clonidine and Prazosin for the Treatment of Nighttime Posttraumatic Stress Disorder Symptoms

Article Type
Changed
Fri, 11/10/2017 - 15:46
Display Headline
Evaluation of Clonidine and Prazosin for the Treatment of Nighttime Posttraumatic Stress Disorder Symptoms
Both clonidine and prazosin can be effective treatments for nighttime symptoms 
of posttraumatic stress disorder, but their long-term use may be limited.

Posttraumatic stress disorder (PTSD) remains a significant health concern in veterans and military personnel. Whereas the lifetime incidence of PTSD in the U.S. general population is about 7% to 8%, the estimated prevalence of PTSD in deployed U.S. military personnel is higher than the national average, ranging from 11% to 17%.1,2 These numbers may be even higher, depending on the branch of service, responsibilities within the military, and specific conflict in which the veteran served. For example, one study found that 31% of Vietnam veterans have PTSD, and another recent study has reported PTSD in 28.7% of veterans returning from military service in Iraq and Afghanistan.3,4

Posttraumatic stress disorder treatment guidelines from both the American Psychiatric Association and the VA and DoD recommend the use of selective serotonin reuptake inhibitors (SSRIs) or serotonin-norepinephrine reuptake inhibitors (SNRIs) as first-line pharmacotherapy for PTSD.5,6 However, SSRIs and SNRIs seem to be largely ineffective for the management of nighttime PTSD symptoms, such as insomnia and nightmares.7,8

Related: PTSD Increases Chance of Heart Failure

Researchers hypothesize that the sympathetic nervous system plays a significant role in the hyperarousal component of nighttime PTSD. The heightened responsiveness and disruption in restorative sleep seen in PTSD have been attributed to increased activity of norepinephrine in the central nervous system.9 Mechanistically, therapies that attenuate the increased noradrenergic signaling might be effective in the management of nighttime PTSD symptoms.

The body of evidence for the use of adrenergic agents for nighttime PTSD symptoms is growing. Prazosin, a peripherally acting 
α1-adrenergic receptor antagonist, has recently been demonstrated to be effective for nighttime PTSD symptoms in veterans in a series of small, randomized controlled trials.10-12 Data to support the use of clonidine, a centrally acting α2-adrenergic 
receptor agonist, are generally limited, with the most compelling data coming from a population of civilian Cambodian refugees.13,14 A 2007 article by Boehnlein and Kinzie includes a thorough review of the preclinical research, case reports, and early clinical studies that have led to the widespread use of these agents for PTSD despite the lack of FDA approval for this indication.13A previous retrospective review by Byers and colleagues compared the effectiveness and tolerability of prazosin and quetiapine for nighttime PTSD symptoms in veterans.15 The results of that review suggest that α1-adrenergic agents may be equally effective and better tolerated than alternative medication options (ie, atypical antipsychotics) for this purpose. The present study was adapted from this design to report concurrently on the real-world use of clonidine and prazosin for the treatment of nighttime PTSD
symptoms.

Study Objectives

The primary objective of this retrospective chart review was to describe the experience of patients prescribed clonidine or prazosin for the treatment of nighttime PTSD symptoms, including initial effectiveness. The primary endpoint of initial drug effectiveness was documented improvement of nighttime PTSD symptoms in the patient’s chart within 6 months of the date of first prescription. Clonidine or prazosin was categorized as initially effective if a statement such as “frequency of nightmares decreased” or “patient’s nighttime PTSD symptoms have improved” was made within 6 months after initial prescription of the drug.

The secondary objectives of this study were to evaluate the long-term effectiveness and tolerability of prazosin. The endpoints used to assess these outcomes were the 2-year continuation rates of clonidine and prazosin (as a surrogate marker for long-term effectiveness) and the documented reasons for discontinuation of clonidine and prazosin for the treatment of nighttime PTSD symptoms (in order to assess tolerability).

Methods

An electronic database search was conducted to identify the VA Portland Health Care System (VAPHCS) patients with a diagnosis of PTSD who received a first prescription for clonidine or prazosin for nighttime PTSD symptoms from a VAPHCS mental health provider or primary care provider (PCP) from January 1, 2009, to December 31, 2011. Patients were excluded if they had any history of prior use of the drug being initiated, were co-initiated on both clonidine and prazosin (defined as starting the drugs within 30 days of each other), or had a concomitant diagnosis of schizophrenia, bipolar 
disorder, psychotic disorder, or cognitive disorder as defined in the 
Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. Patients with traumatic brain injury (TBI) were excluded only if it could be determined that the event had resulted in lasting cognitive impairment.

Study Population

All patients with a diagnosis of PTSD who received a first prescription for clonidine during the period specified were screened for inclusion; patients with PTSD who were first prescribed prazosin during the same period were randomly sampled to equalize patient populations. This was done to maximize the data set while examining groups of roughly equal size for each drug, as prazosin is used much more commonly than clonidine for nighttime PTSD symptoms at VAPHCS. The patients in each resulting group were screened to determine whether they met inclusion and exclusion criteria. All subjects included were followed for 2 years from the date of the initial prescription.

 

 

Study Design

Initial effectiveness of each agent was determined by reviewing subjects’ progress notes after the initial prescription of clonidine or prazosin for documentation of improvement in symptoms within 6 months of the prescription start date. A decrease in frequency or intensity of nighttime PTSD symptoms, nightmares, or insomnia, as documented in the patient chart, was interpreted as improvement of symptoms.

Long-term continuation was assessed by reviewing subjects’ prescription records, to determine whether prescription(s) for clonidine or prazosin continued for 2 years after the date of the initial prescription.

Any gap between medication fills that resulted in an anticipated period without medication of ≥ 6 months (eg, 9 months after receiving a 
90-day supply) was considered discontinuation of therapy. Prescription refill history was also reviewed, and medication possession ratio (MPR) was calculated to assess whether patients were adherent to the study drug as prescribed. Adherence was defined as an MPR of ≥ 80%. Patients who left the VAPHCS service area but continued to receive care at another VA were assessed for continuation of therapy, but refill data and/or MPR were not assessed.

Tolerability was assessed by reviewing subjects’ medical records to determine whether therapy with clonidine or prazosin was discontinued due to documented adverse effects (AEs). The occurrence of AEs was determined by reviewing progress notes and other chart documentation surrounding the date of discontinuation. If the drug was discontinued but the reason was not explicitly documented or if the prescription expired without a documented reason for nonrenewing, the reason for discontinuation was coded as “not specified.” Discontinuation due to treatment failure, change in symptoms, nonadherence, or other causes was also recorded. If multiple reasons for discontinuation were cited for a single patient, all were included in the data. This project was approved by the institutional review board at the VAPHCS.

Related:Depression and Substance Abuse Intensify Suicide Risk

Statistical Considerations

Based on clinical experience, it was presumed that many of the patients who were prescribed clonidine would be receiving it as a second-line therapy after failing prazosin. Therefore, statistical analysis of the relative effectiveness and tolerability of clonidine and prazosin could not be performed. Neither power nor sample size needed to demonstrate any difference in effectiveness or tolerability between the groups was calculated. All results are expressed using descriptive statistics.

Results

An initial database search for patients with PTSD who received a first prescription for clonidine between January 1, 2009, and December 31, 2011, from a VAPHCS provider yielded a list of 149 patients. The same search criteria applied for prazosin yielded 1,116 patients, 149 of whom were randomly selected for screening. After screening, 42 patients on clonidine and 60 patients on prazosin were included in this analysis (Figure).

Patient Demographics

The average age of the clonidine patients was 38.5 years (range 21-65 years) (Table 1). The clonidine group was primarily male (90%) and white (83%). Eighteen of the 42 patients in the clonidine group had a
baseline PTSD Checklist-Civilian version (PCL-C) score available within the 90 days before the first prescription of clonidine; the average baseline PCL-C score in this subgroup was 
62 ± 12.0 (median 65.5, range 
31-82). Most of the clonidine patients (71%) had a concomitant diagnosis of a depressive disorder. About one-quarter of the group (24%) had previously tried prazosin per prescription records. In 24 patients (57%), the first prescription for clonidine was written by a psychiatrist or psychiatric nurse practitioner; 18 patients (43%) were started on clonidine by PCPs.

The average age of the prazosin patients was 46.1 years (range 21-74 years). The prazosin group was also primarily male (93%) and white (88%). Twenty of the 60 patients in the prazosin group had a baseline PCL-C score available within the 
90 days before the first prescription of prazosin; the average baseline PCL-C score in this subgroup was 55 ± 16.1 (median 64, range 30-72). Most of the prazosin patients (63%) had a concomitant diagnosis of a depressive disorder. Four patients (7%) had previously tried clonidine per prescription records. In 35 patients (58%), the first prescription for prazosin was written by a psychiatrist or psychiatric nurse practitioner; 
25 patients (42%) were started on prazosin by PCPs.

Data pertaining to initial and long-term effectiveness, tolerability, and MPR for both clonidine and prazosin are presented in Table 2.

Clonidine

Of the 42 clonidine patients assessed, 24 (57%) had a positive response to the medication for nighttime PTSD symptoms documented in the Computerized Patient Record System (CPRS) within 6 months of starting therapy. Six months after starting clonidine, 23 patients (55%) continued to take clonidine. Two years after starting therapy, 8 of the original 
42 patients continued on clonidine for an overall 2-year continuation rate of 19%.

 

 

Tolerability

Of the 34 patients who discontinued clonidine within 2 years, 13 patients (38%) cited ineffectiveness of therapy as a reason for discontinuation. Another 13 patients (38%) reported discontinuing therapy due to AEs. Sedation (4 patients, 12%), dizziness/hypotension (3 patients, 9%), and paradoxical worsening of PTSD symptoms (4 patients, 12%) were the most common AEs leading to discontinuation. Other AEs cited as reasons for discontinuation were syncope 
(2 patients), erectile dysfunction 
(1 patient), rash (1 patient), myoclonus (1 patient), increased depression (1 patient), and fatigue (1 patient). One patient reported that he had discontinued clonidine due to symptom resolution/lack of need for treatment. In 8 of the 34 patients, no reason for discontinuation was found in chart documentation.

Medication Possession Ratio

Among the 21 evaluable patients who continued to receive clonidine 6 months after initiation, 10 (48%) were determined to be highly adherent to therapy, with an MPR of ≥ 80%. Six of the 21 patients (29%) had an MPR between 50% and 79%, and 
5 patients (24%) had an MPR < 50%.

Of the 8 patients who continued on clonidine at the 2-year mark, 
3 (38%) were adherent to therapy, with an MPR of ≥ 80%. Three more patients (38%) had a 2-year MPR between 50% and 80%, and 2 patients (25%) had an MPR < 50%.

Prazosin

Of the 60 prazosin patients assessed, 32 (53%) had a positive response to the medication for nighttime PTSD symptoms documented in the CPRS within 6 months of starting therapy. Six months after starting prazosin, 36 patients (60%) continued to take prazosin. Two years after starting therapy, 18 of the original 60 patients continued on prazosin for an overall 2-year continuation rate of 30%.

Tolerability

Of the 42 patients who discontinued prazosin within 2 years, six patients (14%) cited ineffectiveness of therapy as a reason for discontinuation. Thirteen patients (31%) reported discontinuing therapy due to AEs. Sedation (3 patients, 7%), dizziness/hypotension (3 patients, 7%), and paradoxical worsening of PTSD symptoms (6 patients, 14%) were the most common AEs leading to discontinuation. Other AEs cited as reasons for discontinuation were headache 
(2 patients), altered mental status (1 patient), and fatigue (1 patient). Three patients reported that they had discontinued clonidine due 
to symptom resolution/lack of need for treatment. Other reasons for discontinuation not related to 
AEs included flight rules (1 patient), changes to antihypertensive regimen (1 patient), refill issues (1 patient), and cost (1 patient). In 15 of 
the 42 patients, no reason for 
discontinuation was found in chart documentation.

Medication Possession Ratio

Among the 31 evaluable patients who continued to receive prazosin 
6 months after initiation, 20 (65%) were determined to be highly adherent to therapy, with an MPR of ≥ 80%. Five of the 31 patients (16%) had an MPR between 50% and 80%, and 6 patients (19%) had an MPR < 50%.

Of the 15 evaluable patients who continued on prazosin at the 2-year mark, 9 (60%) were adherent to therapy, with an MPR of ≥ 80%. Three patients (20%) had a 2-year MPR 
between 50% and 80%, and 3 patients 
(20%) had an MPR < 50%.

Discussion

Although prazosin has been shown to be effective for nighttime PTSD symptoms in both prospective and retrospective evaluations in veterans, this study provides the first evidence to support the use of clonidine in a veteran population.10-12,15

Interestingly, 42% of the patients assessed received their first prescription of an α2-adrenergic agent for nighttime PTSD symptoms from a PCP. Even with the recent increased focus on integrating mental health into primary care within the VA, this was a surprising finding. Primary care providers at VAPHCS may have a greater role in the outpatient management of PTSD than previously suspected. The information presented here may prove useful and applicable in both psychiatric and primary care treatment settings.

The study results indicated that a majority of subjects initially reported effectiveness with either clonidine or prazosin (53% and 57%, respectively). The initial effectiveness rate for prazosin is similar to those described in previous studies.10-13,15 The data also support a viable role for clonidine in the treatment of nighttime PTSD symptoms.

Regardless of initial improvement, the study results also suggest that the therapeutic benefit may not persist in the long term, as evidenced by a significant percentage of discontinuations attributed to ineffectiveness (38% for clonidine and 14% for prazosin) and a very low rate of long-term continuation (19% for clonidine and 30% for prazosin at 2 years). This latter observation contrasts with findings from previous studies; Byers and colleagues reported a 2-year prazosin continuation rate of 48.4% in a similar analysis, and Boehnlein and colleagues reported a sustained benefit of clonidine in responders over a 10-year period.14,15 The wide variety of reasons for discontinuation reported here may help providers who are considering clonidine or prazosin for their patients to anticipate barriers to long-term success.

 

 

Part of the discrepancy between these results and previously reported successes with clonidine and prazosin may be attributable to the classic issue of efficacy vs effectiveness. Many of the studies that have informed us on the efficacy and tolerability of prazosin for nighttime PTSD symptoms described outcomes of prospective clinical research. Furthermore, these prospective trials were limited to < 6 months in duration. To date, neither clonidine nor prazosin has been evaluated for long-term efficacy and effectiveness in well-designed, prospective trials. This retrospective analysis may help provide a realistic estimate of the long-term effectiveness of these therapies, especially within the veteran population.

Limitations

This was a single-center, retrospective study conducted primarily in white male patients. Although likely applicable to the U.S. veteran population at large, these data may be poorly generalizable to patient populations outside the VA health care system.

Aside from external validity, this study has several significant limitations. The primary limitation of this project is that it was not designed to allow for statistical comparison of clonidine and prazosin. Such an analysis would have better defined the role of clonidine in PTSD treatment, either by establishing similar effectiveness of clonidine and prazosin for nighttime symptoms or by providing evidence of the superiority of one over the other. In designing the project, investigators suspected based on experience that the majority of patients prescribed clonidine would receive the drug after having already failed first-line therapy with prazosin. Had this been the case, a direct comparison may have been biased in favor of prazosin. In retrospect, however, only 24% of the clonidine group had previously been prescribed prazosin, and only 7% of the prazosin group had been prescribed clonidine. This suggests that clonidine may be used first line more often than the investigators anticipated and that a future direct comparison would be worthwhile.

Second, the subjective data collected for this project required investigators to read and interpret chart notes, although the review of all records by a single investigator helped limit variability in interpretation. At times, information in the CPRS was incomplete in terms of determining continuation of therapy or cause for discontinuation.

Third, although it is implied that a significant number of veterans have combat-related PTSD, the nature of the traumatic event(s) leading to PTSD was not recorded in this study, and no subgroup analysis was done to compare the effect of α2-adrenergic agents between combat- and noncombat-related PTSD. Owing to their exclusion by design, it is also difficult to apply these results to veterans who have lasting cognitive impairment as a result of TBI, who are presumably among those most likely to have experienced traumas that could provoke PTSD.

The design of this project also did not include a subgroup analysis based on antidepressant type, and it is unclear whether the potential pharmacodynamic interaction 
between noradrenergic antidepressants (ie, SNRIs) and anti–
α2-adrenergic agents had any impact on clinical outcomes. The use of complementary nonpharmacologic treatment modalities (ie, psychotherapy, eye movement desensitization and reprocessing) was also not evaluated.

Related: Female Service Members in the Long War

Finally, the primary outcome of patient-reported improvement in symptoms does not provide information on the magnitude or specific nature of benefits derived. Given the retrospective nature, data used in prospectively designed studies (eg, rating scales pertinent to PTSD), which might have helped to quantify the benefit of treatment, was not consistently available. Even a baseline PCL-C score, collected in order to describe the patient population, was available only in 37% of the patients assessed. Furthermore, nighttime PTSD symptoms vary among individuals, but the primary outcome of this study pools any benefits seen in areas such as nightmares, awakenings, night sweats, or sleep quality into a single outcome of symptom improvement.

Conclusions

This study indicates that both clonidine and prazosin may be effective for the treatment of nighttime PTSD symptoms in the veteran population but that their long-term utility may be limited by waning effectiveness, tolerability, and adherence issues. At this time, it is unclear whether either agent has an advantage over the other in terms of effectiveness or tolerability; further studies are needed to address that question.

Despite its limitations, the authors anticipate that this study will provide information regarding the effectiveness and tolerability of clonidine and prazosin to treat nighttime PTSD symptoms. Findings from this study may help clinicians to anticipate the needs and challenges of patients using β2-adrenergic agents for nighttime symptoms of PTSD.

Acknowledgements
The authors wish to acknowledge Brian Wilcox, PharmD, for his assistance in generating patient data reports, and Ronald Brown, RPh, MS, for his guidance regarding data analysis.

 

 

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the U.S. Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

References

 

1.  Hoge CW, Castro CA, Messer SC, McGurk D, Cotting DI, Koffman RL. Combat duty in Iraq and Afghanistan, mental problems, and barriers to care. N Engl J Med. 2004;351(1):13-22.  

2. Gates MA, Holowka DW, Vasterling JJ, Keane TM, Marx BP, Rosen RC. Posttraumatic stress disorder in veterans and military personnel: epidemiology, screening, and case recognition. Psychol Serv. 2012;9(4):361-382.

3. Kulka R, Schlenger WE, Fairbanks J, et al. Trauma and the Vietnam War Generation: Report of Findings From the National Vietnam Veterans Readjustment Study. New York, NY: Brunnel/Mazel; 1990.

4. Barrera TL, Graham DP, Dunn NJ, Teng EJ. Influence of trauma history on panic and posttraumatic stress disorder in returning veterans. Psychol Serv. 2013;10(2):168-176. 

5. American Psychiatric Association. Practice Guideline for the Treatment of Patients With Acute Stress Disorder and Posttraumatic Stress Disorder. Arlington, VA: American Psychiatric Association; 2004. 

6. U.S. Department of Veterans Affairs, Department of Defense. VA/DoD clinical practice guideline for management of post-traumatic stress. Version 2.0. U.S. Department of Veterans Affairs Website. http://www.healthquality.va.gov/guidelines/MH/ptsd/cpgPTSDFULL201011612c.pdf. Published October 2010. Accessed October 5, 2015.

7. Berger W, Mendlowicz MV, Marques-Portella C, et al. Pharmacologic alternatives to antidepressants in posttraumatic stress disorder: a systematic review. Prog Neuropsychopharmacol Biol Psychiatry. 2009;33(2):169-180.

8. Ravindran LN, Stein MB. Pharmacotherapy of post-traumatic stress disorder. In: Stein MB, Steckler T, eds. Behavioral Neurobiology of Anxiety and Its Treatment. Vol 2. Heidelberg, Germany: Springer; 2010:505-525.

9. Spoormaker VI, Montgomery P. Disturbed sleep in post-traumatic stress disorder: secondary symptom or core feature? Sleep Med Rev. 2008;12(3):169-184.

10. Raskind MA, Peskind ER, Kanter ED, et al. Reduction of nightmares and other PTSD symptoms in combat veterans by prazosin: a placebo controlled study. Am J Psychiatry. 2003;160(2):371-373.

11. Raskind MA, Peskind ER, Hoff DJ, et al. A parallel group placebo controlled study of prazosin for trauma nightmares and sleep disturbance in combat veterans with post-traumatic stress disorder. Biol Psychiatry. 2007;61(8):928-934.

12. Raskind MA, Peterson K, Williams T, et al. A trial of prazosin for combat trauma PTSD with nightmares in active-duty soldiers returned from Iraq and Afghanistan. Am J Psychiatry. 2013;170:1003-1010.

13. Boehnlein JK, Kinzie JD. Pharmacologic reduction of CNS noradrenergic activity in PTSD: the case for clonidine and prazosin. J Psychiatr Pract. 2007;13(2):72-78.

14. Boehnlein JK, Kinzie JD, Sekiya U, Riley C, Pou K, Rosborough B. A ten-year treatment outcome study of traumatized Cambodian refugees. J Nerve Ment Dis. 2004;192(10):658-663.

15. Byers MG, Allison KM, Wendel CS, Lee JK. Prazosin versus quetiapine for nighttime posttraumatic stress disorder symptoms in veterans: an assessment of long-term comparative effectiveness and safety.  J Clin Psychopharmacol. 2010;30(3):225-229.

Article PDF
Author and Disclosure Information

Dr. Wendell is a clinical pharmacy specialist in geriatrics at Providence ElderPlace in Portland, Oregon. Dr. Maxwell is a clinical pharmacy specialist in psychiatry at the VA Portland Health Care System in Oregon.

Issue
Federal Practitioner - 32(11)
Publications
Topics
Page Number
8-14
Legacy Keywords
Posttraumatic stress disorder, clonidine, nighttime posttraumatic stress disorder symptoms, prazosin, Kristin R. Wendell, Melissa L. Maxwell
Sections
Author and Disclosure Information

Dr. Wendell is a clinical pharmacy specialist in geriatrics at Providence ElderPlace in Portland, Oregon. Dr. Maxwell is a clinical pharmacy specialist in psychiatry at the VA Portland Health Care System in Oregon.

Author and Disclosure Information

Dr. Wendell is a clinical pharmacy specialist in geriatrics at Providence ElderPlace in Portland, Oregon. Dr. Maxwell is a clinical pharmacy specialist in psychiatry at the VA Portland Health Care System in Oregon.

Article PDF
Article PDF
Related Articles
Both clonidine and prazosin can be effective treatments for nighttime symptoms 
of posttraumatic stress disorder, but their long-term use may be limited.
Both clonidine and prazosin can be effective treatments for nighttime symptoms 
of posttraumatic stress disorder, but their long-term use may be limited.

Posttraumatic stress disorder (PTSD) remains a significant health concern in veterans and military personnel. Whereas the lifetime incidence of PTSD in the U.S. general population is about 7% to 8%, the estimated prevalence of PTSD in deployed U.S. military personnel is higher than the national average, ranging from 11% to 17%.1,2 These numbers may be even higher, depending on the branch of service, responsibilities within the military, and specific conflict in which the veteran served. For example, one study found that 31% of Vietnam veterans have PTSD, and another recent study has reported PTSD in 28.7% of veterans returning from military service in Iraq and Afghanistan.3,4

Posttraumatic stress disorder treatment guidelines from both the American Psychiatric Association and the VA and DoD recommend the use of selective serotonin reuptake inhibitors (SSRIs) or serotonin-norepinephrine reuptake inhibitors (SNRIs) as first-line pharmacotherapy for PTSD.5,6 However, SSRIs and SNRIs seem to be largely ineffective for the management of nighttime PTSD symptoms, such as insomnia and nightmares.7,8

Related: PTSD Increases Chance of Heart Failure

Researchers hypothesize that the sympathetic nervous system plays a significant role in the hyperarousal component of nighttime PTSD. The heightened responsiveness and disruption in restorative sleep seen in PTSD have been attributed to increased activity of norepinephrine in the central nervous system.9 Mechanistically, therapies that attenuate the increased noradrenergic signaling might be effective in the management of nighttime PTSD symptoms.

The body of evidence for the use of adrenergic agents for nighttime PTSD symptoms is growing. Prazosin, a peripherally acting 
α1-adrenergic receptor antagonist, has recently been demonstrated to be effective for nighttime PTSD symptoms in veterans in a series of small, randomized controlled trials.10-12 Data to support the use of clonidine, a centrally acting α2-adrenergic 
receptor agonist, are generally limited, with the most compelling data coming from a population of civilian Cambodian refugees.13,14 A 2007 article by Boehnlein and Kinzie includes a thorough review of the preclinical research, case reports, and early clinical studies that have led to the widespread use of these agents for PTSD despite the lack of FDA approval for this indication.13A previous retrospective review by Byers and colleagues compared the effectiveness and tolerability of prazosin and quetiapine for nighttime PTSD symptoms in veterans.15 The results of that review suggest that α1-adrenergic agents may be equally effective and better tolerated than alternative medication options (ie, atypical antipsychotics) for this purpose. The present study was adapted from this design to report concurrently on the real-world use of clonidine and prazosin for the treatment of nighttime PTSD
symptoms.

Study Objectives

The primary objective of this retrospective chart review was to describe the experience of patients prescribed clonidine or prazosin for the treatment of nighttime PTSD symptoms, including initial effectiveness. The primary endpoint of initial drug effectiveness was documented improvement of nighttime PTSD symptoms in the patient’s chart within 6 months of the date of first prescription. Clonidine or prazosin was categorized as initially effective if a statement such as “frequency of nightmares decreased” or “patient’s nighttime PTSD symptoms have improved” was made within 6 months after initial prescription of the drug.

The secondary objectives of this study were to evaluate the long-term effectiveness and tolerability of prazosin. The endpoints used to assess these outcomes were the 2-year continuation rates of clonidine and prazosin (as a surrogate marker for long-term effectiveness) and the documented reasons for discontinuation of clonidine and prazosin for the treatment of nighttime PTSD symptoms (in order to assess tolerability).

Methods

An electronic database search was conducted to identify the VA Portland Health Care System (VAPHCS) patients with a diagnosis of PTSD who received a first prescription for clonidine or prazosin for nighttime PTSD symptoms from a VAPHCS mental health provider or primary care provider (PCP) from January 1, 2009, to December 31, 2011. Patients were excluded if they had any history of prior use of the drug being initiated, were co-initiated on both clonidine and prazosin (defined as starting the drugs within 30 days of each other), or had a concomitant diagnosis of schizophrenia, bipolar 
disorder, psychotic disorder, or cognitive disorder as defined in the 
Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. Patients with traumatic brain injury (TBI) were excluded only if it could be determined that the event had resulted in lasting cognitive impairment.

Study Population

All patients with a diagnosis of PTSD who received a first prescription for clonidine during the period specified were screened for inclusion; patients with PTSD who were first prescribed prazosin during the same period were randomly sampled to equalize patient populations. This was done to maximize the data set while examining groups of roughly equal size for each drug, as prazosin is used much more commonly than clonidine for nighttime PTSD symptoms at VAPHCS. The patients in each resulting group were screened to determine whether they met inclusion and exclusion criteria. All subjects included were followed for 2 years from the date of the initial prescription.

 

 

Study Design

Initial effectiveness of each agent was determined by reviewing subjects’ progress notes after the initial prescription of clonidine or prazosin for documentation of improvement in symptoms within 6 months of the prescription start date. A decrease in frequency or intensity of nighttime PTSD symptoms, nightmares, or insomnia, as documented in the patient chart, was interpreted as improvement of symptoms.

Long-term continuation was assessed by reviewing subjects’ prescription records, to determine whether prescription(s) for clonidine or prazosin continued for 2 years after the date of the initial prescription.

Any gap between medication fills that resulted in an anticipated period without medication of ≥ 6 months (eg, 9 months after receiving a 
90-day supply) was considered discontinuation of therapy. Prescription refill history was also reviewed, and medication possession ratio (MPR) was calculated to assess whether patients were adherent to the study drug as prescribed. Adherence was defined as an MPR of ≥ 80%. Patients who left the VAPHCS service area but continued to receive care at another VA were assessed for continuation of therapy, but refill data and/or MPR were not assessed.

Tolerability was assessed by reviewing subjects’ medical records to determine whether therapy with clonidine or prazosin was discontinued due to documented adverse effects (AEs). The occurrence of AEs was determined by reviewing progress notes and other chart documentation surrounding the date of discontinuation. If the drug was discontinued but the reason was not explicitly documented or if the prescription expired without a documented reason for nonrenewing, the reason for discontinuation was coded as “not specified.” Discontinuation due to treatment failure, change in symptoms, nonadherence, or other causes was also recorded. If multiple reasons for discontinuation were cited for a single patient, all were included in the data. This project was approved by the institutional review board at the VAPHCS.

Related:Depression and Substance Abuse Intensify Suicide Risk

Statistical Considerations

Based on clinical experience, it was presumed that many of the patients who were prescribed clonidine would be receiving it as a second-line therapy after failing prazosin. Therefore, statistical analysis of the relative effectiveness and tolerability of clonidine and prazosin could not be performed. Neither power nor sample size needed to demonstrate any difference in effectiveness or tolerability between the groups was calculated. All results are expressed using descriptive statistics.

Results

An initial database search for patients with PTSD who received a first prescription for clonidine between January 1, 2009, and December 31, 2011, from a VAPHCS provider yielded a list of 149 patients. The same search criteria applied for prazosin yielded 1,116 patients, 149 of whom were randomly selected for screening. After screening, 42 patients on clonidine and 60 patients on prazosin were included in this analysis (Figure).

Patient Demographics

The average age of the clonidine patients was 38.5 years (range 21-65 years) (Table 1). The clonidine group was primarily male (90%) and white (83%). Eighteen of the 42 patients in the clonidine group had a
baseline PTSD Checklist-Civilian version (PCL-C) score available within the 90 days before the first prescription of clonidine; the average baseline PCL-C score in this subgroup was 
62 ± 12.0 (median 65.5, range 
31-82). Most of the clonidine patients (71%) had a concomitant diagnosis of a depressive disorder. About one-quarter of the group (24%) had previously tried prazosin per prescription records. In 24 patients (57%), the first prescription for clonidine was written by a psychiatrist or psychiatric nurse practitioner; 18 patients (43%) were started on clonidine by PCPs.

The average age of the prazosin patients was 46.1 years (range 21-74 years). The prazosin group was also primarily male (93%) and white (88%). Twenty of the 60 patients in the prazosin group had a baseline PCL-C score available within the 
90 days before the first prescription of prazosin; the average baseline PCL-C score in this subgroup was 55 ± 16.1 (median 64, range 30-72). Most of the prazosin patients (63%) had a concomitant diagnosis of a depressive disorder. Four patients (7%) had previously tried clonidine per prescription records. In 35 patients (58%), the first prescription for prazosin was written by a psychiatrist or psychiatric nurse practitioner; 
25 patients (42%) were started on prazosin by PCPs.

Data pertaining to initial and long-term effectiveness, tolerability, and MPR for both clonidine and prazosin are presented in Table 2.

Clonidine

Of the 42 clonidine patients assessed, 24 (57%) had a positive response to the medication for nighttime PTSD symptoms documented in the Computerized Patient Record System (CPRS) within 6 months of starting therapy. Six months after starting clonidine, 23 patients (55%) continued to take clonidine. Two years after starting therapy, 8 of the original 
42 patients continued on clonidine for an overall 2-year continuation rate of 19%.

 

 

Tolerability

Of the 34 patients who discontinued clonidine within 2 years, 13 patients (38%) cited ineffectiveness of therapy as a reason for discontinuation. Another 13 patients (38%) reported discontinuing therapy due to AEs. Sedation (4 patients, 12%), dizziness/hypotension (3 patients, 9%), and paradoxical worsening of PTSD symptoms (4 patients, 12%) were the most common AEs leading to discontinuation. Other AEs cited as reasons for discontinuation were syncope 
(2 patients), erectile dysfunction 
(1 patient), rash (1 patient), myoclonus (1 patient), increased depression (1 patient), and fatigue (1 patient). One patient reported that he had discontinued clonidine due to symptom resolution/lack of need for treatment. In 8 of the 34 patients, no reason for discontinuation was found in chart documentation.

Medication Possession Ratio

Among the 21 evaluable patients who continued to receive clonidine 6 months after initiation, 10 (48%) were determined to be highly adherent to therapy, with an MPR of ≥ 80%. Six of the 21 patients (29%) had an MPR between 50% and 79%, and 
5 patients (24%) had an MPR < 50%.

Of the 8 patients who continued on clonidine at the 2-year mark, 
3 (38%) were adherent to therapy, with an MPR of ≥ 80%. Three more patients (38%) had a 2-year MPR between 50% and 80%, and 2 patients (25%) had an MPR < 50%.

Prazosin

Of the 60 prazosin patients assessed, 32 (53%) had a positive response to the medication for nighttime PTSD symptoms documented in the CPRS within 6 months of starting therapy. Six months after starting prazosin, 36 patients (60%) continued to take prazosin. Two years after starting therapy, 18 of the original 60 patients continued on prazosin for an overall 2-year continuation rate of 30%.

Tolerability

Of the 42 patients who discontinued prazosin within 2 years, six patients (14%) cited ineffectiveness of therapy as a reason for discontinuation. Thirteen patients (31%) reported discontinuing therapy due to AEs. Sedation (3 patients, 7%), dizziness/hypotension (3 patients, 7%), and paradoxical worsening of PTSD symptoms (6 patients, 14%) were the most common AEs leading to discontinuation. Other AEs cited as reasons for discontinuation were headache 
(2 patients), altered mental status (1 patient), and fatigue (1 patient). Three patients reported that they had discontinued clonidine due 
to symptom resolution/lack of need for treatment. Other reasons for discontinuation not related to 
AEs included flight rules (1 patient), changes to antihypertensive regimen (1 patient), refill issues (1 patient), and cost (1 patient). In 15 of 
the 42 patients, no reason for 
discontinuation was found in chart documentation.

Medication Possession Ratio

Among the 31 evaluable patients who continued to receive prazosin 
6 months after initiation, 20 (65%) were determined to be highly adherent to therapy, with an MPR of ≥ 80%. Five of the 31 patients (16%) had an MPR between 50% and 80%, and 6 patients (19%) had an MPR < 50%.

Of the 15 evaluable patients who continued on prazosin at the 2-year mark, 9 (60%) were adherent to therapy, with an MPR of ≥ 80%. Three patients (20%) had a 2-year MPR 
between 50% and 80%, and 3 patients 
(20%) had an MPR < 50%.

Discussion

Although prazosin has been shown to be effective for nighttime PTSD symptoms in both prospective and retrospective evaluations in veterans, this study provides the first evidence to support the use of clonidine in a veteran population.10-12,15

Interestingly, 42% of the patients assessed received their first prescription of an α2-adrenergic agent for nighttime PTSD symptoms from a PCP. Even with the recent increased focus on integrating mental health into primary care within the VA, this was a surprising finding. Primary care providers at VAPHCS may have a greater role in the outpatient management of PTSD than previously suspected. The information presented here may prove useful and applicable in both psychiatric and primary care treatment settings.

The study results indicated that a majority of subjects initially reported effectiveness with either clonidine or prazosin (53% and 57%, respectively). The initial effectiveness rate for prazosin is similar to those described in previous studies.10-13,15 The data also support a viable role for clonidine in the treatment of nighttime PTSD symptoms.

Regardless of initial improvement, the study results also suggest that the therapeutic benefit may not persist in the long term, as evidenced by a significant percentage of discontinuations attributed to ineffectiveness (38% for clonidine and 14% for prazosin) and a very low rate of long-term continuation (19% for clonidine and 30% for prazosin at 2 years). This latter observation contrasts with findings from previous studies; Byers and colleagues reported a 2-year prazosin continuation rate of 48.4% in a similar analysis, and Boehnlein and colleagues reported a sustained benefit of clonidine in responders over a 10-year period.14,15 The wide variety of reasons for discontinuation reported here may help providers who are considering clonidine or prazosin for their patients to anticipate barriers to long-term success.

 

 

Part of the discrepancy between these results and previously reported successes with clonidine and prazosin may be attributable to the classic issue of efficacy vs effectiveness. Many of the studies that have informed us on the efficacy and tolerability of prazosin for nighttime PTSD symptoms described outcomes of prospective clinical research. Furthermore, these prospective trials were limited to < 6 months in duration. To date, neither clonidine nor prazosin has been evaluated for long-term efficacy and effectiveness in well-designed, prospective trials. This retrospective analysis may help provide a realistic estimate of the long-term effectiveness of these therapies, especially within the veteran population.

Limitations

This was a single-center, retrospective study conducted primarily in white male patients. Although likely applicable to the U.S. veteran population at large, these data may be poorly generalizable to patient populations outside the VA health care system.

Aside from external validity, this study has several significant limitations. The primary limitation of this project is that it was not designed to allow for statistical comparison of clonidine and prazosin. Such an analysis would have better defined the role of clonidine in PTSD treatment, either by establishing similar effectiveness of clonidine and prazosin for nighttime symptoms or by providing evidence of the superiority of one over the other. In designing the project, investigators suspected based on experience that the majority of patients prescribed clonidine would receive the drug after having already failed first-line therapy with prazosin. Had this been the case, a direct comparison may have been biased in favor of prazosin. In retrospect, however, only 24% of the clonidine group had previously been prescribed prazosin, and only 7% of the prazosin group had been prescribed clonidine. This suggests that clonidine may be used first line more often than the investigators anticipated and that a future direct comparison would be worthwhile.

Second, the subjective data collected for this project required investigators to read and interpret chart notes, although the review of all records by a single investigator helped limit variability in interpretation. At times, information in the CPRS was incomplete in terms of determining continuation of therapy or cause for discontinuation.

Third, although it is implied that a significant number of veterans have combat-related PTSD, the nature of the traumatic event(s) leading to PTSD was not recorded in this study, and no subgroup analysis was done to compare the effect of α2-adrenergic agents between combat- and noncombat-related PTSD. Owing to their exclusion by design, it is also difficult to apply these results to veterans who have lasting cognitive impairment as a result of TBI, who are presumably among those most likely to have experienced traumas that could provoke PTSD.

The design of this project also did not include a subgroup analysis based on antidepressant type, and it is unclear whether the potential pharmacodynamic interaction 
between noradrenergic antidepressants (ie, SNRIs) and anti–
α2-adrenergic agents had any impact on clinical outcomes. The use of complementary nonpharmacologic treatment modalities (ie, psychotherapy, eye movement desensitization and reprocessing) was also not evaluated.

Related: Female Service Members in the Long War

Finally, the primary outcome of patient-reported improvement in symptoms does not provide information on the magnitude or specific nature of benefits derived. Given the retrospective nature, data used in prospectively designed studies (eg, rating scales pertinent to PTSD), which might have helped to quantify the benefit of treatment, was not consistently available. Even a baseline PCL-C score, collected in order to describe the patient population, was available only in 37% of the patients assessed. Furthermore, nighttime PTSD symptoms vary among individuals, but the primary outcome of this study pools any benefits seen in areas such as nightmares, awakenings, night sweats, or sleep quality into a single outcome of symptom improvement.

Conclusions

This study indicates that both clonidine and prazosin may be effective for the treatment of nighttime PTSD symptoms in the veteran population but that their long-term utility may be limited by waning effectiveness, tolerability, and adherence issues. At this time, it is unclear whether either agent has an advantage over the other in terms of effectiveness or tolerability; further studies are needed to address that question.

Despite its limitations, the authors anticipate that this study will provide information regarding the effectiveness and tolerability of clonidine and prazosin to treat nighttime PTSD symptoms. Findings from this study may help clinicians to anticipate the needs and challenges of patients using β2-adrenergic agents for nighttime symptoms of PTSD.

Acknowledgements
The authors wish to acknowledge Brian Wilcox, PharmD, for his assistance in generating patient data reports, and Ronald Brown, RPh, MS, for his guidance regarding data analysis.

 

 

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the U.S. Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Posttraumatic stress disorder (PTSD) remains a significant health concern in veterans and military personnel. Whereas the lifetime incidence of PTSD in the U.S. general population is about 7% to 8%, the estimated prevalence of PTSD in deployed U.S. military personnel is higher than the national average, ranging from 11% to 17%.1,2 These numbers may be even higher, depending on the branch of service, responsibilities within the military, and specific conflict in which the veteran served. For example, one study found that 31% of Vietnam veterans have PTSD, and another recent study has reported PTSD in 28.7% of veterans returning from military service in Iraq and Afghanistan.3,4

Posttraumatic stress disorder treatment guidelines from both the American Psychiatric Association and the VA and DoD recommend the use of selective serotonin reuptake inhibitors (SSRIs) or serotonin-norepinephrine reuptake inhibitors (SNRIs) as first-line pharmacotherapy for PTSD.5,6 However, SSRIs and SNRIs seem to be largely ineffective for the management of nighttime PTSD symptoms, such as insomnia and nightmares.7,8

Related: PTSD Increases Chance of Heart Failure

Researchers hypothesize that the sympathetic nervous system plays a significant role in the hyperarousal component of nighttime PTSD. The heightened responsiveness and disruption in restorative sleep seen in PTSD have been attributed to increased activity of norepinephrine in the central nervous system.9 Mechanistically, therapies that attenuate the increased noradrenergic signaling might be effective in the management of nighttime PTSD symptoms.

The body of evidence for the use of adrenergic agents for nighttime PTSD symptoms is growing. Prazosin, a peripherally acting 
α1-adrenergic receptor antagonist, has recently been demonstrated to be effective for nighttime PTSD symptoms in veterans in a series of small, randomized controlled trials.10-12 Data to support the use of clonidine, a centrally acting α2-adrenergic 
receptor agonist, are generally limited, with the most compelling data coming from a population of civilian Cambodian refugees.13,14 A 2007 article by Boehnlein and Kinzie includes a thorough review of the preclinical research, case reports, and early clinical studies that have led to the widespread use of these agents for PTSD despite the lack of FDA approval for this indication.13A previous retrospective review by Byers and colleagues compared the effectiveness and tolerability of prazosin and quetiapine for nighttime PTSD symptoms in veterans.15 The results of that review suggest that α1-adrenergic agents may be equally effective and better tolerated than alternative medication options (ie, atypical antipsychotics) for this purpose. The present study was adapted from this design to report concurrently on the real-world use of clonidine and prazosin for the treatment of nighttime PTSD
symptoms.

Study Objectives

The primary objective of this retrospective chart review was to describe the experience of patients prescribed clonidine or prazosin for the treatment of nighttime PTSD symptoms, including initial effectiveness. The primary endpoint of initial drug effectiveness was documented improvement of nighttime PTSD symptoms in the patient’s chart within 6 months of the date of first prescription. Clonidine or prazosin was categorized as initially effective if a statement such as “frequency of nightmares decreased” or “patient’s nighttime PTSD symptoms have improved” was made within 6 months after initial prescription of the drug.

The secondary objectives of this study were to evaluate the long-term effectiveness and tolerability of prazosin. The endpoints used to assess these outcomes were the 2-year continuation rates of clonidine and prazosin (as a surrogate marker for long-term effectiveness) and the documented reasons for discontinuation of clonidine and prazosin for the treatment of nighttime PTSD symptoms (in order to assess tolerability).

Methods

An electronic database search was conducted to identify the VA Portland Health Care System (VAPHCS) patients with a diagnosis of PTSD who received a first prescription for clonidine or prazosin for nighttime PTSD symptoms from a VAPHCS mental health provider or primary care provider (PCP) from January 1, 2009, to December 31, 2011. Patients were excluded if they had any history of prior use of the drug being initiated, were co-initiated on both clonidine and prazosin (defined as starting the drugs within 30 days of each other), or had a concomitant diagnosis of schizophrenia, bipolar 
disorder, psychotic disorder, or cognitive disorder as defined in the 
Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. Patients with traumatic brain injury (TBI) were excluded only if it could be determined that the event had resulted in lasting cognitive impairment.

Study Population

All patients with a diagnosis of PTSD who received a first prescription for clonidine during the period specified were screened for inclusion; patients with PTSD who were first prescribed prazosin during the same period were randomly sampled to equalize patient populations. This was done to maximize the data set while examining groups of roughly equal size for each drug, as prazosin is used much more commonly than clonidine for nighttime PTSD symptoms at VAPHCS. The patients in each resulting group were screened to determine whether they met inclusion and exclusion criteria. All subjects included were followed for 2 years from the date of the initial prescription.

 

 

Study Design

Initial effectiveness of each agent was determined by reviewing subjects’ progress notes after the initial prescription of clonidine or prazosin for documentation of improvement in symptoms within 6 months of the prescription start date. A decrease in frequency or intensity of nighttime PTSD symptoms, nightmares, or insomnia, as documented in the patient chart, was interpreted as improvement of symptoms.

Long-term continuation was assessed by reviewing subjects’ prescription records, to determine whether prescription(s) for clonidine or prazosin continued for 2 years after the date of the initial prescription.

Any gap between medication fills that resulted in an anticipated period without medication of ≥ 6 months (eg, 9 months after receiving a 
90-day supply) was considered discontinuation of therapy. Prescription refill history was also reviewed, and medication possession ratio (MPR) was calculated to assess whether patients were adherent to the study drug as prescribed. Adherence was defined as an MPR of ≥ 80%. Patients who left the VAPHCS service area but continued to receive care at another VA were assessed for continuation of therapy, but refill data and/or MPR were not assessed.

Tolerability was assessed by reviewing subjects’ medical records to determine whether therapy with clonidine or prazosin was discontinued due to documented adverse effects (AEs). The occurrence of AEs was determined by reviewing progress notes and other chart documentation surrounding the date of discontinuation. If the drug was discontinued but the reason was not explicitly documented or if the prescription expired without a documented reason for nonrenewing, the reason for discontinuation was coded as “not specified.” Discontinuation due to treatment failure, change in symptoms, nonadherence, or other causes was also recorded. If multiple reasons for discontinuation were cited for a single patient, all were included in the data. This project was approved by the institutional review board at the VAPHCS.

Related:Depression and Substance Abuse Intensify Suicide Risk

Statistical Considerations

Based on clinical experience, it was presumed that many of the patients who were prescribed clonidine would be receiving it as a second-line therapy after failing prazosin. Therefore, statistical analysis of the relative effectiveness and tolerability of clonidine and prazosin could not be performed. Neither power nor sample size needed to demonstrate any difference in effectiveness or tolerability between the groups was calculated. All results are expressed using descriptive statistics.

Results

An initial database search for patients with PTSD who received a first prescription for clonidine between January 1, 2009, and December 31, 2011, from a VAPHCS provider yielded a list of 149 patients. The same search criteria applied for prazosin yielded 1,116 patients, 149 of whom were randomly selected for screening. After screening, 42 patients on clonidine and 60 patients on prazosin were included in this analysis (Figure).

Patient Demographics

The average age of the clonidine patients was 38.5 years (range 21-65 years) (Table 1). The clonidine group was primarily male (90%) and white (83%). Eighteen of the 42 patients in the clonidine group had a
baseline PTSD Checklist-Civilian version (PCL-C) score available within the 90 days before the first prescription of clonidine; the average baseline PCL-C score in this subgroup was 
62 ± 12.0 (median 65.5, range 
31-82). Most of the clonidine patients (71%) had a concomitant diagnosis of a depressive disorder. About one-quarter of the group (24%) had previously tried prazosin per prescription records. In 24 patients (57%), the first prescription for clonidine was written by a psychiatrist or psychiatric nurse practitioner; 18 patients (43%) were started on clonidine by PCPs.

The average age of the prazosin patients was 46.1 years (range 21-74 years). The prazosin group was also primarily male (93%) and white (88%). Twenty of the 60 patients in the prazosin group had a baseline PCL-C score available within the 
90 days before the first prescription of prazosin; the average baseline PCL-C score in this subgroup was 55 ± 16.1 (median 64, range 30-72). Most of the prazosin patients (63%) had a concomitant diagnosis of a depressive disorder. Four patients (7%) had previously tried clonidine per prescription records. In 35 patients (58%), the first prescription for prazosin was written by a psychiatrist or psychiatric nurse practitioner; 
25 patients (42%) were started on prazosin by PCPs.

Data pertaining to initial and long-term effectiveness, tolerability, and MPR for both clonidine and prazosin are presented in Table 2.

Clonidine

Of the 42 clonidine patients assessed, 24 (57%) had a positive response to the medication for nighttime PTSD symptoms documented in the Computerized Patient Record System (CPRS) within 6 months of starting therapy. Six months after starting clonidine, 23 patients (55%) continued to take clonidine. Two years after starting therapy, 8 of the original 
42 patients continued on clonidine for an overall 2-year continuation rate of 19%.

 

 

Tolerability

Of the 34 patients who discontinued clonidine within 2 years, 13 patients (38%) cited ineffectiveness of therapy as a reason for discontinuation. Another 13 patients (38%) reported discontinuing therapy due to AEs. Sedation (4 patients, 12%), dizziness/hypotension (3 patients, 9%), and paradoxical worsening of PTSD symptoms (4 patients, 12%) were the most common AEs leading to discontinuation. Other AEs cited as reasons for discontinuation were syncope 
(2 patients), erectile dysfunction 
(1 patient), rash (1 patient), myoclonus (1 patient), increased depression (1 patient), and fatigue (1 patient). One patient reported that he had discontinued clonidine due to symptom resolution/lack of need for treatment. In 8 of the 34 patients, no reason for discontinuation was found in chart documentation.

Medication Possession Ratio

Among the 21 evaluable patients who continued to receive clonidine 6 months after initiation, 10 (48%) were determined to be highly adherent to therapy, with an MPR of ≥ 80%. Six of the 21 patients (29%) had an MPR between 50% and 79%, and 
5 patients (24%) had an MPR < 50%.

Of the 8 patients who continued on clonidine at the 2-year mark, 
3 (38%) were adherent to therapy, with an MPR of ≥ 80%. Three more patients (38%) had a 2-year MPR between 50% and 80%, and 2 patients (25%) had an MPR < 50%.

Prazosin

Of the 60 prazosin patients assessed, 32 (53%) had a positive response to the medication for nighttime PTSD symptoms documented in the CPRS within 6 months of starting therapy. Six months after starting prazosin, 36 patients (60%) continued to take prazosin. Two years after starting therapy, 18 of the original 60 patients continued on prazosin for an overall 2-year continuation rate of 30%.

Tolerability

Of the 42 patients who discontinued prazosin within 2 years, six patients (14%) cited ineffectiveness of therapy as a reason for discontinuation. Thirteen patients (31%) reported discontinuing therapy due to AEs. Sedation (3 patients, 7%), dizziness/hypotension (3 patients, 7%), and paradoxical worsening of PTSD symptoms (6 patients, 14%) were the most common AEs leading to discontinuation. Other AEs cited as reasons for discontinuation were headache 
(2 patients), altered mental status (1 patient), and fatigue (1 patient). Three patients reported that they had discontinued clonidine due 
to symptom resolution/lack of need for treatment. Other reasons for discontinuation not related to 
AEs included flight rules (1 patient), changes to antihypertensive regimen (1 patient), refill issues (1 patient), and cost (1 patient). In 15 of 
the 42 patients, no reason for 
discontinuation was found in chart documentation.

Medication Possession Ratio

Among the 31 evaluable patients who continued to receive prazosin 
6 months after initiation, 20 (65%) were determined to be highly adherent to therapy, with an MPR of ≥ 80%. Five of the 31 patients (16%) had an MPR between 50% and 80%, and 6 patients (19%) had an MPR < 50%.

Of the 15 evaluable patients who continued on prazosin at the 2-year mark, 9 (60%) were adherent to therapy, with an MPR of ≥ 80%. Three patients (20%) had a 2-year MPR 
between 50% and 80%, and 3 patients 
(20%) had an MPR < 50%.

Discussion

Although prazosin has been shown to be effective for nighttime PTSD symptoms in both prospective and retrospective evaluations in veterans, this study provides the first evidence to support the use of clonidine in a veteran population.10-12,15

Interestingly, 42% of the patients assessed received their first prescription of an α2-adrenergic agent for nighttime PTSD symptoms from a PCP. Even with the recent increased focus on integrating mental health into primary care within the VA, this was a surprising finding. Primary care providers at VAPHCS may have a greater role in the outpatient management of PTSD than previously suspected. The information presented here may prove useful and applicable in both psychiatric and primary care treatment settings.

The study results indicated that a majority of subjects initially reported effectiveness with either clonidine or prazosin (53% and 57%, respectively). The initial effectiveness rate for prazosin is similar to those described in previous studies.10-13,15 The data also support a viable role for clonidine in the treatment of nighttime PTSD symptoms.

Regardless of initial improvement, the study results also suggest that the therapeutic benefit may not persist in the long term, as evidenced by a significant percentage of discontinuations attributed to ineffectiveness (38% for clonidine and 14% for prazosin) and a very low rate of long-term continuation (19% for clonidine and 30% for prazosin at 2 years). This latter observation contrasts with findings from previous studies; Byers and colleagues reported a 2-year prazosin continuation rate of 48.4% in a similar analysis, and Boehnlein and colleagues reported a sustained benefit of clonidine in responders over a 10-year period.14,15 The wide variety of reasons for discontinuation reported here may help providers who are considering clonidine or prazosin for their patients to anticipate barriers to long-term success.

 

 

Part of the discrepancy between these results and previously reported successes with clonidine and prazosin may be attributable to the classic issue of efficacy vs effectiveness. Many of the studies that have informed us on the efficacy and tolerability of prazosin for nighttime PTSD symptoms described outcomes of prospective clinical research. Furthermore, these prospective trials were limited to < 6 months in duration. To date, neither clonidine nor prazosin has been evaluated for long-term efficacy and effectiveness in well-designed, prospective trials. This retrospective analysis may help provide a realistic estimate of the long-term effectiveness of these therapies, especially within the veteran population.

Limitations

This was a single-center, retrospective study conducted primarily in white male patients. Although likely applicable to the U.S. veteran population at large, these data may be poorly generalizable to patient populations outside the VA health care system.

Aside from external validity, this study has several significant limitations. The primary limitation of this project is that it was not designed to allow for statistical comparison of clonidine and prazosin. Such an analysis would have better defined the role of clonidine in PTSD treatment, either by establishing similar effectiveness of clonidine and prazosin for nighttime symptoms or by providing evidence of the superiority of one over the other. In designing the project, investigators suspected based on experience that the majority of patients prescribed clonidine would receive the drug after having already failed first-line therapy with prazosin. Had this been the case, a direct comparison may have been biased in favor of prazosin. In retrospect, however, only 24% of the clonidine group had previously been prescribed prazosin, and only 7% of the prazosin group had been prescribed clonidine. This suggests that clonidine may be used first line more often than the investigators anticipated and that a future direct comparison would be worthwhile.

Second, the subjective data collected for this project required investigators to read and interpret chart notes, although the review of all records by a single investigator helped limit variability in interpretation. At times, information in the CPRS was incomplete in terms of determining continuation of therapy or cause for discontinuation.

Third, although it is implied that a significant number of veterans have combat-related PTSD, the nature of the traumatic event(s) leading to PTSD was not recorded in this study, and no subgroup analysis was done to compare the effect of α2-adrenergic agents between combat- and noncombat-related PTSD. Owing to their exclusion by design, it is also difficult to apply these results to veterans who have lasting cognitive impairment as a result of TBI, who are presumably among those most likely to have experienced traumas that could provoke PTSD.

The design of this project also did not include a subgroup analysis based on antidepressant type, and it is unclear whether the potential pharmacodynamic interaction 
between noradrenergic antidepressants (ie, SNRIs) and anti–
α2-adrenergic agents had any impact on clinical outcomes. The use of complementary nonpharmacologic treatment modalities (ie, psychotherapy, eye movement desensitization and reprocessing) was also not evaluated.

Related: Female Service Members in the Long War

Finally, the primary outcome of patient-reported improvement in symptoms does not provide information on the magnitude or specific nature of benefits derived. Given the retrospective nature, data used in prospectively designed studies (eg, rating scales pertinent to PTSD), which might have helped to quantify the benefit of treatment, was not consistently available. Even a baseline PCL-C score, collected in order to describe the patient population, was available only in 37% of the patients assessed. Furthermore, nighttime PTSD symptoms vary among individuals, but the primary outcome of this study pools any benefits seen in areas such as nightmares, awakenings, night sweats, or sleep quality into a single outcome of symptom improvement.

Conclusions

This study indicates that both clonidine and prazosin may be effective for the treatment of nighttime PTSD symptoms in the veteran population but that their long-term utility may be limited by waning effectiveness, tolerability, and adherence issues. At this time, it is unclear whether either agent has an advantage over the other in terms of effectiveness or tolerability; further studies are needed to address that question.

Despite its limitations, the authors anticipate that this study will provide information regarding the effectiveness and tolerability of clonidine and prazosin to treat nighttime PTSD symptoms. Findings from this study may help clinicians to anticipate the needs and challenges of patients using β2-adrenergic agents for nighttime symptoms of PTSD.

Acknowledgements
The authors wish to acknowledge Brian Wilcox, PharmD, for his assistance in generating patient data reports, and Ronald Brown, RPh, MS, for his guidance regarding data analysis.

 

 

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the U.S. Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

References

 

1.  Hoge CW, Castro CA, Messer SC, McGurk D, Cotting DI, Koffman RL. Combat duty in Iraq and Afghanistan, mental problems, and barriers to care. N Engl J Med. 2004;351(1):13-22.  

2. Gates MA, Holowka DW, Vasterling JJ, Keane TM, Marx BP, Rosen RC. Posttraumatic stress disorder in veterans and military personnel: epidemiology, screening, and case recognition. Psychol Serv. 2012;9(4):361-382.

3. Kulka R, Schlenger WE, Fairbanks J, et al. Trauma and the Vietnam War Generation: Report of Findings From the National Vietnam Veterans Readjustment Study. New York, NY: Brunnel/Mazel; 1990.

4. Barrera TL, Graham DP, Dunn NJ, Teng EJ. Influence of trauma history on panic and posttraumatic stress disorder in returning veterans. Psychol Serv. 2013;10(2):168-176. 

5. American Psychiatric Association. Practice Guideline for the Treatment of Patients With Acute Stress Disorder and Posttraumatic Stress Disorder. Arlington, VA: American Psychiatric Association; 2004. 

6. U.S. Department of Veterans Affairs, Department of Defense. VA/DoD clinical practice guideline for management of post-traumatic stress. Version 2.0. U.S. Department of Veterans Affairs Website. http://www.healthquality.va.gov/guidelines/MH/ptsd/cpgPTSDFULL201011612c.pdf. Published October 2010. Accessed October 5, 2015.

7. Berger W, Mendlowicz MV, Marques-Portella C, et al. Pharmacologic alternatives to antidepressants in posttraumatic stress disorder: a systematic review. Prog Neuropsychopharmacol Biol Psychiatry. 2009;33(2):169-180.

8. Ravindran LN, Stein MB. Pharmacotherapy of post-traumatic stress disorder. In: Stein MB, Steckler T, eds. Behavioral Neurobiology of Anxiety and Its Treatment. Vol 2. Heidelberg, Germany: Springer; 2010:505-525.

9. Spoormaker VI, Montgomery P. Disturbed sleep in post-traumatic stress disorder: secondary symptom or core feature? Sleep Med Rev. 2008;12(3):169-184.

10. Raskind MA, Peskind ER, Kanter ED, et al. Reduction of nightmares and other PTSD symptoms in combat veterans by prazosin: a placebo controlled study. Am J Psychiatry. 2003;160(2):371-373.

11. Raskind MA, Peskind ER, Hoff DJ, et al. A parallel group placebo controlled study of prazosin for trauma nightmares and sleep disturbance in combat veterans with post-traumatic stress disorder. Biol Psychiatry. 2007;61(8):928-934.

12. Raskind MA, Peterson K, Williams T, et al. A trial of prazosin for combat trauma PTSD with nightmares in active-duty soldiers returned from Iraq and Afghanistan. Am J Psychiatry. 2013;170:1003-1010.

13. Boehnlein JK, Kinzie JD. Pharmacologic reduction of CNS noradrenergic activity in PTSD: the case for clonidine and prazosin. J Psychiatr Pract. 2007;13(2):72-78.

14. Boehnlein JK, Kinzie JD, Sekiya U, Riley C, Pou K, Rosborough B. A ten-year treatment outcome study of traumatized Cambodian refugees. J Nerve Ment Dis. 2004;192(10):658-663.

15. Byers MG, Allison KM, Wendel CS, Lee JK. Prazosin versus quetiapine for nighttime posttraumatic stress disorder symptoms in veterans: an assessment of long-term comparative effectiveness and safety.  J Clin Psychopharmacol. 2010;30(3):225-229.

References

 

1.  Hoge CW, Castro CA, Messer SC, McGurk D, Cotting DI, Koffman RL. Combat duty in Iraq and Afghanistan, mental problems, and barriers to care. N Engl J Med. 2004;351(1):13-22.  

2. Gates MA, Holowka DW, Vasterling JJ, Keane TM, Marx BP, Rosen RC. Posttraumatic stress disorder in veterans and military personnel: epidemiology, screening, and case recognition. Psychol Serv. 2012;9(4):361-382.

3. Kulka R, Schlenger WE, Fairbanks J, et al. Trauma and the Vietnam War Generation: Report of Findings From the National Vietnam Veterans Readjustment Study. New York, NY: Brunnel/Mazel; 1990.

4. Barrera TL, Graham DP, Dunn NJ, Teng EJ. Influence of trauma history on panic and posttraumatic stress disorder in returning veterans. Psychol Serv. 2013;10(2):168-176. 

5. American Psychiatric Association. Practice Guideline for the Treatment of Patients With Acute Stress Disorder and Posttraumatic Stress Disorder. Arlington, VA: American Psychiatric Association; 2004. 

6. U.S. Department of Veterans Affairs, Department of Defense. VA/DoD clinical practice guideline for management of post-traumatic stress. Version 2.0. U.S. Department of Veterans Affairs Website. http://www.healthquality.va.gov/guidelines/MH/ptsd/cpgPTSDFULL201011612c.pdf. Published October 2010. Accessed October 5, 2015.

7. Berger W, Mendlowicz MV, Marques-Portella C, et al. Pharmacologic alternatives to antidepressants in posttraumatic stress disorder: a systematic review. Prog Neuropsychopharmacol Biol Psychiatry. 2009;33(2):169-180.

8. Ravindran LN, Stein MB. Pharmacotherapy of post-traumatic stress disorder. In: Stein MB, Steckler T, eds. Behavioral Neurobiology of Anxiety and Its Treatment. Vol 2. Heidelberg, Germany: Springer; 2010:505-525.

9. Spoormaker VI, Montgomery P. Disturbed sleep in post-traumatic stress disorder: secondary symptom or core feature? Sleep Med Rev. 2008;12(3):169-184.

10. Raskind MA, Peskind ER, Kanter ED, et al. Reduction of nightmares and other PTSD symptoms in combat veterans by prazosin: a placebo controlled study. Am J Psychiatry. 2003;160(2):371-373.

11. Raskind MA, Peskind ER, Hoff DJ, et al. A parallel group placebo controlled study of prazosin for trauma nightmares and sleep disturbance in combat veterans with post-traumatic stress disorder. Biol Psychiatry. 2007;61(8):928-934.

12. Raskind MA, Peterson K, Williams T, et al. A trial of prazosin for combat trauma PTSD with nightmares in active-duty soldiers returned from Iraq and Afghanistan. Am J Psychiatry. 2013;170:1003-1010.

13. Boehnlein JK, Kinzie JD. Pharmacologic reduction of CNS noradrenergic activity in PTSD: the case for clonidine and prazosin. J Psychiatr Pract. 2007;13(2):72-78.

14. Boehnlein JK, Kinzie JD, Sekiya U, Riley C, Pou K, Rosborough B. A ten-year treatment outcome study of traumatized Cambodian refugees. J Nerve Ment Dis. 2004;192(10):658-663.

15. Byers MG, Allison KM, Wendel CS, Lee JK. Prazosin versus quetiapine for nighttime posttraumatic stress disorder symptoms in veterans: an assessment of long-term comparative effectiveness and safety.  J Clin Psychopharmacol. 2010;30(3):225-229.

Issue
Federal Practitioner - 32(11)
Issue
Federal Practitioner - 32(11)
Page Number
8-14
Page Number
8-14
Publications
Publications
Topics
Article Type
Display Headline
Evaluation of Clonidine and Prazosin for the Treatment of Nighttime Posttraumatic Stress Disorder Symptoms
Display Headline
Evaluation of Clonidine and Prazosin for the Treatment of Nighttime Posttraumatic Stress Disorder Symptoms
Legacy Keywords
Posttraumatic stress disorder, clonidine, nighttime posttraumatic stress disorder symptoms, prazosin, Kristin R. Wendell, Melissa L. Maxwell
Legacy Keywords
Posttraumatic stress disorder, clonidine, nighttime posttraumatic stress disorder symptoms, prazosin, Kristin R. Wendell, Melissa L. Maxwell
Sections
Disallow All Ads
Alternative CME
Article PDF Media