Distress and Factors Associated with Suicidal Ideation in Veterans Living with Cancer (FULL)

Article Type
Changed
Thu, 12/15/2022 - 14:39
Display Headline
Distress and Factors Associated with Suicidal Ideation in Veterans Living with Cancer

It was estimated that physicians would diagnose a form of invasive cancer > 1.7 million times in 2019. As the second most common cause of death in the US, > 600,000 people were projected to die from cancer in 2019.1 Many individuals with cancer endure distress, which the National Comprehensive Cancer Network (NCCN) defines as a “multifactorial unpleasant experience of a psychological (ie, cognitive, behavioral, emotional), social, spiritual, and/or physical nature that may interfere with the ability to cope effectively with cancer, its physical symptoms, and its treatment.”2,3 Distress in people living with cancer has been attributed to various psychosocial concerns, such as family problems, whichinclude dealing with partners and children; emotional problems, such as depression and anxiety; and physical symptoms, such as pain and fatigue.4-9 Certain factors associated with distress may increase a patient’s risk for suicide.4

Veterans are at particularly high risk for suicide.10 In 2014, veterans accounted for 18% of completed suicides in the US but only were 8.5% of the total population that same year.10 Yet, little research has been done on the relationship between distress and suicide in veterans living with cancer. Aboumrad and colleagues found that 45% of veterans with cancer who completed suicide reported family issues and 41% endorsed chronic pain.11 This study recommended continued efforts to assess and treat distress to lessen risk of suicide in veterans living with cancer; however, to date, only 1 study has specifically evaluated distress and problems endorsed among veterans living with cancer.7

Suicide prevention is of the highest priority to the US Department of Veterans Affairs (VA).12 Consistent with the VA mission to end veteran suicide, the current study aimed to better understand the relationship between distress and suicide within a sample of veterans living with cancer. Findings would additionally be used to tailor clinical assessments and interventions for veterans living with cancer.

This study had 3 primary goals. First, we sought to understand demographic and clinical factors associated with low, moderate, and severe levels of distress in veterans living with cancer who were referred for psychology services. Second, the study investigated the most commonly endorsed problems by veterans living with cancer. Finally, we examined which problems were related to suicidal ideation (SI). It was hypothesized that veterans who reported severe distress would be significantly more likely to endorse SI when compared with veterans who reported mild or moderate distress. Based on existing literature, it was further hypothesized that family, emotional, and physical problems would be significantly associated with SI.7,11

Methods

The current study was conducted at James A. Haley Veterans’ Hospital (JAHVH) in Tampa, Florida. Inclusion criteria included veterans who were diagnosed with cancer, attended an outpatient psychology-oncology evaluation, and completed mental health screening measures provided during their evaluation. Exclusion criteria included veterans who: were seen in response to an inpatient consult, were seen solely for a stem cell transplant evaluation, or did not complete the screening measures.

Measures

A veteran’s demographic (eg, age, sex, ethnicity) and clinical (eg, cancer type, stage of disease, recurrence, cancer treatments received) information was abstracted from their VA medical records. Marital status was assessed during a clinical interview and documented as part of the standardized suicide risk assessment.

 

 

The Distress Thermometer (DT) is a subjective measure developed by the NCCN.2 The DT provides a visual representation of a thermometer and asks patients to rate their level of distress over the past week with 0 indicating no distress and 10 indicating extreme distress. A distress rating of 4 or higher is clinically significant.4,6 Distress may be categorized into 3 levels of severity: mild distress (< 4), moderate distress (4-7), or severe distress (8-10). The DT has been found to have good face validity, sensitivity and specificity, and is user-friendly.2,6,7,13

The measurement additionally lists 39 problems nested within 5 domains: practical, family, emotional, spiritual/religious, and physical. Patients may endorse listed items under each problem domain by indicating yes or no. Endorsement of various items are intended to provide more detailed information about sources of distress. Due to the predominantly male and mostly older population included in this study the ability to have children measure was removed from the family problem domain.

SI was assessed in 2 ways. First, by patients’ self-report through item-9 of the Patient Health Questionnaire-9 (PHQ-9).14 Item-9 asks “over the last 2 weeks, how often have you been bothered by thoughts that you would be better off dead or of hurting yourself in some way?” Responses range from 0 (not at all) to 3 (nearly every day).14 Responses > 0 were considered a positive screen for SI. The process of administering the PHQ-9 item-9 is part of a national VA directive for standardizing assessment of suicide risk (Steve Young, personal communication, May 23, 2018). The PHQ-9 has been found to have good construct validity when used with both medical samples and the general population along with good internal and test-retest reliability.14,15 Second, all veterans also were asked directly about SI during clinical interview, the results of which were documented in health records using a standardized format for risk assessment.

Procedure

Participants were a sample of veterans who were referred for psychology-oncology services. The NCCN DT and Problems List were administered prior to the start of clinical interviews, which followed a checklist and included standardized assessments of SI and history of a suicide attempt(s). A licensed clinical psychologist or a postdoctoral resident conducted these assessments under the supervision of a licensed psychologist. Data gathered during the clinical interview and from the DT and problems list were documented in health records, which were retrospectively reviewed for relevant information (eg, cancer diagnosis, SI). Therefore, informed consent was waived. This study was approved by the JAHVH Institutional Review Board.

Analysis

Data were analyzed using SPSS Version 25. Data analysis proceeded in 3 steps. First, descriptive statistics included the demographic and clinical factors present in the current sample. Difference between those with and without suicidal ideation were compared using F-statistic for continuous variables and χ2 analyses for categorical variables. Second, to examine relationships between each DT problem domain and SI, χ2 analyses were conducted. Third, DT problem domains that had a significant relationship with SI were entered in a logistic regression. This analysis determined which items, if any, from a DT problem domain predicted SI. In the logistic regression model, history of suicide attempts was entered into the first block, as history of suicide attempts is a well-established risk factor for subsequent suicidal ideation. In the second block, other variables that were significantly related to suicidal ideation in the second step of analyses were included. Before interpreting the results of the logistic regression, model fit was tested using the Hosmer-Lemeshow test. Significance of each individual predictor variable in the model is reported using the Wald χ2 statistic; each Wald statistic is compared with a χ2 distribution with 1 degree of freedom (df). Results of logistic regression models also provide information about the effect of each predictor variable in the regression equation (beta weight), odds a veteran who endorsed each predictor variable in the model would also endorse SI (as indicated by the odds ratio), and an estimate of the amount of variance accounted for by each predictor variable (using Nagelkerke’s pseudo R2, which ranges in value from 0 to 1 with higher values indicating more variance explained). For all analyses, P value of .05 (2-tailed) was used for statistical significance.

 

 

Results

The sample consisted of 174 veterans (Table 1). The majority (77.6%) were male with a mean age of nearly 62 years (range, 29-87). Most identified as white (74.1%) with half reporting they were either married or living with a partner.

Prostate cancer (19.0%) was the most common type of cancer among study participants followed by head and neck (18.4%), lymphoma/leukemia (11.5%), lung (11.5%), and breast (10.9%); 31.6% had metastatic disease and 14.9% had recurrent disease. Chemotherapy (42.5%) was the most common treatment modality, followed by surgery (38.5%) and radiation (31.6%). The sample was distributed among the 3 distress DT categories: mild (18.4%), moderate (42.5%), and severe (39.1%).

Problems Endorsed

Treatment decisions (44.3%) and insurance/financial concerns (35.1%) were the most frequently endorsed practical problems (Figure 1). Family health issues (33.9%) and dealing with partner (23.0%) were the most frequently endorsed family problems (Figure 2). Worry (73.0%) and depression (69.5%) were the most frequent emotional problems; of note, all emotional problems were endorsed by at least 50% of veterans (Figure 3). Fatigue (71.3%), sleep (70.7%), and pain (69%), were the most frequently endorsed physical problems (Figure 4). Spiritual/religious problems were endorsed by 15% of veterans.

 

Suicidal Ideation

Overall, 25.3% of veterans endorsed SI. About 20% of veterans reported a history of ≥ 1 suicide attempts in their lifetime. A significant relationship among distress categories and SI was found 2 = 18.36, P < .001). Veterans with severe distress were more likely to endorse SI (42.7%) when compared with veterans with mild (9.4%) or moderate (16.2%) distress.

Similarly, a significant relationship among distress categories and a history of a suicide attempt(s) was found (χ2 = 6.08, P = .048). Veterans with severe distress were more likely to have attempted suicide (29.4%) when compared with veterans with mild (12.5%) or moderate (14.9%) distress.



χ2 analyses were conducted to examine the relationships between DT problem domains and SI. A significant relationship was found between family problems and SI (χ2 = 5.54,df = 1, P = .02) (Table 2). Specifically, 33.0% of veterans who endorsed family problems also reported experiencing SI. In comparison, there were no significant differences between groups with regard to practical, emotional, spiritual/religious, or physical problems and SI.

Logistic regression analyses determined whether items representative of the family problems domain were predictive of SI. Suicide attempt(s) were entered in the first step of the model to evaluate risk factors for SI over this already established risk factor. The assumptions of logistic regression were met.



The Hosmer-Lemeshow test (χ2 = 3.66, df = 5, P = .56) demonstrated that the model fit was good. The group of predictors used in the model differentiate between people who were experiencing SI and those who were not experiencing SI at the time of evaluation. A history of a suicide attempt(s) predicted SI, as expected (Wald = 6.821, df = 1, P = .01). The odds that a veteran with a history of a suicide attempt(s) would endorse SI at the time of the evaluation was nearly 3 times greater than that of veterans without a history of a suicide attempt(s). Over and above suicide attempts, problems dealing with partner (Wald = 15.142; df = 1, P < .001) was a second significant predictor of current SI. The odds that a veteran who endorsed problems dealing with partner would also endorse SI was > 5 times higher than that of veterans who did not endorse problems dealing with partner. This finding represents a significant risk factor for SI, over and above a history of a suicide attempt(s). The other items from the family problems domains were not significant (P > .05) (Table 3).

 

 

Discussion

This study aimed to understand factors associated with low, moderate, and severe levels of distress in veterans living with cancer who were referred for psychology services. As hypothesized, veterans who endorsed severe distress were significantly more likely to endorse SI. They also were more likely to have a history of a suicide attempt(s) when compared with those with mild or moderate distress.

A second aim of this study was to understand the most commonly endorsed problems. Consistent with prior literature, treatment decisions were the most commonly endorsed practical problem; worry and depression were the most common emotional problems; and fatigue, sleep, and pain were the most common physical problems.7

A finding unique to the current study is that family health issues and dealing with partner were specified as the most common family problems. However, a study by Smith and colleagues did not provide information about the rank of most frequently reported problems within this domain.7

The third aim was to understand which problems were related to SI. It was hypothesized that family, emotional, and physical problems would be related to SI. However, results indicated that only family problems (specifically, problems dealing with a partner) were significantly associated with SI among veterans living with cancer.

Contrary to expectations, emotional and physical problems were not found to have a significant relationship with SI. This is likely because veterans endorsed items nested within these problem domains with similar frequency. The lack of significant findings does not suggest that emotional and physical problems are not significant predictors of SI for veterans living with cancer, but that no specific emotional or physical symptom stood out as a predictor of suicidal ideation above the others.

The finding of a significant relationship between family problems (specifically, problems dealing with a partner) and SI in this study is consistent with findings of Aboumrad and colleagues in a study that examined root-cause analyses of completed suicides by veterans living with cancer.11 They found that nearly half the sample endorsed family problems prior to their death, and a small but notable percentage of veterans who completed suicide reported divorce as a stressor prior to their death.

This finding may be explained by Thomas Joiner's interpersonal-psychological theory of suicidal behavior (IPT), which suggests that completed suicide may result from a thwarted sense of belonging, perceived burdensomeness, and acquired capability for suicide.16 Problems dealing with a partner may impact a veteran’s sense of belonging or social connectedness. Problems dealing with a partner also may be attributed to perceived burdens due to limitations imposed by living with cancer and/or undergoing treatment. In both circumstances, the veteran’s social support system may be negatively impacted, and perceived social support is a well-established protective factor against suicide.17

Partner distress is a second consideration. It is likely that veterans’ partners experienced their own distress in response to the veteran’s cancer diagnosis and/or treatment. The partner’s cause, severity, and expression of distress may contribute to problems for the couple.

Finally, the latter point of the IPT refers to acquired capability, or the ability to inflict deadly harm to oneself.18 A military sample was found to have more acquired capability for suicide when compared with a college undergraduate sample.19 A history of a suicide attempt(s) and male gender have been found to significantly predict acquired capability to complete suicide.18 Furthermore, because veterans living with cancer often are in pain, fear of pain associated with suicide may be reduced and, therefore, acquired capability increased. This suggests that male veterans living with cancer who are in pain, have a history of a suicide attempt(s), and current problems with their partner may be an extremely vulnerable population at-risk for suicide. Results from the current study emphasize the importance of veterans having access to mental health and crisis resources for problems dealing with their partner. Partner problems may foreshadow a potentially lethal type of distress.

 

 

Strengths

This study’s aims are consistent with the VA’s mission to end veteran suicide and contributes to literature in several important ways.12 First, veterans living with cancer are an understudied population. The current study addresses a gap in existing literature by researching veterans living with cancer and aims to better understand the relationship between cancer-related distress and SI. Second, to the best of the authors’ knowledge, this study is the first to find that problems dealing with a partner significantly increases a veteran’s risk for SI above a history of a suicide attempt(s). Risk assessments now may be more comprehensive through inclusion of this distress factor.

It is recommended that future research use IPT to further investigate the relationship between problems dealing with a partner and SI.16 Future research may do so by including specific measures to assess for the tenants of the theory, including measurements of burdensomeness and belongingness. An expanded knowledge base about what makes problems dealing with a partner a significant suicide risk factor (eg, increased conflict, lack of support, etc.) would better enable clinicians to intervene effectively. Effective intervention may lessen suicidal behaviors or deaths from suicides within the Veteran population.

Limitations

One limitation is the focus on patients who accepted a mental health referral. This study design may limit the generalizability of results to veterans who would not accept mental health treatment. The homogenous sample of veterans is a second limitation. Most participants were male, white, and had a mean age of 62 years. These demographics are representative of the veterans that most typically utilize VA services; however, more research is needed on veterans living with cancer who are female and of diverse racial and ethnic backgrounds. There are likely differences in problems endorsed and factors associated with SI based on age, race, sex, and other socioeconomic factors. A third limitation is the cross-sectional, retrospective nature of this study. Future studies are advised to assess for distress at multiple time points. This is consistent with NCCN Standards of Care for Distress Management.2 Longitudinal data would enable more findings about distress and SI throughout the course of cancer diagnosis and treatment, therefore enhancing clinical implications and informing future research.

Conclusion

This is among the first of studies to investigate distress and factors associated with SI in veterans living with cancer who were referred for psychology services. The prevalence of distress caused by psychosocial factors (including treatment decisions, worry, and depression) highlights the importance of including mental health services as part of comprehensive cancer treatment.

Distress due to treatment decisions may be attributed to a litany of factors such as a veteran’s consideration of adverse effects, effectiveness of treatments, changes to quality of life or functioning, and inclusion of alternative or complimentary treatments. These types of decisions often are reported to be difficult conversations to have with family members or loved ones, who are likely experiencing distress of their own. The role of a mental health provider to assist veterans in exploring their treatment decisions and the implications of such decisions appears important to lessening distress.

Early intervention for emotional symptoms would likely benefit veterans’ management of distress and may lessen suicide risk as depression is known to place veterans at-risk for SI.20 This underscores the importance of timely distress assessment to prevent mild emotional distress from progressing to potentially severe or life-threatening emotional distress. For veterans with a psychiatric history, timely assessment and intervention is essential because psychiatric history is an established suicide risk factor that may be exacerbated by cancer-related distress.12

Furthermore, management of intolerable physical symptoms may lessen risk for suicide.4 Under medical guidance, fatigue may be improved using exercise.21 Behavioral intervention is commonly used as first-line treatment for sleep problems.22 While pain may be lessened through medication or nonpharmacological interventions.23

Considering the numerous ways that distress may present itself (eg, practical, emotional, or physical) and increase risk for SI, it is essential that all veterans living with cancer are assessed for distress and SI, regardless of their presentation. Although veterans may not outwardly express distress, this does not indicate the absence of either distress or risk for suicide. For example, a veteran may be distressed due to financial concerns, transportation issues, and the health of his/her partner or spouse. This veteran may not exhibit visible symptoms of distress, as would be expected when the source of distress is emotional (eg, depression, anxiety). However, this veteran is equally vulnerable to impairing distress and SI as someone who exhibits emotional distress. Distress assessments should be further developed to capture both the visible and less apparent sources of distress, while also serving the imperative function of screening for suicide. Other researchers also have noted the necessity of this development.24 Currently, the NCCN DT and Problems List does not include any assessment of SI or behavior.

Finally, this study identified a potentially critical factor to include in distress assessment: problems dealing with a partner. Problems dealing with a partner have been noted as a source of distress in existing literature, but this is the first study to find problems dealing with a partner to be a predictor of SI in veterans living with cancer.4-6

Because partners often attend appointments with veterans, it is not surprising that problems dealing with their partner are not disclosed more readily. It is recommended that clinicians ask veterans about potential problems with their partner when they are alone. Directly gathering information about such problems while assessing for distress may assist health care workers in providing the most effective, accurate type of intervention in a timely manner, and potentially mitigate risk for suicide.

As recommended by the NCCN and numerous researchers, findings from the current study underscore the importance of accurate, timely assessment of distress.2,4,8 This study makes several important recommendations about how distress assessment may be strengthened and further developed, specifically for the veteran population. This study also expands the current knowledge base of what is known about veterans living with cancer, and has begun to fill a gap in the existing literature. Consistent with the VA mission to end veteran suicide, results suggest that veterans living with cancer should be regularly screened for distress, asked about distress related to their partner, and assessed for SI. Continued efforts to enhance assessment of and response to distress may lessen suicide risk in veterans with cancer.11

 

Acknowledgements

This study is the result of work supported with resources and the use of facilities at the James A. Haley Veterans’ Hospital.

 

References

1. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2019. CA Cancer J Clin. 2019;69(1):7-34.

2. Riba MB, Donovan, KA, Andersen, B. National Comprehensive Cancer Network clinical practice guidelines in oncology. Distress management (Version 3.2019). J Natl Compr Can Net, 2019;17(10):1229-1249.

3. Zabora J, BrintzenhofeSzoc K, Curbow B, Hooker C, Pianta dosi S. The prevalence of psychological distress by cancer site. Psychooncology. 2001;10(1):19–28.

4. Holland JC, Alici Y. Management of distress in cancer patients. J Support Oncol. 2010;8(1):4-12.

5. Bulli F, Miccinesi G, Maruelli A, Katz M, Paci E. The measure of psychological distress in cancer patients: the use of distress thermometer in the oncological rehabilitation center of Florence. Support Care Cancer. 2009;17(7):771–779.

6. Jacobsen PB, Donovan KA, Trask PC, et al. Screening for psychologic distress in ambulatory cancer patients. Cancer. 2005;103(7):1494-1502.

7. Smith J, Berman S, Dimick J, et al. Distress Screening and Management in an Outpatient VA Cancer Clinic: A Pilot Project Involving Ambulatory Patients Across the Disease Trajectory. Fed Pract. 2017;34(Suppl 1):43S–50S.

8. Carlson LE, Waller A, Groff SL, Bultz BD. Screening for distress, the sixth vital sign, in lung cancer patients: effects on pain, fatigue, and common problems--secondary outcomes of a randomized controlled trial. Psychooncology. 2013;22(8):1880-1888.

9. Cooley ME, Short TH, Moriarty HJ. Symptom prevalence, distress, and change over time in adults receiving treatment for lung cancer. Psychooncology. 2003;12(7):694-708.

10. US Department of Veterans Affairs Office of Suicide Prevention. Suicide among veterans and other Americans 2001-2014. https://www.mentalhealth.va.gov/docs/2016suicidedatareport.pdf. Published August 3, 2016. Accessed April 13, 2020.

11. Aboumrad M, Shiner B, Riblet N, Mills, PD, Watts BV. Factors contributing to cancer-related suicide: a study of root-cause-analysis reports. Psychooncology. 2018;27(9):2237-2244.

12. US Department of Veterans Affairs, Office of Mental Health and Suicide Prevention. National Strategy for Preventing Veteran Suicide 2018–2028. https://www.mentalhealth.va.gov/suicide_prevention/docs/Office-of-Mental-Health-and-Suicide-Prevention-National-Strategy-for-Preventing-Veterans-Suicide.pdf Published 2018. Accessed April 13, 2020.

13. Carlson LE, Waller A, Mitchell AJ. Screening for distress and unmet needs in patients with cancer: review and recommendations. J Clin Oncol. 2012;30(11):1160-1177.

14. Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606–613.

15. Martin A, Rief W, Klaiberg A, Braehler E. Validity of the brief patient health questionnaire mood scale (PHQ-9) in the general population. Gen Hosp Psychiatry. 2006;28(1):71-77.

16. Joiner TE. Why People Die by Suicide. Cambridge, MA: Harvard University Press, 2005.

17. Kleiman EM, Riskind JH, Schaefer KE. Social support and positive events as suicide resiliency factors: examination of synergistic buffering effects. Arch Suicide Res. 2014;18(2):144-155.

18. Van Orden KA, Witte TK, Gordon KH, Bender TW, Joiner TE Jr. Suicidal desire and the capability for suicide: tests of the interpersonal-psychological theory of suicidal behavior among adults. J Consult Clin Psychol. 2008;76(1):72–83.

19. Bryan CJ, Morrow CE, Anestis MD, Joiner TE. A preliminary test of the interpersonal -psychological theory of suicidal behavior in a military sample. Personal Individual Differ. 2010;48(3):347-350.

20. Miller SN, Monahan CJ, Phillips KM, Agliata D, Gironda RJ. Mental health utilization among veterans at risk for suicide: Data from a post-deployment clinic [published online ahead of print, 2018 Oct 8]. Psychol Serv. 2018;10.1037/ser0000311.

21. Galvão DA, Newton RU. Review of exercise intervention studies in cancer patients. J Clin Oncol. 2005;23(4):899-909.

22. Qaseem A, Kansagara D, Forciea MA, Cooke M, Denberg TD; Clinical Guidelines Committee of the American College of Physicians. Management of chronic insomnia disorder in adults: A clinical practice guideline from the American College of Physicians. Ann Intern Med. 2016;165(2):125-133.

23. Ngamkham S, Holden JE, Smith EL. A systematic review: Mindfulness intervention for cancer-related pain. Asia Pac J Oncol Nurs. 2019;6(2):161-169.

24. Granek L, Nakash O, Ben-David M, Shapira S, Ariad S. Oncologists’, nurses’, and social workers’ strategies and barriers to identifying suicide risk in cancer patients. Psychooncology. 2018;27(1):148-154.

Article PDF
Author and Disclosure Information

Samantha Munson was a Clinical Psychology Resident in Psycho- Oncology, Patricia Cabrera-Sanchez is a Clinical Psychologist in Primary Care Mental Health Integration, Stephanie Miller is a Clinical Psychologist in Suicide Prevention, and Kristin Phillips is a Clinical Psychologist in Psycho- Oncology, all in the Department of Mental Health and Behavioral Science, James A. Haley Veterans’ Hospital, Tampa, Florida.
Correspondence: Kristin M. Phillips (kristin.phillips3@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest for this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Issue
Federal Practitioner - 37(2)s
Publications
Topics
Page Number
S8-S15
Sections
Author and Disclosure Information

Samantha Munson was a Clinical Psychology Resident in Psycho- Oncology, Patricia Cabrera-Sanchez is a Clinical Psychologist in Primary Care Mental Health Integration, Stephanie Miller is a Clinical Psychologist in Suicide Prevention, and Kristin Phillips is a Clinical Psychologist in Psycho- Oncology, all in the Department of Mental Health and Behavioral Science, James A. Haley Veterans’ Hospital, Tampa, Florida.
Correspondence: Kristin M. Phillips (kristin.phillips3@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest for this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Author and Disclosure Information

Samantha Munson was a Clinical Psychology Resident in Psycho- Oncology, Patricia Cabrera-Sanchez is a Clinical Psychologist in Primary Care Mental Health Integration, Stephanie Miller is a Clinical Psychologist in Suicide Prevention, and Kristin Phillips is a Clinical Psychologist in Psycho- Oncology, all in the Department of Mental Health and Behavioral Science, James A. Haley Veterans’ Hospital, Tampa, Florida.
Correspondence: Kristin M. Phillips (kristin.phillips3@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest for this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Article PDF
Article PDF

It was estimated that physicians would diagnose a form of invasive cancer > 1.7 million times in 2019. As the second most common cause of death in the US, > 600,000 people were projected to die from cancer in 2019.1 Many individuals with cancer endure distress, which the National Comprehensive Cancer Network (NCCN) defines as a “multifactorial unpleasant experience of a psychological (ie, cognitive, behavioral, emotional), social, spiritual, and/or physical nature that may interfere with the ability to cope effectively with cancer, its physical symptoms, and its treatment.”2,3 Distress in people living with cancer has been attributed to various psychosocial concerns, such as family problems, whichinclude dealing with partners and children; emotional problems, such as depression and anxiety; and physical symptoms, such as pain and fatigue.4-9 Certain factors associated with distress may increase a patient’s risk for suicide.4

Veterans are at particularly high risk for suicide.10 In 2014, veterans accounted for 18% of completed suicides in the US but only were 8.5% of the total population that same year.10 Yet, little research has been done on the relationship between distress and suicide in veterans living with cancer. Aboumrad and colleagues found that 45% of veterans with cancer who completed suicide reported family issues and 41% endorsed chronic pain.11 This study recommended continued efforts to assess and treat distress to lessen risk of suicide in veterans living with cancer; however, to date, only 1 study has specifically evaluated distress and problems endorsed among veterans living with cancer.7

Suicide prevention is of the highest priority to the US Department of Veterans Affairs (VA).12 Consistent with the VA mission to end veteran suicide, the current study aimed to better understand the relationship between distress and suicide within a sample of veterans living with cancer. Findings would additionally be used to tailor clinical assessments and interventions for veterans living with cancer.

This study had 3 primary goals. First, we sought to understand demographic and clinical factors associated with low, moderate, and severe levels of distress in veterans living with cancer who were referred for psychology services. Second, the study investigated the most commonly endorsed problems by veterans living with cancer. Finally, we examined which problems were related to suicidal ideation (SI). It was hypothesized that veterans who reported severe distress would be significantly more likely to endorse SI when compared with veterans who reported mild or moderate distress. Based on existing literature, it was further hypothesized that family, emotional, and physical problems would be significantly associated with SI.7,11

Methods

The current study was conducted at James A. Haley Veterans’ Hospital (JAHVH) in Tampa, Florida. Inclusion criteria included veterans who were diagnosed with cancer, attended an outpatient psychology-oncology evaluation, and completed mental health screening measures provided during their evaluation. Exclusion criteria included veterans who: were seen in response to an inpatient consult, were seen solely for a stem cell transplant evaluation, or did not complete the screening measures.

Measures

A veteran’s demographic (eg, age, sex, ethnicity) and clinical (eg, cancer type, stage of disease, recurrence, cancer treatments received) information was abstracted from their VA medical records. Marital status was assessed during a clinical interview and documented as part of the standardized suicide risk assessment.

 

 

The Distress Thermometer (DT) is a subjective measure developed by the NCCN.2 The DT provides a visual representation of a thermometer and asks patients to rate their level of distress over the past week with 0 indicating no distress and 10 indicating extreme distress. A distress rating of 4 or higher is clinically significant.4,6 Distress may be categorized into 3 levels of severity: mild distress (< 4), moderate distress (4-7), or severe distress (8-10). The DT has been found to have good face validity, sensitivity and specificity, and is user-friendly.2,6,7,13

The measurement additionally lists 39 problems nested within 5 domains: practical, family, emotional, spiritual/religious, and physical. Patients may endorse listed items under each problem domain by indicating yes or no. Endorsement of various items are intended to provide more detailed information about sources of distress. Due to the predominantly male and mostly older population included in this study the ability to have children measure was removed from the family problem domain.

SI was assessed in 2 ways. First, by patients’ self-report through item-9 of the Patient Health Questionnaire-9 (PHQ-9).14 Item-9 asks “over the last 2 weeks, how often have you been bothered by thoughts that you would be better off dead or of hurting yourself in some way?” Responses range from 0 (not at all) to 3 (nearly every day).14 Responses > 0 were considered a positive screen for SI. The process of administering the PHQ-9 item-9 is part of a national VA directive for standardizing assessment of suicide risk (Steve Young, personal communication, May 23, 2018). The PHQ-9 has been found to have good construct validity when used with both medical samples and the general population along with good internal and test-retest reliability.14,15 Second, all veterans also were asked directly about SI during clinical interview, the results of which were documented in health records using a standardized format for risk assessment.

Procedure

Participants were a sample of veterans who were referred for psychology-oncology services. The NCCN DT and Problems List were administered prior to the start of clinical interviews, which followed a checklist and included standardized assessments of SI and history of a suicide attempt(s). A licensed clinical psychologist or a postdoctoral resident conducted these assessments under the supervision of a licensed psychologist. Data gathered during the clinical interview and from the DT and problems list were documented in health records, which were retrospectively reviewed for relevant information (eg, cancer diagnosis, SI). Therefore, informed consent was waived. This study was approved by the JAHVH Institutional Review Board.

Analysis

Data were analyzed using SPSS Version 25. Data analysis proceeded in 3 steps. First, descriptive statistics included the demographic and clinical factors present in the current sample. Difference between those with and without suicidal ideation were compared using F-statistic for continuous variables and χ2 analyses for categorical variables. Second, to examine relationships between each DT problem domain and SI, χ2 analyses were conducted. Third, DT problem domains that had a significant relationship with SI were entered in a logistic regression. This analysis determined which items, if any, from a DT problem domain predicted SI. In the logistic regression model, history of suicide attempts was entered into the first block, as history of suicide attempts is a well-established risk factor for subsequent suicidal ideation. In the second block, other variables that were significantly related to suicidal ideation in the second step of analyses were included. Before interpreting the results of the logistic regression, model fit was tested using the Hosmer-Lemeshow test. Significance of each individual predictor variable in the model is reported using the Wald χ2 statistic; each Wald statistic is compared with a χ2 distribution with 1 degree of freedom (df). Results of logistic regression models also provide information about the effect of each predictor variable in the regression equation (beta weight), odds a veteran who endorsed each predictor variable in the model would also endorse SI (as indicated by the odds ratio), and an estimate of the amount of variance accounted for by each predictor variable (using Nagelkerke’s pseudo R2, which ranges in value from 0 to 1 with higher values indicating more variance explained). For all analyses, P value of .05 (2-tailed) was used for statistical significance.

 

 

Results

The sample consisted of 174 veterans (Table 1). The majority (77.6%) were male with a mean age of nearly 62 years (range, 29-87). Most identified as white (74.1%) with half reporting they were either married or living with a partner.

Prostate cancer (19.0%) was the most common type of cancer among study participants followed by head and neck (18.4%), lymphoma/leukemia (11.5%), lung (11.5%), and breast (10.9%); 31.6% had metastatic disease and 14.9% had recurrent disease. Chemotherapy (42.5%) was the most common treatment modality, followed by surgery (38.5%) and radiation (31.6%). The sample was distributed among the 3 distress DT categories: mild (18.4%), moderate (42.5%), and severe (39.1%).

Problems Endorsed

Treatment decisions (44.3%) and insurance/financial concerns (35.1%) were the most frequently endorsed practical problems (Figure 1). Family health issues (33.9%) and dealing with partner (23.0%) were the most frequently endorsed family problems (Figure 2). Worry (73.0%) and depression (69.5%) were the most frequent emotional problems; of note, all emotional problems were endorsed by at least 50% of veterans (Figure 3). Fatigue (71.3%), sleep (70.7%), and pain (69%), were the most frequently endorsed physical problems (Figure 4). Spiritual/religious problems were endorsed by 15% of veterans.

 

Suicidal Ideation

Overall, 25.3% of veterans endorsed SI. About 20% of veterans reported a history of ≥ 1 suicide attempts in their lifetime. A significant relationship among distress categories and SI was found 2 = 18.36, P < .001). Veterans with severe distress were more likely to endorse SI (42.7%) when compared with veterans with mild (9.4%) or moderate (16.2%) distress.

Similarly, a significant relationship among distress categories and a history of a suicide attempt(s) was found (χ2 = 6.08, P = .048). Veterans with severe distress were more likely to have attempted suicide (29.4%) when compared with veterans with mild (12.5%) or moderate (14.9%) distress.



χ2 analyses were conducted to examine the relationships between DT problem domains and SI. A significant relationship was found between family problems and SI (χ2 = 5.54,df = 1, P = .02) (Table 2). Specifically, 33.0% of veterans who endorsed family problems also reported experiencing SI. In comparison, there were no significant differences between groups with regard to practical, emotional, spiritual/religious, or physical problems and SI.

Logistic regression analyses determined whether items representative of the family problems domain were predictive of SI. Suicide attempt(s) were entered in the first step of the model to evaluate risk factors for SI over this already established risk factor. The assumptions of logistic regression were met.



The Hosmer-Lemeshow test (χ2 = 3.66, df = 5, P = .56) demonstrated that the model fit was good. The group of predictors used in the model differentiate between people who were experiencing SI and those who were not experiencing SI at the time of evaluation. A history of a suicide attempt(s) predicted SI, as expected (Wald = 6.821, df = 1, P = .01). The odds that a veteran with a history of a suicide attempt(s) would endorse SI at the time of the evaluation was nearly 3 times greater than that of veterans without a history of a suicide attempt(s). Over and above suicide attempts, problems dealing with partner (Wald = 15.142; df = 1, P < .001) was a second significant predictor of current SI. The odds that a veteran who endorsed problems dealing with partner would also endorse SI was > 5 times higher than that of veterans who did not endorse problems dealing with partner. This finding represents a significant risk factor for SI, over and above a history of a suicide attempt(s). The other items from the family problems domains were not significant (P > .05) (Table 3).

 

 

Discussion

This study aimed to understand factors associated with low, moderate, and severe levels of distress in veterans living with cancer who were referred for psychology services. As hypothesized, veterans who endorsed severe distress were significantly more likely to endorse SI. They also were more likely to have a history of a suicide attempt(s) when compared with those with mild or moderate distress.

A second aim of this study was to understand the most commonly endorsed problems. Consistent with prior literature, treatment decisions were the most commonly endorsed practical problem; worry and depression were the most common emotional problems; and fatigue, sleep, and pain were the most common physical problems.7

A finding unique to the current study is that family health issues and dealing with partner were specified as the most common family problems. However, a study by Smith and colleagues did not provide information about the rank of most frequently reported problems within this domain.7

The third aim was to understand which problems were related to SI. It was hypothesized that family, emotional, and physical problems would be related to SI. However, results indicated that only family problems (specifically, problems dealing with a partner) were significantly associated with SI among veterans living with cancer.

Contrary to expectations, emotional and physical problems were not found to have a significant relationship with SI. This is likely because veterans endorsed items nested within these problem domains with similar frequency. The lack of significant findings does not suggest that emotional and physical problems are not significant predictors of SI for veterans living with cancer, but that no specific emotional or physical symptom stood out as a predictor of suicidal ideation above the others.

The finding of a significant relationship between family problems (specifically, problems dealing with a partner) and SI in this study is consistent with findings of Aboumrad and colleagues in a study that examined root-cause analyses of completed suicides by veterans living with cancer.11 They found that nearly half the sample endorsed family problems prior to their death, and a small but notable percentage of veterans who completed suicide reported divorce as a stressor prior to their death.

This finding may be explained by Thomas Joiner's interpersonal-psychological theory of suicidal behavior (IPT), which suggests that completed suicide may result from a thwarted sense of belonging, perceived burdensomeness, and acquired capability for suicide.16 Problems dealing with a partner may impact a veteran’s sense of belonging or social connectedness. Problems dealing with a partner also may be attributed to perceived burdens due to limitations imposed by living with cancer and/or undergoing treatment. In both circumstances, the veteran’s social support system may be negatively impacted, and perceived social support is a well-established protective factor against suicide.17

Partner distress is a second consideration. It is likely that veterans’ partners experienced their own distress in response to the veteran’s cancer diagnosis and/or treatment. The partner’s cause, severity, and expression of distress may contribute to problems for the couple.

Finally, the latter point of the IPT refers to acquired capability, or the ability to inflict deadly harm to oneself.18 A military sample was found to have more acquired capability for suicide when compared with a college undergraduate sample.19 A history of a suicide attempt(s) and male gender have been found to significantly predict acquired capability to complete suicide.18 Furthermore, because veterans living with cancer often are in pain, fear of pain associated with suicide may be reduced and, therefore, acquired capability increased. This suggests that male veterans living with cancer who are in pain, have a history of a suicide attempt(s), and current problems with their partner may be an extremely vulnerable population at-risk for suicide. Results from the current study emphasize the importance of veterans having access to mental health and crisis resources for problems dealing with their partner. Partner problems may foreshadow a potentially lethal type of distress.

 

 

Strengths

This study’s aims are consistent with the VA’s mission to end veteran suicide and contributes to literature in several important ways.12 First, veterans living with cancer are an understudied population. The current study addresses a gap in existing literature by researching veterans living with cancer and aims to better understand the relationship between cancer-related distress and SI. Second, to the best of the authors’ knowledge, this study is the first to find that problems dealing with a partner significantly increases a veteran’s risk for SI above a history of a suicide attempt(s). Risk assessments now may be more comprehensive through inclusion of this distress factor.

It is recommended that future research use IPT to further investigate the relationship between problems dealing with a partner and SI.16 Future research may do so by including specific measures to assess for the tenants of the theory, including measurements of burdensomeness and belongingness. An expanded knowledge base about what makes problems dealing with a partner a significant suicide risk factor (eg, increased conflict, lack of support, etc.) would better enable clinicians to intervene effectively. Effective intervention may lessen suicidal behaviors or deaths from suicides within the Veteran population.

Limitations

One limitation is the focus on patients who accepted a mental health referral. This study design may limit the generalizability of results to veterans who would not accept mental health treatment. The homogenous sample of veterans is a second limitation. Most participants were male, white, and had a mean age of 62 years. These demographics are representative of the veterans that most typically utilize VA services; however, more research is needed on veterans living with cancer who are female and of diverse racial and ethnic backgrounds. There are likely differences in problems endorsed and factors associated with SI based on age, race, sex, and other socioeconomic factors. A third limitation is the cross-sectional, retrospective nature of this study. Future studies are advised to assess for distress at multiple time points. This is consistent with NCCN Standards of Care for Distress Management.2 Longitudinal data would enable more findings about distress and SI throughout the course of cancer diagnosis and treatment, therefore enhancing clinical implications and informing future research.

Conclusion

This is among the first of studies to investigate distress and factors associated with SI in veterans living with cancer who were referred for psychology services. The prevalence of distress caused by psychosocial factors (including treatment decisions, worry, and depression) highlights the importance of including mental health services as part of comprehensive cancer treatment.

Distress due to treatment decisions may be attributed to a litany of factors such as a veteran’s consideration of adverse effects, effectiveness of treatments, changes to quality of life or functioning, and inclusion of alternative or complimentary treatments. These types of decisions often are reported to be difficult conversations to have with family members or loved ones, who are likely experiencing distress of their own. The role of a mental health provider to assist veterans in exploring their treatment decisions and the implications of such decisions appears important to lessening distress.

Early intervention for emotional symptoms would likely benefit veterans’ management of distress and may lessen suicide risk as depression is known to place veterans at-risk for SI.20 This underscores the importance of timely distress assessment to prevent mild emotional distress from progressing to potentially severe or life-threatening emotional distress. For veterans with a psychiatric history, timely assessment and intervention is essential because psychiatric history is an established suicide risk factor that may be exacerbated by cancer-related distress.12

Furthermore, management of intolerable physical symptoms may lessen risk for suicide.4 Under medical guidance, fatigue may be improved using exercise.21 Behavioral intervention is commonly used as first-line treatment for sleep problems.22 While pain may be lessened through medication or nonpharmacological interventions.23

Considering the numerous ways that distress may present itself (eg, practical, emotional, or physical) and increase risk for SI, it is essential that all veterans living with cancer are assessed for distress and SI, regardless of their presentation. Although veterans may not outwardly express distress, this does not indicate the absence of either distress or risk for suicide. For example, a veteran may be distressed due to financial concerns, transportation issues, and the health of his/her partner or spouse. This veteran may not exhibit visible symptoms of distress, as would be expected when the source of distress is emotional (eg, depression, anxiety). However, this veteran is equally vulnerable to impairing distress and SI as someone who exhibits emotional distress. Distress assessments should be further developed to capture both the visible and less apparent sources of distress, while also serving the imperative function of screening for suicide. Other researchers also have noted the necessity of this development.24 Currently, the NCCN DT and Problems List does not include any assessment of SI or behavior.

Finally, this study identified a potentially critical factor to include in distress assessment: problems dealing with a partner. Problems dealing with a partner have been noted as a source of distress in existing literature, but this is the first study to find problems dealing with a partner to be a predictor of SI in veterans living with cancer.4-6

Because partners often attend appointments with veterans, it is not surprising that problems dealing with their partner are not disclosed more readily. It is recommended that clinicians ask veterans about potential problems with their partner when they are alone. Directly gathering information about such problems while assessing for distress may assist health care workers in providing the most effective, accurate type of intervention in a timely manner, and potentially mitigate risk for suicide.

As recommended by the NCCN and numerous researchers, findings from the current study underscore the importance of accurate, timely assessment of distress.2,4,8 This study makes several important recommendations about how distress assessment may be strengthened and further developed, specifically for the veteran population. This study also expands the current knowledge base of what is known about veterans living with cancer, and has begun to fill a gap in the existing literature. Consistent with the VA mission to end veteran suicide, results suggest that veterans living with cancer should be regularly screened for distress, asked about distress related to their partner, and assessed for SI. Continued efforts to enhance assessment of and response to distress may lessen suicide risk in veterans with cancer.11

 

Acknowledgements

This study is the result of work supported with resources and the use of facilities at the James A. Haley Veterans’ Hospital.

 

It was estimated that physicians would diagnose a form of invasive cancer > 1.7 million times in 2019. As the second most common cause of death in the US, > 600,000 people were projected to die from cancer in 2019.1 Many individuals with cancer endure distress, which the National Comprehensive Cancer Network (NCCN) defines as a “multifactorial unpleasant experience of a psychological (ie, cognitive, behavioral, emotional), social, spiritual, and/or physical nature that may interfere with the ability to cope effectively with cancer, its physical symptoms, and its treatment.”2,3 Distress in people living with cancer has been attributed to various psychosocial concerns, such as family problems, whichinclude dealing with partners and children; emotional problems, such as depression and anxiety; and physical symptoms, such as pain and fatigue.4-9 Certain factors associated with distress may increase a patient’s risk for suicide.4

Veterans are at particularly high risk for suicide.10 In 2014, veterans accounted for 18% of completed suicides in the US but only were 8.5% of the total population that same year.10 Yet, little research has been done on the relationship between distress and suicide in veterans living with cancer. Aboumrad and colleagues found that 45% of veterans with cancer who completed suicide reported family issues and 41% endorsed chronic pain.11 This study recommended continued efforts to assess and treat distress to lessen risk of suicide in veterans living with cancer; however, to date, only 1 study has specifically evaluated distress and problems endorsed among veterans living with cancer.7

Suicide prevention is of the highest priority to the US Department of Veterans Affairs (VA).12 Consistent with the VA mission to end veteran suicide, the current study aimed to better understand the relationship between distress and suicide within a sample of veterans living with cancer. Findings would additionally be used to tailor clinical assessments and interventions for veterans living with cancer.

This study had 3 primary goals. First, we sought to understand demographic and clinical factors associated with low, moderate, and severe levels of distress in veterans living with cancer who were referred for psychology services. Second, the study investigated the most commonly endorsed problems by veterans living with cancer. Finally, we examined which problems were related to suicidal ideation (SI). It was hypothesized that veterans who reported severe distress would be significantly more likely to endorse SI when compared with veterans who reported mild or moderate distress. Based on existing literature, it was further hypothesized that family, emotional, and physical problems would be significantly associated with SI.7,11

Methods

The current study was conducted at James A. Haley Veterans’ Hospital (JAHVH) in Tampa, Florida. Inclusion criteria included veterans who were diagnosed with cancer, attended an outpatient psychology-oncology evaluation, and completed mental health screening measures provided during their evaluation. Exclusion criteria included veterans who: were seen in response to an inpatient consult, were seen solely for a stem cell transplant evaluation, or did not complete the screening measures.

Measures

A veteran’s demographic (eg, age, sex, ethnicity) and clinical (eg, cancer type, stage of disease, recurrence, cancer treatments received) information was abstracted from their VA medical records. Marital status was assessed during a clinical interview and documented as part of the standardized suicide risk assessment.

 

 

The Distress Thermometer (DT) is a subjective measure developed by the NCCN.2 The DT provides a visual representation of a thermometer and asks patients to rate their level of distress over the past week with 0 indicating no distress and 10 indicating extreme distress. A distress rating of 4 or higher is clinically significant.4,6 Distress may be categorized into 3 levels of severity: mild distress (< 4), moderate distress (4-7), or severe distress (8-10). The DT has been found to have good face validity, sensitivity and specificity, and is user-friendly.2,6,7,13

The measurement additionally lists 39 problems nested within 5 domains: practical, family, emotional, spiritual/religious, and physical. Patients may endorse listed items under each problem domain by indicating yes or no. Endorsement of various items are intended to provide more detailed information about sources of distress. Due to the predominantly male and mostly older population included in this study the ability to have children measure was removed from the family problem domain.

SI was assessed in 2 ways. First, by patients’ self-report through item-9 of the Patient Health Questionnaire-9 (PHQ-9).14 Item-9 asks “over the last 2 weeks, how often have you been bothered by thoughts that you would be better off dead or of hurting yourself in some way?” Responses range from 0 (not at all) to 3 (nearly every day).14 Responses > 0 were considered a positive screen for SI. The process of administering the PHQ-9 item-9 is part of a national VA directive for standardizing assessment of suicide risk (Steve Young, personal communication, May 23, 2018). The PHQ-9 has been found to have good construct validity when used with both medical samples and the general population along with good internal and test-retest reliability.14,15 Second, all veterans also were asked directly about SI during clinical interview, the results of which were documented in health records using a standardized format for risk assessment.

Procedure

Participants were a sample of veterans who were referred for psychology-oncology services. The NCCN DT and Problems List were administered prior to the start of clinical interviews, which followed a checklist and included standardized assessments of SI and history of a suicide attempt(s). A licensed clinical psychologist or a postdoctoral resident conducted these assessments under the supervision of a licensed psychologist. Data gathered during the clinical interview and from the DT and problems list were documented in health records, which were retrospectively reviewed for relevant information (eg, cancer diagnosis, SI). Therefore, informed consent was waived. This study was approved by the JAHVH Institutional Review Board.

Analysis

Data were analyzed using SPSS Version 25. Data analysis proceeded in 3 steps. First, descriptive statistics included the demographic and clinical factors present in the current sample. Difference between those with and without suicidal ideation were compared using F-statistic for continuous variables and χ2 analyses for categorical variables. Second, to examine relationships between each DT problem domain and SI, χ2 analyses were conducted. Third, DT problem domains that had a significant relationship with SI were entered in a logistic regression. This analysis determined which items, if any, from a DT problem domain predicted SI. In the logistic regression model, history of suicide attempts was entered into the first block, as history of suicide attempts is a well-established risk factor for subsequent suicidal ideation. In the second block, other variables that were significantly related to suicidal ideation in the second step of analyses were included. Before interpreting the results of the logistic regression, model fit was tested using the Hosmer-Lemeshow test. Significance of each individual predictor variable in the model is reported using the Wald χ2 statistic; each Wald statistic is compared with a χ2 distribution with 1 degree of freedom (df). Results of logistic regression models also provide information about the effect of each predictor variable in the regression equation (beta weight), odds a veteran who endorsed each predictor variable in the model would also endorse SI (as indicated by the odds ratio), and an estimate of the amount of variance accounted for by each predictor variable (using Nagelkerke’s pseudo R2, which ranges in value from 0 to 1 with higher values indicating more variance explained). For all analyses, P value of .05 (2-tailed) was used for statistical significance.

 

 

Results

The sample consisted of 174 veterans (Table 1). The majority (77.6%) were male with a mean age of nearly 62 years (range, 29-87). Most identified as white (74.1%) with half reporting they were either married or living with a partner.

Prostate cancer (19.0%) was the most common type of cancer among study participants followed by head and neck (18.4%), lymphoma/leukemia (11.5%), lung (11.5%), and breast (10.9%); 31.6% had metastatic disease and 14.9% had recurrent disease. Chemotherapy (42.5%) was the most common treatment modality, followed by surgery (38.5%) and radiation (31.6%). The sample was distributed among the 3 distress DT categories: mild (18.4%), moderate (42.5%), and severe (39.1%).

Problems Endorsed

Treatment decisions (44.3%) and insurance/financial concerns (35.1%) were the most frequently endorsed practical problems (Figure 1). Family health issues (33.9%) and dealing with partner (23.0%) were the most frequently endorsed family problems (Figure 2). Worry (73.0%) and depression (69.5%) were the most frequent emotional problems; of note, all emotional problems were endorsed by at least 50% of veterans (Figure 3). Fatigue (71.3%), sleep (70.7%), and pain (69%), were the most frequently endorsed physical problems (Figure 4). Spiritual/religious problems were endorsed by 15% of veterans.

 

Suicidal Ideation

Overall, 25.3% of veterans endorsed SI. About 20% of veterans reported a history of ≥ 1 suicide attempts in their lifetime. A significant relationship among distress categories and SI was found 2 = 18.36, P < .001). Veterans with severe distress were more likely to endorse SI (42.7%) when compared with veterans with mild (9.4%) or moderate (16.2%) distress.

Similarly, a significant relationship among distress categories and a history of a suicide attempt(s) was found (χ2 = 6.08, P = .048). Veterans with severe distress were more likely to have attempted suicide (29.4%) when compared with veterans with mild (12.5%) or moderate (14.9%) distress.



χ2 analyses were conducted to examine the relationships between DT problem domains and SI. A significant relationship was found between family problems and SI (χ2 = 5.54,df = 1, P = .02) (Table 2). Specifically, 33.0% of veterans who endorsed family problems also reported experiencing SI. In comparison, there were no significant differences between groups with regard to practical, emotional, spiritual/religious, or physical problems and SI.

Logistic regression analyses determined whether items representative of the family problems domain were predictive of SI. Suicide attempt(s) were entered in the first step of the model to evaluate risk factors for SI over this already established risk factor. The assumptions of logistic regression were met.



The Hosmer-Lemeshow test (χ2 = 3.66, df = 5, P = .56) demonstrated that the model fit was good. The group of predictors used in the model differentiate between people who were experiencing SI and those who were not experiencing SI at the time of evaluation. A history of a suicide attempt(s) predicted SI, as expected (Wald = 6.821, df = 1, P = .01). The odds that a veteran with a history of a suicide attempt(s) would endorse SI at the time of the evaluation was nearly 3 times greater than that of veterans without a history of a suicide attempt(s). Over and above suicide attempts, problems dealing with partner (Wald = 15.142; df = 1, P < .001) was a second significant predictor of current SI. The odds that a veteran who endorsed problems dealing with partner would also endorse SI was > 5 times higher than that of veterans who did not endorse problems dealing with partner. This finding represents a significant risk factor for SI, over and above a history of a suicide attempt(s). The other items from the family problems domains were not significant (P > .05) (Table 3).

 

 

Discussion

This study aimed to understand factors associated with low, moderate, and severe levels of distress in veterans living with cancer who were referred for psychology services. As hypothesized, veterans who endorsed severe distress were significantly more likely to endorse SI. They also were more likely to have a history of a suicide attempt(s) when compared with those with mild or moderate distress.

A second aim of this study was to understand the most commonly endorsed problems. Consistent with prior literature, treatment decisions were the most commonly endorsed practical problem; worry and depression were the most common emotional problems; and fatigue, sleep, and pain were the most common physical problems.7

A finding unique to the current study is that family health issues and dealing with partner were specified as the most common family problems. However, a study by Smith and colleagues did not provide information about the rank of most frequently reported problems within this domain.7

The third aim was to understand which problems were related to SI. It was hypothesized that family, emotional, and physical problems would be related to SI. However, results indicated that only family problems (specifically, problems dealing with a partner) were significantly associated with SI among veterans living with cancer.

Contrary to expectations, emotional and physical problems were not found to have a significant relationship with SI. This is likely because veterans endorsed items nested within these problem domains with similar frequency. The lack of significant findings does not suggest that emotional and physical problems are not significant predictors of SI for veterans living with cancer, but that no specific emotional or physical symptom stood out as a predictor of suicidal ideation above the others.

The finding of a significant relationship between family problems (specifically, problems dealing with a partner) and SI in this study is consistent with findings of Aboumrad and colleagues in a study that examined root-cause analyses of completed suicides by veterans living with cancer.11 They found that nearly half the sample endorsed family problems prior to their death, and a small but notable percentage of veterans who completed suicide reported divorce as a stressor prior to their death.

This finding may be explained by Thomas Joiner's interpersonal-psychological theory of suicidal behavior (IPT), which suggests that completed suicide may result from a thwarted sense of belonging, perceived burdensomeness, and acquired capability for suicide.16 Problems dealing with a partner may impact a veteran’s sense of belonging or social connectedness. Problems dealing with a partner also may be attributed to perceived burdens due to limitations imposed by living with cancer and/or undergoing treatment. In both circumstances, the veteran’s social support system may be negatively impacted, and perceived social support is a well-established protective factor against suicide.17

Partner distress is a second consideration. It is likely that veterans’ partners experienced their own distress in response to the veteran’s cancer diagnosis and/or treatment. The partner’s cause, severity, and expression of distress may contribute to problems for the couple.

Finally, the latter point of the IPT refers to acquired capability, or the ability to inflict deadly harm to oneself.18 A military sample was found to have more acquired capability for suicide when compared with a college undergraduate sample.19 A history of a suicide attempt(s) and male gender have been found to significantly predict acquired capability to complete suicide.18 Furthermore, because veterans living with cancer often are in pain, fear of pain associated with suicide may be reduced and, therefore, acquired capability increased. This suggests that male veterans living with cancer who are in pain, have a history of a suicide attempt(s), and current problems with their partner may be an extremely vulnerable population at-risk for suicide. Results from the current study emphasize the importance of veterans having access to mental health and crisis resources for problems dealing with their partner. Partner problems may foreshadow a potentially lethal type of distress.

 

 

Strengths

This study’s aims are consistent with the VA’s mission to end veteran suicide and contributes to literature in several important ways.12 First, veterans living with cancer are an understudied population. The current study addresses a gap in existing literature by researching veterans living with cancer and aims to better understand the relationship between cancer-related distress and SI. Second, to the best of the authors’ knowledge, this study is the first to find that problems dealing with a partner significantly increases a veteran’s risk for SI above a history of a suicide attempt(s). Risk assessments now may be more comprehensive through inclusion of this distress factor.

It is recommended that future research use IPT to further investigate the relationship between problems dealing with a partner and SI.16 Future research may do so by including specific measures to assess for the tenants of the theory, including measurements of burdensomeness and belongingness. An expanded knowledge base about what makes problems dealing with a partner a significant suicide risk factor (eg, increased conflict, lack of support, etc.) would better enable clinicians to intervene effectively. Effective intervention may lessen suicidal behaviors or deaths from suicides within the Veteran population.

Limitations

One limitation is the focus on patients who accepted a mental health referral. This study design may limit the generalizability of results to veterans who would not accept mental health treatment. The homogenous sample of veterans is a second limitation. Most participants were male, white, and had a mean age of 62 years. These demographics are representative of the veterans that most typically utilize VA services; however, more research is needed on veterans living with cancer who are female and of diverse racial and ethnic backgrounds. There are likely differences in problems endorsed and factors associated with SI based on age, race, sex, and other socioeconomic factors. A third limitation is the cross-sectional, retrospective nature of this study. Future studies are advised to assess for distress at multiple time points. This is consistent with NCCN Standards of Care for Distress Management.2 Longitudinal data would enable more findings about distress and SI throughout the course of cancer diagnosis and treatment, therefore enhancing clinical implications and informing future research.

Conclusion

This is among the first of studies to investigate distress and factors associated with SI in veterans living with cancer who were referred for psychology services. The prevalence of distress caused by psychosocial factors (including treatment decisions, worry, and depression) highlights the importance of including mental health services as part of comprehensive cancer treatment.

Distress due to treatment decisions may be attributed to a litany of factors such as a veteran’s consideration of adverse effects, effectiveness of treatments, changes to quality of life or functioning, and inclusion of alternative or complimentary treatments. These types of decisions often are reported to be difficult conversations to have with family members or loved ones, who are likely experiencing distress of their own. The role of a mental health provider to assist veterans in exploring their treatment decisions and the implications of such decisions appears important to lessening distress.

Early intervention for emotional symptoms would likely benefit veterans’ management of distress and may lessen suicide risk as depression is known to place veterans at-risk for SI.20 This underscores the importance of timely distress assessment to prevent mild emotional distress from progressing to potentially severe or life-threatening emotional distress. For veterans with a psychiatric history, timely assessment and intervention is essential because psychiatric history is an established suicide risk factor that may be exacerbated by cancer-related distress.12

Furthermore, management of intolerable physical symptoms may lessen risk for suicide.4 Under medical guidance, fatigue may be improved using exercise.21 Behavioral intervention is commonly used as first-line treatment for sleep problems.22 While pain may be lessened through medication or nonpharmacological interventions.23

Considering the numerous ways that distress may present itself (eg, practical, emotional, or physical) and increase risk for SI, it is essential that all veterans living with cancer are assessed for distress and SI, regardless of their presentation. Although veterans may not outwardly express distress, this does not indicate the absence of either distress or risk for suicide. For example, a veteran may be distressed due to financial concerns, transportation issues, and the health of his/her partner or spouse. This veteran may not exhibit visible symptoms of distress, as would be expected when the source of distress is emotional (eg, depression, anxiety). However, this veteran is equally vulnerable to impairing distress and SI as someone who exhibits emotional distress. Distress assessments should be further developed to capture both the visible and less apparent sources of distress, while also serving the imperative function of screening for suicide. Other researchers also have noted the necessity of this development.24 Currently, the NCCN DT and Problems List does not include any assessment of SI or behavior.

Finally, this study identified a potentially critical factor to include in distress assessment: problems dealing with a partner. Problems dealing with a partner have been noted as a source of distress in existing literature, but this is the first study to find problems dealing with a partner to be a predictor of SI in veterans living with cancer.4-6

Because partners often attend appointments with veterans, it is not surprising that problems dealing with their partner are not disclosed more readily. It is recommended that clinicians ask veterans about potential problems with their partner when they are alone. Directly gathering information about such problems while assessing for distress may assist health care workers in providing the most effective, accurate type of intervention in a timely manner, and potentially mitigate risk for suicide.

As recommended by the NCCN and numerous researchers, findings from the current study underscore the importance of accurate, timely assessment of distress.2,4,8 This study makes several important recommendations about how distress assessment may be strengthened and further developed, specifically for the veteran population. This study also expands the current knowledge base of what is known about veterans living with cancer, and has begun to fill a gap in the existing literature. Consistent with the VA mission to end veteran suicide, results suggest that veterans living with cancer should be regularly screened for distress, asked about distress related to their partner, and assessed for SI. Continued efforts to enhance assessment of and response to distress may lessen suicide risk in veterans with cancer.11

 

Acknowledgements

This study is the result of work supported with resources and the use of facilities at the James A. Haley Veterans’ Hospital.

 

References

1. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2019. CA Cancer J Clin. 2019;69(1):7-34.

2. Riba MB, Donovan, KA, Andersen, B. National Comprehensive Cancer Network clinical practice guidelines in oncology. Distress management (Version 3.2019). J Natl Compr Can Net, 2019;17(10):1229-1249.

3. Zabora J, BrintzenhofeSzoc K, Curbow B, Hooker C, Pianta dosi S. The prevalence of psychological distress by cancer site. Psychooncology. 2001;10(1):19–28.

4. Holland JC, Alici Y. Management of distress in cancer patients. J Support Oncol. 2010;8(1):4-12.

5. Bulli F, Miccinesi G, Maruelli A, Katz M, Paci E. The measure of psychological distress in cancer patients: the use of distress thermometer in the oncological rehabilitation center of Florence. Support Care Cancer. 2009;17(7):771–779.

6. Jacobsen PB, Donovan KA, Trask PC, et al. Screening for psychologic distress in ambulatory cancer patients. Cancer. 2005;103(7):1494-1502.

7. Smith J, Berman S, Dimick J, et al. Distress Screening and Management in an Outpatient VA Cancer Clinic: A Pilot Project Involving Ambulatory Patients Across the Disease Trajectory. Fed Pract. 2017;34(Suppl 1):43S–50S.

8. Carlson LE, Waller A, Groff SL, Bultz BD. Screening for distress, the sixth vital sign, in lung cancer patients: effects on pain, fatigue, and common problems--secondary outcomes of a randomized controlled trial. Psychooncology. 2013;22(8):1880-1888.

9. Cooley ME, Short TH, Moriarty HJ. Symptom prevalence, distress, and change over time in adults receiving treatment for lung cancer. Psychooncology. 2003;12(7):694-708.

10. US Department of Veterans Affairs Office of Suicide Prevention. Suicide among veterans and other Americans 2001-2014. https://www.mentalhealth.va.gov/docs/2016suicidedatareport.pdf. Published August 3, 2016. Accessed April 13, 2020.

11. Aboumrad M, Shiner B, Riblet N, Mills, PD, Watts BV. Factors contributing to cancer-related suicide: a study of root-cause-analysis reports. Psychooncology. 2018;27(9):2237-2244.

12. US Department of Veterans Affairs, Office of Mental Health and Suicide Prevention. National Strategy for Preventing Veteran Suicide 2018–2028. https://www.mentalhealth.va.gov/suicide_prevention/docs/Office-of-Mental-Health-and-Suicide-Prevention-National-Strategy-for-Preventing-Veterans-Suicide.pdf Published 2018. Accessed April 13, 2020.

13. Carlson LE, Waller A, Mitchell AJ. Screening for distress and unmet needs in patients with cancer: review and recommendations. J Clin Oncol. 2012;30(11):1160-1177.

14. Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606–613.

15. Martin A, Rief W, Klaiberg A, Braehler E. Validity of the brief patient health questionnaire mood scale (PHQ-9) in the general population. Gen Hosp Psychiatry. 2006;28(1):71-77.

16. Joiner TE. Why People Die by Suicide. Cambridge, MA: Harvard University Press, 2005.

17. Kleiman EM, Riskind JH, Schaefer KE. Social support and positive events as suicide resiliency factors: examination of synergistic buffering effects. Arch Suicide Res. 2014;18(2):144-155.

18. Van Orden KA, Witte TK, Gordon KH, Bender TW, Joiner TE Jr. Suicidal desire and the capability for suicide: tests of the interpersonal-psychological theory of suicidal behavior among adults. J Consult Clin Psychol. 2008;76(1):72–83.

19. Bryan CJ, Morrow CE, Anestis MD, Joiner TE. A preliminary test of the interpersonal -psychological theory of suicidal behavior in a military sample. Personal Individual Differ. 2010;48(3):347-350.

20. Miller SN, Monahan CJ, Phillips KM, Agliata D, Gironda RJ. Mental health utilization among veterans at risk for suicide: Data from a post-deployment clinic [published online ahead of print, 2018 Oct 8]. Psychol Serv. 2018;10.1037/ser0000311.

21. Galvão DA, Newton RU. Review of exercise intervention studies in cancer patients. J Clin Oncol. 2005;23(4):899-909.

22. Qaseem A, Kansagara D, Forciea MA, Cooke M, Denberg TD; Clinical Guidelines Committee of the American College of Physicians. Management of chronic insomnia disorder in adults: A clinical practice guideline from the American College of Physicians. Ann Intern Med. 2016;165(2):125-133.

23. Ngamkham S, Holden JE, Smith EL. A systematic review: Mindfulness intervention for cancer-related pain. Asia Pac J Oncol Nurs. 2019;6(2):161-169.

24. Granek L, Nakash O, Ben-David M, Shapira S, Ariad S. Oncologists’, nurses’, and social workers’ strategies and barriers to identifying suicide risk in cancer patients. Psychooncology. 2018;27(1):148-154.

References

1. Siegel RL, Miller KD, Jemal A. Cancer statistics, 2019. CA Cancer J Clin. 2019;69(1):7-34.

2. Riba MB, Donovan, KA, Andersen, B. National Comprehensive Cancer Network clinical practice guidelines in oncology. Distress management (Version 3.2019). J Natl Compr Can Net, 2019;17(10):1229-1249.

3. Zabora J, BrintzenhofeSzoc K, Curbow B, Hooker C, Pianta dosi S. The prevalence of psychological distress by cancer site. Psychooncology. 2001;10(1):19–28.

4. Holland JC, Alici Y. Management of distress in cancer patients. J Support Oncol. 2010;8(1):4-12.

5. Bulli F, Miccinesi G, Maruelli A, Katz M, Paci E. The measure of psychological distress in cancer patients: the use of distress thermometer in the oncological rehabilitation center of Florence. Support Care Cancer. 2009;17(7):771–779.

6. Jacobsen PB, Donovan KA, Trask PC, et al. Screening for psychologic distress in ambulatory cancer patients. Cancer. 2005;103(7):1494-1502.

7. Smith J, Berman S, Dimick J, et al. Distress Screening and Management in an Outpatient VA Cancer Clinic: A Pilot Project Involving Ambulatory Patients Across the Disease Trajectory. Fed Pract. 2017;34(Suppl 1):43S–50S.

8. Carlson LE, Waller A, Groff SL, Bultz BD. Screening for distress, the sixth vital sign, in lung cancer patients: effects on pain, fatigue, and common problems--secondary outcomes of a randomized controlled trial. Psychooncology. 2013;22(8):1880-1888.

9. Cooley ME, Short TH, Moriarty HJ. Symptom prevalence, distress, and change over time in adults receiving treatment for lung cancer. Psychooncology. 2003;12(7):694-708.

10. US Department of Veterans Affairs Office of Suicide Prevention. Suicide among veterans and other Americans 2001-2014. https://www.mentalhealth.va.gov/docs/2016suicidedatareport.pdf. Published August 3, 2016. Accessed April 13, 2020.

11. Aboumrad M, Shiner B, Riblet N, Mills, PD, Watts BV. Factors contributing to cancer-related suicide: a study of root-cause-analysis reports. Psychooncology. 2018;27(9):2237-2244.

12. US Department of Veterans Affairs, Office of Mental Health and Suicide Prevention. National Strategy for Preventing Veteran Suicide 2018–2028. https://www.mentalhealth.va.gov/suicide_prevention/docs/Office-of-Mental-Health-and-Suicide-Prevention-National-Strategy-for-Preventing-Veterans-Suicide.pdf Published 2018. Accessed April 13, 2020.

13. Carlson LE, Waller A, Mitchell AJ. Screening for distress and unmet needs in patients with cancer: review and recommendations. J Clin Oncol. 2012;30(11):1160-1177.

14. Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606–613.

15. Martin A, Rief W, Klaiberg A, Braehler E. Validity of the brief patient health questionnaire mood scale (PHQ-9) in the general population. Gen Hosp Psychiatry. 2006;28(1):71-77.

16. Joiner TE. Why People Die by Suicide. Cambridge, MA: Harvard University Press, 2005.

17. Kleiman EM, Riskind JH, Schaefer KE. Social support and positive events as suicide resiliency factors: examination of synergistic buffering effects. Arch Suicide Res. 2014;18(2):144-155.

18. Van Orden KA, Witte TK, Gordon KH, Bender TW, Joiner TE Jr. Suicidal desire and the capability for suicide: tests of the interpersonal-psychological theory of suicidal behavior among adults. J Consult Clin Psychol. 2008;76(1):72–83.

19. Bryan CJ, Morrow CE, Anestis MD, Joiner TE. A preliminary test of the interpersonal -psychological theory of suicidal behavior in a military sample. Personal Individual Differ. 2010;48(3):347-350.

20. Miller SN, Monahan CJ, Phillips KM, Agliata D, Gironda RJ. Mental health utilization among veterans at risk for suicide: Data from a post-deployment clinic [published online ahead of print, 2018 Oct 8]. Psychol Serv. 2018;10.1037/ser0000311.

21. Galvão DA, Newton RU. Review of exercise intervention studies in cancer patients. J Clin Oncol. 2005;23(4):899-909.

22. Qaseem A, Kansagara D, Forciea MA, Cooke M, Denberg TD; Clinical Guidelines Committee of the American College of Physicians. Management of chronic insomnia disorder in adults: A clinical practice guideline from the American College of Physicians. Ann Intern Med. 2016;165(2):125-133.

23. Ngamkham S, Holden JE, Smith EL. A systematic review: Mindfulness intervention for cancer-related pain. Asia Pac J Oncol Nurs. 2019;6(2):161-169.

24. Granek L, Nakash O, Ben-David M, Shapira S, Ariad S. Oncologists’, nurses’, and social workers’ strategies and barriers to identifying suicide risk in cancer patients. Psychooncology. 2018;27(1):148-154.

Issue
Federal Practitioner - 37(2)s
Issue
Federal Practitioner - 37(2)s
Page Number
S8-S15
Page Number
S8-S15
Publications
Publications
Topics
Article Type
Display Headline
Distress and Factors Associated with Suicidal Ideation in Veterans Living with Cancer
Display Headline
Distress and Factors Associated with Suicidal Ideation in Veterans Living with Cancer
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Article PDF Media

Atrial Fibrillation and Bleeding in Patients With Chronic Lymphocytic Leukemia Treated with Ibrutinib in the Veterans Health Administration (FULL)

Article Type
Changed
Thu, 12/15/2022 - 14:39
Display Headline
Atrial Fibrillation and Bleeding in Patients With Chronic Lymphocytic Leukemia Treated with Ibrutinib in the Veterans Health Administration

Chronic lymphocytic leukemia (CLL) is the most common leukemia diagnosed in developed countries, with an estimated 21,040 new diagnoses of CLL expected in the US in 2020. 1-3 CLL is an indolent cancer characterized by the accumulation of B-lymphocytes in the blood, marrow, and lymphoid tissues. 4 It has a heterogeneous clinical course; the majority of patients are observed or receive delayed treatment following diagnosis, while a minority of patients require immediate treatment. After first-line treatment, some patients experience prolonged remissions while others require retreatment within 1 or 2 years. Fortunately, advances in cancer biology and therapeutics in the last decade have increased the number of treatment options available for patients with CLL.

Until recently, most CLL treatments relied on a chemotherapy or a chemoimmunotherapy backbone; however, the last few years have seen novel therapies introduced, such as small molecule inhibitors to target molecular pathways that promote the normal development, expansion, and survival of B-cells.5 One such therapy is ibrutinib, a targeted Bruton tyrosine kinase inhibitor that received accelerated approval by the US Food and Drug Administration (FDA) in February 2014 for patients with CLL who received at least 1 prior therapy. The FDA later expanded this approval to include use of ibrutinib in patients with CLL with relapsed or refractory disease, with or without chromosome 17p deletion. In 2016, based on data from the RESONATE-17 study, the FDA approved ibrutinib for first-line therapy in patients with CLL.6

Ibrutinib’s efficacy, ease of administration and dosing (all doses are oral and fixed, rather than based on weight or body surface area), and relatively favorable safety profile have resulted in a rapid growth in its adoption.7 Since its adverse event (AE) profile is generally more tolerable than that of a typical chemoimmunotherapy, its use in older patients with CLL and patients with significant comorbidities is particularly appealing.8

However, the results of some clinical trials suggest an association between treatment with ibrutinib and an increased risk of bleeding-related events of any grade (44%) and major bleeding events (4%).7,8 The incidence of major bleeding events was reported to be higher (9%) in one clinical trial and at 5-year follow-up, although this trial did not exclude patients receiving concomitant oral anticoagulation with warfarin.6,9

Heterogeneity in clinical trials’ definitions of major bleeding confounded the ability to calculate bleeding risk in patients treated with ibrutinib in a systematic review and meta-analysis that called for more data.10 Additionally, patients with factors that might increase the risk of major bleeding with ibrutinib treatment were likely underrepresented in clinical trials, given the carefully selected nature of clinical trial subjects. These factors include renal or hepatic disease, gastrointestinal disease, and use of a number of concomitant medications such as antiplatelets or anticoagulant medications. Accounting for use of the latter is particularly important because patients who develop atrial fibrillation (Afib), one of the recognized AEs of treatment with ibrutinib, often are treated with anticoagulant medications in order to decrease the risk of stroke or other thromboembolic complications.

 

 


A single-site observational study of patients treated with ibrutinib reported a high utilization rate of antiplatelet medications (70%), anticoagulant medications (17%), or both (13%) with a concomitant major bleeding rate of 18% of patients.11 Prevalence of bleeding events seemed to be highly affected by the presence of concomitant medications: 78% of patients treated with ibrutinib while concurrently receiving both antiplatelet and anticoagulant medications developed a major bleeding event, while none of the patients who were not receiving antiplatelets, anticoagulants, or medications that interact with cytochrome P450 (an enzyme that metabolized chemotherapeutic agents used to treat cancer) experienced a major bleeding event.11

The prevalence of major bleeding events, comorbidities, and utilization of medications that could increase the risk of major bleeding in patients with CLL on ibrutinib in the Veterans Health Administration (VHA) is not known. The VHA is the largest integrated health care system in the US. To address these knowledge gaps, a retrospective observational study was conducted using data on demographics, comorbidities that could affect bleeding, use of anticoagulant and antiplatelet medications, and bleeding events in patients with CLL who were treated in the first year of ibrutinib availability from the VHA.

The first year of ibrutinib availability was chosen for this study since we anticipated that many health care providers would be unfamiliar with ibrutinib during that time given its novelty, and therefore more likely to codispense ibrutinib with medications that could increase the risk of a bleeding event. Since Afib is both an AE associated with ibrutinib treatment and a condition that often is treated with anticoagulants, the prevalence of Afib in this population was also included. For context, the incidence of bleeding and Afib and use of anticoagulant and antiplatelet medications during treatment in a cohort of patients with CLL treated with bendamustine + rituximab (BR) also was reported.

Methods

The VHA maintains the centralized US Department of Veterans Affairs Cancer Registry System (VACRS), with electronic medical record data and other sources captured in its Corporate Data Warehouse (CDW). The VHA CDW is a national repository comprising data from several VHA clinical and administrative systems. The CDW includes patient identifiers; demographics; vital status; lab information; administrative information (such as diagnostic International Statistical Classification of Diseases and Related Health Problems [ICD-9] codes); medication dispensation tables (such as outpatient fill); IV package information; and notes from radiology, pathology, outpatient and inpatient admission, discharge, and daily progress.

Registrars abstract all cancer cases within the VHA system (or diagnosed outside the VHA, if patients subsequently receive treatment in the VHA). It is estimated that VACRS captures 3% of cancer cases in the US.12 Like most registries, VACRS captures data such as diagnosis, age, gender, race, and vital status.

 

 


The study received approval from the University of Utah Institutional Review Board and used individual patient-level historical administrative, cancer registry, and electronic health care record data. Patients diagnosed and treated for CLL at the VHA from 2010 to 2014 were identified through the VACRS and CDW; patients with a prior malignancy were excluded. Patients who received ibrutinib or BR based on pharmacy dispensation information were selected. Patients were followed until December 31, 2016 or death; patients with documentation of another cancer or lack of utilization of the VHA hematology or oncology services (defined as absence of any hematology and/or oncology clinic visits for ≥ 18 months) were omitted from the final analysis (Figure).



Previous and concomitant utilization of antiplatelet (aspirin, clopidogrel) or anticoagulant (dalteparin, enoxaparin, fondaparinux, heparin, rivaroxaban, and warfarin) medications was extracted 6 months before and after the first dispensation of ibrutinib or BR using pharmacy dispensation records.

Study Definitions

Prevalence of comorbidities that could increase bleeding risk was determined using administrative ICD-9-CM codes. Liver disease was identified by presence of cirrhosis, hepatitis C virus, or alcoholic liver disease using administrative codes validated by Kramer and colleagues, who reported positive and negative predictive values of 90% and 87% for cirrhosis, 93% and 92% for hepatitis C virus, and 71% and 98% for alcoholic liver disease.13 Similarly, end-stage liver disease was identified using a validated coding algorithm developed by Goldberg and colleagues, with a positive predictive value of 89.3%.14 The presence of controlled or uncontrolled diabetes mellitus (DM) was identified using the procedure described by Guzman and colleagues.15 Quan’s algorithm was used to calculate Charlson Comorbidity Index (CCI) based on ICD-9-CM codes for inpatient and outpatient visits within a 6-month lookback period prior to treatment initiation.16

A major bleeding event was defined as a hospitalization with an ICD-9-CM code suggestive of major bleeding as the primary reason, as defined by Lane and colleagues in their study of major bleeding related to warfarin in a cohort of patients treated within the VHA.17 Incidence rates of major bleeding events were identified during the first 6 months of treatment. Incidence of Afib—defined as an inpatient or outpatient encounter with the 427.31 ICD-9-CM code—also was examined within the first 6 months after starting treatment. The period of 6 months was chosen because bendamustine must be discontinued after 6 months.

 

 

Study Analysis

Descriptive statistics were used to examine patient demographics, disease characteristics, and treatment history from initial CLL diagnosis through end of study observation period. Categorical variables were summarized using frequencies and accompanying proportions, while a mean and standard deviation were used to summarize continuous variables. For the means of continuous variables and of categorical data, 95% CIs were used. Proportions and accompanying 95% CIs characterized treatment patterns, including line of therapy, comorbidities, and bleeding events. Treatment duration was described using mean and accompanying 95% CI. Statistical tests were not conducted for comparisons among treatment groups. Patients were censored at the end of follow-up, defined as the earliest of the following scenarios: (1) end of study observation period (December 31, 2016); (2) development of a secondary cancer; or (3) last day of contact given absence of care within the VHA for ≥ 18 months (with care defined as oncology and/or oncology/hematology visit with an associated note). Analysis was performed using R 3.4.0.

Results

Between 2010 and 2014, 2,796 patients were diagnosed and received care for CLL within the VHA. Overall, all 172 patients who were treated with ibrutinib during our inclusion period were selected. These patients were treated between January 1, 2014 and December 31, 2016, following ibrutinib’s approval in early 2014. An additional 291 patients were selected who received BR (Table). Reflecting the predominantly male population of the VHA, 282 (97%) BR patients and 167 (97%) ibrutinib patients were male. The median age at diagnosis was 67 years for BR patients and 69 years for ibrutinib patients. About 76% of patients who received ibrutinib and 82% of patients who received BR were non-Hispanic white; 17% and 14% were African American, respectively.

Less than 10% of patients receiving either ibrutinib or BR had liver disease per criteria used by Kramer and colleagues, or end-stage liver disease using criteria developed by Goldberg and colleagues.12,13 About 5% of patients had a history of previous bleeding in the 6-month period prior to initiating either therapy. Mean CCI (excluding malignancy) score was 1.5 (range, 0-11) for the ibrutinib group, and 2.1 (range, 0-9) for the BR group. About 16% of the ibrutinib group had controlled DM and fewer than 10% had uncontrolled DM, while 4% of patients in the BR group met the criteria for controlled DM and another 4% met the criteria for uncontrolled DM.

There was very low utilization of anticoagulant or antiplatelet medication prior to initiation of ibrutinib (2.9% and 2.3%, respectively) or BR (< 1% each). In the first 6 months after treatment initiation, about 8% of patients in both ibrutinib and BR cohorts received anticoagulant medication while antiplatelet utilization was < 5% in either group.

In the BR group, 8 patients (2.7%) experienced a major bleeding event, while 14 patients (8.1%) in the ibrutinib group experienced a bleeding event (P = .008). While these numbers were too low to perform a formal statistical analysis of the association between clinical covariates and bleeding in either group, there did not seem to be an association between bleeding and liver disease or DM. Of patients who experienced a bleeding event, about 1 in 4 patients had had a prior bleeding event in both the ibrutinib and the BR groups. Interestingly, while none of the patients who experienced a bleeding event while receiving BR were taking concomitant anticoagulant medication, 3 of the 14 patients who experienced a bleeding event in the ibrutinib group showed evidence of anticoagulant utilization. Finally, the incidence of Afib (defined as patients with no evidence of Afib in the 6 months prior to treatment but with evidence of Afib in the 6 months following treatment initiation) was 4% in the BR group, and about 8% in the ibrutinib group (P = .003).

 

 

Discussion

To the authors’ knowledge, this study is the first to examine the real-world incidence of bleeding and Afib in veterans who received ibrutinib for CLL in the first year of its availability. The study found minimal use of anticoagulants and/or antiplatelet agents prior to receiving first-line ibrutinib or BR, and very low use of these agents in the first 6 months following the initiation of first-line treatment. This finding suggests a high awareness among VA providers of potential adverse effects (AEs) of ibrutinib and chemotherapy, and a careful selection of patients that lack risk factors for AEs.

In patients treated with first-line ibrutinib when compared with patients treated with first-line BR, moderate increases in bleeding (2.7% vs 8.1%, P = .008) and Afib (10.5% vs 3%, P = .003) also were observed. These results are concordant with previous findings examining the use of ibrutinib in patients with CLL.18-20

Limitations

The results of this study should be interpreted with caution, as some limitations must be considered. The study was conducted in the early days of ibrutinib adoption. Since then, more patients have been treated with ibrutinib and for longer durations. As clinicians gain more familiarity and with ibrutinib, and as additional novel therapeutics emerge, it is possible that the initial awareness about risks for possible AEs may diminish; patients with high comorbidity burdens and concomitant medications would be especially vulnerable in cases of reduced physician vigilance.

Another limitation of this study stems from the potential for dual system use among patients treated in the VHA. Concurrent or alternating use of multiple health care systems (use of VHA and private-sector facilities) may present gaps in the reconstruction of patient histories, resulting in missing data as patients transition between commercial, the Centers for Medicare and Medicaid Services, and VHA care. As a result, the results presented here do not reflect instances where a patient experienced a bleeding event treated outside the VA.

Problems with missing data also may occur due to incomplete extraction from the electronic health record; these issues were addressed by leveraging an understanding of the multiple data marts within the CDW environment to harmonize missing and/or erroneous information through use of other data marts when possible. Lastly, this research represents a population-level study of the VHA, thus all findings are directly relevant to the VHA. The generalizability of the findings outside the VHA would depend on the characteristics of the external population.

Conclusion

Real-world evidence from a nationwide cohort of veteran patients with CLL treated with ibrutinib suggest that, while there is an association of increased bleeding-related events and Afib, the risk is comparable to those reported in previous studies.18-20 These findings suggest that patients in real-world clinical care settings with higher levels of comorbidities may be at a slight increased risk for bleeding events and Afib.

References

1. Scarfò L, Ferreri AJ, Ghia P. Chronic lymphocytic leukaemia. Crit Rev Oncol Hematol. 2016;104:169-182.

2. Devereux S, Cuthill K. Chronic lymphocytic leukaemia. Medicine (Baltimore). 2017;45(5):292-296.

3. American Cancer Society. Cancer facts & figures 2020. https://www.cancer.org/content/dam/cancer-org/research/cancer-facts-and-statistics/annual-cancer-facts-and-figures/2020/cancer-facts-and-figures-2020.pdf. Accessed April 24, 2020.

4. Kipps TJ, Stevenson FK, Wu CJ, et al. Chronic lymphocytic leukaemia. Nat Rev Dis Primers. 2017;3:16096.

5. Owen C, Assouline S, Kuruvilla J, Uchida C, Bellingham C, Sehn L. Novel therapies for chronic lymphocytic leukemia: a Canadian perspective. Clin Lymphoma Myeloma Leuk. 2015;15(11):627-634.e5.

6. O’Brien S, Jones JA, Coutre SE, et al. Ibrutinib for patients with relapsed or refractory chronic lymphocytic leukaemia with 17p deletion (RESONATE-17): a phase 2, open-label, multicentre study. Lancet Oncol. 2016;17(10):1409–1418.

7. Burger JA, Tedeschi A, Barr PM, et al; RESONATE-2 Investigators. Ibrutinib as initial therapy for patients with chronic lymphocytic leukemia. N Engl J Med. 2015;373(25):2425-2437.

8. Byrd JC, Furman RR, Coutre SE, et al. Targeting BTK with ibrutinib in relapsed chronic lymphocytic leukemia. N Engl J Med. 2013;369(1):32-42.

9. O’Brien S, Furman R, Coutre S, et al. Single-agent ibrutinib in treatment-naive and relapsed/refractory chronic lymphocytic leukemia: a 5-year experience. Blood. 2018;131(17):1910-1919.

10. Caron F, Leong DP, Hillis C, Fraser G, Siegal D. Current understanding of bleeding with ibrutinib use: a systematic review and meta-analysis. Blood Adv. 2017;1(12):772-778.

11. Kunk PR, Mock J, Devitt ME, Palkimas S, et al. Major bleeding with ibrutinib: more than expected. Blood. 2016;128(22):3229.

12. Zullig LL, Jackson GL, Dorn RA, et al. Cancer incidence among patients of the U.S. Veterans Affairs Health Care System. Mil Med. 2012;177(6):693-701.

13. Kramer JR, Davila JA, Miller ED, Richardson P, Giordano TP, El-Serag HB. The validity of viral hepatitis and chronic liver disease diagnoses in Veterans Affairs administrative databases. Aliment Pharmacol Ther. 2008;27(3):274-282.

14. Goldberg D, Lewis JD, Halpern SD, Weiner M, Lo Re V 3rd. Validation of three coding algorithms to identify patients with end-stage liver disease in an administrative database. Pharmacoepidemiol Drug Saf. 2012;21(7):765-769.

15. Guzman JZ, Iatridis JC, Skovrlj B, et al. Outcomes and complications of diabetes mellitus on patients undergoing degenerative lumbar spine surgery. Spine (Phila Pa 1976). 2014;39(19):1596-1604.

16. Quan H, Sundararajan V, Halfon P, et al. Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data. Med Care. 2005;43(11):1130-1139.

17. Lane MA, Zeringue A, McDonald JR. Serious bleeding events due to warfarin and antibiotic co-prescription in a cohort of veterans. Am J Med. 2014;127(7):657–663.e2.

18. Leong DP, Caron F, Hillis C, et al. The risk of atrial fibrillation with ibrutinib use: a systematic review and meta-analysis. Blood. 2016;128(1):138-140.

19. Lipsky AH, Farooqui MZ, Tian X, et al. Incidence and risk factors of bleeding-related adverse events in patients with chronic lymphocytic leukemia treated with ibrutinib. Haematologica. 2015;100(12):1571-1578.

20. Brown JR, Moslehi J, O’Brien S, et al. Characterization of atrial fibrillation adverse events reported in ibrutinib randomized controlled registration trials. Haematologica. 2017;102(10):1796-1805.

Article PDF
Author and Disclosure Information

Kelli Rasmussen is a Senior Research Analyst at the University of Utah School of Medicine and the George E. Wahlen Veterans Affairs Medical Center (GEWVAMC) in Salt Lake City, Utah; Vikas Patil is a Senior Research Analyst at the University of Utah School of Medicine and GEWVAMC; Zachary Burningham is a Research Associate at GEWVAMC; Christina Yong is a Medical Writer at the University of Utah School of Medicine and GEWVAM; Brian Sauer is an Associate Professor at the University of Utah School of Medicine and GEWVAMC; Ahmad Halwani is an Assistant Professor of Medicine at the Huntsman Cancer Institute, University of Utah and GEWVAMC.
Correspondence: Kelli M. Rasmussen (kelli.rasmussen@hsc.utah.edu)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the author and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Issue
Federal Practitioner - 37(2)s
Publications
Topics
Page Number
S44-S48
Sections
Author and Disclosure Information

Kelli Rasmussen is a Senior Research Analyst at the University of Utah School of Medicine and the George E. Wahlen Veterans Affairs Medical Center (GEWVAMC) in Salt Lake City, Utah; Vikas Patil is a Senior Research Analyst at the University of Utah School of Medicine and GEWVAMC; Zachary Burningham is a Research Associate at GEWVAMC; Christina Yong is a Medical Writer at the University of Utah School of Medicine and GEWVAM; Brian Sauer is an Associate Professor at the University of Utah School of Medicine and GEWVAMC; Ahmad Halwani is an Assistant Professor of Medicine at the Huntsman Cancer Institute, University of Utah and GEWVAMC.
Correspondence: Kelli M. Rasmussen (kelli.rasmussen@hsc.utah.edu)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the author and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Author and Disclosure Information

Kelli Rasmussen is a Senior Research Analyst at the University of Utah School of Medicine and the George E. Wahlen Veterans Affairs Medical Center (GEWVAMC) in Salt Lake City, Utah; Vikas Patil is a Senior Research Analyst at the University of Utah School of Medicine and GEWVAMC; Zachary Burningham is a Research Associate at GEWVAMC; Christina Yong is a Medical Writer at the University of Utah School of Medicine and GEWVAM; Brian Sauer is an Associate Professor at the University of Utah School of Medicine and GEWVAMC; Ahmad Halwani is an Assistant Professor of Medicine at the Huntsman Cancer Institute, University of Utah and GEWVAMC.
Correspondence: Kelli M. Rasmussen (kelli.rasmussen@hsc.utah.edu)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the author and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Article PDF
Article PDF

Chronic lymphocytic leukemia (CLL) is the most common leukemia diagnosed in developed countries, with an estimated 21,040 new diagnoses of CLL expected in the US in 2020. 1-3 CLL is an indolent cancer characterized by the accumulation of B-lymphocytes in the blood, marrow, and lymphoid tissues. 4 It has a heterogeneous clinical course; the majority of patients are observed or receive delayed treatment following diagnosis, while a minority of patients require immediate treatment. After first-line treatment, some patients experience prolonged remissions while others require retreatment within 1 or 2 years. Fortunately, advances in cancer biology and therapeutics in the last decade have increased the number of treatment options available for patients with CLL.

Until recently, most CLL treatments relied on a chemotherapy or a chemoimmunotherapy backbone; however, the last few years have seen novel therapies introduced, such as small molecule inhibitors to target molecular pathways that promote the normal development, expansion, and survival of B-cells.5 One such therapy is ibrutinib, a targeted Bruton tyrosine kinase inhibitor that received accelerated approval by the US Food and Drug Administration (FDA) in February 2014 for patients with CLL who received at least 1 prior therapy. The FDA later expanded this approval to include use of ibrutinib in patients with CLL with relapsed or refractory disease, with or without chromosome 17p deletion. In 2016, based on data from the RESONATE-17 study, the FDA approved ibrutinib for first-line therapy in patients with CLL.6

Ibrutinib’s efficacy, ease of administration and dosing (all doses are oral and fixed, rather than based on weight or body surface area), and relatively favorable safety profile have resulted in a rapid growth in its adoption.7 Since its adverse event (AE) profile is generally more tolerable than that of a typical chemoimmunotherapy, its use in older patients with CLL and patients with significant comorbidities is particularly appealing.8

However, the results of some clinical trials suggest an association between treatment with ibrutinib and an increased risk of bleeding-related events of any grade (44%) and major bleeding events (4%).7,8 The incidence of major bleeding events was reported to be higher (9%) in one clinical trial and at 5-year follow-up, although this trial did not exclude patients receiving concomitant oral anticoagulation with warfarin.6,9

Heterogeneity in clinical trials’ definitions of major bleeding confounded the ability to calculate bleeding risk in patients treated with ibrutinib in a systematic review and meta-analysis that called for more data.10 Additionally, patients with factors that might increase the risk of major bleeding with ibrutinib treatment were likely underrepresented in clinical trials, given the carefully selected nature of clinical trial subjects. These factors include renal or hepatic disease, gastrointestinal disease, and use of a number of concomitant medications such as antiplatelets or anticoagulant medications. Accounting for use of the latter is particularly important because patients who develop atrial fibrillation (Afib), one of the recognized AEs of treatment with ibrutinib, often are treated with anticoagulant medications in order to decrease the risk of stroke or other thromboembolic complications.

 

 


A single-site observational study of patients treated with ibrutinib reported a high utilization rate of antiplatelet medications (70%), anticoagulant medications (17%), or both (13%) with a concomitant major bleeding rate of 18% of patients.11 Prevalence of bleeding events seemed to be highly affected by the presence of concomitant medications: 78% of patients treated with ibrutinib while concurrently receiving both antiplatelet and anticoagulant medications developed a major bleeding event, while none of the patients who were not receiving antiplatelets, anticoagulants, or medications that interact with cytochrome P450 (an enzyme that metabolized chemotherapeutic agents used to treat cancer) experienced a major bleeding event.11

The prevalence of major bleeding events, comorbidities, and utilization of medications that could increase the risk of major bleeding in patients with CLL on ibrutinib in the Veterans Health Administration (VHA) is not known. The VHA is the largest integrated health care system in the US. To address these knowledge gaps, a retrospective observational study was conducted using data on demographics, comorbidities that could affect bleeding, use of anticoagulant and antiplatelet medications, and bleeding events in patients with CLL who were treated in the first year of ibrutinib availability from the VHA.

The first year of ibrutinib availability was chosen for this study since we anticipated that many health care providers would be unfamiliar with ibrutinib during that time given its novelty, and therefore more likely to codispense ibrutinib with medications that could increase the risk of a bleeding event. Since Afib is both an AE associated with ibrutinib treatment and a condition that often is treated with anticoagulants, the prevalence of Afib in this population was also included. For context, the incidence of bleeding and Afib and use of anticoagulant and antiplatelet medications during treatment in a cohort of patients with CLL treated with bendamustine + rituximab (BR) also was reported.

Methods

The VHA maintains the centralized US Department of Veterans Affairs Cancer Registry System (VACRS), with electronic medical record data and other sources captured in its Corporate Data Warehouse (CDW). The VHA CDW is a national repository comprising data from several VHA clinical and administrative systems. The CDW includes patient identifiers; demographics; vital status; lab information; administrative information (such as diagnostic International Statistical Classification of Diseases and Related Health Problems [ICD-9] codes); medication dispensation tables (such as outpatient fill); IV package information; and notes from radiology, pathology, outpatient and inpatient admission, discharge, and daily progress.

Registrars abstract all cancer cases within the VHA system (or diagnosed outside the VHA, if patients subsequently receive treatment in the VHA). It is estimated that VACRS captures 3% of cancer cases in the US.12 Like most registries, VACRS captures data such as diagnosis, age, gender, race, and vital status.

 

 


The study received approval from the University of Utah Institutional Review Board and used individual patient-level historical administrative, cancer registry, and electronic health care record data. Patients diagnosed and treated for CLL at the VHA from 2010 to 2014 were identified through the VACRS and CDW; patients with a prior malignancy were excluded. Patients who received ibrutinib or BR based on pharmacy dispensation information were selected. Patients were followed until December 31, 2016 or death; patients with documentation of another cancer or lack of utilization of the VHA hematology or oncology services (defined as absence of any hematology and/or oncology clinic visits for ≥ 18 months) were omitted from the final analysis (Figure).



Previous and concomitant utilization of antiplatelet (aspirin, clopidogrel) or anticoagulant (dalteparin, enoxaparin, fondaparinux, heparin, rivaroxaban, and warfarin) medications was extracted 6 months before and after the first dispensation of ibrutinib or BR using pharmacy dispensation records.

Study Definitions

Prevalence of comorbidities that could increase bleeding risk was determined using administrative ICD-9-CM codes. Liver disease was identified by presence of cirrhosis, hepatitis C virus, or alcoholic liver disease using administrative codes validated by Kramer and colleagues, who reported positive and negative predictive values of 90% and 87% for cirrhosis, 93% and 92% for hepatitis C virus, and 71% and 98% for alcoholic liver disease.13 Similarly, end-stage liver disease was identified using a validated coding algorithm developed by Goldberg and colleagues, with a positive predictive value of 89.3%.14 The presence of controlled or uncontrolled diabetes mellitus (DM) was identified using the procedure described by Guzman and colleagues.15 Quan’s algorithm was used to calculate Charlson Comorbidity Index (CCI) based on ICD-9-CM codes for inpatient and outpatient visits within a 6-month lookback period prior to treatment initiation.16

A major bleeding event was defined as a hospitalization with an ICD-9-CM code suggestive of major bleeding as the primary reason, as defined by Lane and colleagues in their study of major bleeding related to warfarin in a cohort of patients treated within the VHA.17 Incidence rates of major bleeding events were identified during the first 6 months of treatment. Incidence of Afib—defined as an inpatient or outpatient encounter with the 427.31 ICD-9-CM code—also was examined within the first 6 months after starting treatment. The period of 6 months was chosen because bendamustine must be discontinued after 6 months.

 

 

Study Analysis

Descriptive statistics were used to examine patient demographics, disease characteristics, and treatment history from initial CLL diagnosis through end of study observation period. Categorical variables were summarized using frequencies and accompanying proportions, while a mean and standard deviation were used to summarize continuous variables. For the means of continuous variables and of categorical data, 95% CIs were used. Proportions and accompanying 95% CIs characterized treatment patterns, including line of therapy, comorbidities, and bleeding events. Treatment duration was described using mean and accompanying 95% CI. Statistical tests were not conducted for comparisons among treatment groups. Patients were censored at the end of follow-up, defined as the earliest of the following scenarios: (1) end of study observation period (December 31, 2016); (2) development of a secondary cancer; or (3) last day of contact given absence of care within the VHA for ≥ 18 months (with care defined as oncology and/or oncology/hematology visit with an associated note). Analysis was performed using R 3.4.0.

Results

Between 2010 and 2014, 2,796 patients were diagnosed and received care for CLL within the VHA. Overall, all 172 patients who were treated with ibrutinib during our inclusion period were selected. These patients were treated between January 1, 2014 and December 31, 2016, following ibrutinib’s approval in early 2014. An additional 291 patients were selected who received BR (Table). Reflecting the predominantly male population of the VHA, 282 (97%) BR patients and 167 (97%) ibrutinib patients were male. The median age at diagnosis was 67 years for BR patients and 69 years for ibrutinib patients. About 76% of patients who received ibrutinib and 82% of patients who received BR were non-Hispanic white; 17% and 14% were African American, respectively.

Less than 10% of patients receiving either ibrutinib or BR had liver disease per criteria used by Kramer and colleagues, or end-stage liver disease using criteria developed by Goldberg and colleagues.12,13 About 5% of patients had a history of previous bleeding in the 6-month period prior to initiating either therapy. Mean CCI (excluding malignancy) score was 1.5 (range, 0-11) for the ibrutinib group, and 2.1 (range, 0-9) for the BR group. About 16% of the ibrutinib group had controlled DM and fewer than 10% had uncontrolled DM, while 4% of patients in the BR group met the criteria for controlled DM and another 4% met the criteria for uncontrolled DM.

There was very low utilization of anticoagulant or antiplatelet medication prior to initiation of ibrutinib (2.9% and 2.3%, respectively) or BR (< 1% each). In the first 6 months after treatment initiation, about 8% of patients in both ibrutinib and BR cohorts received anticoagulant medication while antiplatelet utilization was < 5% in either group.

In the BR group, 8 patients (2.7%) experienced a major bleeding event, while 14 patients (8.1%) in the ibrutinib group experienced a bleeding event (P = .008). While these numbers were too low to perform a formal statistical analysis of the association between clinical covariates and bleeding in either group, there did not seem to be an association between bleeding and liver disease or DM. Of patients who experienced a bleeding event, about 1 in 4 patients had had a prior bleeding event in both the ibrutinib and the BR groups. Interestingly, while none of the patients who experienced a bleeding event while receiving BR were taking concomitant anticoagulant medication, 3 of the 14 patients who experienced a bleeding event in the ibrutinib group showed evidence of anticoagulant utilization. Finally, the incidence of Afib (defined as patients with no evidence of Afib in the 6 months prior to treatment but with evidence of Afib in the 6 months following treatment initiation) was 4% in the BR group, and about 8% in the ibrutinib group (P = .003).

 

 

Discussion

To the authors’ knowledge, this study is the first to examine the real-world incidence of bleeding and Afib in veterans who received ibrutinib for CLL in the first year of its availability. The study found minimal use of anticoagulants and/or antiplatelet agents prior to receiving first-line ibrutinib or BR, and very low use of these agents in the first 6 months following the initiation of first-line treatment. This finding suggests a high awareness among VA providers of potential adverse effects (AEs) of ibrutinib and chemotherapy, and a careful selection of patients that lack risk factors for AEs.

In patients treated with first-line ibrutinib when compared with patients treated with first-line BR, moderate increases in bleeding (2.7% vs 8.1%, P = .008) and Afib (10.5% vs 3%, P = .003) also were observed. These results are concordant with previous findings examining the use of ibrutinib in patients with CLL.18-20

Limitations

The results of this study should be interpreted with caution, as some limitations must be considered. The study was conducted in the early days of ibrutinib adoption. Since then, more patients have been treated with ibrutinib and for longer durations. As clinicians gain more familiarity and with ibrutinib, and as additional novel therapeutics emerge, it is possible that the initial awareness about risks for possible AEs may diminish; patients with high comorbidity burdens and concomitant medications would be especially vulnerable in cases of reduced physician vigilance.

Another limitation of this study stems from the potential for dual system use among patients treated in the VHA. Concurrent or alternating use of multiple health care systems (use of VHA and private-sector facilities) may present gaps in the reconstruction of patient histories, resulting in missing data as patients transition between commercial, the Centers for Medicare and Medicaid Services, and VHA care. As a result, the results presented here do not reflect instances where a patient experienced a bleeding event treated outside the VA.

Problems with missing data also may occur due to incomplete extraction from the electronic health record; these issues were addressed by leveraging an understanding of the multiple data marts within the CDW environment to harmonize missing and/or erroneous information through use of other data marts when possible. Lastly, this research represents a population-level study of the VHA, thus all findings are directly relevant to the VHA. The generalizability of the findings outside the VHA would depend on the characteristics of the external population.

Conclusion

Real-world evidence from a nationwide cohort of veteran patients with CLL treated with ibrutinib suggest that, while there is an association of increased bleeding-related events and Afib, the risk is comparable to those reported in previous studies.18-20 These findings suggest that patients in real-world clinical care settings with higher levels of comorbidities may be at a slight increased risk for bleeding events and Afib.

Chronic lymphocytic leukemia (CLL) is the most common leukemia diagnosed in developed countries, with an estimated 21,040 new diagnoses of CLL expected in the US in 2020. 1-3 CLL is an indolent cancer characterized by the accumulation of B-lymphocytes in the blood, marrow, and lymphoid tissues. 4 It has a heterogeneous clinical course; the majority of patients are observed or receive delayed treatment following diagnosis, while a minority of patients require immediate treatment. After first-line treatment, some patients experience prolonged remissions while others require retreatment within 1 or 2 years. Fortunately, advances in cancer biology and therapeutics in the last decade have increased the number of treatment options available for patients with CLL.

Until recently, most CLL treatments relied on a chemotherapy or a chemoimmunotherapy backbone; however, the last few years have seen novel therapies introduced, such as small molecule inhibitors to target molecular pathways that promote the normal development, expansion, and survival of B-cells.5 One such therapy is ibrutinib, a targeted Bruton tyrosine kinase inhibitor that received accelerated approval by the US Food and Drug Administration (FDA) in February 2014 for patients with CLL who received at least 1 prior therapy. The FDA later expanded this approval to include use of ibrutinib in patients with CLL with relapsed or refractory disease, with or without chromosome 17p deletion. In 2016, based on data from the RESONATE-17 study, the FDA approved ibrutinib for first-line therapy in patients with CLL.6

Ibrutinib’s efficacy, ease of administration and dosing (all doses are oral and fixed, rather than based on weight or body surface area), and relatively favorable safety profile have resulted in a rapid growth in its adoption.7 Since its adverse event (AE) profile is generally more tolerable than that of a typical chemoimmunotherapy, its use in older patients with CLL and patients with significant comorbidities is particularly appealing.8

However, the results of some clinical trials suggest an association between treatment with ibrutinib and an increased risk of bleeding-related events of any grade (44%) and major bleeding events (4%).7,8 The incidence of major bleeding events was reported to be higher (9%) in one clinical trial and at 5-year follow-up, although this trial did not exclude patients receiving concomitant oral anticoagulation with warfarin.6,9

Heterogeneity in clinical trials’ definitions of major bleeding confounded the ability to calculate bleeding risk in patients treated with ibrutinib in a systematic review and meta-analysis that called for more data.10 Additionally, patients with factors that might increase the risk of major bleeding with ibrutinib treatment were likely underrepresented in clinical trials, given the carefully selected nature of clinical trial subjects. These factors include renal or hepatic disease, gastrointestinal disease, and use of a number of concomitant medications such as antiplatelets or anticoagulant medications. Accounting for use of the latter is particularly important because patients who develop atrial fibrillation (Afib), one of the recognized AEs of treatment with ibrutinib, often are treated with anticoagulant medications in order to decrease the risk of stroke or other thromboembolic complications.

 

 


A single-site observational study of patients treated with ibrutinib reported a high utilization rate of antiplatelet medications (70%), anticoagulant medications (17%), or both (13%) with a concomitant major bleeding rate of 18% of patients.11 Prevalence of bleeding events seemed to be highly affected by the presence of concomitant medications: 78% of patients treated with ibrutinib while concurrently receiving both antiplatelet and anticoagulant medications developed a major bleeding event, while none of the patients who were not receiving antiplatelets, anticoagulants, or medications that interact with cytochrome P450 (an enzyme that metabolized chemotherapeutic agents used to treat cancer) experienced a major bleeding event.11

The prevalence of major bleeding events, comorbidities, and utilization of medications that could increase the risk of major bleeding in patients with CLL on ibrutinib in the Veterans Health Administration (VHA) is not known. The VHA is the largest integrated health care system in the US. To address these knowledge gaps, a retrospective observational study was conducted using data on demographics, comorbidities that could affect bleeding, use of anticoagulant and antiplatelet medications, and bleeding events in patients with CLL who were treated in the first year of ibrutinib availability from the VHA.

The first year of ibrutinib availability was chosen for this study since we anticipated that many health care providers would be unfamiliar with ibrutinib during that time given its novelty, and therefore more likely to codispense ibrutinib with medications that could increase the risk of a bleeding event. Since Afib is both an AE associated with ibrutinib treatment and a condition that often is treated with anticoagulants, the prevalence of Afib in this population was also included. For context, the incidence of bleeding and Afib and use of anticoagulant and antiplatelet medications during treatment in a cohort of patients with CLL treated with bendamustine + rituximab (BR) also was reported.

Methods

The VHA maintains the centralized US Department of Veterans Affairs Cancer Registry System (VACRS), with electronic medical record data and other sources captured in its Corporate Data Warehouse (CDW). The VHA CDW is a national repository comprising data from several VHA clinical and administrative systems. The CDW includes patient identifiers; demographics; vital status; lab information; administrative information (such as diagnostic International Statistical Classification of Diseases and Related Health Problems [ICD-9] codes); medication dispensation tables (such as outpatient fill); IV package information; and notes from radiology, pathology, outpatient and inpatient admission, discharge, and daily progress.

Registrars abstract all cancer cases within the VHA system (or diagnosed outside the VHA, if patients subsequently receive treatment in the VHA). It is estimated that VACRS captures 3% of cancer cases in the US.12 Like most registries, VACRS captures data such as diagnosis, age, gender, race, and vital status.

 

 


The study received approval from the University of Utah Institutional Review Board and used individual patient-level historical administrative, cancer registry, and electronic health care record data. Patients diagnosed and treated for CLL at the VHA from 2010 to 2014 were identified through the VACRS and CDW; patients with a prior malignancy were excluded. Patients who received ibrutinib or BR based on pharmacy dispensation information were selected. Patients were followed until December 31, 2016 or death; patients with documentation of another cancer or lack of utilization of the VHA hematology or oncology services (defined as absence of any hematology and/or oncology clinic visits for ≥ 18 months) were omitted from the final analysis (Figure).



Previous and concomitant utilization of antiplatelet (aspirin, clopidogrel) or anticoagulant (dalteparin, enoxaparin, fondaparinux, heparin, rivaroxaban, and warfarin) medications was extracted 6 months before and after the first dispensation of ibrutinib or BR using pharmacy dispensation records.

Study Definitions

Prevalence of comorbidities that could increase bleeding risk was determined using administrative ICD-9-CM codes. Liver disease was identified by presence of cirrhosis, hepatitis C virus, or alcoholic liver disease using administrative codes validated by Kramer and colleagues, who reported positive and negative predictive values of 90% and 87% for cirrhosis, 93% and 92% for hepatitis C virus, and 71% and 98% for alcoholic liver disease.13 Similarly, end-stage liver disease was identified using a validated coding algorithm developed by Goldberg and colleagues, with a positive predictive value of 89.3%.14 The presence of controlled or uncontrolled diabetes mellitus (DM) was identified using the procedure described by Guzman and colleagues.15 Quan’s algorithm was used to calculate Charlson Comorbidity Index (CCI) based on ICD-9-CM codes for inpatient and outpatient visits within a 6-month lookback period prior to treatment initiation.16

A major bleeding event was defined as a hospitalization with an ICD-9-CM code suggestive of major bleeding as the primary reason, as defined by Lane and colleagues in their study of major bleeding related to warfarin in a cohort of patients treated within the VHA.17 Incidence rates of major bleeding events were identified during the first 6 months of treatment. Incidence of Afib—defined as an inpatient or outpatient encounter with the 427.31 ICD-9-CM code—also was examined within the first 6 months after starting treatment. The period of 6 months was chosen because bendamustine must be discontinued after 6 months.

 

 

Study Analysis

Descriptive statistics were used to examine patient demographics, disease characteristics, and treatment history from initial CLL diagnosis through end of study observation period. Categorical variables were summarized using frequencies and accompanying proportions, while a mean and standard deviation were used to summarize continuous variables. For the means of continuous variables and of categorical data, 95% CIs were used. Proportions and accompanying 95% CIs characterized treatment patterns, including line of therapy, comorbidities, and bleeding events. Treatment duration was described using mean and accompanying 95% CI. Statistical tests were not conducted for comparisons among treatment groups. Patients were censored at the end of follow-up, defined as the earliest of the following scenarios: (1) end of study observation period (December 31, 2016); (2) development of a secondary cancer; or (3) last day of contact given absence of care within the VHA for ≥ 18 months (with care defined as oncology and/or oncology/hematology visit with an associated note). Analysis was performed using R 3.4.0.

Results

Between 2010 and 2014, 2,796 patients were diagnosed and received care for CLL within the VHA. Overall, all 172 patients who were treated with ibrutinib during our inclusion period were selected. These patients were treated between January 1, 2014 and December 31, 2016, following ibrutinib’s approval in early 2014. An additional 291 patients were selected who received BR (Table). Reflecting the predominantly male population of the VHA, 282 (97%) BR patients and 167 (97%) ibrutinib patients were male. The median age at diagnosis was 67 years for BR patients and 69 years for ibrutinib patients. About 76% of patients who received ibrutinib and 82% of patients who received BR were non-Hispanic white; 17% and 14% were African American, respectively.

Less than 10% of patients receiving either ibrutinib or BR had liver disease per criteria used by Kramer and colleagues, or end-stage liver disease using criteria developed by Goldberg and colleagues.12,13 About 5% of patients had a history of previous bleeding in the 6-month period prior to initiating either therapy. Mean CCI (excluding malignancy) score was 1.5 (range, 0-11) for the ibrutinib group, and 2.1 (range, 0-9) for the BR group. About 16% of the ibrutinib group had controlled DM and fewer than 10% had uncontrolled DM, while 4% of patients in the BR group met the criteria for controlled DM and another 4% met the criteria for uncontrolled DM.

There was very low utilization of anticoagulant or antiplatelet medication prior to initiation of ibrutinib (2.9% and 2.3%, respectively) or BR (< 1% each). In the first 6 months after treatment initiation, about 8% of patients in both ibrutinib and BR cohorts received anticoagulant medication while antiplatelet utilization was < 5% in either group.

In the BR group, 8 patients (2.7%) experienced a major bleeding event, while 14 patients (8.1%) in the ibrutinib group experienced a bleeding event (P = .008). While these numbers were too low to perform a formal statistical analysis of the association between clinical covariates and bleeding in either group, there did not seem to be an association between bleeding and liver disease or DM. Of patients who experienced a bleeding event, about 1 in 4 patients had had a prior bleeding event in both the ibrutinib and the BR groups. Interestingly, while none of the patients who experienced a bleeding event while receiving BR were taking concomitant anticoagulant medication, 3 of the 14 patients who experienced a bleeding event in the ibrutinib group showed evidence of anticoagulant utilization. Finally, the incidence of Afib (defined as patients with no evidence of Afib in the 6 months prior to treatment but with evidence of Afib in the 6 months following treatment initiation) was 4% in the BR group, and about 8% in the ibrutinib group (P = .003).

 

 

Discussion

To the authors’ knowledge, this study is the first to examine the real-world incidence of bleeding and Afib in veterans who received ibrutinib for CLL in the first year of its availability. The study found minimal use of anticoagulants and/or antiplatelet agents prior to receiving first-line ibrutinib or BR, and very low use of these agents in the first 6 months following the initiation of first-line treatment. This finding suggests a high awareness among VA providers of potential adverse effects (AEs) of ibrutinib and chemotherapy, and a careful selection of patients that lack risk factors for AEs.

In patients treated with first-line ibrutinib when compared with patients treated with first-line BR, moderate increases in bleeding (2.7% vs 8.1%, P = .008) and Afib (10.5% vs 3%, P = .003) also were observed. These results are concordant with previous findings examining the use of ibrutinib in patients with CLL.18-20

Limitations

The results of this study should be interpreted with caution, as some limitations must be considered. The study was conducted in the early days of ibrutinib adoption. Since then, more patients have been treated with ibrutinib and for longer durations. As clinicians gain more familiarity and with ibrutinib, and as additional novel therapeutics emerge, it is possible that the initial awareness about risks for possible AEs may diminish; patients with high comorbidity burdens and concomitant medications would be especially vulnerable in cases of reduced physician vigilance.

Another limitation of this study stems from the potential for dual system use among patients treated in the VHA. Concurrent or alternating use of multiple health care systems (use of VHA and private-sector facilities) may present gaps in the reconstruction of patient histories, resulting in missing data as patients transition between commercial, the Centers for Medicare and Medicaid Services, and VHA care. As a result, the results presented here do not reflect instances where a patient experienced a bleeding event treated outside the VA.

Problems with missing data also may occur due to incomplete extraction from the electronic health record; these issues were addressed by leveraging an understanding of the multiple data marts within the CDW environment to harmonize missing and/or erroneous information through use of other data marts when possible. Lastly, this research represents a population-level study of the VHA, thus all findings are directly relevant to the VHA. The generalizability of the findings outside the VHA would depend on the characteristics of the external population.

Conclusion

Real-world evidence from a nationwide cohort of veteran patients with CLL treated with ibrutinib suggest that, while there is an association of increased bleeding-related events and Afib, the risk is comparable to those reported in previous studies.18-20 These findings suggest that patients in real-world clinical care settings with higher levels of comorbidities may be at a slight increased risk for bleeding events and Afib.

References

1. Scarfò L, Ferreri AJ, Ghia P. Chronic lymphocytic leukaemia. Crit Rev Oncol Hematol. 2016;104:169-182.

2. Devereux S, Cuthill K. Chronic lymphocytic leukaemia. Medicine (Baltimore). 2017;45(5):292-296.

3. American Cancer Society. Cancer facts & figures 2020. https://www.cancer.org/content/dam/cancer-org/research/cancer-facts-and-statistics/annual-cancer-facts-and-figures/2020/cancer-facts-and-figures-2020.pdf. Accessed April 24, 2020.

4. Kipps TJ, Stevenson FK, Wu CJ, et al. Chronic lymphocytic leukaemia. Nat Rev Dis Primers. 2017;3:16096.

5. Owen C, Assouline S, Kuruvilla J, Uchida C, Bellingham C, Sehn L. Novel therapies for chronic lymphocytic leukemia: a Canadian perspective. Clin Lymphoma Myeloma Leuk. 2015;15(11):627-634.e5.

6. O’Brien S, Jones JA, Coutre SE, et al. Ibrutinib for patients with relapsed or refractory chronic lymphocytic leukaemia with 17p deletion (RESONATE-17): a phase 2, open-label, multicentre study. Lancet Oncol. 2016;17(10):1409–1418.

7. Burger JA, Tedeschi A, Barr PM, et al; RESONATE-2 Investigators. Ibrutinib as initial therapy for patients with chronic lymphocytic leukemia. N Engl J Med. 2015;373(25):2425-2437.

8. Byrd JC, Furman RR, Coutre SE, et al. Targeting BTK with ibrutinib in relapsed chronic lymphocytic leukemia. N Engl J Med. 2013;369(1):32-42.

9. O’Brien S, Furman R, Coutre S, et al. Single-agent ibrutinib in treatment-naive and relapsed/refractory chronic lymphocytic leukemia: a 5-year experience. Blood. 2018;131(17):1910-1919.

10. Caron F, Leong DP, Hillis C, Fraser G, Siegal D. Current understanding of bleeding with ibrutinib use: a systematic review and meta-analysis. Blood Adv. 2017;1(12):772-778.

11. Kunk PR, Mock J, Devitt ME, Palkimas S, et al. Major bleeding with ibrutinib: more than expected. Blood. 2016;128(22):3229.

12. Zullig LL, Jackson GL, Dorn RA, et al. Cancer incidence among patients of the U.S. Veterans Affairs Health Care System. Mil Med. 2012;177(6):693-701.

13. Kramer JR, Davila JA, Miller ED, Richardson P, Giordano TP, El-Serag HB. The validity of viral hepatitis and chronic liver disease diagnoses in Veterans Affairs administrative databases. Aliment Pharmacol Ther. 2008;27(3):274-282.

14. Goldberg D, Lewis JD, Halpern SD, Weiner M, Lo Re V 3rd. Validation of three coding algorithms to identify patients with end-stage liver disease in an administrative database. Pharmacoepidemiol Drug Saf. 2012;21(7):765-769.

15. Guzman JZ, Iatridis JC, Skovrlj B, et al. Outcomes and complications of diabetes mellitus on patients undergoing degenerative lumbar spine surgery. Spine (Phila Pa 1976). 2014;39(19):1596-1604.

16. Quan H, Sundararajan V, Halfon P, et al. Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data. Med Care. 2005;43(11):1130-1139.

17. Lane MA, Zeringue A, McDonald JR. Serious bleeding events due to warfarin and antibiotic co-prescription in a cohort of veterans. Am J Med. 2014;127(7):657–663.e2.

18. Leong DP, Caron F, Hillis C, et al. The risk of atrial fibrillation with ibrutinib use: a systematic review and meta-analysis. Blood. 2016;128(1):138-140.

19. Lipsky AH, Farooqui MZ, Tian X, et al. Incidence and risk factors of bleeding-related adverse events in patients with chronic lymphocytic leukemia treated with ibrutinib. Haematologica. 2015;100(12):1571-1578.

20. Brown JR, Moslehi J, O’Brien S, et al. Characterization of atrial fibrillation adverse events reported in ibrutinib randomized controlled registration trials. Haematologica. 2017;102(10):1796-1805.

References

1. Scarfò L, Ferreri AJ, Ghia P. Chronic lymphocytic leukaemia. Crit Rev Oncol Hematol. 2016;104:169-182.

2. Devereux S, Cuthill K. Chronic lymphocytic leukaemia. Medicine (Baltimore). 2017;45(5):292-296.

3. American Cancer Society. Cancer facts & figures 2020. https://www.cancer.org/content/dam/cancer-org/research/cancer-facts-and-statistics/annual-cancer-facts-and-figures/2020/cancer-facts-and-figures-2020.pdf. Accessed April 24, 2020.

4. Kipps TJ, Stevenson FK, Wu CJ, et al. Chronic lymphocytic leukaemia. Nat Rev Dis Primers. 2017;3:16096.

5. Owen C, Assouline S, Kuruvilla J, Uchida C, Bellingham C, Sehn L. Novel therapies for chronic lymphocytic leukemia: a Canadian perspective. Clin Lymphoma Myeloma Leuk. 2015;15(11):627-634.e5.

6. O’Brien S, Jones JA, Coutre SE, et al. Ibrutinib for patients with relapsed or refractory chronic lymphocytic leukaemia with 17p deletion (RESONATE-17): a phase 2, open-label, multicentre study. Lancet Oncol. 2016;17(10):1409–1418.

7. Burger JA, Tedeschi A, Barr PM, et al; RESONATE-2 Investigators. Ibrutinib as initial therapy for patients with chronic lymphocytic leukemia. N Engl J Med. 2015;373(25):2425-2437.

8. Byrd JC, Furman RR, Coutre SE, et al. Targeting BTK with ibrutinib in relapsed chronic lymphocytic leukemia. N Engl J Med. 2013;369(1):32-42.

9. O’Brien S, Furman R, Coutre S, et al. Single-agent ibrutinib in treatment-naive and relapsed/refractory chronic lymphocytic leukemia: a 5-year experience. Blood. 2018;131(17):1910-1919.

10. Caron F, Leong DP, Hillis C, Fraser G, Siegal D. Current understanding of bleeding with ibrutinib use: a systematic review and meta-analysis. Blood Adv. 2017;1(12):772-778.

11. Kunk PR, Mock J, Devitt ME, Palkimas S, et al. Major bleeding with ibrutinib: more than expected. Blood. 2016;128(22):3229.

12. Zullig LL, Jackson GL, Dorn RA, et al. Cancer incidence among patients of the U.S. Veterans Affairs Health Care System. Mil Med. 2012;177(6):693-701.

13. Kramer JR, Davila JA, Miller ED, Richardson P, Giordano TP, El-Serag HB. The validity of viral hepatitis and chronic liver disease diagnoses in Veterans Affairs administrative databases. Aliment Pharmacol Ther. 2008;27(3):274-282.

14. Goldberg D, Lewis JD, Halpern SD, Weiner M, Lo Re V 3rd. Validation of three coding algorithms to identify patients with end-stage liver disease in an administrative database. Pharmacoepidemiol Drug Saf. 2012;21(7):765-769.

15. Guzman JZ, Iatridis JC, Skovrlj B, et al. Outcomes and complications of diabetes mellitus on patients undergoing degenerative lumbar spine surgery. Spine (Phila Pa 1976). 2014;39(19):1596-1604.

16. Quan H, Sundararajan V, Halfon P, et al. Coding algorithms for defining comorbidities in ICD-9-CM and ICD-10 administrative data. Med Care. 2005;43(11):1130-1139.

17. Lane MA, Zeringue A, McDonald JR. Serious bleeding events due to warfarin and antibiotic co-prescription in a cohort of veterans. Am J Med. 2014;127(7):657–663.e2.

18. Leong DP, Caron F, Hillis C, et al. The risk of atrial fibrillation with ibrutinib use: a systematic review and meta-analysis. Blood. 2016;128(1):138-140.

19. Lipsky AH, Farooqui MZ, Tian X, et al. Incidence and risk factors of bleeding-related adverse events in patients with chronic lymphocytic leukemia treated with ibrutinib. Haematologica. 2015;100(12):1571-1578.

20. Brown JR, Moslehi J, O’Brien S, et al. Characterization of atrial fibrillation adverse events reported in ibrutinib randomized controlled registration trials. Haematologica. 2017;102(10):1796-1805.

Issue
Federal Practitioner - 37(2)s
Issue
Federal Practitioner - 37(2)s
Page Number
S44-S48
Page Number
S44-S48
Publications
Publications
Topics
Article Type
Display Headline
Atrial Fibrillation and Bleeding in Patients With Chronic Lymphocytic Leukemia Treated with Ibrutinib in the Veterans Health Administration
Display Headline
Atrial Fibrillation and Bleeding in Patients With Chronic Lymphocytic Leukemia Treated with Ibrutinib in the Veterans Health Administration
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Article PDF Media

Radiotherapeutic Care of Patients With Stage IV Lung Cancer with Thoracic Symptoms in the Veterans Health Administration (FULL)

Article Type
Changed
Thu, 12/15/2022 - 14:39
Display Headline
Radiotherapeutic Care of Patients With Stage IV Lung Cancer with Thoracic Symptoms in the Veterans Health Administration

Lung cancer is the leading cause of cancer mortality both in the US and worldwide.1 Many patients diagnosed with lung cancer present with advanced disease with thoracic symptoms such as cough, hemoptysis, dyspnea, and chest pain.2-4 Palliative radiotherapy is routinely used in patients with locally advanced and metastatic lung cancer with the goal of relieving these symptoms and improving quality of life. Guidelines published by the American Society for Radiation Oncology (ASTRO) in 2011, and updated in 2018, provide recommendations on palliation of lung cancer with external beam radiotherapy (EBRT) and clarify the roles of concurrent chemotherapy and endobronchial brachytherapy (EBB) for palliation.5,6

After prostate cancer, lung cancer is the second most frequently diagnosed cancer in the Veterans Health Administration (VHA).7 The VHA consists of 172 medical centers and is the largest integrated health care system in the US. At the time of this study, 40 of these centers had onsite radiation facilities. The VHA Palliative Radiation Taskforce has conducted a series of surveys to evaluate use of palliative radiotherapy in the VHA, determine VHA practice concordance with ASTRO and American College of Radiology (ACR) guidelines, and direct educational efforts towards addressing gaps in knowledge. These efforts are directed at ensuring best practices throughout this large and heterogeneous healthcare system. In 2016 a survey was conducted to evaluate concordance of VHA radiation oncologist (RO) practice with the 2011 ASTRO guidelines on palliative thoracic radiotherapy for non-small cell lung cancer (NSCLC).

 

 

Methods

A survey instrument was generated by VHA National Palliative Radiotherapy Taskforce members. It was reviewed and approved for use by the VHA Patient Care Services office. In May of 2016, the online survey was sent to the 88 VHA ROs practicing at the 40 sites with onsite radiation facilities. The survey aimed to determine patterns of practice for palliation of thoracic symptoms secondary to lung cancer.

Demographic information obtained included years in practice, employment status, academic appointment, board certification, and familiarity with ASTRO lung cancer guidelines. Two clinical scenarios were presented to glean opinions on dose/fractionation schemes preferred, use of concurrent chemotherapy, and use of EBB and/or yttrium aluminum garnet (YAG) laser technology. Survey questions also assessed use of EBRT for palliation of hemoptysis, chest wall pain, and/or stridor as well as use of stereotactic body radiotherapy (SBRT) for palliation.

Survey results were assessed for concordance with published ASTRO guidelines. χ2 tests were run to test for associations between demographic factors such as academic appointment, years of practice, full time vs part time employment, and familiarity with ASTRO palliative lung cancer guidelines, with use of EBRT for palliation, dose and fractionation preference, use of concurrent chemotherapy, and strategy for management of endobronchial lesions.

Results

Of the 88 physicians surveyed, 54 responded for a response rate of 61%. Respondents represented 37 of the 40 (93%) VHA radiation oncology departments (Table 1). Among respondents, most were board certified (96%), held academic appointments (91%), and were full-time employees (85%). Forty-four percent of respondents were in practice for > 20 years, 19% for 11 to 20 years, 20% for 6 to 10 years, and 17% for < 6 years. A majority reported familiarity with the ASTRO guidelines (64%), while just 11% reported no familiarity with the guidelines.

When asked about use of SBRT for palliation of hemoptysis, stridor, and/or chest pain, the majority (87%) preferred conventional EBRT. Of the 13% who reported use of SBRT, most (11%) performed it onsite, with 2% of respondents referring offsite to non-VHA centers for the service. When asked about use of EBB for palliation, only 2% reported use of that procedure at their facilities, while 26% reported referral to non-VHA facilities for EBB. The remaining 72% of respondents favor use of conventional EBRT.

Respondents were presented with a case of a male patient aged 70 years who smoked and had widely metastatic NSCLC, a life expectancy of about 3 months, and 10/10 chest wall pain from direct tumor invasion. All respondents recommended palliative radiotherapy. The preferred fractionation was 20 Gray (Gy) in 5 fractions, which was recommended by 69% of respondents. The remainder recommended 30 Gy in 10 fractions (22%) or a single fraction of 10 Gy (9%). No respondent recommended the longer fractionation options of 60 Gy in 30 fractions, 45 Gy in 15 fractions, or 40 Gy in 20 fractions. The majority (98%) did not recommend concurrent chemotherapy.

When the above case was modified for an endobronchial lesion requiring palliation with associated lung collapse, rather than chest wall invasion, 20 respondents (38%) reported they would refer for EBB, and 20 respondents reported they would refer for YAG laser. As > 1 answer could be selected for this question, there were 12 respondents who selected both EBB and YAG laser; 8 selected only EBB, and 8 selected only YAG laser. Many respondents added comments about treating with EBRT, which had not been presented as an answer choice. Nearly half of respondents (49%) were amenable to referral for the use of EBB or YAG laser for lung reexpansion prior to radiotherapy. Three respondents mentioned referral for an endobronchial stent prior to palliative radiotherapy to address this question.



χ2 tests were used to evaluate for significant associations between demographic factors, such as number of years in practice, academic appointment, full-time vs part-time status, and familiarity with ASTRO guidelines with clinical management choices (Table 2). The χ2 analysis revealed that these demographic factors were not significantly associated with familiarity with ASTRO guidelines, offering SBRT for palliation, EBRT fractionation scheme preferred, use of concurrent chemotherapy, or use of EBB or YAG laser.

 

 

Discussion

This survey was conducted to evaluate concordance of management of metastatic lung cancer in the VHA with ASTRO guidelines. The relationship between respondents’ familiarity with the guidelines and responses also was evaluated to determine the impact such guidelines have on decision-making. The ASTRO guidelines for palliative thoracic radiation make recommendations regarding 3 issues: (1) radiation doses and fractionations for palliation; (2) the role of EBB; and (3) the use of concurrent chemotherapy.5,6

Radiation Dose and Fractionation for Palliation

A variety of dose/fractionation schemes are considered appropriate in the ASTRO guideline statement, including more prolonged courses such as 30 Gy/10 fractions as well as more hypofractionated regimens (ie, 20 Gy/5 fractions, 17 Gy/2 fractions, and a single fraction of 10 Gy). Higher dose regimens, such as 30 Gy/10 fractions, have been associated with prolonged survival, as well as increased toxicities such as radiation esophagitis.8 Therefore, the guidelines support use of 30 Gy/10 fractions for patients with good performance status while encouraging use of more hypofractionated regimens for patients with poor performance status. In considering more hypofractionated regimens, one must consider the possibility of adverse effects that can be associated with higher dose per fraction. For instance, 17 Gy/2 fractions has been associated with myelopathy; therefore it should be used with caution and careful treatment planning.9

For the survey case example (a male aged 70 years with a 3-month life expectancy who required palliation for chest wall pain), all respondents selected hypofractionated regimens; with no respondent selected the more prolonged fractionations of 60 Gy/30 fractions, 45 Gy/15 fractions, or 40 Gy/20 fractions. These more prolonged fractionations are not endorsed by the guidelines in general, and particularly not for a patient with poor life expectancy. All responses for this case selected by survey respondents are considered appropriate per the consensus guideline statement.

Role of Concurrent Chemotherapy

The ASTRO guidelines do not support use of concurrent chemotherapy for palliation of stage IV NSCLC.5,6 The 2018 updated guidelines established a role for concurrent chemotherapy for patients with stage III NSCLC with good performance status and life expectancy of > 3 months. This updated recommendation is based on data from 2 randomized trials demonstrating improvement in overall survival with the addition of chemotherapy for patients with stage III NSCLC undergoing palliative radiotherapy.10-12

These newer studies are in contrast to an older randomized study by Ball and colleagues that demonstrated greater toxicity from concurrent chemotherapy, with no improvement in outcomes such as palliation of symptoms, overall survival, or progression free survival.13 In contrast to the newer studies that included only patients with stage III NSCLC, about half of the patients in the Ball and colleagues study had known metastatic disease.10-13 Of note, staging for metastatic disease was not carried out routinely, so it is possible that a greater proportion of patients had metastatic disease that would have been seen on imaging. In concordance with the guidelines, 98% of the survey respondents did not recommend concurrent chemotherapy for palliation of intrathoracic symptom; only 1 respondent recommended use of chemotherapy for palliation.

 

 

Role of Endobronchial Brachytherapy

EBB involves implantation of radioactive sources for treatment of endobronchial lesions causing obstructive symptoms.14 Given the lack of randomized data that demonstrate a benefit of EBB over EBRT, the ASTRO guidelines do not endorse routine use of EBB for initial palliative management.15,16 The ASTRO guidelines reference a Cochrane Review of 13 trials that concluded that EBRT alone is superior to EBB alone for initial palliation of symptoms from endobronchial NSCLC.17

Of respondents surveyed, only 1 facility offered onsite EBB. The majority of respondents (72%) preferred the use of conventional EBRT techniques, while 26% refer to non-VHA centers for EBB. Lack of incorporation of EBB into routine VHA practice likely is a reflection of the unclear role of this technology based on the available literature and ASTRO guidelines. In the setting of a right lower lung collapse, more respondents (49%) would consider use of EBB or YAG laser technology for lung reexpansion prior to EBRT.

The ASTRO guidelines recommend that initial EBB in conjunction with EBRT be considered based on randomized data demonstrating significant improvement in lung reexpansion and in patient reported dyspnea with addition of EBB to EBRT over EBRT alone.18 However, the guidelines do not mandate the use of EBB in this situation. It is possible that targeted education regarding the role of EBB would improve knowledge of the potential benefit in the setting of lung collapse and increase the percentage of VHA ROs who would recommend this procedure.

Limitations

The study is limited by lack of generalizability of these findings to all ROs in the country. It is also possible that physician responses do not represent practice patterns with complete accuracy. The use of EBB varied among practitioners. Further study of this technology is necessary to clarify its role in the management of endobronchial obstructive symptoms and to determine whether efforts should be made to increase access to EBB within the VHA.

Conclusions

Most of the ROs who responded to our survey were cognizant and compliant with current ASTRO guidelines on management of lung cancer. Furthermore, familiarity with ASTRO guidelines and management choices were not associated with the respondents’ years in practice, academic appointment, full-time vs part-time status, or familiarity with ASTRO guidelines. This study is a nationwide survey of ROs in the VHA system that reflects the radiation-related care received by veterans with metastatic lung cancer. Responses were obtained from 93% of the 40 radiation oncology centers, so it is likely that the survey accurately represents the decision-making process at the majority of centers. It is possible that those who did not respond to the survey do not treat thoracic cases.

References

1. Torre LA, Bray F, Siegel RL, Ferlay J, Lortet-Tieulent J, Jemal A. Global cancer statistics, 2012. CA Cancer J Clin. 2015 65(2):87-108.

2. Kocher F, Hilbe W, Seeber A, et al. Longitudinal analysis of 2293 NSCLC patients: a comprehensive study from the TYROL registry. Lung Cancer. 2015;87(2):193-200.

3. Chute CG, Greenberg ER, Baron J, Korson R, Baker J, Yates J. Presenting conditions of 1539 population-based lung cancer patients by cell type and stage in New Hampshire and Vermont. Cancer. 1985;56(8):2107-2111.

4. Hyde L, Hyde Cl. Clinical manifestations of lung cancer. Chest. 1974;65(3):299-306.

5. Rodrigues G, Videtic GM, Sur R, et al. Palliative thoracic radiotherapy in lung cancer: An American Society for Radiation Oncology evidence-based clinical practice guideline. Pract Radiat Oncol. 2011;1(2):60-71.

6. Moeller B, Balagamwala EH, Chen A, et al. Palliative thoracic radiation therapy for non-small cell lung cancer: 2018 Update of an American Society for Radiation Oncology (ASTRO) Evidence-Based Guideline. Pract Radiat Oncol. 2018;8(4):245-250.

7. Zullig LL, Jackson GL, Dorn RA, et al. Cancer incidence among patients of the United States Veterans Affairs (VA) healthcare system. Mil Med. 2012;177(6):693-701.

8. Fairchild A, Harris K, Barnes E, et al. Palliative thoracic radiotherapy for lung cancer: a systematic review. J Clin Oncol. 2008;26(24):4001-4011.

9. A Medical Research Council (MRC) randomised trial of palliative radiotherapy with two fractions or a single fraction in patients with inoperable non-small-cell lung cancer (NSCLC) and poor performance status. Medical Research Council Lung Cancer Working Party. Br J Cancer. 1992;65(6):934-941.

10. Nawrocki S, Krzakowski M, Wasilewska-Tesluk E, et al. Concurrent chemotherapy and short course radiotherapy in patients with stage IIIA to IIIB non-small cell lung cancer not eligible for radical treatment: results of a randomized phase II study. J Thorac Oncol. 2010;5(8):1255-1262.

11. Strøm HH, Bremnes RM, Sundstrøm SH, Helbekkmo N, Fløtten O, Aasebø U. Concurrent palliative chemoradiation leads to survival and quality of life benefits in poor prognosis stage III non-small-cell lung cancer: a randomised trial by the Norwegian Lung Cancer Study Group. Br J Cancer. 2013;109(6):1467-1475.

12. Strøm HH, Bremnes RM, Sundstrøm SH, Helbekkmo N, Aasebø U. Poor prognosis patients with inoperable locally advanced NSCLC and large tumors benefit from palliative chemoradiotherapy: a subset analysis from a randomized clinical phase III trial. J Thorac Oncol. 2014;9(6):825-833.

13. Ball D, Smith J, Bishop J, et al. A phase III study of radiotherapy with and without continuous-infusion fluorouracil as palliation for non-small-cell lung cancer. Br J Cancer. 1997;75(5):690-697.

14. Stewart A, Parashar B, Patel M, et al. American Brachytherapy Society consensus guidelines for thoracic brachytherapy for lung cancer. Brachytherapy. 2016;15(1):1-11.

15. Sur R, Ahmed SN, Donde B, Morar R, Mohamed G, Sur M, Pacella JA, Van der Merwe E, Feldman C. Brachytherapy boost vs teletherapy boost in palliation of symptomatic, locally advanced non-small cell lung cancer: preliminary analysis of a randomized prospective study. J Brachytherapy Int. 2001;17(4):309-315.

16. Sur R, Donde B, Mohuiddin M, et al. Randomized prospective study on the role of high dose rate intraluminal brachytherapy (HDRILBT) in palliation of symptoms in advanced non-small cell lung cancer (NSCLC) treated with radiation alone. Int J Radiat Oncol Biol Phys. 2004;60(1):S205.

17. Ung YC, Yu E, Falkson C, et al. The role of high-dose-rate brachytherapy in the palliation of symptoms in patients with non-small cell lung cancer: a systematic review. Brachytherapy. 2006;5:189-202.

18. Langendijk H, de Jong J, Tjwa M, et al. External irradiation versus external irradiation plus endobronchial brachytherapy in inoperable non-small cell lung cancer: a prospective randomized study. Radiother Oncol. 2001;58(3):257-268.

Article PDF
Author and Disclosure Information

Ruchika Gutt is a Radiation Oncologist at the Washington DC VA Medical Center (VAMC). Sheetal Malhotra is an Endocrinologist at The Southeast Permanente Medical Group in Jonesboro, Georgia. Drew Moghanaki is a Radiation Oncologist at the Atlanta VAMC in Georgia. Alice Cheuk is a Radiation Oncologist at the James J. Peters VAMC in the Bronx, New York, and an Assistant Professor at Mount Sinai School of Medicine. Lori Hoffman-Hogg is National Program Manager for Prevention Policy at Veterans Health Administration National Center for Health Promotion and Disease Prevention in Durham, North Carolina. Maria Kelly and George Dawson are Radiation Oncologists at the New Jersey VA Health Care System in East Orange. Helen Fosmire is Deputy Chief of Staff at the Richard L. Roudebush VAMC in Indianapolis, Indiana.
Correspondence: Ruchika Gutt (ruchika.gutt@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest for this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Issue
Federal Practitioner - 37(2)s
Publications
Topics
Page Number
S38-S42
Sections
Author and Disclosure Information

Ruchika Gutt is a Radiation Oncologist at the Washington DC VA Medical Center (VAMC). Sheetal Malhotra is an Endocrinologist at The Southeast Permanente Medical Group in Jonesboro, Georgia. Drew Moghanaki is a Radiation Oncologist at the Atlanta VAMC in Georgia. Alice Cheuk is a Radiation Oncologist at the James J. Peters VAMC in the Bronx, New York, and an Assistant Professor at Mount Sinai School of Medicine. Lori Hoffman-Hogg is National Program Manager for Prevention Policy at Veterans Health Administration National Center for Health Promotion and Disease Prevention in Durham, North Carolina. Maria Kelly and George Dawson are Radiation Oncologists at the New Jersey VA Health Care System in East Orange. Helen Fosmire is Deputy Chief of Staff at the Richard L. Roudebush VAMC in Indianapolis, Indiana.
Correspondence: Ruchika Gutt (ruchika.gutt@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest for this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Author and Disclosure Information

Ruchika Gutt is a Radiation Oncologist at the Washington DC VA Medical Center (VAMC). Sheetal Malhotra is an Endocrinologist at The Southeast Permanente Medical Group in Jonesboro, Georgia. Drew Moghanaki is a Radiation Oncologist at the Atlanta VAMC in Georgia. Alice Cheuk is a Radiation Oncologist at the James J. Peters VAMC in the Bronx, New York, and an Assistant Professor at Mount Sinai School of Medicine. Lori Hoffman-Hogg is National Program Manager for Prevention Policy at Veterans Health Administration National Center for Health Promotion and Disease Prevention in Durham, North Carolina. Maria Kelly and George Dawson are Radiation Oncologists at the New Jersey VA Health Care System in East Orange. Helen Fosmire is Deputy Chief of Staff at the Richard L. Roudebush VAMC in Indianapolis, Indiana.
Correspondence: Ruchika Gutt (ruchika.gutt@va.gov)

Author disclosures
The authors report no actual or potential conflicts of interest for this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Article PDF
Article PDF

Lung cancer is the leading cause of cancer mortality both in the US and worldwide.1 Many patients diagnosed with lung cancer present with advanced disease with thoracic symptoms such as cough, hemoptysis, dyspnea, and chest pain.2-4 Palliative radiotherapy is routinely used in patients with locally advanced and metastatic lung cancer with the goal of relieving these symptoms and improving quality of life. Guidelines published by the American Society for Radiation Oncology (ASTRO) in 2011, and updated in 2018, provide recommendations on palliation of lung cancer with external beam radiotherapy (EBRT) and clarify the roles of concurrent chemotherapy and endobronchial brachytherapy (EBB) for palliation.5,6

After prostate cancer, lung cancer is the second most frequently diagnosed cancer in the Veterans Health Administration (VHA).7 The VHA consists of 172 medical centers and is the largest integrated health care system in the US. At the time of this study, 40 of these centers had onsite radiation facilities. The VHA Palliative Radiation Taskforce has conducted a series of surveys to evaluate use of palliative radiotherapy in the VHA, determine VHA practice concordance with ASTRO and American College of Radiology (ACR) guidelines, and direct educational efforts towards addressing gaps in knowledge. These efforts are directed at ensuring best practices throughout this large and heterogeneous healthcare system. In 2016 a survey was conducted to evaluate concordance of VHA radiation oncologist (RO) practice with the 2011 ASTRO guidelines on palliative thoracic radiotherapy for non-small cell lung cancer (NSCLC).

 

 

Methods

A survey instrument was generated by VHA National Palliative Radiotherapy Taskforce members. It was reviewed and approved for use by the VHA Patient Care Services office. In May of 2016, the online survey was sent to the 88 VHA ROs practicing at the 40 sites with onsite radiation facilities. The survey aimed to determine patterns of practice for palliation of thoracic symptoms secondary to lung cancer.

Demographic information obtained included years in practice, employment status, academic appointment, board certification, and familiarity with ASTRO lung cancer guidelines. Two clinical scenarios were presented to glean opinions on dose/fractionation schemes preferred, use of concurrent chemotherapy, and use of EBB and/or yttrium aluminum garnet (YAG) laser technology. Survey questions also assessed use of EBRT for palliation of hemoptysis, chest wall pain, and/or stridor as well as use of stereotactic body radiotherapy (SBRT) for palliation.

Survey results were assessed for concordance with published ASTRO guidelines. χ2 tests were run to test for associations between demographic factors such as academic appointment, years of practice, full time vs part time employment, and familiarity with ASTRO palliative lung cancer guidelines, with use of EBRT for palliation, dose and fractionation preference, use of concurrent chemotherapy, and strategy for management of endobronchial lesions.

Results

Of the 88 physicians surveyed, 54 responded for a response rate of 61%. Respondents represented 37 of the 40 (93%) VHA radiation oncology departments (Table 1). Among respondents, most were board certified (96%), held academic appointments (91%), and were full-time employees (85%). Forty-four percent of respondents were in practice for > 20 years, 19% for 11 to 20 years, 20% for 6 to 10 years, and 17% for < 6 years. A majority reported familiarity with the ASTRO guidelines (64%), while just 11% reported no familiarity with the guidelines.

When asked about use of SBRT for palliation of hemoptysis, stridor, and/or chest pain, the majority (87%) preferred conventional EBRT. Of the 13% who reported use of SBRT, most (11%) performed it onsite, with 2% of respondents referring offsite to non-VHA centers for the service. When asked about use of EBB for palliation, only 2% reported use of that procedure at their facilities, while 26% reported referral to non-VHA facilities for EBB. The remaining 72% of respondents favor use of conventional EBRT.

Respondents were presented with a case of a male patient aged 70 years who smoked and had widely metastatic NSCLC, a life expectancy of about 3 months, and 10/10 chest wall pain from direct tumor invasion. All respondents recommended palliative radiotherapy. The preferred fractionation was 20 Gray (Gy) in 5 fractions, which was recommended by 69% of respondents. The remainder recommended 30 Gy in 10 fractions (22%) or a single fraction of 10 Gy (9%). No respondent recommended the longer fractionation options of 60 Gy in 30 fractions, 45 Gy in 15 fractions, or 40 Gy in 20 fractions. The majority (98%) did not recommend concurrent chemotherapy.

When the above case was modified for an endobronchial lesion requiring palliation with associated lung collapse, rather than chest wall invasion, 20 respondents (38%) reported they would refer for EBB, and 20 respondents reported they would refer for YAG laser. As > 1 answer could be selected for this question, there were 12 respondents who selected both EBB and YAG laser; 8 selected only EBB, and 8 selected only YAG laser. Many respondents added comments about treating with EBRT, which had not been presented as an answer choice. Nearly half of respondents (49%) were amenable to referral for the use of EBB or YAG laser for lung reexpansion prior to radiotherapy. Three respondents mentioned referral for an endobronchial stent prior to palliative radiotherapy to address this question.



χ2 tests were used to evaluate for significant associations between demographic factors, such as number of years in practice, academic appointment, full-time vs part-time status, and familiarity with ASTRO guidelines with clinical management choices (Table 2). The χ2 analysis revealed that these demographic factors were not significantly associated with familiarity with ASTRO guidelines, offering SBRT for palliation, EBRT fractionation scheme preferred, use of concurrent chemotherapy, or use of EBB or YAG laser.

 

 

Discussion

This survey was conducted to evaluate concordance of management of metastatic lung cancer in the VHA with ASTRO guidelines. The relationship between respondents’ familiarity with the guidelines and responses also was evaluated to determine the impact such guidelines have on decision-making. The ASTRO guidelines for palliative thoracic radiation make recommendations regarding 3 issues: (1) radiation doses and fractionations for palliation; (2) the role of EBB; and (3) the use of concurrent chemotherapy.5,6

Radiation Dose and Fractionation for Palliation

A variety of dose/fractionation schemes are considered appropriate in the ASTRO guideline statement, including more prolonged courses such as 30 Gy/10 fractions as well as more hypofractionated regimens (ie, 20 Gy/5 fractions, 17 Gy/2 fractions, and a single fraction of 10 Gy). Higher dose regimens, such as 30 Gy/10 fractions, have been associated with prolonged survival, as well as increased toxicities such as radiation esophagitis.8 Therefore, the guidelines support use of 30 Gy/10 fractions for patients with good performance status while encouraging use of more hypofractionated regimens for patients with poor performance status. In considering more hypofractionated regimens, one must consider the possibility of adverse effects that can be associated with higher dose per fraction. For instance, 17 Gy/2 fractions has been associated with myelopathy; therefore it should be used with caution and careful treatment planning.9

For the survey case example (a male aged 70 years with a 3-month life expectancy who required palliation for chest wall pain), all respondents selected hypofractionated regimens; with no respondent selected the more prolonged fractionations of 60 Gy/30 fractions, 45 Gy/15 fractions, or 40 Gy/20 fractions. These more prolonged fractionations are not endorsed by the guidelines in general, and particularly not for a patient with poor life expectancy. All responses for this case selected by survey respondents are considered appropriate per the consensus guideline statement.

Role of Concurrent Chemotherapy

The ASTRO guidelines do not support use of concurrent chemotherapy for palliation of stage IV NSCLC.5,6 The 2018 updated guidelines established a role for concurrent chemotherapy for patients with stage III NSCLC with good performance status and life expectancy of > 3 months. This updated recommendation is based on data from 2 randomized trials demonstrating improvement in overall survival with the addition of chemotherapy for patients with stage III NSCLC undergoing palliative radiotherapy.10-12

These newer studies are in contrast to an older randomized study by Ball and colleagues that demonstrated greater toxicity from concurrent chemotherapy, with no improvement in outcomes such as palliation of symptoms, overall survival, or progression free survival.13 In contrast to the newer studies that included only patients with stage III NSCLC, about half of the patients in the Ball and colleagues study had known metastatic disease.10-13 Of note, staging for metastatic disease was not carried out routinely, so it is possible that a greater proportion of patients had metastatic disease that would have been seen on imaging. In concordance with the guidelines, 98% of the survey respondents did not recommend concurrent chemotherapy for palliation of intrathoracic symptom; only 1 respondent recommended use of chemotherapy for palliation.

 

 

Role of Endobronchial Brachytherapy

EBB involves implantation of radioactive sources for treatment of endobronchial lesions causing obstructive symptoms.14 Given the lack of randomized data that demonstrate a benefit of EBB over EBRT, the ASTRO guidelines do not endorse routine use of EBB for initial palliative management.15,16 The ASTRO guidelines reference a Cochrane Review of 13 trials that concluded that EBRT alone is superior to EBB alone for initial palliation of symptoms from endobronchial NSCLC.17

Of respondents surveyed, only 1 facility offered onsite EBB. The majority of respondents (72%) preferred the use of conventional EBRT techniques, while 26% refer to non-VHA centers for EBB. Lack of incorporation of EBB into routine VHA practice likely is a reflection of the unclear role of this technology based on the available literature and ASTRO guidelines. In the setting of a right lower lung collapse, more respondents (49%) would consider use of EBB or YAG laser technology for lung reexpansion prior to EBRT.

The ASTRO guidelines recommend that initial EBB in conjunction with EBRT be considered based on randomized data demonstrating significant improvement in lung reexpansion and in patient reported dyspnea with addition of EBB to EBRT over EBRT alone.18 However, the guidelines do not mandate the use of EBB in this situation. It is possible that targeted education regarding the role of EBB would improve knowledge of the potential benefit in the setting of lung collapse and increase the percentage of VHA ROs who would recommend this procedure.

Limitations

The study is limited by lack of generalizability of these findings to all ROs in the country. It is also possible that physician responses do not represent practice patterns with complete accuracy. The use of EBB varied among practitioners. Further study of this technology is necessary to clarify its role in the management of endobronchial obstructive symptoms and to determine whether efforts should be made to increase access to EBB within the VHA.

Conclusions

Most of the ROs who responded to our survey were cognizant and compliant with current ASTRO guidelines on management of lung cancer. Furthermore, familiarity with ASTRO guidelines and management choices were not associated with the respondents’ years in practice, academic appointment, full-time vs part-time status, or familiarity with ASTRO guidelines. This study is a nationwide survey of ROs in the VHA system that reflects the radiation-related care received by veterans with metastatic lung cancer. Responses were obtained from 93% of the 40 radiation oncology centers, so it is likely that the survey accurately represents the decision-making process at the majority of centers. It is possible that those who did not respond to the survey do not treat thoracic cases.

Lung cancer is the leading cause of cancer mortality both in the US and worldwide.1 Many patients diagnosed with lung cancer present with advanced disease with thoracic symptoms such as cough, hemoptysis, dyspnea, and chest pain.2-4 Palliative radiotherapy is routinely used in patients with locally advanced and metastatic lung cancer with the goal of relieving these symptoms and improving quality of life. Guidelines published by the American Society for Radiation Oncology (ASTRO) in 2011, and updated in 2018, provide recommendations on palliation of lung cancer with external beam radiotherapy (EBRT) and clarify the roles of concurrent chemotherapy and endobronchial brachytherapy (EBB) for palliation.5,6

After prostate cancer, lung cancer is the second most frequently diagnosed cancer in the Veterans Health Administration (VHA).7 The VHA consists of 172 medical centers and is the largest integrated health care system in the US. At the time of this study, 40 of these centers had onsite radiation facilities. The VHA Palliative Radiation Taskforce has conducted a series of surveys to evaluate use of palliative radiotherapy in the VHA, determine VHA practice concordance with ASTRO and American College of Radiology (ACR) guidelines, and direct educational efforts towards addressing gaps in knowledge. These efforts are directed at ensuring best practices throughout this large and heterogeneous healthcare system. In 2016 a survey was conducted to evaluate concordance of VHA radiation oncologist (RO) practice with the 2011 ASTRO guidelines on palliative thoracic radiotherapy for non-small cell lung cancer (NSCLC).

 

 

Methods

A survey instrument was generated by VHA National Palliative Radiotherapy Taskforce members. It was reviewed and approved for use by the VHA Patient Care Services office. In May of 2016, the online survey was sent to the 88 VHA ROs practicing at the 40 sites with onsite radiation facilities. The survey aimed to determine patterns of practice for palliation of thoracic symptoms secondary to lung cancer.

Demographic information obtained included years in practice, employment status, academic appointment, board certification, and familiarity with ASTRO lung cancer guidelines. Two clinical scenarios were presented to glean opinions on dose/fractionation schemes preferred, use of concurrent chemotherapy, and use of EBB and/or yttrium aluminum garnet (YAG) laser technology. Survey questions also assessed use of EBRT for palliation of hemoptysis, chest wall pain, and/or stridor as well as use of stereotactic body radiotherapy (SBRT) for palliation.

Survey results were assessed for concordance with published ASTRO guidelines. χ2 tests were run to test for associations between demographic factors such as academic appointment, years of practice, full time vs part time employment, and familiarity with ASTRO palliative lung cancer guidelines, with use of EBRT for palliation, dose and fractionation preference, use of concurrent chemotherapy, and strategy for management of endobronchial lesions.

Results

Of the 88 physicians surveyed, 54 responded for a response rate of 61%. Respondents represented 37 of the 40 (93%) VHA radiation oncology departments (Table 1). Among respondents, most were board certified (96%), held academic appointments (91%), and were full-time employees (85%). Forty-four percent of respondents were in practice for > 20 years, 19% for 11 to 20 years, 20% for 6 to 10 years, and 17% for < 6 years. A majority reported familiarity with the ASTRO guidelines (64%), while just 11% reported no familiarity with the guidelines.

When asked about use of SBRT for palliation of hemoptysis, stridor, and/or chest pain, the majority (87%) preferred conventional EBRT. Of the 13% who reported use of SBRT, most (11%) performed it onsite, with 2% of respondents referring offsite to non-VHA centers for the service. When asked about use of EBB for palliation, only 2% reported use of that procedure at their facilities, while 26% reported referral to non-VHA facilities for EBB. The remaining 72% of respondents favor use of conventional EBRT.

Respondents were presented with a case of a male patient aged 70 years who smoked and had widely metastatic NSCLC, a life expectancy of about 3 months, and 10/10 chest wall pain from direct tumor invasion. All respondents recommended palliative radiotherapy. The preferred fractionation was 20 Gray (Gy) in 5 fractions, which was recommended by 69% of respondents. The remainder recommended 30 Gy in 10 fractions (22%) or a single fraction of 10 Gy (9%). No respondent recommended the longer fractionation options of 60 Gy in 30 fractions, 45 Gy in 15 fractions, or 40 Gy in 20 fractions. The majority (98%) did not recommend concurrent chemotherapy.

When the above case was modified for an endobronchial lesion requiring palliation with associated lung collapse, rather than chest wall invasion, 20 respondents (38%) reported they would refer for EBB, and 20 respondents reported they would refer for YAG laser. As > 1 answer could be selected for this question, there were 12 respondents who selected both EBB and YAG laser; 8 selected only EBB, and 8 selected only YAG laser. Many respondents added comments about treating with EBRT, which had not been presented as an answer choice. Nearly half of respondents (49%) were amenable to referral for the use of EBB or YAG laser for lung reexpansion prior to radiotherapy. Three respondents mentioned referral for an endobronchial stent prior to palliative radiotherapy to address this question.



χ2 tests were used to evaluate for significant associations between demographic factors, such as number of years in practice, academic appointment, full-time vs part-time status, and familiarity with ASTRO guidelines with clinical management choices (Table 2). The χ2 analysis revealed that these demographic factors were not significantly associated with familiarity with ASTRO guidelines, offering SBRT for palliation, EBRT fractionation scheme preferred, use of concurrent chemotherapy, or use of EBB or YAG laser.

 

 

Discussion

This survey was conducted to evaluate concordance of management of metastatic lung cancer in the VHA with ASTRO guidelines. The relationship between respondents’ familiarity with the guidelines and responses also was evaluated to determine the impact such guidelines have on decision-making. The ASTRO guidelines for palliative thoracic radiation make recommendations regarding 3 issues: (1) radiation doses and fractionations for palliation; (2) the role of EBB; and (3) the use of concurrent chemotherapy.5,6

Radiation Dose and Fractionation for Palliation

A variety of dose/fractionation schemes are considered appropriate in the ASTRO guideline statement, including more prolonged courses such as 30 Gy/10 fractions as well as more hypofractionated regimens (ie, 20 Gy/5 fractions, 17 Gy/2 fractions, and a single fraction of 10 Gy). Higher dose regimens, such as 30 Gy/10 fractions, have been associated with prolonged survival, as well as increased toxicities such as radiation esophagitis.8 Therefore, the guidelines support use of 30 Gy/10 fractions for patients with good performance status while encouraging use of more hypofractionated regimens for patients with poor performance status. In considering more hypofractionated regimens, one must consider the possibility of adverse effects that can be associated with higher dose per fraction. For instance, 17 Gy/2 fractions has been associated with myelopathy; therefore it should be used with caution and careful treatment planning.9

For the survey case example (a male aged 70 years with a 3-month life expectancy who required palliation for chest wall pain), all respondents selected hypofractionated regimens; with no respondent selected the more prolonged fractionations of 60 Gy/30 fractions, 45 Gy/15 fractions, or 40 Gy/20 fractions. These more prolonged fractionations are not endorsed by the guidelines in general, and particularly not for a patient with poor life expectancy. All responses for this case selected by survey respondents are considered appropriate per the consensus guideline statement.

Role of Concurrent Chemotherapy

The ASTRO guidelines do not support use of concurrent chemotherapy for palliation of stage IV NSCLC.5,6 The 2018 updated guidelines established a role for concurrent chemotherapy for patients with stage III NSCLC with good performance status and life expectancy of > 3 months. This updated recommendation is based on data from 2 randomized trials demonstrating improvement in overall survival with the addition of chemotherapy for patients with stage III NSCLC undergoing palliative radiotherapy.10-12

These newer studies are in contrast to an older randomized study by Ball and colleagues that demonstrated greater toxicity from concurrent chemotherapy, with no improvement in outcomes such as palliation of symptoms, overall survival, or progression free survival.13 In contrast to the newer studies that included only patients with stage III NSCLC, about half of the patients in the Ball and colleagues study had known metastatic disease.10-13 Of note, staging for metastatic disease was not carried out routinely, so it is possible that a greater proportion of patients had metastatic disease that would have been seen on imaging. In concordance with the guidelines, 98% of the survey respondents did not recommend concurrent chemotherapy for palliation of intrathoracic symptom; only 1 respondent recommended use of chemotherapy for palliation.

 

 

Role of Endobronchial Brachytherapy

EBB involves implantation of radioactive sources for treatment of endobronchial lesions causing obstructive symptoms.14 Given the lack of randomized data that demonstrate a benefit of EBB over EBRT, the ASTRO guidelines do not endorse routine use of EBB for initial palliative management.15,16 The ASTRO guidelines reference a Cochrane Review of 13 trials that concluded that EBRT alone is superior to EBB alone for initial palliation of symptoms from endobronchial NSCLC.17

Of respondents surveyed, only 1 facility offered onsite EBB. The majority of respondents (72%) preferred the use of conventional EBRT techniques, while 26% refer to non-VHA centers for EBB. Lack of incorporation of EBB into routine VHA practice likely is a reflection of the unclear role of this technology based on the available literature and ASTRO guidelines. In the setting of a right lower lung collapse, more respondents (49%) would consider use of EBB or YAG laser technology for lung reexpansion prior to EBRT.

The ASTRO guidelines recommend that initial EBB in conjunction with EBRT be considered based on randomized data demonstrating significant improvement in lung reexpansion and in patient reported dyspnea with addition of EBB to EBRT over EBRT alone.18 However, the guidelines do not mandate the use of EBB in this situation. It is possible that targeted education regarding the role of EBB would improve knowledge of the potential benefit in the setting of lung collapse and increase the percentage of VHA ROs who would recommend this procedure.

Limitations

The study is limited by lack of generalizability of these findings to all ROs in the country. It is also possible that physician responses do not represent practice patterns with complete accuracy. The use of EBB varied among practitioners. Further study of this technology is necessary to clarify its role in the management of endobronchial obstructive symptoms and to determine whether efforts should be made to increase access to EBB within the VHA.

Conclusions

Most of the ROs who responded to our survey were cognizant and compliant with current ASTRO guidelines on management of lung cancer. Furthermore, familiarity with ASTRO guidelines and management choices were not associated with the respondents’ years in practice, academic appointment, full-time vs part-time status, or familiarity with ASTRO guidelines. This study is a nationwide survey of ROs in the VHA system that reflects the radiation-related care received by veterans with metastatic lung cancer. Responses were obtained from 93% of the 40 radiation oncology centers, so it is likely that the survey accurately represents the decision-making process at the majority of centers. It is possible that those who did not respond to the survey do not treat thoracic cases.

References

1. Torre LA, Bray F, Siegel RL, Ferlay J, Lortet-Tieulent J, Jemal A. Global cancer statistics, 2012. CA Cancer J Clin. 2015 65(2):87-108.

2. Kocher F, Hilbe W, Seeber A, et al. Longitudinal analysis of 2293 NSCLC patients: a comprehensive study from the TYROL registry. Lung Cancer. 2015;87(2):193-200.

3. Chute CG, Greenberg ER, Baron J, Korson R, Baker J, Yates J. Presenting conditions of 1539 population-based lung cancer patients by cell type and stage in New Hampshire and Vermont. Cancer. 1985;56(8):2107-2111.

4. Hyde L, Hyde Cl. Clinical manifestations of lung cancer. Chest. 1974;65(3):299-306.

5. Rodrigues G, Videtic GM, Sur R, et al. Palliative thoracic radiotherapy in lung cancer: An American Society for Radiation Oncology evidence-based clinical practice guideline. Pract Radiat Oncol. 2011;1(2):60-71.

6. Moeller B, Balagamwala EH, Chen A, et al. Palliative thoracic radiation therapy for non-small cell lung cancer: 2018 Update of an American Society for Radiation Oncology (ASTRO) Evidence-Based Guideline. Pract Radiat Oncol. 2018;8(4):245-250.

7. Zullig LL, Jackson GL, Dorn RA, et al. Cancer incidence among patients of the United States Veterans Affairs (VA) healthcare system. Mil Med. 2012;177(6):693-701.

8. Fairchild A, Harris K, Barnes E, et al. Palliative thoracic radiotherapy for lung cancer: a systematic review. J Clin Oncol. 2008;26(24):4001-4011.

9. A Medical Research Council (MRC) randomised trial of palliative radiotherapy with two fractions or a single fraction in patients with inoperable non-small-cell lung cancer (NSCLC) and poor performance status. Medical Research Council Lung Cancer Working Party. Br J Cancer. 1992;65(6):934-941.

10. Nawrocki S, Krzakowski M, Wasilewska-Tesluk E, et al. Concurrent chemotherapy and short course radiotherapy in patients with stage IIIA to IIIB non-small cell lung cancer not eligible for radical treatment: results of a randomized phase II study. J Thorac Oncol. 2010;5(8):1255-1262.

11. Strøm HH, Bremnes RM, Sundstrøm SH, Helbekkmo N, Fløtten O, Aasebø U. Concurrent palliative chemoradiation leads to survival and quality of life benefits in poor prognosis stage III non-small-cell lung cancer: a randomised trial by the Norwegian Lung Cancer Study Group. Br J Cancer. 2013;109(6):1467-1475.

12. Strøm HH, Bremnes RM, Sundstrøm SH, Helbekkmo N, Aasebø U. Poor prognosis patients with inoperable locally advanced NSCLC and large tumors benefit from palliative chemoradiotherapy: a subset analysis from a randomized clinical phase III trial. J Thorac Oncol. 2014;9(6):825-833.

13. Ball D, Smith J, Bishop J, et al. A phase III study of radiotherapy with and without continuous-infusion fluorouracil as palliation for non-small-cell lung cancer. Br J Cancer. 1997;75(5):690-697.

14. Stewart A, Parashar B, Patel M, et al. American Brachytherapy Society consensus guidelines for thoracic brachytherapy for lung cancer. Brachytherapy. 2016;15(1):1-11.

15. Sur R, Ahmed SN, Donde B, Morar R, Mohamed G, Sur M, Pacella JA, Van der Merwe E, Feldman C. Brachytherapy boost vs teletherapy boost in palliation of symptomatic, locally advanced non-small cell lung cancer: preliminary analysis of a randomized prospective study. J Brachytherapy Int. 2001;17(4):309-315.

16. Sur R, Donde B, Mohuiddin M, et al. Randomized prospective study on the role of high dose rate intraluminal brachytherapy (HDRILBT) in palliation of symptoms in advanced non-small cell lung cancer (NSCLC) treated with radiation alone. Int J Radiat Oncol Biol Phys. 2004;60(1):S205.

17. Ung YC, Yu E, Falkson C, et al. The role of high-dose-rate brachytherapy in the palliation of symptoms in patients with non-small cell lung cancer: a systematic review. Brachytherapy. 2006;5:189-202.

18. Langendijk H, de Jong J, Tjwa M, et al. External irradiation versus external irradiation plus endobronchial brachytherapy in inoperable non-small cell lung cancer: a prospective randomized study. Radiother Oncol. 2001;58(3):257-268.

References

1. Torre LA, Bray F, Siegel RL, Ferlay J, Lortet-Tieulent J, Jemal A. Global cancer statistics, 2012. CA Cancer J Clin. 2015 65(2):87-108.

2. Kocher F, Hilbe W, Seeber A, et al. Longitudinal analysis of 2293 NSCLC patients: a comprehensive study from the TYROL registry. Lung Cancer. 2015;87(2):193-200.

3. Chute CG, Greenberg ER, Baron J, Korson R, Baker J, Yates J. Presenting conditions of 1539 population-based lung cancer patients by cell type and stage in New Hampshire and Vermont. Cancer. 1985;56(8):2107-2111.

4. Hyde L, Hyde Cl. Clinical manifestations of lung cancer. Chest. 1974;65(3):299-306.

5. Rodrigues G, Videtic GM, Sur R, et al. Palliative thoracic radiotherapy in lung cancer: An American Society for Radiation Oncology evidence-based clinical practice guideline. Pract Radiat Oncol. 2011;1(2):60-71.

6. Moeller B, Balagamwala EH, Chen A, et al. Palliative thoracic radiation therapy for non-small cell lung cancer: 2018 Update of an American Society for Radiation Oncology (ASTRO) Evidence-Based Guideline. Pract Radiat Oncol. 2018;8(4):245-250.

7. Zullig LL, Jackson GL, Dorn RA, et al. Cancer incidence among patients of the United States Veterans Affairs (VA) healthcare system. Mil Med. 2012;177(6):693-701.

8. Fairchild A, Harris K, Barnes E, et al. Palliative thoracic radiotherapy for lung cancer: a systematic review. J Clin Oncol. 2008;26(24):4001-4011.

9. A Medical Research Council (MRC) randomised trial of palliative radiotherapy with two fractions or a single fraction in patients with inoperable non-small-cell lung cancer (NSCLC) and poor performance status. Medical Research Council Lung Cancer Working Party. Br J Cancer. 1992;65(6):934-941.

10. Nawrocki S, Krzakowski M, Wasilewska-Tesluk E, et al. Concurrent chemotherapy and short course radiotherapy in patients with stage IIIA to IIIB non-small cell lung cancer not eligible for radical treatment: results of a randomized phase II study. J Thorac Oncol. 2010;5(8):1255-1262.

11. Strøm HH, Bremnes RM, Sundstrøm SH, Helbekkmo N, Fløtten O, Aasebø U. Concurrent palliative chemoradiation leads to survival and quality of life benefits in poor prognosis stage III non-small-cell lung cancer: a randomised trial by the Norwegian Lung Cancer Study Group. Br J Cancer. 2013;109(6):1467-1475.

12. Strøm HH, Bremnes RM, Sundstrøm SH, Helbekkmo N, Aasebø U. Poor prognosis patients with inoperable locally advanced NSCLC and large tumors benefit from palliative chemoradiotherapy: a subset analysis from a randomized clinical phase III trial. J Thorac Oncol. 2014;9(6):825-833.

13. Ball D, Smith J, Bishop J, et al. A phase III study of radiotherapy with and without continuous-infusion fluorouracil as palliation for non-small-cell lung cancer. Br J Cancer. 1997;75(5):690-697.

14. Stewart A, Parashar B, Patel M, et al. American Brachytherapy Society consensus guidelines for thoracic brachytherapy for lung cancer. Brachytherapy. 2016;15(1):1-11.

15. Sur R, Ahmed SN, Donde B, Morar R, Mohamed G, Sur M, Pacella JA, Van der Merwe E, Feldman C. Brachytherapy boost vs teletherapy boost in palliation of symptomatic, locally advanced non-small cell lung cancer: preliminary analysis of a randomized prospective study. J Brachytherapy Int. 2001;17(4):309-315.

16. Sur R, Donde B, Mohuiddin M, et al. Randomized prospective study on the role of high dose rate intraluminal brachytherapy (HDRILBT) in palliation of symptoms in advanced non-small cell lung cancer (NSCLC) treated with radiation alone. Int J Radiat Oncol Biol Phys. 2004;60(1):S205.

17. Ung YC, Yu E, Falkson C, et al. The role of high-dose-rate brachytherapy in the palliation of symptoms in patients with non-small cell lung cancer: a systematic review. Brachytherapy. 2006;5:189-202.

18. Langendijk H, de Jong J, Tjwa M, et al. External irradiation versus external irradiation plus endobronchial brachytherapy in inoperable non-small cell lung cancer: a prospective randomized study. Radiother Oncol. 2001;58(3):257-268.

Issue
Federal Practitioner - 37(2)s
Issue
Federal Practitioner - 37(2)s
Page Number
S38-S42
Page Number
S38-S42
Publications
Publications
Topics
Article Type
Display Headline
Radiotherapeutic Care of Patients With Stage IV Lung Cancer with Thoracic Symptoms in the Veterans Health Administration
Display Headline
Radiotherapeutic Care of Patients With Stage IV Lung Cancer with Thoracic Symptoms in the Veterans Health Administration
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Article PDF Media

SARS-CoV-2 Seroprevalence Among Healthcare Workers by Job Function and Work Location in a New York Inner-City Hospital

Article Type
Changed
Tue, 04/27/2021 - 13:28
Display Headline
SARS-CoV-2 Seroprevalence Among Healthcare Workers by Job Function and Work Location in a New York Inner-City Hospital

SARS-CoV-2 has infected 141 million people worldwide and 31 million people in the United States as of April 20, 2021.1,2 The influx of hospital admissions and deaths has severely strained healthcare systems worldwide and placed healthcare workers (HCWs) at increased risk for acquiring COVID-19.3-5

Several studies have described the impact of COVID-19 on this heterogeneous group of HCWs. Shields et al reported a seroprevalence of 24.4% in HCWs at University Hospitals Birmingham (UK), with the highest rate, 34.5%, in housekeeping staff.6 Steensels et al reported a lower prevalence of 6.4% at a tertiary care center in Belgium, and showed no increased risk for HCWs when directly involved in clinical care.7 The authors attributed this to adequate use of personal protective equipment (PPE). Other studies have reported seroprevalences ranging from 1.6% to 18%.8-11 In the New York City (NYC) metro area, Jeremias et al reported a seroprevalence of 9.8% in HCWs and found no difference by job title or work location,12 whereas Moscola et al reported a seroprevalence of 13.7% and demonstrated a 3% increased risk for those working in service or maintenance.13 Antibody tests were conducted between March and April 2020 in all but two of these studies; testing in these two studies was performed between April 13 and June 23, 2020, with one reporting a seroprevalence of 6%11 and the other, 13.7%.13

NYC became the earliest pandemic epicenter in the United States following untracked transmission from ongoing circulation of SARS-CoV-2 in Europe.14 As a result, the COVID-19 surge in NYC commenced in March and largely subsided by the end of May 2020. Most HCW data reported to date do not reflect the situation at the end of the surge, and may underestimate true seroprevalence. We describe SARS-CoV-2 seroprevalence in HCWs in a large inner-city hospital in NYC, with antibody testing conducted from May 18 to June 26, 2020, at the subsidence of the surge. To further our understanding of occupational risk among different groups of HCWs, we examined associations of seroprevalence with HCWs’ job function and work location.

METHODS

This was a cross-sectional seroprevalence study conducted in the BronxCare Health System located in South and Central Bronx, an area that experienced one of the highest incidences of SARS-CoV-2 infections within NYC’s five boroughs.

HCWs were offered voluntary testing for serum antibodies to SARS-CoV-2 between May 18 and June 26, 2020. Testing occurred in the institution’s auditorium, a central and easily accessible location. Weekly emails were sent to all employees and department heads during the testing period, offering antibody testing and providing location and testing time information. The Elecsys Anti-SARS-CoV-2 (Roche) assay measuring total qualitative antibodies was used; the assay has a reported sensitivity of 97.1% 14 days after a positive SARS-CoV-2 RNA polymerase chain reaction (PCR) test result and a specificity of 100%.15

Demographic and work-related information was abstracted from electronic medical records, including all comorbid conditions that affected 30 or more HCWs. Pulmonary diagnoses, including asthma and chronic obstructive pulmonary disease, were grouped as chronic lung disease, and cardiovascular diseases, including hypertension, as chronic heart disease. Personal identifiers and data were delinked upon completion of data abstraction. The study was approved by the hospital’s institutional review board.

Job Function and Work Location

HCWs were grouped by job function as follows: physicians; nurses (including physician assistants and nurse practitioners); allied HCW I (medical assistants, patient care, and electrocardiogram, radiology, and ear, nose and throat technicians); allied HCW II (social workers, dieticians and nutritionists, registration clerks and unit associates, physical and occupational therapists); nonclinical staff (patient transporters, housekeeping staff, and security staff); pharmacists; engineering; and administrative staff. Respiratory therapists were considered as a separate group as their work placed them at high risk for respiratory diseases.

Work locations were as follows: clinics (including dental, outpatient, and satellite clinics), emergency departments (ED), inpatient units (including floors and intensive care units [ICU]), radiology suite, laboratory and pharmacy, and offices.

Statistical Analysis

Descriptive statistics were calculated using χ2 analyses. All demographic variables were tested against serology status (positive/negative). A binary logistic regression analysis was used to calculate odds ratios (ORs). Eight separate univariate unadjusted ORs were calculated by running each predictor variable against serology status (dependent variable), which included the six categorical variables—race, ethnicity, age, sex, body mass index (BMI), and prior SARS-CoV-2 PCR results—and the two main predictors—job function and work location. To obtain adjusted ORs, two final separate multivariable logistic regression analyses were executed including the six covariates listed. Due to high collinearity between job function and work location (χ2 = 3030.13, df = 35 [6 levels of work location – 1]*[8 levels of job function – 1]; P < .001), we included only one of the main predictors in each model. The regressions were specified such that the reference groups for the work location and job function variables were office work and administration, respectively. This choice was made based on the fact that their nonclinical functions do not confer an exposure risk in excess of that experienced by typical community populations. Sensitivity analyses were performed on the subset of HCWs whose address zip codes indicated residence within NYC to exclude the effect of different community seroprevalences in areas outside of NYC. The 95% CI for seroprevalence of antibodies within tested HCWs was estimated using the Clopper-Pearson binomial method.

RESULTS

Among all HCWs in the institution (N = 4,807), 2,749 (57.2%) underwent voluntary testing. Of those who underwent testing, 831 were positive for antibodies to SARS-CoV-2 (Figure 1), a seroprevalence of 30.2% (95% CI, 29%-32%). Among the age groups, the 45-to-64−year group had the highest seropositivity at 33% (400/1203), and those ≥75 years of age, the lowest at 16.7% (2/12) (P < .009).

Flow Diagram Showing Voluntary Testing Uptake and Results for Qualitative SARS-CoV-2 Antibody Testing
Data on race was available for 38.7% (1,064/2,749) of HCWs (Table); seropositivity was highest for Blacks (259/664, 39%) and lowest for Whites (36/163, 22.1%; P < .001). Certain comorbid conditions were associated with seropositivity (P = .001).
Healthcare Workers’ Demographic, Comorbid, and Work Characteristics by SARS-CoV-2 Antibody Status

Among all tested HCWs, 70.1% (1,928/2,749) resided in NYC. SARS-CoV-2 seroprevalence in this subset was 32% (616/1,928) (Figure 1). Demographic and comorbid conditions in HCWs who lived in NYC were similar to those of the whole group (Appendix Table 1).

HCWs who underwent voluntary antibody testing (Appendix Table 2) had a higher percentage of persons in the 45-to-64−year age group (43.8% vs 40.9%) and a lower percentage of persons in the 65-to-74−year age group (3.3% vs 5.3%) compared with the group of HCWs that did not undergo testing (P < .001). Gender, race, ethnicity, comorbid conditions, SARS-CoV-2 PCR testing, and work locations were not different between groups. The tested group had higher proportions of clinicians (physicians, nurses, allied HCWs I and II) than the untested nonparticipant group (P = .014).

SARS-CoV-2 PCR Tests on HCWs

More than one-third (34.1%; 938/2,749) of HCWs had a documented nasopharyngeal PCR test between March 23 and June 26, 2020 (Table). Of all PCRs performed, 262 were positive, giving an overall PCR positivity rate of 27.9%. Positivity was 51.4% in March and 36.6% in April. The reasons for PCR testing were not available, but likely represent a combination of exposure-related testing among asymptomatic individuals and diagnostic testing of symptomatic HCWs. In contrast, serology testing was indicative of prior infection and yielded a cumulative seroprevalence at the end of the surge. Findings were similar among HCWs residing in NYC (Appendix Table 1).

Work Location and Job Function

Among all HCWs (Table, Figure 2), there were differences in seropositivity by work location (P = .001). The largest number of HCWs worked in inpatient units (1,348/2,749, 49%), and the second largest in offices (554/2,749, 20%). The highest seropositivity rate was in the EDs, at 36.4% (64/176), followed by radiology suites, at 32.7% (17/52); the seropositivity rate in office locations was 25.8% (143/554). Among HCWs residing in NYC (Appendix Table 1, Appendix Figure 1), the rank order according to proportion seropositive by work location was similar to that of the whole group (P = .004), except that the second highest seropositivity rate was in the inpatient units (33.9% [323/953]). In the group of HCWs residing in NYC, office locations had a seropositivity of 27.4% (102/372). The seropositivity rates for both groups working in office locations were slightly higher than the 22% community seroprevalence in NYC reported for the same period.16

 Proportions Seropositive for SARS-CoV-2 Among All Tested Healthcare Workers by Job Function and Work Location

Among all HCWs, there were differences in seropositivity by job function (P = .001). The greatest proportion of HCWs were allied HCW II (23% [631/2,749]), followed by nurses (22.2% [611/2,749]) and physicians (21.3% [585/2,749] ). Seropositivity was highest for nonclinical staff (44.0% [51/116]), followed by nurses (37.5% [229/611]) and allied clinical HCW I and II (34.5% [143/414] and 32.0% [202/631], respectively). It was lowest for administrative staff (20.9% [42/201]) and pharmacists (11.1% [5/45]). Among HCWs residing in NYC, the rank order according to proportion seropositive by location was similar to that of the whole group. Administrative staff seropositivity was 18.3% (20/109). Administrative staff seropositivity for both groups was marginally lower than the 22% community seroprevalence in NYC for the same period.16

Odds Ratios for SARS-CoV-2 Seropositivity

For all HCWs, in unadjusted models (Appendix Table 3), age 45 to 64 years and Black race were associated with increased odds of being seropositive (1.26; 95% CI, 1.07-1.49 and 2.26; 95% CI, 1.51-3.37, respectively). Increased odds were seen for HCWs working in the ED (1.64; 95% CI, 1.14-2.36) and inpatient units (1.35; 95% CI, 1.08-1.69), and decreased odds were seen for those working in the laboratory and pharmacy (0.47; 95% CI, 0.26-0.86). Increased odds for seropositivity were found for nurses (2.27; 95% CI, 1.56-3.31), allied HCW I (2.00; 95% CI, 1.34-2.97), allied HCW II (1.78; 95% CI, 1.22-2.60), and nonclinical staff (2.97; 95% CI,1.80-4.90).

After adjusting for all covariates, HCWs who were Black remained at increased odds for being seropositive in the two final models (adjusted OR, 2.29; 95% CI, 1.38-3.81 and adjusted OR, 2.94; 95% CI, 1.78-4.85), as did those who had a BMI >30 kg/m2, with an adjusted OR of 1.36 (95% CI, 1.05-1.77) in one of the final models (Appendix Table 3). None of the other comorbid conditions had increased ORs. Those who worked in the ED and inpatient units also remained at increased odds after adjusting for covariates (2.27; 95% CI, 1.53-3.37 and 1.48; 95% CI, 1.14-1.92, respectively; Figure 3). Other job functions that had increased odds for seropositivity were nurses (2.54; 95% CI, 1.64-3.94), allied HCW I (1.83; 95% CI, 1.15-2.89) and II (1.70; 95% CI, 1.10-2.63), and nonclinical staff (2.51; 95% CI, 1.42-4.43).

Association of Job Function and Work Location With Seropositivity Among All Tested Healthcare Workers

Having a positive PCR for SAR-CoV-2 on nasopharyngeal swabs was strongly associated with seropositivity (OR, 47.26; 95% CI, 29.30-76.23 and OR, 44.79; 95% CI, 27.87-72.00) in the two multivariate-adjusted models. These findings were confirmed when the analyses were performed on HCWs who resided in NYC (Appendix Table 4 and Appendix Figure 2).

DISCUSSION

In a large inner-city New York hospital, we report a cumulative SARS-CoV-2 seroprevalence of 30.2% in HCWs at the end of the first surge of SARS-CoV-2 infections in NYC. We identified the highest seropositivity rates for nonclinical staff and nurses, followed by allied HCWs, with the odds of being seropositive ranging from 1.7 to 2.5. The work locations with the highest seroprevalences were the ED and inpatient units, with 2.3-fold and 1.5-fold increased odds of seropositivity, respectively.

Serosurveillance studies have reported the trajectory of community prevalence in NYC over the first wave. A 6.3% prevalence was reported in samples collected between March 23 and April 1, 2020.17 In a study by Rosenberg et al18 with testing performed from April 9 through April 28, 2020, prevalence increased to 22.7%. Serosurveillance data from the NYC Department of Health show prevalence ranging from 20.1% to 23.3% (average 22%) during the study period.16 Compared to the estimated seroprevalence of 9.3% in the United States,19 these rates established NYC as an early epicenter for the COVID-19 pandemic, with our institution’s HCW seroprevalence considerably higher than NYC community serosurveillance rates, 2.2 times higher than reported in the earlier HCW study in the greater NYC area,13 and higher than the 27% rate during May 2020 recently reported in another NYC hospital.20

Data from studies of hospital transmission and effects of mitigation measures, such as a universal masking policy for HCWs and patients, clearly demonstrate the high effectiveness of these measures in reducing hospital transmissions.21,22 This suggests HCW seroprevalence in institutions with well-implemented infection control and universal masking policies may not be a consequence of workplace exposures, but rather may be reflective of community rates.23 Our institution’s response commenced February 3, 2020, with implementation of social distancing, a universal masking policy, transmission-based precautions, and use of fitted N95 masks. Mid-March, elective surgeries were canceled, and inpatient visitation suspended. During the surge, these measures were widely and consistently implemented for all categories of HCWs throughout the work environment, based on emerging guidelines from the Centers for Disease Control and Prevention (CDC) and NYC Department of Health. Our overall observed HCW seroprevalence, well above that of the community, with differences in categories of job function and work locations, is therefore an important finding. Our sample of 2,749 HCWs lived in NYC and its surrounding suburbs and nearby states. There is heterogeneity in community seroprevalence between areas outside of NYC and NYC (an epicenter) itself. We therefore analyzed our data in the subset with NYC zip codes, confirming a similar overall prevalence and increased odds of seropositivity in nurses, allied HCWs, and nonclinical staff.

Physicians and administrative and office staff had seropositivity rates of 18.1%, 20.9%, and 25.8%, respectively, consistent with community rates and illustrating the effectiveness of PPE in the hospital setting. Since PPE use was part of a universal policy applied to all HCWs in our institution, other possible reasons may explain the differences we found. We speculate that the close working relationship nurses have with their patients resulted in a longer duration and higher frequency of daily interactions, increasing the risk for transmission and causing breakthrough infections.24,25 This increased risk is reflected in a study in which 28% of hospitalized patients were nurses and 9% certified nursing assistants.26

The CDC recently redefined close contact with someone with COVID-19 as a cumulative total of >15 minutes over 24 hours.25 Thus, several multiple short periods of exposure can increase risk for infection with SARS-CoV-2; such exposure is characteristic of the job function of nurses, nursing staff, and nonclinical staff. Further, housekeeping, transportation, and security officers are all nonclinical staff with significant and multiple exposures to COVID-19 patients during the surge, and for security officers, to continuous public traffic in and out of the hospital. SARS-CoV-2 spreads by virus shedding in large droplets and aerosols, with droplet nuclei <5 microns in size efficiently dispersed in air, an important additional mode of transmission.27-30 Airborne transmission coupled with virus shedding in asymptomatic and presymptomatic persons, which has been shown to cause secondary attack rates of up to 32%, are other factors that likely contributed to the increased seroprevalence in this group.31 Our observation is consistent with the Birmingham study, which reported the highest rate in housekeeping staff, with a prevalence of 34.5%, compared to 44% in this study.6 Similar reasons for high seropositivity rates apply to the two groups of allied HCWs (eg, medical assistants and patient care technicians, social workers, nutritionists and therapists), whose job functions place them in intermittent but significant proximity with inpatients and outpatients.

Consistent with public health data showing that minorities are disproportionately affected by this disease, we found that Black HCWs were three times more likely to be seropositive.32 However, an unexpected observation was the association between obesity and SARS-CoV-2 seropositivity. A possible explanation for this association may be inability to achieve optimal fit testing for N95 masks, thereby increasing the risk of exposure to droplet nuclei. This is important given that obesity is associated with poorer outcomes from COVID-19.

During the height of the first wave in NYC, EDs and inpatient units handled a large volume of COVID-19 patients with high PCR positivity rates (peak of 51% in March in our hospital). It was not unexpected that we observed increased odds of seropositivity in these work locations. As ICUs were at capacity, inpatient units cared for critically ill patients they would not normally have. HCWs in these locations coped with an increased workload, increased demand on PPE supplies, and work fatigue, which contributed to increased risk for hospital-acquired SARS-CoV-2 infections.

Reporting seroprevalence at a single institution was a limitation of the study. Approximately 57% of the hospital’s total HCW population was tested for antibodies. It is possible their risk profile influenced their decision to volunteer for testing when it became available, introducing selection bias. A comparison between tested and untested HCWs showed similarity in all demographic measures, including nasopharyngeal PCR testing, except for age. We did not have information on symptoms that would prompt PCR testing. HCWs who underwent voluntary testing were younger compared to those who did not undergo testing. Current NYC serosurveillance data showed higher seropositivity in the 45-to-64–year age group (27.8%-28.6%) compared to the 65-to-74–year age group (24.3%), which suggests that the tested group may overestimate seroprevalence among HCWs relative to a randomly selected sample.33 Similarly, there were more nurses, allied HCWs, physicians, and administrative staff in the tested group, with the former two having higher SARS-CoV-2 seropositivity compared to community prevalence, which could also overestimate seroprevalence. Our large sample size provided us with the power to detect differences within several different job functions and work locations, a strength of this study. It was not possible to differentiate community- from hospital-acquired infection in our HCWs, a limitation in many observational HCW seroprevalence studies. However, when we analyzed data restricted only to HCWs in NYC, to reduce the effect of differing community prevalences outside the city, our results were unchanged. Since it is possible that nonclinical HCWs are of a lower socioeconomic status compared to others (nurses and allied HCWs), we cannot exclude the possibility that higher SARS-CoV-2 seroprevalence associated with lower status explains, partly or completely, the increased odds of seropositivity we observed.34 Due to the high proportion of missing data for race (61.3%), we advise caution in interpreting our finding that the odds of seropositivity were three times higher for Black race, even though consistent with prior literature.34 Healthcare organizations have similar job function and work location categories incorporated in their infrastructure, suggesting that our observations may be generalizable to other hospitals in the United States.

CONCLUSION

These findings show that during the first surge in NYC, with its increased burden of disease, hospitalizations, morbidity, and mortality, seroprevalences varied based on job function and work location within this institution. Nurses were at highest risk for SARS-CoV-2 infection, as were those who worked in the ED. In preparation for subsequent waves of SARS-CoV-2 and other highly contagious respiratory infections, major medical centers need to enhance efforts aimed at protecting HCWs, with particular attention to these groups. This study also strongly supports the recent CDC guideline prioritizing HCWs to receive COVID-19 mRNA and adenovirus vector vaccines that have obtained emergency use authorization by the US Food and Drug Administration.35

Acknowledgments

The authors thank all the residents, nurses, and staff of the Department of Family Medicine for their contribution to this work.

Files
References

1. Liu YC, Kuo RL, Shih SR. COVID-19: The first documented coronavirus pandemic in history. Biomed J. 2020;43(4):328-333. https://doi.org/10.1016/j.bj.2020.04.007
2. World Health Organization. WHO coronavirus disease (COVID-19) dashboard. Accessed April 12, 2021. https://covid19.who.int
3. Nguyen LH, Drew DA, Graham MS, et al. Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. Lancet Public Health. 2020;5(9):e475-e483. https://doi.org/10.1016/S2468-2667(20)30164-X
4. Gupta S, Federman DG. Hospital preparedness for COVID-19 pandemic: experience from department of medicine at Veterans Affairs Connecticut Healthcare System. Postgrad Med. 2020:1-6. https://doi.org/10.1080/00325481.2020.1761668
5. Woolley K, Smith R, Arumugam S. Personal protective equipment (PPE) guidelines, adaptations and lessons during the COVID-19 pandemic. Ethics Med Public Health. 2020;14:100546. https://doi.org/10.1016/j.jemep.2020.100546
6. Shields A, Faustini SE, Perez-Toledo M, et al. SARS-CoV-2 seroprevalence and asymptomatic viral carriage in healthcare workers: a cross-sectional study. Thorax. 2020;75(12):1089-1094. https://doi.org/10.1136/thoraxjnl-2020-215414
7. Steensels D, Oris E, Coninx L, et al. Hospital-wide SARS-CoV-2 antibody screening in 3056 staff in a tertiary center in Belgium. JAMA. 2020;324(2):195-197. https://doi.org/10.1001/jama.2020.11160
8. Stubblefield WB, Talbot HK, Feldstein L, et al. Seroprevalence of SARS-CoV-2 Among frontline healthcare personnel during the first month of caring for COVID-19 patients - Nashville, Tennessee. Clin Infect Dis. 2020. https://doi.org/10.1093/cid/ciaa936
9. Korth J, Wilde B, Dolff S, et al. SARS-CoV-2-specific antibody detection in healthcare workers in Germany with direct contact to COVID-19 patients. J Clin Virol. 2020;128:104437. https://doi.org/10.1016/j.jcv.2020.104437
10. Keeley AJ, Evans C, Colton H, et al. Roll-out of SARS-CoV-2 testing for healthcare workers at a large NHS Foundation Trust in the United Kingdom, March 2020. Euro Surveill. 2020;25(14). https://doi.org/10.2807/1560-7917.ES.2020.25.14.2000433
11. Self WH, Tenforde MW, Stubblefield WB, et al. Seroprevalence of SARS-CoV-2 among frontline health care personnel in a multistate hospital network - 13 academic medical centers, April-June 2020. MMWR Morb Mortal Wkly Rep. 2020;69(35):1221-1226. https://doi.org/10.15585/mmwr.mm6935e2
12. Jeremias A, Nguyen J, Levine J, et al. Prevalence of SARS-CoV-2 infection among health care workers in a tertiary community hospital. JAMA Intern Med. 2020 Aug 11:e204214. https://doi.org/10.1001/jamainternmed.2020.4214
13. Moscola J, Sembajwe G, Jarrett M, et al. Prevalence of SARS-CoV-2 antibodies in health care personnel in the New York City area. JAMA. 2020;324(9):893-895. https://doi.org/10.1001/jama.2020.14765
14. Gonzalez-Reiche AS, Hernandez MM, Sullivan MJ, et al. Introductions and early spread of SARS-CoV-2 in the New York City area. Science. 2020;369(6501):297-301. https://doi.org/10.1126/science.abc1917
15. Lau CS, Hoo SF, Yew SF, et al. Evaluation of the Roche Elecsys Anti-SARS-CoV-2 assay. Preprint. Posted online June 29, 2020. Accessed November 8, 2020. https://www.medrxiv.org/content/10.1101/2020.06.28.20142232v1 https://doi.org/10.1101/2020.06.28.20142232
16. New York City Department of Health. Covid-19: data. long-term trends. Antibody testing. Accessed March 5, 2021. https://www1.nyc.gov/site/doh/covid/covid-19-data-trends.page#antibody
17. Havers FP, Reed C, Lim T, et al. Seroprevalence of antibodies to SARS-CoV-2 in 10 sites in the United States, March 23-May 12, 2020. JAMA Intern Med. Published online July 21, 2020. https://doi.org/10.1001/jamainternmed.2020.4130
18. Rosenberg ES, Tesoriero JM, Rosenthal EM, et al. Cumulative incidence and diagnosis of SARS-CoV-2 infection in New York. Ann Epidemiol. Aug 2020;48:23-29 e4. https://doi.org/10.1016/j.annepidem.2020.06.004
19. Anand S, Montez-Rath M, Han J, et al. Prevalence of SARS-CoV-2 antibodies in a large nationwide sample of patients on dialysis in the USA: a cross-sectional study. Lancet. 2020;396(10259):1335-1344. https://doi.org/10.1016/S0140-6736(20)32009-2
20. Venugopal U, Jilani N, Rabah S, et al. SARS-CoV-2 seroprevalence among health care workers in a New York City hospital: a cross-sectional analysis during the COVID-19 pandemic. Int J Infect Dis. 2020;102:63-69. https://doi.org/10.1016/j.ijid.2020.10.036
21. Samaranayake LP, Fakhruddin KS, Ngo HC, Chang JWW, Panduwawala C. The effectiveness and efficacy of respiratory protective equipment (RPE) in dentistry and other health care settings: a systematic review. Acta Odontol Scand. 2020;78(8):626-639. https://doi.org/10.1080/00016357.2020.1810769
22. Seidelman JL, Lewis SS, Advani SD, et al. Universal masking is an effective strategy to flatten the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) healthcare worker epidemiologic curve. Infect Control Hosp Epidemiol. 2020;41(12):1466-1467. https://doi.org/10.1017/ice.2020.313
23. Richterman A, Meyerowitz EA, Cevik M. Hospital-acquired SARS-CoV-2 infection: lessons for public health. JAMA. Published online November 13, 2020. https://doi.org/10.1001/jama.2020.21399
24. Degesys NF, Wang RC, Kwan E, Fahimi J, Noble JA, Raven MC. Correlation between n95 extended use and reuse and fit failure in an emergency department. JAMA. 2020;324(1):94-96. https://doi.org/10.1001/jama.2020.9843
25. Pringle JC, Leikauskas J, Ransom-Kelley S, et al. COVID-19 in a correctional facility employee following multiple brief exposures to persons with COVID-19 - Vermont, July-August 2020. MMWR Morb Mortal Wkly Rep. 2020;69(43):1569-1570. https://doi.org/10.15585/mmwr.mm6943e1
26. Kambhampati AK, O’Halloran AC, Whitaker M, et al. COVID-19-associated hospitalizations among health care personnel - COVID-NET, 13 states, March 1-May 31, 2020. MMWR Morb Mortal Wkly Rep. 2020;69(43):1576-1583. https://doi.org/10.15585/mmwr.mm6943e3
27. Zhang R, Li Y, Zhang AL, Wang Y, Molina MJ. Identifying airborne transmission as the dominant route for the spread of COVID-19. Proc Natl Acad Sci U S A. 2020;117(26):14857-14863. https://doi.org/10.1073/pnas.2009637117
28. Setti L, Passarini F, De Gennaro G, et al. Airborne transmission route of COVID-19: why 2 meters/6 feet of inter-personal distance could not be enough. Int J Environ Res Public Health. 2020;17(8):2932. https://doi.org/doi:10.3390/ijerph17082932
29. Klompas M, Baker MA, Rhee C. Airborne transmission of SARS-CoV-2: theoretical considerations and available evidence. JAMA. 2020;324(5):441-442. https://doi.org/10.1001/jama.2020.12458
30. Bourouiba L. Turbulent gas clouds and respiratory pathogen emissions: potential implications for reducing transmission of COVID-19. JAMA. 2020;323(18):1837-1838. https://doi.org/10.1001/jama.2020.4756
31. Qiu X, Nergiz A, Maraolo A, Bogoch I, Low N, Cevik M. The role of asymptomatic and pre-symptomatic infection in SARS-CoV-2 transmission – a living systematic review. Clin Mibrobiol Infect. 2021;20:S1198-743X(21)00038-0. Published online January 20, 2021. https://doi.org/10.1016/j.cmi.2021.01.011
32. Price-Haywood EG, Burton J, Fort D, Seoane L. Hospitalization and mortality among black patients and white patients with Covid-19. N Engl J Med. 2020;382(26):2534-2543. https://doi.org/doi:10.1056/NEJMsa2011686
33. New York City Department of Health. Covid-19: Data. Antibody testing by group - age. Accessed March 5, 2021. https://www1.nyc.gov/site/doh/covid/covid-19-data-totals.page#antibody
34. Patel JA, Nielsen FBH, Badiani AA, et al. Poverty, inequality and COVID-19: the forgotten vulnerable. Public Health. 2020;183:110-111. https://doi.org/10.1016/j.puhe.2020.05.006
35. Polack FP, Thomas SJ, Kitchin N, et al. Safety and efficacy of the BNT162b2 mRNA Covid-19 vaccine. N Engl J Med. 2020;383(27):2603-2615. https://doi.org/10.1056/NEJMoa2034577

Article PDF
Author and Disclosure Information

1Division of Pediatric Infectious Disease, Department of Pediatrics, BronxCare Health System, Bronx, NY; 2Department of Family Medicine, BronxCare Health System Bronx, NY; 3Patient Care Services, Ambulatory Care, BronxCare Health System Bronx, NY; 4Division of Adult Infectious Disease, Department of Medicine, BronxCare Health System, Bronx, NY; 5Rory Meyers College of Nursing, New York University, New York, NY.

Disclosures
Dr Purswani receives research grant support, unrelated to this work, from the National Institute of Child Health and Human Development as the clinical site principal investigator for the International Maternal Pediatric and Adolescent Clinical Trials Group (IMPAACT) and the Pediatric HIV/AIDS Cohort Study (PHACS). The other authors have nothing to disclose.

Issue
Journal of Hospital Medicine 16(5)
Topics
Page Number
282-289. Published Online First April 20, 2021
Sections
Files
Files
Author and Disclosure Information

1Division of Pediatric Infectious Disease, Department of Pediatrics, BronxCare Health System, Bronx, NY; 2Department of Family Medicine, BronxCare Health System Bronx, NY; 3Patient Care Services, Ambulatory Care, BronxCare Health System Bronx, NY; 4Division of Adult Infectious Disease, Department of Medicine, BronxCare Health System, Bronx, NY; 5Rory Meyers College of Nursing, New York University, New York, NY.

Disclosures
Dr Purswani receives research grant support, unrelated to this work, from the National Institute of Child Health and Human Development as the clinical site principal investigator for the International Maternal Pediatric and Adolescent Clinical Trials Group (IMPAACT) and the Pediatric HIV/AIDS Cohort Study (PHACS). The other authors have nothing to disclose.

Author and Disclosure Information

1Division of Pediatric Infectious Disease, Department of Pediatrics, BronxCare Health System, Bronx, NY; 2Department of Family Medicine, BronxCare Health System Bronx, NY; 3Patient Care Services, Ambulatory Care, BronxCare Health System Bronx, NY; 4Division of Adult Infectious Disease, Department of Medicine, BronxCare Health System, Bronx, NY; 5Rory Meyers College of Nursing, New York University, New York, NY.

Disclosures
Dr Purswani receives research grant support, unrelated to this work, from the National Institute of Child Health and Human Development as the clinical site principal investigator for the International Maternal Pediatric and Adolescent Clinical Trials Group (IMPAACT) and the Pediatric HIV/AIDS Cohort Study (PHACS). The other authors have nothing to disclose.

Article PDF
Article PDF
Related Articles

SARS-CoV-2 has infected 141 million people worldwide and 31 million people in the United States as of April 20, 2021.1,2 The influx of hospital admissions and deaths has severely strained healthcare systems worldwide and placed healthcare workers (HCWs) at increased risk for acquiring COVID-19.3-5

Several studies have described the impact of COVID-19 on this heterogeneous group of HCWs. Shields et al reported a seroprevalence of 24.4% in HCWs at University Hospitals Birmingham (UK), with the highest rate, 34.5%, in housekeeping staff.6 Steensels et al reported a lower prevalence of 6.4% at a tertiary care center in Belgium, and showed no increased risk for HCWs when directly involved in clinical care.7 The authors attributed this to adequate use of personal protective equipment (PPE). Other studies have reported seroprevalences ranging from 1.6% to 18%.8-11 In the New York City (NYC) metro area, Jeremias et al reported a seroprevalence of 9.8% in HCWs and found no difference by job title or work location,12 whereas Moscola et al reported a seroprevalence of 13.7% and demonstrated a 3% increased risk for those working in service or maintenance.13 Antibody tests were conducted between March and April 2020 in all but two of these studies; testing in these two studies was performed between April 13 and June 23, 2020, with one reporting a seroprevalence of 6%11 and the other, 13.7%.13

NYC became the earliest pandemic epicenter in the United States following untracked transmission from ongoing circulation of SARS-CoV-2 in Europe.14 As a result, the COVID-19 surge in NYC commenced in March and largely subsided by the end of May 2020. Most HCW data reported to date do not reflect the situation at the end of the surge, and may underestimate true seroprevalence. We describe SARS-CoV-2 seroprevalence in HCWs in a large inner-city hospital in NYC, with antibody testing conducted from May 18 to June 26, 2020, at the subsidence of the surge. To further our understanding of occupational risk among different groups of HCWs, we examined associations of seroprevalence with HCWs’ job function and work location.

METHODS

This was a cross-sectional seroprevalence study conducted in the BronxCare Health System located in South and Central Bronx, an area that experienced one of the highest incidences of SARS-CoV-2 infections within NYC’s five boroughs.

HCWs were offered voluntary testing for serum antibodies to SARS-CoV-2 between May 18 and June 26, 2020. Testing occurred in the institution’s auditorium, a central and easily accessible location. Weekly emails were sent to all employees and department heads during the testing period, offering antibody testing and providing location and testing time information. The Elecsys Anti-SARS-CoV-2 (Roche) assay measuring total qualitative antibodies was used; the assay has a reported sensitivity of 97.1% 14 days after a positive SARS-CoV-2 RNA polymerase chain reaction (PCR) test result and a specificity of 100%.15

Demographic and work-related information was abstracted from electronic medical records, including all comorbid conditions that affected 30 or more HCWs. Pulmonary diagnoses, including asthma and chronic obstructive pulmonary disease, were grouped as chronic lung disease, and cardiovascular diseases, including hypertension, as chronic heart disease. Personal identifiers and data were delinked upon completion of data abstraction. The study was approved by the hospital’s institutional review board.

Job Function and Work Location

HCWs were grouped by job function as follows: physicians; nurses (including physician assistants and nurse practitioners); allied HCW I (medical assistants, patient care, and electrocardiogram, radiology, and ear, nose and throat technicians); allied HCW II (social workers, dieticians and nutritionists, registration clerks and unit associates, physical and occupational therapists); nonclinical staff (patient transporters, housekeeping staff, and security staff); pharmacists; engineering; and administrative staff. Respiratory therapists were considered as a separate group as their work placed them at high risk for respiratory diseases.

Work locations were as follows: clinics (including dental, outpatient, and satellite clinics), emergency departments (ED), inpatient units (including floors and intensive care units [ICU]), radiology suite, laboratory and pharmacy, and offices.

Statistical Analysis

Descriptive statistics were calculated using χ2 analyses. All demographic variables were tested against serology status (positive/negative). A binary logistic regression analysis was used to calculate odds ratios (ORs). Eight separate univariate unadjusted ORs were calculated by running each predictor variable against serology status (dependent variable), which included the six categorical variables—race, ethnicity, age, sex, body mass index (BMI), and prior SARS-CoV-2 PCR results—and the two main predictors—job function and work location. To obtain adjusted ORs, two final separate multivariable logistic regression analyses were executed including the six covariates listed. Due to high collinearity between job function and work location (χ2 = 3030.13, df = 35 [6 levels of work location – 1]*[8 levels of job function – 1]; P < .001), we included only one of the main predictors in each model. The regressions were specified such that the reference groups for the work location and job function variables were office work and administration, respectively. This choice was made based on the fact that their nonclinical functions do not confer an exposure risk in excess of that experienced by typical community populations. Sensitivity analyses were performed on the subset of HCWs whose address zip codes indicated residence within NYC to exclude the effect of different community seroprevalences in areas outside of NYC. The 95% CI for seroprevalence of antibodies within tested HCWs was estimated using the Clopper-Pearson binomial method.

RESULTS

Among all HCWs in the institution (N = 4,807), 2,749 (57.2%) underwent voluntary testing. Of those who underwent testing, 831 were positive for antibodies to SARS-CoV-2 (Figure 1), a seroprevalence of 30.2% (95% CI, 29%-32%). Among the age groups, the 45-to-64−year group had the highest seropositivity at 33% (400/1203), and those ≥75 years of age, the lowest at 16.7% (2/12) (P < .009).

Flow Diagram Showing Voluntary Testing Uptake and Results for Qualitative SARS-CoV-2 Antibody Testing
Data on race was available for 38.7% (1,064/2,749) of HCWs (Table); seropositivity was highest for Blacks (259/664, 39%) and lowest for Whites (36/163, 22.1%; P < .001). Certain comorbid conditions were associated with seropositivity (P = .001).
Healthcare Workers’ Demographic, Comorbid, and Work Characteristics by SARS-CoV-2 Antibody Status

Among all tested HCWs, 70.1% (1,928/2,749) resided in NYC. SARS-CoV-2 seroprevalence in this subset was 32% (616/1,928) (Figure 1). Demographic and comorbid conditions in HCWs who lived in NYC were similar to those of the whole group (Appendix Table 1).

HCWs who underwent voluntary antibody testing (Appendix Table 2) had a higher percentage of persons in the 45-to-64−year age group (43.8% vs 40.9%) and a lower percentage of persons in the 65-to-74−year age group (3.3% vs 5.3%) compared with the group of HCWs that did not undergo testing (P < .001). Gender, race, ethnicity, comorbid conditions, SARS-CoV-2 PCR testing, and work locations were not different between groups. The tested group had higher proportions of clinicians (physicians, nurses, allied HCWs I and II) than the untested nonparticipant group (P = .014).

SARS-CoV-2 PCR Tests on HCWs

More than one-third (34.1%; 938/2,749) of HCWs had a documented nasopharyngeal PCR test between March 23 and June 26, 2020 (Table). Of all PCRs performed, 262 were positive, giving an overall PCR positivity rate of 27.9%. Positivity was 51.4% in March and 36.6% in April. The reasons for PCR testing were not available, but likely represent a combination of exposure-related testing among asymptomatic individuals and diagnostic testing of symptomatic HCWs. In contrast, serology testing was indicative of prior infection and yielded a cumulative seroprevalence at the end of the surge. Findings were similar among HCWs residing in NYC (Appendix Table 1).

Work Location and Job Function

Among all HCWs (Table, Figure 2), there were differences in seropositivity by work location (P = .001). The largest number of HCWs worked in inpatient units (1,348/2,749, 49%), and the second largest in offices (554/2,749, 20%). The highest seropositivity rate was in the EDs, at 36.4% (64/176), followed by radiology suites, at 32.7% (17/52); the seropositivity rate in office locations was 25.8% (143/554). Among HCWs residing in NYC (Appendix Table 1, Appendix Figure 1), the rank order according to proportion seropositive by work location was similar to that of the whole group (P = .004), except that the second highest seropositivity rate was in the inpatient units (33.9% [323/953]). In the group of HCWs residing in NYC, office locations had a seropositivity of 27.4% (102/372). The seropositivity rates for both groups working in office locations were slightly higher than the 22% community seroprevalence in NYC reported for the same period.16

 Proportions Seropositive for SARS-CoV-2 Among All Tested Healthcare Workers by Job Function and Work Location

Among all HCWs, there were differences in seropositivity by job function (P = .001). The greatest proportion of HCWs were allied HCW II (23% [631/2,749]), followed by nurses (22.2% [611/2,749]) and physicians (21.3% [585/2,749] ). Seropositivity was highest for nonclinical staff (44.0% [51/116]), followed by nurses (37.5% [229/611]) and allied clinical HCW I and II (34.5% [143/414] and 32.0% [202/631], respectively). It was lowest for administrative staff (20.9% [42/201]) and pharmacists (11.1% [5/45]). Among HCWs residing in NYC, the rank order according to proportion seropositive by location was similar to that of the whole group. Administrative staff seropositivity was 18.3% (20/109). Administrative staff seropositivity for both groups was marginally lower than the 22% community seroprevalence in NYC for the same period.16

Odds Ratios for SARS-CoV-2 Seropositivity

For all HCWs, in unadjusted models (Appendix Table 3), age 45 to 64 years and Black race were associated with increased odds of being seropositive (1.26; 95% CI, 1.07-1.49 and 2.26; 95% CI, 1.51-3.37, respectively). Increased odds were seen for HCWs working in the ED (1.64; 95% CI, 1.14-2.36) and inpatient units (1.35; 95% CI, 1.08-1.69), and decreased odds were seen for those working in the laboratory and pharmacy (0.47; 95% CI, 0.26-0.86). Increased odds for seropositivity were found for nurses (2.27; 95% CI, 1.56-3.31), allied HCW I (2.00; 95% CI, 1.34-2.97), allied HCW II (1.78; 95% CI, 1.22-2.60), and nonclinical staff (2.97; 95% CI,1.80-4.90).

After adjusting for all covariates, HCWs who were Black remained at increased odds for being seropositive in the two final models (adjusted OR, 2.29; 95% CI, 1.38-3.81 and adjusted OR, 2.94; 95% CI, 1.78-4.85), as did those who had a BMI >30 kg/m2, with an adjusted OR of 1.36 (95% CI, 1.05-1.77) in one of the final models (Appendix Table 3). None of the other comorbid conditions had increased ORs. Those who worked in the ED and inpatient units also remained at increased odds after adjusting for covariates (2.27; 95% CI, 1.53-3.37 and 1.48; 95% CI, 1.14-1.92, respectively; Figure 3). Other job functions that had increased odds for seropositivity were nurses (2.54; 95% CI, 1.64-3.94), allied HCW I (1.83; 95% CI, 1.15-2.89) and II (1.70; 95% CI, 1.10-2.63), and nonclinical staff (2.51; 95% CI, 1.42-4.43).

Association of Job Function and Work Location With Seropositivity Among All Tested Healthcare Workers

Having a positive PCR for SAR-CoV-2 on nasopharyngeal swabs was strongly associated with seropositivity (OR, 47.26; 95% CI, 29.30-76.23 and OR, 44.79; 95% CI, 27.87-72.00) in the two multivariate-adjusted models. These findings were confirmed when the analyses were performed on HCWs who resided in NYC (Appendix Table 4 and Appendix Figure 2).

DISCUSSION

In a large inner-city New York hospital, we report a cumulative SARS-CoV-2 seroprevalence of 30.2% in HCWs at the end of the first surge of SARS-CoV-2 infections in NYC. We identified the highest seropositivity rates for nonclinical staff and nurses, followed by allied HCWs, with the odds of being seropositive ranging from 1.7 to 2.5. The work locations with the highest seroprevalences were the ED and inpatient units, with 2.3-fold and 1.5-fold increased odds of seropositivity, respectively.

Serosurveillance studies have reported the trajectory of community prevalence in NYC over the first wave. A 6.3% prevalence was reported in samples collected between March 23 and April 1, 2020.17 In a study by Rosenberg et al18 with testing performed from April 9 through April 28, 2020, prevalence increased to 22.7%. Serosurveillance data from the NYC Department of Health show prevalence ranging from 20.1% to 23.3% (average 22%) during the study period.16 Compared to the estimated seroprevalence of 9.3% in the United States,19 these rates established NYC as an early epicenter for the COVID-19 pandemic, with our institution’s HCW seroprevalence considerably higher than NYC community serosurveillance rates, 2.2 times higher than reported in the earlier HCW study in the greater NYC area,13 and higher than the 27% rate during May 2020 recently reported in another NYC hospital.20

Data from studies of hospital transmission and effects of mitigation measures, such as a universal masking policy for HCWs and patients, clearly demonstrate the high effectiveness of these measures in reducing hospital transmissions.21,22 This suggests HCW seroprevalence in institutions with well-implemented infection control and universal masking policies may not be a consequence of workplace exposures, but rather may be reflective of community rates.23 Our institution’s response commenced February 3, 2020, with implementation of social distancing, a universal masking policy, transmission-based precautions, and use of fitted N95 masks. Mid-March, elective surgeries were canceled, and inpatient visitation suspended. During the surge, these measures were widely and consistently implemented for all categories of HCWs throughout the work environment, based on emerging guidelines from the Centers for Disease Control and Prevention (CDC) and NYC Department of Health. Our overall observed HCW seroprevalence, well above that of the community, with differences in categories of job function and work locations, is therefore an important finding. Our sample of 2,749 HCWs lived in NYC and its surrounding suburbs and nearby states. There is heterogeneity in community seroprevalence between areas outside of NYC and NYC (an epicenter) itself. We therefore analyzed our data in the subset with NYC zip codes, confirming a similar overall prevalence and increased odds of seropositivity in nurses, allied HCWs, and nonclinical staff.

Physicians and administrative and office staff had seropositivity rates of 18.1%, 20.9%, and 25.8%, respectively, consistent with community rates and illustrating the effectiveness of PPE in the hospital setting. Since PPE use was part of a universal policy applied to all HCWs in our institution, other possible reasons may explain the differences we found. We speculate that the close working relationship nurses have with their patients resulted in a longer duration and higher frequency of daily interactions, increasing the risk for transmission and causing breakthrough infections.24,25 This increased risk is reflected in a study in which 28% of hospitalized patients were nurses and 9% certified nursing assistants.26

The CDC recently redefined close contact with someone with COVID-19 as a cumulative total of >15 minutes over 24 hours.25 Thus, several multiple short periods of exposure can increase risk for infection with SARS-CoV-2; such exposure is characteristic of the job function of nurses, nursing staff, and nonclinical staff. Further, housekeeping, transportation, and security officers are all nonclinical staff with significant and multiple exposures to COVID-19 patients during the surge, and for security officers, to continuous public traffic in and out of the hospital. SARS-CoV-2 spreads by virus shedding in large droplets and aerosols, with droplet nuclei <5 microns in size efficiently dispersed in air, an important additional mode of transmission.27-30 Airborne transmission coupled with virus shedding in asymptomatic and presymptomatic persons, which has been shown to cause secondary attack rates of up to 32%, are other factors that likely contributed to the increased seroprevalence in this group.31 Our observation is consistent with the Birmingham study, which reported the highest rate in housekeeping staff, with a prevalence of 34.5%, compared to 44% in this study.6 Similar reasons for high seropositivity rates apply to the two groups of allied HCWs (eg, medical assistants and patient care technicians, social workers, nutritionists and therapists), whose job functions place them in intermittent but significant proximity with inpatients and outpatients.

Consistent with public health data showing that minorities are disproportionately affected by this disease, we found that Black HCWs were three times more likely to be seropositive.32 However, an unexpected observation was the association between obesity and SARS-CoV-2 seropositivity. A possible explanation for this association may be inability to achieve optimal fit testing for N95 masks, thereby increasing the risk of exposure to droplet nuclei. This is important given that obesity is associated with poorer outcomes from COVID-19.

During the height of the first wave in NYC, EDs and inpatient units handled a large volume of COVID-19 patients with high PCR positivity rates (peak of 51% in March in our hospital). It was not unexpected that we observed increased odds of seropositivity in these work locations. As ICUs were at capacity, inpatient units cared for critically ill patients they would not normally have. HCWs in these locations coped with an increased workload, increased demand on PPE supplies, and work fatigue, which contributed to increased risk for hospital-acquired SARS-CoV-2 infections.

Reporting seroprevalence at a single institution was a limitation of the study. Approximately 57% of the hospital’s total HCW population was tested for antibodies. It is possible their risk profile influenced their decision to volunteer for testing when it became available, introducing selection bias. A comparison between tested and untested HCWs showed similarity in all demographic measures, including nasopharyngeal PCR testing, except for age. We did not have information on symptoms that would prompt PCR testing. HCWs who underwent voluntary testing were younger compared to those who did not undergo testing. Current NYC serosurveillance data showed higher seropositivity in the 45-to-64–year age group (27.8%-28.6%) compared to the 65-to-74–year age group (24.3%), which suggests that the tested group may overestimate seroprevalence among HCWs relative to a randomly selected sample.33 Similarly, there were more nurses, allied HCWs, physicians, and administrative staff in the tested group, with the former two having higher SARS-CoV-2 seropositivity compared to community prevalence, which could also overestimate seroprevalence. Our large sample size provided us with the power to detect differences within several different job functions and work locations, a strength of this study. It was not possible to differentiate community- from hospital-acquired infection in our HCWs, a limitation in many observational HCW seroprevalence studies. However, when we analyzed data restricted only to HCWs in NYC, to reduce the effect of differing community prevalences outside the city, our results were unchanged. Since it is possible that nonclinical HCWs are of a lower socioeconomic status compared to others (nurses and allied HCWs), we cannot exclude the possibility that higher SARS-CoV-2 seroprevalence associated with lower status explains, partly or completely, the increased odds of seropositivity we observed.34 Due to the high proportion of missing data for race (61.3%), we advise caution in interpreting our finding that the odds of seropositivity were three times higher for Black race, even though consistent with prior literature.34 Healthcare organizations have similar job function and work location categories incorporated in their infrastructure, suggesting that our observations may be generalizable to other hospitals in the United States.

CONCLUSION

These findings show that during the first surge in NYC, with its increased burden of disease, hospitalizations, morbidity, and mortality, seroprevalences varied based on job function and work location within this institution. Nurses were at highest risk for SARS-CoV-2 infection, as were those who worked in the ED. In preparation for subsequent waves of SARS-CoV-2 and other highly contagious respiratory infections, major medical centers need to enhance efforts aimed at protecting HCWs, with particular attention to these groups. This study also strongly supports the recent CDC guideline prioritizing HCWs to receive COVID-19 mRNA and adenovirus vector vaccines that have obtained emergency use authorization by the US Food and Drug Administration.35

Acknowledgments

The authors thank all the residents, nurses, and staff of the Department of Family Medicine for their contribution to this work.

SARS-CoV-2 has infected 141 million people worldwide and 31 million people in the United States as of April 20, 2021.1,2 The influx of hospital admissions and deaths has severely strained healthcare systems worldwide and placed healthcare workers (HCWs) at increased risk for acquiring COVID-19.3-5

Several studies have described the impact of COVID-19 on this heterogeneous group of HCWs. Shields et al reported a seroprevalence of 24.4% in HCWs at University Hospitals Birmingham (UK), with the highest rate, 34.5%, in housekeeping staff.6 Steensels et al reported a lower prevalence of 6.4% at a tertiary care center in Belgium, and showed no increased risk for HCWs when directly involved in clinical care.7 The authors attributed this to adequate use of personal protective equipment (PPE). Other studies have reported seroprevalences ranging from 1.6% to 18%.8-11 In the New York City (NYC) metro area, Jeremias et al reported a seroprevalence of 9.8% in HCWs and found no difference by job title or work location,12 whereas Moscola et al reported a seroprevalence of 13.7% and demonstrated a 3% increased risk for those working in service or maintenance.13 Antibody tests were conducted between March and April 2020 in all but two of these studies; testing in these two studies was performed between April 13 and June 23, 2020, with one reporting a seroprevalence of 6%11 and the other, 13.7%.13

NYC became the earliest pandemic epicenter in the United States following untracked transmission from ongoing circulation of SARS-CoV-2 in Europe.14 As a result, the COVID-19 surge in NYC commenced in March and largely subsided by the end of May 2020. Most HCW data reported to date do not reflect the situation at the end of the surge, and may underestimate true seroprevalence. We describe SARS-CoV-2 seroprevalence in HCWs in a large inner-city hospital in NYC, with antibody testing conducted from May 18 to June 26, 2020, at the subsidence of the surge. To further our understanding of occupational risk among different groups of HCWs, we examined associations of seroprevalence with HCWs’ job function and work location.

METHODS

This was a cross-sectional seroprevalence study conducted in the BronxCare Health System located in South and Central Bronx, an area that experienced one of the highest incidences of SARS-CoV-2 infections within NYC’s five boroughs.

HCWs were offered voluntary testing for serum antibodies to SARS-CoV-2 between May 18 and June 26, 2020. Testing occurred in the institution’s auditorium, a central and easily accessible location. Weekly emails were sent to all employees and department heads during the testing period, offering antibody testing and providing location and testing time information. The Elecsys Anti-SARS-CoV-2 (Roche) assay measuring total qualitative antibodies was used; the assay has a reported sensitivity of 97.1% 14 days after a positive SARS-CoV-2 RNA polymerase chain reaction (PCR) test result and a specificity of 100%.15

Demographic and work-related information was abstracted from electronic medical records, including all comorbid conditions that affected 30 or more HCWs. Pulmonary diagnoses, including asthma and chronic obstructive pulmonary disease, were grouped as chronic lung disease, and cardiovascular diseases, including hypertension, as chronic heart disease. Personal identifiers and data were delinked upon completion of data abstraction. The study was approved by the hospital’s institutional review board.

Job Function and Work Location

HCWs were grouped by job function as follows: physicians; nurses (including physician assistants and nurse practitioners); allied HCW I (medical assistants, patient care, and electrocardiogram, radiology, and ear, nose and throat technicians); allied HCW II (social workers, dieticians and nutritionists, registration clerks and unit associates, physical and occupational therapists); nonclinical staff (patient transporters, housekeeping staff, and security staff); pharmacists; engineering; and administrative staff. Respiratory therapists were considered as a separate group as their work placed them at high risk for respiratory diseases.

Work locations were as follows: clinics (including dental, outpatient, and satellite clinics), emergency departments (ED), inpatient units (including floors and intensive care units [ICU]), radiology suite, laboratory and pharmacy, and offices.

Statistical Analysis

Descriptive statistics were calculated using χ2 analyses. All demographic variables were tested against serology status (positive/negative). A binary logistic regression analysis was used to calculate odds ratios (ORs). Eight separate univariate unadjusted ORs were calculated by running each predictor variable against serology status (dependent variable), which included the six categorical variables—race, ethnicity, age, sex, body mass index (BMI), and prior SARS-CoV-2 PCR results—and the two main predictors—job function and work location. To obtain adjusted ORs, two final separate multivariable logistic regression analyses were executed including the six covariates listed. Due to high collinearity between job function and work location (χ2 = 3030.13, df = 35 [6 levels of work location – 1]*[8 levels of job function – 1]; P < .001), we included only one of the main predictors in each model. The regressions were specified such that the reference groups for the work location and job function variables were office work and administration, respectively. This choice was made based on the fact that their nonclinical functions do not confer an exposure risk in excess of that experienced by typical community populations. Sensitivity analyses were performed on the subset of HCWs whose address zip codes indicated residence within NYC to exclude the effect of different community seroprevalences in areas outside of NYC. The 95% CI for seroprevalence of antibodies within tested HCWs was estimated using the Clopper-Pearson binomial method.

RESULTS

Among all HCWs in the institution (N = 4,807), 2,749 (57.2%) underwent voluntary testing. Of those who underwent testing, 831 were positive for antibodies to SARS-CoV-2 (Figure 1), a seroprevalence of 30.2% (95% CI, 29%-32%). Among the age groups, the 45-to-64−year group had the highest seropositivity at 33% (400/1203), and those ≥75 years of age, the lowest at 16.7% (2/12) (P < .009).

Flow Diagram Showing Voluntary Testing Uptake and Results for Qualitative SARS-CoV-2 Antibody Testing
Data on race was available for 38.7% (1,064/2,749) of HCWs (Table); seropositivity was highest for Blacks (259/664, 39%) and lowest for Whites (36/163, 22.1%; P < .001). Certain comorbid conditions were associated with seropositivity (P = .001).
Healthcare Workers’ Demographic, Comorbid, and Work Characteristics by SARS-CoV-2 Antibody Status

Among all tested HCWs, 70.1% (1,928/2,749) resided in NYC. SARS-CoV-2 seroprevalence in this subset was 32% (616/1,928) (Figure 1). Demographic and comorbid conditions in HCWs who lived in NYC were similar to those of the whole group (Appendix Table 1).

HCWs who underwent voluntary antibody testing (Appendix Table 2) had a higher percentage of persons in the 45-to-64−year age group (43.8% vs 40.9%) and a lower percentage of persons in the 65-to-74−year age group (3.3% vs 5.3%) compared with the group of HCWs that did not undergo testing (P < .001). Gender, race, ethnicity, comorbid conditions, SARS-CoV-2 PCR testing, and work locations were not different between groups. The tested group had higher proportions of clinicians (physicians, nurses, allied HCWs I and II) than the untested nonparticipant group (P = .014).

SARS-CoV-2 PCR Tests on HCWs

More than one-third (34.1%; 938/2,749) of HCWs had a documented nasopharyngeal PCR test between March 23 and June 26, 2020 (Table). Of all PCRs performed, 262 were positive, giving an overall PCR positivity rate of 27.9%. Positivity was 51.4% in March and 36.6% in April. The reasons for PCR testing were not available, but likely represent a combination of exposure-related testing among asymptomatic individuals and diagnostic testing of symptomatic HCWs. In contrast, serology testing was indicative of prior infection and yielded a cumulative seroprevalence at the end of the surge. Findings were similar among HCWs residing in NYC (Appendix Table 1).

Work Location and Job Function

Among all HCWs (Table, Figure 2), there were differences in seropositivity by work location (P = .001). The largest number of HCWs worked in inpatient units (1,348/2,749, 49%), and the second largest in offices (554/2,749, 20%). The highest seropositivity rate was in the EDs, at 36.4% (64/176), followed by radiology suites, at 32.7% (17/52); the seropositivity rate in office locations was 25.8% (143/554). Among HCWs residing in NYC (Appendix Table 1, Appendix Figure 1), the rank order according to proportion seropositive by work location was similar to that of the whole group (P = .004), except that the second highest seropositivity rate was in the inpatient units (33.9% [323/953]). In the group of HCWs residing in NYC, office locations had a seropositivity of 27.4% (102/372). The seropositivity rates for both groups working in office locations were slightly higher than the 22% community seroprevalence in NYC reported for the same period.16

 Proportions Seropositive for SARS-CoV-2 Among All Tested Healthcare Workers by Job Function and Work Location

Among all HCWs, there were differences in seropositivity by job function (P = .001). The greatest proportion of HCWs were allied HCW II (23% [631/2,749]), followed by nurses (22.2% [611/2,749]) and physicians (21.3% [585/2,749] ). Seropositivity was highest for nonclinical staff (44.0% [51/116]), followed by nurses (37.5% [229/611]) and allied clinical HCW I and II (34.5% [143/414] and 32.0% [202/631], respectively). It was lowest for administrative staff (20.9% [42/201]) and pharmacists (11.1% [5/45]). Among HCWs residing in NYC, the rank order according to proportion seropositive by location was similar to that of the whole group. Administrative staff seropositivity was 18.3% (20/109). Administrative staff seropositivity for both groups was marginally lower than the 22% community seroprevalence in NYC for the same period.16

Odds Ratios for SARS-CoV-2 Seropositivity

For all HCWs, in unadjusted models (Appendix Table 3), age 45 to 64 years and Black race were associated with increased odds of being seropositive (1.26; 95% CI, 1.07-1.49 and 2.26; 95% CI, 1.51-3.37, respectively). Increased odds were seen for HCWs working in the ED (1.64; 95% CI, 1.14-2.36) and inpatient units (1.35; 95% CI, 1.08-1.69), and decreased odds were seen for those working in the laboratory and pharmacy (0.47; 95% CI, 0.26-0.86). Increased odds for seropositivity were found for nurses (2.27; 95% CI, 1.56-3.31), allied HCW I (2.00; 95% CI, 1.34-2.97), allied HCW II (1.78; 95% CI, 1.22-2.60), and nonclinical staff (2.97; 95% CI,1.80-4.90).

After adjusting for all covariates, HCWs who were Black remained at increased odds for being seropositive in the two final models (adjusted OR, 2.29; 95% CI, 1.38-3.81 and adjusted OR, 2.94; 95% CI, 1.78-4.85), as did those who had a BMI >30 kg/m2, with an adjusted OR of 1.36 (95% CI, 1.05-1.77) in one of the final models (Appendix Table 3). None of the other comorbid conditions had increased ORs. Those who worked in the ED and inpatient units also remained at increased odds after adjusting for covariates (2.27; 95% CI, 1.53-3.37 and 1.48; 95% CI, 1.14-1.92, respectively; Figure 3). Other job functions that had increased odds for seropositivity were nurses (2.54; 95% CI, 1.64-3.94), allied HCW I (1.83; 95% CI, 1.15-2.89) and II (1.70; 95% CI, 1.10-2.63), and nonclinical staff (2.51; 95% CI, 1.42-4.43).

Association of Job Function and Work Location With Seropositivity Among All Tested Healthcare Workers

Having a positive PCR for SAR-CoV-2 on nasopharyngeal swabs was strongly associated with seropositivity (OR, 47.26; 95% CI, 29.30-76.23 and OR, 44.79; 95% CI, 27.87-72.00) in the two multivariate-adjusted models. These findings were confirmed when the analyses were performed on HCWs who resided in NYC (Appendix Table 4 and Appendix Figure 2).

DISCUSSION

In a large inner-city New York hospital, we report a cumulative SARS-CoV-2 seroprevalence of 30.2% in HCWs at the end of the first surge of SARS-CoV-2 infections in NYC. We identified the highest seropositivity rates for nonclinical staff and nurses, followed by allied HCWs, with the odds of being seropositive ranging from 1.7 to 2.5. The work locations with the highest seroprevalences were the ED and inpatient units, with 2.3-fold and 1.5-fold increased odds of seropositivity, respectively.

Serosurveillance studies have reported the trajectory of community prevalence in NYC over the first wave. A 6.3% prevalence was reported in samples collected between March 23 and April 1, 2020.17 In a study by Rosenberg et al18 with testing performed from April 9 through April 28, 2020, prevalence increased to 22.7%. Serosurveillance data from the NYC Department of Health show prevalence ranging from 20.1% to 23.3% (average 22%) during the study period.16 Compared to the estimated seroprevalence of 9.3% in the United States,19 these rates established NYC as an early epicenter for the COVID-19 pandemic, with our institution’s HCW seroprevalence considerably higher than NYC community serosurveillance rates, 2.2 times higher than reported in the earlier HCW study in the greater NYC area,13 and higher than the 27% rate during May 2020 recently reported in another NYC hospital.20

Data from studies of hospital transmission and effects of mitigation measures, such as a universal masking policy for HCWs and patients, clearly demonstrate the high effectiveness of these measures in reducing hospital transmissions.21,22 This suggests HCW seroprevalence in institutions with well-implemented infection control and universal masking policies may not be a consequence of workplace exposures, but rather may be reflective of community rates.23 Our institution’s response commenced February 3, 2020, with implementation of social distancing, a universal masking policy, transmission-based precautions, and use of fitted N95 masks. Mid-March, elective surgeries were canceled, and inpatient visitation suspended. During the surge, these measures were widely and consistently implemented for all categories of HCWs throughout the work environment, based on emerging guidelines from the Centers for Disease Control and Prevention (CDC) and NYC Department of Health. Our overall observed HCW seroprevalence, well above that of the community, with differences in categories of job function and work locations, is therefore an important finding. Our sample of 2,749 HCWs lived in NYC and its surrounding suburbs and nearby states. There is heterogeneity in community seroprevalence between areas outside of NYC and NYC (an epicenter) itself. We therefore analyzed our data in the subset with NYC zip codes, confirming a similar overall prevalence and increased odds of seropositivity in nurses, allied HCWs, and nonclinical staff.

Physicians and administrative and office staff had seropositivity rates of 18.1%, 20.9%, and 25.8%, respectively, consistent with community rates and illustrating the effectiveness of PPE in the hospital setting. Since PPE use was part of a universal policy applied to all HCWs in our institution, other possible reasons may explain the differences we found. We speculate that the close working relationship nurses have with their patients resulted in a longer duration and higher frequency of daily interactions, increasing the risk for transmission and causing breakthrough infections.24,25 This increased risk is reflected in a study in which 28% of hospitalized patients were nurses and 9% certified nursing assistants.26

The CDC recently redefined close contact with someone with COVID-19 as a cumulative total of >15 minutes over 24 hours.25 Thus, several multiple short periods of exposure can increase risk for infection with SARS-CoV-2; such exposure is characteristic of the job function of nurses, nursing staff, and nonclinical staff. Further, housekeeping, transportation, and security officers are all nonclinical staff with significant and multiple exposures to COVID-19 patients during the surge, and for security officers, to continuous public traffic in and out of the hospital. SARS-CoV-2 spreads by virus shedding in large droplets and aerosols, with droplet nuclei <5 microns in size efficiently dispersed in air, an important additional mode of transmission.27-30 Airborne transmission coupled with virus shedding in asymptomatic and presymptomatic persons, which has been shown to cause secondary attack rates of up to 32%, are other factors that likely contributed to the increased seroprevalence in this group.31 Our observation is consistent with the Birmingham study, which reported the highest rate in housekeeping staff, with a prevalence of 34.5%, compared to 44% in this study.6 Similar reasons for high seropositivity rates apply to the two groups of allied HCWs (eg, medical assistants and patient care technicians, social workers, nutritionists and therapists), whose job functions place them in intermittent but significant proximity with inpatients and outpatients.

Consistent with public health data showing that minorities are disproportionately affected by this disease, we found that Black HCWs were three times more likely to be seropositive.32 However, an unexpected observation was the association between obesity and SARS-CoV-2 seropositivity. A possible explanation for this association may be inability to achieve optimal fit testing for N95 masks, thereby increasing the risk of exposure to droplet nuclei. This is important given that obesity is associated with poorer outcomes from COVID-19.

During the height of the first wave in NYC, EDs and inpatient units handled a large volume of COVID-19 patients with high PCR positivity rates (peak of 51% in March in our hospital). It was not unexpected that we observed increased odds of seropositivity in these work locations. As ICUs were at capacity, inpatient units cared for critically ill patients they would not normally have. HCWs in these locations coped with an increased workload, increased demand on PPE supplies, and work fatigue, which contributed to increased risk for hospital-acquired SARS-CoV-2 infections.

Reporting seroprevalence at a single institution was a limitation of the study. Approximately 57% of the hospital’s total HCW population was tested for antibodies. It is possible their risk profile influenced their decision to volunteer for testing when it became available, introducing selection bias. A comparison between tested and untested HCWs showed similarity in all demographic measures, including nasopharyngeal PCR testing, except for age. We did not have information on symptoms that would prompt PCR testing. HCWs who underwent voluntary testing were younger compared to those who did not undergo testing. Current NYC serosurveillance data showed higher seropositivity in the 45-to-64–year age group (27.8%-28.6%) compared to the 65-to-74–year age group (24.3%), which suggests that the tested group may overestimate seroprevalence among HCWs relative to a randomly selected sample.33 Similarly, there were more nurses, allied HCWs, physicians, and administrative staff in the tested group, with the former two having higher SARS-CoV-2 seropositivity compared to community prevalence, which could also overestimate seroprevalence. Our large sample size provided us with the power to detect differences within several different job functions and work locations, a strength of this study. It was not possible to differentiate community- from hospital-acquired infection in our HCWs, a limitation in many observational HCW seroprevalence studies. However, when we analyzed data restricted only to HCWs in NYC, to reduce the effect of differing community prevalences outside the city, our results were unchanged. Since it is possible that nonclinical HCWs are of a lower socioeconomic status compared to others (nurses and allied HCWs), we cannot exclude the possibility that higher SARS-CoV-2 seroprevalence associated with lower status explains, partly or completely, the increased odds of seropositivity we observed.34 Due to the high proportion of missing data for race (61.3%), we advise caution in interpreting our finding that the odds of seropositivity were three times higher for Black race, even though consistent with prior literature.34 Healthcare organizations have similar job function and work location categories incorporated in their infrastructure, suggesting that our observations may be generalizable to other hospitals in the United States.

CONCLUSION

These findings show that during the first surge in NYC, with its increased burden of disease, hospitalizations, morbidity, and mortality, seroprevalences varied based on job function and work location within this institution. Nurses were at highest risk for SARS-CoV-2 infection, as were those who worked in the ED. In preparation for subsequent waves of SARS-CoV-2 and other highly contagious respiratory infections, major medical centers need to enhance efforts aimed at protecting HCWs, with particular attention to these groups. This study also strongly supports the recent CDC guideline prioritizing HCWs to receive COVID-19 mRNA and adenovirus vector vaccines that have obtained emergency use authorization by the US Food and Drug Administration.35

Acknowledgments

The authors thank all the residents, nurses, and staff of the Department of Family Medicine for their contribution to this work.

References

1. Liu YC, Kuo RL, Shih SR. COVID-19: The first documented coronavirus pandemic in history. Biomed J. 2020;43(4):328-333. https://doi.org/10.1016/j.bj.2020.04.007
2. World Health Organization. WHO coronavirus disease (COVID-19) dashboard. Accessed April 12, 2021. https://covid19.who.int
3. Nguyen LH, Drew DA, Graham MS, et al. Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. Lancet Public Health. 2020;5(9):e475-e483. https://doi.org/10.1016/S2468-2667(20)30164-X
4. Gupta S, Federman DG. Hospital preparedness for COVID-19 pandemic: experience from department of medicine at Veterans Affairs Connecticut Healthcare System. Postgrad Med. 2020:1-6. https://doi.org/10.1080/00325481.2020.1761668
5. Woolley K, Smith R, Arumugam S. Personal protective equipment (PPE) guidelines, adaptations and lessons during the COVID-19 pandemic. Ethics Med Public Health. 2020;14:100546. https://doi.org/10.1016/j.jemep.2020.100546
6. Shields A, Faustini SE, Perez-Toledo M, et al. SARS-CoV-2 seroprevalence and asymptomatic viral carriage in healthcare workers: a cross-sectional study. Thorax. 2020;75(12):1089-1094. https://doi.org/10.1136/thoraxjnl-2020-215414
7. Steensels D, Oris E, Coninx L, et al. Hospital-wide SARS-CoV-2 antibody screening in 3056 staff in a tertiary center in Belgium. JAMA. 2020;324(2):195-197. https://doi.org/10.1001/jama.2020.11160
8. Stubblefield WB, Talbot HK, Feldstein L, et al. Seroprevalence of SARS-CoV-2 Among frontline healthcare personnel during the first month of caring for COVID-19 patients - Nashville, Tennessee. Clin Infect Dis. 2020. https://doi.org/10.1093/cid/ciaa936
9. Korth J, Wilde B, Dolff S, et al. SARS-CoV-2-specific antibody detection in healthcare workers in Germany with direct contact to COVID-19 patients. J Clin Virol. 2020;128:104437. https://doi.org/10.1016/j.jcv.2020.104437
10. Keeley AJ, Evans C, Colton H, et al. Roll-out of SARS-CoV-2 testing for healthcare workers at a large NHS Foundation Trust in the United Kingdom, March 2020. Euro Surveill. 2020;25(14). https://doi.org/10.2807/1560-7917.ES.2020.25.14.2000433
11. Self WH, Tenforde MW, Stubblefield WB, et al. Seroprevalence of SARS-CoV-2 among frontline health care personnel in a multistate hospital network - 13 academic medical centers, April-June 2020. MMWR Morb Mortal Wkly Rep. 2020;69(35):1221-1226. https://doi.org/10.15585/mmwr.mm6935e2
12. Jeremias A, Nguyen J, Levine J, et al. Prevalence of SARS-CoV-2 infection among health care workers in a tertiary community hospital. JAMA Intern Med. 2020 Aug 11:e204214. https://doi.org/10.1001/jamainternmed.2020.4214
13. Moscola J, Sembajwe G, Jarrett M, et al. Prevalence of SARS-CoV-2 antibodies in health care personnel in the New York City area. JAMA. 2020;324(9):893-895. https://doi.org/10.1001/jama.2020.14765
14. Gonzalez-Reiche AS, Hernandez MM, Sullivan MJ, et al. Introductions and early spread of SARS-CoV-2 in the New York City area. Science. 2020;369(6501):297-301. https://doi.org/10.1126/science.abc1917
15. Lau CS, Hoo SF, Yew SF, et al. Evaluation of the Roche Elecsys Anti-SARS-CoV-2 assay. Preprint. Posted online June 29, 2020. Accessed November 8, 2020. https://www.medrxiv.org/content/10.1101/2020.06.28.20142232v1 https://doi.org/10.1101/2020.06.28.20142232
16. New York City Department of Health. Covid-19: data. long-term trends. Antibody testing. Accessed March 5, 2021. https://www1.nyc.gov/site/doh/covid/covid-19-data-trends.page#antibody
17. Havers FP, Reed C, Lim T, et al. Seroprevalence of antibodies to SARS-CoV-2 in 10 sites in the United States, March 23-May 12, 2020. JAMA Intern Med. Published online July 21, 2020. https://doi.org/10.1001/jamainternmed.2020.4130
18. Rosenberg ES, Tesoriero JM, Rosenthal EM, et al. Cumulative incidence and diagnosis of SARS-CoV-2 infection in New York. Ann Epidemiol. Aug 2020;48:23-29 e4. https://doi.org/10.1016/j.annepidem.2020.06.004
19. Anand S, Montez-Rath M, Han J, et al. Prevalence of SARS-CoV-2 antibodies in a large nationwide sample of patients on dialysis in the USA: a cross-sectional study. Lancet. 2020;396(10259):1335-1344. https://doi.org/10.1016/S0140-6736(20)32009-2
20. Venugopal U, Jilani N, Rabah S, et al. SARS-CoV-2 seroprevalence among health care workers in a New York City hospital: a cross-sectional analysis during the COVID-19 pandemic. Int J Infect Dis. 2020;102:63-69. https://doi.org/10.1016/j.ijid.2020.10.036
21. Samaranayake LP, Fakhruddin KS, Ngo HC, Chang JWW, Panduwawala C. The effectiveness and efficacy of respiratory protective equipment (RPE) in dentistry and other health care settings: a systematic review. Acta Odontol Scand. 2020;78(8):626-639. https://doi.org/10.1080/00016357.2020.1810769
22. Seidelman JL, Lewis SS, Advani SD, et al. Universal masking is an effective strategy to flatten the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) healthcare worker epidemiologic curve. Infect Control Hosp Epidemiol. 2020;41(12):1466-1467. https://doi.org/10.1017/ice.2020.313
23. Richterman A, Meyerowitz EA, Cevik M. Hospital-acquired SARS-CoV-2 infection: lessons for public health. JAMA. Published online November 13, 2020. https://doi.org/10.1001/jama.2020.21399
24. Degesys NF, Wang RC, Kwan E, Fahimi J, Noble JA, Raven MC. Correlation between n95 extended use and reuse and fit failure in an emergency department. JAMA. 2020;324(1):94-96. https://doi.org/10.1001/jama.2020.9843
25. Pringle JC, Leikauskas J, Ransom-Kelley S, et al. COVID-19 in a correctional facility employee following multiple brief exposures to persons with COVID-19 - Vermont, July-August 2020. MMWR Morb Mortal Wkly Rep. 2020;69(43):1569-1570. https://doi.org/10.15585/mmwr.mm6943e1
26. Kambhampati AK, O’Halloran AC, Whitaker M, et al. COVID-19-associated hospitalizations among health care personnel - COVID-NET, 13 states, March 1-May 31, 2020. MMWR Morb Mortal Wkly Rep. 2020;69(43):1576-1583. https://doi.org/10.15585/mmwr.mm6943e3
27. Zhang R, Li Y, Zhang AL, Wang Y, Molina MJ. Identifying airborne transmission as the dominant route for the spread of COVID-19. Proc Natl Acad Sci U S A. 2020;117(26):14857-14863. https://doi.org/10.1073/pnas.2009637117
28. Setti L, Passarini F, De Gennaro G, et al. Airborne transmission route of COVID-19: why 2 meters/6 feet of inter-personal distance could not be enough. Int J Environ Res Public Health. 2020;17(8):2932. https://doi.org/doi:10.3390/ijerph17082932
29. Klompas M, Baker MA, Rhee C. Airborne transmission of SARS-CoV-2: theoretical considerations and available evidence. JAMA. 2020;324(5):441-442. https://doi.org/10.1001/jama.2020.12458
30. Bourouiba L. Turbulent gas clouds and respiratory pathogen emissions: potential implications for reducing transmission of COVID-19. JAMA. 2020;323(18):1837-1838. https://doi.org/10.1001/jama.2020.4756
31. Qiu X, Nergiz A, Maraolo A, Bogoch I, Low N, Cevik M. The role of asymptomatic and pre-symptomatic infection in SARS-CoV-2 transmission – a living systematic review. Clin Mibrobiol Infect. 2021;20:S1198-743X(21)00038-0. Published online January 20, 2021. https://doi.org/10.1016/j.cmi.2021.01.011
32. Price-Haywood EG, Burton J, Fort D, Seoane L. Hospitalization and mortality among black patients and white patients with Covid-19. N Engl J Med. 2020;382(26):2534-2543. https://doi.org/doi:10.1056/NEJMsa2011686
33. New York City Department of Health. Covid-19: Data. Antibody testing by group - age. Accessed March 5, 2021. https://www1.nyc.gov/site/doh/covid/covid-19-data-totals.page#antibody
34. Patel JA, Nielsen FBH, Badiani AA, et al. Poverty, inequality and COVID-19: the forgotten vulnerable. Public Health. 2020;183:110-111. https://doi.org/10.1016/j.puhe.2020.05.006
35. Polack FP, Thomas SJ, Kitchin N, et al. Safety and efficacy of the BNT162b2 mRNA Covid-19 vaccine. N Engl J Med. 2020;383(27):2603-2615. https://doi.org/10.1056/NEJMoa2034577

References

1. Liu YC, Kuo RL, Shih SR. COVID-19: The first documented coronavirus pandemic in history. Biomed J. 2020;43(4):328-333. https://doi.org/10.1016/j.bj.2020.04.007
2. World Health Organization. WHO coronavirus disease (COVID-19) dashboard. Accessed April 12, 2021. https://covid19.who.int
3. Nguyen LH, Drew DA, Graham MS, et al. Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. Lancet Public Health. 2020;5(9):e475-e483. https://doi.org/10.1016/S2468-2667(20)30164-X
4. Gupta S, Federman DG. Hospital preparedness for COVID-19 pandemic: experience from department of medicine at Veterans Affairs Connecticut Healthcare System. Postgrad Med. 2020:1-6. https://doi.org/10.1080/00325481.2020.1761668
5. Woolley K, Smith R, Arumugam S. Personal protective equipment (PPE) guidelines, adaptations and lessons during the COVID-19 pandemic. Ethics Med Public Health. 2020;14:100546. https://doi.org/10.1016/j.jemep.2020.100546
6. Shields A, Faustini SE, Perez-Toledo M, et al. SARS-CoV-2 seroprevalence and asymptomatic viral carriage in healthcare workers: a cross-sectional study. Thorax. 2020;75(12):1089-1094. https://doi.org/10.1136/thoraxjnl-2020-215414
7. Steensels D, Oris E, Coninx L, et al. Hospital-wide SARS-CoV-2 antibody screening in 3056 staff in a tertiary center in Belgium. JAMA. 2020;324(2):195-197. https://doi.org/10.1001/jama.2020.11160
8. Stubblefield WB, Talbot HK, Feldstein L, et al. Seroprevalence of SARS-CoV-2 Among frontline healthcare personnel during the first month of caring for COVID-19 patients - Nashville, Tennessee. Clin Infect Dis. 2020. https://doi.org/10.1093/cid/ciaa936
9. Korth J, Wilde B, Dolff S, et al. SARS-CoV-2-specific antibody detection in healthcare workers in Germany with direct contact to COVID-19 patients. J Clin Virol. 2020;128:104437. https://doi.org/10.1016/j.jcv.2020.104437
10. Keeley AJ, Evans C, Colton H, et al. Roll-out of SARS-CoV-2 testing for healthcare workers at a large NHS Foundation Trust in the United Kingdom, March 2020. Euro Surveill. 2020;25(14). https://doi.org/10.2807/1560-7917.ES.2020.25.14.2000433
11. Self WH, Tenforde MW, Stubblefield WB, et al. Seroprevalence of SARS-CoV-2 among frontline health care personnel in a multistate hospital network - 13 academic medical centers, April-June 2020. MMWR Morb Mortal Wkly Rep. 2020;69(35):1221-1226. https://doi.org/10.15585/mmwr.mm6935e2
12. Jeremias A, Nguyen J, Levine J, et al. Prevalence of SARS-CoV-2 infection among health care workers in a tertiary community hospital. JAMA Intern Med. 2020 Aug 11:e204214. https://doi.org/10.1001/jamainternmed.2020.4214
13. Moscola J, Sembajwe G, Jarrett M, et al. Prevalence of SARS-CoV-2 antibodies in health care personnel in the New York City area. JAMA. 2020;324(9):893-895. https://doi.org/10.1001/jama.2020.14765
14. Gonzalez-Reiche AS, Hernandez MM, Sullivan MJ, et al. Introductions and early spread of SARS-CoV-2 in the New York City area. Science. 2020;369(6501):297-301. https://doi.org/10.1126/science.abc1917
15. Lau CS, Hoo SF, Yew SF, et al. Evaluation of the Roche Elecsys Anti-SARS-CoV-2 assay. Preprint. Posted online June 29, 2020. Accessed November 8, 2020. https://www.medrxiv.org/content/10.1101/2020.06.28.20142232v1 https://doi.org/10.1101/2020.06.28.20142232
16. New York City Department of Health. Covid-19: data. long-term trends. Antibody testing. Accessed March 5, 2021. https://www1.nyc.gov/site/doh/covid/covid-19-data-trends.page#antibody
17. Havers FP, Reed C, Lim T, et al. Seroprevalence of antibodies to SARS-CoV-2 in 10 sites in the United States, March 23-May 12, 2020. JAMA Intern Med. Published online July 21, 2020. https://doi.org/10.1001/jamainternmed.2020.4130
18. Rosenberg ES, Tesoriero JM, Rosenthal EM, et al. Cumulative incidence and diagnosis of SARS-CoV-2 infection in New York. Ann Epidemiol. Aug 2020;48:23-29 e4. https://doi.org/10.1016/j.annepidem.2020.06.004
19. Anand S, Montez-Rath M, Han J, et al. Prevalence of SARS-CoV-2 antibodies in a large nationwide sample of patients on dialysis in the USA: a cross-sectional study. Lancet. 2020;396(10259):1335-1344. https://doi.org/10.1016/S0140-6736(20)32009-2
20. Venugopal U, Jilani N, Rabah S, et al. SARS-CoV-2 seroprevalence among health care workers in a New York City hospital: a cross-sectional analysis during the COVID-19 pandemic. Int J Infect Dis. 2020;102:63-69. https://doi.org/10.1016/j.ijid.2020.10.036
21. Samaranayake LP, Fakhruddin KS, Ngo HC, Chang JWW, Panduwawala C. The effectiveness and efficacy of respiratory protective equipment (RPE) in dentistry and other health care settings: a systematic review. Acta Odontol Scand. 2020;78(8):626-639. https://doi.org/10.1080/00016357.2020.1810769
22. Seidelman JL, Lewis SS, Advani SD, et al. Universal masking is an effective strategy to flatten the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) healthcare worker epidemiologic curve. Infect Control Hosp Epidemiol. 2020;41(12):1466-1467. https://doi.org/10.1017/ice.2020.313
23. Richterman A, Meyerowitz EA, Cevik M. Hospital-acquired SARS-CoV-2 infection: lessons for public health. JAMA. Published online November 13, 2020. https://doi.org/10.1001/jama.2020.21399
24. Degesys NF, Wang RC, Kwan E, Fahimi J, Noble JA, Raven MC. Correlation between n95 extended use and reuse and fit failure in an emergency department. JAMA. 2020;324(1):94-96. https://doi.org/10.1001/jama.2020.9843
25. Pringle JC, Leikauskas J, Ransom-Kelley S, et al. COVID-19 in a correctional facility employee following multiple brief exposures to persons with COVID-19 - Vermont, July-August 2020. MMWR Morb Mortal Wkly Rep. 2020;69(43):1569-1570. https://doi.org/10.15585/mmwr.mm6943e1
26. Kambhampati AK, O’Halloran AC, Whitaker M, et al. COVID-19-associated hospitalizations among health care personnel - COVID-NET, 13 states, March 1-May 31, 2020. MMWR Morb Mortal Wkly Rep. 2020;69(43):1576-1583. https://doi.org/10.15585/mmwr.mm6943e3
27. Zhang R, Li Y, Zhang AL, Wang Y, Molina MJ. Identifying airborne transmission as the dominant route for the spread of COVID-19. Proc Natl Acad Sci U S A. 2020;117(26):14857-14863. https://doi.org/10.1073/pnas.2009637117
28. Setti L, Passarini F, De Gennaro G, et al. Airborne transmission route of COVID-19: why 2 meters/6 feet of inter-personal distance could not be enough. Int J Environ Res Public Health. 2020;17(8):2932. https://doi.org/doi:10.3390/ijerph17082932
29. Klompas M, Baker MA, Rhee C. Airborne transmission of SARS-CoV-2: theoretical considerations and available evidence. JAMA. 2020;324(5):441-442. https://doi.org/10.1001/jama.2020.12458
30. Bourouiba L. Turbulent gas clouds and respiratory pathogen emissions: potential implications for reducing transmission of COVID-19. JAMA. 2020;323(18):1837-1838. https://doi.org/10.1001/jama.2020.4756
31. Qiu X, Nergiz A, Maraolo A, Bogoch I, Low N, Cevik M. The role of asymptomatic and pre-symptomatic infection in SARS-CoV-2 transmission – a living systematic review. Clin Mibrobiol Infect. 2021;20:S1198-743X(21)00038-0. Published online January 20, 2021. https://doi.org/10.1016/j.cmi.2021.01.011
32. Price-Haywood EG, Burton J, Fort D, Seoane L. Hospitalization and mortality among black patients and white patients with Covid-19. N Engl J Med. 2020;382(26):2534-2543. https://doi.org/doi:10.1056/NEJMsa2011686
33. New York City Department of Health. Covid-19: Data. Antibody testing by group - age. Accessed March 5, 2021. https://www1.nyc.gov/site/doh/covid/covid-19-data-totals.page#antibody
34. Patel JA, Nielsen FBH, Badiani AA, et al. Poverty, inequality and COVID-19: the forgotten vulnerable. Public Health. 2020;183:110-111. https://doi.org/10.1016/j.puhe.2020.05.006
35. Polack FP, Thomas SJ, Kitchin N, et al. Safety and efficacy of the BNT162b2 mRNA Covid-19 vaccine. N Engl J Med. 2020;383(27):2603-2615. https://doi.org/10.1056/NEJMoa2034577

Issue
Journal of Hospital Medicine 16(5)
Issue
Journal of Hospital Medicine 16(5)
Page Number
282-289. Published Online First April 20, 2021
Page Number
282-289. Published Online First April 20, 2021
Topics
Article Type
Display Headline
SARS-CoV-2 Seroprevalence Among Healthcare Workers by Job Function and Work Location in a New York Inner-City Hospital
Display Headline
SARS-CoV-2 Seroprevalence Among Healthcare Workers by Job Function and Work Location in a New York Inner-City Hospital
Sections
Article Source

© 2021 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Murli U Purswani, MD; Email: mpurswan@bronxcare.org; Telephone: 718-960-1010. Twitter: @purswani_murli.
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
Article PDF Media
Media Files

SARS-CoV-2 Seroprevalence Among Healthcare Workers by Workplace Exposure Risk in Kashmir, India

Article Type
Changed
Tue, 04/27/2021 - 10:21
Display Headline
SARS-CoV-2 Seroprevalence Among Healthcare Workers by Workplace Exposure Risk in Kashmir, India

India is emerging as one of the world’s largest hotspots for SARS-CoV-2 infection (COVID-19)—second only to the United States—with more than 13,000,000 documented infections since the first case was recorded on January 30, 2020.1,2 Kashmir, a northern territory of India, reported its first case of COVID-19 on March 18, 2020, from the central District Srinagar; this region has accounted for more cases of COVID-19 than any other district throughout the pandemic.3 The large majority of healthcare in District Srinagar is provided by three tertiary care institutions, one district hospital, two subdistrict hospitals, and 70 primary healthcare centers. Potential occupational exposures place healthcare workers (HCWs) at higher risk of acquiring SARS-CoV-2 infection, which in turn may serve as an important source of infection for their families and other community members.4-6 Given the high frequency and geographic variability of asymptomatic infection, growing evidence suggests this hidden reservoir is a source of infection for the general population.7,8

Many countries have started testing for antibodies against SARS-CoV-2, both at the population level and in specific groups, such as HCWs. Seroepidemiological studies are crucial to understanding the dynamics of SARS-CoV-2 infection. Many seroepidemiological studies have been conducted among community populations, but there are insufficient data on HCWs. The World Health Organization also encouraged its member states to conduct seroepidemiological studies to attain a better understanding of COVID-19 infection prevalence and distribution.9-11 Therefore, to quantify the prevalence of SARS-CoV-2 infection among HCWs, we conducted a seroepidemiological study by testing for SARS-CoV-2–specific immunoglobulin (IgG) to gain insight into the extent of infection among specific subgroups of HCWs and to identify risk-factor profiles associated with seropositivity.

METHODS

Study Design and Settings

We conducted this seroepidemiological study to ascertain the presence of IgG antibodies against SARS-CoV-2 among HCWs in the District Srinagar of Kashmir, India. The 2-week period of data collection began on June 15, 2020. As part of healthcare system pandemic preparedness efforts, India’s Ministry of Health provided specific guidelines for health facilities to manage COVID-19. Hospitals were categorized as dedicated COVID and non-COVID hospitals. Dedicated COVID hospitals provided comprehensive care exclusively to patients with COVID-19 and were equipped with fully functional intensive care units, ventilators, and beds with reliable access to oxygen support.12 In addition, infection prevention and control strategies to limit the transmission of SARS-CoV-2 infection were implemented according to guidelines specified by India’s National Center for Disease Control.13 To strengthen service provision, HCWs from other hospitals, including resident physicians, were relocated to these dedicated COVID hospitals. The additional staff were selected by administrative leadership, without input from HCWs.

Study Population and Data Collection

We approached administrative heads of the hospitals in District Srinagar for permission to conduct our study and to invite their HCWs to participate in the study. As Figure 1 shows, we were denied permission by the administrative heads of two tertiary care hospitals. Finally, with a point person serving as a study liaison at each institution, HCWs from three dedicated COVID and seven non-COVID tertiary care hospitals, two subdistrict hospitals, and six primary healthcare centers across the District Srinagar were invited to participate. The sample primary healthcare centers were each selected randomly, after stratification, from six major regions of the district. All frontline HCWs, including physicians, administrative and laboratory personnel, technicians, field workers involved in surveillance activity, and other supporting staff were eligible for the study.

Healthcare Facilities in District Srinagar and the Number of Hospitals and Facilities Selected for the Study

We collected information on an interview form using Epicollect5, a free data-gathering tool widely used in health research.14 Physicians specifically trained in the use of Epicollect5 conducted the face-to-face interview on a prespecified day and recorded the collected information through mobile phones. This information included the participants’ role in providing care to patients with COVID-19 and risk factors for SARS-CoV-2 infection (eg, history of travel since January 1, 2020, symptoms of an influenza-like illness [ILI] in the 4 weeks prior to the interview, close contact with a COVID-19 case). We defined close contact as an unmasked exposure within 6 feet of an infected individual for at least 15 minutes, irrespective of location (ie, community or the hospital).

Following the interview, trained phlebotomists collected 3 to 5 mL of venous blood under aseptic conditions. We strictly adhered to standard operating procedures during collection, transportation, and testing of blood samples. Following collection, the blood samples remained undisturbed for at least 30 minutes before centrifugation, which was performed at the collection site (or at the central laboratory for sites lacking the capability). The samples were then transported for further processing and testing through a cold chain supply line, using vaccine carriers with conditioned icepacks. All testing procedures were conducted with strict adherence to the manufacturers’ guidelines.

Laboratory Procedure

In accordance with the manufacturer’s recommendations, we used a chemiluminescent microparticle immunoassay to detect SARS-CoV-2–specific IgG antibodies in serum samples. The assay is an automated two-step immunoassay for the qualitative detection of IgG antibodies against the nucleocapsid of SARS-CoV-2 in human serum and plasma. The sensitivity and specificity of this test are 100% and 99%, respectively. The test result was considered positive for SARS-CoV-2 IgG if the index value was ≥1.4, consistent with guidance provided by the manufacturer.15

The IgG values were also entered into Epicollect5. Two trained medical interns independently entered the laboratory results in two separate forms. A third medical intern reviewed these forms for discrepancies, in response to which they referenced the source data for adjudication. The information gathered during the interview and the laboratory results were linked with the help of a unique identification number, which was generated at the time of the interview.

Statistical Analysis

We estimated the proportion (and logit-transformed 95% CI) of HCWs with a positive SARS-CoV-2–specific IgG antibody level, the primary outcome of interest. We compared seroprevalence rates by gender, age group, specific occupational group, and type of health facility (dedicated COVID hospital vs non-COVID hospital). Seroprevalence was also estimated separately for HCWs who reported symptoms in the past 4 weeks, had a history of exposure to a known case of COVID-19, or had undergone testing by reverse transcriptase-polymerase chain reaction (RT-PCR). In the case of zero seroprevalences, Jeffreys 95% CIs were reported. We used a chi-square test to report two-sided P values for comparison of seroprevalence between groups. When the expected frequency was <5 in more than 20% of the cells, the exact test was used instead of the chi-square test. We additionally performed multivariable logistic regression analysis to evaluate the independent association between place of work (primary independent variable) and seropositivity (dependent variable). We adjusted for the following observable covariates by including them as categorical variables: age, gender, occupational group, and history of close contact with a patient who was COVID-positive. We performed data analysis using Stata, version 15.1 (StataCorp LP). The Institutional Ethics Committee of Government Medical College, Srinagar, approved the study (Reference No. 1003/ETH/GMC dated 13-05-2020). We obtained written, informed consent from all participants.

RESULTS

Of the 7,346 HCWs we were granted permission to approach, 2,915 (39.7%) agreed to participate in the study. The participation rate was 49% at the dedicated COVID hospitals (57% physicians and 47% nonphysicians) and 39% at the non-COVID hospitals (46% physicians and 36% nonphysicians). We analyzed information gathered from 2,905 HCWs (Epicollect5 interview forms were missing for nine participants, and the laboratory report was missing for one participant).

The mean age of the participants was 38.6 years, and 35.8% of participants identified as female (Table 1). One third (33.7%) of the participants were physicians, nearly half of whom were residents. In our sample, the overall seroprevalence of SARS-CoV-2–specific antibodies was 2.5% (95% CI, 2.0%-3.1%).

Seroprevalence of SARS-CoV-2–specific IgG Antibodies by Baseline Characteristics of Healthcare Workers
The distribution of the IgG index value among the study participants is shown in Figure 2.

Scatter Diagram Displaying Immunoglobulin G (IgG) Index Value of the Study Participants

Of the 2,905 participating HCWs, 123 (4.2%) reported an ILI (ie, fever and cough) in the 4 weeks preceding the interview, and 339 (11.7%) reported close contact with a person with COVID-19 (Table 2). A total of 760 (26.2%) HCWs had undergone RT-PCR testing, 29 (3.8%) of whom had a positive result. Stratifying by workplace, history of nasopharyngeal RT-PCR positivity was reported by 4 of 77 (5.1%) participants from dedicated COVID hospitals compared to (3.7%) participants from the non-COVID hospital (P = .528).

Seroprevalence of SARS-CoV-2–specific IgG Antibodies by Clinical Characteristics and Specific Risk Factors

As Table 2 also demonstrates, we found a significantly higher seropositivity rate among HCWs who had a history of ILI (P < .001), a history of positive RT-PCR (P < .001), history of ever being put under quarantine (P = .009), and a self-reported history of close contact with a person with COVID-19 (P = .014). Healthcare workers who had ever worked at a dedicated COVID hospital had a significantly lower seroprevalence of infection (P = .004).

Among HCWs who reported no ILI symptoms in the 4 weeks prior to the interview but who had positive RT-PCR test, 20.8% were seropositive. Of HCWs who reported both ILI and a positive RT-PCR test result, 60.0% were seropositive. Compared to employment at a non-COVID hospital, HCWs working in dedicated COVID hospitals had a reduced multivariate-adjusted risk of seropositivity (odds ratio, 0.21; 95% CI, 0.06-0.66).

DISCUSSION

We aimed to estimate the seroprevalence of SARS-CoV-2 infection in HCWs in different hospital settings in the District Srinagar of Kashmir, India. In general, seroprevalence was low (2.5%), with little difference across gender or occupational group.

Seroprevalence studies of HCWs across divergent workplace environments have revealed estimates ranging from 1% to 10.2%.16-19 Generally, the seroprevalence rates among HCWs are not significantly different from those of the general population, which reflects how different the dynamics of COVID-19 are compared to other infections in healthcare settings. The low seroprevalence observed in our study coincides with the overall low infection rate in the community population. During the study period, District Srinagar reported a median of 28 new infections daily (interquartile range, 17-46), which is indicative of the early phase of the pandemic in the population at the time of the study.20

Among the HCW occupational groups, ambulance drivers and housekeeping staff had the highest seroprevalence rates, followed by nurses and physicians. Possible explanations for higher seropositivity in these groups are improper use or inadequate supply of protective gear and lack of training on the use of personal protective equipment (PPE), resulting in increased exposure risk.21 Concordance of HCW and community infection rates in specific geographic areas suggests that community exposure may be the dominant source of healthcare exposure and infection. Additionally, careful in-hospital behavior of HCWs in dedicated COVID hospitals may have had a spillover effect on their out-of-hospital behavior, which may partially explain our finding that employment at dedicated COVID hospitals was associated with a markedly lower chance of seropositivity. A study of 6,510 HCWs in Chicago, Illinois, showed high seropositivity rates among support service workers, medical assistants, and nurses, with nurses identified as having a markedly higher adjusted odds of seropositivity relative to administrators. The authors of the study concluded that exposure in the community setting plays a crucial role in transmission among HCWs.22 Similarly, higher seroprevalence among housekeeping, nonadministrative staff, and other support service staff has been reported elsewhere.23 Certain underlying factors related to socioeconomic status and lifestyle may also contribute to higher seroprevalence in some occupational groups.24 Nonadherence to masking, social distancing, and proper hand hygiene outside the hospital setting could result in community-acquired infection.

Interestingly, participants who were working in a dedicated COVID hospital or who had ever worked at one had a seroprevalence of 0.6%, much lower than the 2.8% observed among other participants. This difference remained statistically significant after controlling for age, sex, place of work, and occupational group. As these facilities were dedicated to the management and care of patients with COVID-19, the hospital staff strictly adhered to safety precautions, with particular vigilance during patient contact. These hospitals also strictly adhered to infection prevention and control practices based on the latest guidelines released by India’s Ministry of Health and Family Welfare.13

A commitment was made to provide adequate PPE to the dedicated COVID hospitals and staff, commensurate with expected infected patient volumes and associated exposure risks. Healthcare workers were specifically trained on proper donning and doffing of PPE, self-health monitoring, and protocols for reporting symptoms and PPE breaches during patient encounters. Healthcare workers were regularly tested for COVID-19 using nasopharyngeal RT-PCR. Of critical importance, these hospitals implemented a buddy system wherein a team of two or more staff members was responsible for ensuring each other’s safety, proper PPE use, conformance to other protective measures, and reporting breaches of PPE compliance.25 Universal masking was mandatory for all hospital staff and patients at the COVID-focused facilities, with the additional use of N-95 masks, gloves, and face shields during times of patient contact. Administrative measures, including visitor restrictions and environmental sanitation, were rigorously enforced. Also, being a potentially high-risk area for transmission of infection, these facilities implemented staff-rationing to reduce the duration of exposure to the healthcare staff. Third, the HCWs of COVID-dedicated hospitals were provided with separate living accommodations during the period in which they were employed at a dedicated COVID hospital.

In contrast, in non-COVID hospitals, with the exception of HCWs, patients and the hospital visitors were not subject to a masking policy. Moreover, an adequate and timely supply of PPE was not prioritized at the non-COVID facilities due to resource constraints. Further, lack of testing of asymptomatic patients at non-COVID hospitals may have resulted in nosocomial transmission from asymptomatic carriers. Though routine infection prevention and control activities were performed at non-COVID hospitals, we did not assess adherence to infection prevention and control guidelines in the two different categories of hospitals. Our results are also supported by evidence from studies conducted in different hospital settings, the findings of which reiterate the importance of fundamental principles of prevention (eg, proper masking, hand hygiene, and distancing) and are of particular importance in resource-limited settings.17,26,27 The only published study quantifying seroprevalence among HCWs in India was performed in a single hospital setting with separate COVID and non-COVID units. The authors of that study reported a higher seroprevalence among HCWs in the COVID unit. However, this difference seems to be confounded by other factors as revealed by the multivariable analysis result.23

We found a two-fold higher seroprevalence (4.4%) in HCWs who reported close contact with a patient with COVID-19. Respiratory infections pose a greater health risk to HCWs in an occupational setting. Substantial evidence has emerged demonstrating that the respiratory system is the dominant route of SARS-CoV-2 transmission, with proximity and ventilation as key predictive factors.28 Globally, among thousands of HCWs infected with SARS-CoV-2, one of the leading risk factors identified was close contact with a patient with COVID-19; other identified risk factors were lack of PPE, poor infection prevention and control practices, work overload, and a preexisting health condition.29

The seroprevalence estimate among participants who reported an ILI in the 4 weeks preceding the interview was only 12.2%, suggesting an alternative etiology of these symptoms. Among those who reported a previously positive RT-PCR for SARS-CoV-2, only 27.6% showed the presence of SARS-CoV-2–specific IgG antibodies. The inability to mount an antibody-mediated immune response or early conversion to seronegative status during the convalescence phase has been suggested as an explanation for such discordant findings.30 On the contrary, seropositivity among participants who reported having a negative RT-PCR test was 1.9%. There are few plausible explanations for such observations. First, several studies have reported false-negative result rates from RT-PCR testing ranging from 2% to 29%.31-33 Second, the sensitivity of the SARS-CoV-2 assay is influenced by the timing of the test after the onset of symptoms or RT-PCR positivity. The sensitivity of the assay we used varies from 53.1% at day 7 to 100% at day 17 postinfection.34 Variable viral load and differences in duration of viral shedding are other possible reasons for false-negative RT-PCR results.35,36

In our study, seroconversion among asymptomatic HCWs who were RT-PCR-positive was 20.8%. Among HCWs who reported an ILI and were RT-PCR-positive, seropositivity was 60%. In one study, 40% of asymptomatic and 13% of symptomatic patients who tested positive for COVID-19 became seronegative after initial seropositivity—that is, 8 weeks after hospital discharge.37

Serological testing offers insight into both the exposure history and residual COVID-19 susceptibility of HCWs. However, current immunological knowledge does not allow us to conclude that seropositivity conveys high-level immunity against reinfection. As the epidemic evolves, HCWs will continue to be exposed to COVID-19 in the community and the workplace. Serial cross-sectional serosurveys can help monitor the progression of the pandemic within the healthcare setting and guide hospital authorities in resource allocation.

Strengths and Limitations

We used the Abbott Architect SARS-CoV-2 IgG assay, which has exhibited a high level of consistency and performance characteristics when tested in different patient populations. The participation rate was acceptable compared to similar studies, and we included all the major hospitals in the District Srinagar. The findings from our study can therefore be considered representative of the HCWs in the district.

The study results should be interpreted in the context of the following limitations. First, information on risk factors for seropositivity were based on participant report. Also, we did not collect information on the timing of symptoms or the date on which a participant became RT-PCR-positive. Second, information regarding place of exposure (ie, community or hospital setting) was not recorded, limiting conclusions regarding the effect of workplace exposures. Third, given the voluntary nature of participation in the study, there is a possibility of selection bias that may have limited the generalizability of our findings. For example, some HCWs with a recent exposure to COVID-19 or those who were symptomatic at the time of the study might not have participated based on the absence of an individual benefit from IgG testing in the early phase of infection. Conversely, some HCWs who had symptoms in the distant past might have been more likely to have participated in the study. However, we believe that selection bias does not vitiate the validity of the associations based on the plausible assumption that infection risk should be similar between respondents and nonrespondents due to comparable work environments. Finally, with a cross-sectional study design, we cannot ascertain the reconversion from an initial positive-IgG to negative-IgG status, which warrants a cohort study.

CONCLUSION

We conclude that the seroprevalence of SARS-CoV-2 infection was low among HCWs of District Srinagar at the time of the study. Healthcare workers in a dedicated COVID hospital or HCWs who had ever worked in such a facility had lower seroprevalence, suggesting both adherence to and effectiveness of standard protective measures during contact with patients who had COVID-19. Nonetheless, the careful in-hospital behavior of the HCWs at the COVID hospitals may have had a spillover effect on their out-of-hospital behaviors, which lead to community-acquired infection. On the contrary, lack of testing of asymptomatic patients at non-COVID hospitals may have resulted in nosocomial transmission from asymptomatic carriers. We believe that our findings highlight the value of implementing infection prevention and control measures in the hospital setting. Moreover, training and retraining of sanitation and other housekeeping staff on standard hygienic practices and appropriate use of the protective gear may further help reduce their rates of exposure.

Acknowledgments

The authors thank Principal and Dean of the Government Medical College, Srinagar, Professor Samia Rashid, and District Commissioner, Srinagar, Shahid Iqbal Chowdhary for their support. We also acknowledge the support rendered by the Directorate of Health Services, Kashmir; Chief Medical Officer Srinagar; Block Medical Officers; and Zonal Medical Officers of District Srinagar, Kashmir, and extend our appreciation to the medical interns for their efforts in data collection, and to laboratory in-charge Gulzar Ahmad Wani, PhD scholar, Biochemistry, and his staff, who were involved in this study. Finally, we thank the study participants for their understanding of the importance of this study and for their time and participation.

Data availability statement

Data shall be made available on request through the corresponding author.

References

1. Ministry of Health & Family Welfare. Government of India. Accessed January 11, 2021. https://www.mohfw.gov.in/
2. COVID19 India. Accessed January 11, 2021. https://www.covid19india.org/
3. Government of Jammu & Kashmir. Department of Information & Public Relations. Bulletin on Novel Corona Virus (COVID-19). Accessed January 11, 2021. http://new.jkdirinf.in/NewsDescription.aspx?ID=66598
4. Black JRM, Bailey C, Przewrocka J, Dijkstra KK, Swanton C. COVID-19: the case for health-care worker screening to prevent hospital transmission. Lancet. 2020;395(10234):1418-1420. https://doi.org/10.1016/s0140-6736(20)30917-x
5. Nguyen LH, Drew DA, Graham MS, et al; Coronavirus Pandemic Epidemiology Consortium. Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. Lancet Public Heal. 2020;5(9):e475-e483. https://doi.org/10.1016/s2468-2667(20)30164-x
6. The Lancet. COVID-19: protecting health-care workers. Lancet. 2020;395(10228):922. https://doi.org/10.1016/s0140-6736(20)30644-9
7. Byambasuren O, Cardona M, Bell K, Clark J, McLaws M-L, Glasziou P. Estimating the extent of asymptomatic COVID-19 and its potential for community transmission: systematic review and meta-analysis. Off J Assoc Med Microbiol Infect Dis Canada. 2020;5(4):223-234. https://doi.org/10.3138/jammi-2020-0030
8. Rosenbaum L. Facing Covid-19 in Italy—ethics, logistics, and therapeutics on the epidemic’s front line. N Engl J Med. 2020;382(20):1873-1875. https://doi.org/10.1056/nejmp2005492
9. World Health Organization. The Unity Studies: WHO Sero-epidemiological Investigations Protocols. Accessed January 11, 2021. https://www.who.int/emergencies/diseases/novel-coronavirus-2019/technical-guidance/early-investigations
10. Pollán M, Pérez-Gómez B, Pastor-Barriuso R, et al; ENE-COVID Study Group. Prevalence of SARS-CoV-2 in Spain (ENE-COVID): a nationwide, population-based seroepidemiological study. Lancet. 2020;396(10250):535-544. https://doi.org/10.1016/s0140-6736(20)31483-5
11. Folgueira MD, Muñoz-Ruipérez C, Alonso-López MA, Delgado R. SARS-CoV-2 infection in health care workers in a large public hospital in Madrid, Spain, during March 2020. MedRxiv Web site. Published April 27, 2020. Accessed March 9, 2021. https://doi.org/10.1101/2020.04.07.20055723
12. Ministry of Health & Family Welfare, Directorate General of Health Services, EMR Division. Guidance document on appropriate management of suspect/confirmed cases of COVID-19. Accessed January 11, 2021. https://www.mohfw.gov.in/pdf/FinalGuidanceonMangaementofCovidcasesversion2.pdf
13. Ministry of Health &Family Welfare Government of India. National guidelines for infection prevention and control in healthcare facilities. Accessed January 11, 2021. https://main.mohfw.gov.in/sites/default/files/National%20Guidelines%20for%20IPC%20in%20HCF%20-%20final%281%29.pdf
14. Epicollect5. Accessed January 11, 2021. https://five.epicollect.net/
15. SARS-CoV-2 Immunoassay. Abbott Core Laboratory. Accessed January 11, 2021. https://www.corelaboratory.abbott/us/en/offerings/segments/infectious-disease/sars-cov-2
16. Bendavid E, Mulaney B, Sood N, et al. COVID-19 antibody seroprevalence in Santa Clara County, California. medRxiv. Published online April 30, 2020. Accessed March 9, 2021. https://doi.org/10.1101/2020.04.14.20062463
17. Korth J, Wilde B, Dolff S, et al. SARS-CoV-2-specific antibody detection in healthcare workers in Germany with direct contact to COVID-19 patients. J Clin Virol. 2020;128:104437. https://doi.org/10.1016/j.jcv.2020.104437
18. Steensels D, Oris E, Coninx L, et al. Hospital-wide SARS-CoV-2 antibody screening in 3056 staff in a tertiary center in Belgium. JAMA. 2020;324(2):195-197. https://doi.org/10.1001/jama.2020.11160
19. Behrens GMN, Cossmann A, Stankov M V., et al. Perceived versus proven SARS-CoV-2-specific immune responses in health-care professionals. Infection. 2020;48(4):631-634. https://doi.org/10.1007/s15010-020-01461-0
20. COVID-19 Kashmir Tracker. Accessed January 11, 2021. https://covidkashmir.org/statistics
21. World Health Organization. Rational use of personal protective equipment for coronavirus disease (COVID-19) and considerations during severe shortages. Published December 23, 2020. Accessed January 11, 2021. https://www.who.int/publications/i/item/rational-use-of-personal-protective-equipment-for-coronavirus-disease-(covid-19)-and-considerations-during-severe-shortages
22. Wilkins JT, Gray EL, Wallia A, et al. Seroprevalence and correlates of SARS-CoV-2 antibodies in health care workers in Chicago. Open Forum Infect Dis. 2020;8(1):ofaa582. https://doi.org/10.1093/ofid/ofaa582
23. Goenka M, Afzalpurkar S, Goenka U, et al. Seroprevalence of COVID-19 amongst health care workers in a tertiary care hospital of a metropolitan city from India. J Assoc Physicians India. 2020;68(11):14-19. https://doi.org/10.2139/ssrn.3689618
24. Mutambudzi M, Niedwiedz C, Macdonald EB, et al. Occupation and risk of severe COVID-19: prospective cohort study of 120 075 UK Biobank participants. Occup Environ Med. 2020;oemed-2020-106731. https://doi.org/10.1136/oemed-2020-106731
25. Ministry of Health & Family Welfare, Directorate General of Health Services, EMR Division. Advisory for managing health care workers working in COVID and Non-COVID areas of the hospital. Accessed January 12, 2021. https://cdnbbsr.s3waas.gov.in/s3850af92f8d9903e7a4e0559a98ecc857/uploads/2020/06/2020061949.pdf
26. Rhee C, Baker M, Vaidya V, et al; CDC Prevention Epicenters Program. Incidence of nosocomial COVID-19 in patients hospitalized at a large US academic medical center. JAMA Netw Open. 2020;3(9):e2020498. https://doi.org/10.1001/jamanetworkopen.2020.20498
27. Seidelman J, Lewis SS, Advani SD, et al. Universal masking is an effective strategy to flatten the severe acute respiratory coronavirus virus 2 (SARS-2-CoV)healthcare worker epidemiologic curve. Infect Control Hosp Epidemiol. 2020;41(12):1466-1467. https://doi.org/10.1017/ice.2020.313
28. Meyerowitz EA, Richterman A, Gandhi RT, Sax PE. Transmission of SARS-CoV-2: a review of viral, host, and environmental factors. Ann Intern Med. 2020;174(1):69-79. https://doi.org/10.7326/m20-5008
29. Mhango M, Dzobo M, Chitungo I, Dzinamarira T. COVID-19 risk factors among health workers: a rapid review. Saf Health Work. 2020;11(3):262-265. https://doi.org/10.1016/j.shaw.2020.06.001
30. European Centre for Disease Prevention and Control. Immune responses and immunity to SARS-CoV-2. Updated June 30, 2020. Accessed January 12, 2021. https://www.ecdc.europa.eu/en/covid-19/latest-evidence/immune-responses
31. Arevalo-Rodriguez I, Buitrago-Garcia D, Simancas-Racines D, et al. False-negative results of initial RT-PCR assays for COVID-19: a systematic review. PLoS One. 2020;15(12):e0242958. https://doi.org/10.1371/journal.pone.0242958
32. Ai T, Yang Z, Hou H, et al. Correlation of chest CT and RT-PCR testing for coronavirus disease 2019 (COVID-19) in China: a report of 1014 cases. Radiology. 2020;296(2):E32-E40. https://doi.org/10.1148/radiol.2020200642
33. Woloshin S, Patel N, Kesselheim AS. False negative tests for SARS-CoV-2 infection — challenges and implications. N Engl J Med. 2020;383(6):e38. https://doi.org/10.1056/nejmp2015897
34. Bryan A, Pepper G, Wener MH, et al. Performance characteristics of the Abbott Architect SARS-CoV-2 IgG assay and seroprevalence in Boise, Idaho. J Clin Microbiol. 2020;58(8):e00941. https://doi.org/10.1128/jcm.00941-20
35. Long Q-X, Liu B-Z, Deng H-J, et al. Antibody responses to SARS-CoV-2 in patients with COVID-19. Nat Med. 2020;26(6):845-848. https://doi.org/10.1038/s41591-020-0897-1
36. Tahamtan A, Ardebili A. Real-time RT-PCR in COVID-19 detection: issues affecting the results. Expert Rev Mol Diagn. 2020;20(5):453-454. https://doi.org/10.1080/14737159.2020.1757437
37. Long Q-X, Tang X-J, Shi Q-L, et al. Clinical and immunological assessment of asymptomatic SARS-CoV-2 infections. Nat Med. 2020;26(8):1200-1204. https://doi.org/10.1038/s41591-020-0965-6

Article PDF
Author and Disclosure Information

1Department of Community Medicine, Government Medical College, Srinagar, Kashmir, India; 2Department of Biochemistry, Government Medical College, Srinagar, Kashmir, India.

Disclosures
The authors declare no conflicts of interest.

Funding
The study received mainly institutional funding from Government Medical College, Srinagar with support from the District Disaster Management Authority Srinagar. The funding bodies had no role in the design, collection, analysis, interpretation, or writing of the manuscript.

Issue
Journal of Hospital Medicine 16(5)
Topics
Page Number
274-281. Published Online First April 20, 2021
Sections
Author and Disclosure Information

1Department of Community Medicine, Government Medical College, Srinagar, Kashmir, India; 2Department of Biochemistry, Government Medical College, Srinagar, Kashmir, India.

Disclosures
The authors declare no conflicts of interest.

Funding
The study received mainly institutional funding from Government Medical College, Srinagar with support from the District Disaster Management Authority Srinagar. The funding bodies had no role in the design, collection, analysis, interpretation, or writing of the manuscript.

Author and Disclosure Information

1Department of Community Medicine, Government Medical College, Srinagar, Kashmir, India; 2Department of Biochemistry, Government Medical College, Srinagar, Kashmir, India.

Disclosures
The authors declare no conflicts of interest.

Funding
The study received mainly institutional funding from Government Medical College, Srinagar with support from the District Disaster Management Authority Srinagar. The funding bodies had no role in the design, collection, analysis, interpretation, or writing of the manuscript.

Article PDF
Article PDF
Related Articles

India is emerging as one of the world’s largest hotspots for SARS-CoV-2 infection (COVID-19)—second only to the United States—with more than 13,000,000 documented infections since the first case was recorded on January 30, 2020.1,2 Kashmir, a northern territory of India, reported its first case of COVID-19 on March 18, 2020, from the central District Srinagar; this region has accounted for more cases of COVID-19 than any other district throughout the pandemic.3 The large majority of healthcare in District Srinagar is provided by three tertiary care institutions, one district hospital, two subdistrict hospitals, and 70 primary healthcare centers. Potential occupational exposures place healthcare workers (HCWs) at higher risk of acquiring SARS-CoV-2 infection, which in turn may serve as an important source of infection for their families and other community members.4-6 Given the high frequency and geographic variability of asymptomatic infection, growing evidence suggests this hidden reservoir is a source of infection for the general population.7,8

Many countries have started testing for antibodies against SARS-CoV-2, both at the population level and in specific groups, such as HCWs. Seroepidemiological studies are crucial to understanding the dynamics of SARS-CoV-2 infection. Many seroepidemiological studies have been conducted among community populations, but there are insufficient data on HCWs. The World Health Organization also encouraged its member states to conduct seroepidemiological studies to attain a better understanding of COVID-19 infection prevalence and distribution.9-11 Therefore, to quantify the prevalence of SARS-CoV-2 infection among HCWs, we conducted a seroepidemiological study by testing for SARS-CoV-2–specific immunoglobulin (IgG) to gain insight into the extent of infection among specific subgroups of HCWs and to identify risk-factor profiles associated with seropositivity.

METHODS

Study Design and Settings

We conducted this seroepidemiological study to ascertain the presence of IgG antibodies against SARS-CoV-2 among HCWs in the District Srinagar of Kashmir, India. The 2-week period of data collection began on June 15, 2020. As part of healthcare system pandemic preparedness efforts, India’s Ministry of Health provided specific guidelines for health facilities to manage COVID-19. Hospitals were categorized as dedicated COVID and non-COVID hospitals. Dedicated COVID hospitals provided comprehensive care exclusively to patients with COVID-19 and were equipped with fully functional intensive care units, ventilators, and beds with reliable access to oxygen support.12 In addition, infection prevention and control strategies to limit the transmission of SARS-CoV-2 infection were implemented according to guidelines specified by India’s National Center for Disease Control.13 To strengthen service provision, HCWs from other hospitals, including resident physicians, were relocated to these dedicated COVID hospitals. The additional staff were selected by administrative leadership, without input from HCWs.

Study Population and Data Collection

We approached administrative heads of the hospitals in District Srinagar for permission to conduct our study and to invite their HCWs to participate in the study. As Figure 1 shows, we were denied permission by the administrative heads of two tertiary care hospitals. Finally, with a point person serving as a study liaison at each institution, HCWs from three dedicated COVID and seven non-COVID tertiary care hospitals, two subdistrict hospitals, and six primary healthcare centers across the District Srinagar were invited to participate. The sample primary healthcare centers were each selected randomly, after stratification, from six major regions of the district. All frontline HCWs, including physicians, administrative and laboratory personnel, technicians, field workers involved in surveillance activity, and other supporting staff were eligible for the study.

Healthcare Facilities in District Srinagar and the Number of Hospitals and Facilities Selected for the Study

We collected information on an interview form using Epicollect5, a free data-gathering tool widely used in health research.14 Physicians specifically trained in the use of Epicollect5 conducted the face-to-face interview on a prespecified day and recorded the collected information through mobile phones. This information included the participants’ role in providing care to patients with COVID-19 and risk factors for SARS-CoV-2 infection (eg, history of travel since January 1, 2020, symptoms of an influenza-like illness [ILI] in the 4 weeks prior to the interview, close contact with a COVID-19 case). We defined close contact as an unmasked exposure within 6 feet of an infected individual for at least 15 minutes, irrespective of location (ie, community or the hospital).

Following the interview, trained phlebotomists collected 3 to 5 mL of venous blood under aseptic conditions. We strictly adhered to standard operating procedures during collection, transportation, and testing of blood samples. Following collection, the blood samples remained undisturbed for at least 30 minutes before centrifugation, which was performed at the collection site (or at the central laboratory for sites lacking the capability). The samples were then transported for further processing and testing through a cold chain supply line, using vaccine carriers with conditioned icepacks. All testing procedures were conducted with strict adherence to the manufacturers’ guidelines.

Laboratory Procedure

In accordance with the manufacturer’s recommendations, we used a chemiluminescent microparticle immunoassay to detect SARS-CoV-2–specific IgG antibodies in serum samples. The assay is an automated two-step immunoassay for the qualitative detection of IgG antibodies against the nucleocapsid of SARS-CoV-2 in human serum and plasma. The sensitivity and specificity of this test are 100% and 99%, respectively. The test result was considered positive for SARS-CoV-2 IgG if the index value was ≥1.4, consistent with guidance provided by the manufacturer.15

The IgG values were also entered into Epicollect5. Two trained medical interns independently entered the laboratory results in two separate forms. A third medical intern reviewed these forms for discrepancies, in response to which they referenced the source data for adjudication. The information gathered during the interview and the laboratory results were linked with the help of a unique identification number, which was generated at the time of the interview.

Statistical Analysis

We estimated the proportion (and logit-transformed 95% CI) of HCWs with a positive SARS-CoV-2–specific IgG antibody level, the primary outcome of interest. We compared seroprevalence rates by gender, age group, specific occupational group, and type of health facility (dedicated COVID hospital vs non-COVID hospital). Seroprevalence was also estimated separately for HCWs who reported symptoms in the past 4 weeks, had a history of exposure to a known case of COVID-19, or had undergone testing by reverse transcriptase-polymerase chain reaction (RT-PCR). In the case of zero seroprevalences, Jeffreys 95% CIs were reported. We used a chi-square test to report two-sided P values for comparison of seroprevalence between groups. When the expected frequency was <5 in more than 20% of the cells, the exact test was used instead of the chi-square test. We additionally performed multivariable logistic regression analysis to evaluate the independent association between place of work (primary independent variable) and seropositivity (dependent variable). We adjusted for the following observable covariates by including them as categorical variables: age, gender, occupational group, and history of close contact with a patient who was COVID-positive. We performed data analysis using Stata, version 15.1 (StataCorp LP). The Institutional Ethics Committee of Government Medical College, Srinagar, approved the study (Reference No. 1003/ETH/GMC dated 13-05-2020). We obtained written, informed consent from all participants.

RESULTS

Of the 7,346 HCWs we were granted permission to approach, 2,915 (39.7%) agreed to participate in the study. The participation rate was 49% at the dedicated COVID hospitals (57% physicians and 47% nonphysicians) and 39% at the non-COVID hospitals (46% physicians and 36% nonphysicians). We analyzed information gathered from 2,905 HCWs (Epicollect5 interview forms were missing for nine participants, and the laboratory report was missing for one participant).

The mean age of the participants was 38.6 years, and 35.8% of participants identified as female (Table 1). One third (33.7%) of the participants were physicians, nearly half of whom were residents. In our sample, the overall seroprevalence of SARS-CoV-2–specific antibodies was 2.5% (95% CI, 2.0%-3.1%).

Seroprevalence of SARS-CoV-2–specific IgG Antibodies by Baseline Characteristics of Healthcare Workers
The distribution of the IgG index value among the study participants is shown in Figure 2.

Scatter Diagram Displaying Immunoglobulin G (IgG) Index Value of the Study Participants

Of the 2,905 participating HCWs, 123 (4.2%) reported an ILI (ie, fever and cough) in the 4 weeks preceding the interview, and 339 (11.7%) reported close contact with a person with COVID-19 (Table 2). A total of 760 (26.2%) HCWs had undergone RT-PCR testing, 29 (3.8%) of whom had a positive result. Stratifying by workplace, history of nasopharyngeal RT-PCR positivity was reported by 4 of 77 (5.1%) participants from dedicated COVID hospitals compared to (3.7%) participants from the non-COVID hospital (P = .528).

Seroprevalence of SARS-CoV-2–specific IgG Antibodies by Clinical Characteristics and Specific Risk Factors

As Table 2 also demonstrates, we found a significantly higher seropositivity rate among HCWs who had a history of ILI (P < .001), a history of positive RT-PCR (P < .001), history of ever being put under quarantine (P = .009), and a self-reported history of close contact with a person with COVID-19 (P = .014). Healthcare workers who had ever worked at a dedicated COVID hospital had a significantly lower seroprevalence of infection (P = .004).

Among HCWs who reported no ILI symptoms in the 4 weeks prior to the interview but who had positive RT-PCR test, 20.8% were seropositive. Of HCWs who reported both ILI and a positive RT-PCR test result, 60.0% were seropositive. Compared to employment at a non-COVID hospital, HCWs working in dedicated COVID hospitals had a reduced multivariate-adjusted risk of seropositivity (odds ratio, 0.21; 95% CI, 0.06-0.66).

DISCUSSION

We aimed to estimate the seroprevalence of SARS-CoV-2 infection in HCWs in different hospital settings in the District Srinagar of Kashmir, India. In general, seroprevalence was low (2.5%), with little difference across gender or occupational group.

Seroprevalence studies of HCWs across divergent workplace environments have revealed estimates ranging from 1% to 10.2%.16-19 Generally, the seroprevalence rates among HCWs are not significantly different from those of the general population, which reflects how different the dynamics of COVID-19 are compared to other infections in healthcare settings. The low seroprevalence observed in our study coincides with the overall low infection rate in the community population. During the study period, District Srinagar reported a median of 28 new infections daily (interquartile range, 17-46), which is indicative of the early phase of the pandemic in the population at the time of the study.20

Among the HCW occupational groups, ambulance drivers and housekeeping staff had the highest seroprevalence rates, followed by nurses and physicians. Possible explanations for higher seropositivity in these groups are improper use or inadequate supply of protective gear and lack of training on the use of personal protective equipment (PPE), resulting in increased exposure risk.21 Concordance of HCW and community infection rates in specific geographic areas suggests that community exposure may be the dominant source of healthcare exposure and infection. Additionally, careful in-hospital behavior of HCWs in dedicated COVID hospitals may have had a spillover effect on their out-of-hospital behavior, which may partially explain our finding that employment at dedicated COVID hospitals was associated with a markedly lower chance of seropositivity. A study of 6,510 HCWs in Chicago, Illinois, showed high seropositivity rates among support service workers, medical assistants, and nurses, with nurses identified as having a markedly higher adjusted odds of seropositivity relative to administrators. The authors of the study concluded that exposure in the community setting plays a crucial role in transmission among HCWs.22 Similarly, higher seroprevalence among housekeeping, nonadministrative staff, and other support service staff has been reported elsewhere.23 Certain underlying factors related to socioeconomic status and lifestyle may also contribute to higher seroprevalence in some occupational groups.24 Nonadherence to masking, social distancing, and proper hand hygiene outside the hospital setting could result in community-acquired infection.

Interestingly, participants who were working in a dedicated COVID hospital or who had ever worked at one had a seroprevalence of 0.6%, much lower than the 2.8% observed among other participants. This difference remained statistically significant after controlling for age, sex, place of work, and occupational group. As these facilities were dedicated to the management and care of patients with COVID-19, the hospital staff strictly adhered to safety precautions, with particular vigilance during patient contact. These hospitals also strictly adhered to infection prevention and control practices based on the latest guidelines released by India’s Ministry of Health and Family Welfare.13

A commitment was made to provide adequate PPE to the dedicated COVID hospitals and staff, commensurate with expected infected patient volumes and associated exposure risks. Healthcare workers were specifically trained on proper donning and doffing of PPE, self-health monitoring, and protocols for reporting symptoms and PPE breaches during patient encounters. Healthcare workers were regularly tested for COVID-19 using nasopharyngeal RT-PCR. Of critical importance, these hospitals implemented a buddy system wherein a team of two or more staff members was responsible for ensuring each other’s safety, proper PPE use, conformance to other protective measures, and reporting breaches of PPE compliance.25 Universal masking was mandatory for all hospital staff and patients at the COVID-focused facilities, with the additional use of N-95 masks, gloves, and face shields during times of patient contact. Administrative measures, including visitor restrictions and environmental sanitation, were rigorously enforced. Also, being a potentially high-risk area for transmission of infection, these facilities implemented staff-rationing to reduce the duration of exposure to the healthcare staff. Third, the HCWs of COVID-dedicated hospitals were provided with separate living accommodations during the period in which they were employed at a dedicated COVID hospital.

In contrast, in non-COVID hospitals, with the exception of HCWs, patients and the hospital visitors were not subject to a masking policy. Moreover, an adequate and timely supply of PPE was not prioritized at the non-COVID facilities due to resource constraints. Further, lack of testing of asymptomatic patients at non-COVID hospitals may have resulted in nosocomial transmission from asymptomatic carriers. Though routine infection prevention and control activities were performed at non-COVID hospitals, we did not assess adherence to infection prevention and control guidelines in the two different categories of hospitals. Our results are also supported by evidence from studies conducted in different hospital settings, the findings of which reiterate the importance of fundamental principles of prevention (eg, proper masking, hand hygiene, and distancing) and are of particular importance in resource-limited settings.17,26,27 The only published study quantifying seroprevalence among HCWs in India was performed in a single hospital setting with separate COVID and non-COVID units. The authors of that study reported a higher seroprevalence among HCWs in the COVID unit. However, this difference seems to be confounded by other factors as revealed by the multivariable analysis result.23

We found a two-fold higher seroprevalence (4.4%) in HCWs who reported close contact with a patient with COVID-19. Respiratory infections pose a greater health risk to HCWs in an occupational setting. Substantial evidence has emerged demonstrating that the respiratory system is the dominant route of SARS-CoV-2 transmission, with proximity and ventilation as key predictive factors.28 Globally, among thousands of HCWs infected with SARS-CoV-2, one of the leading risk factors identified was close contact with a patient with COVID-19; other identified risk factors were lack of PPE, poor infection prevention and control practices, work overload, and a preexisting health condition.29

The seroprevalence estimate among participants who reported an ILI in the 4 weeks preceding the interview was only 12.2%, suggesting an alternative etiology of these symptoms. Among those who reported a previously positive RT-PCR for SARS-CoV-2, only 27.6% showed the presence of SARS-CoV-2–specific IgG antibodies. The inability to mount an antibody-mediated immune response or early conversion to seronegative status during the convalescence phase has been suggested as an explanation for such discordant findings.30 On the contrary, seropositivity among participants who reported having a negative RT-PCR test was 1.9%. There are few plausible explanations for such observations. First, several studies have reported false-negative result rates from RT-PCR testing ranging from 2% to 29%.31-33 Second, the sensitivity of the SARS-CoV-2 assay is influenced by the timing of the test after the onset of symptoms or RT-PCR positivity. The sensitivity of the assay we used varies from 53.1% at day 7 to 100% at day 17 postinfection.34 Variable viral load and differences in duration of viral shedding are other possible reasons for false-negative RT-PCR results.35,36

In our study, seroconversion among asymptomatic HCWs who were RT-PCR-positive was 20.8%. Among HCWs who reported an ILI and were RT-PCR-positive, seropositivity was 60%. In one study, 40% of asymptomatic and 13% of symptomatic patients who tested positive for COVID-19 became seronegative after initial seropositivity—that is, 8 weeks after hospital discharge.37

Serological testing offers insight into both the exposure history and residual COVID-19 susceptibility of HCWs. However, current immunological knowledge does not allow us to conclude that seropositivity conveys high-level immunity against reinfection. As the epidemic evolves, HCWs will continue to be exposed to COVID-19 in the community and the workplace. Serial cross-sectional serosurveys can help monitor the progression of the pandemic within the healthcare setting and guide hospital authorities in resource allocation.

Strengths and Limitations

We used the Abbott Architect SARS-CoV-2 IgG assay, which has exhibited a high level of consistency and performance characteristics when tested in different patient populations. The participation rate was acceptable compared to similar studies, and we included all the major hospitals in the District Srinagar. The findings from our study can therefore be considered representative of the HCWs in the district.

The study results should be interpreted in the context of the following limitations. First, information on risk factors for seropositivity were based on participant report. Also, we did not collect information on the timing of symptoms or the date on which a participant became RT-PCR-positive. Second, information regarding place of exposure (ie, community or hospital setting) was not recorded, limiting conclusions regarding the effect of workplace exposures. Third, given the voluntary nature of participation in the study, there is a possibility of selection bias that may have limited the generalizability of our findings. For example, some HCWs with a recent exposure to COVID-19 or those who were symptomatic at the time of the study might not have participated based on the absence of an individual benefit from IgG testing in the early phase of infection. Conversely, some HCWs who had symptoms in the distant past might have been more likely to have participated in the study. However, we believe that selection bias does not vitiate the validity of the associations based on the plausible assumption that infection risk should be similar between respondents and nonrespondents due to comparable work environments. Finally, with a cross-sectional study design, we cannot ascertain the reconversion from an initial positive-IgG to negative-IgG status, which warrants a cohort study.

CONCLUSION

We conclude that the seroprevalence of SARS-CoV-2 infection was low among HCWs of District Srinagar at the time of the study. Healthcare workers in a dedicated COVID hospital or HCWs who had ever worked in such a facility had lower seroprevalence, suggesting both adherence to and effectiveness of standard protective measures during contact with patients who had COVID-19. Nonetheless, the careful in-hospital behavior of the HCWs at the COVID hospitals may have had a spillover effect on their out-of-hospital behaviors, which lead to community-acquired infection. On the contrary, lack of testing of asymptomatic patients at non-COVID hospitals may have resulted in nosocomial transmission from asymptomatic carriers. We believe that our findings highlight the value of implementing infection prevention and control measures in the hospital setting. Moreover, training and retraining of sanitation and other housekeeping staff on standard hygienic practices and appropriate use of the protective gear may further help reduce their rates of exposure.

Acknowledgments

The authors thank Principal and Dean of the Government Medical College, Srinagar, Professor Samia Rashid, and District Commissioner, Srinagar, Shahid Iqbal Chowdhary for their support. We also acknowledge the support rendered by the Directorate of Health Services, Kashmir; Chief Medical Officer Srinagar; Block Medical Officers; and Zonal Medical Officers of District Srinagar, Kashmir, and extend our appreciation to the medical interns for their efforts in data collection, and to laboratory in-charge Gulzar Ahmad Wani, PhD scholar, Biochemistry, and his staff, who were involved in this study. Finally, we thank the study participants for their understanding of the importance of this study and for their time and participation.

Data availability statement

Data shall be made available on request through the corresponding author.

India is emerging as one of the world’s largest hotspots for SARS-CoV-2 infection (COVID-19)—second only to the United States—with more than 13,000,000 documented infections since the first case was recorded on January 30, 2020.1,2 Kashmir, a northern territory of India, reported its first case of COVID-19 on March 18, 2020, from the central District Srinagar; this region has accounted for more cases of COVID-19 than any other district throughout the pandemic.3 The large majority of healthcare in District Srinagar is provided by three tertiary care institutions, one district hospital, two subdistrict hospitals, and 70 primary healthcare centers. Potential occupational exposures place healthcare workers (HCWs) at higher risk of acquiring SARS-CoV-2 infection, which in turn may serve as an important source of infection for their families and other community members.4-6 Given the high frequency and geographic variability of asymptomatic infection, growing evidence suggests this hidden reservoir is a source of infection for the general population.7,8

Many countries have started testing for antibodies against SARS-CoV-2, both at the population level and in specific groups, such as HCWs. Seroepidemiological studies are crucial to understanding the dynamics of SARS-CoV-2 infection. Many seroepidemiological studies have been conducted among community populations, but there are insufficient data on HCWs. The World Health Organization also encouraged its member states to conduct seroepidemiological studies to attain a better understanding of COVID-19 infection prevalence and distribution.9-11 Therefore, to quantify the prevalence of SARS-CoV-2 infection among HCWs, we conducted a seroepidemiological study by testing for SARS-CoV-2–specific immunoglobulin (IgG) to gain insight into the extent of infection among specific subgroups of HCWs and to identify risk-factor profiles associated with seropositivity.

METHODS

Study Design and Settings

We conducted this seroepidemiological study to ascertain the presence of IgG antibodies against SARS-CoV-2 among HCWs in the District Srinagar of Kashmir, India. The 2-week period of data collection began on June 15, 2020. As part of healthcare system pandemic preparedness efforts, India’s Ministry of Health provided specific guidelines for health facilities to manage COVID-19. Hospitals were categorized as dedicated COVID and non-COVID hospitals. Dedicated COVID hospitals provided comprehensive care exclusively to patients with COVID-19 and were equipped with fully functional intensive care units, ventilators, and beds with reliable access to oxygen support.12 In addition, infection prevention and control strategies to limit the transmission of SARS-CoV-2 infection were implemented according to guidelines specified by India’s National Center for Disease Control.13 To strengthen service provision, HCWs from other hospitals, including resident physicians, were relocated to these dedicated COVID hospitals. The additional staff were selected by administrative leadership, without input from HCWs.

Study Population and Data Collection

We approached administrative heads of the hospitals in District Srinagar for permission to conduct our study and to invite their HCWs to participate in the study. As Figure 1 shows, we were denied permission by the administrative heads of two tertiary care hospitals. Finally, with a point person serving as a study liaison at each institution, HCWs from three dedicated COVID and seven non-COVID tertiary care hospitals, two subdistrict hospitals, and six primary healthcare centers across the District Srinagar were invited to participate. The sample primary healthcare centers were each selected randomly, after stratification, from six major regions of the district. All frontline HCWs, including physicians, administrative and laboratory personnel, technicians, field workers involved in surveillance activity, and other supporting staff were eligible for the study.

Healthcare Facilities in District Srinagar and the Number of Hospitals and Facilities Selected for the Study

We collected information on an interview form using Epicollect5, a free data-gathering tool widely used in health research.14 Physicians specifically trained in the use of Epicollect5 conducted the face-to-face interview on a prespecified day and recorded the collected information through mobile phones. This information included the participants’ role in providing care to patients with COVID-19 and risk factors for SARS-CoV-2 infection (eg, history of travel since January 1, 2020, symptoms of an influenza-like illness [ILI] in the 4 weeks prior to the interview, close contact with a COVID-19 case). We defined close contact as an unmasked exposure within 6 feet of an infected individual for at least 15 minutes, irrespective of location (ie, community or the hospital).

Following the interview, trained phlebotomists collected 3 to 5 mL of venous blood under aseptic conditions. We strictly adhered to standard operating procedures during collection, transportation, and testing of blood samples. Following collection, the blood samples remained undisturbed for at least 30 minutes before centrifugation, which was performed at the collection site (or at the central laboratory for sites lacking the capability). The samples were then transported for further processing and testing through a cold chain supply line, using vaccine carriers with conditioned icepacks. All testing procedures were conducted with strict adherence to the manufacturers’ guidelines.

Laboratory Procedure

In accordance with the manufacturer’s recommendations, we used a chemiluminescent microparticle immunoassay to detect SARS-CoV-2–specific IgG antibodies in serum samples. The assay is an automated two-step immunoassay for the qualitative detection of IgG antibodies against the nucleocapsid of SARS-CoV-2 in human serum and plasma. The sensitivity and specificity of this test are 100% and 99%, respectively. The test result was considered positive for SARS-CoV-2 IgG if the index value was ≥1.4, consistent with guidance provided by the manufacturer.15

The IgG values were also entered into Epicollect5. Two trained medical interns independently entered the laboratory results in two separate forms. A third medical intern reviewed these forms for discrepancies, in response to which they referenced the source data for adjudication. The information gathered during the interview and the laboratory results were linked with the help of a unique identification number, which was generated at the time of the interview.

Statistical Analysis

We estimated the proportion (and logit-transformed 95% CI) of HCWs with a positive SARS-CoV-2–specific IgG antibody level, the primary outcome of interest. We compared seroprevalence rates by gender, age group, specific occupational group, and type of health facility (dedicated COVID hospital vs non-COVID hospital). Seroprevalence was also estimated separately for HCWs who reported symptoms in the past 4 weeks, had a history of exposure to a known case of COVID-19, or had undergone testing by reverse transcriptase-polymerase chain reaction (RT-PCR). In the case of zero seroprevalences, Jeffreys 95% CIs were reported. We used a chi-square test to report two-sided P values for comparison of seroprevalence between groups. When the expected frequency was <5 in more than 20% of the cells, the exact test was used instead of the chi-square test. We additionally performed multivariable logistic regression analysis to evaluate the independent association between place of work (primary independent variable) and seropositivity (dependent variable). We adjusted for the following observable covariates by including them as categorical variables: age, gender, occupational group, and history of close contact with a patient who was COVID-positive. We performed data analysis using Stata, version 15.1 (StataCorp LP). The Institutional Ethics Committee of Government Medical College, Srinagar, approved the study (Reference No. 1003/ETH/GMC dated 13-05-2020). We obtained written, informed consent from all participants.

RESULTS

Of the 7,346 HCWs we were granted permission to approach, 2,915 (39.7%) agreed to participate in the study. The participation rate was 49% at the dedicated COVID hospitals (57% physicians and 47% nonphysicians) and 39% at the non-COVID hospitals (46% physicians and 36% nonphysicians). We analyzed information gathered from 2,905 HCWs (Epicollect5 interview forms were missing for nine participants, and the laboratory report was missing for one participant).

The mean age of the participants was 38.6 years, and 35.8% of participants identified as female (Table 1). One third (33.7%) of the participants were physicians, nearly half of whom were residents. In our sample, the overall seroprevalence of SARS-CoV-2–specific antibodies was 2.5% (95% CI, 2.0%-3.1%).

Seroprevalence of SARS-CoV-2–specific IgG Antibodies by Baseline Characteristics of Healthcare Workers
The distribution of the IgG index value among the study participants is shown in Figure 2.

Scatter Diagram Displaying Immunoglobulin G (IgG) Index Value of the Study Participants

Of the 2,905 participating HCWs, 123 (4.2%) reported an ILI (ie, fever and cough) in the 4 weeks preceding the interview, and 339 (11.7%) reported close contact with a person with COVID-19 (Table 2). A total of 760 (26.2%) HCWs had undergone RT-PCR testing, 29 (3.8%) of whom had a positive result. Stratifying by workplace, history of nasopharyngeal RT-PCR positivity was reported by 4 of 77 (5.1%) participants from dedicated COVID hospitals compared to (3.7%) participants from the non-COVID hospital (P = .528).

Seroprevalence of SARS-CoV-2–specific IgG Antibodies by Clinical Characteristics and Specific Risk Factors

As Table 2 also demonstrates, we found a significantly higher seropositivity rate among HCWs who had a history of ILI (P < .001), a history of positive RT-PCR (P < .001), history of ever being put under quarantine (P = .009), and a self-reported history of close contact with a person with COVID-19 (P = .014). Healthcare workers who had ever worked at a dedicated COVID hospital had a significantly lower seroprevalence of infection (P = .004).

Among HCWs who reported no ILI symptoms in the 4 weeks prior to the interview but who had positive RT-PCR test, 20.8% were seropositive. Of HCWs who reported both ILI and a positive RT-PCR test result, 60.0% were seropositive. Compared to employment at a non-COVID hospital, HCWs working in dedicated COVID hospitals had a reduced multivariate-adjusted risk of seropositivity (odds ratio, 0.21; 95% CI, 0.06-0.66).

DISCUSSION

We aimed to estimate the seroprevalence of SARS-CoV-2 infection in HCWs in different hospital settings in the District Srinagar of Kashmir, India. In general, seroprevalence was low (2.5%), with little difference across gender or occupational group.

Seroprevalence studies of HCWs across divergent workplace environments have revealed estimates ranging from 1% to 10.2%.16-19 Generally, the seroprevalence rates among HCWs are not significantly different from those of the general population, which reflects how different the dynamics of COVID-19 are compared to other infections in healthcare settings. The low seroprevalence observed in our study coincides with the overall low infection rate in the community population. During the study period, District Srinagar reported a median of 28 new infections daily (interquartile range, 17-46), which is indicative of the early phase of the pandemic in the population at the time of the study.20

Among the HCW occupational groups, ambulance drivers and housekeeping staff had the highest seroprevalence rates, followed by nurses and physicians. Possible explanations for higher seropositivity in these groups are improper use or inadequate supply of protective gear and lack of training on the use of personal protective equipment (PPE), resulting in increased exposure risk.21 Concordance of HCW and community infection rates in specific geographic areas suggests that community exposure may be the dominant source of healthcare exposure and infection. Additionally, careful in-hospital behavior of HCWs in dedicated COVID hospitals may have had a spillover effect on their out-of-hospital behavior, which may partially explain our finding that employment at dedicated COVID hospitals was associated with a markedly lower chance of seropositivity. A study of 6,510 HCWs in Chicago, Illinois, showed high seropositivity rates among support service workers, medical assistants, and nurses, with nurses identified as having a markedly higher adjusted odds of seropositivity relative to administrators. The authors of the study concluded that exposure in the community setting plays a crucial role in transmission among HCWs.22 Similarly, higher seroprevalence among housekeeping, nonadministrative staff, and other support service staff has been reported elsewhere.23 Certain underlying factors related to socioeconomic status and lifestyle may also contribute to higher seroprevalence in some occupational groups.24 Nonadherence to masking, social distancing, and proper hand hygiene outside the hospital setting could result in community-acquired infection.

Interestingly, participants who were working in a dedicated COVID hospital or who had ever worked at one had a seroprevalence of 0.6%, much lower than the 2.8% observed among other participants. This difference remained statistically significant after controlling for age, sex, place of work, and occupational group. As these facilities were dedicated to the management and care of patients with COVID-19, the hospital staff strictly adhered to safety precautions, with particular vigilance during patient contact. These hospitals also strictly adhered to infection prevention and control practices based on the latest guidelines released by India’s Ministry of Health and Family Welfare.13

A commitment was made to provide adequate PPE to the dedicated COVID hospitals and staff, commensurate with expected infected patient volumes and associated exposure risks. Healthcare workers were specifically trained on proper donning and doffing of PPE, self-health monitoring, and protocols for reporting symptoms and PPE breaches during patient encounters. Healthcare workers were regularly tested for COVID-19 using nasopharyngeal RT-PCR. Of critical importance, these hospitals implemented a buddy system wherein a team of two or more staff members was responsible for ensuring each other’s safety, proper PPE use, conformance to other protective measures, and reporting breaches of PPE compliance.25 Universal masking was mandatory for all hospital staff and patients at the COVID-focused facilities, with the additional use of N-95 masks, gloves, and face shields during times of patient contact. Administrative measures, including visitor restrictions and environmental sanitation, were rigorously enforced. Also, being a potentially high-risk area for transmission of infection, these facilities implemented staff-rationing to reduce the duration of exposure to the healthcare staff. Third, the HCWs of COVID-dedicated hospitals were provided with separate living accommodations during the period in which they were employed at a dedicated COVID hospital.

In contrast, in non-COVID hospitals, with the exception of HCWs, patients and the hospital visitors were not subject to a masking policy. Moreover, an adequate and timely supply of PPE was not prioritized at the non-COVID facilities due to resource constraints. Further, lack of testing of asymptomatic patients at non-COVID hospitals may have resulted in nosocomial transmission from asymptomatic carriers. Though routine infection prevention and control activities were performed at non-COVID hospitals, we did not assess adherence to infection prevention and control guidelines in the two different categories of hospitals. Our results are also supported by evidence from studies conducted in different hospital settings, the findings of which reiterate the importance of fundamental principles of prevention (eg, proper masking, hand hygiene, and distancing) and are of particular importance in resource-limited settings.17,26,27 The only published study quantifying seroprevalence among HCWs in India was performed in a single hospital setting with separate COVID and non-COVID units. The authors of that study reported a higher seroprevalence among HCWs in the COVID unit. However, this difference seems to be confounded by other factors as revealed by the multivariable analysis result.23

We found a two-fold higher seroprevalence (4.4%) in HCWs who reported close contact with a patient with COVID-19. Respiratory infections pose a greater health risk to HCWs in an occupational setting. Substantial evidence has emerged demonstrating that the respiratory system is the dominant route of SARS-CoV-2 transmission, with proximity and ventilation as key predictive factors.28 Globally, among thousands of HCWs infected with SARS-CoV-2, one of the leading risk factors identified was close contact with a patient with COVID-19; other identified risk factors were lack of PPE, poor infection prevention and control practices, work overload, and a preexisting health condition.29

The seroprevalence estimate among participants who reported an ILI in the 4 weeks preceding the interview was only 12.2%, suggesting an alternative etiology of these symptoms. Among those who reported a previously positive RT-PCR for SARS-CoV-2, only 27.6% showed the presence of SARS-CoV-2–specific IgG antibodies. The inability to mount an antibody-mediated immune response or early conversion to seronegative status during the convalescence phase has been suggested as an explanation for such discordant findings.30 On the contrary, seropositivity among participants who reported having a negative RT-PCR test was 1.9%. There are few plausible explanations for such observations. First, several studies have reported false-negative result rates from RT-PCR testing ranging from 2% to 29%.31-33 Second, the sensitivity of the SARS-CoV-2 assay is influenced by the timing of the test after the onset of symptoms or RT-PCR positivity. The sensitivity of the assay we used varies from 53.1% at day 7 to 100% at day 17 postinfection.34 Variable viral load and differences in duration of viral shedding are other possible reasons for false-negative RT-PCR results.35,36

In our study, seroconversion among asymptomatic HCWs who were RT-PCR-positive was 20.8%. Among HCWs who reported an ILI and were RT-PCR-positive, seropositivity was 60%. In one study, 40% of asymptomatic and 13% of symptomatic patients who tested positive for COVID-19 became seronegative after initial seropositivity—that is, 8 weeks after hospital discharge.37

Serological testing offers insight into both the exposure history and residual COVID-19 susceptibility of HCWs. However, current immunological knowledge does not allow us to conclude that seropositivity conveys high-level immunity against reinfection. As the epidemic evolves, HCWs will continue to be exposed to COVID-19 in the community and the workplace. Serial cross-sectional serosurveys can help monitor the progression of the pandemic within the healthcare setting and guide hospital authorities in resource allocation.

Strengths and Limitations

We used the Abbott Architect SARS-CoV-2 IgG assay, which has exhibited a high level of consistency and performance characteristics when tested in different patient populations. The participation rate was acceptable compared to similar studies, and we included all the major hospitals in the District Srinagar. The findings from our study can therefore be considered representative of the HCWs in the district.

The study results should be interpreted in the context of the following limitations. First, information on risk factors for seropositivity were based on participant report. Also, we did not collect information on the timing of symptoms or the date on which a participant became RT-PCR-positive. Second, information regarding place of exposure (ie, community or hospital setting) was not recorded, limiting conclusions regarding the effect of workplace exposures. Third, given the voluntary nature of participation in the study, there is a possibility of selection bias that may have limited the generalizability of our findings. For example, some HCWs with a recent exposure to COVID-19 or those who were symptomatic at the time of the study might not have participated based on the absence of an individual benefit from IgG testing in the early phase of infection. Conversely, some HCWs who had symptoms in the distant past might have been more likely to have participated in the study. However, we believe that selection bias does not vitiate the validity of the associations based on the plausible assumption that infection risk should be similar between respondents and nonrespondents due to comparable work environments. Finally, with a cross-sectional study design, we cannot ascertain the reconversion from an initial positive-IgG to negative-IgG status, which warrants a cohort study.

CONCLUSION

We conclude that the seroprevalence of SARS-CoV-2 infection was low among HCWs of District Srinagar at the time of the study. Healthcare workers in a dedicated COVID hospital or HCWs who had ever worked in such a facility had lower seroprevalence, suggesting both adherence to and effectiveness of standard protective measures during contact with patients who had COVID-19. Nonetheless, the careful in-hospital behavior of the HCWs at the COVID hospitals may have had a spillover effect on their out-of-hospital behaviors, which lead to community-acquired infection. On the contrary, lack of testing of asymptomatic patients at non-COVID hospitals may have resulted in nosocomial transmission from asymptomatic carriers. We believe that our findings highlight the value of implementing infection prevention and control measures in the hospital setting. Moreover, training and retraining of sanitation and other housekeeping staff on standard hygienic practices and appropriate use of the protective gear may further help reduce their rates of exposure.

Acknowledgments

The authors thank Principal and Dean of the Government Medical College, Srinagar, Professor Samia Rashid, and District Commissioner, Srinagar, Shahid Iqbal Chowdhary for their support. We also acknowledge the support rendered by the Directorate of Health Services, Kashmir; Chief Medical Officer Srinagar; Block Medical Officers; and Zonal Medical Officers of District Srinagar, Kashmir, and extend our appreciation to the medical interns for their efforts in data collection, and to laboratory in-charge Gulzar Ahmad Wani, PhD scholar, Biochemistry, and his staff, who were involved in this study. Finally, we thank the study participants for their understanding of the importance of this study and for their time and participation.

Data availability statement

Data shall be made available on request through the corresponding author.

References

1. Ministry of Health & Family Welfare. Government of India. Accessed January 11, 2021. https://www.mohfw.gov.in/
2. COVID19 India. Accessed January 11, 2021. https://www.covid19india.org/
3. Government of Jammu & Kashmir. Department of Information & Public Relations. Bulletin on Novel Corona Virus (COVID-19). Accessed January 11, 2021. http://new.jkdirinf.in/NewsDescription.aspx?ID=66598
4. Black JRM, Bailey C, Przewrocka J, Dijkstra KK, Swanton C. COVID-19: the case for health-care worker screening to prevent hospital transmission. Lancet. 2020;395(10234):1418-1420. https://doi.org/10.1016/s0140-6736(20)30917-x
5. Nguyen LH, Drew DA, Graham MS, et al; Coronavirus Pandemic Epidemiology Consortium. Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. Lancet Public Heal. 2020;5(9):e475-e483. https://doi.org/10.1016/s2468-2667(20)30164-x
6. The Lancet. COVID-19: protecting health-care workers. Lancet. 2020;395(10228):922. https://doi.org/10.1016/s0140-6736(20)30644-9
7. Byambasuren O, Cardona M, Bell K, Clark J, McLaws M-L, Glasziou P. Estimating the extent of asymptomatic COVID-19 and its potential for community transmission: systematic review and meta-analysis. Off J Assoc Med Microbiol Infect Dis Canada. 2020;5(4):223-234. https://doi.org/10.3138/jammi-2020-0030
8. Rosenbaum L. Facing Covid-19 in Italy—ethics, logistics, and therapeutics on the epidemic’s front line. N Engl J Med. 2020;382(20):1873-1875. https://doi.org/10.1056/nejmp2005492
9. World Health Organization. The Unity Studies: WHO Sero-epidemiological Investigations Protocols. Accessed January 11, 2021. https://www.who.int/emergencies/diseases/novel-coronavirus-2019/technical-guidance/early-investigations
10. Pollán M, Pérez-Gómez B, Pastor-Barriuso R, et al; ENE-COVID Study Group. Prevalence of SARS-CoV-2 in Spain (ENE-COVID): a nationwide, population-based seroepidemiological study. Lancet. 2020;396(10250):535-544. https://doi.org/10.1016/s0140-6736(20)31483-5
11. Folgueira MD, Muñoz-Ruipérez C, Alonso-López MA, Delgado R. SARS-CoV-2 infection in health care workers in a large public hospital in Madrid, Spain, during March 2020. MedRxiv Web site. Published April 27, 2020. Accessed March 9, 2021. https://doi.org/10.1101/2020.04.07.20055723
12. Ministry of Health & Family Welfare, Directorate General of Health Services, EMR Division. Guidance document on appropriate management of suspect/confirmed cases of COVID-19. Accessed January 11, 2021. https://www.mohfw.gov.in/pdf/FinalGuidanceonMangaementofCovidcasesversion2.pdf
13. Ministry of Health &Family Welfare Government of India. National guidelines for infection prevention and control in healthcare facilities. Accessed January 11, 2021. https://main.mohfw.gov.in/sites/default/files/National%20Guidelines%20for%20IPC%20in%20HCF%20-%20final%281%29.pdf
14. Epicollect5. Accessed January 11, 2021. https://five.epicollect.net/
15. SARS-CoV-2 Immunoassay. Abbott Core Laboratory. Accessed January 11, 2021. https://www.corelaboratory.abbott/us/en/offerings/segments/infectious-disease/sars-cov-2
16. Bendavid E, Mulaney B, Sood N, et al. COVID-19 antibody seroprevalence in Santa Clara County, California. medRxiv. Published online April 30, 2020. Accessed March 9, 2021. https://doi.org/10.1101/2020.04.14.20062463
17. Korth J, Wilde B, Dolff S, et al. SARS-CoV-2-specific antibody detection in healthcare workers in Germany with direct contact to COVID-19 patients. J Clin Virol. 2020;128:104437. https://doi.org/10.1016/j.jcv.2020.104437
18. Steensels D, Oris E, Coninx L, et al. Hospital-wide SARS-CoV-2 antibody screening in 3056 staff in a tertiary center in Belgium. JAMA. 2020;324(2):195-197. https://doi.org/10.1001/jama.2020.11160
19. Behrens GMN, Cossmann A, Stankov M V., et al. Perceived versus proven SARS-CoV-2-specific immune responses in health-care professionals. Infection. 2020;48(4):631-634. https://doi.org/10.1007/s15010-020-01461-0
20. COVID-19 Kashmir Tracker. Accessed January 11, 2021. https://covidkashmir.org/statistics
21. World Health Organization. Rational use of personal protective equipment for coronavirus disease (COVID-19) and considerations during severe shortages. Published December 23, 2020. Accessed January 11, 2021. https://www.who.int/publications/i/item/rational-use-of-personal-protective-equipment-for-coronavirus-disease-(covid-19)-and-considerations-during-severe-shortages
22. Wilkins JT, Gray EL, Wallia A, et al. Seroprevalence and correlates of SARS-CoV-2 antibodies in health care workers in Chicago. Open Forum Infect Dis. 2020;8(1):ofaa582. https://doi.org/10.1093/ofid/ofaa582
23. Goenka M, Afzalpurkar S, Goenka U, et al. Seroprevalence of COVID-19 amongst health care workers in a tertiary care hospital of a metropolitan city from India. J Assoc Physicians India. 2020;68(11):14-19. https://doi.org/10.2139/ssrn.3689618
24. Mutambudzi M, Niedwiedz C, Macdonald EB, et al. Occupation and risk of severe COVID-19: prospective cohort study of 120 075 UK Biobank participants. Occup Environ Med. 2020;oemed-2020-106731. https://doi.org/10.1136/oemed-2020-106731
25. Ministry of Health & Family Welfare, Directorate General of Health Services, EMR Division. Advisory for managing health care workers working in COVID and Non-COVID areas of the hospital. Accessed January 12, 2021. https://cdnbbsr.s3waas.gov.in/s3850af92f8d9903e7a4e0559a98ecc857/uploads/2020/06/2020061949.pdf
26. Rhee C, Baker M, Vaidya V, et al; CDC Prevention Epicenters Program. Incidence of nosocomial COVID-19 in patients hospitalized at a large US academic medical center. JAMA Netw Open. 2020;3(9):e2020498. https://doi.org/10.1001/jamanetworkopen.2020.20498
27. Seidelman J, Lewis SS, Advani SD, et al. Universal masking is an effective strategy to flatten the severe acute respiratory coronavirus virus 2 (SARS-2-CoV)healthcare worker epidemiologic curve. Infect Control Hosp Epidemiol. 2020;41(12):1466-1467. https://doi.org/10.1017/ice.2020.313
28. Meyerowitz EA, Richterman A, Gandhi RT, Sax PE. Transmission of SARS-CoV-2: a review of viral, host, and environmental factors. Ann Intern Med. 2020;174(1):69-79. https://doi.org/10.7326/m20-5008
29. Mhango M, Dzobo M, Chitungo I, Dzinamarira T. COVID-19 risk factors among health workers: a rapid review. Saf Health Work. 2020;11(3):262-265. https://doi.org/10.1016/j.shaw.2020.06.001
30. European Centre for Disease Prevention and Control. Immune responses and immunity to SARS-CoV-2. Updated June 30, 2020. Accessed January 12, 2021. https://www.ecdc.europa.eu/en/covid-19/latest-evidence/immune-responses
31. Arevalo-Rodriguez I, Buitrago-Garcia D, Simancas-Racines D, et al. False-negative results of initial RT-PCR assays for COVID-19: a systematic review. PLoS One. 2020;15(12):e0242958. https://doi.org/10.1371/journal.pone.0242958
32. Ai T, Yang Z, Hou H, et al. Correlation of chest CT and RT-PCR testing for coronavirus disease 2019 (COVID-19) in China: a report of 1014 cases. Radiology. 2020;296(2):E32-E40. https://doi.org/10.1148/radiol.2020200642
33. Woloshin S, Patel N, Kesselheim AS. False negative tests for SARS-CoV-2 infection — challenges and implications. N Engl J Med. 2020;383(6):e38. https://doi.org/10.1056/nejmp2015897
34. Bryan A, Pepper G, Wener MH, et al. Performance characteristics of the Abbott Architect SARS-CoV-2 IgG assay and seroprevalence in Boise, Idaho. J Clin Microbiol. 2020;58(8):e00941. https://doi.org/10.1128/jcm.00941-20
35. Long Q-X, Liu B-Z, Deng H-J, et al. Antibody responses to SARS-CoV-2 in patients with COVID-19. Nat Med. 2020;26(6):845-848. https://doi.org/10.1038/s41591-020-0897-1
36. Tahamtan A, Ardebili A. Real-time RT-PCR in COVID-19 detection: issues affecting the results. Expert Rev Mol Diagn. 2020;20(5):453-454. https://doi.org/10.1080/14737159.2020.1757437
37. Long Q-X, Tang X-J, Shi Q-L, et al. Clinical and immunological assessment of asymptomatic SARS-CoV-2 infections. Nat Med. 2020;26(8):1200-1204. https://doi.org/10.1038/s41591-020-0965-6

References

1. Ministry of Health & Family Welfare. Government of India. Accessed January 11, 2021. https://www.mohfw.gov.in/
2. COVID19 India. Accessed January 11, 2021. https://www.covid19india.org/
3. Government of Jammu & Kashmir. Department of Information & Public Relations. Bulletin on Novel Corona Virus (COVID-19). Accessed January 11, 2021. http://new.jkdirinf.in/NewsDescription.aspx?ID=66598
4. Black JRM, Bailey C, Przewrocka J, Dijkstra KK, Swanton C. COVID-19: the case for health-care worker screening to prevent hospital transmission. Lancet. 2020;395(10234):1418-1420. https://doi.org/10.1016/s0140-6736(20)30917-x
5. Nguyen LH, Drew DA, Graham MS, et al; Coronavirus Pandemic Epidemiology Consortium. Risk of COVID-19 among front-line health-care workers and the general community: a prospective cohort study. Lancet Public Heal. 2020;5(9):e475-e483. https://doi.org/10.1016/s2468-2667(20)30164-x
6. The Lancet. COVID-19: protecting health-care workers. Lancet. 2020;395(10228):922. https://doi.org/10.1016/s0140-6736(20)30644-9
7. Byambasuren O, Cardona M, Bell K, Clark J, McLaws M-L, Glasziou P. Estimating the extent of asymptomatic COVID-19 and its potential for community transmission: systematic review and meta-analysis. Off J Assoc Med Microbiol Infect Dis Canada. 2020;5(4):223-234. https://doi.org/10.3138/jammi-2020-0030
8. Rosenbaum L. Facing Covid-19 in Italy—ethics, logistics, and therapeutics on the epidemic’s front line. N Engl J Med. 2020;382(20):1873-1875. https://doi.org/10.1056/nejmp2005492
9. World Health Organization. The Unity Studies: WHO Sero-epidemiological Investigations Protocols. Accessed January 11, 2021. https://www.who.int/emergencies/diseases/novel-coronavirus-2019/technical-guidance/early-investigations
10. Pollán M, Pérez-Gómez B, Pastor-Barriuso R, et al; ENE-COVID Study Group. Prevalence of SARS-CoV-2 in Spain (ENE-COVID): a nationwide, population-based seroepidemiological study. Lancet. 2020;396(10250):535-544. https://doi.org/10.1016/s0140-6736(20)31483-5
11. Folgueira MD, Muñoz-Ruipérez C, Alonso-López MA, Delgado R. SARS-CoV-2 infection in health care workers in a large public hospital in Madrid, Spain, during March 2020. MedRxiv Web site. Published April 27, 2020. Accessed March 9, 2021. https://doi.org/10.1101/2020.04.07.20055723
12. Ministry of Health & Family Welfare, Directorate General of Health Services, EMR Division. Guidance document on appropriate management of suspect/confirmed cases of COVID-19. Accessed January 11, 2021. https://www.mohfw.gov.in/pdf/FinalGuidanceonMangaementofCovidcasesversion2.pdf
13. Ministry of Health &Family Welfare Government of India. National guidelines for infection prevention and control in healthcare facilities. Accessed January 11, 2021. https://main.mohfw.gov.in/sites/default/files/National%20Guidelines%20for%20IPC%20in%20HCF%20-%20final%281%29.pdf
14. Epicollect5. Accessed January 11, 2021. https://five.epicollect.net/
15. SARS-CoV-2 Immunoassay. Abbott Core Laboratory. Accessed January 11, 2021. https://www.corelaboratory.abbott/us/en/offerings/segments/infectious-disease/sars-cov-2
16. Bendavid E, Mulaney B, Sood N, et al. COVID-19 antibody seroprevalence in Santa Clara County, California. medRxiv. Published online April 30, 2020. Accessed March 9, 2021. https://doi.org/10.1101/2020.04.14.20062463
17. Korth J, Wilde B, Dolff S, et al. SARS-CoV-2-specific antibody detection in healthcare workers in Germany with direct contact to COVID-19 patients. J Clin Virol. 2020;128:104437. https://doi.org/10.1016/j.jcv.2020.104437
18. Steensels D, Oris E, Coninx L, et al. Hospital-wide SARS-CoV-2 antibody screening in 3056 staff in a tertiary center in Belgium. JAMA. 2020;324(2):195-197. https://doi.org/10.1001/jama.2020.11160
19. Behrens GMN, Cossmann A, Stankov M V., et al. Perceived versus proven SARS-CoV-2-specific immune responses in health-care professionals. Infection. 2020;48(4):631-634. https://doi.org/10.1007/s15010-020-01461-0
20. COVID-19 Kashmir Tracker. Accessed January 11, 2021. https://covidkashmir.org/statistics
21. World Health Organization. Rational use of personal protective equipment for coronavirus disease (COVID-19) and considerations during severe shortages. Published December 23, 2020. Accessed January 11, 2021. https://www.who.int/publications/i/item/rational-use-of-personal-protective-equipment-for-coronavirus-disease-(covid-19)-and-considerations-during-severe-shortages
22. Wilkins JT, Gray EL, Wallia A, et al. Seroprevalence and correlates of SARS-CoV-2 antibodies in health care workers in Chicago. Open Forum Infect Dis. 2020;8(1):ofaa582. https://doi.org/10.1093/ofid/ofaa582
23. Goenka M, Afzalpurkar S, Goenka U, et al. Seroprevalence of COVID-19 amongst health care workers in a tertiary care hospital of a metropolitan city from India. J Assoc Physicians India. 2020;68(11):14-19. https://doi.org/10.2139/ssrn.3689618
24. Mutambudzi M, Niedwiedz C, Macdonald EB, et al. Occupation and risk of severe COVID-19: prospective cohort study of 120 075 UK Biobank participants. Occup Environ Med. 2020;oemed-2020-106731. https://doi.org/10.1136/oemed-2020-106731
25. Ministry of Health & Family Welfare, Directorate General of Health Services, EMR Division. Advisory for managing health care workers working in COVID and Non-COVID areas of the hospital. Accessed January 12, 2021. https://cdnbbsr.s3waas.gov.in/s3850af92f8d9903e7a4e0559a98ecc857/uploads/2020/06/2020061949.pdf
26. Rhee C, Baker M, Vaidya V, et al; CDC Prevention Epicenters Program. Incidence of nosocomial COVID-19 in patients hospitalized at a large US academic medical center. JAMA Netw Open. 2020;3(9):e2020498. https://doi.org/10.1001/jamanetworkopen.2020.20498
27. Seidelman J, Lewis SS, Advani SD, et al. Universal masking is an effective strategy to flatten the severe acute respiratory coronavirus virus 2 (SARS-2-CoV)healthcare worker epidemiologic curve. Infect Control Hosp Epidemiol. 2020;41(12):1466-1467. https://doi.org/10.1017/ice.2020.313
28. Meyerowitz EA, Richterman A, Gandhi RT, Sax PE. Transmission of SARS-CoV-2: a review of viral, host, and environmental factors. Ann Intern Med. 2020;174(1):69-79. https://doi.org/10.7326/m20-5008
29. Mhango M, Dzobo M, Chitungo I, Dzinamarira T. COVID-19 risk factors among health workers: a rapid review. Saf Health Work. 2020;11(3):262-265. https://doi.org/10.1016/j.shaw.2020.06.001
30. European Centre for Disease Prevention and Control. Immune responses and immunity to SARS-CoV-2. Updated June 30, 2020. Accessed January 12, 2021. https://www.ecdc.europa.eu/en/covid-19/latest-evidence/immune-responses
31. Arevalo-Rodriguez I, Buitrago-Garcia D, Simancas-Racines D, et al. False-negative results of initial RT-PCR assays for COVID-19: a systematic review. PLoS One. 2020;15(12):e0242958. https://doi.org/10.1371/journal.pone.0242958
32. Ai T, Yang Z, Hou H, et al. Correlation of chest CT and RT-PCR testing for coronavirus disease 2019 (COVID-19) in China: a report of 1014 cases. Radiology. 2020;296(2):E32-E40. https://doi.org/10.1148/radiol.2020200642
33. Woloshin S, Patel N, Kesselheim AS. False negative tests for SARS-CoV-2 infection — challenges and implications. N Engl J Med. 2020;383(6):e38. https://doi.org/10.1056/nejmp2015897
34. Bryan A, Pepper G, Wener MH, et al. Performance characteristics of the Abbott Architect SARS-CoV-2 IgG assay and seroprevalence in Boise, Idaho. J Clin Microbiol. 2020;58(8):e00941. https://doi.org/10.1128/jcm.00941-20
35. Long Q-X, Liu B-Z, Deng H-J, et al. Antibody responses to SARS-CoV-2 in patients with COVID-19. Nat Med. 2020;26(6):845-848. https://doi.org/10.1038/s41591-020-0897-1
36. Tahamtan A, Ardebili A. Real-time RT-PCR in COVID-19 detection: issues affecting the results. Expert Rev Mol Diagn. 2020;20(5):453-454. https://doi.org/10.1080/14737159.2020.1757437
37. Long Q-X, Tang X-J, Shi Q-L, et al. Clinical and immunological assessment of asymptomatic SARS-CoV-2 infections. Nat Med. 2020;26(8):1200-1204. https://doi.org/10.1038/s41591-020-0965-6

Issue
Journal of Hospital Medicine 16(5)
Issue
Journal of Hospital Medicine 16(5)
Page Number
274-281. Published Online First April 20, 2021
Page Number
274-281. Published Online First April 20, 2021
Topics
Article Type
Display Headline
SARS-CoV-2 Seroprevalence Among Healthcare Workers by Workplace Exposure Risk in Kashmir, India
Display Headline
SARS-CoV-2 Seroprevalence Among Healthcare Workers by Workplace Exposure Risk in Kashmir, India
Sections
Article Source

©2021 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Mariya Amin Qurieshi, MD; Email:maryaamin123@gmail.com.
Content Gating
Open Access (article Unlocked/Open Access)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
Article PDF Media

Decreasing Hospital Observation Time for Febrile Infants

Article Type
Changed
Tue, 04/27/2021 - 10:16
Display Headline
Decreasing Hospital Observation Time for Febrile Infants

Febrile infants aged 0 to 60 days often undergo diagnostic testing to evaluate for invasive bacterial infections (IBI; ie, bacteremia and meningitis) and are subsequently hospitalized pending culture results. Only 1% to 2% of infants 0 to 60 days old have an IBI,1-3 and most hospitalized infants are discharged once physicians feel confident that pathogens are unlikely to be isolated from blood and cerebrospinal fluid (CSF) cultures. Practice regarding duration of hospitalization while awaiting blood and CSF culture results is not standardized in this population. Longer hospitalizations can lead to increased costs and familial stress, including difficulty with breastfeeding and anxiety in newly postpartum mothers.4,5

In 2010, an institutional evidence-based guideline for the management of febrile infants aged 0 to 60 days recommended discharge after 36 hours of observation if all cultures were negative.6 However, recent studies demonstrate that 85% to 93% of pathogens in blood and CSF cultures grow within 24 hours of incubation.7-9 Assuming a 2% prevalence of IBI, if 15% of pathogens were identified after 24 hours of incubation, only one out of 333 infants would have an IBI identified after 24 hours of hospital observation.7

Furthermore, a review of our institution’s electronic health records (EHR) over the past 5 years revealed that an observation period of 24 hours would have resulted in the discharge of three infants with an IBI. Two infants had bacteremia; both were discharged from the emergency department (ED) without antibiotics, returned to care after cultures were reported positive at 27 hours, and had no adverse outcomes. The third infant had meningitis, but also had an abnormal CSF Gram stain, which led to a longer hospitalization.

In 2019, our institution appraised the emerging literature and institutional data supporting the low absolute risk of missed IBI, and also leveraged local consensus among key stakeholders to update its evidence-based guideline for the evaluation and management of febrile infants aged 60 days and younger. The updated guideline recommends that clinicians consider discharging well-appearing neonates and infants if blood and CSF cultures remain negative at 24 hours.10 The objective of this study was to decrease the average hospital culture observation time (COT; culture incubation to hospital discharge) from 38 to 30 hours over a 12-month period in febrile infants aged 0 to 60 days.

METHODS

Context

Improvement efforts were conducted at Cincinnati Children’s Hospital Medical Center (CCHMC), a large, urban, academic hospital that admitted more than 8,000 noncritically ill patients to the hospital medicine (HM) service from July 1, 2018, through June 30, 2019. Hospital medicine teams, located at both the main and satellite campuses, are staffed by attending physicians, fellows, residents, medical students, and nurse practitioners. The two campuses, which are about 20 miles apart, share clinician providers but have distinct nursing pools.

Microbiology services for all CCHMC patients are provided at the main campus. Blood and CSF cultures at the satellite campus are transported to the main campus for incubation and monitoring via an urgent courier service. The microbiology laboratory at CCHMC uses a continuous monitoring system for blood cultures (BACT/ALERT Virtuo, BioMérieux). The system automatically alerts laboratory technicians of positive cultures; these results are reported to clinical providers within 30 minutes of detection. Laboratory technicians manually evaluate CSF cultures once daily for 5 days.

Improvement Team

Our improvement team included three HM attending physicians; two HM fellows; a pediatric chief resident; two nurses, who represented nursing pools at the main and satellite campuses; and a clinical pharmacist, who is a co-leader of the antimicrobial stewardship program at CCHMC. Supporting members for the improvement team included the CCHMC laboratory director; the microbiology laboratory director; an infectious disease physician, who is a co-leader of the antimicrobial stewardship program; and nursing directors of the HM units at both campuses.

Evidence-Based Guideline

Our improvement initiative was based on recommendations from the updated CCHMC Evidence-Based Care Guideline for Management of Infants 0 to 60 days with Fever of Unknown Source.10 This guideline, published in May 2019, was developed by a multidisciplinary working group composed of key stakeholders from HM, community pediatrics, emergency medicine, the pediatric residency program, infectious disease, and laboratory medicine. Several improvement team members were participants on the committee that published the evidence-based guideline. The committee first performed a systematic literature review and critical appraisal of the literature. Care recommendations were formulated via a consensus process directed by best evidence, patient and family preferences, and clinical expertise; the recommendations were subsequently reviewed and approved by clinical experts who were not involved in the development process.

Based on evidence review and multistakeholder consensus, the updated guideline recommends clinicians consider discharging neonates and infants aged 60 days and younger if there is no culture growth after an observation period of 24 hours (as documented in the EHR) and patients are otherwise medically ready for discharge (ie, well appearing with adequate oral intake).10,11 In addition, prior to discharge, there must be a documented working phone number on file for the patient’s parents/guardians, an established outpatient follow-up plan within 24 hours, and communication with the primary pediatrician who is in agreement with discharge at 24 hours.

Study Population

Infants 0 to 60 days old who had a documented or reported fever without an apparent source based on history and physical exam upon presentation to the ED, and who were subsequently admitted to the HM service at CCHMC between October 30, 2018, and July 10, 2020, were eligible for inclusion. We excluded infants who were admitted to other clinical services (eg, intensive care unit); had organisms identified on blood, urine, or CSF culture within 24 hours of incubation; had positive herpes simplex virus testing; had skin/soft tissue infections or another clearly documented source of bacterial infection; or had an alternative indication for hospitalization (eg, need for intravenous fluid or deep suctioning) after cultures had incubated for 24 hours. Infants who had a positive blood, urine, or CSF culture result after 24 hours of incubation were included in the study population. Organisms were classified as pathogen or contaminant based on treatment decisions made by the care team.

Improvement Activities

Key drivers critical to success of the improvement efforts were: (1) clearly defined standard of care for duration of observation in febrile infants 0 to 60 days old; (2) improved understanding of microbiology lab procedures; (3) effective communication of discharge criteria between providers and nurses; and (4) transparency of data with feedback (Figure 1).

Key Driver Diagram Detailing Essential Drivers and Interventions Aimed at Reducing Culture Observation Time in Infants Aged 60 Days and Younger Hospitalized With Fever
The corresponding interventions were executed using Plan-Do-Study-Act (PDSA) cycles as follows:

Education and Structured Dissemination of Evidence-Based Guideline

The CCHMC febrile infant guideline10 was disseminated to HM physicians, residents, and nurses via the following means: (1) in-person announcements at staff meetings and educational conferences, (2) published highlights from the guideline in weekly newsletters, and (3) email announcements. Additionally, members of the study team educated HM attending physicians, nursing staff from the medical units at both campuses, and resident physicians about recent studies demonstrating safety of shorter length of stay (LOS) in febrile infants aged 0 to 60 days. The study team also provided residents, physicians, and nurses with data on the number of positive blood and CSF cultures and outcomes of patients at CCHMC within the past 5 years. In addition, team members led a journal club for residents discussing an article7 describing time-to-positivity of blood and CSF cultures in febrile infants. For ongoing engagement, the evidence-based guideline and a detailed explanation of microbiology procedures were published in the resident handbook, an internal resource that includes vital clinical pearls and practice guidelines across specialties. (Each resident receives an updated hard copy each year, and there is also an online link to the resource in the EHR.) Information about the guideline and COT was also included in the monthly chief resident’s orientation script, which is relayed to all residents on the first day of their HM rotation.

Clear Communication of Microbiology Procedures

Team members created a detailed process map describing the processing protocols for blood and CSF cultures collected at both CCHMC campuses. This information was shared with HM attending physicians and nurses via in-person announcements at staff meetings, flyers in team workrooms, and email communications. Residents received information on microbiology protocols via in-person announcements at educational conferences and dissemination in the weekly residency newsletter.Important information communicated included:

1. Definition of culture start time. We conveyed that there may be a delay of up to 4 hours between culture collection at the satellite campus and culture incubation at the main campus laboratory. As a result, the time of blood or CSF sample arrival to the main campus laboratory was a more accurate reflection of the culture incubation start time than the culture collection time.

2. Explanation of CSF culture processing. We discussed the process by which these cultures are plated upon arrival at the microbiology laboratory and read once per day in the morning. Therefore, a culture incubated at midnight would be evaluated once at 9 hours and not again until 33 hours.

Modification of Febrile Infant Order Set

Enhancements to the febrile infant order set improved communication and cultivated a shared mental model regarding discharge goals among all members of the care team. The EHR order set for febrile infants was updated as follows: (1) mandatory free-text fields that established the culture start time for blood and CSF cultures were added, (2) culture start time was clearly defined (ie, the time culture arrives at the main campus laboratory), and (3) a change was made in the default discharge criteria11 to “culture observation for 24 hours,” with the ability to modify COT (Appendix Figure 1). We embedded hyperlinks to the guideline and microbiology process map within the updated order set, which allowed providers to easily access this information and refresh their knowledge of the recommendations (Appendix Figure 1).

Identification of Failures and Follow-up With Near-Time Feedback

All cases of febrile infants were tracked weekly. For infants hospitalized longer than 24 hours, the study team contacted the discharging clinicians to discuss reasons for prolonged hospitalization, with an emphasis on identifying system-level barriers to earlier discharge.

Study of the Interventions

The institutional microbiology database was queried weekly to identify all infants 0 to 60 days old who had a blood culture obtained and were hospitalized on the HM service. Study team members conducted targeted EHR review to determine whether patients met exclusion criteria and to identify reasons for prolonged COT. Baseline data were collected retrospectively for a 3-month period prior to initiation of improvement activities. During the study period, queries were conducted weekly and reviewed by study team members to evaluate the impact of improvement activities and to inform new interventions.

Measures

Our primary outcome measure was COT, defined as the hours between final culture incubation and hospital discharge. The operational definition for “final culture incubation” was the documented time of arrival of the last collected culture to the microbiology laboratory. Our goal COT was 30 hours to account for a subset of patients whose blood and/or CSF culture were obtained overnight (ie, after 9 pm), since subsequent discharge times would likely and practically be delayed beyond 24 hours. Our secondary outcome measure was LOS, defined as the time between ED arrival and hospital discharge. Process measures included the proportion of patients for whom the febrile infant EHR order set was used and the proportion of patients for whom medical discharge criteria (ie, blood and CSF culture observed for ”xx” hours) and culture incubation start times were entered using the order set. Balancing measures included identification of IBI after hospital discharge, 48-hour ED revisits, and 7-day hospital readmissions.

Analysis

Measures were evaluated using statistical process control charts and run charts, and Western Electric rules were employed to determine special cause variation.12 Annotated X-bar S control charts tracked the impact of improvement activities on average COT and LOS for all infants. Given that a relatively small number of patients (ie, two to four) met inclusion criteria each week, average COT was calculated per five patients.

This study was considered exempt from review by the CCHMC Institutional Review Board.

RESULTS

Of the 184 infants in this study, 46 were included as part of baseline data collection, and 138 were included during the intervention period. The median age was 26.6 days (range, 3-59 days); 52% of patients were female; two-thirds were non-Hispanic White; 22% were Black, and 5% were Hispanic (Appendix Table).

Average COT decreased from 38 hours to 32 hours with improvement activities (Figure 2) and was sustained for a total of 17 months. There were small decreases in COT after initial education was provided to attendings, nurses, and residents.

X-Bar S Control Chart Displaying Average Culture Observation Time per Five Admitted Febrile Infants Aged 60 Days and Younger
However, the greatest sustained decreases in COT occurred after dissemination of the published evidence-based guideline and standardization of the EHR order set. Average LOS decreased from 42 hours to 36 hours (Figure 3). Among the total cohort, 34% of infants were admitted to the satellite campus. At the satellite and main campuses, median COT was 28 hours and 35 hours, respectively (Appendix Figure 2).

X-Bar S Control Chart Displaying Average Length of Stay From Emergency Department Arrival to Hospital Discharge per Five Admitted Febrile Infants Aged 60 Days and Younger

After the launch of the updated order set, median usage of the EHR order set increased from 50% to 80%. Medical discharge criteria were entered for 80 (96%) of the 83 patients for whom the updated order set was applied; culture incubation start times were entered for 78 (94%) of these patients.

No infants in our cohort were found to have IBI after hospital discharge. There were no ED revisits within 48 hours of discharge, and there were no hospital readmissions within 7 days of index discharge. Furthermore, none of the patients included in the study had growth of a pathogenic organism after 24 hours.

Of the 138 infants hospitalized during the intervention period, 77 (56%) had a COT greater than 30 hours. Among these 77 patients, 49 (64%) had their final culture incubated between 9 pm and 4 am; Furthermore, 11 (14%) had missing, abnormal, pretreated, or uninterpretable CSF studies, 7 (9%) had ongoing fevers, and 4 (5%) remained hospitalized due to family preference or inability to obtain timely outpatient follow-up.

DISCUSSION

Our study aimed to decrease the average COT from 38 hours to 30 hours among hospitalized infants aged 60 days and younger over a period of 12 months. An intervention featuring implementation of an evidence-based guideline through education, laboratory procedure transparency, creation of a standardized EHR order set, and near-time feedback was associated with a shorter average COT of 32 hours, sustained over a 17-month period. No infants with bacteremia or meningitis were inappropriately discharged during this study.

Interpretation

Prior to our improvement efforts, most febrile infants at CCHMC were observed for at least 36 hours based on a prior institutional guideline,6 despite recent evidence suggesting that most pathogens in blood and CSF cultures grow within 24 hours of incubation.7-9 The goal of this improvement initiative was to bridge the gap between emerging evidence and clinical practice by developing and disseminating an updated evidence-based guideline to safely decrease the hospital observation time in febrile infants aged 60 days and younger.

Similar to previous studies aimed at improving diagnosis and management among febrile infants,13-16 generation and structured dissemination of an institutional evidence-based guideline was crucial to safely shortening COT in our population. These prior studies established a goal COT of 36 to 42 hours for hospitalized febrile infants.13,15,16 Our study incorporated emerging evidence and local experience into an updated evidence-based practice guideline to further reduce COT to 32 hours for hospitalized infants. Key factors contributing to our success included multidisciplinary engagement, specifically partnering with nurses and resident physicians in designing and implementing our initiatives. Furthermore, improved transparency of culture monitoring practices allowed clinicians to better understand the recommended observation periods. Finally, we employed a standardized EHR order set as a no-cost, one-time, high-reliability intervention to establish 24 hours of culture monitoring as the default and to enhance transparency around start time for culture incubation.

Average COT remained stable at 32 hours for 17 months after initiation of the intervention. During the intervention period, 64% patients with hospital stays longer than 30 hours had cultures obtained between 9 pm to 4 am. These patients often remained hospitalized for longer than 30 hours to allow for a daytime hospital discharge. Additionally, CSF cultures were only monitored manually once per day between 8 am and 10 am. As a result, CSF cultures obtained in the evening (eg, 9 pm) would be evaluated once at roughly 12 hours of incubation, and then the following morning at 36 hours of incubation. In cases where CSF studies (eg, cell count, protein, Gram stain) were abnormal, uninterpretable, or could not be obtained, clinicians monitored CSF cultures closer to 36 hours from incubation. While evidence-based guidelines and local data support safe early discharge of febrile infants, clinicians presented with incomplete or uninterpretable data were appropriately more likely to observe infants for longer periods to confirm negative cultures.

Limitations

The study has several limitations. First, this single-center study was conducted at a quaternary care medical center with a robust quality improvement infrastructure. Our interventions took advantage of the existing processes in place that ensure timely discharge of medically ready patients.11 Furthermore, microbiology laboratory practices are unique to our institution. These factors limit the generalizability of this work. Second, due to small numbers of eligible infants, analyses were conducted per five patients. Infrequent hospitalizations limited our ability to learn quickly from PDSA cycles. Finally, we did not measure cost savings attributable to shorter hospital stays. However, in addition to financial savings from charges and decreased nonmedical costs such as lost earnings and childcare,17 shorter hospitalizations have many additional benefits, such as promoting bonding and breastfeeding and decreasing exposure to nosocomial infections. Shorter hospitalizations, with clearly communicated discharge times, also serve to optimize patient throughput.

CONCLUSION

Implementation of a clinical practice guideline resulted in reduction of average COT from 38 to 32 hours in febrile infants aged 60 days and younger, with no cases of missed IBI. Engagement of multidisciplinary stakeholders in the generation and structured dissemination of the evidence-based guideline, improved transparency of the microbiological blood and CSF culture process, and standardization of EHR order sets were crucial to the success of this work. Cultures incubated overnight and daily CSF culture-monitoring practices primarily contributed to an average LOS of more than 30 hours.

Future work will include collaboration with emergency physicians to improve evaluation efficiency and decrease LOS in the ED for febrile infants. Additionally, creation of an automated data dashboard of COT and LOS will provide clinicians with real-time feedback on hospitalization practices.

Acknowledgments

The authors thank Dr Jeffrey Simmons, MD, MSc, as well as the members of the 2019 Fever of Uncertain Source Evidence-Based Guideline Committee. We also thank the James M Anderson Center for Health System Excellence and the Rapid Cycle Improvement Collaborative for their support with guideline development as well as design and execution of our improvement efforts.

Files
References

1. Cruz AT, Mahajan P, Bonsu BK, et al. Accuracy of complete blood cell counts to identify febrile infants 60 days or younger with invasive bacterial infections. JAMA Pediatr. 2017;171(11):e172927. https://doi.org/10.1001/jamapediatrics.2017.2927
2. Kuppermann N, Dayan PS, Levine DA, et al; Febrile Infant Working Group of the Pediatric Emergency Care Applied Research Network (PECARN). A clinical prediction rule to identify febrile infants 60 days and younger at low risk for serious bacterial infections. JAMA Pediatr. 2019;173(4):342-351. https://doi.org/10.1001/jamapediatrics.2018.5501
3. Nigrovic LE, Mahajan PV, Blumberg SM, et al; Febrile Infant Working Group of the Pediatric Emergency Care Applied Research Network (PECARN). The Yale Observation Scale Score and the risk of serious bacterial infections in febrile infants. Pediatrics. 2017;140(1):e20170695. https://doi.org/10.1542/peds.2017-0695
4. De S, Tong A, Isaacs D, Craig JC. Parental perspectives on evaluation and management of fever in young infants: an interview study. Arch Dis Child. 2014;99(8):717-723. https://doi.org/10.1136/archdischild-2013-305736
5. Paxton RD, Byington CL. An examination of the unintended consequences of the rule-out sepsis evaluation: a parental perspective. Clin Pediatr (Phila). 2001;40(2):71-77. https://doi.org/10.1177/000992280104000202
6. FUS Team. Cincinnati Children’s Hospital Medical Center. Evidence-based clinical care guideline for fever of uncertain source in infants 60 days of age or less. Guideline 2. 2010:1-4.
7. Aronson PL, Wang ME, Nigrovic LE, et al; Febrile Young Infant Research Collaborative. Time to pathogen detection for non-ill versus ill-appearing infants ≤60 days old with bacteremia and meningitis. Hosp Pediatr. 2018;8(7):379-384. https://doi.org/10.1542/hpeds.2018-0002
8. Biondi EA, Mischler M, Jerardi KE, et al; Pediatric Research in Inpatient Settings (PRIS) Network. Blood culture time to positivity in febrile infants with bacteremia. JAMA Pediatr. 2014;168(9):844-849. https://doi.org/10.1001/jamapediatrics.2014.895
9. Lefebvre CE, Renaud C, Chartrand C. Time to positivity of blood cultures in infants 0 to 90 days old presenting to the emergency department: is 36 hours enough? J Pediatric Infect Dis Soc. 2017;6(1):28-32. https://doi.org/10.1093/jpids/piv078
10. Unaka N, Statile A, Bensman, R, et al. Cincinnati Children’s Hospital Medical Center. Evidence-based clinical care guideline for evidence-based care guideline for management of infants 0 to 60 days seen in emergency department for fever of unknown source. Guideline 10. 2019;1-42. http://www.cincinnatichildrens.org/service/j/anderson-center/evidence-based-care/recommendations/default/
11. White CM, Statile AM, White DL, et al. Using quality improvement to optimise paediatric discharge efficiency. BMJ Qual Saf. 2014;23(5):428-436. https://doi.org/10.1136/bmjqs-2013-002556
12. Benneyan JC, Lloyd RC, Plsek PE. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care. 2003;12(6):458-464. https://doi.org/10.1136/qhc.12.6.458
13. Biondi EA, McCulloh R, Staggs VS, et al; American Academy of Pediatrics’ Revise Collaborative. Reducing variability in the infant sepsis evaluation (REVISE): a national quality initiative. Pediatrics. 2019;144(3): e20182201. https://doi.org/10.1542/peds.2018-2201
14. McCulloh RJ, Commers T, Williams DD, Michael J, Mann K, Newland JG. Effect of combined clinical practice guideline and electronic order set implementation on febrile infant evaluation and management. Pediatr Emerg Care. 2021;37(1):e25-e31. https://doi.org/10.1097/pec.0000000000002012
15. Foster LZ, Beiner J, Duh-Leong C, et al. Implementation of febrile infant management guidelines reduces hospitalization. Pediatr Qual Saf. 2020;5(1):e252. https://doi.org/10.1097/pq9.0000000000000252
16. Byington CL, Reynolds CC, Korgenski K, et al. Costs and infant outcomes after implementation of a care process model for febrile infants. Pediatrics. 2012;130(1):e16-e24. https://doi.org/10.1542/peds.2012-0127
17. Chang LV, Shah AN, Hoefgen ER, et al; H2O Study Group. Lost earnings and nonmedical expenses of pediatric hospitalizations. Pediatrics. 2018;142(3):e20180195. https://doi.org/10.1542/peds.2018-0195

Article PDF
Author and Disclosure Information

1Division of Hospital Medicine, Department of Pediatrics, Seattle Children’s Hospital, University of Washington School of Medicine, Seattle, Washington; 2Division of Hospital Medicine, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 3Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; 4Division of Pharmacy, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 5Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 6Section of Hospital Medicine, Department of Pediatrics, University of Oklahoma Health Science Center, Oklahoma City, Oklahoma; 7Division of Hospital Medicine, Department of Pediatrics, University Hospital Rainbow Babies and Children’s Hospital, Cleveland Ohio; 8Division of Infectious Diseases, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio.

Disclosures
The authors have nothing to disclose.

Issue
Journal of Hospital Medicine 16(5)
Topics
Page Number
267-273. Published Online First April 20, 2021
Sections
Files
Files
Author and Disclosure Information

1Division of Hospital Medicine, Department of Pediatrics, Seattle Children’s Hospital, University of Washington School of Medicine, Seattle, Washington; 2Division of Hospital Medicine, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 3Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; 4Division of Pharmacy, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 5Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 6Section of Hospital Medicine, Department of Pediatrics, University of Oklahoma Health Science Center, Oklahoma City, Oklahoma; 7Division of Hospital Medicine, Department of Pediatrics, University Hospital Rainbow Babies and Children’s Hospital, Cleveland Ohio; 8Division of Infectious Diseases, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio.

Disclosures
The authors have nothing to disclose.

Author and Disclosure Information

1Division of Hospital Medicine, Department of Pediatrics, Seattle Children’s Hospital, University of Washington School of Medicine, Seattle, Washington; 2Division of Hospital Medicine, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 3Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; 4Division of Pharmacy, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 5Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio; 6Section of Hospital Medicine, Department of Pediatrics, University of Oklahoma Health Science Center, Oklahoma City, Oklahoma; 7Division of Hospital Medicine, Department of Pediatrics, University Hospital Rainbow Babies and Children’s Hospital, Cleveland Ohio; 8Division of Infectious Diseases, Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio.

Disclosures
The authors have nothing to disclose.

Article PDF
Article PDF
Related Articles

Febrile infants aged 0 to 60 days often undergo diagnostic testing to evaluate for invasive bacterial infections (IBI; ie, bacteremia and meningitis) and are subsequently hospitalized pending culture results. Only 1% to 2% of infants 0 to 60 days old have an IBI,1-3 and most hospitalized infants are discharged once physicians feel confident that pathogens are unlikely to be isolated from blood and cerebrospinal fluid (CSF) cultures. Practice regarding duration of hospitalization while awaiting blood and CSF culture results is not standardized in this population. Longer hospitalizations can lead to increased costs and familial stress, including difficulty with breastfeeding and anxiety in newly postpartum mothers.4,5

In 2010, an institutional evidence-based guideline for the management of febrile infants aged 0 to 60 days recommended discharge after 36 hours of observation if all cultures were negative.6 However, recent studies demonstrate that 85% to 93% of pathogens in blood and CSF cultures grow within 24 hours of incubation.7-9 Assuming a 2% prevalence of IBI, if 15% of pathogens were identified after 24 hours of incubation, only one out of 333 infants would have an IBI identified after 24 hours of hospital observation.7

Furthermore, a review of our institution’s electronic health records (EHR) over the past 5 years revealed that an observation period of 24 hours would have resulted in the discharge of three infants with an IBI. Two infants had bacteremia; both were discharged from the emergency department (ED) without antibiotics, returned to care after cultures were reported positive at 27 hours, and had no adverse outcomes. The third infant had meningitis, but also had an abnormal CSF Gram stain, which led to a longer hospitalization.

In 2019, our institution appraised the emerging literature and institutional data supporting the low absolute risk of missed IBI, and also leveraged local consensus among key stakeholders to update its evidence-based guideline for the evaluation and management of febrile infants aged 60 days and younger. The updated guideline recommends that clinicians consider discharging well-appearing neonates and infants if blood and CSF cultures remain negative at 24 hours.10 The objective of this study was to decrease the average hospital culture observation time (COT; culture incubation to hospital discharge) from 38 to 30 hours over a 12-month period in febrile infants aged 0 to 60 days.

METHODS

Context

Improvement efforts were conducted at Cincinnati Children’s Hospital Medical Center (CCHMC), a large, urban, academic hospital that admitted more than 8,000 noncritically ill patients to the hospital medicine (HM) service from July 1, 2018, through June 30, 2019. Hospital medicine teams, located at both the main and satellite campuses, are staffed by attending physicians, fellows, residents, medical students, and nurse practitioners. The two campuses, which are about 20 miles apart, share clinician providers but have distinct nursing pools.

Microbiology services for all CCHMC patients are provided at the main campus. Blood and CSF cultures at the satellite campus are transported to the main campus for incubation and monitoring via an urgent courier service. The microbiology laboratory at CCHMC uses a continuous monitoring system for blood cultures (BACT/ALERT Virtuo, BioMérieux). The system automatically alerts laboratory technicians of positive cultures; these results are reported to clinical providers within 30 minutes of detection. Laboratory technicians manually evaluate CSF cultures once daily for 5 days.

Improvement Team

Our improvement team included three HM attending physicians; two HM fellows; a pediatric chief resident; two nurses, who represented nursing pools at the main and satellite campuses; and a clinical pharmacist, who is a co-leader of the antimicrobial stewardship program at CCHMC. Supporting members for the improvement team included the CCHMC laboratory director; the microbiology laboratory director; an infectious disease physician, who is a co-leader of the antimicrobial stewardship program; and nursing directors of the HM units at both campuses.

Evidence-Based Guideline

Our improvement initiative was based on recommendations from the updated CCHMC Evidence-Based Care Guideline for Management of Infants 0 to 60 days with Fever of Unknown Source.10 This guideline, published in May 2019, was developed by a multidisciplinary working group composed of key stakeholders from HM, community pediatrics, emergency medicine, the pediatric residency program, infectious disease, and laboratory medicine. Several improvement team members were participants on the committee that published the evidence-based guideline. The committee first performed a systematic literature review and critical appraisal of the literature. Care recommendations were formulated via a consensus process directed by best evidence, patient and family preferences, and clinical expertise; the recommendations were subsequently reviewed and approved by clinical experts who were not involved in the development process.

Based on evidence review and multistakeholder consensus, the updated guideline recommends clinicians consider discharging neonates and infants aged 60 days and younger if there is no culture growth after an observation period of 24 hours (as documented in the EHR) and patients are otherwise medically ready for discharge (ie, well appearing with adequate oral intake).10,11 In addition, prior to discharge, there must be a documented working phone number on file for the patient’s parents/guardians, an established outpatient follow-up plan within 24 hours, and communication with the primary pediatrician who is in agreement with discharge at 24 hours.

Study Population

Infants 0 to 60 days old who had a documented or reported fever without an apparent source based on history and physical exam upon presentation to the ED, and who were subsequently admitted to the HM service at CCHMC between October 30, 2018, and July 10, 2020, were eligible for inclusion. We excluded infants who were admitted to other clinical services (eg, intensive care unit); had organisms identified on blood, urine, or CSF culture within 24 hours of incubation; had positive herpes simplex virus testing; had skin/soft tissue infections or another clearly documented source of bacterial infection; or had an alternative indication for hospitalization (eg, need for intravenous fluid or deep suctioning) after cultures had incubated for 24 hours. Infants who had a positive blood, urine, or CSF culture result after 24 hours of incubation were included in the study population. Organisms were classified as pathogen or contaminant based on treatment decisions made by the care team.

Improvement Activities

Key drivers critical to success of the improvement efforts were: (1) clearly defined standard of care for duration of observation in febrile infants 0 to 60 days old; (2) improved understanding of microbiology lab procedures; (3) effective communication of discharge criteria between providers and nurses; and (4) transparency of data with feedback (Figure 1).

Key Driver Diagram Detailing Essential Drivers and Interventions Aimed at Reducing Culture Observation Time in Infants Aged 60 Days and Younger Hospitalized With Fever
The corresponding interventions were executed using Plan-Do-Study-Act (PDSA) cycles as follows:

Education and Structured Dissemination of Evidence-Based Guideline

The CCHMC febrile infant guideline10 was disseminated to HM physicians, residents, and nurses via the following means: (1) in-person announcements at staff meetings and educational conferences, (2) published highlights from the guideline in weekly newsletters, and (3) email announcements. Additionally, members of the study team educated HM attending physicians, nursing staff from the medical units at both campuses, and resident physicians about recent studies demonstrating safety of shorter length of stay (LOS) in febrile infants aged 0 to 60 days. The study team also provided residents, physicians, and nurses with data on the number of positive blood and CSF cultures and outcomes of patients at CCHMC within the past 5 years. In addition, team members led a journal club for residents discussing an article7 describing time-to-positivity of blood and CSF cultures in febrile infants. For ongoing engagement, the evidence-based guideline and a detailed explanation of microbiology procedures were published in the resident handbook, an internal resource that includes vital clinical pearls and practice guidelines across specialties. (Each resident receives an updated hard copy each year, and there is also an online link to the resource in the EHR.) Information about the guideline and COT was also included in the monthly chief resident’s orientation script, which is relayed to all residents on the first day of their HM rotation.

Clear Communication of Microbiology Procedures

Team members created a detailed process map describing the processing protocols for blood and CSF cultures collected at both CCHMC campuses. This information was shared with HM attending physicians and nurses via in-person announcements at staff meetings, flyers in team workrooms, and email communications. Residents received information on microbiology protocols via in-person announcements at educational conferences and dissemination in the weekly residency newsletter.Important information communicated included:

1. Definition of culture start time. We conveyed that there may be a delay of up to 4 hours between culture collection at the satellite campus and culture incubation at the main campus laboratory. As a result, the time of blood or CSF sample arrival to the main campus laboratory was a more accurate reflection of the culture incubation start time than the culture collection time.

2. Explanation of CSF culture processing. We discussed the process by which these cultures are plated upon arrival at the microbiology laboratory and read once per day in the morning. Therefore, a culture incubated at midnight would be evaluated once at 9 hours and not again until 33 hours.

Modification of Febrile Infant Order Set

Enhancements to the febrile infant order set improved communication and cultivated a shared mental model regarding discharge goals among all members of the care team. The EHR order set for febrile infants was updated as follows: (1) mandatory free-text fields that established the culture start time for blood and CSF cultures were added, (2) culture start time was clearly defined (ie, the time culture arrives at the main campus laboratory), and (3) a change was made in the default discharge criteria11 to “culture observation for 24 hours,” with the ability to modify COT (Appendix Figure 1). We embedded hyperlinks to the guideline and microbiology process map within the updated order set, which allowed providers to easily access this information and refresh their knowledge of the recommendations (Appendix Figure 1).

Identification of Failures and Follow-up With Near-Time Feedback

All cases of febrile infants were tracked weekly. For infants hospitalized longer than 24 hours, the study team contacted the discharging clinicians to discuss reasons for prolonged hospitalization, with an emphasis on identifying system-level barriers to earlier discharge.

Study of the Interventions

The institutional microbiology database was queried weekly to identify all infants 0 to 60 days old who had a blood culture obtained and were hospitalized on the HM service. Study team members conducted targeted EHR review to determine whether patients met exclusion criteria and to identify reasons for prolonged COT. Baseline data were collected retrospectively for a 3-month period prior to initiation of improvement activities. During the study period, queries were conducted weekly and reviewed by study team members to evaluate the impact of improvement activities and to inform new interventions.

Measures

Our primary outcome measure was COT, defined as the hours between final culture incubation and hospital discharge. The operational definition for “final culture incubation” was the documented time of arrival of the last collected culture to the microbiology laboratory. Our goal COT was 30 hours to account for a subset of patients whose blood and/or CSF culture were obtained overnight (ie, after 9 pm), since subsequent discharge times would likely and practically be delayed beyond 24 hours. Our secondary outcome measure was LOS, defined as the time between ED arrival and hospital discharge. Process measures included the proportion of patients for whom the febrile infant EHR order set was used and the proportion of patients for whom medical discharge criteria (ie, blood and CSF culture observed for ”xx” hours) and culture incubation start times were entered using the order set. Balancing measures included identification of IBI after hospital discharge, 48-hour ED revisits, and 7-day hospital readmissions.

Analysis

Measures were evaluated using statistical process control charts and run charts, and Western Electric rules were employed to determine special cause variation.12 Annotated X-bar S control charts tracked the impact of improvement activities on average COT and LOS for all infants. Given that a relatively small number of patients (ie, two to four) met inclusion criteria each week, average COT was calculated per five patients.

This study was considered exempt from review by the CCHMC Institutional Review Board.

RESULTS

Of the 184 infants in this study, 46 were included as part of baseline data collection, and 138 were included during the intervention period. The median age was 26.6 days (range, 3-59 days); 52% of patients were female; two-thirds were non-Hispanic White; 22% were Black, and 5% were Hispanic (Appendix Table).

Average COT decreased from 38 hours to 32 hours with improvement activities (Figure 2) and was sustained for a total of 17 months. There were small decreases in COT after initial education was provided to attendings, nurses, and residents.

X-Bar S Control Chart Displaying Average Culture Observation Time per Five Admitted Febrile Infants Aged 60 Days and Younger
However, the greatest sustained decreases in COT occurred after dissemination of the published evidence-based guideline and standardization of the EHR order set. Average LOS decreased from 42 hours to 36 hours (Figure 3). Among the total cohort, 34% of infants were admitted to the satellite campus. At the satellite and main campuses, median COT was 28 hours and 35 hours, respectively (Appendix Figure 2).

X-Bar S Control Chart Displaying Average Length of Stay From Emergency Department Arrival to Hospital Discharge per Five Admitted Febrile Infants Aged 60 Days and Younger

After the launch of the updated order set, median usage of the EHR order set increased from 50% to 80%. Medical discharge criteria were entered for 80 (96%) of the 83 patients for whom the updated order set was applied; culture incubation start times were entered for 78 (94%) of these patients.

No infants in our cohort were found to have IBI after hospital discharge. There were no ED revisits within 48 hours of discharge, and there were no hospital readmissions within 7 days of index discharge. Furthermore, none of the patients included in the study had growth of a pathogenic organism after 24 hours.

Of the 138 infants hospitalized during the intervention period, 77 (56%) had a COT greater than 30 hours. Among these 77 patients, 49 (64%) had their final culture incubated between 9 pm and 4 am; Furthermore, 11 (14%) had missing, abnormal, pretreated, or uninterpretable CSF studies, 7 (9%) had ongoing fevers, and 4 (5%) remained hospitalized due to family preference or inability to obtain timely outpatient follow-up.

DISCUSSION

Our study aimed to decrease the average COT from 38 hours to 30 hours among hospitalized infants aged 60 days and younger over a period of 12 months. An intervention featuring implementation of an evidence-based guideline through education, laboratory procedure transparency, creation of a standardized EHR order set, and near-time feedback was associated with a shorter average COT of 32 hours, sustained over a 17-month period. No infants with bacteremia or meningitis were inappropriately discharged during this study.

Interpretation

Prior to our improvement efforts, most febrile infants at CCHMC were observed for at least 36 hours based on a prior institutional guideline,6 despite recent evidence suggesting that most pathogens in blood and CSF cultures grow within 24 hours of incubation.7-9 The goal of this improvement initiative was to bridge the gap between emerging evidence and clinical practice by developing and disseminating an updated evidence-based guideline to safely decrease the hospital observation time in febrile infants aged 60 days and younger.

Similar to previous studies aimed at improving diagnosis and management among febrile infants,13-16 generation and structured dissemination of an institutional evidence-based guideline was crucial to safely shortening COT in our population. These prior studies established a goal COT of 36 to 42 hours for hospitalized febrile infants.13,15,16 Our study incorporated emerging evidence and local experience into an updated evidence-based practice guideline to further reduce COT to 32 hours for hospitalized infants. Key factors contributing to our success included multidisciplinary engagement, specifically partnering with nurses and resident physicians in designing and implementing our initiatives. Furthermore, improved transparency of culture monitoring practices allowed clinicians to better understand the recommended observation periods. Finally, we employed a standardized EHR order set as a no-cost, one-time, high-reliability intervention to establish 24 hours of culture monitoring as the default and to enhance transparency around start time for culture incubation.

Average COT remained stable at 32 hours for 17 months after initiation of the intervention. During the intervention period, 64% patients with hospital stays longer than 30 hours had cultures obtained between 9 pm to 4 am. These patients often remained hospitalized for longer than 30 hours to allow for a daytime hospital discharge. Additionally, CSF cultures were only monitored manually once per day between 8 am and 10 am. As a result, CSF cultures obtained in the evening (eg, 9 pm) would be evaluated once at roughly 12 hours of incubation, and then the following morning at 36 hours of incubation. In cases where CSF studies (eg, cell count, protein, Gram stain) were abnormal, uninterpretable, or could not be obtained, clinicians monitored CSF cultures closer to 36 hours from incubation. While evidence-based guidelines and local data support safe early discharge of febrile infants, clinicians presented with incomplete or uninterpretable data were appropriately more likely to observe infants for longer periods to confirm negative cultures.

Limitations

The study has several limitations. First, this single-center study was conducted at a quaternary care medical center with a robust quality improvement infrastructure. Our interventions took advantage of the existing processes in place that ensure timely discharge of medically ready patients.11 Furthermore, microbiology laboratory practices are unique to our institution. These factors limit the generalizability of this work. Second, due to small numbers of eligible infants, analyses were conducted per five patients. Infrequent hospitalizations limited our ability to learn quickly from PDSA cycles. Finally, we did not measure cost savings attributable to shorter hospital stays. However, in addition to financial savings from charges and decreased nonmedical costs such as lost earnings and childcare,17 shorter hospitalizations have many additional benefits, such as promoting bonding and breastfeeding and decreasing exposure to nosocomial infections. Shorter hospitalizations, with clearly communicated discharge times, also serve to optimize patient throughput.

CONCLUSION

Implementation of a clinical practice guideline resulted in reduction of average COT from 38 to 32 hours in febrile infants aged 60 days and younger, with no cases of missed IBI. Engagement of multidisciplinary stakeholders in the generation and structured dissemination of the evidence-based guideline, improved transparency of the microbiological blood and CSF culture process, and standardization of EHR order sets were crucial to the success of this work. Cultures incubated overnight and daily CSF culture-monitoring practices primarily contributed to an average LOS of more than 30 hours.

Future work will include collaboration with emergency physicians to improve evaluation efficiency and decrease LOS in the ED for febrile infants. Additionally, creation of an automated data dashboard of COT and LOS will provide clinicians with real-time feedback on hospitalization practices.

Acknowledgments

The authors thank Dr Jeffrey Simmons, MD, MSc, as well as the members of the 2019 Fever of Uncertain Source Evidence-Based Guideline Committee. We also thank the James M Anderson Center for Health System Excellence and the Rapid Cycle Improvement Collaborative for their support with guideline development as well as design and execution of our improvement efforts.

Febrile infants aged 0 to 60 days often undergo diagnostic testing to evaluate for invasive bacterial infections (IBI; ie, bacteremia and meningitis) and are subsequently hospitalized pending culture results. Only 1% to 2% of infants 0 to 60 days old have an IBI,1-3 and most hospitalized infants are discharged once physicians feel confident that pathogens are unlikely to be isolated from blood and cerebrospinal fluid (CSF) cultures. Practice regarding duration of hospitalization while awaiting blood and CSF culture results is not standardized in this population. Longer hospitalizations can lead to increased costs and familial stress, including difficulty with breastfeeding and anxiety in newly postpartum mothers.4,5

In 2010, an institutional evidence-based guideline for the management of febrile infants aged 0 to 60 days recommended discharge after 36 hours of observation if all cultures were negative.6 However, recent studies demonstrate that 85% to 93% of pathogens in blood and CSF cultures grow within 24 hours of incubation.7-9 Assuming a 2% prevalence of IBI, if 15% of pathogens were identified after 24 hours of incubation, only one out of 333 infants would have an IBI identified after 24 hours of hospital observation.7

Furthermore, a review of our institution’s electronic health records (EHR) over the past 5 years revealed that an observation period of 24 hours would have resulted in the discharge of three infants with an IBI. Two infants had bacteremia; both were discharged from the emergency department (ED) without antibiotics, returned to care after cultures were reported positive at 27 hours, and had no adverse outcomes. The third infant had meningitis, but also had an abnormal CSF Gram stain, which led to a longer hospitalization.

In 2019, our institution appraised the emerging literature and institutional data supporting the low absolute risk of missed IBI, and also leveraged local consensus among key stakeholders to update its evidence-based guideline for the evaluation and management of febrile infants aged 60 days and younger. The updated guideline recommends that clinicians consider discharging well-appearing neonates and infants if blood and CSF cultures remain negative at 24 hours.10 The objective of this study was to decrease the average hospital culture observation time (COT; culture incubation to hospital discharge) from 38 to 30 hours over a 12-month period in febrile infants aged 0 to 60 days.

METHODS

Context

Improvement efforts were conducted at Cincinnati Children’s Hospital Medical Center (CCHMC), a large, urban, academic hospital that admitted more than 8,000 noncritically ill patients to the hospital medicine (HM) service from July 1, 2018, through June 30, 2019. Hospital medicine teams, located at both the main and satellite campuses, are staffed by attending physicians, fellows, residents, medical students, and nurse practitioners. The two campuses, which are about 20 miles apart, share clinician providers but have distinct nursing pools.

Microbiology services for all CCHMC patients are provided at the main campus. Blood and CSF cultures at the satellite campus are transported to the main campus for incubation and monitoring via an urgent courier service. The microbiology laboratory at CCHMC uses a continuous monitoring system for blood cultures (BACT/ALERT Virtuo, BioMérieux). The system automatically alerts laboratory technicians of positive cultures; these results are reported to clinical providers within 30 minutes of detection. Laboratory technicians manually evaluate CSF cultures once daily for 5 days.

Improvement Team

Our improvement team included three HM attending physicians; two HM fellows; a pediatric chief resident; two nurses, who represented nursing pools at the main and satellite campuses; and a clinical pharmacist, who is a co-leader of the antimicrobial stewardship program at CCHMC. Supporting members for the improvement team included the CCHMC laboratory director; the microbiology laboratory director; an infectious disease physician, who is a co-leader of the antimicrobial stewardship program; and nursing directors of the HM units at both campuses.

Evidence-Based Guideline

Our improvement initiative was based on recommendations from the updated CCHMC Evidence-Based Care Guideline for Management of Infants 0 to 60 days with Fever of Unknown Source.10 This guideline, published in May 2019, was developed by a multidisciplinary working group composed of key stakeholders from HM, community pediatrics, emergency medicine, the pediatric residency program, infectious disease, and laboratory medicine. Several improvement team members were participants on the committee that published the evidence-based guideline. The committee first performed a systematic literature review and critical appraisal of the literature. Care recommendations were formulated via a consensus process directed by best evidence, patient and family preferences, and clinical expertise; the recommendations were subsequently reviewed and approved by clinical experts who were not involved in the development process.

Based on evidence review and multistakeholder consensus, the updated guideline recommends clinicians consider discharging neonates and infants aged 60 days and younger if there is no culture growth after an observation period of 24 hours (as documented in the EHR) and patients are otherwise medically ready for discharge (ie, well appearing with adequate oral intake).10,11 In addition, prior to discharge, there must be a documented working phone number on file for the patient’s parents/guardians, an established outpatient follow-up plan within 24 hours, and communication with the primary pediatrician who is in agreement with discharge at 24 hours.

Study Population

Infants 0 to 60 days old who had a documented or reported fever without an apparent source based on history and physical exam upon presentation to the ED, and who were subsequently admitted to the HM service at CCHMC between October 30, 2018, and July 10, 2020, were eligible for inclusion. We excluded infants who were admitted to other clinical services (eg, intensive care unit); had organisms identified on blood, urine, or CSF culture within 24 hours of incubation; had positive herpes simplex virus testing; had skin/soft tissue infections or another clearly documented source of bacterial infection; or had an alternative indication for hospitalization (eg, need for intravenous fluid or deep suctioning) after cultures had incubated for 24 hours. Infants who had a positive blood, urine, or CSF culture result after 24 hours of incubation were included in the study population. Organisms were classified as pathogen or contaminant based on treatment decisions made by the care team.

Improvement Activities

Key drivers critical to success of the improvement efforts were: (1) clearly defined standard of care for duration of observation in febrile infants 0 to 60 days old; (2) improved understanding of microbiology lab procedures; (3) effective communication of discharge criteria between providers and nurses; and (4) transparency of data with feedback (Figure 1).

Key Driver Diagram Detailing Essential Drivers and Interventions Aimed at Reducing Culture Observation Time in Infants Aged 60 Days and Younger Hospitalized With Fever
The corresponding interventions were executed using Plan-Do-Study-Act (PDSA) cycles as follows:

Education and Structured Dissemination of Evidence-Based Guideline

The CCHMC febrile infant guideline10 was disseminated to HM physicians, residents, and nurses via the following means: (1) in-person announcements at staff meetings and educational conferences, (2) published highlights from the guideline in weekly newsletters, and (3) email announcements. Additionally, members of the study team educated HM attending physicians, nursing staff from the medical units at both campuses, and resident physicians about recent studies demonstrating safety of shorter length of stay (LOS) in febrile infants aged 0 to 60 days. The study team also provided residents, physicians, and nurses with data on the number of positive blood and CSF cultures and outcomes of patients at CCHMC within the past 5 years. In addition, team members led a journal club for residents discussing an article7 describing time-to-positivity of blood and CSF cultures in febrile infants. For ongoing engagement, the evidence-based guideline and a detailed explanation of microbiology procedures were published in the resident handbook, an internal resource that includes vital clinical pearls and practice guidelines across specialties. (Each resident receives an updated hard copy each year, and there is also an online link to the resource in the EHR.) Information about the guideline and COT was also included in the monthly chief resident’s orientation script, which is relayed to all residents on the first day of their HM rotation.

Clear Communication of Microbiology Procedures

Team members created a detailed process map describing the processing protocols for blood and CSF cultures collected at both CCHMC campuses. This information was shared with HM attending physicians and nurses via in-person announcements at staff meetings, flyers in team workrooms, and email communications. Residents received information on microbiology protocols via in-person announcements at educational conferences and dissemination in the weekly residency newsletter.Important information communicated included:

1. Definition of culture start time. We conveyed that there may be a delay of up to 4 hours between culture collection at the satellite campus and culture incubation at the main campus laboratory. As a result, the time of blood or CSF sample arrival to the main campus laboratory was a more accurate reflection of the culture incubation start time than the culture collection time.

2. Explanation of CSF culture processing. We discussed the process by which these cultures are plated upon arrival at the microbiology laboratory and read once per day in the morning. Therefore, a culture incubated at midnight would be evaluated once at 9 hours and not again until 33 hours.

Modification of Febrile Infant Order Set

Enhancements to the febrile infant order set improved communication and cultivated a shared mental model regarding discharge goals among all members of the care team. The EHR order set for febrile infants was updated as follows: (1) mandatory free-text fields that established the culture start time for blood and CSF cultures were added, (2) culture start time was clearly defined (ie, the time culture arrives at the main campus laboratory), and (3) a change was made in the default discharge criteria11 to “culture observation for 24 hours,” with the ability to modify COT (Appendix Figure 1). We embedded hyperlinks to the guideline and microbiology process map within the updated order set, which allowed providers to easily access this information and refresh their knowledge of the recommendations (Appendix Figure 1).

Identification of Failures and Follow-up With Near-Time Feedback

All cases of febrile infants were tracked weekly. For infants hospitalized longer than 24 hours, the study team contacted the discharging clinicians to discuss reasons for prolonged hospitalization, with an emphasis on identifying system-level barriers to earlier discharge.

Study of the Interventions

The institutional microbiology database was queried weekly to identify all infants 0 to 60 days old who had a blood culture obtained and were hospitalized on the HM service. Study team members conducted targeted EHR review to determine whether patients met exclusion criteria and to identify reasons for prolonged COT. Baseline data were collected retrospectively for a 3-month period prior to initiation of improvement activities. During the study period, queries were conducted weekly and reviewed by study team members to evaluate the impact of improvement activities and to inform new interventions.

Measures

Our primary outcome measure was COT, defined as the hours between final culture incubation and hospital discharge. The operational definition for “final culture incubation” was the documented time of arrival of the last collected culture to the microbiology laboratory. Our goal COT was 30 hours to account for a subset of patients whose blood and/or CSF culture were obtained overnight (ie, after 9 pm), since subsequent discharge times would likely and practically be delayed beyond 24 hours. Our secondary outcome measure was LOS, defined as the time between ED arrival and hospital discharge. Process measures included the proportion of patients for whom the febrile infant EHR order set was used and the proportion of patients for whom medical discharge criteria (ie, blood and CSF culture observed for ”xx” hours) and culture incubation start times were entered using the order set. Balancing measures included identification of IBI after hospital discharge, 48-hour ED revisits, and 7-day hospital readmissions.

Analysis

Measures were evaluated using statistical process control charts and run charts, and Western Electric rules were employed to determine special cause variation.12 Annotated X-bar S control charts tracked the impact of improvement activities on average COT and LOS for all infants. Given that a relatively small number of patients (ie, two to four) met inclusion criteria each week, average COT was calculated per five patients.

This study was considered exempt from review by the CCHMC Institutional Review Board.

RESULTS

Of the 184 infants in this study, 46 were included as part of baseline data collection, and 138 were included during the intervention period. The median age was 26.6 days (range, 3-59 days); 52% of patients were female; two-thirds were non-Hispanic White; 22% were Black, and 5% were Hispanic (Appendix Table).

Average COT decreased from 38 hours to 32 hours with improvement activities (Figure 2) and was sustained for a total of 17 months. There were small decreases in COT after initial education was provided to attendings, nurses, and residents.

X-Bar S Control Chart Displaying Average Culture Observation Time per Five Admitted Febrile Infants Aged 60 Days and Younger
However, the greatest sustained decreases in COT occurred after dissemination of the published evidence-based guideline and standardization of the EHR order set. Average LOS decreased from 42 hours to 36 hours (Figure 3). Among the total cohort, 34% of infants were admitted to the satellite campus. At the satellite and main campuses, median COT was 28 hours and 35 hours, respectively (Appendix Figure 2).

X-Bar S Control Chart Displaying Average Length of Stay From Emergency Department Arrival to Hospital Discharge per Five Admitted Febrile Infants Aged 60 Days and Younger

After the launch of the updated order set, median usage of the EHR order set increased from 50% to 80%. Medical discharge criteria were entered for 80 (96%) of the 83 patients for whom the updated order set was applied; culture incubation start times were entered for 78 (94%) of these patients.

No infants in our cohort were found to have IBI after hospital discharge. There were no ED revisits within 48 hours of discharge, and there were no hospital readmissions within 7 days of index discharge. Furthermore, none of the patients included in the study had growth of a pathogenic organism after 24 hours.

Of the 138 infants hospitalized during the intervention period, 77 (56%) had a COT greater than 30 hours. Among these 77 patients, 49 (64%) had their final culture incubated between 9 pm and 4 am; Furthermore, 11 (14%) had missing, abnormal, pretreated, or uninterpretable CSF studies, 7 (9%) had ongoing fevers, and 4 (5%) remained hospitalized due to family preference or inability to obtain timely outpatient follow-up.

DISCUSSION

Our study aimed to decrease the average COT from 38 hours to 30 hours among hospitalized infants aged 60 days and younger over a period of 12 months. An intervention featuring implementation of an evidence-based guideline through education, laboratory procedure transparency, creation of a standardized EHR order set, and near-time feedback was associated with a shorter average COT of 32 hours, sustained over a 17-month period. No infants with bacteremia or meningitis were inappropriately discharged during this study.

Interpretation

Prior to our improvement efforts, most febrile infants at CCHMC were observed for at least 36 hours based on a prior institutional guideline,6 despite recent evidence suggesting that most pathogens in blood and CSF cultures grow within 24 hours of incubation.7-9 The goal of this improvement initiative was to bridge the gap between emerging evidence and clinical practice by developing and disseminating an updated evidence-based guideline to safely decrease the hospital observation time in febrile infants aged 60 days and younger.

Similar to previous studies aimed at improving diagnosis and management among febrile infants,13-16 generation and structured dissemination of an institutional evidence-based guideline was crucial to safely shortening COT in our population. These prior studies established a goal COT of 36 to 42 hours for hospitalized febrile infants.13,15,16 Our study incorporated emerging evidence and local experience into an updated evidence-based practice guideline to further reduce COT to 32 hours for hospitalized infants. Key factors contributing to our success included multidisciplinary engagement, specifically partnering with nurses and resident physicians in designing and implementing our initiatives. Furthermore, improved transparency of culture monitoring practices allowed clinicians to better understand the recommended observation periods. Finally, we employed a standardized EHR order set as a no-cost, one-time, high-reliability intervention to establish 24 hours of culture monitoring as the default and to enhance transparency around start time for culture incubation.

Average COT remained stable at 32 hours for 17 months after initiation of the intervention. During the intervention period, 64% patients with hospital stays longer than 30 hours had cultures obtained between 9 pm to 4 am. These patients often remained hospitalized for longer than 30 hours to allow for a daytime hospital discharge. Additionally, CSF cultures were only monitored manually once per day between 8 am and 10 am. As a result, CSF cultures obtained in the evening (eg, 9 pm) would be evaluated once at roughly 12 hours of incubation, and then the following morning at 36 hours of incubation. In cases where CSF studies (eg, cell count, protein, Gram stain) were abnormal, uninterpretable, or could not be obtained, clinicians monitored CSF cultures closer to 36 hours from incubation. While evidence-based guidelines and local data support safe early discharge of febrile infants, clinicians presented with incomplete or uninterpretable data were appropriately more likely to observe infants for longer periods to confirm negative cultures.

Limitations

The study has several limitations. First, this single-center study was conducted at a quaternary care medical center with a robust quality improvement infrastructure. Our interventions took advantage of the existing processes in place that ensure timely discharge of medically ready patients.11 Furthermore, microbiology laboratory practices are unique to our institution. These factors limit the generalizability of this work. Second, due to small numbers of eligible infants, analyses were conducted per five patients. Infrequent hospitalizations limited our ability to learn quickly from PDSA cycles. Finally, we did not measure cost savings attributable to shorter hospital stays. However, in addition to financial savings from charges and decreased nonmedical costs such as lost earnings and childcare,17 shorter hospitalizations have many additional benefits, such as promoting bonding and breastfeeding and decreasing exposure to nosocomial infections. Shorter hospitalizations, with clearly communicated discharge times, also serve to optimize patient throughput.

CONCLUSION

Implementation of a clinical practice guideline resulted in reduction of average COT from 38 to 32 hours in febrile infants aged 60 days and younger, with no cases of missed IBI. Engagement of multidisciplinary stakeholders in the generation and structured dissemination of the evidence-based guideline, improved transparency of the microbiological blood and CSF culture process, and standardization of EHR order sets were crucial to the success of this work. Cultures incubated overnight and daily CSF culture-monitoring practices primarily contributed to an average LOS of more than 30 hours.

Future work will include collaboration with emergency physicians to improve evaluation efficiency and decrease LOS in the ED for febrile infants. Additionally, creation of an automated data dashboard of COT and LOS will provide clinicians with real-time feedback on hospitalization practices.

Acknowledgments

The authors thank Dr Jeffrey Simmons, MD, MSc, as well as the members of the 2019 Fever of Uncertain Source Evidence-Based Guideline Committee. We also thank the James M Anderson Center for Health System Excellence and the Rapid Cycle Improvement Collaborative for their support with guideline development as well as design and execution of our improvement efforts.

References

1. Cruz AT, Mahajan P, Bonsu BK, et al. Accuracy of complete blood cell counts to identify febrile infants 60 days or younger with invasive bacterial infections. JAMA Pediatr. 2017;171(11):e172927. https://doi.org/10.1001/jamapediatrics.2017.2927
2. Kuppermann N, Dayan PS, Levine DA, et al; Febrile Infant Working Group of the Pediatric Emergency Care Applied Research Network (PECARN). A clinical prediction rule to identify febrile infants 60 days and younger at low risk for serious bacterial infections. JAMA Pediatr. 2019;173(4):342-351. https://doi.org/10.1001/jamapediatrics.2018.5501
3. Nigrovic LE, Mahajan PV, Blumberg SM, et al; Febrile Infant Working Group of the Pediatric Emergency Care Applied Research Network (PECARN). The Yale Observation Scale Score and the risk of serious bacterial infections in febrile infants. Pediatrics. 2017;140(1):e20170695. https://doi.org/10.1542/peds.2017-0695
4. De S, Tong A, Isaacs D, Craig JC. Parental perspectives on evaluation and management of fever in young infants: an interview study. Arch Dis Child. 2014;99(8):717-723. https://doi.org/10.1136/archdischild-2013-305736
5. Paxton RD, Byington CL. An examination of the unintended consequences of the rule-out sepsis evaluation: a parental perspective. Clin Pediatr (Phila). 2001;40(2):71-77. https://doi.org/10.1177/000992280104000202
6. FUS Team. Cincinnati Children’s Hospital Medical Center. Evidence-based clinical care guideline for fever of uncertain source in infants 60 days of age or less. Guideline 2. 2010:1-4.
7. Aronson PL, Wang ME, Nigrovic LE, et al; Febrile Young Infant Research Collaborative. Time to pathogen detection for non-ill versus ill-appearing infants ≤60 days old with bacteremia and meningitis. Hosp Pediatr. 2018;8(7):379-384. https://doi.org/10.1542/hpeds.2018-0002
8. Biondi EA, Mischler M, Jerardi KE, et al; Pediatric Research in Inpatient Settings (PRIS) Network. Blood culture time to positivity in febrile infants with bacteremia. JAMA Pediatr. 2014;168(9):844-849. https://doi.org/10.1001/jamapediatrics.2014.895
9. Lefebvre CE, Renaud C, Chartrand C. Time to positivity of blood cultures in infants 0 to 90 days old presenting to the emergency department: is 36 hours enough? J Pediatric Infect Dis Soc. 2017;6(1):28-32. https://doi.org/10.1093/jpids/piv078
10. Unaka N, Statile A, Bensman, R, et al. Cincinnati Children’s Hospital Medical Center. Evidence-based clinical care guideline for evidence-based care guideline for management of infants 0 to 60 days seen in emergency department for fever of unknown source. Guideline 10. 2019;1-42. http://www.cincinnatichildrens.org/service/j/anderson-center/evidence-based-care/recommendations/default/
11. White CM, Statile AM, White DL, et al. Using quality improvement to optimise paediatric discharge efficiency. BMJ Qual Saf. 2014;23(5):428-436. https://doi.org/10.1136/bmjqs-2013-002556
12. Benneyan JC, Lloyd RC, Plsek PE. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care. 2003;12(6):458-464. https://doi.org/10.1136/qhc.12.6.458
13. Biondi EA, McCulloh R, Staggs VS, et al; American Academy of Pediatrics’ Revise Collaborative. Reducing variability in the infant sepsis evaluation (REVISE): a national quality initiative. Pediatrics. 2019;144(3): e20182201. https://doi.org/10.1542/peds.2018-2201
14. McCulloh RJ, Commers T, Williams DD, Michael J, Mann K, Newland JG. Effect of combined clinical practice guideline and electronic order set implementation on febrile infant evaluation and management. Pediatr Emerg Care. 2021;37(1):e25-e31. https://doi.org/10.1097/pec.0000000000002012
15. Foster LZ, Beiner J, Duh-Leong C, et al. Implementation of febrile infant management guidelines reduces hospitalization. Pediatr Qual Saf. 2020;5(1):e252. https://doi.org/10.1097/pq9.0000000000000252
16. Byington CL, Reynolds CC, Korgenski K, et al. Costs and infant outcomes after implementation of a care process model for febrile infants. Pediatrics. 2012;130(1):e16-e24. https://doi.org/10.1542/peds.2012-0127
17. Chang LV, Shah AN, Hoefgen ER, et al; H2O Study Group. Lost earnings and nonmedical expenses of pediatric hospitalizations. Pediatrics. 2018;142(3):e20180195. https://doi.org/10.1542/peds.2018-0195

References

1. Cruz AT, Mahajan P, Bonsu BK, et al. Accuracy of complete blood cell counts to identify febrile infants 60 days or younger with invasive bacterial infections. JAMA Pediatr. 2017;171(11):e172927. https://doi.org/10.1001/jamapediatrics.2017.2927
2. Kuppermann N, Dayan PS, Levine DA, et al; Febrile Infant Working Group of the Pediatric Emergency Care Applied Research Network (PECARN). A clinical prediction rule to identify febrile infants 60 days and younger at low risk for serious bacterial infections. JAMA Pediatr. 2019;173(4):342-351. https://doi.org/10.1001/jamapediatrics.2018.5501
3. Nigrovic LE, Mahajan PV, Blumberg SM, et al; Febrile Infant Working Group of the Pediatric Emergency Care Applied Research Network (PECARN). The Yale Observation Scale Score and the risk of serious bacterial infections in febrile infants. Pediatrics. 2017;140(1):e20170695. https://doi.org/10.1542/peds.2017-0695
4. De S, Tong A, Isaacs D, Craig JC. Parental perspectives on evaluation and management of fever in young infants: an interview study. Arch Dis Child. 2014;99(8):717-723. https://doi.org/10.1136/archdischild-2013-305736
5. Paxton RD, Byington CL. An examination of the unintended consequences of the rule-out sepsis evaluation: a parental perspective. Clin Pediatr (Phila). 2001;40(2):71-77. https://doi.org/10.1177/000992280104000202
6. FUS Team. Cincinnati Children’s Hospital Medical Center. Evidence-based clinical care guideline for fever of uncertain source in infants 60 days of age or less. Guideline 2. 2010:1-4.
7. Aronson PL, Wang ME, Nigrovic LE, et al; Febrile Young Infant Research Collaborative. Time to pathogen detection for non-ill versus ill-appearing infants ≤60 days old with bacteremia and meningitis. Hosp Pediatr. 2018;8(7):379-384. https://doi.org/10.1542/hpeds.2018-0002
8. Biondi EA, Mischler M, Jerardi KE, et al; Pediatric Research in Inpatient Settings (PRIS) Network. Blood culture time to positivity in febrile infants with bacteremia. JAMA Pediatr. 2014;168(9):844-849. https://doi.org/10.1001/jamapediatrics.2014.895
9. Lefebvre CE, Renaud C, Chartrand C. Time to positivity of blood cultures in infants 0 to 90 days old presenting to the emergency department: is 36 hours enough? J Pediatric Infect Dis Soc. 2017;6(1):28-32. https://doi.org/10.1093/jpids/piv078
10. Unaka N, Statile A, Bensman, R, et al. Cincinnati Children’s Hospital Medical Center. Evidence-based clinical care guideline for evidence-based care guideline for management of infants 0 to 60 days seen in emergency department for fever of unknown source. Guideline 10. 2019;1-42. http://www.cincinnatichildrens.org/service/j/anderson-center/evidence-based-care/recommendations/default/
11. White CM, Statile AM, White DL, et al. Using quality improvement to optimise paediatric discharge efficiency. BMJ Qual Saf. 2014;23(5):428-436. https://doi.org/10.1136/bmjqs-2013-002556
12. Benneyan JC, Lloyd RC, Plsek PE. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care. 2003;12(6):458-464. https://doi.org/10.1136/qhc.12.6.458
13. Biondi EA, McCulloh R, Staggs VS, et al; American Academy of Pediatrics’ Revise Collaborative. Reducing variability in the infant sepsis evaluation (REVISE): a national quality initiative. Pediatrics. 2019;144(3): e20182201. https://doi.org/10.1542/peds.2018-2201
14. McCulloh RJ, Commers T, Williams DD, Michael J, Mann K, Newland JG. Effect of combined clinical practice guideline and electronic order set implementation on febrile infant evaluation and management. Pediatr Emerg Care. 2021;37(1):e25-e31. https://doi.org/10.1097/pec.0000000000002012
15. Foster LZ, Beiner J, Duh-Leong C, et al. Implementation of febrile infant management guidelines reduces hospitalization. Pediatr Qual Saf. 2020;5(1):e252. https://doi.org/10.1097/pq9.0000000000000252
16. Byington CL, Reynolds CC, Korgenski K, et al. Costs and infant outcomes after implementation of a care process model for febrile infants. Pediatrics. 2012;130(1):e16-e24. https://doi.org/10.1542/peds.2012-0127
17. Chang LV, Shah AN, Hoefgen ER, et al; H2O Study Group. Lost earnings and nonmedical expenses of pediatric hospitalizations. Pediatrics. 2018;142(3):e20180195. https://doi.org/10.1542/peds.2018-0195

Issue
Journal of Hospital Medicine 16(5)
Issue
Journal of Hospital Medicine 16(5)
Page Number
267-273. Published Online First April 20, 2021
Page Number
267-273. Published Online First April 20, 2021
Topics
Article Type
Display Headline
Decreasing Hospital Observation Time for Febrile Infants
Display Headline
Decreasing Hospital Observation Time for Febrile Infants
Sections
Article Source

© 2021 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Sanyukta Desai, MD; Email: sanyukta.desai@seattlechildrens.org; Telephone: 206-987-7370.
Content Gating
Gated (full article locked unless allowed per User)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Gating Strategy
First Page Free
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
Article PDF Media
Media Files

Nine Seasons of a Bronchiolitis Observation Unit and Home Oxygen Therapy Protocol

Article Type
Changed
Tue, 04/27/2021 - 10:11
Display Headline
Nine Seasons of a Bronchiolitis Observation Unit and Home Oxygen Therapy Protocol

Bronchiolitis is the leading cause of hospitalization in infants aged <1 year in the United States.1-3 Estimates suggest that 1.5% to 2.0% of US infants require hospitalization every year, with a median (interquartile range) length of stay of 2 days (1-4),3 incurring direct medical costs of $555 million annually.1 Evidence suggests that few interventions, aside from supportive care, are effective for bronchiolitis.4-7 Adherence to standardized clinical guidelines could improve outcomes and resource use by streamlining care and limiting ineffective interventions, thereby decreasing hospital length of stay, which is a major medical cost.8-13 For this reason, many hospitals have adopted bronchiolitis guidelines, although institutional practices vary.14,15

Two relatively unexplored methods to reduce the inpatient burden of bronchiolitis are the use of observation units (OU) and home oxygen therapy (HOT). Motivated by research demonstrating the safety and effectiveness of an emergency department (ED)–based HOT protocol,16 where 36 of 37 patients with mild hypoxemia discharged on HOT avoided hospital admission, our institution implemented an observation unit and home oxygen therapy (OU-HOT) protocol designed to return children with bronchiolitis home earlier from the hospital. In the first winter season of implementation (2010 to 2011), the OU-HOT protocol was associated with significant reductions in length of stay and substantial cost savings, without an increase in return visits to the ED or inpatient readmissions.17 The objectives of this study were to determine whether these encouraging initial findings persisted and to measure the long-term impact of the OU-HOT protocol.

METHODS

We conducted a retrospective cohort study of children hospitalized with bronchiolitis at Primary Children’s Hospital, a freestanding children’s hospital in Salt Lake City, Utah. Discharge diagnosis and procedures codes, as well as laboratory, imaging, pharmacy, and supply costs, were obtained from the Intermountain Healthcare enterprise data warehouse. A crosswalk available from the Centers for Medicare and Medicaid Services was used to convert International Classification of Diseases (ICD)-10 discharge diagnosis and procedure codes to ICD-9 equivalents.18 This study was approved by the University of Utah institutional review board (00110419).

Patients

Children aged 3 to 24 months who were discharged with a diagnosis of bronchiolitis (466.xx) during winter seasons from 2007 to 2019 were included. A winter season was defined as November 1 to April 30. Both observation and inpatient encounters were included in the cohort. We excluded patients with discharge diagnosis or procedure codes indicating tracheostomy (519.0-519.09, V44.0, V55.0, 31.1, 31.21, 31.41, 31.74, 97.23), ventilator dependence (V46.1x), chronic lung disease (518.83, 770.7), or pulmonary hypertension (416.xx). Patients with both bronchiolitis and a concurrent diagnosis, such as otitis media or pneumonia, were included unless exclusion criteria were met.

Intervention and Process Measures

Our institution implemented the OU-HOT protocol at the start of the 2010-2011 winter season.17 The aim of the OU-HOT protocol was to discharge children with bronchiolitis home sooner by increasing use of both an OU, with frequent assessment of discharge readiness, and HOT to help children become ready for discharge. Similar to most OUs, admission to our unit was limited to patients who met hospital admission criteria, and had a short anticipated length of stay (<48 hours). As a self-contained 20-bed unit providing 24-hour dedicated pediatrician/pediatric emergency medicine physician and nursing coverage, the OU actively monitored patients’ discharge readiness, with a goal to facilitate patient throughput more akin to an ED rather than a traditional inpatient unit. Patients who could not be discharged from the OU within 48 hours were transferred to the inpatient unit. Although the OU existed at the time of protocol implementation, its use for patients with bronchiolitis was not actively encouraged until implementation.

Hospitalized patients—in either inpatient or observation units—were eligible for discharge on HOT if they met the following criteria: hypoxemia was the only indication for continued hospitalization, the child’s oxygen requirement was <0.5 L/min for at least 6 hours (0.8 L/min for children aged >1 year), the child’s caregiver(s) were willing to manage oxygen at home, and the child had reliable access to primary care provider follow up. We used two process measures across winter seasons: (1) the percentage of patients discharged from the OU, and (2) the percentage of patients discharged with HOT. The percentage of patients discharged on HOT was estimated by a manual chart review and an electronic medical record (EMR) HOT flag that came into existence with our hospital system’s adoption of a new EMR (2017-2019). Chart review randomly sampled patients from 2007-2017, totaling 457 patients. To estimate the reliability of this method, we calculated the sensitivity, specificity, positive predictive value, and negative predictive value of the EMR HOT flag using chart review as the gold standard.

Outcome Measures

The main outcome measure was mean hospital length of stay. Balancing measures were revisit rates (stratified into ED visits and readmissions) and annual per-population bronchiolitis admission rates. Visits were considered revisits if they occurred within 7 days of initial hospital discharge, and included visits to Primary Children’s Hospital as well as 22 other Intermountain Healthcare hospitals. Population estimates from the Utah Department of Health were used to calculate the annual population-based rate of bronchiolitis admissions to Primary Children’s Hospital.19 Annual admission rates were calculated per 10,000 children aged 3 to 24 months who resided in Utah each year of the study period, and were evaluated to determine if patients were admitted more frequently after OU-HOT implementation. Secondary outcome measures included the percentage of patients discharged within 24 hours and mean inflation-adjusted cost per episode of care (in 2019 dollars). Hospitalization costs were determined using Intermountain Healthcare’s internal cost accounting system, an activity-based method that aggregates costs of individual resources according to date of service.20 Costs were adjusted to 2019 dollars and were defined as the total costs of a patient’s initial hospitalization as well as any 7-day revisit encounters.

Data Analysis

Demographic data were compared before and after OU-HOT protocol implementation using Pearson chi-square tests. Multivariable linear or logistic regression models were used to compare measures before and after OU-HOT protocol implementation via an interrupted time-series approach. The interrupted time-series analysis measured two types of changes after protocol implementation during the 2010-2011 winter season: (1) any immediate change in the level of an outcome (immediate effect) and (2) any change of an outcome going forward over time (change in slope).21 Covariates in the regression models included patient age, sex, race, ethnicity, and insurance type, as well as presence of an underlying complex chronic condition, mechanical ventilation use, and pediatric intensive care unit (PICU) admission during hospitalization. Data were analyzed in STATA 15 (StataCorp LLC).22

RESULTS

A total of 7,116 patients met inclusion criteria over the study period (2,061 pre-implementation, 5,055 post-implementation). A comparison of patient characteristics before and after HOT protocol implementation is presented in Table 1. Patients were similar in terms of age, sex, and insurance type. Patients in the postimplementation period were more likely to have a complex chronic condition, require admission to the PICU, and need mechanical ventilation (P < .01). Differences between cohorts with regard to race/ethnicity distribution largely were a result of improved capture of these data elements in the postimplementation period. For example, 30% of patients were classified as “race/ethnicity unknown” in the preimplementation cohort, compared with 4% of patients in the postimplementation period.

Patient Demographic and Clinical Characteristics, Preimplementation (2007-2010) and Postimplementation (2010-2019) of the OU-HOT Protocol

Process Measures

Figure 1 shows trends in OU and HOT use by winter season. The percentage of patients discharged from the OU increased immediately after OU-HOT protocol implementation (absolute 26.9% immediate increase; 95% CI, 21.9-42.2). The change in the proportion of OU use per season also increased (change in slope +3.9% per season; 95% CI, 3.4%-4.4%). The percentage of patients discharged with HOT increased immediately after OU-HOT protocol implementation (26.0% immediate change; 95% CI, 18.9%-33.1%); however, the immediate increase in HOT discharges was coupled with a declining rate of HOT discharges per season in the postprotocol period compared with the preprotocol period (change in slope –4.5% per season; 95% CI, –7.5% to –1.5%). Our chart review and EMR flag included 1,354 patients, or 19.0% of our cohort. Our EMR flag for HOT in the last two seasons of the study had a positive predictive value of 100% (5 of 5 identified by EMR flag as receiving HOT were confirmed by chart review) and negative predictive value of 89% (31 of 35 identified by EMR flag as not receiving HOT were confirmed by chart review). The specificity of the EMR flag was 100% (31 of 31 of those confirmed by chart review as not receiving HOT, who were correctly identified by EMR) and the sensitivity was 55% (5 of 9 of those confirmed by chart review as receiving HOT, who were correctly identified by EMR).

Process Measures, 2007-2019

Primary and Secondary Outcomes

Trends in length of stay across winter seasons are presented in Figure 2. The OU-HOT protocol was associated with an immediate reduction of 30.6 hours in mean length of stay (95% CI, –37.1 to –24.2). The rate of change in length of stay postimplementation did not differ significantly from the rate of change preimplementation (change in slope –0.6 hours per season; 95% CI, –2.3 to 1.1 hours). The percentage of patients discharged within 24 hours of admission rose immediately after protocol implementation, by 23.8 absolute percentage points (95% CI, 11.7-28.8). Slopes of the preintervention and postintervention regression lines did not differ significantly (change in slope –0.1% per season; 95% CI, –1.4% to 1.1%). Immediate decreases in length of stay were accompanied by an immediate decrease in mean cost per episode of care (–$4,181; 95% CI, –$4,829 to –$3,533). Protocol implementation also was associated with a decreased slope in cost postimplementation (change in slope –$403 per season; 95% CI, –$543 to –$264). The total cost savings, estimated by the product of the average cost savings per episode of care and the number of bronchiolitis admissions included in the study after OU-HOT implementation, amounted to $21.1 million over the 9-year period, or $2.3 million per winter season.

Primary and Secondary Outcome Measures, 2007-2019

Balancing Measures

We observed an immediate reduction in 7-day hospital revisits (–1.1% immediate change; 95% CI, –1.8% to –0.4%), but an increasing slope in revisits after implementation (change in slope 0.4% per season; 95% CI, 0.1%-0.8%) (Figure 3). Stratifying revisits into ED visits and readmissions revealed that the revisit findings reflected changes in ED return visits, for which there was an immediate reduction at the time of implementation (–1.0% immediate change; 95% CI, –1.6% to –0.4%), but an increasing slope postimplementation (change in slope 0.5% per season; 95% CI, 0.2-0.8). Neither an immediate intervention effect (0.0% immediate change; 95% CI, –0.5% to 0.4%) nor a change in slope (change in slope 0.0% per season; 95% CI, –0.1% to 0.1%) were observed for inpatient readmissions alone. The annual rate of bronchiolitis admissions to Primary Children’s Hospital per 10,000 children who reside in Utah decreased after implementation of the OU-HOT protocol (immediate intervention effect –6.2 admissions; 95% CI, –10.8 to –1.6; change in slope –1.8 admissions per season; 95% CI, –2.8 to –0.69).

Balancing Measures, 2007-2019

DISCUSSION

Our OU-HOT protocol was associated with immediate improvements in care delivered to children hospitalized for bronchiolitis, including decreased length of stay and cost savings. These improvements in outcomes largely have been sustained over a 9-year period. The OU-HOT protocol also appears to be safe as evidenced by a stable rate of readmissions over the study period and only a small increase in revisits to EDs across Intermountain Healthcare facilities, which see most children in the catchment area. Our OU-HOT protocol represents a combination of two interventions: (1) the creation of an OU focused on discharge within 24 to 48 hours of admission and (2) encouragement to discharge children with HOT. We found that use of the OU and a commitment to timely discharges has been sustained in recent years, while the commitment to HOT has appeared to wane.

Earlier investigations have evaluated the efficacy of HOT in the ED setting to prevent hospital admissions, finding high levels of caregiver comfort, estimating $1,300 per patient cost savings, and reporting readmission rates of approximately 5%.16,23-25 Our study is unique in addressing HOT among a population of patients already hospitalized with bronchiolitis. The cost reductions we observed with our OU-HOT protocol were similar to those noted in the ED-based HOT protocols. However, we recorded lower readmission rates, likely because of the additional time allotted to caregivers to better gauge illness trajectory in the inpatient setting vs the ED, as well as additional time for hospitalized patients to reach the plateau or convalescent phase of illness. The small increase in ED revisits that we measured in recent years might be related to the concurrent rise in patient acuity and complexity.

Considering that length of stay has remained low despite less commitment to HOT, our results suggest that the OU might be the more impactful of the two interventions, and these data support the use of such a unit for a subset of patients with bronchiolitis. However, it is important to note that while the EMR HOT flag demonstrated high specificity, positive predictive value, and negative predictive value, the sensitivity was low (56%). As a result, it is possible that we have underestimated HOT use in the 2017-2018 and 2018-2019 seasons, the final two years of the study. Alternatively, the discrepancy between sustained outcomes and lagging use of HOT could be explained by improved identification of patients who would experience the greatest benefit with oxygen in terms of length of stay reductions, with fewer patients discharged on HOT but greater per-patient benefit. Finally, in an era that encourages reduced monitor use and less aggressive response to transient mild desaturations,13,26,27 it is possible that fewer patients are identified with clinically actionable hypoxemia around the time they would be otherwise discharged.

Our OU-HOT model is not unprecedented. Increasingly, other formerly inpatient indications are being successfully managed in the observation, outpatient, and home setting, such as parenteral antibiotic treatment28,29 and chemotherapy administration.30 Considering the inpatient burden of bronchiolitis, similar strategies to expedite discharge are needed. Although outpatient intravenous antibiotic and chemotherapy administration have been widely adopted, we are aware of only one other pediatric health care system in the United States (Children’s Hospital Colorado) that routinely discharges inpatients with bronchiolitis on HOT.

This study has several limitations. First, although the interrupted time-series analysis is designed to account for trends that precede an intervention and covariates that differ before and after the intervention, it is possible that important unmeasured patient factors or changes in practice patterns differed between the pre- and post-intervention cohorts. There were no major changes to the OU-HOT protocol or discharge criteria after implementation, but individual practice management of bronchiolitis during the study period likely has evolved as new evidence emerges. Second, one could postulate that the increase in discharges within 24 hours and accompanying decreases in average length of stay and cost could be achieved by hospitalizing healthier patients over time, which the presence of an OU might incentivize. To the contrary, we found that population-based bronchiolitis admission rates have declined and disease severity appears to be increased since implementation of the OU-HOT protocol. The increase in medically complex children and PICU use in our postimplementation cohort aligns with recently published data suggesting these are national trends.3,31 Third, HOT use was estimated from a sample of the cohort using a chart review and a newly available EMR flag. A low sensitivity and a small sample for the positive predictive value are limitations of the EMR flag.

Additionally, there are almost certainly unmeasured ambulatory burdens of HOT not captured by this study. ED-based protocols have estimated that patients discharged with HOT have a median of two follow-up ambulatory visits before oxygen is discontinued32; however, the ambulatory burden associated with discharge on HOT after a hospitalization and the extent to which demographic factors affect that burden is unknown. Furthermore, one insurance company charged $94 for a month of HOT in 2019; paying even a portion of this charge represents a nontrivial financial burden for many families, even considering inpatient cost savings. Although the decision to discharge on oxygen or remain hospitalized until the child did not need oxygen was left to the parents, their posthospitalization perspectives were not assessed in this study. Although reports indicate that families largely feel positive about HOT after discharge from an ED setting, with 90% of caregivers preferring HOT use to inpatient admission and most reporting no difficulty with home management,23 it is uncertain whether this would also apply after inpatient hospitalization.

CONCLUSION

The OU-HOT bronchiolitis protocol was associated with decreases in inpatient length of stay and cost while appearing safe to implement. The sustained use of the OU combined with declining use of HOT suggests that the OU might be the more impactful intervention. As previously inpatient indications such as parenteral antibiotics and chemotherapy increasingly have been administered in observation and outpatient settings, bronchiolitis appears ideal for a similar strategy that allows patients to spend less time in the hospital. Studies are needed to understand the outpatient burden of HOT and the generalizability of our findings.

References

1. Hasegawa K, Tsugawa Y, Brown DFM, Mansbach JM, Camargo CA. Trends in bronchiolitis hospitalizations in the United States, 2000-2009. Pediatrics. 2013;132(1):28-36. https://doi.org/10.1542/peds.2012-3877
2. Carroll KN, Gebretsadik T, Griffin MR, et al. Increasing burden and risk factors for bronchiolitis-related medical visits in infants enrolled in a state health care insurance plan. Pediatrics. 2008;122(1):58-64. https://doi.org/10.1542/peds.2007-2087
3. Fujiogi M, Goto T, Yasunaga H, et al. Trends in bronchiolitis hospitalizations in the United States: 2000–2016. Pediatrics. 2019;144(6):e20192614. https://doi.org/10.1542/peds.2019-2614
4. Schroeder AR, Mansbach JM. Recent evidence on the management of bronchiolitis. Curr Opin Pediatr. 2014;26(3):328-333. https://doi.org/10.1097/MOP.0000000000000090
5. American Academy of Pediatrics Subcommittee on Diagnosis and Management of Bronchiolitis. Diagnosis and management of bronchiolitis. Pediatrics. 2006;118(4):1774-1793. https://doi.org/10.1542/peds.2006-2223
6. Ralston SL, Lieberthal AS, Meissner HC, et al; American Academy of Pediatrics. Clinical practice guideline: the diagnosis, management, and prevention of bronchiolitis. Pediatrics. 2014;134(5):e1474. https://doi.org/10.1542/peds.2014-2742
7. Riese J, Porter T, Fierce J, Riese A, Richardson T, Alverson BK. Clinical outcomes of bronchiolitis after implementation of a general ward high flow nasal cannula guideline. Hosp Pediatr. 2017;7(4):197-203. https://doi.org/10.1542/hpeds.2016-0195
8. Perlstein PH, Kotagal UR, Bolling C, et al. Evaluation of an evidence-based guideline for bronchiolitis. Pediatrics. 1999;104(6):1334-1341. https://doi.org/10.1542/peds.104.6.1334
9. Perlstein PH, Kotagal UR, Schoettker PJ, et al. Sustaining the implementation of an evidence-based guideline for bronchiolitis. Arch Pediatr Adolesc Med. 2000;154(10):1001-1007. https://doi.org/10.1001/archpedi.154.10.1001
10. Wilson SD, Dahl BB, Wells RD. An evidence-based clinical pathway for bronchiolitis safely reduces antibiotic overuse. Am J Med Qual. 2002;17(5):195-199. https://doi.org/10.1177/106286060201700507
11. Barben J, Kuehni CE, Trachsel D, Hammer J; Swiss Paediatric Respiratory Research Group. Management of acute bronchiolitis: can evidence based guidelines alter clinical practice? Thorax. 2008;63(12):1103-1109. https://doi.org/10.1136/thx.2007.094706
12. Bryan MA, Desai AD, Wilson L, Wright DR, Mangione-Smith R. Association of bronchiolitis clinical pathway adherence with length of stay and costs. Pediatrics. 2017;139(3):e20163432. https://doi.org/10.1542/peds.2016-3432
13. Mittal S, Marlowe L, Blakeslee S, et al. Successful use of quality improvement methodology to reduce inpatient length of stay in bronchiolitis through judicious use of intermittent pulse oximetry. Hosp Pediatr. 2019;9(2):73-78. https://doi.org/10.1542/hpeds.2018-0023
14. Macias CG, Mansbach JM, Fisher ES, et al. Variability in inpatient management of children hospitalized with bronchiolitis. Acad Pediatr. 2015;15(1):69-76. https://doi.org/10.1016/j.acap.2014.07.005
15. Mittal V, Hall M, Morse R, et al. Impact of inpatient bronchiolitis clinical practice guideline implementation on testing and treatment. J Pediatr. 2014;165(3):570-6.e3. https://doi.org/10.1016/j.jpeds.2014.05.021
16. Bajaj L, Turner CG, Bothner J. A randomized trial of home oxygen therapy from the emergency department for acute bronchiolitis. Pediatrics. 2006;117(3):633-640. https://doi.org/10.1542/peds.2005-1322
17. Sandweiss DR, Mundorff MB, Hill T, et al. Decreasing hospital length of stay for bronchiolitis by using an observation unit and home oxygen therapy. JAMA Pediatr. 2013;167(5):422-428. https://doi.org/10.1001/jamapediatrics.2013.1435
18. National Bureau of Economic Research. ICD-9-CM to and from ICD-10-CM and ICD-10-PCS crosswalk or general equivalence mappings. Accessed December 2, 2020. http://www.nber.org/data/icd9-icd-10-cm-and-pcs-crosswalk-general-equivalence-mapping.html
19. Utah Department of Health, Indicator-Based Information System for Public Health. Accessed February 15, 2020. https://ibis.health.utah.gov/ibisph-view
20. James BC, Savitz LA. How Intermountain trimmed health care costs through robust quality improvement efforts. Health Aff (Millwood). 2011;30(6):1185-1191. https://doi.org/10.1377/hlthaff.2011.0358
21. Penfold RB, Zhang F. Use of interrupted time series analysis in evaluating health care quality improvements. Acad Pediatr. 2013;13(6 Suppl):S38-44. https://doi.org/10.1016/j.acap.2013.08.002
22. StataCorp. Stata Statistical Software: Release 15. StataCorp LLC; 2017.
23. Freeman JF, Deakyne S, Bajaj L. Emergency department-initiated home oxygen for bronchiolitis: a prospective study of community follow-up, caregiver satisfaction, and outcomes. Acad Emerg Med. 2017;24(8):920-929. https://doi.org/10.1111/acem.13179
24. Freeman JF, Brou L, Mistry R. Feasibility and capacity for widespread use of emergency department-based home oxygen for bronchiolitis. Am J Emerg Med. 2017;35(9):1379-1381. https://doi.org/10.1016/j.ajem.2017.03.069
25. Halstead S, Roosevelt G, Deakyne S, Bajaj L. Discharged on supplemental oxygen from an emergency department in patients with bronchiolitis. Pediatrics. 2012;129(3):e605-610. https://doi.org/10.1542/peds.2011-0889
26. Quinonez RA, Coon ER, Schroeder AR, Moyer VA. When technology creates uncertainty: pulse oximetry and overdiagnosis of hypoxaemia in bronchiolitis. BMJ. 2017;358:j3850. https://doi.org/10.1136/bmj.j3850
27. Burrows J, Berg K, McCulloh R. Intermittent pulse oximetry use and length of stay in bronchiolitis: bystander or primary Driver? Hosp Pediatr. 2019;9(2):142-143. https://doi.org/10.1542/hpeds.2018-0183
28. Norris AH, Shrestha NK, Allison GM, et al. 2018 Infectious Diseases Society of America clinical practice guideline for the management of outpatient parenteral antimicrobial therapy. Clin Infect Dis. 2019;68(1):e1-e35. https://doi.org/10.1093/cid/ciy745
29. Williams DN, Baker CA, Kind AC, Sannes MR. The history and evolution of outpatient parenteral antibiotic therapy (OPAT). Int J Antimicrob Agents. 2015;46(3):307-312. https://doi.org/10.1016/j.ijantimicag.2015.07.001
30. Beaty RS, Bernhardt MB, Berger AH, Hesselgrave JE, Russell HV, Okcu MF. Inpatient versus outpatient vincristine, dactinomycin, and cyclophosphamide for pediatric cancers: quality and cost implications. Pediatr Blood Cancer. 2015;62(11):1925-1928. https://doi.org/10.1002/pbc.25610
31. Coon ER, Stoddard G, Brady PW. Intensive care unit utilization after adoption of a ward-based high-flow nasal cannula protocol. J Hosp Med. 2020;15(6):325-330. https://doi.org/10.12788/jhm.3417
32. Freeman JF, Weng H-YC, Sandweiss D. Outpatient management of home oxygen for bronchiolitis. Clin Pediatr (Phila). 2015;54(1):62-66. https://doi.org/10.1177/0009922814547564

Article PDF
Author and Disclosure Information

1Department of Pediatrics, Division of Inpatient Medicine, University of Utah, Salt Lake City, Utah; 2University of Utah School of Medicine, Salt Lake City, Utah; 3Department of Pediatrics, Division of Emergency Medicine, University of Utah, Salt Lake City, Utah; 4Department of Pediatrics, Division of General Pediatrics, Salt Lake City, Utah.

Disclosures
Dr. Coon is the recipient of an Intermountain-Stanford Collaboration Grant (NCT03354325), which funded a randomized controlled trial for patients hospitalized with bronchiolitis.

Funding
This investigation was supported by the University of Utah Population Health Research (PHR) Foundation, with funding in part from the National Center for Research Resources and the National Center for Advancing Translational Sciences, National Institutes of Health, through Grant 5UL1TR001067-05 (formerly 8UL1TR000105 and UL1RR025764).

Issue
Journal of Hospital Medicine 16(5)
Topics
Page Number
261-266. Published Online First April 20, 2021
Sections
Author and Disclosure Information

1Department of Pediatrics, Division of Inpatient Medicine, University of Utah, Salt Lake City, Utah; 2University of Utah School of Medicine, Salt Lake City, Utah; 3Department of Pediatrics, Division of Emergency Medicine, University of Utah, Salt Lake City, Utah; 4Department of Pediatrics, Division of General Pediatrics, Salt Lake City, Utah.

Disclosures
Dr. Coon is the recipient of an Intermountain-Stanford Collaboration Grant (NCT03354325), which funded a randomized controlled trial for patients hospitalized with bronchiolitis.

Funding
This investigation was supported by the University of Utah Population Health Research (PHR) Foundation, with funding in part from the National Center for Research Resources and the National Center for Advancing Translational Sciences, National Institutes of Health, through Grant 5UL1TR001067-05 (formerly 8UL1TR000105 and UL1RR025764).

Author and Disclosure Information

1Department of Pediatrics, Division of Inpatient Medicine, University of Utah, Salt Lake City, Utah; 2University of Utah School of Medicine, Salt Lake City, Utah; 3Department of Pediatrics, Division of Emergency Medicine, University of Utah, Salt Lake City, Utah; 4Department of Pediatrics, Division of General Pediatrics, Salt Lake City, Utah.

Disclosures
Dr. Coon is the recipient of an Intermountain-Stanford Collaboration Grant (NCT03354325), which funded a randomized controlled trial for patients hospitalized with bronchiolitis.

Funding
This investigation was supported by the University of Utah Population Health Research (PHR) Foundation, with funding in part from the National Center for Research Resources and the National Center for Advancing Translational Sciences, National Institutes of Health, through Grant 5UL1TR001067-05 (formerly 8UL1TR000105 and UL1RR025764).

Article PDF
Article PDF
Related Articles

Bronchiolitis is the leading cause of hospitalization in infants aged <1 year in the United States.1-3 Estimates suggest that 1.5% to 2.0% of US infants require hospitalization every year, with a median (interquartile range) length of stay of 2 days (1-4),3 incurring direct medical costs of $555 million annually.1 Evidence suggests that few interventions, aside from supportive care, are effective for bronchiolitis.4-7 Adherence to standardized clinical guidelines could improve outcomes and resource use by streamlining care and limiting ineffective interventions, thereby decreasing hospital length of stay, which is a major medical cost.8-13 For this reason, many hospitals have adopted bronchiolitis guidelines, although institutional practices vary.14,15

Two relatively unexplored methods to reduce the inpatient burden of bronchiolitis are the use of observation units (OU) and home oxygen therapy (HOT). Motivated by research demonstrating the safety and effectiveness of an emergency department (ED)–based HOT protocol,16 where 36 of 37 patients with mild hypoxemia discharged on HOT avoided hospital admission, our institution implemented an observation unit and home oxygen therapy (OU-HOT) protocol designed to return children with bronchiolitis home earlier from the hospital. In the first winter season of implementation (2010 to 2011), the OU-HOT protocol was associated with significant reductions in length of stay and substantial cost savings, without an increase in return visits to the ED or inpatient readmissions.17 The objectives of this study were to determine whether these encouraging initial findings persisted and to measure the long-term impact of the OU-HOT protocol.

METHODS

We conducted a retrospective cohort study of children hospitalized with bronchiolitis at Primary Children’s Hospital, a freestanding children’s hospital in Salt Lake City, Utah. Discharge diagnosis and procedures codes, as well as laboratory, imaging, pharmacy, and supply costs, were obtained from the Intermountain Healthcare enterprise data warehouse. A crosswalk available from the Centers for Medicare and Medicaid Services was used to convert International Classification of Diseases (ICD)-10 discharge diagnosis and procedure codes to ICD-9 equivalents.18 This study was approved by the University of Utah institutional review board (00110419).

Patients

Children aged 3 to 24 months who were discharged with a diagnosis of bronchiolitis (466.xx) during winter seasons from 2007 to 2019 were included. A winter season was defined as November 1 to April 30. Both observation and inpatient encounters were included in the cohort. We excluded patients with discharge diagnosis or procedure codes indicating tracheostomy (519.0-519.09, V44.0, V55.0, 31.1, 31.21, 31.41, 31.74, 97.23), ventilator dependence (V46.1x), chronic lung disease (518.83, 770.7), or pulmonary hypertension (416.xx). Patients with both bronchiolitis and a concurrent diagnosis, such as otitis media or pneumonia, were included unless exclusion criteria were met.

Intervention and Process Measures

Our institution implemented the OU-HOT protocol at the start of the 2010-2011 winter season.17 The aim of the OU-HOT protocol was to discharge children with bronchiolitis home sooner by increasing use of both an OU, with frequent assessment of discharge readiness, and HOT to help children become ready for discharge. Similar to most OUs, admission to our unit was limited to patients who met hospital admission criteria, and had a short anticipated length of stay (<48 hours). As a self-contained 20-bed unit providing 24-hour dedicated pediatrician/pediatric emergency medicine physician and nursing coverage, the OU actively monitored patients’ discharge readiness, with a goal to facilitate patient throughput more akin to an ED rather than a traditional inpatient unit. Patients who could not be discharged from the OU within 48 hours were transferred to the inpatient unit. Although the OU existed at the time of protocol implementation, its use for patients with bronchiolitis was not actively encouraged until implementation.

Hospitalized patients—in either inpatient or observation units—were eligible for discharge on HOT if they met the following criteria: hypoxemia was the only indication for continued hospitalization, the child’s oxygen requirement was <0.5 L/min for at least 6 hours (0.8 L/min for children aged >1 year), the child’s caregiver(s) were willing to manage oxygen at home, and the child had reliable access to primary care provider follow up. We used two process measures across winter seasons: (1) the percentage of patients discharged from the OU, and (2) the percentage of patients discharged with HOT. The percentage of patients discharged on HOT was estimated by a manual chart review and an electronic medical record (EMR) HOT flag that came into existence with our hospital system’s adoption of a new EMR (2017-2019). Chart review randomly sampled patients from 2007-2017, totaling 457 patients. To estimate the reliability of this method, we calculated the sensitivity, specificity, positive predictive value, and negative predictive value of the EMR HOT flag using chart review as the gold standard.

Outcome Measures

The main outcome measure was mean hospital length of stay. Balancing measures were revisit rates (stratified into ED visits and readmissions) and annual per-population bronchiolitis admission rates. Visits were considered revisits if they occurred within 7 days of initial hospital discharge, and included visits to Primary Children’s Hospital as well as 22 other Intermountain Healthcare hospitals. Population estimates from the Utah Department of Health were used to calculate the annual population-based rate of bronchiolitis admissions to Primary Children’s Hospital.19 Annual admission rates were calculated per 10,000 children aged 3 to 24 months who resided in Utah each year of the study period, and were evaluated to determine if patients were admitted more frequently after OU-HOT implementation. Secondary outcome measures included the percentage of patients discharged within 24 hours and mean inflation-adjusted cost per episode of care (in 2019 dollars). Hospitalization costs were determined using Intermountain Healthcare’s internal cost accounting system, an activity-based method that aggregates costs of individual resources according to date of service.20 Costs were adjusted to 2019 dollars and were defined as the total costs of a patient’s initial hospitalization as well as any 7-day revisit encounters.

Data Analysis

Demographic data were compared before and after OU-HOT protocol implementation using Pearson chi-square tests. Multivariable linear or logistic regression models were used to compare measures before and after OU-HOT protocol implementation via an interrupted time-series approach. The interrupted time-series analysis measured two types of changes after protocol implementation during the 2010-2011 winter season: (1) any immediate change in the level of an outcome (immediate effect) and (2) any change of an outcome going forward over time (change in slope).21 Covariates in the regression models included patient age, sex, race, ethnicity, and insurance type, as well as presence of an underlying complex chronic condition, mechanical ventilation use, and pediatric intensive care unit (PICU) admission during hospitalization. Data were analyzed in STATA 15 (StataCorp LLC).22

RESULTS

A total of 7,116 patients met inclusion criteria over the study period (2,061 pre-implementation, 5,055 post-implementation). A comparison of patient characteristics before and after HOT protocol implementation is presented in Table 1. Patients were similar in terms of age, sex, and insurance type. Patients in the postimplementation period were more likely to have a complex chronic condition, require admission to the PICU, and need mechanical ventilation (P < .01). Differences between cohorts with regard to race/ethnicity distribution largely were a result of improved capture of these data elements in the postimplementation period. For example, 30% of patients were classified as “race/ethnicity unknown” in the preimplementation cohort, compared with 4% of patients in the postimplementation period.

Patient Demographic and Clinical Characteristics, Preimplementation (2007-2010) and Postimplementation (2010-2019) of the OU-HOT Protocol

Process Measures

Figure 1 shows trends in OU and HOT use by winter season. The percentage of patients discharged from the OU increased immediately after OU-HOT protocol implementation (absolute 26.9% immediate increase; 95% CI, 21.9-42.2). The change in the proportion of OU use per season also increased (change in slope +3.9% per season; 95% CI, 3.4%-4.4%). The percentage of patients discharged with HOT increased immediately after OU-HOT protocol implementation (26.0% immediate change; 95% CI, 18.9%-33.1%); however, the immediate increase in HOT discharges was coupled with a declining rate of HOT discharges per season in the postprotocol period compared with the preprotocol period (change in slope –4.5% per season; 95% CI, –7.5% to –1.5%). Our chart review and EMR flag included 1,354 patients, or 19.0% of our cohort. Our EMR flag for HOT in the last two seasons of the study had a positive predictive value of 100% (5 of 5 identified by EMR flag as receiving HOT were confirmed by chart review) and negative predictive value of 89% (31 of 35 identified by EMR flag as not receiving HOT were confirmed by chart review). The specificity of the EMR flag was 100% (31 of 31 of those confirmed by chart review as not receiving HOT, who were correctly identified by EMR) and the sensitivity was 55% (5 of 9 of those confirmed by chart review as receiving HOT, who were correctly identified by EMR).

Process Measures, 2007-2019

Primary and Secondary Outcomes

Trends in length of stay across winter seasons are presented in Figure 2. The OU-HOT protocol was associated with an immediate reduction of 30.6 hours in mean length of stay (95% CI, –37.1 to –24.2). The rate of change in length of stay postimplementation did not differ significantly from the rate of change preimplementation (change in slope –0.6 hours per season; 95% CI, –2.3 to 1.1 hours). The percentage of patients discharged within 24 hours of admission rose immediately after protocol implementation, by 23.8 absolute percentage points (95% CI, 11.7-28.8). Slopes of the preintervention and postintervention regression lines did not differ significantly (change in slope –0.1% per season; 95% CI, –1.4% to 1.1%). Immediate decreases in length of stay were accompanied by an immediate decrease in mean cost per episode of care (–$4,181; 95% CI, –$4,829 to –$3,533). Protocol implementation also was associated with a decreased slope in cost postimplementation (change in slope –$403 per season; 95% CI, –$543 to –$264). The total cost savings, estimated by the product of the average cost savings per episode of care and the number of bronchiolitis admissions included in the study after OU-HOT implementation, amounted to $21.1 million over the 9-year period, or $2.3 million per winter season.

Primary and Secondary Outcome Measures, 2007-2019

Balancing Measures

We observed an immediate reduction in 7-day hospital revisits (–1.1% immediate change; 95% CI, –1.8% to –0.4%), but an increasing slope in revisits after implementation (change in slope 0.4% per season; 95% CI, 0.1%-0.8%) (Figure 3). Stratifying revisits into ED visits and readmissions revealed that the revisit findings reflected changes in ED return visits, for which there was an immediate reduction at the time of implementation (–1.0% immediate change; 95% CI, –1.6% to –0.4%), but an increasing slope postimplementation (change in slope 0.5% per season; 95% CI, 0.2-0.8). Neither an immediate intervention effect (0.0% immediate change; 95% CI, –0.5% to 0.4%) nor a change in slope (change in slope 0.0% per season; 95% CI, –0.1% to 0.1%) were observed for inpatient readmissions alone. The annual rate of bronchiolitis admissions to Primary Children’s Hospital per 10,000 children who reside in Utah decreased after implementation of the OU-HOT protocol (immediate intervention effect –6.2 admissions; 95% CI, –10.8 to –1.6; change in slope –1.8 admissions per season; 95% CI, –2.8 to –0.69).

Balancing Measures, 2007-2019

DISCUSSION

Our OU-HOT protocol was associated with immediate improvements in care delivered to children hospitalized for bronchiolitis, including decreased length of stay and cost savings. These improvements in outcomes largely have been sustained over a 9-year period. The OU-HOT protocol also appears to be safe as evidenced by a stable rate of readmissions over the study period and only a small increase in revisits to EDs across Intermountain Healthcare facilities, which see most children in the catchment area. Our OU-HOT protocol represents a combination of two interventions: (1) the creation of an OU focused on discharge within 24 to 48 hours of admission and (2) encouragement to discharge children with HOT. We found that use of the OU and a commitment to timely discharges has been sustained in recent years, while the commitment to HOT has appeared to wane.

Earlier investigations have evaluated the efficacy of HOT in the ED setting to prevent hospital admissions, finding high levels of caregiver comfort, estimating $1,300 per patient cost savings, and reporting readmission rates of approximately 5%.16,23-25 Our study is unique in addressing HOT among a population of patients already hospitalized with bronchiolitis. The cost reductions we observed with our OU-HOT protocol were similar to those noted in the ED-based HOT protocols. However, we recorded lower readmission rates, likely because of the additional time allotted to caregivers to better gauge illness trajectory in the inpatient setting vs the ED, as well as additional time for hospitalized patients to reach the plateau or convalescent phase of illness. The small increase in ED revisits that we measured in recent years might be related to the concurrent rise in patient acuity and complexity.

Considering that length of stay has remained low despite less commitment to HOT, our results suggest that the OU might be the more impactful of the two interventions, and these data support the use of such a unit for a subset of patients with bronchiolitis. However, it is important to note that while the EMR HOT flag demonstrated high specificity, positive predictive value, and negative predictive value, the sensitivity was low (56%). As a result, it is possible that we have underestimated HOT use in the 2017-2018 and 2018-2019 seasons, the final two years of the study. Alternatively, the discrepancy between sustained outcomes and lagging use of HOT could be explained by improved identification of patients who would experience the greatest benefit with oxygen in terms of length of stay reductions, with fewer patients discharged on HOT but greater per-patient benefit. Finally, in an era that encourages reduced monitor use and less aggressive response to transient mild desaturations,13,26,27 it is possible that fewer patients are identified with clinically actionable hypoxemia around the time they would be otherwise discharged.

Our OU-HOT model is not unprecedented. Increasingly, other formerly inpatient indications are being successfully managed in the observation, outpatient, and home setting, such as parenteral antibiotic treatment28,29 and chemotherapy administration.30 Considering the inpatient burden of bronchiolitis, similar strategies to expedite discharge are needed. Although outpatient intravenous antibiotic and chemotherapy administration have been widely adopted, we are aware of only one other pediatric health care system in the United States (Children’s Hospital Colorado) that routinely discharges inpatients with bronchiolitis on HOT.

This study has several limitations. First, although the interrupted time-series analysis is designed to account for trends that precede an intervention and covariates that differ before and after the intervention, it is possible that important unmeasured patient factors or changes in practice patterns differed between the pre- and post-intervention cohorts. There were no major changes to the OU-HOT protocol or discharge criteria after implementation, but individual practice management of bronchiolitis during the study period likely has evolved as new evidence emerges. Second, one could postulate that the increase in discharges within 24 hours and accompanying decreases in average length of stay and cost could be achieved by hospitalizing healthier patients over time, which the presence of an OU might incentivize. To the contrary, we found that population-based bronchiolitis admission rates have declined and disease severity appears to be increased since implementation of the OU-HOT protocol. The increase in medically complex children and PICU use in our postimplementation cohort aligns with recently published data suggesting these are national trends.3,31 Third, HOT use was estimated from a sample of the cohort using a chart review and a newly available EMR flag. A low sensitivity and a small sample for the positive predictive value are limitations of the EMR flag.

Additionally, there are almost certainly unmeasured ambulatory burdens of HOT not captured by this study. ED-based protocols have estimated that patients discharged with HOT have a median of two follow-up ambulatory visits before oxygen is discontinued32; however, the ambulatory burden associated with discharge on HOT after a hospitalization and the extent to which demographic factors affect that burden is unknown. Furthermore, one insurance company charged $94 for a month of HOT in 2019; paying even a portion of this charge represents a nontrivial financial burden for many families, even considering inpatient cost savings. Although the decision to discharge on oxygen or remain hospitalized until the child did not need oxygen was left to the parents, their posthospitalization perspectives were not assessed in this study. Although reports indicate that families largely feel positive about HOT after discharge from an ED setting, with 90% of caregivers preferring HOT use to inpatient admission and most reporting no difficulty with home management,23 it is uncertain whether this would also apply after inpatient hospitalization.

CONCLUSION

The OU-HOT bronchiolitis protocol was associated with decreases in inpatient length of stay and cost while appearing safe to implement. The sustained use of the OU combined with declining use of HOT suggests that the OU might be the more impactful intervention. As previously inpatient indications such as parenteral antibiotics and chemotherapy increasingly have been administered in observation and outpatient settings, bronchiolitis appears ideal for a similar strategy that allows patients to spend less time in the hospital. Studies are needed to understand the outpatient burden of HOT and the generalizability of our findings.

Bronchiolitis is the leading cause of hospitalization in infants aged <1 year in the United States.1-3 Estimates suggest that 1.5% to 2.0% of US infants require hospitalization every year, with a median (interquartile range) length of stay of 2 days (1-4),3 incurring direct medical costs of $555 million annually.1 Evidence suggests that few interventions, aside from supportive care, are effective for bronchiolitis.4-7 Adherence to standardized clinical guidelines could improve outcomes and resource use by streamlining care and limiting ineffective interventions, thereby decreasing hospital length of stay, which is a major medical cost.8-13 For this reason, many hospitals have adopted bronchiolitis guidelines, although institutional practices vary.14,15

Two relatively unexplored methods to reduce the inpatient burden of bronchiolitis are the use of observation units (OU) and home oxygen therapy (HOT). Motivated by research demonstrating the safety and effectiveness of an emergency department (ED)–based HOT protocol,16 where 36 of 37 patients with mild hypoxemia discharged on HOT avoided hospital admission, our institution implemented an observation unit and home oxygen therapy (OU-HOT) protocol designed to return children with bronchiolitis home earlier from the hospital. In the first winter season of implementation (2010 to 2011), the OU-HOT protocol was associated with significant reductions in length of stay and substantial cost savings, without an increase in return visits to the ED or inpatient readmissions.17 The objectives of this study were to determine whether these encouraging initial findings persisted and to measure the long-term impact of the OU-HOT protocol.

METHODS

We conducted a retrospective cohort study of children hospitalized with bronchiolitis at Primary Children’s Hospital, a freestanding children’s hospital in Salt Lake City, Utah. Discharge diagnosis and procedures codes, as well as laboratory, imaging, pharmacy, and supply costs, were obtained from the Intermountain Healthcare enterprise data warehouse. A crosswalk available from the Centers for Medicare and Medicaid Services was used to convert International Classification of Diseases (ICD)-10 discharge diagnosis and procedure codes to ICD-9 equivalents.18 This study was approved by the University of Utah institutional review board (00110419).

Patients

Children aged 3 to 24 months who were discharged with a diagnosis of bronchiolitis (466.xx) during winter seasons from 2007 to 2019 were included. A winter season was defined as November 1 to April 30. Both observation and inpatient encounters were included in the cohort. We excluded patients with discharge diagnosis or procedure codes indicating tracheostomy (519.0-519.09, V44.0, V55.0, 31.1, 31.21, 31.41, 31.74, 97.23), ventilator dependence (V46.1x), chronic lung disease (518.83, 770.7), or pulmonary hypertension (416.xx). Patients with both bronchiolitis and a concurrent diagnosis, such as otitis media or pneumonia, were included unless exclusion criteria were met.

Intervention and Process Measures

Our institution implemented the OU-HOT protocol at the start of the 2010-2011 winter season.17 The aim of the OU-HOT protocol was to discharge children with bronchiolitis home sooner by increasing use of both an OU, with frequent assessment of discharge readiness, and HOT to help children become ready for discharge. Similar to most OUs, admission to our unit was limited to patients who met hospital admission criteria, and had a short anticipated length of stay (<48 hours). As a self-contained 20-bed unit providing 24-hour dedicated pediatrician/pediatric emergency medicine physician and nursing coverage, the OU actively monitored patients’ discharge readiness, with a goal to facilitate patient throughput more akin to an ED rather than a traditional inpatient unit. Patients who could not be discharged from the OU within 48 hours were transferred to the inpatient unit. Although the OU existed at the time of protocol implementation, its use for patients with bronchiolitis was not actively encouraged until implementation.

Hospitalized patients—in either inpatient or observation units—were eligible for discharge on HOT if they met the following criteria: hypoxemia was the only indication for continued hospitalization, the child’s oxygen requirement was <0.5 L/min for at least 6 hours (0.8 L/min for children aged >1 year), the child’s caregiver(s) were willing to manage oxygen at home, and the child had reliable access to primary care provider follow up. We used two process measures across winter seasons: (1) the percentage of patients discharged from the OU, and (2) the percentage of patients discharged with HOT. The percentage of patients discharged on HOT was estimated by a manual chart review and an electronic medical record (EMR) HOT flag that came into existence with our hospital system’s adoption of a new EMR (2017-2019). Chart review randomly sampled patients from 2007-2017, totaling 457 patients. To estimate the reliability of this method, we calculated the sensitivity, specificity, positive predictive value, and negative predictive value of the EMR HOT flag using chart review as the gold standard.

Outcome Measures

The main outcome measure was mean hospital length of stay. Balancing measures were revisit rates (stratified into ED visits and readmissions) and annual per-population bronchiolitis admission rates. Visits were considered revisits if they occurred within 7 days of initial hospital discharge, and included visits to Primary Children’s Hospital as well as 22 other Intermountain Healthcare hospitals. Population estimates from the Utah Department of Health were used to calculate the annual population-based rate of bronchiolitis admissions to Primary Children’s Hospital.19 Annual admission rates were calculated per 10,000 children aged 3 to 24 months who resided in Utah each year of the study period, and were evaluated to determine if patients were admitted more frequently after OU-HOT implementation. Secondary outcome measures included the percentage of patients discharged within 24 hours and mean inflation-adjusted cost per episode of care (in 2019 dollars). Hospitalization costs were determined using Intermountain Healthcare’s internal cost accounting system, an activity-based method that aggregates costs of individual resources according to date of service.20 Costs were adjusted to 2019 dollars and were defined as the total costs of a patient’s initial hospitalization as well as any 7-day revisit encounters.

Data Analysis

Demographic data were compared before and after OU-HOT protocol implementation using Pearson chi-square tests. Multivariable linear or logistic regression models were used to compare measures before and after OU-HOT protocol implementation via an interrupted time-series approach. The interrupted time-series analysis measured two types of changes after protocol implementation during the 2010-2011 winter season: (1) any immediate change in the level of an outcome (immediate effect) and (2) any change of an outcome going forward over time (change in slope).21 Covariates in the regression models included patient age, sex, race, ethnicity, and insurance type, as well as presence of an underlying complex chronic condition, mechanical ventilation use, and pediatric intensive care unit (PICU) admission during hospitalization. Data were analyzed in STATA 15 (StataCorp LLC).22

RESULTS

A total of 7,116 patients met inclusion criteria over the study period (2,061 pre-implementation, 5,055 post-implementation). A comparison of patient characteristics before and after HOT protocol implementation is presented in Table 1. Patients were similar in terms of age, sex, and insurance type. Patients in the postimplementation period were more likely to have a complex chronic condition, require admission to the PICU, and need mechanical ventilation (P < .01). Differences between cohorts with regard to race/ethnicity distribution largely were a result of improved capture of these data elements in the postimplementation period. For example, 30% of patients were classified as “race/ethnicity unknown” in the preimplementation cohort, compared with 4% of patients in the postimplementation period.

Patient Demographic and Clinical Characteristics, Preimplementation (2007-2010) and Postimplementation (2010-2019) of the OU-HOT Protocol

Process Measures

Figure 1 shows trends in OU and HOT use by winter season. The percentage of patients discharged from the OU increased immediately after OU-HOT protocol implementation (absolute 26.9% immediate increase; 95% CI, 21.9-42.2). The change in the proportion of OU use per season also increased (change in slope +3.9% per season; 95% CI, 3.4%-4.4%). The percentage of patients discharged with HOT increased immediately after OU-HOT protocol implementation (26.0% immediate change; 95% CI, 18.9%-33.1%); however, the immediate increase in HOT discharges was coupled with a declining rate of HOT discharges per season in the postprotocol period compared with the preprotocol period (change in slope –4.5% per season; 95% CI, –7.5% to –1.5%). Our chart review and EMR flag included 1,354 patients, or 19.0% of our cohort. Our EMR flag for HOT in the last two seasons of the study had a positive predictive value of 100% (5 of 5 identified by EMR flag as receiving HOT were confirmed by chart review) and negative predictive value of 89% (31 of 35 identified by EMR flag as not receiving HOT were confirmed by chart review). The specificity of the EMR flag was 100% (31 of 31 of those confirmed by chart review as not receiving HOT, who were correctly identified by EMR) and the sensitivity was 55% (5 of 9 of those confirmed by chart review as receiving HOT, who were correctly identified by EMR).

Process Measures, 2007-2019

Primary and Secondary Outcomes

Trends in length of stay across winter seasons are presented in Figure 2. The OU-HOT protocol was associated with an immediate reduction of 30.6 hours in mean length of stay (95% CI, –37.1 to –24.2). The rate of change in length of stay postimplementation did not differ significantly from the rate of change preimplementation (change in slope –0.6 hours per season; 95% CI, –2.3 to 1.1 hours). The percentage of patients discharged within 24 hours of admission rose immediately after protocol implementation, by 23.8 absolute percentage points (95% CI, 11.7-28.8). Slopes of the preintervention and postintervention regression lines did not differ significantly (change in slope –0.1% per season; 95% CI, –1.4% to 1.1%). Immediate decreases in length of stay were accompanied by an immediate decrease in mean cost per episode of care (–$4,181; 95% CI, –$4,829 to –$3,533). Protocol implementation also was associated with a decreased slope in cost postimplementation (change in slope –$403 per season; 95% CI, –$543 to –$264). The total cost savings, estimated by the product of the average cost savings per episode of care and the number of bronchiolitis admissions included in the study after OU-HOT implementation, amounted to $21.1 million over the 9-year period, or $2.3 million per winter season.

Primary and Secondary Outcome Measures, 2007-2019

Balancing Measures

We observed an immediate reduction in 7-day hospital revisits (–1.1% immediate change; 95% CI, –1.8% to –0.4%), but an increasing slope in revisits after implementation (change in slope 0.4% per season; 95% CI, 0.1%-0.8%) (Figure 3). Stratifying revisits into ED visits and readmissions revealed that the revisit findings reflected changes in ED return visits, for which there was an immediate reduction at the time of implementation (–1.0% immediate change; 95% CI, –1.6% to –0.4%), but an increasing slope postimplementation (change in slope 0.5% per season; 95% CI, 0.2-0.8). Neither an immediate intervention effect (0.0% immediate change; 95% CI, –0.5% to 0.4%) nor a change in slope (change in slope 0.0% per season; 95% CI, –0.1% to 0.1%) were observed for inpatient readmissions alone. The annual rate of bronchiolitis admissions to Primary Children’s Hospital per 10,000 children who reside in Utah decreased after implementation of the OU-HOT protocol (immediate intervention effect –6.2 admissions; 95% CI, –10.8 to –1.6; change in slope –1.8 admissions per season; 95% CI, –2.8 to –0.69).

Balancing Measures, 2007-2019

DISCUSSION

Our OU-HOT protocol was associated with immediate improvements in care delivered to children hospitalized for bronchiolitis, including decreased length of stay and cost savings. These improvements in outcomes largely have been sustained over a 9-year period. The OU-HOT protocol also appears to be safe as evidenced by a stable rate of readmissions over the study period and only a small increase in revisits to EDs across Intermountain Healthcare facilities, which see most children in the catchment area. Our OU-HOT protocol represents a combination of two interventions: (1) the creation of an OU focused on discharge within 24 to 48 hours of admission and (2) encouragement to discharge children with HOT. We found that use of the OU and a commitment to timely discharges has been sustained in recent years, while the commitment to HOT has appeared to wane.

Earlier investigations have evaluated the efficacy of HOT in the ED setting to prevent hospital admissions, finding high levels of caregiver comfort, estimating $1,300 per patient cost savings, and reporting readmission rates of approximately 5%.16,23-25 Our study is unique in addressing HOT among a population of patients already hospitalized with bronchiolitis. The cost reductions we observed with our OU-HOT protocol were similar to those noted in the ED-based HOT protocols. However, we recorded lower readmission rates, likely because of the additional time allotted to caregivers to better gauge illness trajectory in the inpatient setting vs the ED, as well as additional time for hospitalized patients to reach the plateau or convalescent phase of illness. The small increase in ED revisits that we measured in recent years might be related to the concurrent rise in patient acuity and complexity.

Considering that length of stay has remained low despite less commitment to HOT, our results suggest that the OU might be the more impactful of the two interventions, and these data support the use of such a unit for a subset of patients with bronchiolitis. However, it is important to note that while the EMR HOT flag demonstrated high specificity, positive predictive value, and negative predictive value, the sensitivity was low (56%). As a result, it is possible that we have underestimated HOT use in the 2017-2018 and 2018-2019 seasons, the final two years of the study. Alternatively, the discrepancy between sustained outcomes and lagging use of HOT could be explained by improved identification of patients who would experience the greatest benefit with oxygen in terms of length of stay reductions, with fewer patients discharged on HOT but greater per-patient benefit. Finally, in an era that encourages reduced monitor use and less aggressive response to transient mild desaturations,13,26,27 it is possible that fewer patients are identified with clinically actionable hypoxemia around the time they would be otherwise discharged.

Our OU-HOT model is not unprecedented. Increasingly, other formerly inpatient indications are being successfully managed in the observation, outpatient, and home setting, such as parenteral antibiotic treatment28,29 and chemotherapy administration.30 Considering the inpatient burden of bronchiolitis, similar strategies to expedite discharge are needed. Although outpatient intravenous antibiotic and chemotherapy administration have been widely adopted, we are aware of only one other pediatric health care system in the United States (Children’s Hospital Colorado) that routinely discharges inpatients with bronchiolitis on HOT.

This study has several limitations. First, although the interrupted time-series analysis is designed to account for trends that precede an intervention and covariates that differ before and after the intervention, it is possible that important unmeasured patient factors or changes in practice patterns differed between the pre- and post-intervention cohorts. There were no major changes to the OU-HOT protocol or discharge criteria after implementation, but individual practice management of bronchiolitis during the study period likely has evolved as new evidence emerges. Second, one could postulate that the increase in discharges within 24 hours and accompanying decreases in average length of stay and cost could be achieved by hospitalizing healthier patients over time, which the presence of an OU might incentivize. To the contrary, we found that population-based bronchiolitis admission rates have declined and disease severity appears to be increased since implementation of the OU-HOT protocol. The increase in medically complex children and PICU use in our postimplementation cohort aligns with recently published data suggesting these are national trends.3,31 Third, HOT use was estimated from a sample of the cohort using a chart review and a newly available EMR flag. A low sensitivity and a small sample for the positive predictive value are limitations of the EMR flag.

Additionally, there are almost certainly unmeasured ambulatory burdens of HOT not captured by this study. ED-based protocols have estimated that patients discharged with HOT have a median of two follow-up ambulatory visits before oxygen is discontinued32; however, the ambulatory burden associated with discharge on HOT after a hospitalization and the extent to which demographic factors affect that burden is unknown. Furthermore, one insurance company charged $94 for a month of HOT in 2019; paying even a portion of this charge represents a nontrivial financial burden for many families, even considering inpatient cost savings. Although the decision to discharge on oxygen or remain hospitalized until the child did not need oxygen was left to the parents, their posthospitalization perspectives were not assessed in this study. Although reports indicate that families largely feel positive about HOT after discharge from an ED setting, with 90% of caregivers preferring HOT use to inpatient admission and most reporting no difficulty with home management,23 it is uncertain whether this would also apply after inpatient hospitalization.

CONCLUSION

The OU-HOT bronchiolitis protocol was associated with decreases in inpatient length of stay and cost while appearing safe to implement. The sustained use of the OU combined with declining use of HOT suggests that the OU might be the more impactful intervention. As previously inpatient indications such as parenteral antibiotics and chemotherapy increasingly have been administered in observation and outpatient settings, bronchiolitis appears ideal for a similar strategy that allows patients to spend less time in the hospital. Studies are needed to understand the outpatient burden of HOT and the generalizability of our findings.

References

1. Hasegawa K, Tsugawa Y, Brown DFM, Mansbach JM, Camargo CA. Trends in bronchiolitis hospitalizations in the United States, 2000-2009. Pediatrics. 2013;132(1):28-36. https://doi.org/10.1542/peds.2012-3877
2. Carroll KN, Gebretsadik T, Griffin MR, et al. Increasing burden and risk factors for bronchiolitis-related medical visits in infants enrolled in a state health care insurance plan. Pediatrics. 2008;122(1):58-64. https://doi.org/10.1542/peds.2007-2087
3. Fujiogi M, Goto T, Yasunaga H, et al. Trends in bronchiolitis hospitalizations in the United States: 2000–2016. Pediatrics. 2019;144(6):e20192614. https://doi.org/10.1542/peds.2019-2614
4. Schroeder AR, Mansbach JM. Recent evidence on the management of bronchiolitis. Curr Opin Pediatr. 2014;26(3):328-333. https://doi.org/10.1097/MOP.0000000000000090
5. American Academy of Pediatrics Subcommittee on Diagnosis and Management of Bronchiolitis. Diagnosis and management of bronchiolitis. Pediatrics. 2006;118(4):1774-1793. https://doi.org/10.1542/peds.2006-2223
6. Ralston SL, Lieberthal AS, Meissner HC, et al; American Academy of Pediatrics. Clinical practice guideline: the diagnosis, management, and prevention of bronchiolitis. Pediatrics. 2014;134(5):e1474. https://doi.org/10.1542/peds.2014-2742
7. Riese J, Porter T, Fierce J, Riese A, Richardson T, Alverson BK. Clinical outcomes of bronchiolitis after implementation of a general ward high flow nasal cannula guideline. Hosp Pediatr. 2017;7(4):197-203. https://doi.org/10.1542/hpeds.2016-0195
8. Perlstein PH, Kotagal UR, Bolling C, et al. Evaluation of an evidence-based guideline for bronchiolitis. Pediatrics. 1999;104(6):1334-1341. https://doi.org/10.1542/peds.104.6.1334
9. Perlstein PH, Kotagal UR, Schoettker PJ, et al. Sustaining the implementation of an evidence-based guideline for bronchiolitis. Arch Pediatr Adolesc Med. 2000;154(10):1001-1007. https://doi.org/10.1001/archpedi.154.10.1001
10. Wilson SD, Dahl BB, Wells RD. An evidence-based clinical pathway for bronchiolitis safely reduces antibiotic overuse. Am J Med Qual. 2002;17(5):195-199. https://doi.org/10.1177/106286060201700507
11. Barben J, Kuehni CE, Trachsel D, Hammer J; Swiss Paediatric Respiratory Research Group. Management of acute bronchiolitis: can evidence based guidelines alter clinical practice? Thorax. 2008;63(12):1103-1109. https://doi.org/10.1136/thx.2007.094706
12. Bryan MA, Desai AD, Wilson L, Wright DR, Mangione-Smith R. Association of bronchiolitis clinical pathway adherence with length of stay and costs. Pediatrics. 2017;139(3):e20163432. https://doi.org/10.1542/peds.2016-3432
13. Mittal S, Marlowe L, Blakeslee S, et al. Successful use of quality improvement methodology to reduce inpatient length of stay in bronchiolitis through judicious use of intermittent pulse oximetry. Hosp Pediatr. 2019;9(2):73-78. https://doi.org/10.1542/hpeds.2018-0023
14. Macias CG, Mansbach JM, Fisher ES, et al. Variability in inpatient management of children hospitalized with bronchiolitis. Acad Pediatr. 2015;15(1):69-76. https://doi.org/10.1016/j.acap.2014.07.005
15. Mittal V, Hall M, Morse R, et al. Impact of inpatient bronchiolitis clinical practice guideline implementation on testing and treatment. J Pediatr. 2014;165(3):570-6.e3. https://doi.org/10.1016/j.jpeds.2014.05.021
16. Bajaj L, Turner CG, Bothner J. A randomized trial of home oxygen therapy from the emergency department for acute bronchiolitis. Pediatrics. 2006;117(3):633-640. https://doi.org/10.1542/peds.2005-1322
17. Sandweiss DR, Mundorff MB, Hill T, et al. Decreasing hospital length of stay for bronchiolitis by using an observation unit and home oxygen therapy. JAMA Pediatr. 2013;167(5):422-428. https://doi.org/10.1001/jamapediatrics.2013.1435
18. National Bureau of Economic Research. ICD-9-CM to and from ICD-10-CM and ICD-10-PCS crosswalk or general equivalence mappings. Accessed December 2, 2020. http://www.nber.org/data/icd9-icd-10-cm-and-pcs-crosswalk-general-equivalence-mapping.html
19. Utah Department of Health, Indicator-Based Information System for Public Health. Accessed February 15, 2020. https://ibis.health.utah.gov/ibisph-view
20. James BC, Savitz LA. How Intermountain trimmed health care costs through robust quality improvement efforts. Health Aff (Millwood). 2011;30(6):1185-1191. https://doi.org/10.1377/hlthaff.2011.0358
21. Penfold RB, Zhang F. Use of interrupted time series analysis in evaluating health care quality improvements. Acad Pediatr. 2013;13(6 Suppl):S38-44. https://doi.org/10.1016/j.acap.2013.08.002
22. StataCorp. Stata Statistical Software: Release 15. StataCorp LLC; 2017.
23. Freeman JF, Deakyne S, Bajaj L. Emergency department-initiated home oxygen for bronchiolitis: a prospective study of community follow-up, caregiver satisfaction, and outcomes. Acad Emerg Med. 2017;24(8):920-929. https://doi.org/10.1111/acem.13179
24. Freeman JF, Brou L, Mistry R. Feasibility and capacity for widespread use of emergency department-based home oxygen for bronchiolitis. Am J Emerg Med. 2017;35(9):1379-1381. https://doi.org/10.1016/j.ajem.2017.03.069
25. Halstead S, Roosevelt G, Deakyne S, Bajaj L. Discharged on supplemental oxygen from an emergency department in patients with bronchiolitis. Pediatrics. 2012;129(3):e605-610. https://doi.org/10.1542/peds.2011-0889
26. Quinonez RA, Coon ER, Schroeder AR, Moyer VA. When technology creates uncertainty: pulse oximetry and overdiagnosis of hypoxaemia in bronchiolitis. BMJ. 2017;358:j3850. https://doi.org/10.1136/bmj.j3850
27. Burrows J, Berg K, McCulloh R. Intermittent pulse oximetry use and length of stay in bronchiolitis: bystander or primary Driver? Hosp Pediatr. 2019;9(2):142-143. https://doi.org/10.1542/hpeds.2018-0183
28. Norris AH, Shrestha NK, Allison GM, et al. 2018 Infectious Diseases Society of America clinical practice guideline for the management of outpatient parenteral antimicrobial therapy. Clin Infect Dis. 2019;68(1):e1-e35. https://doi.org/10.1093/cid/ciy745
29. Williams DN, Baker CA, Kind AC, Sannes MR. The history and evolution of outpatient parenteral antibiotic therapy (OPAT). Int J Antimicrob Agents. 2015;46(3):307-312. https://doi.org/10.1016/j.ijantimicag.2015.07.001
30. Beaty RS, Bernhardt MB, Berger AH, Hesselgrave JE, Russell HV, Okcu MF. Inpatient versus outpatient vincristine, dactinomycin, and cyclophosphamide for pediatric cancers: quality and cost implications. Pediatr Blood Cancer. 2015;62(11):1925-1928. https://doi.org/10.1002/pbc.25610
31. Coon ER, Stoddard G, Brady PW. Intensive care unit utilization after adoption of a ward-based high-flow nasal cannula protocol. J Hosp Med. 2020;15(6):325-330. https://doi.org/10.12788/jhm.3417
32. Freeman JF, Weng H-YC, Sandweiss D. Outpatient management of home oxygen for bronchiolitis. Clin Pediatr (Phila). 2015;54(1):62-66. https://doi.org/10.1177/0009922814547564

References

1. Hasegawa K, Tsugawa Y, Brown DFM, Mansbach JM, Camargo CA. Trends in bronchiolitis hospitalizations in the United States, 2000-2009. Pediatrics. 2013;132(1):28-36. https://doi.org/10.1542/peds.2012-3877
2. Carroll KN, Gebretsadik T, Griffin MR, et al. Increasing burden and risk factors for bronchiolitis-related medical visits in infants enrolled in a state health care insurance plan. Pediatrics. 2008;122(1):58-64. https://doi.org/10.1542/peds.2007-2087
3. Fujiogi M, Goto T, Yasunaga H, et al. Trends in bronchiolitis hospitalizations in the United States: 2000–2016. Pediatrics. 2019;144(6):e20192614. https://doi.org/10.1542/peds.2019-2614
4. Schroeder AR, Mansbach JM. Recent evidence on the management of bronchiolitis. Curr Opin Pediatr. 2014;26(3):328-333. https://doi.org/10.1097/MOP.0000000000000090
5. American Academy of Pediatrics Subcommittee on Diagnosis and Management of Bronchiolitis. Diagnosis and management of bronchiolitis. Pediatrics. 2006;118(4):1774-1793. https://doi.org/10.1542/peds.2006-2223
6. Ralston SL, Lieberthal AS, Meissner HC, et al; American Academy of Pediatrics. Clinical practice guideline: the diagnosis, management, and prevention of bronchiolitis. Pediatrics. 2014;134(5):e1474. https://doi.org/10.1542/peds.2014-2742
7. Riese J, Porter T, Fierce J, Riese A, Richardson T, Alverson BK. Clinical outcomes of bronchiolitis after implementation of a general ward high flow nasal cannula guideline. Hosp Pediatr. 2017;7(4):197-203. https://doi.org/10.1542/hpeds.2016-0195
8. Perlstein PH, Kotagal UR, Bolling C, et al. Evaluation of an evidence-based guideline for bronchiolitis. Pediatrics. 1999;104(6):1334-1341. https://doi.org/10.1542/peds.104.6.1334
9. Perlstein PH, Kotagal UR, Schoettker PJ, et al. Sustaining the implementation of an evidence-based guideline for bronchiolitis. Arch Pediatr Adolesc Med. 2000;154(10):1001-1007. https://doi.org/10.1001/archpedi.154.10.1001
10. Wilson SD, Dahl BB, Wells RD. An evidence-based clinical pathway for bronchiolitis safely reduces antibiotic overuse. Am J Med Qual. 2002;17(5):195-199. https://doi.org/10.1177/106286060201700507
11. Barben J, Kuehni CE, Trachsel D, Hammer J; Swiss Paediatric Respiratory Research Group. Management of acute bronchiolitis: can evidence based guidelines alter clinical practice? Thorax. 2008;63(12):1103-1109. https://doi.org/10.1136/thx.2007.094706
12. Bryan MA, Desai AD, Wilson L, Wright DR, Mangione-Smith R. Association of bronchiolitis clinical pathway adherence with length of stay and costs. Pediatrics. 2017;139(3):e20163432. https://doi.org/10.1542/peds.2016-3432
13. Mittal S, Marlowe L, Blakeslee S, et al. Successful use of quality improvement methodology to reduce inpatient length of stay in bronchiolitis through judicious use of intermittent pulse oximetry. Hosp Pediatr. 2019;9(2):73-78. https://doi.org/10.1542/hpeds.2018-0023
14. Macias CG, Mansbach JM, Fisher ES, et al. Variability in inpatient management of children hospitalized with bronchiolitis. Acad Pediatr. 2015;15(1):69-76. https://doi.org/10.1016/j.acap.2014.07.005
15. Mittal V, Hall M, Morse R, et al. Impact of inpatient bronchiolitis clinical practice guideline implementation on testing and treatment. J Pediatr. 2014;165(3):570-6.e3. https://doi.org/10.1016/j.jpeds.2014.05.021
16. Bajaj L, Turner CG, Bothner J. A randomized trial of home oxygen therapy from the emergency department for acute bronchiolitis. Pediatrics. 2006;117(3):633-640. https://doi.org/10.1542/peds.2005-1322
17. Sandweiss DR, Mundorff MB, Hill T, et al. Decreasing hospital length of stay for bronchiolitis by using an observation unit and home oxygen therapy. JAMA Pediatr. 2013;167(5):422-428. https://doi.org/10.1001/jamapediatrics.2013.1435
18. National Bureau of Economic Research. ICD-9-CM to and from ICD-10-CM and ICD-10-PCS crosswalk or general equivalence mappings. Accessed December 2, 2020. http://www.nber.org/data/icd9-icd-10-cm-and-pcs-crosswalk-general-equivalence-mapping.html
19. Utah Department of Health, Indicator-Based Information System for Public Health. Accessed February 15, 2020. https://ibis.health.utah.gov/ibisph-view
20. James BC, Savitz LA. How Intermountain trimmed health care costs through robust quality improvement efforts. Health Aff (Millwood). 2011;30(6):1185-1191. https://doi.org/10.1377/hlthaff.2011.0358
21. Penfold RB, Zhang F. Use of interrupted time series analysis in evaluating health care quality improvements. Acad Pediatr. 2013;13(6 Suppl):S38-44. https://doi.org/10.1016/j.acap.2013.08.002
22. StataCorp. Stata Statistical Software: Release 15. StataCorp LLC; 2017.
23. Freeman JF, Deakyne S, Bajaj L. Emergency department-initiated home oxygen for bronchiolitis: a prospective study of community follow-up, caregiver satisfaction, and outcomes. Acad Emerg Med. 2017;24(8):920-929. https://doi.org/10.1111/acem.13179
24. Freeman JF, Brou L, Mistry R. Feasibility and capacity for widespread use of emergency department-based home oxygen for bronchiolitis. Am J Emerg Med. 2017;35(9):1379-1381. https://doi.org/10.1016/j.ajem.2017.03.069
25. Halstead S, Roosevelt G, Deakyne S, Bajaj L. Discharged on supplemental oxygen from an emergency department in patients with bronchiolitis. Pediatrics. 2012;129(3):e605-610. https://doi.org/10.1542/peds.2011-0889
26. Quinonez RA, Coon ER, Schroeder AR, Moyer VA. When technology creates uncertainty: pulse oximetry and overdiagnosis of hypoxaemia in bronchiolitis. BMJ. 2017;358:j3850. https://doi.org/10.1136/bmj.j3850
27. Burrows J, Berg K, McCulloh R. Intermittent pulse oximetry use and length of stay in bronchiolitis: bystander or primary Driver? Hosp Pediatr. 2019;9(2):142-143. https://doi.org/10.1542/hpeds.2018-0183
28. Norris AH, Shrestha NK, Allison GM, et al. 2018 Infectious Diseases Society of America clinical practice guideline for the management of outpatient parenteral antimicrobial therapy. Clin Infect Dis. 2019;68(1):e1-e35. https://doi.org/10.1093/cid/ciy745
29. Williams DN, Baker CA, Kind AC, Sannes MR. The history and evolution of outpatient parenteral antibiotic therapy (OPAT). Int J Antimicrob Agents. 2015;46(3):307-312. https://doi.org/10.1016/j.ijantimicag.2015.07.001
30. Beaty RS, Bernhardt MB, Berger AH, Hesselgrave JE, Russell HV, Okcu MF. Inpatient versus outpatient vincristine, dactinomycin, and cyclophosphamide for pediatric cancers: quality and cost implications. Pediatr Blood Cancer. 2015;62(11):1925-1928. https://doi.org/10.1002/pbc.25610
31. Coon ER, Stoddard G, Brady PW. Intensive care unit utilization after adoption of a ward-based high-flow nasal cannula protocol. J Hosp Med. 2020;15(6):325-330. https://doi.org/10.12788/jhm.3417
32. Freeman JF, Weng H-YC, Sandweiss D. Outpatient management of home oxygen for bronchiolitis. Clin Pediatr (Phila). 2015;54(1):62-66. https://doi.org/10.1177/0009922814547564

Issue
Journal of Hospital Medicine 16(5)
Issue
Journal of Hospital Medicine 16(5)
Page Number
261-266. Published Online First April 20, 2021
Page Number
261-266. Published Online First April 20, 2021
Topics
Article Type
Display Headline
Nine Seasons of a Bronchiolitis Observation Unit and Home Oxygen Therapy Protocol
Display Headline
Nine Seasons of a Bronchiolitis Observation Unit and Home Oxygen Therapy Protocol
Sections
Article Source

© 2021 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Timothy J D Ohlsen, MD; Email: timothy.ohlsen@seattlechildrens.org. Twitter: @TimOhlsenMD.
Content Gating
Gated (full article locked unless allowed per User)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Gating Strategy
First Page Free
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
Article PDF Media

Automating Measurement of Trainee Work Hours

Article Type
Changed
Thu, 07/01/2021 - 10:48
Display Headline
Automating Measurement of Trainee Work Hours

Across the country, residents are bound to a set of rules from the Accreditation Council for Graduate Medical Education (ACGME) designed to mini mize fatigue, maintain quality of life, and reduce fatigue-related patient safety events. Adherence to work hours regulations is required to maintain accreditation. Among other guidelines, residents are required to work fewer than 80 hours per week on average over 4 consecutive weeks.1 When work hour violations occur, programs risk citation, penalties, and harm to the program’s reputation.

Residents self-report their adherence to program regulations in an annual survey conducted by the ACGME.2 To collect more frequent data, most training programs monitor resident work hours through self-report on an electronic tracking platform.3 These data generally are used internally to identify problems and opportunities for improvement. However, self-report approaches are subject to imperfect recall and incomplete reporting, and require time and effort to complete.4

The widespread adoption of electronic health records (EHRs) brings new opportunity to measure and promote adherence to work hours. EHR log data capture when users log in and out of the system, along with their location and specific actions. These data offer a compelling alternative to self-report because they are already being collected and can be analyzed almost immediately. Recent studies using EHR log data to approximate resident work hours in a pediatric hospital successfully approximated scheduled hours, but the approach was customized to their hospital’s workflows and might not generalize to other settings.5 Furthermore, earlier studies have not captured evening out-of-hospital work, which contributes to total work hours and is associated with physician burnout.6

We developed a computational method that sought to accurately capture work hours, including out-of-hospital work, which could be used as a screening tool to identify at-risk residents and rotations in near real-time. We estimated work hours, including EHR and non-EHR work, from these EHR data and compared these daily estimations to self-report. We then used a heuristic to estimate the frequency of exceeding the 80-hour workweek in a large internal medicine residency program.

METHODS

The population included 82 internal medicine interns (PGY-1) and 121 residents (PGY-2 = 60, PGY-3 = 61) who rotated through University of California, San Francisco Medical Center (UCSFMC) between July 1, 2018, and June 30, 2019, on inpatient rotations. In the UCSF internal medicine residency program, interns spend an average of 5 months per year and residents spend an average of 2 months per year on inpatient rotations at UCSFMC. Scheduled inpatient rotations generally are in 1-month blocks and include general medical wards, cardiology, liver transplant, night-float, and a procedures and jeopardy rotation where interns perform procedures at UCSFMC and serve as backup for their colleagues across sites. Although expected shift duration differs by rotation, types of shifts include regular length days, call days that are not overnight (but expected duration of work is into the late evening), 28-hour overnight call (PGY-2 and PGY-3), and night-float.

Data Source

This computational method was developed at UCSFMC. This study was approved by the University of California, San Francisco institutional review board. Using the UCSF Epic Clarity database, EHR access log data were obtained, including all Epic logins/logoffs, times, and access devices. Access devices identified included medical center computers, personal computers, and mobile devices.

Trainees self-report their work hours in MedHub, a widely used electronic tracking platform for self-report of resident work hours.7 Data were extracted from this database for interns and residents who matched the criteria above. The self-report data were considered the gold standard for comparison, because it is the best available despite its known limitations.

We used data collected from UCSF’s physician scheduling platform, AMiON, to identify interns and residents assigned to rotations at UCSF hospitals.8 AMiON also was used to capture half-days of off-site scheduled clinics and teaching, which count toward the workday but would not be associated with on-campus logins.

Developing a Computational Method to Measure Work Hours

We developed a heuristic to accomplish two goals: (1) infer the duration of continuous in-hospital work hours while providing clinical care and (2) measure “out-of-hospital” work. Logins from medical center computers were considered to be “on-campus” work. Logins from personal computers were considered to be “out-of-hospital.” “Out-of-hospital” login sessions were further subdivided into “out-of-hospital work” and “out-of-hospital study” based on activity during the session; if any work activities listed in Appendix Table 1 were performed, the session was attributed to work. If only chart review was performed, the session was attributed to study and did not count towards total hours worked. Logins from mobile devices also did not count towards total hours worked.

We inferred continuous in-hospital work by linking on-campus EHR sessions from the first on-campus login until the last on-campus logoff (Figure 1).

Approach to Linking EHR Sessions to Measure the Total Workday
Based on our knowledge of workflows, residents generally print their patient lists when they arrive at the hospital and use the EHR to update hand-off information before they leave. To computationally infer a continuous workday, we determined the maximum amount of time between an on-campus logoff and a subsequent on-campus login that could be inferred as continuous work in the hospital. We calculated the probability that an individual would log in on-campus again at any given number of hours after they last logged out (Appendix Figure 1). We found that for any given on-campus logoff, there was a 93% chance an individual will log in again from on-campus within the next 5 hours, indicating continuous on-campus work. However, after more than 5 hours have elapsed, there is a 90% chance that at least 10 hours will elapse before the next on-campus login, indicating the break between on-campus workdays. We therefore used 5 hours as the maximum interval between on-campus EHR sessions that would be linked together to classify on-campus EHR sessions as a single workday. This window accounts for resident work in direct patient care, rounds, and other activities that do not involve the EHR.

If there was overlapping time measurement between on-campus work and personal computer logins (for example, a resident was inferred to be doing on-campus work based on frequent medical center computer logins but there were also logins from personal computers), we inferred this to indicate that a personal device had been brought on-campus and the time was only attributed to on-campus work and was not double counted as out-of-hospital work. Out-of-hospital work that did not overlap with inferred on-campus work time contributed to the total hours worked in a week, consistent with ACGME guidelines.

Our internal medicine residents work at three hospitals: UCSFMC and two affiliated teaching hospitals. Although this study measured work hours while the residents were on an inpatient rotation at UCSFMC, trainees also might have occasional half-day clinics or teaching activities at other sites not captured by these EHR log data. The allocated time for that scheduled activity (extracted from AMiON) was counted as work hours. If the trainee was assigned to a morning half-day of off-site work (eg, didactics), this was counted the same as an 8 am to noon on-campus EHR session. If a trainee was assigned an afternoon half-day of off-site work (eg, a non-UCSF clinic), this was counted the same as a 1 pm to 5 pm on-campus EHR session. Counting this scheduled time as an on-campus EHR session allowed half-days of off-site work to be linked with inferred in-hospital work.

Comparison of EHR-Derived Work Hours Heuristic to Self-Report

Because resident adherence with daily self-report is imperfect, we compared EHR-derived work to self-report on days when both were available. We generated scatter plots of EHR-derived work hours compared with self-report and calculated the mean absolute error of estimation. We fit a linear mixed-effect model for each PGY, modeling self-reported hours as a linear function of estimated hours (fixed effect) with a random intercept (random effect) for each trainee to account for variations among individuals. StatsModels, version 0.11.1, was used for statistical analyses.9

We reviewed detailed data from outlier clusters to understand situations where the heuristic might not perform optimally. To assess whether EHR-derived work hours reasonably overlapped with expected shifts, 20 8-day blocks from separate interns and residents were randomly selected for qualitative detail review in comparison with AMiON schedule data.

Estimating Hours Worked and Work Hours Violations

After validating against self-report on a daily basis, we used our heuristic to infer the average rate at which the 80-hour workweek was exceeded across all inpatient rotations at UCSFMC. This was determined both including “out-of-hospital” work as derived from logins on personal computers and excluding it. Using the estimated daily hours worked, we built a near real-time dashboard to assist program leadership with identifying at-risk trainees and trends across the program.

RESULTS

Data from 82 interns (PGY-1) and 121 internal medicine residents (PGY-2 and PGY-3) who rotated at UCSFMC between July 1, 2018, and June 30, 2019, were included in the study. Table 1 shows the number of days and rotations worked at UCSFMC as well as the frequency of self-report of work hours according to program year.

Total Days Worked at UCSFMC, Number of Rotations Worked at UCSFMC, Total Days With Self-Reported Hours, and Proportion of Days for Which There Was Self-Reporting
Figure 2 shows scatter plots for self-report of work hours compared with work hours estimated from our computational method. The mean absolute error in estimation of self-report with the heuristic is 1.38 hours. Explanations for outlier groups also are described in Figure 2. Appendix Figure 2 shows the distribution of the differences between estimated and self-reported daily work hours.

Daily Work Hours Estimated With the Computational Heuristic in Comparison to Self-Report

Qualitative review of EHR-derived data compared with schedule data showed that, although residents often reported homogenous daily work hours, EHR-derived work hours often varied as expected on a day-to-day basis according to the schedule (Appendix Table 2).

Because out-of-hospital EHR use does not count as work if done for educational purposes, we evaluated the proportion of out-of-hospital EHR use that is considered work and found that 67% of PGY-1, 50% of PGY-2, and 53% of PGY-3 out-of-hospital sessions included at least one work activity, as denoted in Appendix Table 1. Out-of-hospital work therefore represented 85% of PGY-1, 66% of PGY-2, and 73% of PGY-3 time spent in the EHR out-of-hospital. These sessions were counted towards work hours in accordance with ACGME rules and included 29% of PGY-1 workdays and 21% of PGY-2 and PGY-3 workdays. This amounted to a median of 1.0 hours per day (95% CI, 0.1-4.6 hours) of out-of-hospital work for PGY-1, 0.9 hours per day (95% CI, 0.1-4.1 hours) for PGY-2, and 0.8 hours per day (95% CI, 0.1-4.7 hours) for PGY-3 residents. Out-of-hospital logins that did not include work activities, as denoted in Appendix Table 1, were labeled out-of-hospital study and did not count towards work hours; this amounted to a median of 0.3 hours per day (95% CI, 0.02-1.6 hours) for PGY-1, 0.5 hours per day (95% CI, 0.04-0.25 hours) for PGY-2, and 0.3 hours per day (95% CI, 0.03-1.7 hours) for PGY-3. Mobile device logins also were not counted towards total work hours, with a median of 3 minutes per day for PGY-1, 6 minutes per day for PGY-2, and 5 minutes per day for PGY-3.

The percentage of rotation months where average hours worked exceeded 80 hours weekly is shown in Table 2. Inclusion of out-of-hospital work hours substantially increased the frequency at which the 80-hour workweek was exceeded. The frequency of individual residents working more than 80 hours weekly on average is shown in Appendix Figure 3. A narrow majority of PGY-1 and PGY-2 trainees and a larger majority of PGY-3 trainees never worked in excess of 80 hours per week when averaged over the course of a rotation, but several trainees did on several occasions.

Impact of Out-Of-Hospital Work on the Percentage of Rotation Months That Exceed the 80-Hour Workweek

Estimations from the computational method were built into a dashboard for use as screening tool by residency program directors (Appendix Figure 4).

DISCUSSION

EHR log data can be used to automate measurement of trainee work hours, providing timely data to program directors for identifying residents at risk of exceeding work hours limits. We demonstrated this by developing a data-driven approach to link on-campus logins that can be replicated in other training programs. We further demonstrated that out-of-hospital work substantially contributed to resident work hours and the frequency with which they exceed the 80-hour workweek, making it a critical component of any work hour estimation approach. Inclusive of out-of-hospital work, our computational method found that residents exceeded the 80-hour workweek 10% to 21% of the time, depending on their year in residency, with a small majority of residents never exceeding the 80-hour workweek.

Historically, most ACGME residency programs have relied on resident self-report to determine work hours.3 The validity of this method has been extensively studied and results remain mixed; in some surveys, residents admit to underreporting their hours while other validation studies, including the use of clock-in and clock-out or time-stamped parking data, align with self-report relatively well.10-12 Regardless of the reliability of self-report, it is a cumbersome task that residents have difficulty adhering to, as shown in our study, where only slightly more than one-half of the days worked had associated self-report. By relying on resident self-report, we are adding to the burden of clerical work, which is associated with physician burnout.13 Furthermore, because self-report typically does not happen in real-time, it limits a program’s ability to intervene on recent or impending work-hour violations. Our computational method enabled us to build a dashboard that is updated daily and provides critical insight into resident work hours at any time, without waiting for retrospective self-report.

Our study builds on previous work by Dziorny et al using EHR log data to algorithmically measure in-hospital work.5 In their study, the authors isolated shifts with a login gap of 4 hours and then combined shifts according to a set of heuristics. However, their logic integrated an extensive workflow analysis of trainee shifts, which might limit generalizability.5 Our approach computationally derives the temporal threshold for linking EHR sessions, which in our data was 5 hours but might differ at other sites. Automated derivation of this threshold will support generalizability to other programs and sites, although programs will still need to manually account for off-site work such as didactics. In a subsequent study evaluating the 80-hour workweek, Dziorny et al evaluated shift duration and appropriate time-off between shifts and found systematic underreporting of work.14 In our study, we prioritized evaluation of the 80-hour workweek and found general alignment between self-report and EHR-derived work-hour estimates, with a tendency to underestimate at lower reported work hours and overestimate at higher reported work hours (potentially because of underreporting as illustrated by Dziorny et al). We included the important out-of-hospital logins as discrete work events because out-of-hospital work contributes to the total hours worked and to the number of workweeks that exceed the 80-hour workweek, and might contribute to burnout.15 The incidence of exceeding the 80-hour workweek increased by 7% to 8% across all residents when out-of-hospital work was included, demonstrating that tools such as ResQ (ResQ Medical) that rely primarily on geolocation data might not sufficiently capture the ways in which residents spend their time working.16

Our approach has limitations. We determined on-campus vs out-of-hospital locations based on whether the login device belonged to the medical center or was a personal computer. Consequently, if trainees exclusively used a personal computer while on-campus and never used a medical center computer, we would have captured this work done while logged into the EHR but would not have inferred on-campus work. Although nearly all trainees in our organization use medical center computers throughout the day, this might impact generalizability for programs where trainees use personal computers exclusively in the hospital. Our approach also assumes trainees will use the EHR at the beginning and end of their workdays, which could lead to underestimation of work hours in trainees who do not employ this practice. With regards to work done on personal computers, our heuristic required that at least one work activity (as denoted in Appendix Table 1) be included in the session in order for it to count as work. Although this approach allows us to exclude sessions where trainees might be reviewing charts exclusively for educational purposes, it is difficult to infer the true intent of chart review.

There might be periods of time where residents are doing in-hospital work but more than 5 hours elapsed between EHR user sessions. As we have started adapting this computational method for other residency programs, we have added logic that allows for long periods of time in the operating room to be considered part of a continuous workday. There also are limitations to assigning blocks of time to off-site clinics; clinics that are associated with after-hours work but use a different EHR would not be captured in total out-of-hospital work.

Although correlation with self-report was good, we identified clusters of inaccuracy. This likely resulted from our residency program covering three medical centers, two of which were not included in the data set. For example, if a resident had an off-site clinic that was not accounted for in AMiON, EHR-derived work hours might have been underestimated relative to self-report. Operationally leveraging an automated system for measuring work hours in the form of dashboards and other tools could provide the impetus to ensure accurate documentation of schedule anomalies.

CONCLUSION

Implementation of our EHR-derived work-hour model will allow ACGME residency programs to understand and act upon trainee work-hour violations closer to real time, as the data extraction is daily and automated. Automation will save busy residents a cumbersome task, provide more complete data than self-report, and empower residency programs to intervene quickly to support overworked trainees.

Acknowledgments

The authors thank Drs Bradley Monash, Larissa Thomas, and Rebecca Berman for providing residency program input.

Files
References

1. Accreditation Council for Graduate Medical Education. Common program requirements. Accessed August 12, 2020. https://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements
2. Accreditation Council for Graduate Medical Education. Resident/fellow and faculty surveys. Accessed August 12, 2020. https://www.acgme.org/Data-Collection-Systems/Resident-Fellow-and-Faculty-Surveys
3. Petre M, Geana R, Cipparrone N, et al. Comparing electronic and manual tracking systems for monitoring resident duty hours. Ochsner J. 2016;16(1):16-21.
4. Gonzalo JD, Yang JJ, Ngo L, Clark A, Reynolds EE, Herzig SJ. Accuracy of residents’ retrospective perceptions of 16-hour call admitting shift compliance and characteristics. Grad Med Educ. 2013;5(4):630-633. https://doi.org/10.4300/jgme-d-12-00311.1
5. Dziorny AC, Orenstein EW, Lindell RB, Hames NA, Washington N, Desai B. Automatic detection of front-line clinician hospital shifts: a novel use of electronic health record timestamp data. Appl Clin Inform. 2019;10(1):28-37. https://doi.org/10.1055/s-0038-1676819
6. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc. 2019;26(2):106-114. https://doi.org/10.1093/jamia/ocy145
7. MedHub. Accessed April 7, 2021. https://www.medhub.com
8. AMiON. Accessed April 7, 2021. https://www.amion.com
9. Seabold S, Perktold J. Statsmodels: econometric and statistical modeling with python. Proceedings of the 9th Python in Science Conference. https://conference.scipy.org/proceedings/scipy2010/pdfs/seabold.pdf
10. Todd SR, Fahy BN, Paukert JL, Mersinger D, Johnson ML, Bass BL. How accurate are self-reported resident duty hours? J Surg Educ. 2010;67(2):103-107. https://doi.org/10.1016/j.jsurg.2009.08.004
11. Chadaga SR, Keniston A, Casey D, Albert RK. Correlation between self-reported resident duty hours and time-stamped parking data. J Grad Med Educ. 2012;4(2):254-256. https://doi.org/10.4300/JGME-D-11-00142.1
12. Drolet BC, Schwede M, Bishop KD, Fischer SA. Compliance and falsification of duty hours: reports from residents and program directors. J Grad Med Educ. 2013;5(3):368-373. https://doi.org/10.4300/JGME-D-12-00375.1
13. Shanafelt TD, Dyrbye LN, West CP. Addressing physician burnout: the way forward. JAMA. 2017;317(9):901. https://doi.org/10.1001/jama.2017.0076
14. Dziorny AC, Orenstein EW, Lindell RB, Hames NA, Washington N, Desai B. Pediatric trainees systematically under-report duty hour violations compared to electronic health record defined shifts. PLOS ONE. 2019;14(12):e0226493. https://doi.org/10.1371/journal.pone.0226493
15. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama time: working after work in the electronic health record. J Gen Intern Med. 2019;34(9):1695-1696. https://doi.org/10.1007/s11606-019-05055-x
16. ResQ Medical. Accessed April 7, 2021. https://resqmedical.com

Article PDF
Author and Disclosure Information

1Health Informatics, University of California, San Francisco, San Francisco, California; 2Center for Clinical Informatics and Improvement Research, University of California, San Francisco, San Francisco, California; 3Department of Medicine, University of California, San Francisco, San Francisco, California.

Disclosures
The authors have nothing to disclose.

Issue
Journal of Hospital Medicine 16(7)
Topics
Page Number
404-408. Published Online First April 16, 2021
Sections
Files
Files
Author and Disclosure Information

1Health Informatics, University of California, San Francisco, San Francisco, California; 2Center for Clinical Informatics and Improvement Research, University of California, San Francisco, San Francisco, California; 3Department of Medicine, University of California, San Francisco, San Francisco, California.

Disclosures
The authors have nothing to disclose.

Author and Disclosure Information

1Health Informatics, University of California, San Francisco, San Francisco, California; 2Center for Clinical Informatics and Improvement Research, University of California, San Francisco, San Francisco, California; 3Department of Medicine, University of California, San Francisco, San Francisco, California.

Disclosures
The authors have nothing to disclose.

Article PDF
Article PDF
Related Articles

Across the country, residents are bound to a set of rules from the Accreditation Council for Graduate Medical Education (ACGME) designed to mini mize fatigue, maintain quality of life, and reduce fatigue-related patient safety events. Adherence to work hours regulations is required to maintain accreditation. Among other guidelines, residents are required to work fewer than 80 hours per week on average over 4 consecutive weeks.1 When work hour violations occur, programs risk citation, penalties, and harm to the program’s reputation.

Residents self-report their adherence to program regulations in an annual survey conducted by the ACGME.2 To collect more frequent data, most training programs monitor resident work hours through self-report on an electronic tracking platform.3 These data generally are used internally to identify problems and opportunities for improvement. However, self-report approaches are subject to imperfect recall and incomplete reporting, and require time and effort to complete.4

The widespread adoption of electronic health records (EHRs) brings new opportunity to measure and promote adherence to work hours. EHR log data capture when users log in and out of the system, along with their location and specific actions. These data offer a compelling alternative to self-report because they are already being collected and can be analyzed almost immediately. Recent studies using EHR log data to approximate resident work hours in a pediatric hospital successfully approximated scheduled hours, but the approach was customized to their hospital’s workflows and might not generalize to other settings.5 Furthermore, earlier studies have not captured evening out-of-hospital work, which contributes to total work hours and is associated with physician burnout.6

We developed a computational method that sought to accurately capture work hours, including out-of-hospital work, which could be used as a screening tool to identify at-risk residents and rotations in near real-time. We estimated work hours, including EHR and non-EHR work, from these EHR data and compared these daily estimations to self-report. We then used a heuristic to estimate the frequency of exceeding the 80-hour workweek in a large internal medicine residency program.

METHODS

The population included 82 internal medicine interns (PGY-1) and 121 residents (PGY-2 = 60, PGY-3 = 61) who rotated through University of California, San Francisco Medical Center (UCSFMC) between July 1, 2018, and June 30, 2019, on inpatient rotations. In the UCSF internal medicine residency program, interns spend an average of 5 months per year and residents spend an average of 2 months per year on inpatient rotations at UCSFMC. Scheduled inpatient rotations generally are in 1-month blocks and include general medical wards, cardiology, liver transplant, night-float, and a procedures and jeopardy rotation where interns perform procedures at UCSFMC and serve as backup for their colleagues across sites. Although expected shift duration differs by rotation, types of shifts include regular length days, call days that are not overnight (but expected duration of work is into the late evening), 28-hour overnight call (PGY-2 and PGY-3), and night-float.

Data Source

This computational method was developed at UCSFMC. This study was approved by the University of California, San Francisco institutional review board. Using the UCSF Epic Clarity database, EHR access log data were obtained, including all Epic logins/logoffs, times, and access devices. Access devices identified included medical center computers, personal computers, and mobile devices.

Trainees self-report their work hours in MedHub, a widely used electronic tracking platform for self-report of resident work hours.7 Data were extracted from this database for interns and residents who matched the criteria above. The self-report data were considered the gold standard for comparison, because it is the best available despite its known limitations.

We used data collected from UCSF’s physician scheduling platform, AMiON, to identify interns and residents assigned to rotations at UCSF hospitals.8 AMiON also was used to capture half-days of off-site scheduled clinics and teaching, which count toward the workday but would not be associated with on-campus logins.

Developing a Computational Method to Measure Work Hours

We developed a heuristic to accomplish two goals: (1) infer the duration of continuous in-hospital work hours while providing clinical care and (2) measure “out-of-hospital” work. Logins from medical center computers were considered to be “on-campus” work. Logins from personal computers were considered to be “out-of-hospital.” “Out-of-hospital” login sessions were further subdivided into “out-of-hospital work” and “out-of-hospital study” based on activity during the session; if any work activities listed in Appendix Table 1 were performed, the session was attributed to work. If only chart review was performed, the session was attributed to study and did not count towards total hours worked. Logins from mobile devices also did not count towards total hours worked.

We inferred continuous in-hospital work by linking on-campus EHR sessions from the first on-campus login until the last on-campus logoff (Figure 1).

Approach to Linking EHR Sessions to Measure the Total Workday
Based on our knowledge of workflows, residents generally print their patient lists when they arrive at the hospital and use the EHR to update hand-off information before they leave. To computationally infer a continuous workday, we determined the maximum amount of time between an on-campus logoff and a subsequent on-campus login that could be inferred as continuous work in the hospital. We calculated the probability that an individual would log in on-campus again at any given number of hours after they last logged out (Appendix Figure 1). We found that for any given on-campus logoff, there was a 93% chance an individual will log in again from on-campus within the next 5 hours, indicating continuous on-campus work. However, after more than 5 hours have elapsed, there is a 90% chance that at least 10 hours will elapse before the next on-campus login, indicating the break between on-campus workdays. We therefore used 5 hours as the maximum interval between on-campus EHR sessions that would be linked together to classify on-campus EHR sessions as a single workday. This window accounts for resident work in direct patient care, rounds, and other activities that do not involve the EHR.

If there was overlapping time measurement between on-campus work and personal computer logins (for example, a resident was inferred to be doing on-campus work based on frequent medical center computer logins but there were also logins from personal computers), we inferred this to indicate that a personal device had been brought on-campus and the time was only attributed to on-campus work and was not double counted as out-of-hospital work. Out-of-hospital work that did not overlap with inferred on-campus work time contributed to the total hours worked in a week, consistent with ACGME guidelines.

Our internal medicine residents work at three hospitals: UCSFMC and two affiliated teaching hospitals. Although this study measured work hours while the residents were on an inpatient rotation at UCSFMC, trainees also might have occasional half-day clinics or teaching activities at other sites not captured by these EHR log data. The allocated time for that scheduled activity (extracted from AMiON) was counted as work hours. If the trainee was assigned to a morning half-day of off-site work (eg, didactics), this was counted the same as an 8 am to noon on-campus EHR session. If a trainee was assigned an afternoon half-day of off-site work (eg, a non-UCSF clinic), this was counted the same as a 1 pm to 5 pm on-campus EHR session. Counting this scheduled time as an on-campus EHR session allowed half-days of off-site work to be linked with inferred in-hospital work.

Comparison of EHR-Derived Work Hours Heuristic to Self-Report

Because resident adherence with daily self-report is imperfect, we compared EHR-derived work to self-report on days when both were available. We generated scatter plots of EHR-derived work hours compared with self-report and calculated the mean absolute error of estimation. We fit a linear mixed-effect model for each PGY, modeling self-reported hours as a linear function of estimated hours (fixed effect) with a random intercept (random effect) for each trainee to account for variations among individuals. StatsModels, version 0.11.1, was used for statistical analyses.9

We reviewed detailed data from outlier clusters to understand situations where the heuristic might not perform optimally. To assess whether EHR-derived work hours reasonably overlapped with expected shifts, 20 8-day blocks from separate interns and residents were randomly selected for qualitative detail review in comparison with AMiON schedule data.

Estimating Hours Worked and Work Hours Violations

After validating against self-report on a daily basis, we used our heuristic to infer the average rate at which the 80-hour workweek was exceeded across all inpatient rotations at UCSFMC. This was determined both including “out-of-hospital” work as derived from logins on personal computers and excluding it. Using the estimated daily hours worked, we built a near real-time dashboard to assist program leadership with identifying at-risk trainees and trends across the program.

RESULTS

Data from 82 interns (PGY-1) and 121 internal medicine residents (PGY-2 and PGY-3) who rotated at UCSFMC between July 1, 2018, and June 30, 2019, were included in the study. Table 1 shows the number of days and rotations worked at UCSFMC as well as the frequency of self-report of work hours according to program year.

Total Days Worked at UCSFMC, Number of Rotations Worked at UCSFMC, Total Days With Self-Reported Hours, and Proportion of Days for Which There Was Self-Reporting
Figure 2 shows scatter plots for self-report of work hours compared with work hours estimated from our computational method. The mean absolute error in estimation of self-report with the heuristic is 1.38 hours. Explanations for outlier groups also are described in Figure 2. Appendix Figure 2 shows the distribution of the differences between estimated and self-reported daily work hours.

Daily Work Hours Estimated With the Computational Heuristic in Comparison to Self-Report

Qualitative review of EHR-derived data compared with schedule data showed that, although residents often reported homogenous daily work hours, EHR-derived work hours often varied as expected on a day-to-day basis according to the schedule (Appendix Table 2).

Because out-of-hospital EHR use does not count as work if done for educational purposes, we evaluated the proportion of out-of-hospital EHR use that is considered work and found that 67% of PGY-1, 50% of PGY-2, and 53% of PGY-3 out-of-hospital sessions included at least one work activity, as denoted in Appendix Table 1. Out-of-hospital work therefore represented 85% of PGY-1, 66% of PGY-2, and 73% of PGY-3 time spent in the EHR out-of-hospital. These sessions were counted towards work hours in accordance with ACGME rules and included 29% of PGY-1 workdays and 21% of PGY-2 and PGY-3 workdays. This amounted to a median of 1.0 hours per day (95% CI, 0.1-4.6 hours) of out-of-hospital work for PGY-1, 0.9 hours per day (95% CI, 0.1-4.1 hours) for PGY-2, and 0.8 hours per day (95% CI, 0.1-4.7 hours) for PGY-3 residents. Out-of-hospital logins that did not include work activities, as denoted in Appendix Table 1, were labeled out-of-hospital study and did not count towards work hours; this amounted to a median of 0.3 hours per day (95% CI, 0.02-1.6 hours) for PGY-1, 0.5 hours per day (95% CI, 0.04-0.25 hours) for PGY-2, and 0.3 hours per day (95% CI, 0.03-1.7 hours) for PGY-3. Mobile device logins also were not counted towards total work hours, with a median of 3 minutes per day for PGY-1, 6 minutes per day for PGY-2, and 5 minutes per day for PGY-3.

The percentage of rotation months where average hours worked exceeded 80 hours weekly is shown in Table 2. Inclusion of out-of-hospital work hours substantially increased the frequency at which the 80-hour workweek was exceeded. The frequency of individual residents working more than 80 hours weekly on average is shown in Appendix Figure 3. A narrow majority of PGY-1 and PGY-2 trainees and a larger majority of PGY-3 trainees never worked in excess of 80 hours per week when averaged over the course of a rotation, but several trainees did on several occasions.

Impact of Out-Of-Hospital Work on the Percentage of Rotation Months That Exceed the 80-Hour Workweek

Estimations from the computational method were built into a dashboard for use as screening tool by residency program directors (Appendix Figure 4).

DISCUSSION

EHR log data can be used to automate measurement of trainee work hours, providing timely data to program directors for identifying residents at risk of exceeding work hours limits. We demonstrated this by developing a data-driven approach to link on-campus logins that can be replicated in other training programs. We further demonstrated that out-of-hospital work substantially contributed to resident work hours and the frequency with which they exceed the 80-hour workweek, making it a critical component of any work hour estimation approach. Inclusive of out-of-hospital work, our computational method found that residents exceeded the 80-hour workweek 10% to 21% of the time, depending on their year in residency, with a small majority of residents never exceeding the 80-hour workweek.

Historically, most ACGME residency programs have relied on resident self-report to determine work hours.3 The validity of this method has been extensively studied and results remain mixed; in some surveys, residents admit to underreporting their hours while other validation studies, including the use of clock-in and clock-out or time-stamped parking data, align with self-report relatively well.10-12 Regardless of the reliability of self-report, it is a cumbersome task that residents have difficulty adhering to, as shown in our study, where only slightly more than one-half of the days worked had associated self-report. By relying on resident self-report, we are adding to the burden of clerical work, which is associated with physician burnout.13 Furthermore, because self-report typically does not happen in real-time, it limits a program’s ability to intervene on recent or impending work-hour violations. Our computational method enabled us to build a dashboard that is updated daily and provides critical insight into resident work hours at any time, without waiting for retrospective self-report.

Our study builds on previous work by Dziorny et al using EHR log data to algorithmically measure in-hospital work.5 In their study, the authors isolated shifts with a login gap of 4 hours and then combined shifts according to a set of heuristics. However, their logic integrated an extensive workflow analysis of trainee shifts, which might limit generalizability.5 Our approach computationally derives the temporal threshold for linking EHR sessions, which in our data was 5 hours but might differ at other sites. Automated derivation of this threshold will support generalizability to other programs and sites, although programs will still need to manually account for off-site work such as didactics. In a subsequent study evaluating the 80-hour workweek, Dziorny et al evaluated shift duration and appropriate time-off between shifts and found systematic underreporting of work.14 In our study, we prioritized evaluation of the 80-hour workweek and found general alignment between self-report and EHR-derived work-hour estimates, with a tendency to underestimate at lower reported work hours and overestimate at higher reported work hours (potentially because of underreporting as illustrated by Dziorny et al). We included the important out-of-hospital logins as discrete work events because out-of-hospital work contributes to the total hours worked and to the number of workweeks that exceed the 80-hour workweek, and might contribute to burnout.15 The incidence of exceeding the 80-hour workweek increased by 7% to 8% across all residents when out-of-hospital work was included, demonstrating that tools such as ResQ (ResQ Medical) that rely primarily on geolocation data might not sufficiently capture the ways in which residents spend their time working.16

Our approach has limitations. We determined on-campus vs out-of-hospital locations based on whether the login device belonged to the medical center or was a personal computer. Consequently, if trainees exclusively used a personal computer while on-campus and never used a medical center computer, we would have captured this work done while logged into the EHR but would not have inferred on-campus work. Although nearly all trainees in our organization use medical center computers throughout the day, this might impact generalizability for programs where trainees use personal computers exclusively in the hospital. Our approach also assumes trainees will use the EHR at the beginning and end of their workdays, which could lead to underestimation of work hours in trainees who do not employ this practice. With regards to work done on personal computers, our heuristic required that at least one work activity (as denoted in Appendix Table 1) be included in the session in order for it to count as work. Although this approach allows us to exclude sessions where trainees might be reviewing charts exclusively for educational purposes, it is difficult to infer the true intent of chart review.

There might be periods of time where residents are doing in-hospital work but more than 5 hours elapsed between EHR user sessions. As we have started adapting this computational method for other residency programs, we have added logic that allows for long periods of time in the operating room to be considered part of a continuous workday. There also are limitations to assigning blocks of time to off-site clinics; clinics that are associated with after-hours work but use a different EHR would not be captured in total out-of-hospital work.

Although correlation with self-report was good, we identified clusters of inaccuracy. This likely resulted from our residency program covering three medical centers, two of which were not included in the data set. For example, if a resident had an off-site clinic that was not accounted for in AMiON, EHR-derived work hours might have been underestimated relative to self-report. Operationally leveraging an automated system for measuring work hours in the form of dashboards and other tools could provide the impetus to ensure accurate documentation of schedule anomalies.

CONCLUSION

Implementation of our EHR-derived work-hour model will allow ACGME residency programs to understand and act upon trainee work-hour violations closer to real time, as the data extraction is daily and automated. Automation will save busy residents a cumbersome task, provide more complete data than self-report, and empower residency programs to intervene quickly to support overworked trainees.

Acknowledgments

The authors thank Drs Bradley Monash, Larissa Thomas, and Rebecca Berman for providing residency program input.

Across the country, residents are bound to a set of rules from the Accreditation Council for Graduate Medical Education (ACGME) designed to mini mize fatigue, maintain quality of life, and reduce fatigue-related patient safety events. Adherence to work hours regulations is required to maintain accreditation. Among other guidelines, residents are required to work fewer than 80 hours per week on average over 4 consecutive weeks.1 When work hour violations occur, programs risk citation, penalties, and harm to the program’s reputation.

Residents self-report their adherence to program regulations in an annual survey conducted by the ACGME.2 To collect more frequent data, most training programs monitor resident work hours through self-report on an electronic tracking platform.3 These data generally are used internally to identify problems and opportunities for improvement. However, self-report approaches are subject to imperfect recall and incomplete reporting, and require time and effort to complete.4

The widespread adoption of electronic health records (EHRs) brings new opportunity to measure and promote adherence to work hours. EHR log data capture when users log in and out of the system, along with their location and specific actions. These data offer a compelling alternative to self-report because they are already being collected and can be analyzed almost immediately. Recent studies using EHR log data to approximate resident work hours in a pediatric hospital successfully approximated scheduled hours, but the approach was customized to their hospital’s workflows and might not generalize to other settings.5 Furthermore, earlier studies have not captured evening out-of-hospital work, which contributes to total work hours and is associated with physician burnout.6

We developed a computational method that sought to accurately capture work hours, including out-of-hospital work, which could be used as a screening tool to identify at-risk residents and rotations in near real-time. We estimated work hours, including EHR and non-EHR work, from these EHR data and compared these daily estimations to self-report. We then used a heuristic to estimate the frequency of exceeding the 80-hour workweek in a large internal medicine residency program.

METHODS

The population included 82 internal medicine interns (PGY-1) and 121 residents (PGY-2 = 60, PGY-3 = 61) who rotated through University of California, San Francisco Medical Center (UCSFMC) between July 1, 2018, and June 30, 2019, on inpatient rotations. In the UCSF internal medicine residency program, interns spend an average of 5 months per year and residents spend an average of 2 months per year on inpatient rotations at UCSFMC. Scheduled inpatient rotations generally are in 1-month blocks and include general medical wards, cardiology, liver transplant, night-float, and a procedures and jeopardy rotation where interns perform procedures at UCSFMC and serve as backup for their colleagues across sites. Although expected shift duration differs by rotation, types of shifts include regular length days, call days that are not overnight (but expected duration of work is into the late evening), 28-hour overnight call (PGY-2 and PGY-3), and night-float.

Data Source

This computational method was developed at UCSFMC. This study was approved by the University of California, San Francisco institutional review board. Using the UCSF Epic Clarity database, EHR access log data were obtained, including all Epic logins/logoffs, times, and access devices. Access devices identified included medical center computers, personal computers, and mobile devices.

Trainees self-report their work hours in MedHub, a widely used electronic tracking platform for self-report of resident work hours.7 Data were extracted from this database for interns and residents who matched the criteria above. The self-report data were considered the gold standard for comparison, because it is the best available despite its known limitations.

We used data collected from UCSF’s physician scheduling platform, AMiON, to identify interns and residents assigned to rotations at UCSF hospitals.8 AMiON also was used to capture half-days of off-site scheduled clinics and teaching, which count toward the workday but would not be associated with on-campus logins.

Developing a Computational Method to Measure Work Hours

We developed a heuristic to accomplish two goals: (1) infer the duration of continuous in-hospital work hours while providing clinical care and (2) measure “out-of-hospital” work. Logins from medical center computers were considered to be “on-campus” work. Logins from personal computers were considered to be “out-of-hospital.” “Out-of-hospital” login sessions were further subdivided into “out-of-hospital work” and “out-of-hospital study” based on activity during the session; if any work activities listed in Appendix Table 1 were performed, the session was attributed to work. If only chart review was performed, the session was attributed to study and did not count towards total hours worked. Logins from mobile devices also did not count towards total hours worked.

We inferred continuous in-hospital work by linking on-campus EHR sessions from the first on-campus login until the last on-campus logoff (Figure 1).

Approach to Linking EHR Sessions to Measure the Total Workday
Based on our knowledge of workflows, residents generally print their patient lists when they arrive at the hospital and use the EHR to update hand-off information before they leave. To computationally infer a continuous workday, we determined the maximum amount of time between an on-campus logoff and a subsequent on-campus login that could be inferred as continuous work in the hospital. We calculated the probability that an individual would log in on-campus again at any given number of hours after they last logged out (Appendix Figure 1). We found that for any given on-campus logoff, there was a 93% chance an individual will log in again from on-campus within the next 5 hours, indicating continuous on-campus work. However, after more than 5 hours have elapsed, there is a 90% chance that at least 10 hours will elapse before the next on-campus login, indicating the break between on-campus workdays. We therefore used 5 hours as the maximum interval between on-campus EHR sessions that would be linked together to classify on-campus EHR sessions as a single workday. This window accounts for resident work in direct patient care, rounds, and other activities that do not involve the EHR.

If there was overlapping time measurement between on-campus work and personal computer logins (for example, a resident was inferred to be doing on-campus work based on frequent medical center computer logins but there were also logins from personal computers), we inferred this to indicate that a personal device had been brought on-campus and the time was only attributed to on-campus work and was not double counted as out-of-hospital work. Out-of-hospital work that did not overlap with inferred on-campus work time contributed to the total hours worked in a week, consistent with ACGME guidelines.

Our internal medicine residents work at three hospitals: UCSFMC and two affiliated teaching hospitals. Although this study measured work hours while the residents were on an inpatient rotation at UCSFMC, trainees also might have occasional half-day clinics or teaching activities at other sites not captured by these EHR log data. The allocated time for that scheduled activity (extracted from AMiON) was counted as work hours. If the trainee was assigned to a morning half-day of off-site work (eg, didactics), this was counted the same as an 8 am to noon on-campus EHR session. If a trainee was assigned an afternoon half-day of off-site work (eg, a non-UCSF clinic), this was counted the same as a 1 pm to 5 pm on-campus EHR session. Counting this scheduled time as an on-campus EHR session allowed half-days of off-site work to be linked with inferred in-hospital work.

Comparison of EHR-Derived Work Hours Heuristic to Self-Report

Because resident adherence with daily self-report is imperfect, we compared EHR-derived work to self-report on days when both were available. We generated scatter plots of EHR-derived work hours compared with self-report and calculated the mean absolute error of estimation. We fit a linear mixed-effect model for each PGY, modeling self-reported hours as a linear function of estimated hours (fixed effect) with a random intercept (random effect) for each trainee to account for variations among individuals. StatsModels, version 0.11.1, was used for statistical analyses.9

We reviewed detailed data from outlier clusters to understand situations where the heuristic might not perform optimally. To assess whether EHR-derived work hours reasonably overlapped with expected shifts, 20 8-day blocks from separate interns and residents were randomly selected for qualitative detail review in comparison with AMiON schedule data.

Estimating Hours Worked and Work Hours Violations

After validating against self-report on a daily basis, we used our heuristic to infer the average rate at which the 80-hour workweek was exceeded across all inpatient rotations at UCSFMC. This was determined both including “out-of-hospital” work as derived from logins on personal computers and excluding it. Using the estimated daily hours worked, we built a near real-time dashboard to assist program leadership with identifying at-risk trainees and trends across the program.

RESULTS

Data from 82 interns (PGY-1) and 121 internal medicine residents (PGY-2 and PGY-3) who rotated at UCSFMC between July 1, 2018, and June 30, 2019, were included in the study. Table 1 shows the number of days and rotations worked at UCSFMC as well as the frequency of self-report of work hours according to program year.

Total Days Worked at UCSFMC, Number of Rotations Worked at UCSFMC, Total Days With Self-Reported Hours, and Proportion of Days for Which There Was Self-Reporting
Figure 2 shows scatter plots for self-report of work hours compared with work hours estimated from our computational method. The mean absolute error in estimation of self-report with the heuristic is 1.38 hours. Explanations for outlier groups also are described in Figure 2. Appendix Figure 2 shows the distribution of the differences between estimated and self-reported daily work hours.

Daily Work Hours Estimated With the Computational Heuristic in Comparison to Self-Report

Qualitative review of EHR-derived data compared with schedule data showed that, although residents often reported homogenous daily work hours, EHR-derived work hours often varied as expected on a day-to-day basis according to the schedule (Appendix Table 2).

Because out-of-hospital EHR use does not count as work if done for educational purposes, we evaluated the proportion of out-of-hospital EHR use that is considered work and found that 67% of PGY-1, 50% of PGY-2, and 53% of PGY-3 out-of-hospital sessions included at least one work activity, as denoted in Appendix Table 1. Out-of-hospital work therefore represented 85% of PGY-1, 66% of PGY-2, and 73% of PGY-3 time spent in the EHR out-of-hospital. These sessions were counted towards work hours in accordance with ACGME rules and included 29% of PGY-1 workdays and 21% of PGY-2 and PGY-3 workdays. This amounted to a median of 1.0 hours per day (95% CI, 0.1-4.6 hours) of out-of-hospital work for PGY-1, 0.9 hours per day (95% CI, 0.1-4.1 hours) for PGY-2, and 0.8 hours per day (95% CI, 0.1-4.7 hours) for PGY-3 residents. Out-of-hospital logins that did not include work activities, as denoted in Appendix Table 1, were labeled out-of-hospital study and did not count towards work hours; this amounted to a median of 0.3 hours per day (95% CI, 0.02-1.6 hours) for PGY-1, 0.5 hours per day (95% CI, 0.04-0.25 hours) for PGY-2, and 0.3 hours per day (95% CI, 0.03-1.7 hours) for PGY-3. Mobile device logins also were not counted towards total work hours, with a median of 3 minutes per day for PGY-1, 6 minutes per day for PGY-2, and 5 minutes per day for PGY-3.

The percentage of rotation months where average hours worked exceeded 80 hours weekly is shown in Table 2. Inclusion of out-of-hospital work hours substantially increased the frequency at which the 80-hour workweek was exceeded. The frequency of individual residents working more than 80 hours weekly on average is shown in Appendix Figure 3. A narrow majority of PGY-1 and PGY-2 trainees and a larger majority of PGY-3 trainees never worked in excess of 80 hours per week when averaged over the course of a rotation, but several trainees did on several occasions.

Impact of Out-Of-Hospital Work on the Percentage of Rotation Months That Exceed the 80-Hour Workweek

Estimations from the computational method were built into a dashboard for use as screening tool by residency program directors (Appendix Figure 4).

DISCUSSION

EHR log data can be used to automate measurement of trainee work hours, providing timely data to program directors for identifying residents at risk of exceeding work hours limits. We demonstrated this by developing a data-driven approach to link on-campus logins that can be replicated in other training programs. We further demonstrated that out-of-hospital work substantially contributed to resident work hours and the frequency with which they exceed the 80-hour workweek, making it a critical component of any work hour estimation approach. Inclusive of out-of-hospital work, our computational method found that residents exceeded the 80-hour workweek 10% to 21% of the time, depending on their year in residency, with a small majority of residents never exceeding the 80-hour workweek.

Historically, most ACGME residency programs have relied on resident self-report to determine work hours.3 The validity of this method has been extensively studied and results remain mixed; in some surveys, residents admit to underreporting their hours while other validation studies, including the use of clock-in and clock-out or time-stamped parking data, align with self-report relatively well.10-12 Regardless of the reliability of self-report, it is a cumbersome task that residents have difficulty adhering to, as shown in our study, where only slightly more than one-half of the days worked had associated self-report. By relying on resident self-report, we are adding to the burden of clerical work, which is associated with physician burnout.13 Furthermore, because self-report typically does not happen in real-time, it limits a program’s ability to intervene on recent or impending work-hour violations. Our computational method enabled us to build a dashboard that is updated daily and provides critical insight into resident work hours at any time, without waiting for retrospective self-report.

Our study builds on previous work by Dziorny et al using EHR log data to algorithmically measure in-hospital work.5 In their study, the authors isolated shifts with a login gap of 4 hours and then combined shifts according to a set of heuristics. However, their logic integrated an extensive workflow analysis of trainee shifts, which might limit generalizability.5 Our approach computationally derives the temporal threshold for linking EHR sessions, which in our data was 5 hours but might differ at other sites. Automated derivation of this threshold will support generalizability to other programs and sites, although programs will still need to manually account for off-site work such as didactics. In a subsequent study evaluating the 80-hour workweek, Dziorny et al evaluated shift duration and appropriate time-off between shifts and found systematic underreporting of work.14 In our study, we prioritized evaluation of the 80-hour workweek and found general alignment between self-report and EHR-derived work-hour estimates, with a tendency to underestimate at lower reported work hours and overestimate at higher reported work hours (potentially because of underreporting as illustrated by Dziorny et al). We included the important out-of-hospital logins as discrete work events because out-of-hospital work contributes to the total hours worked and to the number of workweeks that exceed the 80-hour workweek, and might contribute to burnout.15 The incidence of exceeding the 80-hour workweek increased by 7% to 8% across all residents when out-of-hospital work was included, demonstrating that tools such as ResQ (ResQ Medical) that rely primarily on geolocation data might not sufficiently capture the ways in which residents spend their time working.16

Our approach has limitations. We determined on-campus vs out-of-hospital locations based on whether the login device belonged to the medical center or was a personal computer. Consequently, if trainees exclusively used a personal computer while on-campus and never used a medical center computer, we would have captured this work done while logged into the EHR but would not have inferred on-campus work. Although nearly all trainees in our organization use medical center computers throughout the day, this might impact generalizability for programs where trainees use personal computers exclusively in the hospital. Our approach also assumes trainees will use the EHR at the beginning and end of their workdays, which could lead to underestimation of work hours in trainees who do not employ this practice. With regards to work done on personal computers, our heuristic required that at least one work activity (as denoted in Appendix Table 1) be included in the session in order for it to count as work. Although this approach allows us to exclude sessions where trainees might be reviewing charts exclusively for educational purposes, it is difficult to infer the true intent of chart review.

There might be periods of time where residents are doing in-hospital work but more than 5 hours elapsed between EHR user sessions. As we have started adapting this computational method for other residency programs, we have added logic that allows for long periods of time in the operating room to be considered part of a continuous workday. There also are limitations to assigning blocks of time to off-site clinics; clinics that are associated with after-hours work but use a different EHR would not be captured in total out-of-hospital work.

Although correlation with self-report was good, we identified clusters of inaccuracy. This likely resulted from our residency program covering three medical centers, two of which were not included in the data set. For example, if a resident had an off-site clinic that was not accounted for in AMiON, EHR-derived work hours might have been underestimated relative to self-report. Operationally leveraging an automated system for measuring work hours in the form of dashboards and other tools could provide the impetus to ensure accurate documentation of schedule anomalies.

CONCLUSION

Implementation of our EHR-derived work-hour model will allow ACGME residency programs to understand and act upon trainee work-hour violations closer to real time, as the data extraction is daily and automated. Automation will save busy residents a cumbersome task, provide more complete data than self-report, and empower residency programs to intervene quickly to support overworked trainees.

Acknowledgments

The authors thank Drs Bradley Monash, Larissa Thomas, and Rebecca Berman for providing residency program input.

References

1. Accreditation Council for Graduate Medical Education. Common program requirements. Accessed August 12, 2020. https://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements
2. Accreditation Council for Graduate Medical Education. Resident/fellow and faculty surveys. Accessed August 12, 2020. https://www.acgme.org/Data-Collection-Systems/Resident-Fellow-and-Faculty-Surveys
3. Petre M, Geana R, Cipparrone N, et al. Comparing electronic and manual tracking systems for monitoring resident duty hours. Ochsner J. 2016;16(1):16-21.
4. Gonzalo JD, Yang JJ, Ngo L, Clark A, Reynolds EE, Herzig SJ. Accuracy of residents’ retrospective perceptions of 16-hour call admitting shift compliance and characteristics. Grad Med Educ. 2013;5(4):630-633. https://doi.org/10.4300/jgme-d-12-00311.1
5. Dziorny AC, Orenstein EW, Lindell RB, Hames NA, Washington N, Desai B. Automatic detection of front-line clinician hospital shifts: a novel use of electronic health record timestamp data. Appl Clin Inform. 2019;10(1):28-37. https://doi.org/10.1055/s-0038-1676819
6. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc. 2019;26(2):106-114. https://doi.org/10.1093/jamia/ocy145
7. MedHub. Accessed April 7, 2021. https://www.medhub.com
8. AMiON. Accessed April 7, 2021. https://www.amion.com
9. Seabold S, Perktold J. Statsmodels: econometric and statistical modeling with python. Proceedings of the 9th Python in Science Conference. https://conference.scipy.org/proceedings/scipy2010/pdfs/seabold.pdf
10. Todd SR, Fahy BN, Paukert JL, Mersinger D, Johnson ML, Bass BL. How accurate are self-reported resident duty hours? J Surg Educ. 2010;67(2):103-107. https://doi.org/10.1016/j.jsurg.2009.08.004
11. Chadaga SR, Keniston A, Casey D, Albert RK. Correlation between self-reported resident duty hours and time-stamped parking data. J Grad Med Educ. 2012;4(2):254-256. https://doi.org/10.4300/JGME-D-11-00142.1
12. Drolet BC, Schwede M, Bishop KD, Fischer SA. Compliance and falsification of duty hours: reports from residents and program directors. J Grad Med Educ. 2013;5(3):368-373. https://doi.org/10.4300/JGME-D-12-00375.1
13. Shanafelt TD, Dyrbye LN, West CP. Addressing physician burnout: the way forward. JAMA. 2017;317(9):901. https://doi.org/10.1001/jama.2017.0076
14. Dziorny AC, Orenstein EW, Lindell RB, Hames NA, Washington N, Desai B. Pediatric trainees systematically under-report duty hour violations compared to electronic health record defined shifts. PLOS ONE. 2019;14(12):e0226493. https://doi.org/10.1371/journal.pone.0226493
15. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama time: working after work in the electronic health record. J Gen Intern Med. 2019;34(9):1695-1696. https://doi.org/10.1007/s11606-019-05055-x
16. ResQ Medical. Accessed April 7, 2021. https://resqmedical.com

References

1. Accreditation Council for Graduate Medical Education. Common program requirements. Accessed August 12, 2020. https://www.acgme.org/What-We-Do/Accreditation/Common-Program-Requirements
2. Accreditation Council for Graduate Medical Education. Resident/fellow and faculty surveys. Accessed August 12, 2020. https://www.acgme.org/Data-Collection-Systems/Resident-Fellow-and-Faculty-Surveys
3. Petre M, Geana R, Cipparrone N, et al. Comparing electronic and manual tracking systems for monitoring resident duty hours. Ochsner J. 2016;16(1):16-21.
4. Gonzalo JD, Yang JJ, Ngo L, Clark A, Reynolds EE, Herzig SJ. Accuracy of residents’ retrospective perceptions of 16-hour call admitting shift compliance and characteristics. Grad Med Educ. 2013;5(4):630-633. https://doi.org/10.4300/jgme-d-12-00311.1
5. Dziorny AC, Orenstein EW, Lindell RB, Hames NA, Washington N, Desai B. Automatic detection of front-line clinician hospital shifts: a novel use of electronic health record timestamp data. Appl Clin Inform. 2019;10(1):28-37. https://doi.org/10.1055/s-0038-1676819
6. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc. 2019;26(2):106-114. https://doi.org/10.1093/jamia/ocy145
7. MedHub. Accessed April 7, 2021. https://www.medhub.com
8. AMiON. Accessed April 7, 2021. https://www.amion.com
9. Seabold S, Perktold J. Statsmodels: econometric and statistical modeling with python. Proceedings of the 9th Python in Science Conference. https://conference.scipy.org/proceedings/scipy2010/pdfs/seabold.pdf
10. Todd SR, Fahy BN, Paukert JL, Mersinger D, Johnson ML, Bass BL. How accurate are self-reported resident duty hours? J Surg Educ. 2010;67(2):103-107. https://doi.org/10.1016/j.jsurg.2009.08.004
11. Chadaga SR, Keniston A, Casey D, Albert RK. Correlation between self-reported resident duty hours and time-stamped parking data. J Grad Med Educ. 2012;4(2):254-256. https://doi.org/10.4300/JGME-D-11-00142.1
12. Drolet BC, Schwede M, Bishop KD, Fischer SA. Compliance and falsification of duty hours: reports from residents and program directors. J Grad Med Educ. 2013;5(3):368-373. https://doi.org/10.4300/JGME-D-12-00375.1
13. Shanafelt TD, Dyrbye LN, West CP. Addressing physician burnout: the way forward. JAMA. 2017;317(9):901. https://doi.org/10.1001/jama.2017.0076
14. Dziorny AC, Orenstein EW, Lindell RB, Hames NA, Washington N, Desai B. Pediatric trainees systematically under-report duty hour violations compared to electronic health record defined shifts. PLOS ONE. 2019;14(12):e0226493. https://doi.org/10.1371/journal.pone.0226493
15. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama time: working after work in the electronic health record. J Gen Intern Med. 2019;34(9):1695-1696. https://doi.org/10.1007/s11606-019-05055-x
16. ResQ Medical. Accessed April 7, 2021. https://resqmedical.com

Issue
Journal of Hospital Medicine 16(7)
Issue
Journal of Hospital Medicine 16(7)
Page Number
404-408. Published Online First April 16, 2021
Page Number
404-408. Published Online First April 16, 2021
Topics
Article Type
Display Headline
Automating Measurement of Trainee Work Hours
Display Headline
Automating Measurement of Trainee Work Hours
Sections
Article Source

© 2021 Society of Hospital Medicine

Disallow All Ads
Correspondence Location
Sara G Murray, MD, MAS; Email: Sara.murray@ucsf.edu.
Content Gating
Gated (full article locked unless allowed per User)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Gating Strategy
First Page Free
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media
Media Files

Enhancing Diabetes Self-Management Education and Psychological Services for Veterans With Comorbid Chronic Health and Mental Health Conditions

Article Type
Changed
Tue, 05/03/2022 - 15:06

Veterans have a higher prevalence of type 2 diabetes mellitus (T2DM) when compared with their civilian counterparts with an overall prevalence rate of 25%.1 This higher prevalence is similar to other major chronic health conditions, including heart disease and arthritis, with additional costs for disease self-management.2 Psychological and behavioral change strategies are a principal means of limiting the severity and even restoring function once T2DM is diagnosed.3 More broadly, there is mounting evidence that addressing distress and behavior change are important across many conditions, particularly T2DM.4 Therefore, the US Department of Veterans Affairs (VA) has established patient education and multidisciplinary interventions to optimize engagement in T2DM self-management and health behavior change.5

Traditional T2DM education programs aim to meet the American Diabetes Association (ADA) standards of medical care and include a T2DM educator and other allied health professionals. ADA Standard 1.2 emphasizes “productive interactions between a prepared, proactive care team and an informed, activated patient.”6 Thus, to attain ADA accreditation, educational programs require instructors to teach about T2DM while engaging patients to help them set and achieve recommended changes. The requirements emphasize setting specific goals, (ie, eating wisely, being physically active, monitoring blood sugars or taking medications). The care team also helps to identify barriers, and at a required follow-up class, patients evaluate how well they met goals and make modifications if needed. The impact of traditional patient education programs to improve glycemic levels is well established.7 Importantly, veterans with comorbid mental health conditions may not experience the same beneficial outcomes if or when they participate in traditional diabetes or self-management programs.8,9 Veterans with T2DM may be particularly vulnerable to chronic stress and effects of comorbid mental health diagnoses.10 Furthermore, when individuals experience T2DM-related distress, associations with poor health outcomes, including elevated hemoglobin A1c (HbA1c), are observed independent of depression.11

Health psychology services integrate into medical settings and strive to reach veterans who may not engage in traditional mental health clinical offerings.12 These collaborative interventions focus less on diagnostic or screening procedures and more on a patient’s understanding of illness and ability and willingness to carry out treatment regimens. Given the significant roles of distress and co-occurring conditions, health psychology services further aim to provide psychoeducation about stress management in order to explore and enhance motivation for making a wide range of health behavior changes.

The purpose of this study was to evaluate baseline and follow-up HbA1c, weight, and psychosocial measures, namely, health-related self-efficacy and T2DM-related distress among a small sample that engaged in integrated health psychology services. The focus of this evidence-based psychotherapy service was to improve T2DM self-care and physical health. The participants were offered cognitive and behavioral strategies for setting and meeting personalized T2DM self-management goals. Importantly, motivational interviewing was used throughout to adapt to the participants’ preferences and needs as well as to maintain engagement.

Methods

Primary care providers referred veterans with T2DM to the Health Psychology service at VA Ann Arbor Healthcare System (VAAAHS). A T2DM diagnosis was verified through electronic health record review. Most common referrals included addressing coping with chronic illness and improving glycemic levels. Veterans were invited to participate in a program evaluation project to monitor health-related changes. All participants provided written informed consent and did not receive incentive or payment for participating. The VAAAHS Institutional Review Board reviewed and approved this study.

Intervention

Veterans met individually with a health psychologist or health psychology trainee to create personalized health and behavioral goals for improving T2DM self-management, overall health, and psychological well-being. This intervention included motivational interviewing, SMART (specific, measurable, action-oriented, realistic, timely) goal setting, behavioral activation, acceptance of T2DM-related physical changes, problem-solving therapy, challenging maladaptive disease-related cognitions, and incorporating values to help find motivation for change. Interventionists took a flexible approach and met in-person in primary or specialty care clinics, over the phone, and through telehealth 1 to 4 times a month, meeting participant preferences, with sessions typically ranging from 45 to 60 minutes. The goal of the study was to disseminate and implement evidence-based behavioral change strategies into a multidisciplinary team format without excluding veterans who would benefit from receiving active treatment. Due to this translational approach, a control group was not included for comparison.

Data Collection

Participants completed study measures at the beginning and end of the T2DM-focused intervention sessions. Demographic variables collected included age, sex, race/ethnicity, highest educational attainment, and whether a veteran was prescribed insulin, service connected for T2DM, concurrent enrollment in other educational programs, and time since T2DM diagnosis. Measures were selected based on their relevance to T2DM psychosocial care and diabetes health outcomes.13

 

 

Body mass index, low-density lipoprotein cholesterol, blood pressure (BP), HbA1c within 3 months of the pre- and postmeasures were collected by reviewing medical records. T2DM complications were collected by self-report, and comorbid physical and mental health conditions were collected by review of the most recent primary care note. The Diabetes Empowerment Scale-Short Form (DES-SF) is a well-validated measure that was used to measure T2DM-related psychosocial self-efficacy.14 Scores ranged from 8 to 40 with higher scores indicating higher diabetes T2DM empowerment. The Patient Health Questionnaire 9-item (PHQ-9) was used to assess the frequency of somatic (fatigue, appetite, psychomotor) and cognitive symptoms (anhedonia, low mood) of depression over the past 2 weeks.15 The Generalized Anxiety Disorder 7-item (GAD-7) was used to assess the frequency of common anxiety symptoms, including feelings of worry, difficulty controlling worry, and trouble relaxing.16 Veterans were also asked to rate their general health on a 5-point Likert scale. Self-rated health is a well-established indicator of disability and risk of future T2DM complications in older adults.17,18 The Diabetes Distress Scale (DDS) was used to measure emotional burden, physician-related distress, regimen-related distress, and T2DM-related interpersonal distress.19 Scores > 2.0 suggest clinical significant diabetes distress.20 Medication questionnaires were adapted from Wilson and colleagues, 2013.21

Statistical Analyses

Descriptive statistics, including mean and standard deviation (SD) or frequency distributions, as appropriate, were used to characterize the sample. For pre- and postintervention within-group comparisons, a paired samples Student t test analysis was used to evaluate baseline and follow-up measures for statistically significant differences between continuous variables; scores also were evaluated for clinically meaningful change.

Results

This sample (N = 13) of older adults was predominately male, white, with HbA1c > 7.0, and prescribed insulin (Table). On average, participants were at higher risk for future complications due to high BP, hyperlipidemia, and BMI > 30.0. Regarding participation, veterans were seen for an average of 7.8 sessions (range, 4-13) with 46% service connected for T2DM. Of note, 4 veterans received other T2DM-specific self-management support within the same year of their participation with health psychology, such as attending a T2DM education class or T2DM shared medical appointment.22 Reliability in the current sample for the DES-SF was high (Cronbach α = 0.90), PHQ-9 was good (Cronbach α = 0.81), and GAD-7 was very good (Cronbach α = 0.86).

Descriptive Statistics for the Demographic and Health Characteristics at Baseline table

Among the 13 older adults, the most common T2DM-related complications included peripheral neuropathy (n = 7), heart pain or heart attack (n = 5), and retinopathy (n = 4). Recent primary care notes showed a mean (SD) 7 (2.2) comorbid chronic medical conditions with a high prevalence of cardiometabolic illnesses including hypertension, hyperlipidemia, obstructive sleep apnea, and a diagnosis of chronic pain. Eleven veterans were diagnosed with a mental health condition, including bipolar disorder, depression, anxiety, trauma-related disorder, and sleep disorders. Veterans reported high T2DM emotional distress (mean [SD] 3.1 [1.2]), moderate regimen-related distress (mean [SD] 2.9 [1.1]), and moderate total T2DM distress (mean [SD] 2.4 [0.7]). Physician distress (mean [SD] 1.3 [0.55]) and interpersonal T2DM distress (mean [SD] 1.6 [0.9]) subscales indicated little to no distress. The sample reported mild symptoms of depression (PHQ-9 mean [SD] 8.8 [4.6]); mild symptoms of anxiety (GAD-7 mean, 7.1; SD, 4.4), and Diabetes Empowerment (mean, 31.2; SD, 6.0). Participants described missing an average of 2.4 days within the past 30 days of their T2DM oral medications.

Twelve veterans (92.7%) completed the Follow-up questionnaires. The Figure illustrates statistically significant changes in patient-reported outcomes between baseline and follow-up. Clinically meaningful reductions were shown in total T2DM distress (t11 = 5.03, P < .01), T2DM emotional burden (t11 = 4.83, P = .01), and T2DM regimen-related distress (t11 = 5.14, P < .01). There was a significant increase in T2DM self-efficacy (t11 = 0.32, P = .008) as well. A statistically significant reduction was seen in depressive symptoms (t11 = 2.22, P = .048). While HbA1c fell by .56 percentage points (standard error of the mean [SEM], 31; P = .10), this change was not statistically significant. Follow-up analyses also showed a clinically, though not statistically, significant reduction in weight loss by 6.9 lb. (SEM, 3.8; P = .20), and reductions of generalized anxiety by 1.2 points (SEM, 1.4; P = .42). Pre- and postanalyses did not show differences among self-rated health, physician-related burden, interpersonal-related burden, and indicators of medication taking behavior.

Discussion

This observational study evaluated change among patient-reported T2DM-specific and general distress measures and health outcomes among a small sample of veterans at VAAAHS medical center that engaged in an episode of individual care with health psychology. Statistically significant decreases were observed in T2DM-related distress. Noteworthy, these decreases were observed for the emotional burden and regimen subscales, and each of these was clinically meaningful, falling below a score of 2.0 on the T2DM-specific scale. This is important given that T2DM distress may interfere with the ability to understand and find motivation for engaging in health behavior change. Incorporating stress management interventions into interdisciplinary health programs has been demonstrated to improve not only levels of distress, but also other health outcomes, such as health related quality of life and cardiac events in heart disease.23 Thus, behavioral health interventions that incorporate cognitive-behavioral strategies to enhance distress-specific coping may prove important to include among individuals with T2DM.

Reductions in T2DM-related distress also converged with increases observed in the T2DM empowerment scale. These significant improvements in perceived ability suggest increased self-efficacy and willingness to follow a daily T2DM regimen. This finding aligns with the social support literature that demonstrates how instrumental and other aspects of autonomous social support mediate improvements in health-related outcomes and reduced T2DM distress.24,25 Health psychology interventions strive to both provide social support as well as enhance participants’ perceptions and use of existing support as a cognitive-behavioral strategy. Adding in assessments of social support could shed light on such mediating factors.

 

 

The ADA standards of care encourage heath care providers to engage patients in conversations in order to better understand the barriers of T2DM self-care.13 How to best support patients within a primary care multidisciplinary team remains unclear.26 T2DM distress and negative reactions to T2DM, including symptoms of anxiety and depression, are common and may require specific referral to a mental health provider if repeated attempts at T2DM education do not improve self-management and illness biomarkers.27 Thus, integrating these providers and services within the medical setting aims to reach more veterans and potentially meet these standards of care. With our health psychology integrated services, clinically significant decreases in anxiety and statistically significant decreases in depressive symptoms were observed that approached “mild to no” symptoms. Although this was not measured formally, the veterans were not engaging in mental health specialty care historically or during the year of the health psychology intervention. This suggests that health psychology services helped bridge the gap and address these psychosocial needs within the small sample.

For clinical measures, modest decreases were observed for HbA1c and weight. The authors recognize that these changes may not be optimal in terms of health status. A review of the specific patient-centered goals may illuminate this finding. For example, 1 participant had a goal to consume fewer sugary beverages and achieved this behavior change. Yet this change alone may not equate to actual weight loss or a lower HbA1c. Furthermore, in the context of T2DM-related distress, maintaining current weight and/or blood sugar levels may be a more realistic goal. An evaluation of the specific patient-oriented action goals and observed progress may be important outcomes to include in larger studies. Moreover, while not significant, the average HbA1c decrease of about 1% is comparable with traditional T2DM education and should be considered in light of the sample’s significant mental health comorbidities. While landmark intensive glucose control trials illustrate significant benefits in reductions of hyperglycemia and nonfatal cardiovascular disease, these reductions are associated with an approximate 2-fold risk of hypoglycemia.28-30 Thus, the focus on improved glycemic control has been criticized as lacking meaning to patients in contrast to preventing T2DM complications and persevering quality of life.31

Limitations and Future Directions

Noted limitations include small sample size, the range of time, and a broad number of sessions given that the intervention was tailored to each veteran. Conclusions drawn from a small sample may be influenced by individual outliers. Given co-occurring conditions and moderate levels of distress, all participants may benefit from additional support resources.

In addition to these considerations, having a comparison group could further strengthen the study as part of an observational database. A between-group comparison could help clinicians better understand what the interventions offer as well as some individual factors that relate to participation and success with behavior change. In the future, studies with a priori hypotheses could also consider the trajectories of weight and blood sugar levels for extended periods; for example, 6 months before the intervention and 6 months following.32 Given the complexity of comorbid mental health and chronic medical conditions in this sample, it also may be important to measure the relationships between chronic physical symptoms as an additional barrier for veterans to make health behavior changes.

Conclusions

The authors believe that the health psychology interventions offered important support and motivation for engagement in health behavior change that led to reduced distress in this patient group. It remains a challenge to engage veterans with psychiatric conditions in mental health care, and simultaneously for health care systems that strive to reduce costs and complications associated with chronic illness management.33 Aligned with these broader health care goals, the ADA aims to reduce complications and cost and improve outcomes for T2DM with guidelines requiring mental and behavioral health interventions. The authors believe that health psychology interventions are a personalized and feasible bridge to address engagement, illness-related distress while improving patient-satisfaction and T2DM self-management.

Acknowledgments

The authors thank the veterans who participated in the observational study. We thank the VA Ann Arbor Healthcare System Institutional Review Board. For instrumental support for health psychology integrated services, we acknowledge Adam Tremblay, MD, Primary Care Chief, and R.J. Schildhouse, MD, Acting Associate Chief of Staff, Ambulatory Care. The work was supported by the Ambulatory Care Service at the VA Ann Arbor Healthcare System and the VA Office of Academic Affiliations.

References

1. Liu Y, Sayam S, Shao X, et al. Prevalence of and trends in diabetes among veterans, United States, 2005-2014. Prev Chronic Dis. 2017;14(12):E135, 1-5. doi:10.5888/pcd14.170230

2. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S. doi:10.1177/1077558703257000

3. American Psychological Association. Psychology and Health in Action. Updated 2016. Accessed February 10, 2021. https://www.apa.org/health/fall-2016-updates.pdf

4. The US Burden of Disease Collaborators. The state of US health, 1990-2016. JAMA. 2018;319(14):1444-1472. doi:10.1001/jama.2018.0158

5. Piette JD, Kerr E, Richardson C, Heisler M. Veterans Affairs research on health information technologies for diabetes self-management support. J Diabetes Sci Technol. 2008;2(1):15-23. doi:10.1177/193229680800200104

6. American Diabetes Association. 1. Improving care and promoting health in populations: Standards of Medical Care in Diabetes—2019. Diabetes Care. 2019;42(suppl 1):S7-S12. doi:10.2337/dc19-S001

7. Norris SL, Lau J, Smith SJ, Schmid CH, Engelgau MM. Self-management education for adults with type 2 diabetes. A meta-analysis of the effect on glycemic control. Diabetes Care. 2002;25(7):1159-1171. doi:10.2337/diacare.25.7.1159

8. Janney CA, Owen R, Bowersox NW, Ratz D, Kilbourne EA. Bipolar disorder influences weight loss in the nationally implemented MOVE! program for veterans. Bipolar Disord. 2015;17:87.

9. Piette JD, Kerr EA. The impact of comorbid chronic conditions on diabetes care. Diabetes Care. 2006;29(3):725-731. doi:10.2337/diacare.29.03.06.dc05-2078

10. Trief PM, Ouimette P, Wade M, Shanahan P, Weinstock RS. Post-traumatic stress disorder and diabetes: Co-morbidity and outcomes in a male veterans sample. J Behav Med. 2006;29(5):411-418. doi:10.1007/s10865-006-9067-2

11. Fisher L, Mullan JT, Arean P, Glasgow RE, Hessler D, Masharani U. Diabetes distress but not clinical depression or depressive symptoms is associated with glycemic control in both cross-sectional and longitudinal analyses. Diabetes Care. 2010;33(1):23-28. doi:10.2337/dc09-1238

12. Bohnert KM, Pfeiffer PN, Szymanski BR, McCarthy JF. Continuation of care following an initial primary care visit with a mental health diagnosis: differences by receipt of VHA Primary Care-Mental Health Integration services. Gen Hosp Psychiatry. 2013;35(1):66-70. doi:10.1016/j.genhosppsych.2012.09.002

13. Young-Hyman D, De Groot M, Hill-Briggs F, Gonzalez JS, Hood K, Peyrot M. Psychosocial care for people with diabetes: a position statement of the American Diabetes Association. Diabetes Care. 2016;39(12):2126-2140. doi:10.2337/dc16-2053

14. Anderson R, Fitzgerald J, Gruppen L, Funnell M, Oh M. The diabetes empowerment scale-short form (DES-SF). Diabetes Care. 2003;26(5):1641-1642. doi:10.2337/diacare.26.5.1641-a

15. Kroenke K, Spitzer RL, Williams JBW. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606-613.doi:10.1046/j.1525-1497.2001.016009606.x

16. Spitzer RL, Kroenke K, Williams JBW, Löwe B. A brief measure for assessing generalized anxiety disorder: the GAD-7. Arch Intern Med. 2006;166(10):1092-1097. doi:10.1001/archinte.166.10.1092

17. Pinquart M. Correlates of subjective health in older adults: a meta-analysis. Psychol Aging. 2001;16(3):414. doi:10.1037/0882-7974.16.3.414

18. Hayes AJ, Clarke PM, Glasziou PG, Simes RJ, Drury PL, Keech AC. Can self-rated health scores be used for risk prediction in patients with type 2 diabetes? Diabetes Care. 2008;31(4):795-797. doi:10.2337/dc07-1391

19. Polonsky WH, Fisher L, Earles J, et al. Assessing psychosocial distress in diabetes: development of the diabetes distress scale. Diabetes Care. 2005;28(3):626-631. doi:10.2337/diacare.28.3.626

20. Fisher L, Hessler DDM, Polonsky WH, Mullan J. When is diabetes distress meaningful?: Establishing cut points for the Diabetes Distress Scale. Diabetes Care. 2012;35(2):259-264. doi:10.2337/dc11-1572

21. Wilson IB, Fowler FJ Jr, Cosenza CA, et al. Cognitive and field testing of a new set of medication adherence self-report items for HIV care. AIDS Behav. 2013;18(12):2349-2358. doi:10.1007/s10461-013-0610-1

22. Heisler M, Burgess J, Cass J, et al. The Shared Health Appointments and Reciprocal Enhanced Support (SHARES) study: study protocol for a randomized trial. Trials. 2017;18(1):239. doi:10.1186/s13063-017-1959-7

23. Blumenthal JA, Babyak MA, Carney RM, et al. Exercise, depression, and mortality after myocardial infarction in the ENRICHD Trial. Med Sci Sports Exerc. 2004;36(5):746-755. doi:10.1249/01.MSS.0000125997.63493.13

24. Lee AA, Piette JD, Heisler M, Rosland AM. Diabetes distress and glycemic control: the buffering effect of autonomy support from important family members and friends. Diabetes Care. 2018;41(6):1157-1163. doi:10.2337/dc17-2396

25. Baek RN, Tanenbaum ML, Gonzalez JS. Diabetes burden and diabetes distress: the buffering effect of social support. Ann Behav Med. 2014;48(2):1-11.doi:10.1007/s12160-013-9585-4

26. Jortberg BT, Miller BF, Gabbay RA, Sparling K, Dickinson WP. Patient-centered medical home: how it affects psychosocial outcomes for diabetes. Curr Diab Rep. 2012;12(6):721-728. doi:10.1007/s11892-012-0316-1

27. American Diabetes Association. Lifestyle management: standards of medical care in diabetes-2019. Diabetes Care. 2019;41(suppl 1):S38-S50. doi:10.2337/dc19-S005

28. UK Prospective Diabetes Study Group. Effect of intensive blood-glucose control with metformin on complications in overweight patients with type 2 diabetes. Lancet. 1998;352(9131):854-865.

29. The Diabetes Control and Complications Trial Research Group, Control TD, Trial C. The effect of intensive treatment of diabetes on the development and progression of long-term complications in insulin-dependent diabetes mellitus. The Diabetes Control and Complications Trial Research Group. N Engl J Med. 1993;329(14):977-986. doi:10.1056/NEJM199309303291401

30. Kelly TN, Bazzano LA, Fonseca VA, Thethi TK, Reynolds K, He J. Systematic review: glucose control and cardiovascular disease in type 2 diabetes. Ann Intern Med. 2009;151(6):394-403. doi:10.1037/1072-5245.13.1.64

31. Yudkin JS, Lipska KJ, Montori VM. The idolatry of the surrogate. BMJ. 2012;344(7839):8-10. doi:10.1136/bmj.d7995

32. Lutes LD, Damschroder LJ, Masheb R, et al. Behavioral treatment for veterans with obesity: 24-month weight outcomes from the ASPIRE-VA Small Changes Randomized Trial. J Gen Intern Med. 2017;32(1):40-47. doi:10.1007/s11606-017-3987-0

33. Krejci LP, Carter K, Gaudet T. The vision and implementation of personalized, proactive, patient-driven health care for veterans. Med Care. 2014;52(12)(suppl 5):S5-S8. doi:10.1097/MLR.0000000000000226

Article PDF
Author and Disclosure Information

Naomi Kane is a Clinical Psychology Postdoctoral Fellow in behavioral medicine and postdeployment health at the New Jersey VA War Related Illness and Injury Study Center in East Orange. Naomi Kane was previously a Psychology Intern; Lindsey Bloor is a Clinical Health Psychologist and the Health Behavior Coordinator; Jamie Michaels is a Registered Dietician and Certified Diabetes Educator; all at the VA Ann Arbor Healthcare System in Michigan. Lindsey Bloor is a Clinical Assistant Professor in Psychiatry at the University of Michigan Medical School in Ann Arbor.
Correspondence: Naomi Kane (naomikanephd@gmail.com)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Issue
Federal Practitioner - 38(4)s
Publications
Topics
Page Number
e22-e28
Sections
Author and Disclosure Information

Naomi Kane is a Clinical Psychology Postdoctoral Fellow in behavioral medicine and postdeployment health at the New Jersey VA War Related Illness and Injury Study Center in East Orange. Naomi Kane was previously a Psychology Intern; Lindsey Bloor is a Clinical Health Psychologist and the Health Behavior Coordinator; Jamie Michaels is a Registered Dietician and Certified Diabetes Educator; all at the VA Ann Arbor Healthcare System in Michigan. Lindsey Bloor is a Clinical Assistant Professor in Psychiatry at the University of Michigan Medical School in Ann Arbor.
Correspondence: Naomi Kane (naomikanephd@gmail.com)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Author and Disclosure Information

Naomi Kane is a Clinical Psychology Postdoctoral Fellow in behavioral medicine and postdeployment health at the New Jersey VA War Related Illness and Injury Study Center in East Orange. Naomi Kane was previously a Psychology Intern; Lindsey Bloor is a Clinical Health Psychologist and the Health Behavior Coordinator; Jamie Michaels is a Registered Dietician and Certified Diabetes Educator; all at the VA Ann Arbor Healthcare System in Michigan. Lindsey Bloor is a Clinical Assistant Professor in Psychiatry at the University of Michigan Medical School in Ann Arbor.
Correspondence: Naomi Kane (naomikanephd@gmail.com)

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Disclaimer
The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Article PDF
Article PDF
Related Articles

Veterans have a higher prevalence of type 2 diabetes mellitus (T2DM) when compared with their civilian counterparts with an overall prevalence rate of 25%.1 This higher prevalence is similar to other major chronic health conditions, including heart disease and arthritis, with additional costs for disease self-management.2 Psychological and behavioral change strategies are a principal means of limiting the severity and even restoring function once T2DM is diagnosed.3 More broadly, there is mounting evidence that addressing distress and behavior change are important across many conditions, particularly T2DM.4 Therefore, the US Department of Veterans Affairs (VA) has established patient education and multidisciplinary interventions to optimize engagement in T2DM self-management and health behavior change.5

Traditional T2DM education programs aim to meet the American Diabetes Association (ADA) standards of medical care and include a T2DM educator and other allied health professionals. ADA Standard 1.2 emphasizes “productive interactions between a prepared, proactive care team and an informed, activated patient.”6 Thus, to attain ADA accreditation, educational programs require instructors to teach about T2DM while engaging patients to help them set and achieve recommended changes. The requirements emphasize setting specific goals, (ie, eating wisely, being physically active, monitoring blood sugars or taking medications). The care team also helps to identify barriers, and at a required follow-up class, patients evaluate how well they met goals and make modifications if needed. The impact of traditional patient education programs to improve glycemic levels is well established.7 Importantly, veterans with comorbid mental health conditions may not experience the same beneficial outcomes if or when they participate in traditional diabetes or self-management programs.8,9 Veterans with T2DM may be particularly vulnerable to chronic stress and effects of comorbid mental health diagnoses.10 Furthermore, when individuals experience T2DM-related distress, associations with poor health outcomes, including elevated hemoglobin A1c (HbA1c), are observed independent of depression.11

Health psychology services integrate into medical settings and strive to reach veterans who may not engage in traditional mental health clinical offerings.12 These collaborative interventions focus less on diagnostic or screening procedures and more on a patient’s understanding of illness and ability and willingness to carry out treatment regimens. Given the significant roles of distress and co-occurring conditions, health psychology services further aim to provide psychoeducation about stress management in order to explore and enhance motivation for making a wide range of health behavior changes.

The purpose of this study was to evaluate baseline and follow-up HbA1c, weight, and psychosocial measures, namely, health-related self-efficacy and T2DM-related distress among a small sample that engaged in integrated health psychology services. The focus of this evidence-based psychotherapy service was to improve T2DM self-care and physical health. The participants were offered cognitive and behavioral strategies for setting and meeting personalized T2DM self-management goals. Importantly, motivational interviewing was used throughout to adapt to the participants’ preferences and needs as well as to maintain engagement.

Methods

Primary care providers referred veterans with T2DM to the Health Psychology service at VA Ann Arbor Healthcare System (VAAAHS). A T2DM diagnosis was verified through electronic health record review. Most common referrals included addressing coping with chronic illness and improving glycemic levels. Veterans were invited to participate in a program evaluation project to monitor health-related changes. All participants provided written informed consent and did not receive incentive or payment for participating. The VAAAHS Institutional Review Board reviewed and approved this study.

Intervention

Veterans met individually with a health psychologist or health psychology trainee to create personalized health and behavioral goals for improving T2DM self-management, overall health, and psychological well-being. This intervention included motivational interviewing, SMART (specific, measurable, action-oriented, realistic, timely) goal setting, behavioral activation, acceptance of T2DM-related physical changes, problem-solving therapy, challenging maladaptive disease-related cognitions, and incorporating values to help find motivation for change. Interventionists took a flexible approach and met in-person in primary or specialty care clinics, over the phone, and through telehealth 1 to 4 times a month, meeting participant preferences, with sessions typically ranging from 45 to 60 minutes. The goal of the study was to disseminate and implement evidence-based behavioral change strategies into a multidisciplinary team format without excluding veterans who would benefit from receiving active treatment. Due to this translational approach, a control group was not included for comparison.

Data Collection

Participants completed study measures at the beginning and end of the T2DM-focused intervention sessions. Demographic variables collected included age, sex, race/ethnicity, highest educational attainment, and whether a veteran was prescribed insulin, service connected for T2DM, concurrent enrollment in other educational programs, and time since T2DM diagnosis. Measures were selected based on their relevance to T2DM psychosocial care and diabetes health outcomes.13

 

 

Body mass index, low-density lipoprotein cholesterol, blood pressure (BP), HbA1c within 3 months of the pre- and postmeasures were collected by reviewing medical records. T2DM complications were collected by self-report, and comorbid physical and mental health conditions were collected by review of the most recent primary care note. The Diabetes Empowerment Scale-Short Form (DES-SF) is a well-validated measure that was used to measure T2DM-related psychosocial self-efficacy.14 Scores ranged from 8 to 40 with higher scores indicating higher diabetes T2DM empowerment. The Patient Health Questionnaire 9-item (PHQ-9) was used to assess the frequency of somatic (fatigue, appetite, psychomotor) and cognitive symptoms (anhedonia, low mood) of depression over the past 2 weeks.15 The Generalized Anxiety Disorder 7-item (GAD-7) was used to assess the frequency of common anxiety symptoms, including feelings of worry, difficulty controlling worry, and trouble relaxing.16 Veterans were also asked to rate their general health on a 5-point Likert scale. Self-rated health is a well-established indicator of disability and risk of future T2DM complications in older adults.17,18 The Diabetes Distress Scale (DDS) was used to measure emotional burden, physician-related distress, regimen-related distress, and T2DM-related interpersonal distress.19 Scores > 2.0 suggest clinical significant diabetes distress.20 Medication questionnaires were adapted from Wilson and colleagues, 2013.21

Statistical Analyses

Descriptive statistics, including mean and standard deviation (SD) or frequency distributions, as appropriate, were used to characterize the sample. For pre- and postintervention within-group comparisons, a paired samples Student t test analysis was used to evaluate baseline and follow-up measures for statistically significant differences between continuous variables; scores also were evaluated for clinically meaningful change.

Results

This sample (N = 13) of older adults was predominately male, white, with HbA1c > 7.0, and prescribed insulin (Table). On average, participants were at higher risk for future complications due to high BP, hyperlipidemia, and BMI > 30.0. Regarding participation, veterans were seen for an average of 7.8 sessions (range, 4-13) with 46% service connected for T2DM. Of note, 4 veterans received other T2DM-specific self-management support within the same year of their participation with health psychology, such as attending a T2DM education class or T2DM shared medical appointment.22 Reliability in the current sample for the DES-SF was high (Cronbach α = 0.90), PHQ-9 was good (Cronbach α = 0.81), and GAD-7 was very good (Cronbach α = 0.86).

Descriptive Statistics for the Demographic and Health Characteristics at Baseline table

Among the 13 older adults, the most common T2DM-related complications included peripheral neuropathy (n = 7), heart pain or heart attack (n = 5), and retinopathy (n = 4). Recent primary care notes showed a mean (SD) 7 (2.2) comorbid chronic medical conditions with a high prevalence of cardiometabolic illnesses including hypertension, hyperlipidemia, obstructive sleep apnea, and a diagnosis of chronic pain. Eleven veterans were diagnosed with a mental health condition, including bipolar disorder, depression, anxiety, trauma-related disorder, and sleep disorders. Veterans reported high T2DM emotional distress (mean [SD] 3.1 [1.2]), moderate regimen-related distress (mean [SD] 2.9 [1.1]), and moderate total T2DM distress (mean [SD] 2.4 [0.7]). Physician distress (mean [SD] 1.3 [0.55]) and interpersonal T2DM distress (mean [SD] 1.6 [0.9]) subscales indicated little to no distress. The sample reported mild symptoms of depression (PHQ-9 mean [SD] 8.8 [4.6]); mild symptoms of anxiety (GAD-7 mean, 7.1; SD, 4.4), and Diabetes Empowerment (mean, 31.2; SD, 6.0). Participants described missing an average of 2.4 days within the past 30 days of their T2DM oral medications.

Twelve veterans (92.7%) completed the Follow-up questionnaires. The Figure illustrates statistically significant changes in patient-reported outcomes between baseline and follow-up. Clinically meaningful reductions were shown in total T2DM distress (t11 = 5.03, P < .01), T2DM emotional burden (t11 = 4.83, P = .01), and T2DM regimen-related distress (t11 = 5.14, P < .01). There was a significant increase in T2DM self-efficacy (t11 = 0.32, P = .008) as well. A statistically significant reduction was seen in depressive symptoms (t11 = 2.22, P = .048). While HbA1c fell by .56 percentage points (standard error of the mean [SEM], 31; P = .10), this change was not statistically significant. Follow-up analyses also showed a clinically, though not statistically, significant reduction in weight loss by 6.9 lb. (SEM, 3.8; P = .20), and reductions of generalized anxiety by 1.2 points (SEM, 1.4; P = .42). Pre- and postanalyses did not show differences among self-rated health, physician-related burden, interpersonal-related burden, and indicators of medication taking behavior.

Discussion

This observational study evaluated change among patient-reported T2DM-specific and general distress measures and health outcomes among a small sample of veterans at VAAAHS medical center that engaged in an episode of individual care with health psychology. Statistically significant decreases were observed in T2DM-related distress. Noteworthy, these decreases were observed for the emotional burden and regimen subscales, and each of these was clinically meaningful, falling below a score of 2.0 on the T2DM-specific scale. This is important given that T2DM distress may interfere with the ability to understand and find motivation for engaging in health behavior change. Incorporating stress management interventions into interdisciplinary health programs has been demonstrated to improve not only levels of distress, but also other health outcomes, such as health related quality of life and cardiac events in heart disease.23 Thus, behavioral health interventions that incorporate cognitive-behavioral strategies to enhance distress-specific coping may prove important to include among individuals with T2DM.

Reductions in T2DM-related distress also converged with increases observed in the T2DM empowerment scale. These significant improvements in perceived ability suggest increased self-efficacy and willingness to follow a daily T2DM regimen. This finding aligns with the social support literature that demonstrates how instrumental and other aspects of autonomous social support mediate improvements in health-related outcomes and reduced T2DM distress.24,25 Health psychology interventions strive to both provide social support as well as enhance participants’ perceptions and use of existing support as a cognitive-behavioral strategy. Adding in assessments of social support could shed light on such mediating factors.

 

 

The ADA standards of care encourage heath care providers to engage patients in conversations in order to better understand the barriers of T2DM self-care.13 How to best support patients within a primary care multidisciplinary team remains unclear.26 T2DM distress and negative reactions to T2DM, including symptoms of anxiety and depression, are common and may require specific referral to a mental health provider if repeated attempts at T2DM education do not improve self-management and illness biomarkers.27 Thus, integrating these providers and services within the medical setting aims to reach more veterans and potentially meet these standards of care. With our health psychology integrated services, clinically significant decreases in anxiety and statistically significant decreases in depressive symptoms were observed that approached “mild to no” symptoms. Although this was not measured formally, the veterans were not engaging in mental health specialty care historically or during the year of the health psychology intervention. This suggests that health psychology services helped bridge the gap and address these psychosocial needs within the small sample.

For clinical measures, modest decreases were observed for HbA1c and weight. The authors recognize that these changes may not be optimal in terms of health status. A review of the specific patient-centered goals may illuminate this finding. For example, 1 participant had a goal to consume fewer sugary beverages and achieved this behavior change. Yet this change alone may not equate to actual weight loss or a lower HbA1c. Furthermore, in the context of T2DM-related distress, maintaining current weight and/or blood sugar levels may be a more realistic goal. An evaluation of the specific patient-oriented action goals and observed progress may be important outcomes to include in larger studies. Moreover, while not significant, the average HbA1c decrease of about 1% is comparable with traditional T2DM education and should be considered in light of the sample’s significant mental health comorbidities. While landmark intensive glucose control trials illustrate significant benefits in reductions of hyperglycemia and nonfatal cardiovascular disease, these reductions are associated with an approximate 2-fold risk of hypoglycemia.28-30 Thus, the focus on improved glycemic control has been criticized as lacking meaning to patients in contrast to preventing T2DM complications and persevering quality of life.31

Limitations and Future Directions

Noted limitations include small sample size, the range of time, and a broad number of sessions given that the intervention was tailored to each veteran. Conclusions drawn from a small sample may be influenced by individual outliers. Given co-occurring conditions and moderate levels of distress, all participants may benefit from additional support resources.

In addition to these considerations, having a comparison group could further strengthen the study as part of an observational database. A between-group comparison could help clinicians better understand what the interventions offer as well as some individual factors that relate to participation and success with behavior change. In the future, studies with a priori hypotheses could also consider the trajectories of weight and blood sugar levels for extended periods; for example, 6 months before the intervention and 6 months following.32 Given the complexity of comorbid mental health and chronic medical conditions in this sample, it also may be important to measure the relationships between chronic physical symptoms as an additional barrier for veterans to make health behavior changes.

Conclusions

The authors believe that the health psychology interventions offered important support and motivation for engagement in health behavior change that led to reduced distress in this patient group. It remains a challenge to engage veterans with psychiatric conditions in mental health care, and simultaneously for health care systems that strive to reduce costs and complications associated with chronic illness management.33 Aligned with these broader health care goals, the ADA aims to reduce complications and cost and improve outcomes for T2DM with guidelines requiring mental and behavioral health interventions. The authors believe that health psychology interventions are a personalized and feasible bridge to address engagement, illness-related distress while improving patient-satisfaction and T2DM self-management.

Acknowledgments

The authors thank the veterans who participated in the observational study. We thank the VA Ann Arbor Healthcare System Institutional Review Board. For instrumental support for health psychology integrated services, we acknowledge Adam Tremblay, MD, Primary Care Chief, and R.J. Schildhouse, MD, Acting Associate Chief of Staff, Ambulatory Care. The work was supported by the Ambulatory Care Service at the VA Ann Arbor Healthcare System and the VA Office of Academic Affiliations.

Veterans have a higher prevalence of type 2 diabetes mellitus (T2DM) when compared with their civilian counterparts with an overall prevalence rate of 25%.1 This higher prevalence is similar to other major chronic health conditions, including heart disease and arthritis, with additional costs for disease self-management.2 Psychological and behavioral change strategies are a principal means of limiting the severity and even restoring function once T2DM is diagnosed.3 More broadly, there is mounting evidence that addressing distress and behavior change are important across many conditions, particularly T2DM.4 Therefore, the US Department of Veterans Affairs (VA) has established patient education and multidisciplinary interventions to optimize engagement in T2DM self-management and health behavior change.5

Traditional T2DM education programs aim to meet the American Diabetes Association (ADA) standards of medical care and include a T2DM educator and other allied health professionals. ADA Standard 1.2 emphasizes “productive interactions between a prepared, proactive care team and an informed, activated patient.”6 Thus, to attain ADA accreditation, educational programs require instructors to teach about T2DM while engaging patients to help them set and achieve recommended changes. The requirements emphasize setting specific goals, (ie, eating wisely, being physically active, monitoring blood sugars or taking medications). The care team also helps to identify barriers, and at a required follow-up class, patients evaluate how well they met goals and make modifications if needed. The impact of traditional patient education programs to improve glycemic levels is well established.7 Importantly, veterans with comorbid mental health conditions may not experience the same beneficial outcomes if or when they participate in traditional diabetes or self-management programs.8,9 Veterans with T2DM may be particularly vulnerable to chronic stress and effects of comorbid mental health diagnoses.10 Furthermore, when individuals experience T2DM-related distress, associations with poor health outcomes, including elevated hemoglobin A1c (HbA1c), are observed independent of depression.11

Health psychology services integrate into medical settings and strive to reach veterans who may not engage in traditional mental health clinical offerings.12 These collaborative interventions focus less on diagnostic or screening procedures and more on a patient’s understanding of illness and ability and willingness to carry out treatment regimens. Given the significant roles of distress and co-occurring conditions, health psychology services further aim to provide psychoeducation about stress management in order to explore and enhance motivation for making a wide range of health behavior changes.

The purpose of this study was to evaluate baseline and follow-up HbA1c, weight, and psychosocial measures, namely, health-related self-efficacy and T2DM-related distress among a small sample that engaged in integrated health psychology services. The focus of this evidence-based psychotherapy service was to improve T2DM self-care and physical health. The participants were offered cognitive and behavioral strategies for setting and meeting personalized T2DM self-management goals. Importantly, motivational interviewing was used throughout to adapt to the participants’ preferences and needs as well as to maintain engagement.

Methods

Primary care providers referred veterans with T2DM to the Health Psychology service at VA Ann Arbor Healthcare System (VAAAHS). A T2DM diagnosis was verified through electronic health record review. Most common referrals included addressing coping with chronic illness and improving glycemic levels. Veterans were invited to participate in a program evaluation project to monitor health-related changes. All participants provided written informed consent and did not receive incentive or payment for participating. The VAAAHS Institutional Review Board reviewed and approved this study.

Intervention

Veterans met individually with a health psychologist or health psychology trainee to create personalized health and behavioral goals for improving T2DM self-management, overall health, and psychological well-being. This intervention included motivational interviewing, SMART (specific, measurable, action-oriented, realistic, timely) goal setting, behavioral activation, acceptance of T2DM-related physical changes, problem-solving therapy, challenging maladaptive disease-related cognitions, and incorporating values to help find motivation for change. Interventionists took a flexible approach and met in-person in primary or specialty care clinics, over the phone, and through telehealth 1 to 4 times a month, meeting participant preferences, with sessions typically ranging from 45 to 60 minutes. The goal of the study was to disseminate and implement evidence-based behavioral change strategies into a multidisciplinary team format without excluding veterans who would benefit from receiving active treatment. Due to this translational approach, a control group was not included for comparison.

Data Collection

Participants completed study measures at the beginning and end of the T2DM-focused intervention sessions. Demographic variables collected included age, sex, race/ethnicity, highest educational attainment, and whether a veteran was prescribed insulin, service connected for T2DM, concurrent enrollment in other educational programs, and time since T2DM diagnosis. Measures were selected based on their relevance to T2DM psychosocial care and diabetes health outcomes.13

 

 

Body mass index, low-density lipoprotein cholesterol, blood pressure (BP), HbA1c within 3 months of the pre- and postmeasures were collected by reviewing medical records. T2DM complications were collected by self-report, and comorbid physical and mental health conditions were collected by review of the most recent primary care note. The Diabetes Empowerment Scale-Short Form (DES-SF) is a well-validated measure that was used to measure T2DM-related psychosocial self-efficacy.14 Scores ranged from 8 to 40 with higher scores indicating higher diabetes T2DM empowerment. The Patient Health Questionnaire 9-item (PHQ-9) was used to assess the frequency of somatic (fatigue, appetite, psychomotor) and cognitive symptoms (anhedonia, low mood) of depression over the past 2 weeks.15 The Generalized Anxiety Disorder 7-item (GAD-7) was used to assess the frequency of common anxiety symptoms, including feelings of worry, difficulty controlling worry, and trouble relaxing.16 Veterans were also asked to rate their general health on a 5-point Likert scale. Self-rated health is a well-established indicator of disability and risk of future T2DM complications in older adults.17,18 The Diabetes Distress Scale (DDS) was used to measure emotional burden, physician-related distress, regimen-related distress, and T2DM-related interpersonal distress.19 Scores > 2.0 suggest clinical significant diabetes distress.20 Medication questionnaires were adapted from Wilson and colleagues, 2013.21

Statistical Analyses

Descriptive statistics, including mean and standard deviation (SD) or frequency distributions, as appropriate, were used to characterize the sample. For pre- and postintervention within-group comparisons, a paired samples Student t test analysis was used to evaluate baseline and follow-up measures for statistically significant differences between continuous variables; scores also were evaluated for clinically meaningful change.

Results

This sample (N = 13) of older adults was predominately male, white, with HbA1c > 7.0, and prescribed insulin (Table). On average, participants were at higher risk for future complications due to high BP, hyperlipidemia, and BMI > 30.0. Regarding participation, veterans were seen for an average of 7.8 sessions (range, 4-13) with 46% service connected for T2DM. Of note, 4 veterans received other T2DM-specific self-management support within the same year of their participation with health psychology, such as attending a T2DM education class or T2DM shared medical appointment.22 Reliability in the current sample for the DES-SF was high (Cronbach α = 0.90), PHQ-9 was good (Cronbach α = 0.81), and GAD-7 was very good (Cronbach α = 0.86).

Descriptive Statistics for the Demographic and Health Characteristics at Baseline table

Among the 13 older adults, the most common T2DM-related complications included peripheral neuropathy (n = 7), heart pain or heart attack (n = 5), and retinopathy (n = 4). Recent primary care notes showed a mean (SD) 7 (2.2) comorbid chronic medical conditions with a high prevalence of cardiometabolic illnesses including hypertension, hyperlipidemia, obstructive sleep apnea, and a diagnosis of chronic pain. Eleven veterans were diagnosed with a mental health condition, including bipolar disorder, depression, anxiety, trauma-related disorder, and sleep disorders. Veterans reported high T2DM emotional distress (mean [SD] 3.1 [1.2]), moderate regimen-related distress (mean [SD] 2.9 [1.1]), and moderate total T2DM distress (mean [SD] 2.4 [0.7]). Physician distress (mean [SD] 1.3 [0.55]) and interpersonal T2DM distress (mean [SD] 1.6 [0.9]) subscales indicated little to no distress. The sample reported mild symptoms of depression (PHQ-9 mean [SD] 8.8 [4.6]); mild symptoms of anxiety (GAD-7 mean, 7.1; SD, 4.4), and Diabetes Empowerment (mean, 31.2; SD, 6.0). Participants described missing an average of 2.4 days within the past 30 days of their T2DM oral medications.

Twelve veterans (92.7%) completed the Follow-up questionnaires. The Figure illustrates statistically significant changes in patient-reported outcomes between baseline and follow-up. Clinically meaningful reductions were shown in total T2DM distress (t11 = 5.03, P < .01), T2DM emotional burden (t11 = 4.83, P = .01), and T2DM regimen-related distress (t11 = 5.14, P < .01). There was a significant increase in T2DM self-efficacy (t11 = 0.32, P = .008) as well. A statistically significant reduction was seen in depressive symptoms (t11 = 2.22, P = .048). While HbA1c fell by .56 percentage points (standard error of the mean [SEM], 31; P = .10), this change was not statistically significant. Follow-up analyses also showed a clinically, though not statistically, significant reduction in weight loss by 6.9 lb. (SEM, 3.8; P = .20), and reductions of generalized anxiety by 1.2 points (SEM, 1.4; P = .42). Pre- and postanalyses did not show differences among self-rated health, physician-related burden, interpersonal-related burden, and indicators of medication taking behavior.

Discussion

This observational study evaluated change among patient-reported T2DM-specific and general distress measures and health outcomes among a small sample of veterans at VAAAHS medical center that engaged in an episode of individual care with health psychology. Statistically significant decreases were observed in T2DM-related distress. Noteworthy, these decreases were observed for the emotional burden and regimen subscales, and each of these was clinically meaningful, falling below a score of 2.0 on the T2DM-specific scale. This is important given that T2DM distress may interfere with the ability to understand and find motivation for engaging in health behavior change. Incorporating stress management interventions into interdisciplinary health programs has been demonstrated to improve not only levels of distress, but also other health outcomes, such as health related quality of life and cardiac events in heart disease.23 Thus, behavioral health interventions that incorporate cognitive-behavioral strategies to enhance distress-specific coping may prove important to include among individuals with T2DM.

Reductions in T2DM-related distress also converged with increases observed in the T2DM empowerment scale. These significant improvements in perceived ability suggest increased self-efficacy and willingness to follow a daily T2DM regimen. This finding aligns with the social support literature that demonstrates how instrumental and other aspects of autonomous social support mediate improvements in health-related outcomes and reduced T2DM distress.24,25 Health psychology interventions strive to both provide social support as well as enhance participants’ perceptions and use of existing support as a cognitive-behavioral strategy. Adding in assessments of social support could shed light on such mediating factors.

 

 

The ADA standards of care encourage heath care providers to engage patients in conversations in order to better understand the barriers of T2DM self-care.13 How to best support patients within a primary care multidisciplinary team remains unclear.26 T2DM distress and negative reactions to T2DM, including symptoms of anxiety and depression, are common and may require specific referral to a mental health provider if repeated attempts at T2DM education do not improve self-management and illness biomarkers.27 Thus, integrating these providers and services within the medical setting aims to reach more veterans and potentially meet these standards of care. With our health psychology integrated services, clinically significant decreases in anxiety and statistically significant decreases in depressive symptoms were observed that approached “mild to no” symptoms. Although this was not measured formally, the veterans were not engaging in mental health specialty care historically or during the year of the health psychology intervention. This suggests that health psychology services helped bridge the gap and address these psychosocial needs within the small sample.

For clinical measures, modest decreases were observed for HbA1c and weight. The authors recognize that these changes may not be optimal in terms of health status. A review of the specific patient-centered goals may illuminate this finding. For example, 1 participant had a goal to consume fewer sugary beverages and achieved this behavior change. Yet this change alone may not equate to actual weight loss or a lower HbA1c. Furthermore, in the context of T2DM-related distress, maintaining current weight and/or blood sugar levels may be a more realistic goal. An evaluation of the specific patient-oriented action goals and observed progress may be important outcomes to include in larger studies. Moreover, while not significant, the average HbA1c decrease of about 1% is comparable with traditional T2DM education and should be considered in light of the sample’s significant mental health comorbidities. While landmark intensive glucose control trials illustrate significant benefits in reductions of hyperglycemia and nonfatal cardiovascular disease, these reductions are associated with an approximate 2-fold risk of hypoglycemia.28-30 Thus, the focus on improved glycemic control has been criticized as lacking meaning to patients in contrast to preventing T2DM complications and persevering quality of life.31

Limitations and Future Directions

Noted limitations include small sample size, the range of time, and a broad number of sessions given that the intervention was tailored to each veteran. Conclusions drawn from a small sample may be influenced by individual outliers. Given co-occurring conditions and moderate levels of distress, all participants may benefit from additional support resources.

In addition to these considerations, having a comparison group could further strengthen the study as part of an observational database. A between-group comparison could help clinicians better understand what the interventions offer as well as some individual factors that relate to participation and success with behavior change. In the future, studies with a priori hypotheses could also consider the trajectories of weight and blood sugar levels for extended periods; for example, 6 months before the intervention and 6 months following.32 Given the complexity of comorbid mental health and chronic medical conditions in this sample, it also may be important to measure the relationships between chronic physical symptoms as an additional barrier for veterans to make health behavior changes.

Conclusions

The authors believe that the health psychology interventions offered important support and motivation for engagement in health behavior change that led to reduced distress in this patient group. It remains a challenge to engage veterans with psychiatric conditions in mental health care, and simultaneously for health care systems that strive to reduce costs and complications associated with chronic illness management.33 Aligned with these broader health care goals, the ADA aims to reduce complications and cost and improve outcomes for T2DM with guidelines requiring mental and behavioral health interventions. The authors believe that health psychology interventions are a personalized and feasible bridge to address engagement, illness-related distress while improving patient-satisfaction and T2DM self-management.

Acknowledgments

The authors thank the veterans who participated in the observational study. We thank the VA Ann Arbor Healthcare System Institutional Review Board. For instrumental support for health psychology integrated services, we acknowledge Adam Tremblay, MD, Primary Care Chief, and R.J. Schildhouse, MD, Acting Associate Chief of Staff, Ambulatory Care. The work was supported by the Ambulatory Care Service at the VA Ann Arbor Healthcare System and the VA Office of Academic Affiliations.

References

1. Liu Y, Sayam S, Shao X, et al. Prevalence of and trends in diabetes among veterans, United States, 2005-2014. Prev Chronic Dis. 2017;14(12):E135, 1-5. doi:10.5888/pcd14.170230

2. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S. doi:10.1177/1077558703257000

3. American Psychological Association. Psychology and Health in Action. Updated 2016. Accessed February 10, 2021. https://www.apa.org/health/fall-2016-updates.pdf

4. The US Burden of Disease Collaborators. The state of US health, 1990-2016. JAMA. 2018;319(14):1444-1472. doi:10.1001/jama.2018.0158

5. Piette JD, Kerr E, Richardson C, Heisler M. Veterans Affairs research on health information technologies for diabetes self-management support. J Diabetes Sci Technol. 2008;2(1):15-23. doi:10.1177/193229680800200104

6. American Diabetes Association. 1. Improving care and promoting health in populations: Standards of Medical Care in Diabetes—2019. Diabetes Care. 2019;42(suppl 1):S7-S12. doi:10.2337/dc19-S001

7. Norris SL, Lau J, Smith SJ, Schmid CH, Engelgau MM. Self-management education for adults with type 2 diabetes. A meta-analysis of the effect on glycemic control. Diabetes Care. 2002;25(7):1159-1171. doi:10.2337/diacare.25.7.1159

8. Janney CA, Owen R, Bowersox NW, Ratz D, Kilbourne EA. Bipolar disorder influences weight loss in the nationally implemented MOVE! program for veterans. Bipolar Disord. 2015;17:87.

9. Piette JD, Kerr EA. The impact of comorbid chronic conditions on diabetes care. Diabetes Care. 2006;29(3):725-731. doi:10.2337/diacare.29.03.06.dc05-2078

10. Trief PM, Ouimette P, Wade M, Shanahan P, Weinstock RS. Post-traumatic stress disorder and diabetes: Co-morbidity and outcomes in a male veterans sample. J Behav Med. 2006;29(5):411-418. doi:10.1007/s10865-006-9067-2

11. Fisher L, Mullan JT, Arean P, Glasgow RE, Hessler D, Masharani U. Diabetes distress but not clinical depression or depressive symptoms is associated with glycemic control in both cross-sectional and longitudinal analyses. Diabetes Care. 2010;33(1):23-28. doi:10.2337/dc09-1238

12. Bohnert KM, Pfeiffer PN, Szymanski BR, McCarthy JF. Continuation of care following an initial primary care visit with a mental health diagnosis: differences by receipt of VHA Primary Care-Mental Health Integration services. Gen Hosp Psychiatry. 2013;35(1):66-70. doi:10.1016/j.genhosppsych.2012.09.002

13. Young-Hyman D, De Groot M, Hill-Briggs F, Gonzalez JS, Hood K, Peyrot M. Psychosocial care for people with diabetes: a position statement of the American Diabetes Association. Diabetes Care. 2016;39(12):2126-2140. doi:10.2337/dc16-2053

14. Anderson R, Fitzgerald J, Gruppen L, Funnell M, Oh M. The diabetes empowerment scale-short form (DES-SF). Diabetes Care. 2003;26(5):1641-1642. doi:10.2337/diacare.26.5.1641-a

15. Kroenke K, Spitzer RL, Williams JBW. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606-613.doi:10.1046/j.1525-1497.2001.016009606.x

16. Spitzer RL, Kroenke K, Williams JBW, Löwe B. A brief measure for assessing generalized anxiety disorder: the GAD-7. Arch Intern Med. 2006;166(10):1092-1097. doi:10.1001/archinte.166.10.1092

17. Pinquart M. Correlates of subjective health in older adults: a meta-analysis. Psychol Aging. 2001;16(3):414. doi:10.1037/0882-7974.16.3.414

18. Hayes AJ, Clarke PM, Glasziou PG, Simes RJ, Drury PL, Keech AC. Can self-rated health scores be used for risk prediction in patients with type 2 diabetes? Diabetes Care. 2008;31(4):795-797. doi:10.2337/dc07-1391

19. Polonsky WH, Fisher L, Earles J, et al. Assessing psychosocial distress in diabetes: development of the diabetes distress scale. Diabetes Care. 2005;28(3):626-631. doi:10.2337/diacare.28.3.626

20. Fisher L, Hessler DDM, Polonsky WH, Mullan J. When is diabetes distress meaningful?: Establishing cut points for the Diabetes Distress Scale. Diabetes Care. 2012;35(2):259-264. doi:10.2337/dc11-1572

21. Wilson IB, Fowler FJ Jr, Cosenza CA, et al. Cognitive and field testing of a new set of medication adherence self-report items for HIV care. AIDS Behav. 2013;18(12):2349-2358. doi:10.1007/s10461-013-0610-1

22. Heisler M, Burgess J, Cass J, et al. The Shared Health Appointments and Reciprocal Enhanced Support (SHARES) study: study protocol for a randomized trial. Trials. 2017;18(1):239. doi:10.1186/s13063-017-1959-7

23. Blumenthal JA, Babyak MA, Carney RM, et al. Exercise, depression, and mortality after myocardial infarction in the ENRICHD Trial. Med Sci Sports Exerc. 2004;36(5):746-755. doi:10.1249/01.MSS.0000125997.63493.13

24. Lee AA, Piette JD, Heisler M, Rosland AM. Diabetes distress and glycemic control: the buffering effect of autonomy support from important family members and friends. Diabetes Care. 2018;41(6):1157-1163. doi:10.2337/dc17-2396

25. Baek RN, Tanenbaum ML, Gonzalez JS. Diabetes burden and diabetes distress: the buffering effect of social support. Ann Behav Med. 2014;48(2):1-11.doi:10.1007/s12160-013-9585-4

26. Jortberg BT, Miller BF, Gabbay RA, Sparling K, Dickinson WP. Patient-centered medical home: how it affects psychosocial outcomes for diabetes. Curr Diab Rep. 2012;12(6):721-728. doi:10.1007/s11892-012-0316-1

27. American Diabetes Association. Lifestyle management: standards of medical care in diabetes-2019. Diabetes Care. 2019;41(suppl 1):S38-S50. doi:10.2337/dc19-S005

28. UK Prospective Diabetes Study Group. Effect of intensive blood-glucose control with metformin on complications in overweight patients with type 2 diabetes. Lancet. 1998;352(9131):854-865.

29. The Diabetes Control and Complications Trial Research Group, Control TD, Trial C. The effect of intensive treatment of diabetes on the development and progression of long-term complications in insulin-dependent diabetes mellitus. The Diabetes Control and Complications Trial Research Group. N Engl J Med. 1993;329(14):977-986. doi:10.1056/NEJM199309303291401

30. Kelly TN, Bazzano LA, Fonseca VA, Thethi TK, Reynolds K, He J. Systematic review: glucose control and cardiovascular disease in type 2 diabetes. Ann Intern Med. 2009;151(6):394-403. doi:10.1037/1072-5245.13.1.64

31. Yudkin JS, Lipska KJ, Montori VM. The idolatry of the surrogate. BMJ. 2012;344(7839):8-10. doi:10.1136/bmj.d7995

32. Lutes LD, Damschroder LJ, Masheb R, et al. Behavioral treatment for veterans with obesity: 24-month weight outcomes from the ASPIRE-VA Small Changes Randomized Trial. J Gen Intern Med. 2017;32(1):40-47. doi:10.1007/s11606-017-3987-0

33. Krejci LP, Carter K, Gaudet T. The vision and implementation of personalized, proactive, patient-driven health care for veterans. Med Care. 2014;52(12)(suppl 5):S5-S8. doi:10.1097/MLR.0000000000000226

References

1. Liu Y, Sayam S, Shao X, et al. Prevalence of and trends in diabetes among veterans, United States, 2005-2014. Prev Chronic Dis. 2017;14(12):E135, 1-5. doi:10.5888/pcd14.170230

2. Yu W, Ravelo A, Wagner TH, et al. Prevalence and costs of chronic conditions in the VA health care system. Med Care Res Rev. 2003;60(3)(suppl):146S-167S. doi:10.1177/1077558703257000

3. American Psychological Association. Psychology and Health in Action. Updated 2016. Accessed February 10, 2021. https://www.apa.org/health/fall-2016-updates.pdf

4. The US Burden of Disease Collaborators. The state of US health, 1990-2016. JAMA. 2018;319(14):1444-1472. doi:10.1001/jama.2018.0158

5. Piette JD, Kerr E, Richardson C, Heisler M. Veterans Affairs research on health information technologies for diabetes self-management support. J Diabetes Sci Technol. 2008;2(1):15-23. doi:10.1177/193229680800200104

6. American Diabetes Association. 1. Improving care and promoting health in populations: Standards of Medical Care in Diabetes—2019. Diabetes Care. 2019;42(suppl 1):S7-S12. doi:10.2337/dc19-S001

7. Norris SL, Lau J, Smith SJ, Schmid CH, Engelgau MM. Self-management education for adults with type 2 diabetes. A meta-analysis of the effect on glycemic control. Diabetes Care. 2002;25(7):1159-1171. doi:10.2337/diacare.25.7.1159

8. Janney CA, Owen R, Bowersox NW, Ratz D, Kilbourne EA. Bipolar disorder influences weight loss in the nationally implemented MOVE! program for veterans. Bipolar Disord. 2015;17:87.

9. Piette JD, Kerr EA. The impact of comorbid chronic conditions on diabetes care. Diabetes Care. 2006;29(3):725-731. doi:10.2337/diacare.29.03.06.dc05-2078

10. Trief PM, Ouimette P, Wade M, Shanahan P, Weinstock RS. Post-traumatic stress disorder and diabetes: Co-morbidity and outcomes in a male veterans sample. J Behav Med. 2006;29(5):411-418. doi:10.1007/s10865-006-9067-2

11. Fisher L, Mullan JT, Arean P, Glasgow RE, Hessler D, Masharani U. Diabetes distress but not clinical depression or depressive symptoms is associated with glycemic control in both cross-sectional and longitudinal analyses. Diabetes Care. 2010;33(1):23-28. doi:10.2337/dc09-1238

12. Bohnert KM, Pfeiffer PN, Szymanski BR, McCarthy JF. Continuation of care following an initial primary care visit with a mental health diagnosis: differences by receipt of VHA Primary Care-Mental Health Integration services. Gen Hosp Psychiatry. 2013;35(1):66-70. doi:10.1016/j.genhosppsych.2012.09.002

13. Young-Hyman D, De Groot M, Hill-Briggs F, Gonzalez JS, Hood K, Peyrot M. Psychosocial care for people with diabetes: a position statement of the American Diabetes Association. Diabetes Care. 2016;39(12):2126-2140. doi:10.2337/dc16-2053

14. Anderson R, Fitzgerald J, Gruppen L, Funnell M, Oh M. The diabetes empowerment scale-short form (DES-SF). Diabetes Care. 2003;26(5):1641-1642. doi:10.2337/diacare.26.5.1641-a

15. Kroenke K, Spitzer RL, Williams JBW. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606-613.doi:10.1046/j.1525-1497.2001.016009606.x

16. Spitzer RL, Kroenke K, Williams JBW, Löwe B. A brief measure for assessing generalized anxiety disorder: the GAD-7. Arch Intern Med. 2006;166(10):1092-1097. doi:10.1001/archinte.166.10.1092

17. Pinquart M. Correlates of subjective health in older adults: a meta-analysis. Psychol Aging. 2001;16(3):414. doi:10.1037/0882-7974.16.3.414

18. Hayes AJ, Clarke PM, Glasziou PG, Simes RJ, Drury PL, Keech AC. Can self-rated health scores be used for risk prediction in patients with type 2 diabetes? Diabetes Care. 2008;31(4):795-797. doi:10.2337/dc07-1391

19. Polonsky WH, Fisher L, Earles J, et al. Assessing psychosocial distress in diabetes: development of the diabetes distress scale. Diabetes Care. 2005;28(3):626-631. doi:10.2337/diacare.28.3.626

20. Fisher L, Hessler DDM, Polonsky WH, Mullan J. When is diabetes distress meaningful?: Establishing cut points for the Diabetes Distress Scale. Diabetes Care. 2012;35(2):259-264. doi:10.2337/dc11-1572

21. Wilson IB, Fowler FJ Jr, Cosenza CA, et al. Cognitive and field testing of a new set of medication adherence self-report items for HIV care. AIDS Behav. 2013;18(12):2349-2358. doi:10.1007/s10461-013-0610-1

22. Heisler M, Burgess J, Cass J, et al. The Shared Health Appointments and Reciprocal Enhanced Support (SHARES) study: study protocol for a randomized trial. Trials. 2017;18(1):239. doi:10.1186/s13063-017-1959-7

23. Blumenthal JA, Babyak MA, Carney RM, et al. Exercise, depression, and mortality after myocardial infarction in the ENRICHD Trial. Med Sci Sports Exerc. 2004;36(5):746-755. doi:10.1249/01.MSS.0000125997.63493.13

24. Lee AA, Piette JD, Heisler M, Rosland AM. Diabetes distress and glycemic control: the buffering effect of autonomy support from important family members and friends. Diabetes Care. 2018;41(6):1157-1163. doi:10.2337/dc17-2396

25. Baek RN, Tanenbaum ML, Gonzalez JS. Diabetes burden and diabetes distress: the buffering effect of social support. Ann Behav Med. 2014;48(2):1-11.doi:10.1007/s12160-013-9585-4

26. Jortberg BT, Miller BF, Gabbay RA, Sparling K, Dickinson WP. Patient-centered medical home: how it affects psychosocial outcomes for diabetes. Curr Diab Rep. 2012;12(6):721-728. doi:10.1007/s11892-012-0316-1

27. American Diabetes Association. Lifestyle management: standards of medical care in diabetes-2019. Diabetes Care. 2019;41(suppl 1):S38-S50. doi:10.2337/dc19-S005

28. UK Prospective Diabetes Study Group. Effect of intensive blood-glucose control with metformin on complications in overweight patients with type 2 diabetes. Lancet. 1998;352(9131):854-865.

29. The Diabetes Control and Complications Trial Research Group, Control TD, Trial C. The effect of intensive treatment of diabetes on the development and progression of long-term complications in insulin-dependent diabetes mellitus. The Diabetes Control and Complications Trial Research Group. N Engl J Med. 1993;329(14):977-986. doi:10.1056/NEJM199309303291401

30. Kelly TN, Bazzano LA, Fonseca VA, Thethi TK, Reynolds K, He J. Systematic review: glucose control and cardiovascular disease in type 2 diabetes. Ann Intern Med. 2009;151(6):394-403. doi:10.1037/1072-5245.13.1.64

31. Yudkin JS, Lipska KJ, Montori VM. The idolatry of the surrogate. BMJ. 2012;344(7839):8-10. doi:10.1136/bmj.d7995

32. Lutes LD, Damschroder LJ, Masheb R, et al. Behavioral treatment for veterans with obesity: 24-month weight outcomes from the ASPIRE-VA Small Changes Randomized Trial. J Gen Intern Med. 2017;32(1):40-47. doi:10.1007/s11606-017-3987-0

33. Krejci LP, Carter K, Gaudet T. The vision and implementation of personalized, proactive, patient-driven health care for veterans. Med Care. 2014;52(12)(suppl 5):S5-S8. doi:10.1097/MLR.0000000000000226

Issue
Federal Practitioner - 38(4)s
Issue
Federal Practitioner - 38(4)s
Page Number
e22-e28
Page Number
e22-e28
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media