Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now...

9
RESEARCH ARTICLE Open Access Partnering with healthcare facilities to understand psychosocial distress screening practices among cancer survivors: pilot study implications for study design, recruitment, and data collection Diane Ng 1 , M. Shayne Gallaway 2 , Grace C. Huang 1 , Theresa Famolaro 1 , Jennifer Boehm 2 , Karen Stachon 3 and Elizabeth A. Rohan 2* Abstract Background: We sought to understand barriers and facilitators to implementing distress screening (DS) of cancer patients to inform and promote uptake in cancer treatment facilities. We describe the recruitment and data collection challenges and recommendations for assessing DS in oncology treatment facilities. Methods: We recruited CoC-accredited facilities and collected data from each facilitys electronic health record (EHR). Collected data included cancer diagnosis and demographics, details on DS, and other relevant patient health data. Data were collected by external study staff who were given access to the facilitys EHR system, or by facility staff working locally within their own EHR system. Analyses are based on a pilot study of 9 facilities. Results: Challenges stemmed from being a multi-facility-based study and local institutional review board (IRB) approval, facility review and approval processes, and issues associated with EHR systems and the lack of DS data standards. Facilities that provided study staff remote-access took longer for recruitment; facilities that performed their own extraction/abstraction took longer to complete data collection. Conclusion: Examining DS practices and follow-up among cancer survivors necessitated recruiting and working directly with multiple healthcare systems and facilities. There were a number of lessons learned related to recruitment, enrollment, and data collection. Using the facilitators described in this manuscript offers increased potential for working successfully with various cancer centers and insight into partnering with facilities collecting non-standardized DS clinical data. Keywords: Cancer, Electronic health records, Methods, Oncology, Psychological distress, Research design © The Author(s). 2021 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. * Correspondence: [email protected] 2 Centers for Disease Control and Prevention, Division of Cancer Prevention and Control, Atlanta, GA, USA Full list of author information is available at the end of the article Ng et al. BMC Health Services Research (2021) 21:238 https://doi.org/10.1186/s12913-021-06250-5

Transcript of Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now...

Page 1: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

RESEARCH ARTICLE Open Access

Partnering with healthcare facilities tounderstand psychosocial distress screeningpractices among cancer survivors: pilotstudy implications for study design,recruitment, and data collectionDiane Ng1, M. Shayne Gallaway2, Grace C. Huang1, Theresa Famolaro1, Jennifer Boehm2, Karen Stachon3 andElizabeth A. Rohan2*

Abstract

Background: We sought to understand barriers and facilitators to implementing distress screening (DS) of cancerpatients to inform and promote uptake in cancer treatment facilities. We describe the recruitment and datacollection challenges and recommendations for assessing DS in oncology treatment facilities.

Methods: We recruited CoC-accredited facilities and collected data from each facility’s electronic health record(EHR). Collected data included cancer diagnosis and demographics, details on DS, and other relevant patient healthdata. Data were collected by external study staff who were given access to the facility’s EHR system, or by facilitystaff working locally within their own EHR system. Analyses are based on a pilot study of 9 facilities.

Results: Challenges stemmed from being a multi-facility-based study and local institutional review board (IRB)approval, facility review and approval processes, and issues associated with EHR systems and the lack of DS datastandards. Facilities that provided study staff remote-access took longer for recruitment; facilities that performedtheir own extraction/abstraction took longer to complete data collection.

Conclusion: Examining DS practices and follow-up among cancer survivors necessitated recruiting and workingdirectly with multiple healthcare systems and facilities. There were a number of lessons learned related torecruitment, enrollment, and data collection. Using the facilitators described in this manuscript offers increasedpotential for working successfully with various cancer centers and insight into partnering with facilities collectingnon-standardized DS clinical data.

Keywords: Cancer, Electronic health records, Methods, Oncology, Psychological distress, Research design

© The Author(s). 2021 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License,which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you giveappropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate ifchanges were made. The images or other third party material in this article are included in the article's Creative Commonslicence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commonslicence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtainpermission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to thedata made available in this article, unless otherwise stated in a credit line to the data.

* Correspondence: [email protected] for Disease Control and Prevention, Division of Cancer Preventionand Control, Atlanta, GA, USAFull list of author information is available at the end of the article

Ng et al. BMC Health Services Research (2021) 21:238 https://doi.org/10.1186/s12913-021-06250-5

Page 2: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

BackgroundDistress in a cancer patient is defined as an unpleasantpsychological, social, emotional, and/or spiritual experi-ence that interferes with the ability to effectively copewith a cancer diagnosis, symptoms, or subsequent treat-ment side-effects [1]. Since 2015, the American Collegeof Surgeons’ Commission on Cancer (CoC) has requireddistress screening (DS) of cancer patients seen in theiraccredited facilities [2]. Cancer patients must bescreened for distress a minimum of one time at a pivotalmedical visit as determined by the program (e.g., diagno-sis, beginning and ending treatments, recurrence or pro-gression), and preference should be given to visits attimes of greatest risk for distress [2]. Research has dem-onstrated the downstream beneficial impact on patients,families, cancer outcomes, and the medical system whenpatient distress is addressed [3, 4].Successful implementation of routine DS in oncology

settings requires thoughtful planning, the feasible andappropriate use of a scientifically valid tool [5, 6], and ahost of practical variables within each practice setting(e.g., staffing availability, resource availability, monetarycost of implementation) [6–8]. Despite evidence of theeffectiveness of psychosocial services in alleviating dis-tress, many cancer patients who could benefit from theseservices do not receive them. Barriers exist at multiplelevels (i.e., patient, provider, setting) [9, 10].Prior multi-facility-based healthcare studies have

shown that the process for facility recruitment can belengthy and that many may decline to participate for anynumber of reasons, such as the lack of staff time, man-agement support, and institutional review board (IRB) is-sues [11]. While patient recruitment strategies have beenwell described [12], best practices in studying institu-tional policies and practices are far less published. Whi-cher et al. [13] and Anderson et al. [14] describe theimportance of identifying and engaging with key leaderswithin organizations. Successful recruitment of a suffi-cient number and representative mix of healthcare facil-ities requires considerable preparation, planning, andflexibility. It is also important to consider that the pro-cesses may take longer and be more complex than antic-ipated from the outset [15]. Among interestedhealthcare facilities, barriers to participation may stillexist, including a lack of financial resources or insuffi-cient staff to carry out requirements necessary to partici-pate in the study [15].We sought to understand barriers and facilitators to

implementing DS to inform and promote uptake in allfacilities treating cancer patients. We designed a mixed-method study that included a quantitative review ofexisting electronic health records (EHRs) and qualitativeinterviews with health care practitioners. Based on re-cent recommendations to facilitate the integration of DS

and management research into practice [7, 16], here wedescribe the challenges and facilitators associated withthe study design, recruitment, and quantitative data col-lection from multiple oncology treatment facilities.

MethodsDesignThe Centers of Disease Control and Prevention (CDC)sponsored this study and contracted with Westat, a re-search company, to assist with the design and imple-mentation. The study design included specific data itemsthat were collected from EHRs. We solicited feedbackon the study design and study materials from socialworkers who represented medical facilities that werepart of a similar DS study [17]. Through a partnershipwith the CoC, we offered potential facilities credit to-wards CoC’s Standard 4.7 Study of Quality (2016 edi-tion) [2] or for the number of patients accrued forStandard 9.1 Clinical Research Accrual Study (2020 Edi-tion) [18] as a direct benefit for their participation in thestudy.Study recruitment began in October 2017, with data

collection occurring subsequently over 2 years. Thisstudy was reviewed and approved by the CDC HumanSubjects Review Board (protocol #7225.0), Westat’s Hu-man Subjects Review Board (protocol #6282.07), and theU.S. Office of Management and Budget (OMB Control#0920–1270). All participating facilities also completedtheir own institutional review board (IRB) review andapproval. The requirement for patient consent waswaived in accordance with federal regulations per IRBapprovals at each facility, and all methods were carriedout in accordance with relevant guidelines andregulations.

Study sample and recruitmentWe focused our study on lung and ovarian cancer pa-tients because of the high potential for these cancers tobe associated with distress; lung cancer is the deadliestcancer in the United States [19] and ovarian cancer fre-quently recurs within a short period of time and hasmoderate survival [20].We used data from the American Hospital Association

(AHA) Annual Survey 2015 Database [21] to identify fa-cilities of interest. Initially, we had targeted states withhigh incidence rates of lung and/or ovarian cancer to en-sure we had sufficient numbers of cancer survivors rep-resented in the study, particularly for ovarian cancer,which has a lower incidence [19]. We included bothCoC and non-CoC-accredited facilities, because we wereinterested in investigating whether there were differ-ences in DS by accreditation status. Only facilities thatreported offering cancer services and having fully imple-mented EHRs were eligible for inclusion. The initial

Ng et al. BMC Health Services Research (2021) 21:238 Page 2 of 9

Page 3: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

sampling frame was then stratified into four facility typesbased on CoC-accreditation status (CoC and non-CoC-accredited) and geographic urbanicity (urban and rural)to achieve a mix of facilities for recruitment. Among theCoC-accredited facilities, we considered cancer programcategory type as assigned by the CoC (e.g., Comprehen-sive Community Cancer Program, NCI-Designated Net-work Cancer Program) during recruitment, with theintention of obtaining facilities across differentcategories.We obtained contact information for Cancer Program

Managers and Hospital Registrars for a sample of 100 fa-cilities directly from the CoC. We chose to contact thesefacility staff based on CoC’s suggestion as staff whowould understand the current study and are most ofteninvolved in directing the flow of information to the can-cer committee and decision makers pertaining to DS. In-vitation emails were sent to the 100 facilities in threebatches on a rolling basis.For non-accredited facilities, we reached out to Directors

of Oncology Services, Directors of Quality Improvement,Directors of Social Services, and Directors of Patient Care/Nursing using contact information obtained from SK&A(https://www.skainfo.com/). SK&A, now OneKey™ byIQVIA, holds a national database of healthcare providers,which is updated on a continuous basis through govern-ment and non-government industry sources. We selectedthese facility staff based on available options from the One-Key™ database who would likely understand and be inter-ested in the current study. We sent information about thecurrent study and an invitation to participate to the identi-fied healthcare facility contacts via email. We offered poten-tial study sites facility-specific feedback reports based onthe results of the study as a benefit for participation, and of-fered CoC-accredited healthcare facilities the additionalbenefit of receiving credit towards one of two pre-approvedCoC standards. Due to a lack of response from non-CoC-accredited facilities, we focused the study scope to CoC-accredited facilities with a minimum number of lung andovarian cancer cases (< 130 combined) and expanded re-cruitment to all states. We also promoted the studythrough professional networks and organizations thatwould have an interest in DS.

Study enrollmentFacilities interested in participating in the study under-went a multi-part enrollment process. We held orienta-tion calls with healthcare facility contacts from eachfacility that expressed interest. The facility staff we en-gaged with typically included oncology/cancer registrydata managers, oncology social workers, and clinicalnurses. We provided facilities with an introduction tothe study, including information about data collectionmethods, data security, requirements for IRB submission

at the facility, requirements for business associates/datause agreements, a proposed timeline for the study, andbenefits of study participation. During this orientationcall, we also collected facility-level data on lung andovarian cancer caseload, each facility’s DS protocol, andinformation about their EHR system. To aid facility staffin preparing their IRB applications for review at their re-spective facilities, we provided a template pre-populatedwith study-specific information.We tracked communication and the status of facilities

and followed up periodically, often by phone, to encour-age their progression through the enrollment process.After a facility obtained IRB approval and received aBusiness Associates Agreement (BAA) or Data UseAgreement (DUA) as necessary, we initiated the processof data collection with that facility (see Fig. 1 for a flowdiagram of the full recruitment and enrollment process).During the first enrollment call, we offered two options

for facilities to supply their EHR data: 1) study staff (exter-nal to the facilities) to remotely access the facility’s EHRsystem and directly abstract relevant data from patient re-cords, or 2) facility staff to conduct the extraction and ab-straction of relevant data from their EHR and submit thedata to the study team through a secure file transfer proto-col. For healthcare facilities using remote access (option 1)for data collection, the study team signed the healthcare fa-cility BAA or DUA to protect privacy and confidentiality ofpatient information viewed during the abstraction process.

Data collectionData were requested in two phases. Phase 1: an extrac-tion of standardized data from the facility’s cancer regis-try database, and Phase 2: a manual abstraction of non-standardized data fields (Fig. 2).In Phase 1, we requested a case listing of the standard-

ized patient cancer diagnosis and demographic data forall cancer cases that met the following criteria:

� Primary site (International Classification of Disease,Tenth Revision, Clinical Modification (ICD-10-CM)[22]: Lung (C34) or Ovarian (C56)

� Diagnosed in 2016 or 2017� Diagnosed or treated at the participating facility

Based on a power analysis, we aimed to collect dataon a minimum of 2000 patients across all participat-ing facilities. We requested 130 cases from each facil-ity to obtain enough patient data to represent eachfacility’s DS practices. For facilities with more than130 cases that met the selection criteria, we statisti-cally sampled 130 cases by cancer type and race/eth-nicity strata using data from the cancer diagnosis anddemographics case listing.

Ng et al. BMC Health Services Research (2021) 21:238 Page 3 of 9

Page 4: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

Phase 2 of data collection involved manual abstractionof information about DS and other relevant patienthealth information such as healthcare utilization,cigarette smoking, and mental health status. We collabo-rated with facilities to abstract the DS and these otherpatient-level data elements for the sample of cases foreach facility. During this stage of data collection, we pre-sented facilities with multiple options because of antici-pated variation in resources and capacity across facilities.As noted, the two data collection options were for studystaff to remotely access and abstract relevant data frompatient records through the facility’s EHR system, or forfacility staff to extract/abstract data locally and securelytransfer a dataset for study staff to conduct a secondround of abstraction to maintain consistency in methodsacross facilities. For facilities that provided remote accessinto their EHR system, we held at least one training callwith facility staff to learn how best to navigate the EHRin order to locate each patient’s DS and the otherpatient-level data. For facilities that performed their ownextraction/abstraction, we held at least one training callwith facility staff to review the data elements of interest,the data dictionary and study specifications, and to pro-vide guidance to the facility on how best to extract/

abstract the data. After receiving these data, we ab-stracted and coded the data directly into the study data-base. For both data abstraction options (remote accessand data transfer), we used a customized relational dataabstraction tool to perform the abstraction and coding.This tool enabled capture and compilation of a relationaldatabase that captured data at the patient-level, informa-tion on multiple DS events per each patient, and thesubsequent interventions/services received per each DSencounter.

AnalysesAnalyses are based on a pilot study of 9 facilities; datawere collected through March 2020 for the pilot study.Six of the 9 programs were recruited via our larger studyinvitation emails. Three additional facilities were re-cruited outside of the invitation emails and via profes-sional networks. Data from these 9 facilities wereincluded in the final sample. Additional facilities inter-ested in participating in the study were excluded due toineligibility (e.g., low case counts, no DS protocol or DSdata from period of interest, data not accessible due totransitions in EHR system).

Fig. 1 Recruitment and enrollment process

Ng et al. BMC Health Services Research (2021) 21:238 Page 4 of 9

Page 5: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

ResultsOf the facilities enrolled into the pilot study, the ma-jority were Comprehensive Community Cancer Pro-grams (56%), urban (89%), and located in theNortheast region of the United States (33%)(Table 1).After successful study recruitment, facilities took an

average of 33 days to provide the data extract for Phase1 of data collection and an average of 136 days to pro-vide the data for Phase 2 (Table 2). Facilities that optedto provide the study staff remote access (EHR access op-tion 1; n = 6) took longer for recruitment at an averageof 104 days than facilities that performed their own ex-traction/abstraction (EHR access option 2; n = 3) at anaverage of 77 days, but completed data collection morequickly (127 days versus 253 days, respectively). In termsof documentation of DS in the EHR system, all partici-pating facilities reported having specific and fully inte-grated fields for DS data. However, DS-related data(particularly follow up data related to DS assessmentand referrals following DS) were generally found inopen-ended text fields (e.g., physician/staff notes), and

Fig. 2 Phases of quantitative data collection processes

Table 1 Characteristics of healthcare facilities recruited for thepilot study (n = 9)

Characteristics (n = 9) n %

Commission on Cancer (CoC) Program Category

Academic Comprehensive Cancer Program 1 11%

Community Cancer Program 1 11%

Comprehensive Community Cancer Program 5 56%

Integrated Network Cancer Program 2 22%

Urbanicitya

Rural 1 11%

Urban 8 89%

Geographic region

Midwest 2 22%

Northeast 3 33%

South 2 22%

West 2 22%aGeographic regions were defined using U.S. Census Regions designations.Urbanicity is defined by Core Based Statistical Area (CBSA) type, wherehealthcare facilities with a CBSA type of “Rural” or “Micropolitan” wereclassified as Rural and healthcare facilities with a CBSA type of “Metropolitan”were classified as Urban

Ng et al. BMC Health Services Research (2021) 21:238 Page 5 of 9

Page 6: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

four of the nine facilities reported these data in scannednotes or forms.

DiscussionThe current study provided several lessons learned re-lated to recruitment, enrollment, and data collection.Some particular challenges associated with this study in-volved the need to engage healthcare facilities with mul-tiple review and approval processes, identifying the mostappropriate partners within the facility, meeting all datacollection restrictions, and maximizing flexibility to ac-count for the non-standardization between facilities. Athorough description of the implications associated withstudy design, recruitment, enrollment, and data collec-tion from our study may help to further expand and in-tegrate similar study methods, designs, and outcomes inimplementation science among others executing studiesin similar settings.

Recruitment and enrollmentMotivate facilities to participateFacilities require resources to participate in studies, andoften encounter time, money, and staffing limitations[11]. We felt it was important to provide some directbenefits for facility staff efforts and the use of non-monetary benefits helped to successfully achieve the de-sired study sample. The current study offered direct ben-efits for participating, including credit towards one oftwo pre-approved CoC standards and an individualizedsummary report at the conclusion of the study. The CoCcredit was an important factor in motivating CoC-accredited facilities to participate. However, non-CoC-accredited healthcare facilities did not respond to re-quests to participate in the study, and thus, ultimately,we decided to halt recruitment targeting this group offacilities. It’s not entirely clear why there was a lack ofresponse or capability to participate among non-CoC

facilities; however, CoC-accredited hospitals are typicallylarger teaching hospitals with higher volume and add-itional services and specialists than non-CoC-accreditedhospitals [23]. As well, despite targeting rural facilities inthe sampling frame, recruitment of rural facilities wasless successful. This may also be related to factors asso-ciated with the availability of resources within rural facil-ities [24].

Identify the right facility gatekeeper and project championRecruitment required sending a study invitation to aspecific staff person, or gatekeeper, as the first entry intoeach facility. We obtained contact information fromCoC cancer committee lists, which included cancer pro-gram managers or hospital registrars who were familiarwith accreditation processes and internal evaluation re-quirements. Collaboration with the CoC was essential inobtaining the contact information for the appropriatestaff because contact information for hospital staff areoften not widely and publicly available. By describing thestudy interests and methodology at the outset, the gate-keeper was instrumental in directing us to the right pro-ject champion for the duration of the study.Characteristics of an ideal project champion includedstaff that had sufficient understanding about DS proce-dures and/or oncology data at the facility, worked closelywith the target population (i.e., cancer patients), and rec-ognized the benefits of improved DS and follow-up carefor patients. Often, these were oncology social workerswho provide assessments and follow-up services to pa-tients identified as distressed and were mostknowledgeable about sources of distress and resourcesavailable to patients. Project champions who saw partici-pation in this study as an opportunity to improve patientcare were more invested in their site’s participation.When social workers were not available, gatekeepershelped to identify another staff person who had a similar

Table 2 Timing of recruitment and data collection by data collection type

Phase of Study Remote Access (n = 6) Facility Extraction/Abstraction (n = 3)

Overall (n = 9)

Mean(days)

Median(days)

Range(days)

Mean(days)

Median(days)

Range(days)

Mean(days)

Median(days)

Range(days)

Recruitment: Recruitment email to IRB/BAAreceived

104 81 51–247 77 77 73–81 95 77 51–247

Data collection: IRB/BAA received toquantitative data collection complete

127 146 42–188 253 197 183–378 169 170 42–378

IRB/BAA received to Phase 1a data collectioncomplete

28 17 6–89 43 38 23–67 33 23 6–89

Phase 2b and all quantitative data collectioncomplete

100 101 36–167 210 160 159–311 136 145 36–311

Both recruitment and data collection 231 223 129–401 330 278 260–451 264 260 129–451aPhase 1: extraction from cancer registry (cancer diagnosis and demographics data)bPhase 2: medical record abstraction (DS and other relevant patient health data)IRB Institutional Review Board, BAA Business Associate Agreement

Ng et al. BMC Health Services Research (2021) 21:238 Page 6 of 9

Page 7: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

investment or interest in DS. It was greatly beneficial tolearn about the best person to contact at a facility priorto initiating contact. The champion from each facilityserved a key role in moving the facility forward in thestudy, and was a staff member who was engaged,invested, and committed to the project.

Allow time for enrollmentEach facility had multiple approval processes for studyparticipation. Multiple levels of facility leadership neededto approve the study, either formally or informally, andsubmission of a local IRB application was required to ac-cess patient-level data being requested. Similar to othermulticenter studies, there was substantial variability inlocal IRB requirements, which added time to the processand delayed implementation of our study [15, 25, 26].Progress also depended on staff availability to participateon calls and complete tasks for the various local ap-proval processes. It was important to keep the contactsengaged and moving forward in the process throughregular outreach, while also being cognizant of facilitystaff schedules that were unpredictable. This often re-quired multiple attempts to make a connection, and itwas essential to develop a tracking system and thoroughdocumentation to stay organized and aware of the pro-gress of each facility throughout the process.

Understand the variation in facility capabilitiesAll facilities were unique. While it may have been easierto limit participation (based on certain characteristics orcriteria) for a more focused and streamlined design, vari-ation on key variables of interest was critical to detectingdifferences and disparities across facilities [27]. Some fa-cilities were more experienced with participating in re-search studies than others, and our study design madeintentional considerations to accommodate the diversityacross facilities and make the process as smooth as pos-sible. For instance, to facilitate the IRB submissionprocess, a common barrier to research participation [11,28], we found that providing an IRB information tem-plate for facility staff to adapt, helped them completetheir applications in a more timely fashion. Maximizingshared guidance, summary materials, and template docu-ments lessened the burden on facility staff and assistedwith efficiently streamlining facility documentationrequirements.

Data collectionData collection challenges stemmed from the use ofEHRs and specific types of DS measures. Large variationin EHR products, how these data are collected, and alack of data interoperability often lead to challenges inharmonizing clinical data for the purpose of research[29, 30]. This is compounded with newer fields of study

(e.g., in our study of DS process and outcomes), becausethe data are not yet standardized. Furthermore, EHRdata that are collected locally for healthcare provider usecan be extremely heterogeneous across healthcare facil-ities [30].

Understand what, where, and how data are availableFacilities are unique and differ in how data are organizedand collected. In this study, some facilities used a com-bination of data collection methods (e.g., paper chartsand EHR systems), or even used multiple EHR systemswithin their own organization across different depart-ments. Data in EHRs are generally available in differentforms as structured, unstructured, or semi-structureddata [31]. In the current study, while some DS data (e.g.,DS score, reasons for distress) were available in struc-tured data fields, information about follow-up and ser-vices as a result of DS were rarely structured and wereoften only available in unstructured notes. For a studyinvolving multiple sites, it is beneficial to identify com-monalities across facilities to assist with standardizingdata collection as much as possible. In this study, wewere interested in cancer diagnosis and demographic in-formation that we knew CoC-accredited facility cancerregistries standardly collected [32]. We were able to usethis common framework to guide facilities in providingthese data. However, because DS data standards werenot available, it was difficult to anticipate how to requestthese data. While published literature [1, 6, 16, 17, 33]provided some guidance on data standards and availabledata and data sources, the variability across facilities ren-dered it insufficient, necessitating a longer than antici-pated protocol development phase. The lack ofstandardized or structured data on follow-up to DSmakes it difficult not only for researchers, but also forfacility staff and clinicians to track the needs identifiedby an assessment following a DS and services used tomanage distress. Though individual facilities wouldbenefit from structured follow-up data for internal use,there is a need for standardized EHR data collectionacross facilities to allow for a more global view of theimpact of DS on patient outcomes. Ideally, this wouldinvolve a collaborative effort between DS and technologyexperts to develop DS data standards that can be broadlyimplemented and integrated within EHR systems. Thiswould be in line with the objectives set through theHealth Information Technology for Economic and Clin-ical Health Act (HITECH), resulting in the “meaningfuluse” of EHRs to make improvements in care [34].

Account for differences in data collection and protocolsacross facilitiesDue to the lack of data and protocol standards for DS, itwas necessary to account for and reduce data element

Ng et al. BMC Health Services Research (2021) 21:238 Page 7 of 9

Page 8: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

differences for analysis and interpretation. Facility DSmeasures differed in a number of aspects, including, butnot limited to, the DS tool used; the pivotal medical visitwhere DS was administered; and the location, frequency,and documentation of DS encounters. Our study designconsidered the importance of capturing these differ-ences. We were interested in understanding the currentlandscape of DS and so did not limit participation basedon DS tools or protocols. Therefore, we designed thedata dictionary and data abstraction tool to account forpotential differences across healthcare facilities, withinhealthcare facilities, and even among individual patients(i.e., some patients received more than one DS).

Determine the best way to collect data for the studyIt was our experience that data collection involved a sig-nificant use of facility staff and/or study staff time, par-ticularly for data abstraction, as has been the case inother studies [11, 17]. Facility staff involvement was ne-cessary for data collection, however time and resourcesto participate varied depending on EHR abstractionmethod (i.e., remote or abstraction by facility staff).When facilities could not allow or have the capability togrant remote access to their EHR, facility staff were inte-gral to data collection because staff would often manu-ally abstract data from unstructured fields of the EHR.Training was required in order to guarantee valid andreliable data. However, a benefit to having facility staffabstract data was that they were the most familiar withhow DS data were entered into the EHR and wouldknow how to locate these data. Conversely, for facilitiesthat could allow remote access to their EHR, significantstudy staff time was required for records review and ab-straction. Furthermore, the facility staff had to train andsupport study staff who were accessing, navigating, andabstracting data directly from the facility EHR. It may bepreferable and more efficient to limit facility participa-tion to those that can provide the data in a specific for-mat, and important to consider the balance of time,resources, target population, and study outcomes withrecruitment challenges.

ConclusionOur effort to examine DS practices and follow-upamong cancer survivors necessitated recruiting andworking directly with multiple healthcare systems andfacilities. Challenges stemmed from coordinating datacollection from diverse facilities, local IRB and approvalprocesses, and issues associated with the lack of DS datastandards in EHR-based data collection. Our pilot studyhelped to identify potential gaps in DS protocols anddisparities among patients to directly inform improve-ments to DS process administration. We also highlightedthe lack of DS data standards that are needed to enable

clinicians to more easily utilize DS data for evaluation ofsuccessful implementation and measurement of the im-pact of DS on patients. Data standards could be imple-mented into EHRs to streamline DS data collectionacross facilities in a way that is not currently available.Working with oncology facilities poses unique chal-lenges; however, using the facilitators described here of-fers increased potential for enrolling and workingsuccessfully with various cancer centers and insight intopartnering with diverse facilities when collecting non-standardized DS clinical data.

AcknowledgementsThe authors wish to acknowledge all participating facilities, the Patient-Centered Research Collaborative of Psychosocial Oncology (PCRC), Dr. BradZebrack, and the Commission on Cancer (CoC).

Authors’ contributionsDN participated in study design and protocol development, recruitment ofparticipating facilities, and data collection and abstraction, and was a majorcontributor in writing the manuscript. MSG participated in study design andprotocol development and was a major contributor in writing themanuscript. GH participated in and oversaw study design and protocoldevelopment and acted as study director. TF participated in study designand protocol development and lead recruitment of facilities into the study.JB participated in study design and protocol development and contributedin writing the manuscript. KS participated in study design and protocoldevelopment. EAR was the Principal Investigator for the study, participatedin study design and protocol development, and contributed in writing themanuscript. All authors read and approved the final manuscript.

Authors’ informationAuthor Disclaimer Statement: The findings and conclusions in this report arethose of the authors and do not necessarily represent the official position ofthe Centers for Disease Control and Prevention. Use of trade names is foridentification only and does not imply endorsement by the US Departmentof Health and Human Services.

FundingThis work was completed in part by Westat through a contract by theCenters for Disease Control and Prevention (HCUCCG-2017-07697; contract#200–2014-61258).

Availability of data and materialsThe datasets generated and analyzed during the current study are notpublicly available due to it not being possible to remove identifiers toadequately protect subjects’ privacy while still retaining useful variables, butare available from the corresponding author on reasonable request.

Declarations

Ethics approval and consent to participateThis study was reviewed and approved by the CDC Human Subjects ReviewBoard (HSRB) (protocol #7225.0), Westat’s IRB (protocol #6282.07), and theU.S. Office of Management and Budget (OMB Control# 0920–1270). Allparticipating facilities also completed their own IRB review and approval. Therequirement for patient consent was waived by Legacy IRB, Penn MedicineChester County Hospital IRB, Marion General Hospital IRB, White PlainsHospital IRB, Kettering Health Network IRB, Valley Health/Winchester MedicalCenter IRB, Providence Health and Services (Oregon) IRB, Duke UniversityHealth System IRB, and Penn Medicine Princeton Health IRB in accordancewith federal regulations per IRB approvals at each facility. All methods werecarried out in accordance with relevant guidelines and regulations.

Consent for publicationNot applicable.

Ng et al. BMC Health Services Research (2021) 21:238 Page 8 of 9

Page 9: Partnering with healthcare facilities to understand psychosocial … · 2021. 3. 17. · SK&A, now OneKey™ by IQVIA, holds a national database of healthcare providers, which is

Competing interestsThe authors declare that they have no competing interests.

Author details1Westat, Rockville, MD, USA. 2Centers for Disease Control and Prevention,Division of Cancer Prevention and Control, Atlanta, GA, USA. 3AmericanCollege of Surgeons, Commission on Cancer, Chicago, IL, USA.

Received: 17 December 2020 Accepted: 4 March 2021

References1. NCCN Clinical Practice Guidelines in Oncology: Distress Management,

version 2.2018 [https://www.nccn.org/patients/resources/life_with_cancer/distress.aspx].

2. Cancer Program Standards: Ensuring Patient-Centered Care [https://www.facs.org/quality-programs/cancer/coc/standards].

3. Faller H, Schuler M, Richard M, Heckl U, Weis J, Kuffner R. Effects of psycho-oncologic interventions on emotional distress and quality of life in adultpatients with cancer: systematic review and meta-analysis. J Clin Oncol.2013;31(6):782–93. https://doi.org/10.1200/JCO.2011.40.8922.

4. Lutgendorf SK, Andersen BL. Biobehavioral approaches to cancerprogression and survival: mechanisms and interventions. Am Psychol. 2015;70(2):186–97. https://doi.org/10.1037/a0035730.

5. Andersen BL, DeRubeis RJ, Berman BS, Gruman J, Champion VL, Massie MJ,Holland JC, Partridge AH, Bak K, Somerfield MR, Rowland JH, AmericanSociety of Clinical Oncology. Screening, assessment, and care of anxiety anddepressive symptoms in adults with cancer: an American Society of ClinicalOncology guideline adaptation. J Clin Oncol. 2014;32(15):1605–19. https://doi.org/10.1200/JCO.2013.52.4611.

6. Rohan EA. Removing the stress from selecting instruments: arming socialworkers to take leadership in routine distress screening implementation. JPsychosoc Oncol. 2012;30(6):667–78. https://doi.org/10.1080/07347332.2012.721487.

7. Ehlers SL, Davis K, Bluethmann SM, Quintiliani LM, Kendall J, Ratwani RM,Diefenbach MA, Graves KD. Screening for psychosocial distress amongpatients with cancer: implications for clinical practice, healthcare policy, anddissemination to enhance cancer survivorship. Transl Behav Med. 2019;9(2):282–91. https://doi.org/10.1093/tbm/iby123.

8. Kayser K, Acquati C, Tran TV. No patients left behind: a systematic review ofthe cultural equivalence of distress screening instruments. J PsychosocOncol. 2012;30(6):679–93. https://doi.org/10.1080/07347332.2012.721489.

9. Mitchell AJ, Vahabzadeh A, Magruder K. Screening for distress anddepression in cancer settings: 10 lessons from 40 years of primary-careresearch. Psychooncology. 2011;20(6):572–84. https://doi.org/10.1002/pon.1943.

10. Pincus HA, Patel SR. Barriers to the delivery of psychosocial care for cancerpatients: bridging mind and body. J Clin Oncol. 2009;27(5):661–2. https://doi.org/10.1200/JCO.2008.19.7780.

11. Rosen AK, Gaba DM, Meterko M, Shokeen P, Singer S, Zhao S, Labonte A,Falwell A. Recruitment of hospitals for a safety climate study: facilitators andbarriers. Jt Comm J Qual Patient Saf. 2008;34(5):275–84. https://doi.org/10.1016/S1553-7250(08)34034-3.

12. Points to Consider about Recruitment and Retention While Preparing aClinical Research Study [https://www.nimh.nih.gov/funding/grant-writing-and-application-process/points-to-consider-about-recruitment-and-retention-while-preparing-a-clinical-research-study.shtml].

13. Whicher DM, Miller JE, Dunham KM, Joffe S. Gatekeepers for pragmaticclinical trials. Clin Trials. 2015;12(5):442–8. https://doi.org/10.1177/1740774515597699.

14. Anderson ML, Califf RM, Sugarman J. Participants in the NIHHCSRCCRTW:ethical and regulatory issues of pragmatic cluster randomized trials incontemporary health systems. Clin Trials. 2015;12(3):276–86. https://doi.org/10.1177/1740774515571140.

15. Johnson AM, Jones SB, Duncan PW, Bushnell CD, Coleman SW, Mettam LH,Kucharska-Newton AM, Sissine ME, Rosamond WD. Hospital recruitment fora pragmatic cluster-randomized clinical trial: lessons learned from theCOMPASS study. Trials. 2018;19(1):74. https://doi.org/10.1186/s13063-017-2434-1.

16. Jacobsen PB, Norton WE. The role of implementation science in improvingdistress assessment and management in oncology: a commentary on

“screening for psychosocial distress among patients with cancer:implications for clinical practice, healthcare policy, and dissemination toenhance cancer survivorship”. Transl Behav Med. 2019;9(2):292–5. https://doi.org/10.1093/tbm/ibz022.

17. Zebrack B, Kayser K, Bybee D, Padgett L, Sundstrom L, Jobin C, Oktay J. Apractice-based evaluation of distress screening protocol adherence andmedical service utilization. J Natl Compr Cancer Netw. 2017;15(7):903–12.https://doi.org/10.6004/jnccn.2017.0120.

18. Optimal Resources for Cancer Care [facs.org/-/media/files/quality-programs/cancer/coc/optimal_resources_for_cancer_care_2020_standards.ashx].

19. United States Cancer Statistics Data Visualizations Tool, Based on November2018 submission data (1999–2016): U.S. Department of Health and HumanServices, Centers for Disease Control and Prevention and National CancerInstitute. 2019; www.cdc.gov/cancer/dataviz. Accessed June, 2019.

20. Stewart SL, Harewood R, Matz M, Rim SH, Sabatino SA, Ward KC, Weir HK.Disparities in ovarian cancer survival in the United States (2001-2009):Findings from the CONCORD-2 study. Cancer. 2017;123(Suppl 24):5138–59.

21. American Hospital Association annual survey database [https://www.ahadataviewer.com/about/data/].

22. International Classification of Diseases, Tenth Revision, Clinical Modification(ICD-10-CM) [cdc.gov/nchs/icd/icd10cm.htm].

23. Merkow RP, Chung JW, Paruch JL, Bentrem DJ, Bilimoria KY. Relationshipbetween cancer center accreditation and performance on publicly reportedquality measures. Ann Surg. 2014;259(6):1091–7. https://doi.org/10.1097/SLA.0000000000000542.

24. Rural Hospital Closures Number and Characteristics of Affected Hospitalsand Contributing Factors [https://www.gao.gov/assets/700/694125.pdf].

25. Greene SM, Geiger AM. A review finds that multicenter studies facesubstantial challenges but strategies exist to achieve institutional reviewboard approval. J Clin Epidemiol. 2006;59(8):784–90. https://doi.org/10.1016/j.jclinepi.2005.11.018.

26. Greene SM, Geiger AM, Harris EL, Altschuler A, Nekhlyudov L, Barton MB,Rolnick SJ, Elmore JG, Fletcher S. Impact of IRB requirements on amulticenter survey of prophylactic mastectomy outcomes. Ann Epidemiol.2006;16(4):275–8. https://doi.org/10.1016/j.annepidem.2005.02.016.

27. Creswell JW, Creswell JD. Research design: qualitative, quantitative, andmixed methods approaches. 5th ed. Los Angeles: SAGE; 2018.

28. Newgard CD, Hui SH, Stamps-White P, Lewis RJ. Institutional variability in aminimal risk, population-based study: recognizing policy barriers to healthservices research. Health Serv Res. 2005;40(4):1247–58. https://doi.org/10.1111/j.1475-6773.2005.00408.x.

29. Bowles KH, Potashnik S, Ratcliffe SJ, Rosenberg M, Shih NW, Topaz M,Holmes JH, Naylor MD. Conducting research using the electronic healthrecord across multi-hospital systems: semantic harmonization implicationsfor administrators. J Nurs Adm. 2013;43(6):355–60. https://doi.org/10.1097/NNA.0b013e3182942c3c.

30. Hughes RG, Beene MS, Dykes PC. The significance of data harmonization forcredentialing research. In. Washington, D.C: Discussion Paper, Institute ofMedicine; 2014.

31. Raghupathi W, Raghupathi V. Big data analytics in healthcare: promise andpotential. Health Inf Sci Syst. 2014;2(1):3. https://doi.org/10.1186/2047-2501-2-3.

32. Standards for Cancer Registries Volume I Data Exchange Standards andRecord Descriptions [https://www.naaccr.org/wp-content/uploads/2018/05/Standards-Volume-I-Version-18.pdf].

33. Pirl WF, Muriel A, Hwang V, Kornblith A, Greer J, Donelan K, Greenberg DB,Temel J, Schapira L. Screening for psychosocial distress: a national survey ofoncologists. J Support Oncol. 2007;5(10):499–504.

34. Blumenthal D, Tavenner M. The “meaningful use” regulation for electronichealth records. N Engl J Med. 2010;363(6):501–4. https://doi.org/10.1056/NEJMp1006114.

Publisher’s NoteSpringer Nature remains neutral with regard to jurisdictional claims inpublished maps and institutional affiliations.

Ng et al. BMC Health Services Research (2021) 21:238 Page 9 of 9