E-Consult Implementation: Lessons Learned Using ... · centers that were less versus more...

21
e640 n www.ajmc.com n DECEMBER 2015 CLINICAL © Managed Care & Healthcare Communications, LLC I n 2010, the Secretary for the Department of Veterans Affairs (VA) identified improving access to care as a top priority. 1 The Veterans Health Administration (VHA) had been collecting and analyzing data on wait times for more than a decade, and observational studies found associations between wait times and poorer short- and long-term quality indicators. 2 Research also highlighted challenges faced by veterans in rural communities and by female veterans, with travel demands and transportation difficulties sometimes exacerbated by veterans’ functional status, resulting in delayed or forgone care. 3,4 Technology was seen as part of the solution by offering alternate ways to access care. 5 Research suggested telehealth interventions could improve access, including speeding time to treatment while achieving results similar to in-person visits in terms of patient satisfaction and experience of care. 6 Simul- taneously, there were concerns about implementation of new technologies introducing problems such as privacy and confi- dentiality vulnerabilities and disruption to clinic work flow. 7 In 2011, the VHA implemented specialty care electronic consults (e-consults) at 15 pilot sites. E-consults offer pri- mary care providers (PCPs) the option to obtain specialty care expertise by submitting patient consults via the VHA’s electronic health record (EHR) 8,9 ; e-consults have been im- plemented in other healthcare systems as well. 10-13 Special- ists then respond with advice and/or recommendations on whether veterans should be seen in person. If implemented effectively, e-consults should improve specialty care access and reduce travel for veterans. The VHA’s Office of Specialty Care Transformation (OSCT), which was responsible for overseeing the dissemi- nation of e-consults, requested assistance in identifying the challenges associated with implementation to facilitate fur- ther dissemination. Thus, the Specialty Care Evaluation Center was created to evaluate e-consult implementation. We used the Consolidated Framework for Implementation E-Consult Implementation: Lessons Learned Using Consolidated Framework for Implementation Research Leah M. Haverhals, MA; George Sayre, PsyD; Christian D. Helfrich, PhD, MPH; Catherine Battaglia, PhD, RN; David Aron, MD, MS; Lauren D. Stevenson, PhD; Susan Kirsh, MD, MPH; P. Michael Ho, MD, MPH; and Julie Lowery, PhD ABSTRACT Objectives: In 2011, the Veterans Health Administration (VHA) implemented electronic consults (e-consults) as an alternative to in-person specialty visits to improve access and reduce travel for veterans. We conducted an evaluation to understand variation in the use of the new e-consult mechanism and the causes of vari- able implementation, guided by the Consolidated Framework for Implementation Research (CFIR). Study Design: Qualitative case studies of 3 high- and 5 low-im- plementation e-consult pilot sites. Participants included e-consult site leaders, primary care providers, specialists, and support staff identified using a modified snowball sample. Methods: We used a 3-step approach, with a structured survey of e-consult site leaders to identify key constructs, based on the CFIR. We then conducted open-ended interviews, focused on key constructs, with all participants. Finally, we produced structured, site-level ratings of CFIR constructs and compared them between high- and low-implementation sites. Results: Site leaders identified 14 initial constructs. We conducted 37 interviews, from which 4 CFIR constructs distinguished high implementation e-consult sites: compatibility, networks and com- munications, training, and access to knowledge and information. For example, illustrating compatibility, a specialist at a high- implementation site reported that the site changed the order of consult options so that all specialties listed e-consults first to maintain consistency. High-implementation sites also exhibited greater agreement on constructs. Conclusions: By using the CFIR to analyze results, we facilitate fu- ture synthesis with other findings, and we better identify common patterns of implementation determinants common across settings. Am J Manag Care. 2015;21(12):e640-e647

Transcript of E-Consult Implementation: Lessons Learned Using ... · centers that were less versus more...

e640 n www.ajmc.com n DECEMBER 2015

CLINICAL

© Managed Care &Healthcare Communications, LLC

I n 2010, the Secretary for the Department of Veterans Affairs (VA) identified improving access to care as a top priority.1 The Veterans Health Administration

(VHA) had been collecting and analyzing data on wait times for more than a decade, and observational studies found associations between wait times and poorer short- and long-term quality indicators.2 Research also highlighted challenges faced by veterans in rural communities and by female veterans, with travel demands and transportation difficulties sometimes exacerbated by veterans’ functional status, resulting in delayed or forgone care.3,4

Technology was seen as part of the solution by offering alternate ways to access care.5 Research suggested telehealth interventions could improve access, including speeding time to treatment while achieving results similar to in-person visits in terms of patient satisfaction and experience of care.6 Simul-taneously, there were concerns about implementation of new technologies introducing problems such as privacy and confi-dentiality vulnerabilities and disruption to clinic work flow.7

In 2011, the VHA implemented specialty care electronic consults (e-consults) at 15 pilot sites. E-consults offer pri-mary care providers (PCPs) the option to obtain specialty care expertise by submitting patient consults via the VHA’s electronic health record (EHR)8,9; e-consults have been im-plemented in other healthcare systems as well.10-13 Special-ists then respond with advice and/or recommendations on whether veterans should be seen in person. If implemented effectively, e-consults should improve specialty care access and reduce travel for veterans.

The VHA’s Office of Specialty Care Transformation (OSCT), which was responsible for overseeing the dissemi-nation of e-consults, requested assistance in identifying the challenges associated with implementation to facilitate fur-ther dissemination. Thus, the Specialty Care Evaluation Center was created to evaluate e-consult implementation. We used the Consolidated Framework for Implementation

E-Consult Implementation: Lessons Learned Using Consolidated Framework for Implementation Research

Leah M. Haverhals, MA; George Sayre, PsyD; Christian D. Helfrich, PhD, MPH; Catherine Battaglia, PhD, RN;

David Aron, MD, MS; Lauren D. Stevenson, PhD; Susan Kirsh, MD, MPH; P. Michael Ho, MD, MPH;

and Julie Lowery, PhD

ABSTRACT

Objectives: In 2011, the Veterans Health Administration (VHA) implemented electronic consults (e-consults) as an alternative to in-person specialty visits to improve access and reduce travel for veterans. We conducted an evaluation to understand variation in the use of the new e-consult mechanism and the causes of vari-able implementation, guided by the Consolidated Framework for Implementation Research (CFIR).

Study Design: Qualitative case studies of 3 high- and 5 low-im-plementation e-consult pilot sites. Participants included e-consult site leaders, primary care providers, specialists, and support staff identified using a modified snowball sample.

Methods: We used a 3-step approach, with a structured survey of e-consult site leaders to identify key constructs, based on the CFIR. We then conducted open-ended interviews, focused on key constructs, with all participants. Finally, we produced structured, site-level ratings of CFIR constructs and compared them between high- and low-implementation sites.

Results: Site leaders identified 14 initial constructs. We conducted 37 interviews, from which 4 CFIR constructs distinguished high implementation e-consult sites: compatibility, networks and com-munications, training, and access to knowledge and information. For example, illustrating compatibility, a specialist at a high-implementation site reported that the site changed the order of consult options so that all specialties listed e-consults first to maintain consistency. High-implementation sites also exhibited greater agreement on constructs.

Conclusions: By using the CFIR to analyze results, we facilitate fu-ture synthesis with other findings, and we better identify common patterns of implementation determinants common across settings.

Am J Manag Care. 2015;21(12):e640-e647

VOL. 21, NO. 12 n THE AMERICAN JOURNAL OF MANAGED CARE n e641

Facilitators and Barriers to Implementing E-Consults

Research (CFIR) to identify those factors that facilitated or hindered e-consult im-plementation among pilot sites. The CFIR consolidates and standardizes definitions of implementation factors, thereby provid-ing a pragmatic structure for identifying potential influences on implementation and comparing findings across sites and studies.14,15 The CFIR is composed of 5 do-mains: intervention characteristics, outer setting, inner setting, characteristics of in-dividuals involved in implementation, and the process of implementation.14 Thirty-seven constructs characterize these domains. The objective of this study is to use the CFIR for identification and comparison of implementation factors across sites in an effort to learn from their experiences.

METHODSA post implementation interpretive evaluation16 was

conducted using semi-structured, key informant inter-views with structured ratings of CFIR constructs. The unit of analysis was the site and included 8 of 15 pilot sites (geographic-site/specialty combinations), selected for variation on overall e-consult implementation rates, measured as a ratio of e-consults to all consults for spe-cialties of interest. Three e-consult sites were randomly selected from the 7 sites in the top half of e-consult imple-mentation rates, and 5 were selected from the 8 sites in the bottom half (53% of sites interviewed). E-consult volume data were assessed from the beginning of the pilot period to initial site selection, May 2011 to February 2012.

A modified snowball sample was used to recruit partici-pants, beginning with local site leaders and directors from both primary care and specialty care; e-consult programs straddle multiple clinical divisions, so some sites had multiple leaders. Interview participants were asked to identify special-ists, PCPs, and support staff (nurse practitioners, pharma-cists, and medical support assistants) engaged in initiatives. The rationale for conducting interviews at a small, but pur-posefully selected, sample of sites was to focus on obtaining an in-depth understanding of the differences in context in which implementation occurred, and how these differences might be related to implementation success.17-19

Data and AnalysisTo identify a subset of high-probability CFIR con-

structs, a Web-based survey (available in the eAppendix A [eAppendices available at www.ajmc.com]) was first

conducted of e-consult pilot site leaders to rate relevance of CFIR constructs to e-consult implementation. The initial CFIR survey was returned by all 21 e-consult site leaders. Of the 37 CFIR constructs, 14 were rated as im-portant or very important by at least 90% of participants (Table 1). An interview guide (eAppendix B) was devel-oped around those constructs and updated iteratively, a standard accepted practice in qualitative evaluations.20,21 Prior to conducting the interviews, analysts participated in 2 in-person, 2-day CFIR qualitative analysis training meetings, which included conducting CFIR ratings, group debriefings, and discussions of ratings. Interviews were conducted by telephone by an interviewer and note-taker and were digitally recorded. Interview pairs reviewed and clarified interview notes post interview, referring to re-cordings as needed. The pairs then independently coded interview notes from each participant according to CFIR constructs, ensuring that notes were consistent with the definitions of the CFIR constructs.

Following coding of the interview responses, they rated the influence of each construct in the organization (posi-tive or negative) and magnitude or strength of its influ-ence22 (–2, –1, 0, +1, +2) using established criteria (Table 2). Pairs distinguished constructs that were not specifically mentioned (missing) from those with ambiguous or neutral effects (rated 0). Following independent coding, pairs con-vened via phone or in person to resolve discrepancies and reach consensus, based on consensual qualitative research methods.20,21 Using ratings across participants and partici-pants’ roles (some participants’ responses were weighted more heavily than others), pairs derived an overall rating for each construct for each site, and noted if there was sig-nificant variability for constructs (a difference of at least 2 points across 2 or more participants). Assigning ratings to the qualitative interview data in this way allows for a sys-tematic, rapid comparison of findings across sites.23 A ma-trix of ratings for all constructs across sites was developed and used to examine the extent to which constructs were

Take-Away PointsOur research identified implementation factors that distinguished between medical centers that were less versus more successful at implementing a health information technology initiative: electronic consults (e-consults). These factors and their implica-tions for implementing new health information technology programs include:

n Compatibility: design initiative to fit in with existing work processes.

n Networks and communications: assess degree of communication among partici-pants; attend to indications of poor communication.

n Training resources: expend effort on training.

n Access to knowledge and information: establish key contacts easily accessible to program participants.

e642 n www.ajmc.com n DECEMBER 2015

CLINICAL

more likely to be rated as negative or zero/mixed among sites with low volume and more likely to be rated as posi-tive at sites with a high volume of e-consults.14

RESULTSThirty-seven interviews were completed with partici-

pants across 8 sites (Table 3). At all sites, a minimum of 3 people were interviewed, including e-consult site leader(s). In site-level CFIR ratings, 3 CFIR constructs had nega-tive ratings in both low- and high-volume sites: design

quality and packaging (perceptions of how the intervention is bundled and pre-sented), leadership engagement, and goals and feedback, suggesting that these might be areas of concern for VHA. Neverthe-less, the high-volume sites were able to overcome these challenges. Specifically, 4 CFIR constructs had more positive ratings at high-volume sites and more negative, neutral, or mixed ratings at low-volume sites, suggesting they might be critical implementation determinants: 1) compatibility, 2) networks and commu-nications, 3) available resources (specifi-cally training), and 4) access to knowledge and information. Differences between the low- and high-volume sites for each of these constructs are described below, and more examples are provided in Table 4.

Compatibility. Compatibility refers to the degree of tangible fit between mean-ing and values attached to the interven-tion, as well as those of the individuals’ own norms, values, perceived risks, and needs in the context of how the inter-vention fits with existing work flows and systems.14 Participants’ opinions on com-patibility varied at low-volume sites. Some PCPs perceived e-consults as adding to their workload and were not happy with the transfer of responsibility for certain tasks: “I feel like they’ve tried to transfer a lot of the work and basically [are] mak-ing the PCP a clerical person to collect and collate and put all this data together.” Others at low-volume sites saw the poten-tial for e-consults to make a difference, but were frustrated with the need to account for numbers of e-consults: “Here’s what

drives me nuts. We have always done e-consults here. We just didn’t call them e-consults…Then suddenly someone gave them a name—e-consults—and someone decided we could measure them. Had to change the process to deliver advice so they could get counted…[the] number of e-con-sults isn’t the be-all, end-all. [We] do [e-consults] to decrease visits. Appropriate to measure prompt access, not number [of e-consults].”

For high-volume sites, e-consults were more consis-tently described as good for work flow by streamlining ex-isting consult processes. One e-consult site leader from a

n Table 1. Consolidated Framework for Implementation Research Constructs Identified From Web-Based Survey With E-Consult Site Leaders Prior to Qualitative Interviews

Construct Definition

Compatibility

The degree of tangible fit between meaning and values at-tached to the intervention by involved individuals; how those align with individuals’ own norms, values, and perceived risks and needs; and how the intervention fits with existing work flows and systems.

Relative priorityIndividuals’ shared perception of the importance of the imple-mentation within the organization.

Goals and feedback

The degree to which goals are clearly communicated, acted on, and fed back to staff, and alignment of that feedback with goals.

Leadership engagement

Commitment to, involvement with, and accountability of lead-ers and managers for the implementation.

Available resources

The level of resources dedicated for implementation and ongoing operations, including money, training, education, physical space, and time.

Champions

Individuals who dedicate themselves to supporting, market-ing, and “driving through” an implementation, overcoming indifference or resistance that the intervention may provoke in an organization.

ExecutingCarrying out or accomplishing the implementation according to plan.

Reflecting and evaluating

Quantitative and qualitative feedback about the progress and quality of implementation, accompanied by regular personal and team debriefing about progress and experience.

Design, quality, & packaging

Perceived excellence in how the intervention is bundled, presented, and assembled.

Readiness for implementation

Tangible and immediate indicators that the organization is committed to implementing the innovation.

Knowledge and beliefs

Individuals’ attitudes toward and value placed on the interven-tion, as well as familiarity with facts, truths, and principles related to the intervention.

AdaptabilityThe degree to which an intervention can be adapted, tailored, refined, or reinvented to meet local needs.

Networks and communications

The nature and quality of social networks webs, and the nature and quality of formal and informal communications within an organization.

EngagingInvolvement of and relationships with leaders and all key stakeholders in the change process.

VOL. 21, NO. 12 n THE AMERICAN JOURNAL OF MANAGED CARE n e643

Facilitators and Barriers to Implementing E-Consults

high-volume site thought the process was very efficient: “I love it; I think it’s fantastic. There are many times things come up and I would like opinions on and get notes in [the] chart but I don’t think the provider needs to see the patient. I can do it when I have time to organize my time and thoughts…Most of the time this is faster than [a] face-to-face appointment.”

Examining further the differences in context between the low- and high-volume sites that might account for dif-ferences in perception of the compatibility of e-consults with existing processes, the high-volume sites incorporat-ed e-consults in ways that improved efficiency of opera-tions, whereas the low-volume sites did not. Specifically, high-volume sites spent considerable time and effort tai-loring the EHR templates to be completed easily and quickly. One site hired a pharmacist to handle the addi-tional workload needed to generate and follow-up on e-consults. In contrast, low-volume sites did not take extra steps to facilitate implementation.

Networks and communications. The networks and communications construct refers to the nature and qual-ity of webs of social networks and of formal and informal communications within an organization.14 With e-con-sults, specialists must reach out to PCPs to engage them, so it is important that good networks and communications exist to facilitate this engagement. Most low-volume sites noted that there was little to no communication around implementation of e-consults. One specialist at a low-vol-ume site said variability in communication within their Patient-Aligned Care Team (PACT; the VHA’s version of patient-centered medical home) created a barrier to imple-mentation of e-consults: “…meaning, there’s many, many different ways these PACT teams communicate with each other, and because they’re ultimately responsible for im-plementing the e-consults, I would say the greatest barrier is the way they communicate. By sticky notes, phone calls, CPRS [EHR], I would say the variability [among] those methods is our largest barrier to successfully implement

n Table 2. Criteria Used to Assign Ratings to Constructs

Rating Criteria

–2The construct is a negative influence in the organization, an impeding influence in work processes, and/or an impeding influence in implementation efforts. The majority of interviewees (at least 2) describe explicit examples of how the key aspect (or all aspects, or the absence) of a construct manifests itself in a negative way.

–1

The construct is a negative influence in the organization, an impeding influence in work processes, and/or an impeding influence in implementation efforts. Interviewees make general statements about the construct manifesting in a nega-tive way but without concrete examples:• The construct is mentioned only in passing or at a high level without examples or evidence of actual, concrete descrip-

tions of how that construct manifests;• There is a mixed effect of different aspects of the construct but with a general overall negative effect;• There is sufficient information to make an indirect inference about the generally negative influence; and/or• Judged as weakly negative by the absence of the construct.

0

A construct has neutral influence if:• It appears to have neutral effect (purely descriptive) or is only mentioned generically without valence;• There is no evidence of positive or negative influence;• Credible or reliable interviewees contradict each other;• There are positive and negative influences at different levels in the organization that balance each other out; and/or

different aspects of the construct have positive influence while others have negative influence; and overall, the effect is neutral.

+1

The construct is a positive influence in the organization, a facilitating influence in work processes, and/or a facilitating influence in implementation efforts. Interviewees make general statements about the construct manifesting in a positive way but without concrete examples:• The construct is mentioned only in passing or at a high level without examples or evidence of actual, concrete descrip-

tions of how that construct manifests;• There is a mixed effect of different aspects of the construct but with a general overall positive effect; and/or• There is sufficient information to make an indirect inference about the generally positive influence.

+2The construct is a positive influence in the organization, a facilitating influence in work processes, and/or a facilitating influence in implementation efforts. The majority of interviewees (at least 2) describe explicit examples of how the key or all aspects of a construct manifests itself in a positive way.

Responses were coded as missing when interviewee(s) were not asked about the presence or influence of the con-struct; or if asked about a construct, their responses did not correspond to the intended construct and were instead coded to another construct. The lack of knowledge on the part of an interviewee(s) about a construct does not necessar-ily indicate missing data and may instead indicate the absence of the construct.

e644 n www.ajmc.com n DECEMBER 2015

CLINICAL

these, because [the] system needs to take into account that variability and there’s a significant amount.”

In contrast, PCPs at high-volume sites noted the exis-tence of good communication and relationships between PCPs and specialists: “There’s been a lot of consultation between the e-consult team and us, so we are happy with the product we have… I think there’s a better spirit of col-legiality from e-consults too.” The same PCP added,“…specialists are accommodating, easily approachable, stop by and talk to us… We are at an advantage, even though we are [a] large [medical center, because] there is a good relationship between PCP and specialist… Everyone is all on the same floor, so [there is] definitely good communica-tion between PCPs and specialists.” Another high-volume site participant noted that communication with e-consults was vital to successful implementation: “Everyone’s been very helpful especially in neurosurgery, especially in com-municating and I think that’s the biggest key.”

Available resources: training. Training refers to a sub-construct of available resources, focusing on whether training has been helpful. Results showed training must include one-on-one, hands-on demonstrations to reach greatest effectiveness. At low-volume sites, training was not as common. One participant wished there was “an education component, educating providers about how to follow this new pathway,” and noted that, “I didn’t get any training whatsoever. Should have been initial standard training, not just [an] implementation guide; that would have been very beneficial.” In contrast, a high-volume site participant noted that training was crucial, “the key thing to getting this [e-consults] implemented.” Another high-volume site participant said it was instrumental that a specialist was hired specifically to implement e-consults, and under the specialist’s leadership, several trainings

were conducted with PCPs to familiarize them with the program.

Access to knowledge and information. Access to knowledge and information refers to the ease of access to knowledge and information about e-consults in rela-tion to work tasks.14 At low-volume sites, concerns were expressed with acquiring access to information at local and nation-al levels. Some participants felt they were provided with very little assistance with implementation, and that processes were confusing. One PCP said, “[I’m] not sure who [the] e-consult coordinator is now. [I] felt like I was left on my own.” Another noted, “It took 6 months from wanting to

start [e-consults] to launching because of misinformation we were getting from both what Central Office wants on forms and what we could link and how to do this and getting time and attention from CAC [Information Tech-nology Services].” At high-volume sites, participants felt there was a clear point of contact for obtaining informa-tion. One specialist noted their point of contact was “a really good source of information, and constructive in putting me in contact with other diabetes specialists in the country, and supporting the efforts to learn from our col-leagues…[I’ve] been able to grow in unique ways because of his guidance.”

DISCUSSIONInterviews and structured site-level ratings were used to

identify a subset of 4 CFIR constructs that may be critical factors for implementing an e-consult initiative. A closer review of the interview responses offers suggestions for why the low- and high-volume sites implemented the initiative differently, which helps to explain why they differed with respect to their perceptions of compatibility, networks and communications, training, and access to knowledge and in-formation. Basically, the high-volume sites expended more time, energy, and resources into implementing the program than did the low-volume sites. This is not a surprising or very informative conclusion by itself, but the benefit of our approach lies in the identification of the specific ar-eas (specific CFIR constructs) in which time, energy, and resources should be expended to achieve the best results. Specifically, the high-volume sites devoted greater effort to developing mechanisms to make e-consults more efficient for staff (ie, developing easy-to-use templates and designat-ing staff to help with the additional workload generated by

n Table 3. Number of Semi-Structured Interviews by Site and Role

Sitea

E-Consult Site Leaderb PCP

Other Support Staff Specialist

Other Provider Total

1 2 – 1 – – 3

2 1 1 2 – 2 6

3 2 – – 1 – 3

4 2 1 – 1 – 4

5 5 1 – – – 6

6 3 – 1 – 1 5

7 2 1 1 1 – 5

8 2 2 – 1 – 5

Total 19 6 5 4 3 37aSites 1 to 5 are low-volume e-consult sites; sites 6 to 8 are high-volume e-consult sites bSite leaders breakdown: associate chiefs of staff, chiefs of staff, or directors of primary or specialty care (n = 12); primary care providers (PCPs; n = 3); specialists/pharmacists (n = 4).

VOL. 21, NO. 12 n THE AMERICAN JOURNAL OF MANAGED CARE n e645

Facilitators and Barriers to Implementing E-Consults

the consults); investing in training; and designating a point person to answer questions and provide information about the program. These efforts helped achieve positive results despite challenges with leadership engagement, limited ma-terials received for implementation, and poor feedback on the status of goals or implementation.

Although the quality of networks and communica-tions was also different between the low- and high-volume sites, it is more difficult to understand the reason for this difference and, in turn, to make specific recommendations for how sites interested in implementing e-consults might address this. Instead, sites with good networks and com-munications between PCPs and specialists have a better chance of succeeding with implementing a program such as e-consults that requires coordination across depart-ments. Sites at which this is a problem should consider this a red flag before implementing similar programs.

While we cannot conclude that implementation suc-cess was determined by these factors, or that these same

factors would be important in other sites or initiatives, the findings about the potential role of specific contextual factors were helpful to the VHA OSCT and were used to generate recommendations in subsequent implementa-tion guides for e-consults and for other initiatives focused on improving access to specialty care.

Qualitative data collection and analysis is a common approach in implementation studies, because of the wealth of information that can be obtained from in-depth inter-views, compared with closed-ended survey questions. However, there is a need to use a common terminology and to code qualitative data in a way that can facilitate analysis of data not only across cases within a single study, but also across studies. We hope this paper serves to il-lustrate the utility of the study’s methods for these pur-poses, encompassing both quality improvement (QI) and research. Although attention was paid to ensuring quali-tative rigor by means of a clear and transparent data col-lection/analysis protocol, reflexivity, and peer debriefing,

n Table 4. Illustrative Quotes by Construct and by High- and Low-Volume E-Consult Sites

Constructa High-Volume Low-Volume

Compatibility

“[E-consults are] very compatible. Before, I’d walk down and talk to the cardiologist, but it’s better for the flow of the day and location…it used to take 5 minutes to walk down there [to the specialist’s of-fice], but if you’re with a patient and need an answer that day, it takes patient care time away. I think [e-consults are] very good for the flow.”

“If this is going to help the patients get taken care of, they are going to have to show it improving utilization and access. On the other hand if they start rejecting more consults and just [are] seeing them [patients] electronically, they are not necessarily go-ing to make the patients happier. We should define the criteria and expectations and see how this is going to impact the work flow.”

Networks and communications

“I think there’s a better spirit of collegiality from e-consults, too. There could be a little bit of friction if there were too many consults, overworked, don’t have access. I find myself in a situation [now with e-consults] where the doc is actually looking for consults…I just find it better [for] communication and collegiality, I guess.”

“Sometimes it seems when we send an e-consult it seems it goes out into space and no one looks at it; not sure how the process is on the other end. We have had to call and check sometimes because the specialist might have been gone. Would be nice to have some feedback if people are gone.”

Training“[We] provide training with CBOCs, other facilities. [At a] Regular scheduled time, [and] explain [the e-consult] process, and answer questions.”

“There should be more training in terms of infor-mational conference calls. There has to be a rollout process if you want to do it widely. This hasn’t hap-pened so far. Secure messaging has had a formal rollout and we have a coordinator to manage, same thing for telehealth. Those are the ones funded that we have really been pushing this year, which hasn’t really happened with e-consult so far.”

Access to knowledge and information

“I think a new person in the front office is a point person and was hired to keep it [e-consults] in the formal process” and “I’m the point person for pro-vider education, and we received approval for CME [continuing medical education] credits and a once-monthly educational program that will run through the year…[it’s] a phone-based conference providers can call into.”

“Not sure who e-consult coordinator is now…Week or two ago, found out surgeons had no idea how to fill out e-consult to get credit. They are doing more than anyone…and getting no credit.”

CBOCs indicates VA community-based outpatient clinics.aHigh-volume signifies Consolidated Framework for Implementation Research ratings closer to +2; low-volume, closer to –2.

e646 n www.ajmc.com n DECEMBER 2015

CLINICAL

this protocol could be further improved with the use of a true consensual approach and/or with triangulation via multiple methods to validate the analyses.24,25 Regardless of the specific methods used for increasing reliability of the coding process, by applying the widely accepted terminol-ogy of the CFIR to analyze differences between low- and high-implementation sites, findings from other small-sam-ple, qualitative QI studies can begin to be combined. Meta-analyses of these data can then be conducted to improve generalizability and add to our understanding of the most important factors affecting implementation success.

Limitations One of this study’s limitations is limited generalizability

of the findings beyond the participating sites and beyond e-consults. However, the purpose of this study was not to contribute to general, context-independent conclusions. Instead, the purpose was to obtain context-dependent knowledge, to help program leaders better understand the important role of context in implementation. In ad-dition, while the research team took many steps to ensure rigorous qualitative analysis when applying the CFIR, it cannot eliminate the risk of researcher subjectivity, which is inherent to all qualitative analysis. However, the steps applied in the analysis give us confidence in the reliability of the ordinal CFIR ratings and that the rating process could be replicated and generate similar results. Neverthe-less, because of the rapid analytic approach used, we did not have the time to follow a true consensual research ap-proach,15,26 which we recommend be used whenever pos-sible, to help reduce bias.

Another limitation is that not all CFIR constructs were considered in the coding process, as analysts focused on the shortened list of constructs, informed by the sur-vey completed by clinical leads prior to the interviews. In addition, not all sites had sufficient data to assign a code or rating to each construct from the shortened list; thus, data on some constructs were missing for some sites.

CONCLUSIONSThe veteran population the VA serves may benefit

from VHA healthcare providers utilizing e-consults to improve access to, and quality of, patient care. Our study identified 4 critical implementation factors: compatibility, networks and communications, available resources (spe-cifically training), and access to knowledge and informa-tion, on which future VHA medical centers can focus to successfully implement e-consults and similar telehealth initiatives. Furthermore, we observed sites that devoted

effort to make e-consults more efficient for staff, invested in training, and designated a point person to answer ques-tions and provide information about the program were more successful in implementing e-consults than were other sites. These results have important policy implica-tions, in that successfully implemented initiatives like e-consults may lead to improved patient care and shorter patient wait time, and it can spare patients time and trav-el for specialty care visits that can be addressed instead through e-consults. Further, our study demonstrates the importance of rigorous evaluation measures such as CFIR to fully understand implementation processes of such ini-tiatives both within and outside of the VA.

AcknowledgmentsThe authors represent the VHA Specialty Care Transformation Ini-

tiative Evaluation Center, and greatly appreciate and thank many other evaluation center members who contributed to data collection and analy-sis. The authors also wish to thank Tabitha Metreger for scheduling and coordination; Jeffrey Todd-Stemberg for obtaining data from the Corpo-rate Data Warehouse; Omar Cardenas at VA Central Office for providing a variety of information on the e-consult initiative; and Rachel Orlando for assistance in submitting the manuscript. The authors are extremely grateful to all of the VHA clinicians and staff at the e-consult sites who generously shared their experiences and insights.

Author Affiliations: VA Eastern Colorado Health Care System (LMH, CB, PMH), Denver, CO; VA Puget Sound Health Care System (GS, CDH), Seattle, WA; Louis Stokes Cleveland VA Medical Center (DA, LDS, SK), Cleveland, OH; Office of Specialty Care and Specialty Care Transformation (SK), Washington, DC; VA Ann Arbor Health Care Sys-tem (JL), Ann Arbor, MI.

Source of Funding: This work was supported by the US Department of Veterans Affairs, Office of Specialty Care Transformation and Office of Health Services Research, and undertaken by the Specialty Care Trans-formation Initiative Evaluation Center. The views expressed in this ar-ticle are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs.

Author Disclosures: Drs Helfrich, Stevenson, Kirsh, and Ho, and Ms Haverhals are employees of VA (the evaluation in this study was of VA specialty-care initiatives). Dr Helfrich also has received VA grants. Dr Battaglia works for VHA in nursing/research. The remaining authors re-port no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article. Findings from this paper were presented in a poster session at the Society for General Internal Medicine Conference in Denver, Colorado, on April 25, 2013.

Authorship Information: Concept and design (LMH, CDH, CB, DA, SK, JL); acquisition of data (LMH, GS, CDH, LDS, SK); analysis and in-terpretation of data (LMH, GS, CDH, DA, SK, PMH, JL); drafting of the manuscript (LMH, GS, CDH, CB, LDS, SK, JL); critical revision of the manuscript for important intellectual content (LMH, CDH, CB, DA, LDS, SK, PMH, JL); statistical analysis (LMH); provision of patients or study materials (LMH); obtaining funding (CDH, DA, SK, PMH); admin-istrative, technical, or logistic support (LMH); and supervision (JL).

Address correspondence to: Leah M. Haverhals, MA, Health Research Specialist, VA Eastern Colorado Health Care System, 1055 Clermont St, Research A151, Denver, CO 80220. E-mail: [email protected].

REFERENCES1. Fortney J, Kaboli P, Eisen S. Improving access to VA care. J Gen Intern Med. 2011;26(suppl 2):621-622.2. Pizer SD, Prentice JC. What are the consequences of waiting for

VOL. 21, NO. 12 n THE AMERICAN JOURNAL OF MANAGED CARE n e647

Facilitators and Barriers to Implementing E-Consults

health care in the veteran population? J Gen Intern Med. 2011;26(suppl 2):676-682.3. Buzza C, Ono SS, Turvey C, et al. Distance is relative: unpacking a principal barrier in rural healthcare. J Gen Intern Med. 2011;26(suppl 2):648-654. 4. Washington DL, Bean-Mayberry B, Hamilton AB, Cordasco KM, Yano EM. Women veterans’ healthcare delivery preferences and use by military era: findings from the National Survey of Women Veterans. J Gen Intern Med. 2013;28(2):571-576.5. Fortney JC, Burgess JF Jr, Bosworth HB, Booth BM, Kaboli PJ. A re-conceptualization of access for 21st century healthcare. J Gen Intern Med. 2011;26(suppl 2):639-647.6. Kehle S, Greer N, Rutks I, Wilt T. Interventions to improve veterans’ access to care: a systematic review of the literature. J Gen Intern Med. 2011;26(suppl 2):689-696. 7. Kvedar JC, Nesbitt T, Kvedar JG, Darkins A. E-patient connectivity and the near term future. J Gen Intern Med. 2011;26(suppl 2):636-638. 8. Rosland AM, Nelson K, Sun H, et al. The patient-centered medi-cal home in the Veterans Health Administration. Am J Manag Care. 2013;19(7):e263-e272.9. Wild K, Tanner C, Kaye J, et al. Electronic consults to facilitate spe-cialty dementia assessment and care. Alzheimers Dement. 2012;8(sup-pl 4):231. 10. Pap SA, Lach E, Upton J. Telemedicine in plastic surgery: e-consult the attending surgeon. Plast Reconstr Surg. 2002;110(2):452-456.11. Salvo M, Nigro SC, Ward D. Pharmacist-generated electronic consults to improve hypertension management in a multisite health centre: pilot study. Inform Prim Care. 2012;20(3):181-184.12. Angstman KB, Rohrer JE, Adamson SC, Chaudhry R. Impact of e-consults on return visits of primary care patients. Health Care Manag (Frederick). 2009;28(3):253-257.13. Ackerman S, Intinarelli G, Gleason N, et al. “Have you thought about sending that as an e-consult?”: primary care providers’ experi-ences with electronic consultations at an academic medical center. J Gen Intern Med. 2014;29 (suppl 1):15. 14. Damschroder LJ, Aaron DC, Keith RE, Kirsh SR, Alexander JA, Low-ery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation

science. Implement Sci. 2009;4:50. 15. Damschroder LJ, Lowery JC. Evaluation of a large-scale weight management program using the consolidated framework for imple-mentation research (CFIR). Implement Sci. 2013;8:51.16. Stetler CB, Legro MW, Wallace CM, et al. The role of formative evaluation in implementation research and the QUERI experience. J Gen Intern Med. 2006;21(suppl 2);S1-S8.17. Flyvbjerg B. Five misunderstandings about case-study research. Qualitative Inquiry. 2006;12(2):219-245.18. Yin, RK. Applications of Case Study Research, Volume 34. 2nd ed. Thousand Oaks, CA: Sage; 1993. 19. Yin RK. Case Study Research: Design and Methods (Applied Social Research Methods Series, Volume 5). 2nd ed. Thousand Oaks: Sage; 1994. 20. Hill CE, Knox S, Thompson BJ, Williams EN, Hess SA, Ladany N. Consensual qualitative research: an update. J Couns Psychol. 2005;52(2):196-205.21. Hill CE, Thompson BJ, Williams EN. A guide to conducting consensual qualitative research. The Counseling Psychologist. 1997;25(4):517-572.22. Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855-866.23. Rihoux B, Ragin CC. Why compare? why configurational compara-tive methods? In: Rihoux B, Ragin CC, eds. Configurational Compara-tive Methods. Thousand Oaks, CA: Sage; 2009. 24. Gilson L, Hanson K, Sheikh K, Agyepong IA, Ssengooba F, Bennett S. Building the field of health policy and systems research: social sci-ence matters. PLoS Med. 2011;8(8):e1001079.25. Mays N, Pope C. Qualitative research in health care. assessing quality in qualitative research. BMJ. 2000;320(7226):50-52. 26. Damschroder LJ, Goodrich DE, Robinson CH, Fletcher CE, Lowery JC. A systematic exploration of differences in contextual factors re-lated to implementing the MOVE! weight management program in VA: a mixed methods study. BMC Health Serv Res. 2011;11;248. n

www.ajmc.com Full text and PDF

eAppendix A The following e-mail was sent to e-consult pilot site leaders to invite them to complete the Web-based survey which rated relevance of CFIR constructs to e-consult implementation and informed the qualitative interview guide. Dear Colleague: We have been asked by the Office of Specialty Care Services to evaluate the implementation and effectiveness of the e-consults initiative. In brief, this initiative is designed to provide more patient-centric and timely care to veterans by establishing electronic and phone consults to allow for the primary care provider to order a non–face-to-face consult with a specialist, thereby eliminating needless travel and wait times. It involves the establishment of pilot centers around the country that will provide e-consultation for a variety of specialties including diabetes/endocrinology, liver transplants, cardiology, neurosurgery, pain management, and others. The pilot sites include Providence, East Orange, Pittsburgh, Baltimore, Durham, Orlando, Dayton, Detroit, Indianapolis, Hines, Houston, Dallas, Denver, Portland, Las Vegas, Minneapolis, and Jackson. We ask your assistance in creating a guide for interviewing stakeholders that focuses on those organizational factors you believe will be most influential in leading to successful or failed implementation of e-consults. Stakeholders include those people who are key to the successful implementation of e-consults—eg, participating primary care providers and consulting specialists. Below is a list of the factors that are documented in the Consolidated Framework for Implementation Research.1 Please rate the importance of these factors in implementing e-consults at the participating pilot centers. Please circle your response. Topic Description Short Description INTERVENTION CHARACTERISTICS

Very Unimportant

Neither Unimportant nor Important

Very Important

Intervention source

Perception of key stakeholders about whether the e-consult policy and application are externally or internally developed.

1 2 3 4 5

Evidence strength & quality

Stakeholders’ perceptions of the quality and validity of evidence (in the published literature, based on clinical experience, or other local evidence or experience)

1 2 3 4 5

supporting the belief that e-consults will have the desired outcomes.

Relative advantage Stakeholders’ perception of the advantage of implementing e-consults versus an alternative solution for improving access to specialty care.

1 2 3 4 5

Adaptability The degree to which e-consults can be adapted, tailored, refined, or reinvented to meet local needs.

1 2 3 4 5

Trialability

The ability to test e-consults on a small scale and to be able to reverse course (undo implementation) if warranted.

1 2 3 4 5

Complexity Perceived difficulty of implementation, reflected by duration, scope, radicalness, disruptiveness, centrality, intricacy, and number of steps required to implement e-consults.

1 2 3 4 5

Design quality and packaging

Perceived excellence in how e-consults are designed and disseminated as a program.

1 2 3 4 5

Cost Costs of providing e-consults, as well as the costs associated with implementation (including time and effort).

1 2 3 4 5

OUTER SETTING

Patient needs & resources

The extent to which patient needs, as well as barriers and facilitators to meeting those needs, are accurately known and prioritized by the participating facilities.

1 2 3 4 5

Cosmopolitanism

The degree to which participating facilities are networked with other external organizations.

1 2 3 4 5

Peer pressure Mimetic or competitive pressure to implement e-consults, typically because most or other key peer or competing organizations have already implemented them or are in a bid for a competitive edge.

1 2 3 4 5

External policy and incentives

External strategies to spread e-consults including policy and regulations, external mandates, recommendations and guidelines, pay-for-performance/performance measures, collaboratives, and public or benchmark reporting.

1 2 3 4 5

Structural characteristics

Age, size, and complexity of participating facilities.

1 2 3 4 5

INNER SETTING

Networks & communications

The nature and quality of webs of social networks and the nature and quality of formal and informal communications within the

1 2 3 4 5

participating facilities.

Culture Norms, values, and basic assumptions of participating facilities.

1 2 3 4 5

Implementation climate

The absorptive capacity for change, shared receptivity of involved individuals to e-consults, and the extent to which use of e-consults will be supported and expected within the participating facilities.

1 2 3 4 5

Tension for change

The degree to which stakeholders perceive the current situation regarding access to specialty care as intolerable or needing change.

1 2 3 4 5

Compatibility by involved individuals, how those align with individuals’ own norms, values, and perceived risks and needs

The degree of tangible fit between meaning and values attached to e-consults and how they fit with existing workflows and systems.

1 2 3 4 5

Relative priority Individuals’ shared perception of the importance of e-consults within the participating facilities.

1 2 3 4 5

Organizational incentives & rewards

Extrinsic incentives such as goal-sharing awards, performance reviews, promotions, raises in salary, and less tangible incentives such as increased stature or respect for promoting

1 2 3 4 5

implementation of e-consults.

Goals and feedback

The degree to which goals of e-consults are clearly communicated, acted upon, and fed back to staff, and alignment of that feedback with goals.

1 2 3 4 5

Learning climate A climate in which: (a) leaders express their own fallibility and need for team members’ assistance and input; (b) team members feel that they are essential, valued, and knowledgeable partners in the change process; )c) individuals feel psychologically safe to try new methods; and )d) there is sufficient time and space for reflective thinking and evaluation.

1 2 3 4 5

Readiness for implementation

Tangible and immediate indicators of participating facilities’ commitment to their decision to implement e-consults.

1 2 3 4 5

Leadership engagement

Commitment, involvement, and accountability of leaders and managers with the implementation of e-consults.

1 2 3 4 5

Available resources

The level of resources dedicated for implementation and ongoing operations including money, training,

1 2 3 4 5

education, physical space, and time.

Access to knowledge and information

Ease of access to digestible information and knowledge about e-consults and how to incorporate them into work tasks.

1 2 3 4 5

CHARACTERISTICS OF INDIVIDUALS

Knowledge & beliefs about the intervention

Individuals’ attitudes towards and value placed on e-consults, as well as familiarity with facts, truths, and principles related to the intervention.

1 2 3 4 5

Self-efficacy Individuals’ beliefs in their own capabilities to execute courses of action to achieve implementation of e-consults.

1 2 3 4 5

Individual stage of change

Characterization of the phase (eg, pre-contemplation, contemplation) an individual is in, as he or she progresses towards skilled, enthusiastic, and sustained use of e-consults.

1 2 3 4 5

Individual identification with the organization

How individuals perceive their VAMC [Veterans Affairs Medical Center] or CBOC [community-based outpatient clinic] and their relationship and degree of commitment to VA.

1 2 3 4 5

Other personal attributes

Other personal traits of stakeholders such as

1 2 3 4 5

tolerance of ambiguity, intellectual ability, motivation, values, competence, capacity, and learning style.

PROCESS

Planning The degree to which planning for implementing e-consults is developed in advance, and the quality of planning.

1 2 3 4 5

Engaging Attracting and involving appropriate individuals in the implementation and use of e-consults through a combined strategy of social marketing, education, role modeling, training, and other similar activities.

1 2 3 4 5

Opinion leaders People in the participating facilities who have formal or informal influence on the attitudes and beliefs of their colleagues with respect to implementing e-consults.

1 2 3 4 5

Formally appointed implementation leaders

People in the participating facilities who have been formally appointed with responsibility for implementing e-consults as coordinator, project manager, team leader, or other similar role.

1 2 3 4 5

Champions People who dedicate themselves to supporting,

1 2 3 4 5

marketing, and “driving through” e-consults, overcoming indifference or resistance that e-consults may provoke in an organization.

External change agents

People who are affiliated with an outside entity who formally influence or facilitate the implementation of e-consults.

1 2 3 4 5

Executing Carrying out or accomplishing the implementation according to plan.

1 2 3 4 5

Reflecting & evaluating

Quantitative and qualitative feedback about the progress and quality of implementation, accompanied by regular personal and team debriefing about progress and experience.

1 2 3 4 5

What 3 factors do you think will most determine the success of E-consults? Other comments? eAPPENDIX REFERENCES

1. Damschroder LJ, Aaron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

eAppendix B E-Consult Interview Guide Interviewer Name: Interviewee: [Ask if it’s okay to add their name, explaining that their responses will be kept confidential. But if they don’t want their name on this document, then devise a code—but note that the code will be linked to their identity in a separate document. Code could be (site number).(date).(interviewer’s initials).(interview sequence number for this date).] Site: Date: Time Start: Time End: Hello [Dr/Mr/Ms] [interview participant name], My name is [interviewer name] and joining me is my colleague [note-taker name] who will be taking notes. We are with 1 of 2 evaluation teams tasked with understanding how e-consults for [condition] have been implemented in [site]. These findings will be summarized in a quarterly report to the Office of Specialty Care Transformation. We won’t identify you as a participant, nor will we identify your site in any of our reports. The call will take approximately 45-60 minutes and we’d like to talk with you again in approximately 1 year. Your participation in this interview is voluntary. You can stop the interview at any time, and let us know if you’d rather not answer a particular question. Do you have any questions? In order to make sure we capture all of the information you give us, we would like to record this call. The audio-file for the recording will be uploaded to a restricted access file on the VA intranet immediately after we complete this interview. Is this okay with you?

[Generic prompts: If responses are limited or require clarification, probes may be used to elicit more detailed responses. Probes should use words or phrases presented by the participant using one of the following formats: 1. What do you mean by ____________ ? 2. Can you tell me more about ____________ ? 3. Can you give me an example of ____________ ? 4. Can you tell me about a time when ____________ ?] 1. What is your job title? What are your main responsibilities? 2. What is your role in e-consults? When did you start this role? Fidelity* 3. How have e-consults changed methods and patterns of referral?

[Probes:] a. In the past month, approximately how many e-consults did you refer/receive? How

does this compare with the number of consults you referred/received by the usual method?

b. What are the criteria that are used for referring patients to e-consults versus criteria for referring patients for an in-person consultation?

c. What forms of electronic consultation were used before e-consults (if any)? (Examples include specialists responding to consults in CPRS (computerized patient record system) but without a formal procedure or workload credit; e-mailing consult requests and information; calling specialist by phone for advice.)

Fidelity* 4. How are patients referred to specialists as part of e-consults? Please describe the entire

process. [Probes:] a. How does this differ from how patients were referred to specialists prior to e-

consults? Is this method still used in certain cases? Under what circumstances? [This may have been covered in #3, in which case you can skip this question.]

[The following are the key components of e-consults. After the interviewee has described the process in general, please review the following list to be sure that all of these components have been covered. If not, please ask for additional detail on specific components.] [Probes:] b. What are some examples of consult questions that a PCP [primary care provider]

might ask? c. What background information does the PCP provide to the specialist? d. How is an e-consult requested?

• E-consult is presented as a CPRS menu choice; • Consult is ordered and it is automatically an e-consult unless otherwise

indicated by the PCP; or • Specialist decides whether they want to do e-consult or in-person for that case.

e. How does the specialist generate a response to the PCP? f. How are the patients involved? g. Does the PCP provide the specialist with information on the actions taken? h. Is there a template that is used for any of these steps? If so, please explain how it is

used and what data are collected. Compatibility* 5. How compatible are e-consults with the way that you practice? Please explain.

[Probes:] a. How difficult was it to make changes in the way you request/provide consultations?

Have these changes been easy or difficult to incorporate into your daily clinical practice? Please explain.

Networks & Communication* 6. What kinds of communications take place (in addition to the consult itself) among the

providers participating in e-consults that are important for making the program work? [Probes:] a. Are they face-to-face or phone calls? b. Why are they important? c. Do you think some methods of communication are more effective than others? (Probe

for reasons if not offered.) d. Is there any communication that’s lacking? For what kinds of issues?

Adaptability* 7. Did you feel like you had enough flexibility (the ability to change aspects of the program to

make it work) to implement the program in a way that would work best in your facility? Why or why not?

[Probes:] a. What about the program was flexible or adaptable? b. What about the program was inflexible or not adaptable? c. What aspects of the program did you change/adjust? d. What aspects of the program would you have liked to change, but couldn’t? What

barriers exist to making these changes? Engagement* and Planning* 8. Can you describe the planning that was done by facility leadership to get the program

implemented? [If the interviewee doesn’t know, because he/she wasn’t involved, ask if they can identify someone who might know about this process.]

Leadership* 9. What level of involvement does top management (eg, medical center director or chief of

staff) at your facility have with the program? Were there any other leaders who played an important role?

Engaging* & Champions* 10. Were there people in your facility who were especially instrumental in helping to get the

program implemented? Who were these people? What roles did these people play?

Available Resources* 11. Do you have sufficient resources to support the program? Can you tell me more about that?

[Probes:] a. Do you know how much funding you received and were able to allocate for the

initiative? b. Did you hire new staff specifically for the initiative? If so, how many people and for

what positions? c. Do participating specialists have sufficient time to provide consults? d. Do PCPs have sufficient time to initiate the care recommended by the specialists? e. What other staff assistance do you have (eg, for setting up the templates, pulling data,

coding, etc)? f. Has any training been provided? g. Are there any other resources that you received, or would have liked to receive?

Relative Priority* & Available Resources* 12. Do you feel that you personally have enough time to dedicate to fulfilling your

responsibilities for e-consults? If not, what aspects of your responsibilities for the program does this affect?

Executing* 13. Was there anything that prohibited you from implementing the program according to plan?

[Probes:] a. How did you deal with this? Do you feel it was resolved?

Reflecting & Evaluating* 14. Do you receive any regular information/data on the program?

[Probes:] [Follow up if yes:]

a. What kind of data do you receive and how frequently? b. How are they used? Are they discussed with others?

Unintended Consequences* 15. Have there been any problems caused by implementation of e-consults (eg, patient or

provider or other specialist who has been unhappy with the program)? Can you provide an example (tell stories)? [Probes:]

a. Any safety issues/adverse events? Part of Memorandum of Understanding to assess how e-consults impacted quality of care* 16. How do you think e-consults have affected the quality of specialty care provided to patients?

17. Do you think e-consults have had any impact on the PACT [Patient-Aligned Care Team]?

Please explain. [Probes:]

a. Have they increased the workload for the team (eg, they have to implement the recommendations made by the specialist)?

b. Have they improved communication and coordination of care among PCPs and specialists?

Design Quality & Packaging* 18. What do you think about the information and tools that you received for implementing e-

consults? [Probes:]

a. What materials/instructions were available for supporting the implementation of the program?

b. How were these materials helpful? c. How would you describe the quality of the materials? d. What other materials would have been helpful to support the implementation of the

program? Knowledge & Beliefs about the Intervention* 19. How well do you think the program is meeting the goals? Please explain with specific

examples. [Let them talk before probing—to get at knowledge and beliefs.] [Probes—ask about how well the program is meeting the following specific goals:]

a. The primary goal is to ensure the integrity of the PACT teams be maintained by: o giving skills to primary care providers, and o empowering the teams with help that’s needed to manage their patients.

b. A secondary goal is to decrease travel. c. A third goal is to increase access.

20. Is there anything else you would like us to know about implementation of e-consults at your

site? [Probes, if the following haven’t already been covered:]

a. What do you like about e-consults? b. What has been most helpful in getting e-consults implemented? c. What have been the greatest barriers? d. How could the program be improved?

21. Can you please give us the names of any other people at your site who are familiar with the e-consults program, who might be willing to participate in an interview? [If you don’t yet have the names of some specialists or primary care providers, please be sure to ask for these.]

22. Do you have any questions for us?

Thank you for participating in this interview.