Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional...

17
Improving Interprofessional Education and Collaborative Practice Through Evaluation: An Exploration of Current Trends Emmanuelle Careau Center for interdisciplinary research in rehabilitation and social integration (CIRRIS) College of Health Disciplines, University of British Columbia Lesley Bainbridge College of Health Disciplines, University of British Columbia Marla Steinberg School of Population and Public Health, University of British Columbia Chris Lovato School of Population and Public Health, University of British Columbia Abstract: Research on evaluation (RoE) is essential to increase knowledge, develop robust approaches, and help evaluators to conduct better evaluations. In this article, we used the concept of meta-evaluation in the field of interprofessional education and collaborative practice to identify current evaluation trends and efforts in RoE reflective of capacity building. e results contribute to identify weaknesses in cur- rent evaluation methods and highlight the negative consequences of poor RoE for knowledge development. Specific recommendations are drawn out to increase the quality of evaluation studies, to provide evidences in RoE, and to increase its con- nection between evaluation practices in the field. Keywords: collaborative practice, interprofessional education, research on evalu- ation Résumé : La recherche sur l’évaluation est essentielle pour améliorer les connaissan- ces, développer des approches robustes et aider les évaluateurs à mener leurs évalua- tions. Nous avons ainsi réalisé une méta-évaluation dans le domaine de la formation interprofessionnelle et des pratiques de collaboration pour identifier les tendances actuelles en évaluation et les efforts de recherche sur l’amélioration de la capacité d’évaluation. Les résultats ont contribué à identifier les faiblesses dans le domaine et à souligner les conséquences négatives du manque de recherche sur l’évaluation. Des Corresponding author: Emmanuelle Careau, OT, PhD, Center for interdisciplinary research in rehabilitation and social integration (CIRRIS), 525 boul. Wilfrid-Hamel, Québec, Canada, G1M 2S8; phone: 418-529-9141, fax: 418-529-3548; e-mail: [email protected] © 2016 Canadian Journal of Program Evaluation / La Revue canadienne d'évaluation de programme 31.1 (Spring / printemps), 1–17 doi: 10.3138/cjpe.225

Transcript of Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional...

Page 1: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Improving Interprofessional Education and Collaborative Practice Through Evaluation:

An Exploration of Current Trends

Emmanuelle Careau Center for interdisciplinary research in rehabilitation and social integration

(CIRRIS) College of Health Disciplines, University of British Columbia

Lesley Bainbridge College of Health Disciplines, University of British Columbia

Marla Steinberg School of Population and Public Health, University of British Columbia

Chris Lovato School of Population and Public Health, University of British Columbia

Abstract: Research on evaluation (RoE) is essential to increase knowledge, develop robust approaches, and help evaluators to conduct better evaluations. In this article, we used the concept of meta-evaluation in the fi eld of interprofessional education and collaborative practice to identify current evaluation trends and eff orts in RoE refl ective of capacity building. Th e results contribute to identify weaknesses in cur-rent evaluation methods and highlight the negative consequences of poor RoE for knowledge development. Specifi c recommendations are drawn out to increase the quality of evaluation studies, to provide evidences in RoE, and to increase its con-nection between evaluation practices in the fi eld.

Keywords: collaborative practice, interprofessional education, research on evalu-ation

Résumé : La recherche sur l’évaluation est essentielle pour améliorer les connaissan-ces, développer des approches robustes et aider les évaluateurs à mener leurs évalua-tions. Nous avons ainsi réalisé une méta-évaluation dans le domaine de la formation interprofessionnelle et des pratiques de collaboration pour identifi er les tendances actuelles en évaluation et les eff orts de recherche sur l’amélioration de la capacité d’évaluation. Les résultats ont contribué à identifi er les faiblesses dans le domaine et à souligner les conséquences négatives du manque de recherche sur l’évaluation. Des

Corresponding author: Emmanuelle Careau, OT, PhD, Center for interdisciplinary research in rehabilitation and social integration (CIRRIS), 525 boul. Wilfrid-Hamel, Québec, Canada, G1M 2S8; phone: 418-529-9141, fax: 418-529-3548; e-mail: [email protected]

© 2016 Canadian Journal of Program Evaluation / La Revue canadienne d'évaluation de programme31.1 (Spring / printemps), 1–17 doi: 10.3138/cjpe.225

Page 2: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

2 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

recommandations spécifi ques ont été faites pour pallier ces faiblesses et pour mieux appliquer les connaissances dans la pratique.

Mots clés  : pratique collaborative, formation interprofessionnelle, recherche sur l’évaluation

INTRODUCTION Collaborative practice occurs when health care providers, patients/clients/fami-lies, and communities develop and maintain interprofessional working relation-ships that enable optimal health outcomes (Canadian Interprofessional Health Collaborative [CIHC], 2010). Early evidence shows that interprofessional col-laborative practice (ICP) has been linked to improvements in patient safety and case management, the optimal use of each healthcare team member’s skill, and the provision of better health services ( Berridge, Mackintosh, & Freeth, 2010 ; Reeves, Lewin, Espin, & Zwarenstein, 2010 ; Suter et al., 2012 ; Zwarenstein, Goldman, & Reeves, 2009 ). Th eoretical assumptions in this fi eld suggest that interprofessional education (IPE) in training and clinical settings is needed to ensure that health care providers acquire knowledge and develop the skills needed to work in a col-laborative manner ( Bainbridge, Nasmith, Orchard, & Wood, 2010 ; CIHC, 2010 ; World Health Organization [WHO], 2010 ). While there has been a call for more evaluation in the fi eld of IPE and ICP, little is known about the characteristics of current program evaluations. Relatedly, we know of no resources that review the state of “research on evaluation” (RoE) in this area. In this study we apply the concept of meta-evaluation to address this gap by reviewing evaluation abstracts from papers/posters presented at recent conferences. We also reviewed a selec-tion of relevant journals and websites to identify current eff orts in RoE refl ective of capacity building. Results will be of interest to those involved in developing and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research on evaluation designed to build evaluation capacity in this area.

Interprofessional education (IPE) occurs when two or more members of more than one health and/or social care profession learn interactively about, with, and from each other for the explicit purpose of improving collaborative practice to enhance the health/well-being of patients ( Bainbridge & Wood, 2013 ; Centre for the Advancement of Interprofessional Education, 2014 ; Reeves, Perrier, Gold-man, Freeth & Zwarenstein, 2013 ). Th is learning requires active engagement, co-location, communication, and sharing among learners; a trustful interaction to learn about people from other discipline; and respect and confi dence in others’ knowledge ( Bainbridge & Wood, 2012 ). Since the early 1990s, many strategies have been developed and implemented to improve IPECP. Unfortunately, the evaluation of these initiatives has not kept pace with their development, and many authors have identifi ed the need for evaluation that facilitates knowledge building and informs implementation of practice-based interprofessional collaboration (IPC) interventions ( Zwarenstein et al., 2009 ) and IPE ( Hammick, Freeth, Koppel,

Page 3: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 3

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

Reeves, & Barr, 2007 ; Reeves et al., 2013 ). A systematic literature review on IPE in clinical settings that included papers up to 2006 ( Davidson, Smith, Dodd, & O’Loughlan, 2007 ) reported several concerns with the evaluations, including limited descriptions of project objectives, inconsistency in outcome measurement, bias induced by the evaluation process, and methods that were not detailed or theoretically grounded.

In a more recent review of the titles of IPECP papers identifi ed through PubMed, Paradis and Reeves (2013) noted an increased focus on a “psychomet-ric paradigm of inquiry,” as refl ected by words such “evaluate/evaluation, assess/assessment, outcome, intervention, and evidence”; however, little is known about the specifi c focus and characteristics of these evaluation studies. Two Cochrane reviews published in 2009 (Zwarenstein et al.) and 2013 (Reeves et al.) raise con-cerns related to evaluation capacity. Unfortunately, there appears to be a limited number of studies that provide the building blocks to optimize the quality of interventions and evaluation ( Reeves et al., 2013 ). For example, Zwarenstein and colleagues (2009) identify the need for research on the conceptualization and measurement of collaboration—a key concept underlying IPECP. Although in-strument development is an important element for building evaluation capacity, broader RoE is essential to increase knowledge, connect theory to practice, de-velop robust approaches, and help evaluators to select the best methods, practices, and techniques to conduct evaluations ( Szanyi, Azzam, & Galen, 2013 ). While program evaluation focuses on a specifi c evaluation to improve it or determine its outcomes and impacts, RoE aims at increasing knowledge to enhance capac-ity for conducting sound evaluation. Szanyi and colleagues (2013) identifi ed 10 categories of RoE: impact (e.g., when and how evaluations are successfully used); methods (e.g., which methods are most cost-effi cient); context (e.g., what contextual factors alter evaluation); ethics (how evaluators weigh the needs and concerns of diff erent stakeholders); culture (e.g., how evaluators become cultur-ally competent); technology (e.g., what technology is available for evaluators to use?); professional development (e.g., current training characteristics of practic-ing evaluators); policy issues (how evaluation policies vary across organizations); conceptual research (e.g., the predominant evaluation theories driving practices today); and background research (e.g., what sectors evaluators work in). Address-ing these kinds of questions would contribute to building evaluation capacity in IPECP.

METHOD Although searching peer-reviewed literature is a valuable way to conduct a meta-evaluation to explore the current state of knowledge on a specifi c topic and identify what researchers perceive to be of interest, legitimate, and valuable, it is not necessarily the best method to uncover how evaluation is conducted in a fi eld ( Davidson et al., 2007 ). Th is is because many interventions, including IPECP initiatives, are neither developed nor evaluated for publication purposes

Page 4: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

4 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

and oft en do not appear in the published literature, yet the knowledge they pro-duce is used by practicing professionals and represents scholarly work that is both produced by someone with expertise in the fi eld and reviewed by peers or other professionals working in the area. To identify current methods of program evaluation used in IPECP, we conducted a retrospective analysis of oral and poster presentation abstracts from two recent international IPECP conferences ( All Together Better Health [ATBH] VI, Kobe, Japan, 2012 , and Collaborating Across Borders [CAB] IV, Vancouver, Canada, 2013 ). Th ese conferences, at-tended by a wide variety of professionals working in this area (e.g., educators, researchers, evaluators, policy makers, students), were chosen because of their scope, reputation, and popularity, as well as the rigour of their peer-reviewed selection processes, which focus on selection of high-quality and current IPECP initiatives. We believe that analyzing abstracts from these conferences provides a more realistic representation of current practices in the fi eld, compared to a traditional literature review.

We began by screening abstracts from the 2012 ATBH VI and 2013 CAB IV programs to identify those that included at least one term related to evaluation (e.g., implementation , evaluation , outcome , or assessment ). Th e abstracts identi-fi ed were then read to determine if they should be included in the review. To be selected, abstracts had to present an IPE initiative or a practice-based ICP intervention that included an evaluation component. For each selected abstract, the level of outcomes measured was determined using the 6-point Joint Evalua-tion Team (JET) classifi cation of IPE outcomes ( Barr, Koppel, Reeves, Hammick, & Freeth, 2005 ). Th is classifi cation categorizes outcomes into six levels: “reac-tion” (Level 1), “modifi cation of attitudes/perceptions” (Level 2a), “acquisition of knowledge/skills” (Level 2b), "behavioural change" (Level 3), “change in organi-zational practice” (Level 4a), and “benefi ts to patients/clients” (Level 4b). Th en, to detail the evaluation methods, each abstract was analyzed using an iterative qualitative content analysis with deductive themes identifi ed from an evaluation glossary (Hutchinson, Dunkley, & Lovato, 2013). Table 1 presents the defi nition for each theme used in this analysis. Descriptive statistics were used to complete the analysis.

To identify RoE research we searched the peer-reviewed and grey literature. Th e search strategy for peer-reviewed literature combined keywords proposed by Paradis and Reeves (2013) on IPECP (e.g., inter-, multi-, trans-; -professional*, -disciplinary*) and the keywords “research on evaluation,” “evaluation model,” “evaluation approach,” “evaluation method” or “evaluation framework.” Th is strat-egy was applied using healthcare/education databases (Medline, CINAHL and ERIC 1 ) and journal-specifi c databases ( Journal of Interprofessional Care and Jour-nal of Research in Interprofessional Practice and Education ). In addition, the IPECP keywords combined with the keyword or fi lter “health*” were searched using the databases of the Canadian Journal of Program Evaluation , American Journal of Evaluation , Research Evaluation , and Evaluation . Only papers published from 2007 to September 2013 were targeted. Titles and abstracts were reviewed, and

Page 5: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 5

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

Table 1. Themes Used to Analyze Abstracts1

Themes Defi nition

Research on evaluation (yes/no)

Research aiming at accumulating evidence about evaluation itself (knowledge about evaluation) to provide an empirical basis for improving practice and enhancing our understanding of the types of evaluation most appropriate and eff ective within a context (e.g., studies on evaluation outcomes, comparative stud-ies of evaluation practice, meta-evaluation, experimental studies on evaluation, practice component studies, and evaluation of technical assistance and training) (Henry & Mark, 2003).

Psychometric research (yes/no)

Research on theory and technique of psychological measure-ment, which includes the measurement of knowledge, abilities, attitudes, personality traits, behaviour change, and educational measurement. The studies are primarily concerned with the construction and validation of measurement instruments such as questionnaires, tests, and personality assessments.

Focus of evaluation

The areas of inquiry (Patton, 2012).

PurposeFormative Evaluation intended to improve performance, most often con-

ducted during the implementation phase of projects or pro-grams. Formative evaluations may also be conducted for other reasons such as compliance, legal requirements, or as part of a lar-ger evaluation initiative (Organisation for Economic Co-operation and Development, 2002).

Summative Evaluation conducted at the end of an inte rvention (or a phase of that intervention) to determine the extent to which anticipated outcomes were produced. Summative evaluation is intended to provide information about the worth of the program (Organisa-tion for Economic Co-operation and Development, 2002).

Develop-mental

Developmental evaluation applies to an ongoing process of innova-tion in which both the path and the destination are evolving. While a focus on outcome measurement is maintained, developmental evaluation is intended to support innovation and identifi cation of specifi c outcomes within a context of uncertainty, in comparison to more traditional evaluation approaches that focus on a priori specifi -cation and measurement of outcomes (Patton, 2012).

Data analysisQuantitative Study that examines phenomena through the numerical rep-

resentation of observations and statistical analysis.Qualitative Study that involves detailed verbal descriptions of characteristics,

cases, and setting, typically through use of observation, inter-viewing, and document review to collect data.

(Continued)

Page 6: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

6 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

Themes Defi nition

Mixed Study that triangulates both quantitative and qualitative data in one or many phases in the research process.

DesignExperimental/Quasi-experimental

A study that uses controlled experimentation with comparison groups (with or without randomization).

Naturalistic A study that evaluates a program without comparison groups.SettingEducational Study that takes place in an educational setting and involves

prelicensed or graduate health students.Clinical Study that takes place in a clinical or continuing educational set-

ting and involves healthcare practitioners.Both Study that takes place in a clinical setting and involves preli-

censed or graduate health students.MeasurementSelf-measurement

Data in which participants record their own recollections of events, feelings, judgements, perceptions, and attitudes.

Measure-ment through observation

Data in which raters judge participants’ attitude, skills, behaviour, and so on through observation.

Measurement toolValidated Measurement tool that has gone through a formal validation

process (reliability, content validation, etc.).Not validated/not specifi ed

Measurement tool that was developed for the project and that has not gone through a formal validation process.

Evaluation framework/approach/methodExplicitly iden-tifi ed

Study that explicitly names what the evaluation framework, ap-proach, or method is (e.g., realistic evaluation, Kirkpatrick, re-aim)

Not identifi ed

Note. In the table, primary sources for defi nitions are cited if applicable. The complete references can be found in the Evaluation Glossary developed by Hutchinson, Dunkley, and Lovato (2013).

Table 1. (Continued)

those addressing one of the 10 RoE categories ( Szanyi et al., 2013 ) were included in the review. We manually searched the grey literature using similar search terms in Google and examined eight websites from key organizations working in the area of IPECP (see list in Table 2 ).

Page 7: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 7

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

Table 2. Websites and description of organizations working in IPECP.

Name Description URL

European Inter-professional Practice and Education Net-work (EIPEN)

Aims to develop and share eff ective interprofes-sional training programs, methods, and materials for improving collaborative practice in health and social care in Europe. International organization (Europe)

www.eipen.eu

Nordic Interprofes-sional Network (NIPNET)

A learning network to foster interprofessional col-laboration in education, practice, and research. It is primarily for Nordic educators, practitioners, and researchers in the fi elds of health. International organization (Scandinavian countries)

https://nipnet.org/

National Center for Interprofes-sional Practice and Education (NCIPE)

Leads, coordinates, and studies the advancement of collaborative, team-based health profession education and patient care as an effi cient model for improving quality, outcomes, and cost. National organization (United States)

http://nexusipe.org/

Centre for the Advancement of Interprofes-sional Educa-tion (CAIPE)

Aims to promote and develop interprofessional edu-cation (IPE) with and through its individual, corporate, and student members, in collaboration with like-minded organizations in the UK and overseas, for the benefi t of patients and clients. National organization (UK)

http://caipe.org.uk/

Canadian Interprofes-sional Health Collaborative (CIHC)

Works at the edges and interfaces of health, educa-tion, and the professions to discover and share prom-ising practices to promote interprofessional educa-tion and collaboration in areas that will enhance patient care. CIHC is the hub for Canadian inter-professional activity. National organization (Canada)

www.cihc.ca

Réseau de col-laboration sur les pratiquesin-terprofession-nelles (RCPI)

Supports the development and valorization of know-ledge, skills, and attitudes associated with person- and community-centredinterprofessional collabora-tion. Provincial organization (Quebec, Canada)

www.rcpi.ulaval.ca

Centre for Interprofes-sional Educa-tion

Provides IPE opportunities to pre-entry to practice students and practice-based health professionals at their affi liated hospitals and community clinical placements and aims to lead the advancement of IPE through education, practice, and research initiatives. Provincial organization (Ontario, Canada)

www.ipe.utoronto.ca

College of Health Disci-plines

Develops curriculum, conducts research, and dis-seminates fi ndings on IPECP, participates in global interprofessional programs, and expands commun-ity partnerships to make health care safer and more eff ective for patients. Provincial organization (British Columbia, Canada)

www.chd.ubc.ca

Page 8: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

8 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

RESULTS

Characteristics of Program Evaluations in IPECP Of the 614 abstracts included in the conference programs (CAB IV = 295, ATBH VI = 319), a total of 16 papers concerned the development/validation of a meas-urement tool (2.6%), 3 papers proposed general recommendations and guidelines to conduct evaluation within the fi eld (0.5%), and only 1 paper described an in-novative evaluation framework/approach/method (0.2%). Most of the abstracts presented an initiative/program to improve collaborative practice through IPE or practice-based interventions, but only 277 (45%) of them described an explicit evaluation approach in their abstract.

Exploring the outcomes measured with the JET classifi cation of outcomes indicated that the majority of program evaluations focused on measuring initial outcomes such as reactions toward the initiative/program (42%), modifi cation of attitude and perceptions about IPECP (31%), and acquisition of collaborative knowledge and skills (47%). Th is was followed by behavioural change (12%), changes in organizational practice (6%), and benefi ts to patients/clients (5%). Table 3 details the outcomes measured according to JET levels of classifi cation.

Table 3. Outcomes Measured According to JET Classifi cation

Outcomes ATBH VI CAB IV Total

%a (n) %b (n) %c (n)

Level 1: ReactionLearners’ satisfaction 39% (57) 20% (26) 30% (83)Educators/faculty satisfaction 8% (12) 5% (6) 7% (19)

Patient/community satisfaction 6% (9) 4% (5) 5% (14)Total Level 1 53% (78) 29% (37) 42% (116)Level 2a: Modifi cation of attitudes/perceptionLearners’ attitudes and beliefs toward IPECP 21% (31) 24% (32) 23% (63)Learners’ self-effi cacy beliefs – 9% (12) 4% (12)

Team perceived cohesion – 2% (2) 2% (2)Learners’ identity 3% (4) 2% (2) 2% (6)Total Level 2a 24% (35) 37% (48) 31% (83)Level 2b: Acquisition of knowledge/skillsLearners’ collaborative skills 18% (26) 42% (56) 30% (82)Learners’ knowledge about collaborative practice 14% (21) 19% (25) 17% (46)Total Level 2b 32% (47) 61% (81) 47% (128)Level 3: Behavioural changeLearners’ collaborative behaviour 3% (5) 17% (22) 10% (83)

Page 9: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 9

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

Outcomes ATBH VI CAB IV Total

%a (n) %b (n) %c (n)

Team performance – 5% (6) 2% (6)Total Level 3 3% (5) 22% (28) 12% (89)Level 4a: Change in organizational practiceOrganizational collaborative practice 3% (5) 5% (7) 4% (12)Organization culture shift toward IPECP – 4% (5) 2% (5)Total Level 4a 3% (5) 9% (12) 6% (17)Level 4b: Benefi ts to patients/clientsPatient health 6% (9) 5% (6) 5% (15)Total Level 4b 6% (9) 5% (6) 5% (15)

aPercentages are calculated from a total of 145 abstracts. bPercentages are calculated from a total of 132 abstracts. cPercentages are calculated from a total of 277 abstracts.

Overall, learners’ satisfaction and acquisition of collaborative skills, such as com-munication, patient-centred practice, teamwork, problem resolution, role clari-fi cation, and collaborative leadership, represented the most frequently measured outcomes (30% each). In comparing the two conferences, a trend toward medium-term outcomes was seen from 2012 to 2013. For example, a drastic decrease in the measurement of satisfaction was noted (53% to 29%) while an augmentation was noted for all the higher levels. Th e most important increases concern acquisition of collaborative skills (18% in 2012 and 42% in 2013), followed by learners’ col-laborative behaviour (3% in 2012 and 17% in 2013).

Based on the information in the abstracts, several characteristics of the evalu-ations were extracted ( Table 4 ). Th ere were no major diff erences in the character-istics of evaluations between the conferences held in 2012 and 2013. Th e purpose of evaluation was evenly distributed between formative (32%), summative (29%), and developmental (36%). Th e majority of authors (88%) did not explicitly iden-tify the evaluation framework or method used in their research. Most of the stud-ies were conducted within an educational setting (49%), with almost all studies taking place in real-world settings (99%). Self-reported data were most commonly used (66%) with exclusive use of quantitative and qualitative methods 54% and 51% of the time, respectively. Triangulation of data by the adoption of a mixed method was uncommon (7%). Of the 149 studies using a quantitative approach, only 92 (62%) of the authors indicated that they used a validated instrument. Th e most commonly used instruments were the Readiness for Interprofessional Learning Scale ( Parsell & Bligh, 1999 ), the Collaborative Practice Assessment Tool ( Schroder et al., 2011 ), the Interdisciplinary Education Perception Scale ( McFadyen, Maclaren, & Webster, 2007 ), and the Attitudes Toward Health Care Teams Scale ( Heinemann, Schmitt, Farrell, & Brallier, 1999 ).

Page 10: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

10 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

Table 4. Characteristics of Conference Abstracts

Characteristics ATBH VI CAB IV Total

%a(n) %b(n) %c(n)

Purpose of evaluationFormative 33% (48) 31% (41) 32% (89)Summative 26% (38) 33% (43) 29% (81)Developmental 41% (59) 30% (40) 36% (99)

Not specifi ed – 6% (8) 2% (8)Data analysisd

Quantitative 59% (85) 48% (64) 54% (149)Qualitative 57% (82) 45% (60) 51% (142)Mixed 5% (7) 9% (19) 7% (19)Not specifi ed 8% (11) 22% (40) 14% (40)DesignExperimental/quasi-experimental 1% (2) 2% (2) 1% (4)Naturalistic 99% (143) 98% (130) 99% (273)SettingEducational 33% (48) 67% (89) 50% (137)Clinical 26% (38) 16% (21) 21% (59)Both 41% (59) 17% (22) 29% (81)Measurementd

Self-measurement 72% (105) 59% (78) 66% (183)Measurement through observation 52% (76) 38% (50) 45% (126)Not specifi ed 1% (2) 19% (25) 10% (27)Evaluation framework/approach/methodExplicitly identifi ed 12% (18) 11% (14) 12% (32)Not identifi ed 88% (127) 89% (118) 88% (245)

aPercentages are calculated from a total of 145 abstracts. bPercentages are calculated from a total of 132 abstracts. cPercentages are calculated from a total of 277 abstracts. dCategories not mutually exclusive. For example, one abstract can present both quantitative and qualitative data answering diff erent research questions.

Current Status of Research on Evaluation Aft er eliminating duplicate citations, the peer-reviewed literature search yield-ed 235 papers retrieved through health/education databases (Medline: n = 81; CINAHL: n = 47; ERIC: n = 75) and the hand search of the Journal of Interpro-fessional Care ( n = 18) and Journal of Research in Interprofessional Practice and Education ( n = 14). No articles were found in evaluation-specifi c journals. Aft er applying the selection criteria by reading the titles and abstracts, this number was reduced to 24 articles. Aft er reading the full text of these articles, 20 articles

Page 11: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 11

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

were excluded because the objectives did not fi t within any of the RoE categories. 2 Th e four remaining articles discussed issues that related to two RoE domains—methods and context.

We classifi ed these articles as RoE because their primary purpose was not to evaluate a specifi c program but more to identify methods and approaches, challenges faced, and contextual factors that altered the evaluation process. Th ey also articulated how the evaluation framework used aligned with the theoretical assumptions and underlying constructs of IPECP. Greaves and colleagues (2013) describe the challenges faced in evaluating a complex intervention in a changing environment and propose a generic four-component evaluation model, located within the theory of complex intervention evaluation, for similar initiatives. Th e model includes quantitative and qualitative data to measure activity, impact on cost, and changes to health outcomes, as well as patient, clinician, and manager experiences. Similarly, Trojan, Suter, Arthur, and Taylor (2009) describe and critique a multisite evaluation focusing on the challenges and benefi ts of various aspects of the evaluation, and recommend systems concepts for future studies. Payler, Meyer, and Humphris (2007) proposed a conceptual evaluation matrix for interprofessional pedagogic evaluation, derived from Kirkpatrick classifi cation, to guide theoretically informed evaluation that will allow exploration of the com-plexity involved. Finally, Tolson, McIntosh, Loft us, and Cormie (2007) focus on the methodological strengths and challenges associated with realistic evaluation in the context of interprofessional management of cancer patients in palliative care. While advocating for the necessity of considering complexity in the evalu-ation of IPE and practice-based IPC interventions, the authors of all four papers noted that success of the evaluation process is closely related to fl exibility of time-frame and activities, as well as the strong involvement of stakeholders at all levels.

Resources relevant to evaluation that were retrieved from organizational websites included reviews of measurement instruments and descriptions of evalu-ation strategies. For example, NCIPE provides a comprehensive collection of IPECP instruments categorized by the construct measured (attitudes: n = 14; behaviour: n = 7; knowledge, skills, and abilities: n = 3; organizational practice: n = 2; patient or provider satisfaction: n = 2; other: n = 3). CAIPE published a report proposing a three-dimensional frame of reference for the evaluation of in-terprofessional education ( Barr, Freeth, Hammick, Koppel, & Reeves, 2000 ), and CIHC produced a report that maps the evaluation strategies of 20 IPECP projects funded by Health Canada ( CIHC, 2008 ). Overall, the websites mainly provide information on several instruments and propose evaluation frameworks specifi c to the IPECP fi eld; however, none of them diff use fi ndings from evaluation studies that focus on providing recommendations for evaluation.

DISCUSSION In this article, an analysis of IPECP conference abstracts combined with a review of peer-reviewed and grey literature provides insights into how evaluation is

Page 12: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

12 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

currently conducted and the current state of RoE within the IPECP fi eld. Th e results from our review of conference abstracts suggest that the same weaknesses identifi ed in previous reviews of published evaluations ( Hammick et al., 2007 ; Reeves et al., 2013 ; Zwarenstein et al., 2009 ) are refl ected in current work, suggest-ing that there is progress yet to be made. We also found a very limited amount of RoE is available to support evaluators in improving their practice.

Most of the evaluation conducted in this area seems to be limited to pre-senting a new initiative or program to improve collaborative practice through IPE or practice-based interventions. Few conference abstracts explicitly iden-tifi ed the evaluation approach or method used to document impact. Among those that provided information on methods, the primary focus was on meas-uring lower-level outcomes (i.e., short-term) as categorized by the JET clas-sifi cation for IPE outcomes. Th is included changes in attitudes and acquisition of knowledge or specifi c skills (e.g., communication, problem-solving, role understanding). Very few program evaluations focus on higher-level IPE out-comes (i.e., long-term) such as organizational changes or benefi ts for the patient that will be necessary to determine the value of IPECP in improving health services. As noted earlier, some studies and reviews have already pro-vided recommendations for evaluators and researchers that will help to in-crease the quality of studies and accelerate progress in this area. Th ese include development and use of reliable and valid instruments, evaluation designs that move beyond short-term outcomes and measure programs’ long-term outcomes and impacts, and application of mixed-methods and systems theory to address complex questions. A newly published guide from Reeves, Boet, Zierler, and Kitto (2015) provides guidance for those planning and undertaking an evaluation study of IPE. For example, the guide recommends stakeholders think about evaluation as early as possible, clearly articulate the purpose of the evaluation, consider the theoretical perspective, and use evaluation mod-els to guide studies. It appears that the numerous calls to increase evaluation capacity through RoE remains to be addressed. More research is needed to addresses questions that will contribute to better evaluation—for example, questions related to methods, contextual factors that alter evaluation, when and how evaluations are successfully used, and ethics specifi c to the area. Such eff orts will equip researchers and evaluators to better plan, conduct, and in-terpret evaluations that link education and practice-based interventions with the optimization of clinical practice and ultimately with the improvement of healthcare process and patient outcomes. Conducting evaluation and research in this fi eld is highly challenging because it oft en involves multiple heteroge-neous interventions; nonlinear dynamic relationships between knowledge/skills acquisition and behavioural change; system-level variables such as trust, social capital, and team cohesion; and many interdependencies between the actors and organizations implicated. Although the evaluation approach was not explicitly identifi ed in the abstracts analyzed, the descriptions, outcomes measured, and limited use of mixed methods suggests that complexity and

Page 13: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 13

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

system theories are not considered as oft en as they could be. Our results indi-cate that methods for evaluating complex interventions are not well known or widely used in the fi eld.

With this article we contribute to identify weaknesses in current evaluation methods in the IPECP fi eld and highlight the negative consequences of poor RoE for knowledge development. Results will help to draw out specifi c recommenda-tions to increase the quality of studies in the fi eld through more robust evalua-tions. Th is article also confi rms the concerns shared by Szanyi, Azzam, and Galen (2013) that there is still much more work to be done to provide evidence in RoE and to increase its connection to evaluation practice. Moreover, the methodol-ogy used shows an example of how we can identify gaps both in the literature and in the practice fi eld. Finally, this article helps to identify opportunities for RoE researchers. Indeed, the results show that few attempts have been made to increase evidence on method and context of evaluation in the IPECP fi eld, and many questions still remain unanswered, especially on other domains of RoE. Several limitations to this work should be considered. First, only two IPECP conferences were considered for the abstract analysis. Although this may not fully represent the full range of evaluation taking place in the fi eld, the ATBH and CAB conferences are the two most signifi cant gatherings of professionals in this fi eld, and we reasoned that they would provide a reasonably accurate picture of current practices. A second limitation is that abstracts provide limited informa-tion as compared to full conference papers (if available) and full reports. Ideally, future meta-evaluations will focus on accessing full documents that will provide more detailed information. Th ird, the two-year time frame between conferences is very brief and may not refl ect the kind of growth that would have been visible with a longer interval. Future work should focus on an analysis of annual IPECP conferences to monitor growth in the fi eld. Finally, it is possible that the approach we used in reviewing the RoE literature resulted in the exclusion of some relevant papers. It was challenging to identify the best approach to eff ectively search the literature. Use of the general term “evaluation” yielded too many articles as almost every peer-reviewed publication in the fi eld refers to evaluation in some way. Th e keywords chosen allowed us to reduce the search to a feasible number of papers. Moreover, we did not systematically search specifi c journals on education evalu-ation; however, we believe that relevant articles from these journals would have been identifi ed through the ERIC database search.

In conclusion, based on the results of this work we reinforce the recom-mendations by other authors who suggest that evaluators and researchers focus on issues that will enhance the internal validity of studies. However, we temper this with the recommendation that researchers and evaluators should not lose sight of external validity. To accomplish this, we recommend more emphasis on communication and collaboration between evaluators, researchers, practitioners, and policy makers, which would, no doubt, result in better evaluations that focus on questions that matter. More projects need to blend questions related to both research and program evaluation perspectives. Glasgow, Green, Taylor, and Stange

Page 14: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

14 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

(2012) have suggested an “evidence integration triangle” (EIT) for aligning science with policy and practice that is relevant in this context. Th e triangle emphasizes interactions among three components: (a) an eff ective program collaboratively se-lected and adapted; (b) pragmatic, longitudinal measures for rapid feedback; and (c) true partnership approaches to implementation that pay attention to context. We recommend that evaluators and researchers alike place more emphasis on RoE questions that will continue to build evaluation capacity in IPECP. Th ese include very practical questions related to methods and approaches that will take into con-sideration the complexities of this area, including contextual factors, evaluation use, and issues of propriety. Such questions would not only help to build capacity in this fi eld, but could be useful to professionals working in other health-related areas. Finally, we view IPECP settings as ideal for testing the thinking that is emerging from diff erent evaluation theories and methods involving complexity theory and systems thinking in the evaluation fi eld.

NOTES 1 Filter subjects “evaluation methods,” “program evaluation,” and “program eff ectiveness”

were used in the ERIC database. 2 A total of nine diff erent frameworks/approaches were identifi ed in these articles: Barr

et al.’s 6-point Joint Evaluation Team (JET) classifi cation of IPE outcomes; Barr et al.’s interprofessional education evaluation framework; the interdisciplinary educational model evaluation; Kirkpatrick’s classifi cation of education outcomes; the Common assessment framework; the refl ective practice model; the realistic evaluation approach; the theory-based evaluation; and the Demand-Driven Learning Model.

REFERENCES All Together Better Health (ATBH) VI . ( 2012 , October 5–8). Programme . Kobe, Japan. Bainbridge , L. , Nasmith , L. , Orchard , L. , & Wood , V. ( 2010 ). Competencies for interpro-

fessional collaboration. Journal, Physical Th erapy Education , 24 ( 1 ), 6 – 11 . Retrieved from http://www.ipe.uwo.ca/Administration/teachings/Competencies%20for%20Interprofessional%20Collaboration.pdf

Bainbridge , L. , & Wood , V. I. ( 2012 ). Th e power of prepositions: Learning with, from and about others in the context of interprofessional education. Journal of Interprofessional Care , 26 ( 6 ), 452 – 458 . http://dx.doi.org/10.3109/13561820.2012.715605

Bainbridge , L. , & Wood , V.I. ( 2013 ). Th e power of prepositions: A taxonomy for interpro-fessional education. Journal of Interprofessional Care , 27 ( 2 ), 131 – 136 . http://dx.doi.org/10.3109/13561820.2012.725231

Barr , H. , Freeth , D. , Hammick , M. , Koppel , I. , & Reeves , S. ( 2000 ) Evaluations of interprofes-sional education: A United Kingdom review for health and social care . Retrieved from United Kingdom Centre for the Advancement of Interprofessional Education website: http://caipe.org.uk/silo/fi les/evaluations-of-interprofessional-education.pdf

Barr , H ., Koppel , I ., Reeves , S ., Hammick , M ., & Freeth , D . ( 2005 ). Eff ective interprofessional collaboration: Arguments, assumptions & evidences . Oxford, UK: Blackwell.

Page 15: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 15

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

Berridge , E. J. , Mackintosh , N. , & Freeth , D. ( 2010 ). Supporting patient safety: Examin-ing communication within delivery suite teams through contrasting approaches to research observation. Midwifery , 26 ( 5 ), 512 – 519 . http://dx.doi.org/10.1016/j.midw.2010.04.009

Centre for the Advancement of Interprofessional Education . ( 2014 ) Defi ning IPE . Retrieved from http://caipe.org.uk/resources/defi ning-ipe

Canadian Interprofessional Health Collaborative (CIHC) . ( 2008 ). Program evaluation for interprofessional education: A mapping of evaluation strategies of the 20 IPECP projects . Retrieved from http://www.cihc.ca/fi les/CIHC_EvalMethods_Final.pdf

Canadian Interprofessional Health Collaborative (CIHC) . ( 2010 ). A national interpro-fessional competency framework . Retrieved from http://www.cihc.ca/fi les/CIHC_IPCompetencies_Feb1210.pdf

Collaborating Across Borders (CAB) IV . ( 2013 ). Peer-reviewed Abstract Booklet . Vancouver, Canada. June 12–14.

Davidson , M. , Smith , R. A. , Dodd , K. J. , & O’Loughlan , M. J. ( 2007 ). Interprofessional pre-qualifi cation clinical education: A systematic review. Health Profession Education , 32 ( 1 ), 111 – 120 . http://dx.doi.org/ 10.1071/AH080111

Glasgow , R. E. , Green , L. W. , Taylor , M. V. , & Stange , K. C. ( 2012 ). An evidence integration triangle for aligning science with policy and practice. American Journal of Preventive Medicine , 42 ( 6 ), 646 – 654 . http://dx.doi.org/10.1016/j.amepre.2012.02.016

Greaves , F. , Pappas , Y. , Bardsley , M. , Harris , M. , Curry , N. , Holder , H. , . . . , Car , J. ( 2013 ). Evaluation of complex integrated care programmes: Th e approach in North West Lon-don. International Journal of Integrated Care , 13( e006 ), 1 – 12 . Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3653284/

Hammick , M. , Freeth , D. , Koppel , I. , Reeves , S. , & Barr , H. ( 2007 ). A best evidence system-atic review of interprofessional education: BEME Guide no. 9. Medical Teacher , 29 ( 8 ), 735 – 751 . http://dx.doi.org/10.1080/01421590701682576

Heinemann , G. D ., Schmitt , M. H ., Farrell , M. P ., & Brallier , S. A . ( 1999 ). Development of an Attitudes Toward Health Care Teams Scale. Evaluation & the Health Professions , 22(1), 123–142. http://www.ncbi.nlm.nih.gov/pubmed/10350960

Henry , G. T ., & Mark , M. M . ( 2003 ). Toward an agenda for research on evaluation . New Directions for Program Evaluation , 97 , 69–80.

Hutchinson , K ., Dunkley , J ., & Lovato , C . ( 2013 ) Evaluation glossary, Mobile App. Re-trieved from http://communitysolutions.ca/web/resources-public/

McFadyen , A. K. , Maclaren , W. M. , & Webster , V. S. ( 2007 ). Th e Interdisciplinary Ed-ucation Perception Scale (IEPS): An alternative remodeled sub‐scale structure and its reliability. Journal of Interprofessional Care , 21 ( 4 ), 433 – 443 . http://dx.doi.org/10.1080/13561820701352531

Organisation for Economic Co-operation and Development (OECD ). ( 2002 ). Glossary of key terms in evaluation and results based management . Retrieved from http://www.oecd.org/development/peer-reviews/2754804.pdf

Paradis , E. , & Reeves , S. ( 2013 ). Key trends in interprofessional research: A macrosocio-logical analysis from 1970 to 2010. Journal of Interprofessional Care , 27 ( 2 ), 113 – 122 . http://dx.doi.org/10.3109/13561820.2012.719943

Page 16: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

16 Careau, Bainbridge, Steinberg, & Lovato

© 2016 CJPE 31.1, 1–17 doi: 10.3138/cjpe.225

Parsell , G. , & Bligh , J. ( 1999 ). Th e development of a questionnaire to assess the readiness of health care students for interprofessional learning (RIPLS). Medical Education , 33 ( 2 ), 95 – 100 . http://dx.doi.org/10.1046/j.1365-2923.1999.00298.x

Patton , M. Q . ( 2012 ). Essentials of utilization-focused evaluation . Th ousand Oaks, CA: Sage. Payler , J. , Meyer , E. , & Humphris , D. ( 2007 ). Th eorizing interprofessional pedagogic evalu-

ation: Framework for evaluating the impact of interprofessional continuing profes-sional development on practice change. Learning in Health and Social Care , 6 ( 3 ), 156 – 169 . http://dx.doi.org/10.1111/j.1473-6861.2007.00156.x

Reeves , S. , Boet , S. , Zierler , B. , & Kitto , S. ( 2015 ). Interprofessional education and practice guide No. 3: Evaluating interprofessional education. Journal of Interprofessional Care , 29 ( 4 ), 305– 312. doi:10.3109/13561820.2014.1003637

Reeves , S. , Lewin , S. , Espin , S. , & Zwarenstein , M. ( 2010 ). Interprofessional Team-work for Heatlh and Social Care . London, UK : Blackwell-Wiley . http://dx.doi.org/10.1002/9781444325027 .

Reeves , S. , Perrier , L. , Goldman , J. , Freeth , D. , & Zwarenstein , M. ( 2013 ). Interprofes-sional education: Eff ects on professional practice and healthcare outcomes (update). Cochrane Library , ( 3 ), 1 – 47 . http://dx.doi.org/ 10.1002/14651858.CD002213.pub3

Schroder , C. , Medves , J. , Paterson , M. , Byrnes , V. , Chapman , C. , O’Riordan , A. , . . . , Kelly , C. ( 2011 ). Development and pilot testing of the collaborative practice assessment tool. Journal of Interprofessional Care , 25 ( 3 ), 189 – 195 . http://dx.doi.org/10.3109/13561820.2010.532620

Szanyi , M. , Azzam , T. , & Galen , M. ( 2013 ) Research on evaluation: A needs assessment. Canadian Journal of Porgram Evaluation, 27 (1), 39–64. Retrieved from http://cjpe.journalhosting.ucalgary.ca/cjpe/index.php/cjpe/article/view/37

Suter , E. , Deutschlander , S. , Mickelson , G. , Nurani , Z. , Lait , J. , Harrison , L. , . . . , Grymon-pre , R. ( 2012 ). Can interprofessional collaboration provide health human resources solutions? A knowledge synthesis. Journal of Interprofessional Care , 26 ( 4 ), 261 – 268 . http://dx.doi.org/10.3109/13561820.2012.663014

Tolson , D. , McIntosh , J. , Loft us , L. , & Cormie , P. ( 2007 ). Developing a managed clinical network in palliative care: A realistic evaluation. International Journal of Nursing Studies , 44 ( 2 ), 183 – 195 . Retrieved from http://www.journalofnursingstudies.com/article/S0020-7489%2805%2900251-8/abstract . http://dx.doi.org/10.1016/j.ijnurstu.2005.11.027

Trojan , L. , Suter , E. , Arthur , N. , & Taylor , E. ( 2009 ). Evaluation framework for a multi-site pratice-based interprofessional education intervention. Journal of Interprofessional Care , 23 ( 4 ), 380 – 389 . http://dx.doi.org/10.1080/13561820902744106

World Health Organization (WHO) . ( 2010 ). Framework for action on interprofessional edu-cation & collaborative practice . Retrieved from: http://www.who.int/hrh/resources/framework_action/en/

Zwarenstein , M. , Goldman , J. , & Reeves , S. ( 2009 ). Interprofessional collaboration: Eff ects of practice-based interventions on professional practice and healthcare outcomes [review] . Cochrane Library , ( 3 ), 1 – 29 . http://dx.doi.org/ 10.1002/14651858.CD000072.pub2

Page 17: Improving Interprofessional Education and Collaborative ... · and evaluating interprofessional education and collaborative practice (IPECP) interventions and engaging in research

Interprofessional Education and Collaborative Practice 17

CJPE 31.1, 1–17 © 2016doi: 10.3138/cjpe.225

AUTHOR INFORMATION Emmanuelle Careau, OT, PhD, is an assistant professor in the rehabilitation department at the Faculty of Medicine of Université Laval (Québec, Canada). Dr. Careau conducted this study as her post-doctoral training at University of British Columbia. Dr. Careau is currently the scientifi c director of the Réseau de collaboration sur les pratiques interprofes-sionnelles en santé (RCPI) and pursuit research activities in the IPECP fi eld. Lesley Bainbridge, BSR(PT), Med, PhD, holds a master’s degree in education and an interdisciplinary doctoral degree with a focus on interprofessional health education. She served as Associate Principal, College of Health Disciplines and Director, Interprofes-sional Education, Faculty of Medicine at the University of British Columbia from 2005 to 2015 and 2014 respectively. She was also an Associate Professor, Department of Physical Th erapy at UBC. Dr. Bainbridge has been principal or co-investigator on several Teaching and Learning Enhancement Fund grants, two major Health Canada grants focusing on interprofessional education and collaborative practice, and other funded research related to interprofessional education and collaborative practice. She has published in peer-reviewed journals and presented on IPE-related topics at national and international conferences. She retired from UBC in 2015 and, as Associate Professor Emeritus, remains involved with graduate students, short-term projects, and interprofessional developments locally, nationally, and internationally. Marla Steinberg has been a professional evaluator for over 25 years. She has conducted evaluations in a wide range of areas including ECD, health promotion, research capacity building, organizational development, leadership, partnerships, health services, research impact, primary care, and KT. She has extensive experience in teaching and professional development that spans developing and delivering graduate-level courses, coaching and in-person workshops, and e-learning courses. Chris Lovato is a professor in the School of Population & Public Health, Faculty of Medicine, University of British Columbia. Her research focuses on evaluating the impact of health programs and policies, particularly in the areas of cancer prevention and health services. She is also involved in conducting a longitudinal study to evaluate the impact of medical school initiatives implemented in response to health care professional shortages in rural, remote, and northern regions of Canada.