Improving the Evaluation of Interprofessional Education ... · 10/1/2017  · Depiction of...

65
Improving the Evaluation of Interprofessional Education: Moving beyond Attitudinal Measures VIRGINIA COMMONWEALTH UNIVERSITY UNIVERSITY OF CALIFORNIA SAN FRANCISCO OREGON HEALTH & SCIENCE UNIVERSITY

Transcript of Improving the Evaluation of Interprofessional Education ... · 10/1/2017  · Depiction of...

  • Improving the Evaluation of Interprofessional Education: Moving beyond Attitudinal Measures VIRGINIA COMMONWEALTH UNIVERSITY UNIVERSITY OF CALIFORNIA SAN FRANCISCO OREGON HEALTH & SCIENCE UNIVERSITY

  • Panel Agenda

    Virginia Commonwealth University: Use of multisource data to evaluate classroom-based IPE for early learners

    University of California, San Francisco: Multisource feedback in a longitudinal clinical skills course where medical students are

    embedded in an interprofessional clinical team

    Multisource feedback for students and residents who are trained in a patient-centered medical home

    Oregon Health & Science University: Multisource data for evaluating a large, national study focused on implementing IPE in

    primary care physician residency continuity clinics

    Use of Q methodology to evaluate rural IPE

  • Use of Multisource Data to Evaluate Classroom-based IPE for Early Learners

    Kelly Lockeman, PhD Virginia Commonwealth University

  • Background • Two large-scale required IPE courses

    – Fall: Foundations of Interprofessional Practice – Spring: Interprofessional Quality Improvement

    and Patient Safety

    • Classroom-based • 1 credit each, pass/fail grading • ~500 students each semester • 88 interprofessional teams • 12-16 faculty facilitators

  • Assessment Challenges

    Assessing individual students is resource intensive.

    Many Students Few Faculty Limited Time

  • Our Approach To Assessment • Use student learning objectives as a guide. • Measure outcomes of individual students and teams. • Build measures into course activities. • Use validated measures when feasible. Assess validity for

    course-specific measures. • Utilize measures that are easily scored. • Provide formative feedback for students. • Iterate!

  • Typical Assessment Measures Learner Team Knowledge • Quizzes • Work products

    Skills/Performance • Self-assessment • Peer assessment

    • Work products • (Aggregate) Self-report

    using validated tools (e.g., TDM)

    Attitudes • Self-report using

    validated tools (e.g., SPICE-R2)

    • Narrative reflections

  • Sample Findings: Individual Learners • KNOWLEDGE: Average quiz score 85% (passing = 70%) • SKILLS/PERFORMANCE:

    – Peer assessment: Scores depend on rating method • Rubric with rating criteria scores highly skewed • Budget-based approach normal distribution that clearly identifies

    low and high performers – Self-assessment: Correlates w/ budget-based peer

    assessment (r = .61, p < .001)

    • ATTITUDES: Significant increase on SPICE-R2 (pre/post)

  • Sample Findings: Teams • KNOWLEDGE: Depiction of collaborative care in team

    work products suggests poor understanding of IPC

    • SKILLS/PERFORMANCE: High scores suggest they can use teamwork to complete tasks effectively

    • ATTITUDES: – Students acknowledge the need for and value IPE. – Comments suggest that interprofessional socialization as a

    primary outcome.

  • Considerations Correlation between individual measures is low.

    Knowledge ≠ Skills/Performance ≠ Attitudes

    Students (and faculty) don't know they are lousy assessors.

    Peer assessment requires practice.

    Interrater reliability between faculty is challenging.

    Qualitative analysis of narrative reflections is resource intensive.

  • Conclusions Data from multiple sources useful information for program evaluation, but… • Keep seeking evidence for validity • Review data annually and revise measures as needed • Draw from the experience of other institutions • Focus on faculty development

  • Thank you! For more information… Visit our website: http://ipe.vcu.edu/ Follow us: • Twitter: @VCUCIPE • Facebook: https://www.facebook.com/VCUIPE

    Contact me directly: Kelly Lockeman, PhD Director of Evaluation and Assessment VCU Center for Interprofessional Education and Collaborative Care [email protected]

    http://ipe.vcu.edu/https://twitter.com/VCUCIPEhttps://www.facebook.com/VCUIPEmailto:[email protected]

  • An Interprofessional Multi-Source Feedback Tool for Early

    Learners: Lessons Learned from a Pilot Study

    Josette Rivera, MD Associate Professor of Medicine

    Department of Medicine University of California, San Francisco

  • The UCSF Bridges Curriculum To prepare the 21st century physician to work collaboratively in promoting health and reducing suffering while continually improving our health care system Guiding Principles Integration into Interprofessional Collaborative Care Immersion in Authentic Workplace Learning Engagement with Health Care Delivery Systems Campus-wide 5 quarter IPE curriculum, SOM additional IPE coursework

    6/19/2015

  • The Clinical Microsystem Clerkship (CMC)

    Who • Medical students (150): First 18 months • Faculty coaches (25): each assigned 6 students. Teach direct patient care

    skills, shepherd students through their microsystems projects • Staff: members of the healthcare team and other staff in the microsystem What

    • Students embedded in a clinical microsystem one day/week throughout SF • Quality improvement, direct patient care skills, interprofessional collaboration • Systems-based project in the microsystem

  • an interprofessional collaboration (IPC) assessment tool for CMC and beyond…

    Opportunity: assess relevant student behaviors in clinical settings Goal: formative assessment at multiple time points, emphasize importance of IPC to students and coaches

    Considerations: # items, narrative vs. rating scales, #/timing of assessments, # raters, training of raters

    Tensions: raters in diverse hectic clinical environments, infeasible to train, avoid rating scales causing reluctance to “judge”

    Gap: brief, narrative, formative assessment, not setting or specialty specific

  • The Multi-source Feedback Tool

    Tool • Consider both observations & direct interactions with the student • Identify their profession • Questions: ‒ What does X do well to collaborate with other health professionals? ‒ What could X improve to collaborate more effectively with other health professionals? ‒ Did you work with this student: On their systems improvement project? To provide

    patient care? Both? ‒ Please estimate the number of days you interacted with this student

    Process • Students identify and request feedback from one non-MD in their microsystem • Request feedback in fall and spring; not required to have same raters • Coaches review feedback with individual students, develop individualized plan/goals

  • Compliance Fall 2016: 152 students

    • 111 completed (73%)

    • 41 missing

    Spring 2017: 152 students

    • 106 completed (70%)

    • 46 missing

    9/152 (6%) never received feedback -> written reflection on why

    • Projects only involved physicians

    • Setting “chaotic” with rotating team members, non-overlapping schedules, only brief interactions

  • Fall 2017 Results • 98/111 (88%) raters had 1-8 days of interaction with student

    • Majority positive; only 18 students received constructive feedback

    …desire to understand the job descriptions/responsibilities of

    each health care professional and is open to seeing how every

    member of the health care team can contribute…never hesitates to reach out to different health care providers for their perspectives or

    recommendations

    Good at sharing his ideas and asking for feedback…explains

    his thought processes yet accepts constructive criticism and uses it

    to improve

  • Spring 2017 Results • 86/106 (81%) 1-8 days of interaction with student; 11/106 (10%) >13 days

    • Many more students (35) received constructive feedback

    …become more vocal and express your ideas and thoughts

    …improve upon finding your own unique professional voice and speaking up more in larger

    interdisciplinary meetings

    Don’t be afraid to make a mistake. You don't need to be perfect and you don't need to know everything. Keep asking questions and continue to put

    yourself out there.

  • Lessons Learned

    Feasible: optimistic can increase compliance due to lessons learned

    Useful: thoughtful & fairly specific feedback without rater training

    Messaging as important as the tool itself

    • Emails to students, coaches, and staff explaining purpose of tool

    • Emphasize formative for student development

    First iteration in fall too early: insufficient contact with others

    This year->earlier messaging to new coaches, push timeline back

  • Implications for Program Evaluation

    Although an explicit course expectation, some students not interacting with other professionals in their microsystem

    Qualitative study with student interviews

    • Outside of projects and assignments, IP interactions did not happen spontaneously or ‘organically’

    • Students reluctant to ‘disturb’ busy staff Coach discussions reveal buy-in and guidance needed Interview microsystem stakeholders to determine what/how students contributed to the microsystem

  • Next Steps

    Include more structured activities early on to facilitate IP interactions (more team member interviews/shadowing opportunities)

    Faculty development for current and future coaches

    • Tips for coaches by coaches (ex. select projects that facilitate IP interactions) • Workshops on IPE and the hidden curriculum • Setting expectations: assist students with compliance, review feedback Expand use of multi-source feedback tool to select clerkships in 3rd/4th years

    • Demonstrate trajectory • Consistent assessment, continued emphasis on importance of IPC in all 4 years

  • Questions? [email protected]

  • Multi-level Evaluation of IPE in the Workplace Individuals, Teams, Systems

    Bridget O’Brien, PhD Center for Faculty Educators & Dept of Medicine, University of California, San Francisco Center of Excellence in Primary Care Education, San Francisco VAMC

  • San Francisco VA EdPACT Program The Workplace • Primary Care clinics for Veterans • IP Team-based model of care • 3 clinic sites • ~18,000 patients

    The Educational Program • Integrated with the clinics • Learners part of clinic teams • Curriculum emphasizes communication,

    relationships, & systems

    2

    LVN

    Social Work Pharm

    RN

    Patient

    IM Residen

    t

    MSA Mental Health

    PACT

    TEAM

    NP Student

  • IPE: An Individual and Collective Enterprise EdPACT evaluation must address goals and objectives of… Each professional program: individual competence in communication, interpersonal,

    and teamwork skills VA primary care clinic: collective competence in team-based care

  • 2016 Education Showcase

    Individual Performance in a Team Evaluation Question

    • Does EdPACT help learners achieve competence in team-based care? ‒ Communication, Collaboration, Empowering team members, Situational Awareness

    Multi-source Feedback • Formative assessment to individual learners • Aggregate data for program evaluation

    Instrument • Modified version of American Board of Internal Medicine multi-source feedback tool (TEAM)1

    • Covers domains such as giving & receiving feedback; respect; empowering others

    Process • All team members rate each learner on the team • Learners self-assess using same survey • Learners receive a report of their results and review with a faculty member

    Chesluk BJ et al, 2012 Health Affairs; Chesluk BJ et al, 2015 J Cont Ed Health Prof

  • 2016 Education Showcase

    Individual Performance in a Team: Example

  • Program Evaluation: Individual Performance in a Team

    31

    Does EdPACT help learners achieve competence in team-based care? In 2017, 36 learners rated per year by 63 team members (~80% response rate) Learners rate own performance highest in empowering others and relational domains (respect,

    acknowledging others contributions), lowest in seeking and giving feedback Team members generally rate learners’ performance higher than learners rate themselves;

    similar patterns of highest and lowest performance – seeking and giving feedback lowest

    Implications Broadly focus on feedback Customized learning goals for each learner

    ‒ Communication with team ‒ Self-awareness

  • 2016 Education Showcase

    Interprofessional Teamwork Evaluation Question

    • To what extent are teams achieving cohesiveness, effective communication, role clarity, clear goals?

    Team Development • Formative feedback for teams to review and use for improvement • Program level feedback to address culture, coaching, etc

    Instrument • Team Development Measure1

    Process • Administered twice per year • Teams review reports and have opportunities to discuss

    1Stock et al, 2013; Survey available at: http://www.peacehealth.org/about-peacehealth/medical-professionals/eugene-springfield-cottage-grove/team-measure/Pages/measure

  • 2016 Education Showcase

    Interprofessional Teamwork: Example

  • 2016 Education Showcase

    Program Evaluation: Interprofessional Teamwork To what extent are teams achieving cohesiveness, effective communication, role clarity, clear goals? 57 teams over the past 6 years, 8-10 per year 194 out of 247 (79%) individual responses; Response rate within team: 50% to 100% TDM scores range from 55 to 89

    ‒ Generally increase from fall to spring ‒ Spring scores were lowest in 2012 (64.9), peaked in 2013 (70.8), gradually fell 2014-17 (67.8)

    Scores consistently highest in communication, then cohesiveness, role clarity, goal clarity ‒ Huddle coaches & trainees rated goal clarity lower, on average, than other team members

    Teams with higher TDM scores tend to have more dispersed distribution of responses

    Implications Consider workplace factors – staff turnover, shortages Additional team building activities

  • A Systems-Based Assessment Framework1 for IPE?

    Level 3: System readiness for future change

    -What changes can we anticipate and do our teams and metrics align with these changes?

    Level 2: Program performance -Is the program & clinic meeting current

    expectations?

    Level 1: Individual curriculum components & stakeholders

    -Are learners & teams meeting expectations?

    Bowe C, Armstrong E. Assessment for systems learning. Acad Med. 2017; 92: 585-592

  • Acknowledgements

    Support for the SFVA Center of Excellence for Primary Care Education is provided by the Veterans Affairs Office of

    Academic Affiliations

    Many thanks to the team members who complete the survey each year and to our data manager, Gillian Earnest, who

    collects and compiles these data

    THANK YOU!

    10/1

    2016 Education Showcase 3

  • Improving the Evaluation of Interprofessional Education: Moving Beyond Attitudinal Measures

    Patricia (Patty) Carney, PhD Professor of Family Medicine Oregon Health & Science University

  • PACER (Professionals Accelerating Clinical & Educational Redesign)

    3 Year Quasi-Experimental Study Designed to Improve IPE and Co-Learning Among the # Primary Care Disciplines Family Medicine General Internal Medicine General Pediatrics

    Funded by the Foundations of the ABFM, ABIM, ABP, Josiah Macy Jr. Foundation, ACGME

  • PACER MAP of Sites – 27 continuity clinics at 9 institutions across the U.S.

  • Learners included in PACER - FM, IM, Peds Residents & Faculty - PA Students - NP Students - MA Students - Behavioral Health Students - Pharmacy Students

  • Evaluation Design

    Mixed Methods Approach (Qualitative/Quantitative)

    - Design is Pre-Post Test Repeated Measures (Quasi-Experimental)

    No Recreational Data Collection!!!

  • Core Data Collection ‘Instruments’ Quantitative

    PACER Training Post Program Survey Attendance logs PCMH Monitor Continuity Clinic Survey Educational Program Survey Faculty Skills Self-Assessment Survey

    Qualitative Key Informant interviews (Telephone) Focus Groups (at site visits) Direct Observations (collected via Field notes)

    at site visits and all joint activities

  • Instrument Domains # of Variables

    Post Program Survey • Session quality • Overall usefulness • Intention to make changes at institution

    43

    Patient-Centered Medical Home Monitor - Validated - 7 Domains

    • Leadership and engagement • QI Team Process • Data capacity • Patient self-management support • Team-based care • Cost containment and care management • Access and continuity

    39

    Continuity Clinic Survey • Practice characteristics • Patient population

    15

    Educational Program Survey • Learning community activities • Coaching Activities/ assessment • Practice re-design efforts • Training re-design efforts • Sustainability activities • Sustainability and dissemination

    40

    Faculty Skills Self-Assessment Survey • Interprofessional care and education • Leadership for change • Patient-centered care • Stewardship of resources • Competency assessment skills • Individual characteristics

    25

  • 7. TEAM-BASED CARE

    No, not at all Yes, completely

    a. Care teams designated with regular team meetings

    ⓪①②③④⑤⑥⑦⑧⑨⑩

    b. Team members have defined roles that makes optimal use of their training and skill sets

    ⓪①②③④⑤⑥⑦⑧⑨⑩

    c. Protocols and standing orders implemented to better distribute workload throughout the team

    ⓪①②③④⑤⑥⑦⑧⑨⑩

    d. Training is provided for team members taking on new roles and responsibilities

    ⓪①②③④⑤⑥⑦⑧⑨⑩

    e. Cross training developed and role barriers removed to improve response to patient needs

    ⓪①②③④⑤⑥⑦⑧⑨⑩

    f. Team huddles used to discuss patient load for the day and to plan for patient visits

    ⓪①②③④⑤⑥⑦⑧⑨⑩

    g. Practice teams use proactive communication for planned, between- visit patient interactions

    ⓪①②③④⑤⑥⑦⑧⑨⑩

  • Longitudinal Coaching

    Key component of this project Coaches will attend the collaborative

    site visits Coaching calls planned quarterly Collecting data as part of these calls and

    during the collaborative site visits

  • Collaborative Site Visits

    Provide a “Check in” opportunity Designed as a “Learning Community” NOT an

    “Audit” Formal Progress Report produced using all

    contacts and used for evaluation

  • 4/16-

    5/16

    6/16-

    7/16

    8/16-

    9/16

    10/16-

    11/16

    12/16-

    1/17

    2/17-

    3/17

    4/17-

    5/17

    6/17-

    7/17

    8/17-

    9/17

    10/17-

    11/18

    12/17-

    1/18

    2/18-

    3/18

    4/18-

    5/18

    PACER Team Training

    April

    May

    PCMH Monitor

    April

    April

    Continuity Clinic Survey

    April

    April

    Process Evaluations:

    · Attendance Log

    · Training Meeting Evaluations:

    · Quality of Instruction

    · Small Group Interactive Sessions

    April

    May

    Longitudinal Coaching Calls (Quarterly)

    May

    Collaborative Site Visits

    Focus Groups

    Key Informant Interviews (during Collaborative Site Visit and every six months afterward)

    Educational Program Survey

    May

    April

    Faculty Skills Self-Assessment Survey

    April

    PACER Data Collection Schedule

    Legend:

    In person

    Web-based

    Paper-based

    Phone

  • First paper fully drafted: The Educational Value of Interprofessional Learners Applying their Developing Team-based Care Skills in Primary Care Ambulatory Training Settings

    Benefits of Co-learning: • Development of Personal

    Relationships • Improved Education, • Improved Patient Care • Improved Job Satisfaction

    Enablers of Perceived Benefits: • Clinic Culture – With open

    Leadership Empoweres TBC • Systemic Factors – Clinic

    layout that fostered TBC (e.g., places to huddle and co-work)

    Barriers of Realizing Perceived Benefits: • Clinic Culture – With Top Down

    Leadership Stymies TBC • Systemic Factors – Clinic layout that

    inhibited TBC (e.g., no co-learning space)

  • We are having more fun than we’ve ever had in our lives…

  • Use of Q methodology to evaluate rural IPE

    Curt Stilp, EdD, PA-C Director, Oregon Area Health Education Center (AHEC)

    Assistant Professor, Physician Assistant Program Oregon Health & Science University

    Portland, Oregon

  • Rural IPE Study Purpose

    Examine how rural IPE influences

    student perspectives

    Search for shared perspectives

    Rural life

    Rural team-based care

    What factors form the perspective

  • Q Method Research Question

    What factors do students participating in a rural IPE experience consider most important and least important in making a decision to practice team-based care in a rural setting?

  • Q Methodology

    Stephenson, 1953; Brown, 1993

    Concourse (Rural, Team-based care, Rural IPE)

    Literature

    Previous student’s rural IPE journals

    Q set

    Set of statements representing thoughts, ideas, opinions on the topic

  • Q Set 35 total statements

    17 - student journals

    18 - literature

    4 – 5 from each category

    Statement Category Source

    Time and sustained presence in a community helped build trust and familiarity.

    Social Henry & Hooker, 2007

    Working together in the clinic serves as great “peer” support that is needed.

    Team Student reflection journal

    Rural communities have limited funds which restrict what care can be provided.

    Community Student reflection journal

    IPE leads to a greater understanding of my own role on the health care team.

    Team Ponzer et al., 2004

    The availability of outdoor activities attracts me to the rural setting.

    Personal Student reflection journal

    The most effective rural IPE allows for engagement in the community.

    Education Deutchman et al., 2012

    Statement

    Category

    Source

    Time and sustained presence in a community helped build trust and familiarity.

    Social

    Henry & Hooker, 2007

    Working together in the clinic serves as great “peer” support that is needed.

    Team

    Student reflection journal

    Rural communities have limited funds which restrict what care can be provided.

    Community

    Student reflection journal

    IPE leads to a greater understanding of my own role on the health care team.

    Team

    Ponzer et al., 2004

    The availability of outdoor activities attracts me to the rural setting.

    Personal

    Student reflection journal

    The most effective rural IPE allows for engagement in the community.

    Education

    Deutchman et al., 2012

  • Q Sort

    Post-experience sorting of statements using FlashQ® software

    Sorting grid Most agree – Most disagree Subjective ordering of statements in relation to

    each other and the experience Reveals perspectives

  • Q Sort

  • Data Collection

    Rural IPE students

    Two rural locations

    Four health care professions MD, PA, Dental, Pharmacy

    15 from each location

    45 participants

  • Q Sort Demographics

    Late twenties (mean 28.5) Majority PA students (24) Near even split – female and male (22 and 23) Majority married/partnered without children (24) Majority not from a rural background (27) Majority were in Coos Bay (28)

  • Q sort Interpretation – Factor Analysis

    Four-step process

    1. Correlation

    2. Rotation

    3. Factor scores

    4. Factor array

    Representative Q sort of participants who sorted similarly – shared perspective

  • Q sort Interpretation – Factor Analysis

    Three Perspectives 1. The Team-Oriented Rural Optimist

    + Team, age, profession, length of time

    − Living together, resources, project, community connection

    2. The Independent Rural Impartial + Isolation from family/friends, natural resources, length of time

    − Project, community engagement, team-based approach

    3. The Team-Willing Rural Skeptic + First-hand experience, no children, location, length of time

    − Recreation, profession, clinical environment, community

  • Study Conclusions

    General Rural IPE is useful for understanding rural life and making

    decisions about where to live and work after graduation

    Rural IPE needs to have a sustained presence

    Connection between community involvement and decision to return

    Specific Rural IPE motivates students to return to practice team-based

    care

    Feeling of isolation/remoteness deter a student from returning

    Rural heritage leads to increased likelihood a student will return

  • Limitations

    Only 4 health care professions No correlation between qualitative and

    quantitative No educational sequence data Bias in Q statement development

  • Thank you!

  • Questions & Discussion Alan Dow, MD, MSHA

    Assistant Vice President of Health Sciences for Interprofessional Education & Collaborative Care

    Virginia Commonwealth University

    Improving the Evaluation of Interprofessional Education: Moving beyond Attitudinal MeasuresPanel AgendaUse of Multisource Data to Evaluate�Classroom-based IPE for Early LearnersBackgroundAssessment ChallengesHow do we assess student outcomes and evaluate the effectiveness of large classes?Our Approach To AssessmentTypical Assessment MeasuresSample Findings: Individual LearnersSample Findings: TeamsConsiderationsConclusionsThank you! For more information…An Interprofessional Multi-Source Feedback Tool for Early Learners: �Lessons Learned from a Pilot StudyThe UCSF Bridges Curriculum ��The Clinical Microsystem Clerkship (CMC)an interprofessional collaboration (IPC) assessment tool for CMC and beyond…The Multi-source Feedback ToolComplianceFall 2017 ResultsSpring 2017 ResultsLessons LearnedImplications for Program EvaluationNext StepsQuestions?�Multi-level Evaluation of IPE in the WorkplaceSan Francisco VA EdPACT ProgramIPE: An Individual and Collective EnterpriseIndividual Performance in a TeamSlide Number 30Program Evaluation: �Individual Performance in a TeamInterprofessional TeamworkInterprofessional Teamwork: ExampleProgram Evaluation:�Interprofessional TeamworkA Systems-Based Assessment Framework1 for IPE?AcknowledgementsImproving the Evaluation of Interprofessional Education: Moving Beyond Attitudinal MeasuresPACER (Professionals Accelerating Clinical & Educational Redesign)Slide Number 39Learners included in PACER� - FM, IM, Peds Residents & Faculty� - PA Students� - NP Students� - MA Students� - Behavioral Health Students� - Pharmacy StudentsEvaluation DesignCore Data Collection ‘Instruments’Slide Number 43Slide Number 44Longitudinal CoachingCollaborative Site VisitsSlide Number 47Slide Number 48Slide Number 49Use of Q methodology to evaluate rural IPE Rural IPE Study PurposeQ Method Research QuestionSlide Number 53Q MethodologyQ SetQ SortQ SortData CollectionQ Sort DemographicsQ sort Interpretation – Factor Analysis Q sort Interpretation – Factor Analysis Study ConclusionsLimitationsThank you!Questions & Discussion