Testing the NIRN Implementation Model and the CFIR...

40
Testing the NIRN Implementation Model and the CFIR Implementation Constructs: Lessons Learned from a Multi-Site Implementation of Motivational Interviewing Melanie Barwick, Ph.D., C.Psych Associate Scientist, Learning Institute Scientific Director Knowledge Translation, Research Institute The Hospital for Sick Children Associate Professor, Psychiatry & Dalla Lana School of Public Health University of Toronto, Canada Raluca Barac, PhD Melissa Kimber, PhD(c) Sabine Johnson, MA Clinical Research Project Managers The Hospital for Sick Children and The CIHR Emerging Team in Knowledge Translation for Child and Youth Mental Health 26th Annual Children’s Mental Health Research and Policy Conference Tampa FL March 4 th , 2013

Transcript of Testing the NIRN Implementation Model and the CFIR...

Testing the NIRN Implementation Model and the

CFIR Implementation Constructs: Lessons Learned from a Multi-Site Implementation of

Motivational Interviewing

Melanie Barwick, Ph.D., C.Psych Associate Scientist, Learning Institute

Scientific Director Knowledge Translation, Research Institute The Hospital for Sick Children

Associate Professor, Psychiatry & Dalla Lana School of Public Health University of Toronto, Canada

Raluca Barac, PhD

Melissa Kimber, PhD(c)

Sabine Johnson, MA Clinical Research Project Managers

The Hospital for Sick Children and

The CIHR Emerging Team in Knowledge Translation for Child and Youth Mental Health

26th Annual Children’s Mental Health Research and Policy Conference Tampa FL March 4th , 2013

© Melanie Barwick, 2013

Research Team Melanie Barwick, Sickkids / U Toronto Charles E. Cunningham, McMaster Rosemary Tannock, Sickkids / OISE Rhonda Martinussen, OISE Peter Chaban, Sickkids Kathryn Bennett, McMaster Don Buchanan, Hamilton Wentworth District School Board Bruce Ferguson, Sickkids Dean Fergusson, Ottawa Health Research Institute Institutional Partners Children’s Mental Health Ontario Ontario Ministry of Children and Youth Services Ontario Ministry of Education Ontario Centre of Excellence for Child and Youth Mental Health Community Based Research Partners Associated Youth Services of Peel Lynwood Hall Child Development Institute Craigwood Youth Services

Funded by an Emerging Team Grant from

Canadian Institutes of Health Research

© Melanie Barwick, 2013

Collaborators

© Melanie Barwick, 2013

The problem

Schools and Child and Youth Mental Health (CYMH) provider organizations often fail to adopt evidence–based approaches.

Implementation of evidence-based practices is of growing concern to communities, policy makers, and researchers.

How can we do this more effectively and efficiently?

© Melanie Barwick, 2013

Research objectives

1) To test the National Implementation Research Network (NIRN) model and the factors identified in the Consolidated Framework for Implementation Research (CFIR) in the Canadian mental health context in order to inform the knowledge base on successful implementation of evidence-based practices.

2) To examine the experience of clinicians and supervisors involved in the process of implementing Motivational Interviewing in four child and youth mental health provider organizations

3) To inform change at the practice and system levels and refine existing theoretical and meta-theoretical approaches to implementation.

© Melanie Barwick, 2013

Abstract

Implementation of evidence-based practices in mental health care is essential for improving health outcomes.

Together, the NIRN1 model and CFIR2 constructs provide a comprehensive approach to guide the implementation process, but both frameworks require further empirical investigation.

To this end, we implemented Motivational Interviewing in four child mental health organizations in Canada, using NIRN as a guide and measuring key CFIR constructs.

These findings have significant implications for implementation, theory, research and practice.

1 Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. and Wallace, F (2005). Implementation

research: a synthesis of the literature. NIRN Monograph 2 Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. (2009) Implement Sci.

7;4:50.

BACKGROUND

© Melanie Barwick, 2013

Implementation Science – Learning how to bring evidence to practice

'Implementation Research is the scientific study of methods to promote the systematic uptake of [clinical] research findings and other evidence-based practices into routine practice, and hence to improve the quality (effectiveness, reliability, safety, appropriateness, equity, efficiency) of health care or well-being. It includes the study of influences on healthcare professional and organizational behaviour.'

Source: Adapted from An implementation research agenda, Implementation Science 2009, 4:18 Martin P Eccles1 , et al.

© Melanie Barwick, 2013

Implementation Science

IMPLEMENTATION

INT

ER

VE

NT

ION

Effective Not Effective

Effective ACTUAL BENEFITS Inconsistent

Non Sustainable Poor Outcomes

Not Effective Poor Outcomes Poor Outcomes

Sometimes Harmful

Source: Fixsen, Blase, Timbers & Wolf (2007). In search of program implementation – 792 replications of the teaching-family model. The Behavior Analyst Today, 8(1), 96-110. Balas EA, Boren SA. In: Yearbook of Medical Informatics 2000: Patient-Centered Systems. Stuttgart: Schattauer; 2000:65-70. Dean Fixsen [Institute of Medicine 2000,2001.,2009; New Freedom Commission on Mental Health, 2003; National Commission on Excellence in Education, 1983; Dept of Health and Human Services, 1999]

Intervention Implementation Team No Implementation Team

Effectiveness

80% 3 years

14% 17 years

Application of implementation science and practice

Fixsen et al., 2001

Diffusion and dissemination

Balas & Boren 2000

© Melanie Barwick, 2013

Why implementation is important

© Melanie Barwick, 2013

NIRN Model

Source: http://nirn.fpg.unc.edu/learn-implementation/implementation-drivers

© Melanie Barwick, 2013

Staff Development OUTCOMES

(% of participants who DEMONSTRATE KNOWLEDGE; DEMONSTRATE NEW SKILLS in a practice setting; and USE

NEW SKILLS in the Classroom

Training Components

Knowledge Skill Demonstration

Use in the Classroom

Theory & Discussion

10% 0% 0%

+ Demonstration & Training

30% 20% 0%

+ Practice & Feedback in Training

60% 60% 5%

+ Coaching in the Classroom

95% 95% 95%

Joyce, B., & Showers, B. (2002). Student achievement through staff development. Alexandria, VA: Association for Supervision and Curriculum Development.

© Melanie Barwick, 2013

Key CFIR Factors for Implementation

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Implement Sci. 2009 Aug 7;4:50.

Implementation

Implementation Characteristics

Outer Setting

Inner Setting Characteristics

of the Individual

Process

© Melanie Barwick, 2013

Key CFIR Factors for Implementation

Process

Planning

Engaging

Executing

Reflecting &evaluating

Intervention Characteristics

Intervention Source

Evidence Strength & Quality

Relative Advantage

Adaptability

Trialability

Complexity

Design Quality & Packaging

Cost

Outer Setting

Patient Needs & Resources

Cosmopolitainism

Peer Pressure

External Policies & Incentives

Inner Setting

Structural Characteristics

Networks and Communications

Culture

Implementation Climate

Readiness for implementation

Characteristics of Individuals

Knowledge & Beliefs about the

intervention

Self-Efficacy

Individual Stage of Change

Individual Identification w Organization

Other Personal Attributes

Source; Implement Sci. 2009 Aug 7;4:50. Fostering implementation of health services research findings into practice: a consolidated framework for

advancing implementation science. Damschroder LJ, Aron DC, Keith RE, Kirsh SR,

Alexander JA, Lowery JC.

© Melanie Barwick, 2013

Our Research

EBP Organizations Sectors Question

How to implement

EBPs?

CYMH

Org1

Motivational Interviewing

Org2

Org3

Org4

Education Sch1 TeachADHD

© Melanie Barwick, 2013

Our Studies (and outputs to date)

Establishing the evidence base for MI training – systematic review

Focus Groups with educators and CYMH practitioners

Discrete conjoint experiments & consumer preference modeling with educators & practitioners

EBP Implementation case studies (mixed methods)

Barwick MA., Bennett LM, Johnson SN, McGowan J & Moore JE (2012). Training health and mental health professionals in motivational interviewing: A systematic review. Children and Youth Services Review, 34 (2012), pp. 1786-1795,

Barwick MA, Bennett LM, Johnson S, Chaban P, Barac R & Hawke L. Bringing evidence to the classroom: exploring educator preferences for practice change. Barwick MA , Johnson S, Bennett-Akrong L. Hawke L, Barac R, Kimber M. Bringing evidence to children’s mental health care: exploring practitioner preferences for practice change.

Cunningham CE, Barwick MA, Short K, Chen Y, Ratcliffe J, Rimas H & Mielko S. Modeling the Mental Health Practice Change Preferences of Educators: A Discrete–Choice Conjoint Experiment. Submitted to Journal of School Psychology, January 2nd 2013.

© Melanie Barwick, 2013

Motivational interviewing

• Motivational interviewing (MI) is a counseling approach that focuses on helping clients explore and resolve ambivalence and centers on motivational processes within the individual that facilitate change.

• Most recently, MI has been defined as a collaborative, person-centered form of guiding to elicit and strengthen motivation for change (Miller & Rollnick, 2009).

• A large body of work studying the effectiveness of MI has led to several systematic reviews published in support of MI effectiveness for a range of client outcomes (Rubak et al., 2005; Heckman, Egleston, & Hofmann, 2010).

METHODS

© Melanie Barwick, 2013

Methods

We implemented Motivational Interviewing in four child and youth mental health provider organizations in Ontario, Canada.

The implementation approach was based on the CFIR model and best evidence from the NIRN implementation model.

Across the four organizations

24 clinicians (18 submitted tapes; full data for 12)

10 supervisors

received 2-day training in Motivational Interviewing and

monthly coaching sessions for 7 months

Participating clinicians (n=24)

Role Education Age Work

experience

child and youth worker

(n = 13)

college diploma

(n = 14)

18-24 years

(n = 4)

2-8 years

(n = 7)

social worker

(n = 4)

undergraduate

degree

(n = 4)

25-34 years

(n = 5)

10-16 years

(n = 11)

youth counsellor

(n = 2)

Master’s degree

(n = 5)

35-44 years

(n = 9)

25-35 years

(n = 4)

other (youth outreach

worker, child and family

clinician, etc.) (n = 5)

child study

diploma

(n = 1)

45-54 years

(n = 3)

55+ years

(n = 3)

© Melanie Barwick, 2013

Procedures / Timeline

Tools/ Measures

Pre-

implement.

Post-

implement.

Repeated

(monthly)

Checklist to Assess Readiness for

Implementation (CARI)

PACE Curriculum

Evidence-Based Practice Attitude Scale

(EBPAS)

Brief Individual Readiness for Change

Scale (BIRCS)

Personal Efficacy Scale (PES)

Organizational Readiness for Change

(ORC)

MI Fidelity Measures: BECCI

Notes from coaching calls &

implementation team calls

Exit Focus Groups

© Melanie Barwick, 2013

Our innovation: Implementation Curriculum

We developed an e-learning curriculum for the Ontario Centre of Excellence for Child and Youth Mental Health to support capacity building among child and youth mental health providers in the implementation of evidence based practices.

Available for public access as of March 30, 2013

Contact: Mark MacAulay Manager, Implementation Support Program l Ontario Centre of Excellence for Child and Youth Mental Health T: (613) 737- 2297 x2895 l F: (613) 738-4894 l E: [email protected] www.excellenceforchildandyouth.ca

RESULTS

© Melanie Barwick, 2013

First – a caveat:

What we wish we could tell you, but we can’t:

Too few subjects (clinicians) to conduct predictive modeling on CFIR constructs and IO (fidelity, BECCI scores)

© Melanie Barwick, 2013

Quantitative Data

a) pre- and post-implementation questionnaires on

readiness for change (organizational, individual)

self-efficacy, attitudes toward EBPs,

organizational culture

b) monthly checks of fidelity to Motivational Interviewing based on coding of audiotaped therapy sessions =

Measure of Implementation outcome (fidelity)

Clinicians taped monthly sessions three months before the training (n = 3 tapes), throughout the period when they received training and coaching (n = 9 tapes) and three months post-coaching (n = 3 tapes).

All therapy sessions were scored for clinicians’ fidelity to Motivational Interviewing using the Behaviour Change Counselling Index (BECCI; Lane et al., 2005). 20% also scored on MITI (Moyers et al.).

© Melanie Barwick, 2013

Qualitative Data

a) focus groups with clinicians, supervisors, and members of the provider organizations’ implementation teams, conducted following the completion of coaching sessions in Motivation Interviewing

b) process notes from the monthly coaching calls with supervisors and clinicians as well as the monthly calls with implementation teams from the four organizations.

Qualitative analyses were coded both inductively (interpretive description based on transcripts) and deductively (based on the CFIR and NIRN frameworks).

© Melanie Barwick, 2013

Quantitative Results 1) Repeated measures analyses showed a shift in clinicians’

practice and attitudes towards EBP

2) Results showed increased clinician Motivational Interviewing adherence and competence over time = evidence of successful implementation outcome (fidelity). Of the 12 clinicians with pre- and post-implementation data 10 clinicians (84%) showed increased fidelity.

These findings suggest that the present implementation approach (training + coaching + fidelity monitoring + feedback) is successful in producing practice change.

© Melanie Barwick, 2013

Qualitative Findings

In addition, qualitative analyses revealed important aspects of the implementation process such as:

the role and format of the implementation team

the nature of the research facilitation

the necessity of the pre-implementation phase

the characteristics of the selected EBP (i.e., whether it is a manualized approach or a stand-alone therapeutic element)

the existence & feasibility of practice fidelity measures for use in the field

impact on the clinicians’ practice and organizational culture

These findings bring important refinements to the NIRN and CFIR frameworks and contribute to the knowledge base informing successful implementation of EBPs in practice.

© Melanie Barwick, 2013

Analyses we haven’t gotten to yet…

Questions we will be able to answer with the data we have, despite limitations:

1) How do practitioners’ characteristics (readiness, self efficacy) relate to implementation outcomes (MI fidelity measures with BECCI);

2) How did the clinicians and supervisors experience coaching in MI and the overall process of MI implementation?

3) What CFIR constructs distinguish between high-fidelity clinicians and low-fidelity clinicians?

4) How did organizations differ with respect to NIRN implementation drivers?

Single Most Important Things

(Smits) AKA what we learned

© Melanie Barwick, 2013

NIRN Model

Source: http://nirn.fpg.unc.edu/learn-implementation/implementation-drivers

© Melanie Barwick, 2013

SMITS - NIRN Implementation Drivers: 1) Selection:

o Have to contend with many HR shifts; people move around and this has implications for implementation and the research of implementation

o Try to train people who work closely together; they will support each other

o Select people who are ‘ready’ to change

o Select people who are relatively ‘stable’ on the job

o Plan for sustainability & build in practices along the way (e.g., reflection, supervision)

2) Training:

o Build in 4-6 months of preparation time BEFORE training; you need time to get the practitioners and implementation teams oriented to implementation

o Interactive training with role play is best

o Plan for sustainability

3) Coaching:

o Build continuity between your trainers and coaches

o Schedule coaching sessions ahead – 1xmos, over 7 months

o Share tapes (fidelity measures) with coaches, if possible

o Work with coaches (ahead of the implementation) to devise a plan for responding to clinicians' who demonstrate inappropriate or problematic practice

© Melanie Barwick, 2013

SMITS - NIRN Implementation Drivers: 4) Managerial Support:

o Training should include supervisors

o Make sure implementation team understands their role & function

5) Facilitative Administration:

o Clinicians will need time to absorb new learning; this has implications for caseload

o Consider how many EBPs you’re bringing in at one time; implications for individual learning and absorptive capacity

o Ensure fidelity measure is practical and feasible for the setting

o Pay attention to new clinical roles i.e., Peer Coach, Practice Lead

o Re-examine models of supervision (including how fidelity is reviewed)

6) Data Support Systems:

o Data systems are necessary not sufficient; ensure there is a business practice for:

o Reviewing fidelity (adherence)

o Reviewing aggregate data for new EBP

o Communicating what you learn from data

7) Leadership:

o Take care to ensure you’re not only dealing with managers and that senior leadership know what’s going on

© Melanie Barwick, 2013

SMITS - CFIR Factors for Implementation

Process

Planning

Engaging

Executing

Reflecting &evaluating

Intervention Characteristics

Intervention Source

Evidence Strength & Quality

Relative Advantage

Adaptability

Trialability

Complexity

Design Quality & Packaging

Cost

Type of EBP: Element or Manualized

Outer Setting

Patient Needs & Resources

Cosmopolitainism

Peer Pressure

External Policies & Incentives

Sector (health, MH, education, global health)

Inner Setting

Structural Characteristics

Networks and Communications

Culture

Implementation Climate

Readiness for implementation

Model of supervision

Characteristics of Individuals

Knowledge & Beliefs about the

intervention

Self-Efficacy

Individual Stage of Change

Individual Identification w Organization

Other Personal Attributes

Educational Training (CYW vs SW)

Considerations for conducting

implementation research

© Melanie Barwick, 2013

What we wish we knew going in…

1) Prediction (CFIR constructs IO) requires a decent sample size. These can be difficult to achieve in IS case studies because of the nature of case study, attrition due to real world events, cost

2) Fidelity measures can pose problems if they don’t lend themselves to the real world (not practical for independent use in practice)

3) Measuring fidelity to EBP practice is difficult due to (a) ‘barriers’ in audio and visual recording, and (b) feasibility – not all therapy happens in a quiet office

4) Pre-implementation likely takes 4-6 months (longer than we anticipated); but this has implications for research funding time frames

IO = implementation outcome

Melanie Barwick, PhD,CPsych Associate Scientist

Scientific Director Knowledge Translation The Hospital for Sick Children

Associate Professor, Department of Psychiatry University of Toronto

Email: [email protected]

Web: www.melaniebarwick.com twitter.com/MelanieBarwick

Scientist Knowledge Translation Training course (SKTT) http://tinyurl.com/3uaqob7

Knowledge Translation Professional Certificate (KTPC) http://tinyurl.com/7zrvbq4