Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 -...

26
Jennifer Coffey, OSEP

Transcript of Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 -...

Page 1: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Jennifer Coffey, OSEP

Page 2: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Regional Meetings – Evidence Based Professional DevelopmentFebruary 3 - Washington, DC - Speaker: Michelle Duda, SISEPFebruary 8 - New Orleans, LA - Speaker: Melissa VanDyke, SISEP

February 15 - Portland, OR - Speaker: Chris Borgmeier, Oregon PBIS

Leadership Network

Innovation FluencyDate: March 24, 3:00-4:30pm ETSpeaker: Karen Blasé, SISEP Professional Development for AdministratorsDate: April 19, 3:00-4:30pm ETSpeakers: Elaine Mulligan, NIUSI Leadscape & Rich

Barbacane, Nat’l Association of Elementary School Principals 

Steve Goodman, Michigan SPDGUsing Technology for Professional DevelopmentDate: May 18, 2:00-3:30pm ETSpeaker: Chris Dede, Ph.D., Learning Technologies, Harvard

2

Page 3: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

According to the thesaurus of the Educational Resources Information Center (ERIC) database, professional development refers to "activities to enhance professional career growth." Such activities may include individual development, continuing education, and inservice education, as well as curriculum writing, peer collaboration, study groups, and peer coaching or mentoring.

Fullan (1991) expands the definition to include "the sum total of formal and informal learning experiences throughout one's career from preservice teacher education to retirement" (p. 326).

North Central Regional Education Lab (NCREL)

Page 4: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

"Professional development ... goes beyond the term 'training' with its implications of learning skills, and encompasses a definition that includes formal and informal means of helping teachers not only learn new skills but also develop new insights into pedagogy and their own practice, and explore new or advanced understandings of content and resources. [This] definition of professional development includes support for teachers as they encounter the challenges that come with putting into practice their evolving understandings about the use of technology to support inquiry-based learning.... Current technologies offer resources to meet these challenges and provide teachers with a cluster of supports that help them continue to grow in their professional skills, understandings, and interests.“ – Grant (nd)

Page 5: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Evidence base/best practice Models

• Carol Trivette and Carl Dunst - PALS• NIRN – Implementation Drivers• Guskey

Preview to the presentation at the Regional Meeting and the PD Series

SPDG/OSEP Program Measures

5

Page 6: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Julie Morrison, Ohio SPDG Evaluator• Models of Professional Development

Li Walter and Alan Wood, California SPDG Evaluators

Page 7: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

“No intervention practice, no matter what its evidence base, is likely to be learned and adopted if the methods and strategies used to teach or train students, practitioners, parents, or others are not themselves effective.”

"Let's Be Pals: An Evidence-based Approach to Professional Development." Dunst & Trivette, 2009

Page 8: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

• Evidence-Based Intervention Practices Insert your SPDG initiative here

• Evidence-Based Implementation Practices Professional Development

Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers Adult learning methods/principles Evaluation

8

Two Types of Two Types of Evidence-Based PracticesEvidence-Based Practices

Page 9: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

“Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

Page 10: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Research synthesis of 79 studies of accelerated learning, coaching, guided design, and just-in-time-training

58 randomized control design studies and 21 comparison group studies

3,152 experimental group participants and 2,988 control or comparison group participants

Combination of studies in college and noncollege settings

Learner outcomes included learner knowledge, skills, attitudes, and self-efficacy beliefs

Weighted average Cohen’s d effect sizes for the post test differences between the intervention and nonintervention or comparison groups were used for assessing the impact of the adult learning methods.

a Trivette, C.M. et al. (2009). Characteristics and consequences of adult learning methods and strategies. Winterberry Research Syntheses, Vol. 2, Number 1.

10

Page 11: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery)• Need learner participation in learning new

knowledge or practice• Need learner engagement in judging his or

her experience in learning and using new material

Page 12: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Planning

Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training

Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner

Application

Practice Engage the learner in the use of the material, knowledge or practice

Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice

Deep Understanding

Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process

Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria

a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

12

Page 13: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Practices

Number

Mean Effect Size (d)

95% Confidence

IntervalStudies

Effect Sizes

Pre-class exercises 9 9 1.02 .63-1.41

Out of class activities/self-instruction 12 20 .76 .44-1.09

Classroom/workshop lectures 26 108 .68 .47-.89

Dramatic readings 18 40 .35 .13-.57

Imagery 7 18 .34 .08-.59

Dramatic readings/imagery 4 11 .15 -.33-.62

Effect Sizes for Introducing Information to Learners

13

Page 14: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Practices

Number

Mean Effect

Size (d)

95% Confidence

IntervalStudies

Effect Sizes

Using learner input for illustration 6 6 .89 .28-1.51

Role playing/simulations 20 64 .87 .58-1.17

Real life example/real life + role playing

6 10 .67 .27-1.07

Instructional video 5 49 .33 .09-.59

Effect Sizes for Illustrating/Demonstrating Learning Topic

14

Page 15: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Characteristics

Number

Mean Effect

Size (d)

95% Confidence

IntervalStudies Effect Sizes

Real life application + role playing 5 20 1.10 .48-1.72

Problem solving tasks 16 29 .67 .39-.95

Real life application 17 83 .58 .35-.81

Learning games/writing exercises 9 11 .55 .11-.99

Role playing (skits, plays) 11 35 .41 .21-.62

Effect Sizes for Learner Application

15

Page 16: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Practices

Number

Mean Effect

Size (d)

95% Confidence

IntervalStudies Effect Sizes

Assess strengths/weaknesses 14 48 .96 .67-1.26

Review experience/make changes

19 35 .60 .36-.83

Effect Sizes for Learner Evaluation

16

Page 17: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Practices

Number

Mean Effect

Size (d)

95% Confidence

IntervalStudies Effect Sizes

Performance improvement 9 34 1.07 .69-1.45

Journaling/behavior suggestion 8 17 .75 .49-1.00

Group discussion about feedback 16 29 .67 .39-.95

Effect Sizes for Learner Reflection

17

Page 18: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Practices

Number

Mean Effect

Size (d)

95% Confidence

IntervalStudies Effect Sizes

Standards-based assessment 13 44 .76 .42-1.10

Self-assessment 16 29 .67 .39-.95

Effect Sizes for Self-Assessment of Learner Mastery

18

Page 19: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Engaging learners in a process of self-assessment of their performance using some type of conceptual or operational framework proved to be a practice that resulted in the largest effect

“Learners are not likely to become experts without instructors engaging them in a process of evaluating their experiences in the context of some framework, model, or operationally defined performance standards or expectations.”

Page 20: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

“The more opportunities a learner has to acquire and use new knowledge or practice, the more frequently those opportunities occur, and the more the learner is engaged in reflection on those opportunities using some external set of standards, the greater the likelihood of optimal benefits.”

Page 21: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

0 1 2 3 4 5

ME

AN

EF

FE

CT

SIZ

E (d

)

21

Page 22: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes.

The more hours of training over an extended number of sessions, the better the study outcomes.

The practices are similarly effective when used in different settings with different types of learners.

22

Page 23: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

Trainers neither direct learning nor encourage only self-directed learning, but rather guide learning based on observations of learners’ experiences and evaluation of the use of the practice and learner self-assessment against standards.

Page 24: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

PLAN

APPLICATIONRECYCLE Active Learner

Involvement

Reflection and Mastery

Practice and Evaluate

Introduce and Illustrate

Identify Next Steps in the

Learning Process

INFORMED UNDERSTANDING

24

Page 25: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

“The use of PALS practices has been found to be associated with improved learner knowledge use, and mastery of different types of intervention practices.”

Page 26: Jennifer Coffey, OSEP. Regional Meetings – Evidence Based Professional Development February 3 - Washington, DC - Speaker: Michelle Duda, SISEP February.

PALS Phases Trainer Roles Trainee Roles

Introduction Preview learning topic Complete pretraining previewDescribe key elements Pre-class/workshop exercisesProvide examples Provide input on the learning topicInclude trainee input In-class/workshop warm-up exercisesIllustrate applicationDemonstrate application

Application Facilitate application Provide examples of applicationObserve trainee application Trainee role playing, games, etc.

Provide in vivo feedback/guidanceImplement/practice use of the subject matter

Facilitate learner assessment of optionsEvaluate use of the knowledge or practice

Informed Understanding Establish learning standards Standards-based evaluationEngage learners in self-assessment Conduct self-assessmentProvide guidance to learners Trainer-guided learner reflectionProvide behavioral suggestions Journaling

Group discussions of understanding

Repeat Learning Process Joint planning Joint planningTrainer guidance Identify needed information/experiencesTrainer/trainee mentoring Trainer/trainee mentoring

Trainer and Trainee Roles in the Different Phases of PALS

26