Leveraging Competing and Complementary Roles for Success in R & D Mid-continent Research for...

44
Leveraging Competing and Complementary Roles for Success in R & D Mid-continent Research for Education and Learning Sheila A. Arens, Helen Apthorp, Zoe Barley, LeAnn M. Gamache © 2005

Transcript of Leveraging Competing and Complementary Roles for Success in R & D Mid-continent Research for...

Leveraging Competing and Complementary Roles for

Success in R & D

Mid-continent Research for Education and Learning

Sheila A. Arens, Helen Apthorp, Zoe Barley, LeAnn M. Gamache

© 2005

2

Questions to Ponder… What are the bridges evaluators must

cross in the R & D world?

What are the roles at play in the R & D continuum?

How can evaluators respond to different (sometimes conflicting) expectations around issues of the validity of evidence?

© 2005

Creating a Coherence with Language of an R&D Continuum

LeAnn M. Gamache, PhDMcREL

© 2005

4

The Players Sponsors

Program Evaluation

Program Developers Implementers Evaluators Participants and

Constituents Researchers Others

© 2005

5

Purposes for a Continuum

Enable common language Help to highlight project priorities Reveal assumptions Focus planning discussions within

context of total endeavor

© 2005

6

Overview of an R&D Continuum

Four Phases for R&D Endeavor Need and Approach Model and Instrumentation Development and Pilot-Testing Broad Dissemination and

Implementation

© 2005

7

Stages within the Four Phases: Need and Approach

A: Need and approach

1 2

Identifying Critical and Enduring

Educational Issues,

Challenges and Priorities

>

Conducting Basic and Applied

Research/Identification of

Relevant Theories

Outcomes: identify critical conduct research:

issues theories challenges rsrch findings priorities Rsrch Base for

R, E, and A:monitor review of lit literature synthesis dialog ident gaps anecdotes conduct target

rhetoric studies

(iterate) (iterate)

Constituencies:network sharing receive and react

out communic re "so what"reflect on situationscommunicate

© 2005

8

Stages within the Four Phases: Model and Instrumentation

B. Model and Instrumentation

3 4

Translating Theory and

Research into Practice(s) and Interventions

>

Designing, Developing, and Testing Evaluation

Products and Tools

models developed programs

practices eval instrumts & tools

craft model design evaltest model devel instrumentstranslate model field test instr into practice finalize instruresearch/test practices

(iterate)

brainstorm and particip in field communicate: testing implications particip in data connections gathering ramifications

© 2005

9

Stages within the Four Phases: Development and Pilot-Testing

C. Development and Pilot-Testing

5 6a 6b 6c 7

Building and Evaluating

Explicit Procedural Knowledge

>

Increasing Knowledge, Products,

Tools Utilization

>

Conducting Small-Scale

Implementation and Efficacy

Research

>

Creating Scale Up and Large-

Scale Generalization

Research

>

Establishing and

Sustaining High

Performing Schools

in sites in sites in sites in sites in sitesbuild and eval build and eval evidence of evidence of evidence of knowledge&use knowledge&use results results results

(components) (components) (full model) (full model) (full model)ident sites ident sites ident sites ident sites ident sitessecure resources secure resources secure resources secure resources secure resourcestryout in site tryout in site implement implement implementeval components eval components monitor implem monitor implem monitor implem

1 1 eval - formative eval - formative eval - formative2 2 eval - summative eval - summative eval - summative3 3 replicate in dif4 4 environments

in combination in combination(iterate) (iterate) (replicate) (replicate) (replicate)

particip in piloting particip in piloting particip in piloting particip in piloting particip in implemparticip in eval particip in eval particip in eval particip in eval particip in eval

© 2005

10

Stages within the Four Phases: Dissemination and Implementation

D. Broad Dissemination and Implementation

8 9 10

Disseminating Products,

Tools, Procedural Knowledge

>

Increasing and

Sustaining use of Tools, Products, and

Procedural Knowledge

>Transforming

a Growing Mass of Schools

dissem (awareness): sustained use: evidence of products products success and tools tools transformation knowledge knowledge

dissemination: brief/presentation increased direct see evid of +report(s) and indirect transformation +book use/citation in routine +training reach critical mass indicators +toolkit (tipping pt)monitor dissem

(repeat) (repeat, monitor) (monitor, report)

sharing experiences, share evid of benefit from suggestions to successes transformation peers maintain commitmtincorp into dialog and focus and other work

© 2005

11

Roles and Discussions at Critical Junctures within Stages

During Development At Implementation At Evaluation

© 2005

12

The Context for Evaluation

in an R & D World

Zoe A. BarleyMcREL

© 2005

13

Three Phases

Development Implementation Production

© 2005

14

Four Roles:

Conceptualizer Implementer/Practitioner Funder Evaluator

© 2005

15

Interactions: Roles and Phases

P H A S E SDevelopment Implementation Production

R Conceptualizer X O O O L Implementer O X O E S Funder O O X

© 2005

16

Variables of Interest

Level of Investment (LOI) Level of Astuteness (LOA) Evidentiary Requirements

(ER)

© 2005

17

The Development Phase

The Conceptualizer LOI High LOA High ER Is it true to theory?

The Practitioner – Is it doable in the real world?

The Funder – is it marketable/affordable?

© 2005

18

The Implementation Phase

The Practitioner/Implementer LOI High LOA High ER Will it work in context?

The Conceptualizer – Can it stand the adaptations?

The Funder – What is the market niche?

© 2005

19

The Production Phase

The Funder LOI High LOA High ER Will sales support it?

The Conceptualizer – Is it still theory based?

The Implementer – Will it make a difference?

© 2005

20

What happens when - -

The Conceptualizer is in charge

The Reluctant Genius

© 2005

21

What happens when - -

The Implementer is in charge

The Passionate Reformer

© 2005

22

What happens when - -

The Funder is in charge

The Bottom Liner

© 2005

23

Validity Concerns Reconciled?

Client versus Evaluator Evidentiary Expectations

Sheila A. ArensMcREL© 2005

24

Concerns about the quality of evidence and claims underlie all social science

Such concerns have been punctuated by the increased interest in evidence-based inquiry and evidence-based practice…

Overview

© 2005

25

Emerging Needs, Differing Perspectives

Increased pressure on Practitioners to select and engage in only those practices that are evidence-based, elevates considerations of what constitutes “evidence” and “evidence based”

Increased pressure on R&D Organizations to collect evidence for their products and services to satisfy practitioner requirements

© 2005

26

Varied Perspectives on Validity

There exist varied perspectives regarding how external readers approach or engage in evaluation documents and varied responses from evaluation community regarding how to appropriately deal with this

© 2005

27

House 1985: Decisions about the data to collect are

intertwined with prospectively considering the rhetorical power of statements one wishes to issue relative to audience...

…regardless of the veracity of the claim(s) being made, evaluators must attend to audience — if evaluation fails to provide audience with acceptable explanation / fails to enhance understanding of some phenomenon, findings may not be considered adequate

Thus, persuasion plays a role in evaluative claims and the perceived validity of the inferences and the extent to which the evaluator herself is able to craft a compelling rhetorical argument is partially a product of audience. Validity is therefore not merely about reaching “true” assertions.

© 2005

28

Patton 2002: goal of ensuring evaluative validity should not be

to reach technical standards but rather, to determine whether appropriate methods and measures have been utilized for the particular evaluation purpose(s) and relative to the intended users of the evaluation findings

© 2005

29

Lincoln 2003: validity is not simply a matter of

determining which data collection efforts lead to better information

"…but rather, which kinds of evidence will best address certain questions, and, at a foundational level, which kinds of literary-rhetorical devices are being employed, and which kinds of symbolic-interpretive processes are being brought to bear in the mounting of a persuasive argument?" (italics in original).

© 2005

30

Cases Several illustrative cases to highlight

differences in evidentiary expectations

These emerged both among various stakeholders and between stakeholders and evaluators

© 2005

31

Up the Ladder Context: A state department of

education

State interested in documenting accountability for the state funding of teacher professional development

Participants in the experience expressed interest in “telling their stories,” and resisted the state department data collection efforts

© 2005

32

Ready, Set…Ready, Set… Context: Proprietor of online

professional development courses

While proprietor interested in collecting evidence of success of product, timing issues (evaluability) precluded the collection of meaningful data

In rush to advance evidence that program “works,” organization began to inappropriately utilize student data to make claims about program efficacy

© 2005

33

Hurry, hurry! Context: Textbook publisher

interested in examining curricular materials with eye on textbook adoption

While product development pressed to “rush to market,” organization continued to stress need for the collection of “rigorous evidence”

© 2005

34

All That Glitters Context: State department and

University interested in outcomes of systemic school reform model State department interested in “bottom

line accountability” student achievement

Participants interested in having their stories heard and having individual school successes and obstacles documented through case studies.

© 2005

35

Evaluator Responsibilities What are the responsibilities of the

evaluator regarding evidence?

How does the evaluator navigate between competing demands for evidence?

At what point does the evaluator need to intervene with client(s) to ensure that claims being made are adequately supported?

© 2005

Geniuses, Bottom Liners & Chameleons:

Complementary and Varying Roles in Education R&D

Helen S. Apthorp, PhDMcREL

© 2005

37

Three Stories across the Phases of Education R&D

Production Development ImplementationCycling Back

into the Future

The Passionate Reformer (What happens when the implementer is in

charge)

Juggling Multiple New

Roles

The Reluctant Genius (What happens when the conceptualizer is

in charge)

Crossing into the Future

The Bottom Liner(What happens when the funder is in

charge)

© 2005

38

The Passionate Reformer

What happens when the implementer is in charge? “We know it works.”

Evidence is not necessary

© 2005

39

The Reluctant Genius What happens

when the conceptualizer is in charge? Intervention is

often ill-defined.

Moves ahead, can’t wait for feedback.

© 2005

40

The Bottom Liner What happens

when the funder is in charge? Marketability is

priority Being savvy

reigns Agreements and

obligations become real

© 2005

41

Cycling back and into the Future

Find the bridges between What clients want to know

and what they ought to know Study design, method, and

audience

© 2005

42

Juggling Multiple New Roles

How not to be fickle Reject the chameleon Use professional authority

© 2005

43

Crossing Boundaries into the Future

Serve the needs of a broad base of stakeholders to protect against bias

Anticipate informational needs

Preserve credibility while remaining flexible

© 2005

44

Contact Information

Sheila A. Arens, [email protected] Zoe Barley, [email protected] LeAnn M. Gamache,

[email protected] Helen S. Apthorp,

[email protected]

© 2005