Brian Zuckerman American Evaluation Association November 2 nd 2011

13
It's an Evolution: Changing Roles and Approaches in the Evaluation of the Pittsburgh Science of Learning Center Brian Zuckerman American Evaluation Association November 2 nd 2011

description

It's an Evolution: Changing Roles and Approaches in the Evaluation of the Pittsburgh Science of Learning Center. Brian Zuckerman American Evaluation Association November 2 nd 2011. Pittsburgh Science of Learning Center. PI: Ken Koedinger (CMU) - PowerPoint PPT Presentation

Transcript of Brian Zuckerman American Evaluation Association November 2 nd 2011

Page 1: Brian Zuckerman  American Evaluation Association November 2 nd  2011

It's an Evolution: Changing Roles and Approaches in the Evaluation of the

Pittsburgh Science of Learning Center

Brian Zuckerman American Evaluation Association

November 2nd 2011

Page 2: Brian Zuckerman  American Evaluation Association November 2 nd  2011

Pittsburgh Science of Learning Center

• Purpose: Leverage cognitive theory and computational modeling to identify the conditions that cause robust student learning.

• Goals: Fundamentally transform– translational research in education– generation of learning science theory

Ed technology + Wide dissemination = “Basic research at scale”

+ =

PI: Ken Koedinger (CMU)Co-PIs: Chuck Perfettti (UPitt), David Klahr (CMU), Lauren Resnick (Upitt)

Page 3: Brian Zuckerman  American Evaluation Association November 2 nd  2011

PSLC: Transforming Translational Research

• LearnLab = social & technical infrastructure to support field-based basic research– Controlled experiments in real courses– Educational technologies => Data!

• Practice-relevant discovery– What lab-based theory survives

translation?– Field-based data drives discovery

English Reading Tutor

Chemistry Virtual Lab

Researchers

LearnLab

Schools

Algebra Intelligent Tutor

Page 4: Brian Zuckerman  American Evaluation Association November 2 nd  2011

PSLC: Transforming Theory GenerationEmerging “Computational Learning Science”• Data mining & computational modeling techniques • New data sources: Brain imaging, classroom video,

student interactions with ed tech

pslcdatashop.web.cmu.edu/KDDCup/

PSLC Capacity Building• Vast student data repository• New field: Educational Data Mining • 2010 KDD Cup

– Annual competition of Knowledge Discovery and Data Mining conference

– Task: Predict step-by-step performance of 10,000 algebra students across school year

Page 5: Brian Zuckerman  American Evaluation Association November 2 nd  2011

PSLC: Research to Practice

• Translation is built in– LearnLab embeds experiments &

scientific data collection within running courses

– Many outcomes of 200+ learning studies incorporated into courses

• Ed tech dissemination partners– Carnegie Learning, Inc.

>600,000 K12 math students a year– Open Learning Initiative

1000s of college student users a semester

Cognitive Tutor 2010/11 release uses PSLC results (Butcher & Aleven, 2008)

In use in all 50 states

Page 6: Brian Zuckerman  American Evaluation Association November 2 nd  2011

PSLC Activities Relative to Program Goals

• Conducts large-scale research– Large-scale theory development– Formation/nucleation of new fields or subdisciplines

• Develops and maintains infrastructure useful to community– Large-scale infrastructure development (LearnLab, DataShop, tools)

• Educates diverse, highly competent, and globally-engaged workforce– Education of graduate students and postdoctoral researchers– Broadening participation (e.g., PSLC summer internships)– Conducts center mass-requiring ancillary education efforts for broader

learning sciences community (e.g., PSLC summer school)• Forges valuable partnerships

– Among PSLC researchers in interdisciplinary collaborations– With industry/external stakeholders

Page 7: Brian Zuckerman  American Evaluation Association November 2 nd  2011

7

Changes in PSLC Organization

• Reorganization around renewal– Four clusters become three thrusts– Change in co-PI, on Executive Committee

Page 8: Brian Zuckerman  American Evaluation Association November 2 nd  2011

8

Evaluation Context

• Evaluator context– STPI funded by the center– STPI came on board around first site visit in 2005– Center passed through five-year review, now in year 7

• Shifts in PSLC activities and logic model– Change in organization– Shifting emphasis on goal of theoretical framework

development– Other smaller changes (e.g., shift in diversity goals

toward long-term expansion of field)

Page 9: Brian Zuckerman  American Evaluation Association November 2 nd  2011

Evolution of Evaluation Effort Matches Changes in PSLC Lifecycle

• Years 1-2: Predominantly focused on growth of “Centerness” with data collection internal to Center– Management processes (interviews)– Collaboration formation (interviews, collaboration survey)– Development of Center-wide language and culture (interviews)

• Years 3-4: In preparation for site review focus shifted– External investigators’ knowledge of Center research and predictions of future value– Theoretical framework development/wiki analysis– Bibliometric analysis of publications to date

• Years 5-6: Center reorganization led to refocus on “centerness”– Return to interviews in Years 1-2– Analysis of changes in thrust plans over time

• Present: Evaluation effort largely dormant until plans for SLC-wide evaluation become evident– Some continuing activities around sustainability

Page 10: Brian Zuckerman  American Evaluation Association November 2 nd  2011

Data Collection Changes Over Time (Partial list of measures and approaches)

Topic Data Collection Strategy Years 1-2

Years 3-4

Years 5-6

Theoretical framework Internal Interviews/growth of common language

X X

Theoretical framework Analysis of PSLC theory wiki X

Theoretical framework External interviews/value of PSLC theory development to date

X

Theoretical framework Analysis of thrust plans X

Theoretical framework Bibliometrics X X

New fields External interviews/role of PSLC in educational data mining community

X

Value of infrastructure Internal interviews/use internally of PSLC DataShop, tools

X X

Value of infrastructure External interviews/knowledge of PSLC DataShop, tools

X

Page 11: Brian Zuckerman  American Evaluation Association November 2 nd  2011

Data Collection Changes Over Time (cont.)

Topic Data Collection Strategy Years 1-2

Years 3-4

Years 5-6

Education of students Internal interviews/perception of value of PSLC participation

X X

Education of students Tracking next steps of PSLC graduates

X X X

Broadening participation Participant observation and interviews with interns

X X

Broadening participation Tracking next steps of interns X X

Other ancillary educational efforts

Follow-up up with participants in summer school

X

Page 12: Brian Zuckerman  American Evaluation Association November 2 nd  2011

Data Collection Changes Over Time (cont.)

Topic Data Collection Strategy Years 1-2

Years 3-4

Years 5-6

Collaboration formation Internal interviews/value and success of center-wide collaboration formation approaches

X X

Collaboration formation Internal Interviews/ collaborativeness in Center

X X

Collaboration formation Collaboration survey X

Collaboration formation Bibliometrics X X

External collaborations External interviews/value of collaboration

X

Management Value and effectiveness of center structures and processes

X X

Page 13: Brian Zuckerman  American Evaluation Association November 2 nd  2011

Reflections

• Evaluation in complex context– Pressure from government funder to demonstrate

results even during first five-year period– Change in PSLC organization and goals over time

• Required nimble evaluation approach as a result– Evaluation plans developed in Years 1 and 5 served

as point of departure rather than blueprint– Difficult-to-maintain balance between need for

continuity in data collection and shifting priorities