Building Blocks Curricular Effects - Peabody Collegepeabody.vanderbilt.edu/docs/pdf/pri/Hofer...The...

42
The Mechanisms Behind the Results: Exploring the Sources of Building Blocks Curricular Effects PRESENTED AT SREE Building Blocks Curricular Effects PRESENTED AT SREE MARCH 3-5, 2011 KERRY G. HOFER - PRESENTER DALE C. FARRAN MARK W. LIPSEY MARK W. LIPSEY CAROL BILBREY ELIZABETH VORHAUS

Transcript of Building Blocks Curricular Effects - Peabody Collegepeabody.vanderbilt.edu/docs/pdf/pri/Hofer...The...

The Mechanisms Behind the Results: Exploring the Sources of Building Blocks Curricular Effects

PRESENTED AT SREE

Building Blocks Curricular Effects

PRESENTED AT SREE

MARCH 3-5, 2011

KERRY G. HOFER - PRESENTER

DALE C. FARRAN

MARK W. LIPSEYMARK W. LIPSEY

CAROL BILBREY

ELIZABETH VORHAUS

Introduction to Our Study

The Building Blocks for Math PreK Curriculum (Clements & Sarama 2007) was designed to help young children learn mathSarama, 2007) was designed to help young children learn math

Nashville was 1 location of a multi-site scale-up study

Key components of the curriculum to facilitate learning involve: Increasing children’s engagement in math activities Encouraging teachers to have children talk about math. g g

Determining the effectiveness of a curriculum often involves investigating the practices of target teachers and how those practices relate to child outcomespractices relate to child outcomes

This study instead looked at the effect of the curriculum on teacher and child behaviors and the resulting link to math

hi iachievement gains.

Nashville Scale-Up Site

20 schools randomly assigned to conditionsy g16 Metropolitan Public schools

4 Head Start centers

57 classrooms31 treatment classrooms (16 public, 15 Head Start)

26 control classrooms (17 public 9 Head Start)26 control classrooms (17 public, 9 Head Start)

Approximately 680 children with PK pre- and post-datadata

Sample was predominantly Black and from low-income households

Timeline for Measures of Interest

Child AssessmentsBeginning of PK

End of PK

Individual Pull-Out Direct Assessments

Classroom and Child Observations3 times during PK year

8:00-12:00

“Prime Instructional Time”

Outcomes of Interest

HOW DID WE LOOK AT HOW DID WE LOOK AT CHILD LEARNING?

Measures: Child Outcomes

REMA (Research-based Elementary Math Assessment;Clements, Sarama & Liu, 2008)

Proximal measure of children’s knowledgeNumber and Geometry componentsNumber and Geometry componentsT-scores used

Woodcock JohnsonMore distal measures of children’s Math knowledgeMore distal measures of children s Math knowledge

Applied ProblemsQuantitative Concepts

Literacy subtest (Letter Word Identification) to assess possible Literacy subtest (Letter-Word Identification) to assess possible negative effects when math is more emphasized in classroomsW-scores used

Treatment Effects on Child Gain

0.700.70

0.59

0.50

0.60 0.59

0.50

0.60 *

0.290.30

0.40

en's d Effect S

ize

0.290.30

0.40

en's d Effect S

ize

*

0 08

0.190.20

Coh

0.19

0 080 10

0.20

Cohe

*

0.08

0.00

0.10

REMA WJ LW WJ AP WJ QC

0.08

0.00

0.10

REMA WJ AP WJ QC WJ LW

MeasureMeasure

*p<.10

Variability in Child Gain

REMA Residualized Gain Across Classrooms

48

50

52

44

46

48

40

42

44

1 ES diff

36

38Treatment Metro

Treatment Head Start

Control Metro

Control Head Start

difference

Pretest Posttest

Classroom Differences

WHAT MIGHT WHAT MIGHT EXPLAIN THE

VARIABILITY IN GAIN THAT WE OBSERVED?

Two Approaches to Measuring Classroom InstructionMeasuring Classroom Instruction

Top-Down v. Bottom-Upp por

“As-Delivered” v. “As-Received”As Delivered v. As Received

Most “fidelity” measures or observations of instructional Most fidelity measures or observations of instructional quality look at the teacher. What is he/she doing?

A complementary approach involves looking at the children. What are they doing and how engaged are they in doing those things related to the curriculum focus?

Measures: Instruction As-Delivered

Measured in Treatment and Control ClassroomsCOEMET (Classroom Observation of Early Mathematics—Environment and Teaching; Sarama & Clements, 2007)

Classroom CultureC ass oo Cu tu e

Specific Math Activities (SMA’s)

Miniature Specific Math Activities (miniSMA’s)

N ti R d (F & Bilb 2004)Narrative Record (Farran & Bilbrey, 2004)Running record of everything that occurs in the classroom during the 8:00-12:00 observation

Length of time of an episode

Content of an episode

Measured in Treatment Classrooms onlyMeasured in Treatment Classrooms onlyNear Fidelity (Sarama & Clements, 2008)

Classroom Culture

Classroom Culture 

4.02

5.00

(Rated 1‐5)

4.023.66

3.292.99

3.00

4.00

2.00

1.00

Metro (N=16) Head Start (N=15) Metro (N=17) Head Start (N=9)

Treatment Control

SMA Numbers

Classroom Mean SMAs and miniSMAs

9.09.710.0

12.0

5.56.0

6.0

8.0

2.9

1.9 1.71 1

2.0

4.0

1.1

0.0

Metro Head Start Metro Head Start

Treatment Control

SMA Numbers

Classroom Mean SMAs and miniSMAs

9.09.710.0

12.0

5.56.0

6.0

8.0

2.9

1.9 1.71 1

2.0

4.0

1.1

0.0

Metro Head Start Metro Head Start

Treatment Control

SMA Numbers

Classroom Mean SMAs and miniSMAs

9.09.710.0

12.0

5.56.0

6.0

8.0

2.9

1.9 1.71 1

2.0

4.0

1.1

0.0

Metro Head Start Metro Head Start

Treatment Control

SMA Quality

SMA Quality

3.64.0

5.0

3.2 3.2

2.2

3.0

1.0

2.0

0.0

Metro Head Start Metro Head Start

Treatment Control

Narrative Record

Proportion of Observation in Instructional Activities

0.600 60

0.70

0.80

0.53 0.520.55

0 40

0.50

0.60

0.20

0.30

0.40

0.10

Metro Head Start Metro Head Start

Treatment Control

Near Fidelity (Treatment Only)

Near Fidelity

3.83.6

4.0

3 6 3 53 44.0

5.0

3.6 3.53.3

2.3

3.4

2 2

3.03.0

M t2.2

1.0

2.0Metro

Head Start

0.0

General Hands‐On Centers Whole Group Small Group ComputersGeneral Curriculum

Hands On Centers Whole Group Small Group Computers

Measure: Instruction As-Received

COPChild Ob i i h l (CO Child Observation in Preschool (COP; Farran, Plummer, Kang, Bilbrey, & Shufelt, 2006)Children were observed in their l h ll d l h b t t d i classrooms, hallways, and lunchrooms, but not during

naps or while outdoors. Uses a behavioral sampling methodAt the end of each day, we generally had about 24 “snapshots” per child describing behavior, activities and engagementNi di i b i bl f i h i l dNine dimensions, but variables of interest here include:

Focus: MathType Task: SequentialEngagement: Mean Level

COPInstances of Children in Math ActivitiesInstances of Children in Math Activities

Proportion of All Snapshots in Math Activity

0.16

0 130.15

0.20

0.13

0.100.10

0.040.05

0.00

Metro Head Start Metro Head Start

Treatment Control

COPRatings of Engagement in MathRatings of Engagement in Math

Engagement when in Math Activity

4.00

5.00

3.15 3.12 3.03 3.16

3.00

4.00

2.00

1.00

Metro Head Start Metro Head Start

Treatment Control

COPInstances of Sequential Math ActivitiesInstances of Sequential Math Activities

Proportion of Math Snapshots in Sequential Math Activity

0.56

0.68

0.580.60

0.70

0.80

0.560.53

0.40

0.50

0 10

0.20

0.30

0.00

0.10

Metro Head Start Metro Head Start

Treatment Control

COPRatings of Engagement in Sequential Mathg f g g q

Engagement When in Sequential Math Activity

3 424.00

5.00

3.42 3.31 3.282.90

3.00

1.00

2.00

0.00

Metro Head Start Metro Head Start

Treatment Control

A l i Cl Analyzing Classroom Characteristics’ Effects

HOW DO CHILD HOW DO CHILD LEARNING OUTCOMES

RELATED TO OUR OBSERVATIONAL

VARIABLES?

“Horizontal” Comparisons

HOW DIFFERENT ARE

p

HOW DIFFERENT ARE TREATMENT AND

CONTROL CLASSROOMS IN THEIR EMPHASIS ON

MATH LEARNING?

Treatment/Control Effect Sizes for Horizontal As-Delivered Instructionfor Horizontal As Delivered Instruction

1.872.00

1.351.40

1.60

1.80

1.19

0.851.00

1.20

en's d Effect S

ize

0.40

0.60

0.80

Cohe

0.22

0.00

0.20

Narrative Prp #SMAs SMA Quality # miniSMAs Classroom CultureNarrative Prp Instructional Time

#SMAs SMA Quality # miniSMAs Classroom Culture

COEMET

Treatment/Control Effect Sizes for Horizontal As-Received Instructionfor Horizontal As Received Instruction

2.00

1.441.50

1.00

en's d Effect S

ize

0.360.43 0.48

0.50

Cohe

0.00

COP Proportion in math COP Engagement in  COP Proportion in  COP Engagement in p g gmath

psequential math

g gsequential math

Exploring Horizontal Measures as MediatorsExploring Horizontal Measures as Mediators

Instruction As-DeliveredInstruction As Delivered

COEMET factor created from all COEMET i blCOEMET variables

Instruction As-Received

COP variable of Proportion of Observations in MathObservations in Math

Mediational Model

CLASSROOM CHARACTERISTICSCHARACTERISTICS

a b

CURRICULUM CHILDREN’S CONDITION ACHIEVEMENT

c’

Mediation Step 1: Path a

Regression Results

Potential Mediator b SE pPotential Mediator b SE p

As‐Delivered Fidelity (COEMET) 1.37 0.290 0.000

As‐Received Fidelity (COP) 0 07 0 020 0 002As Received Fidelity (COP) 0.07 0.020 0.002

Mediation Step 2: Path b

Mediator:  As‐Delivered (COEMET) Regression ResultsOutcome b SE pREMA 0.55 0.29 .067WJ Letter‐Word Identification 0.99 1.67 .558WJ Applied Problems 1.73 1.11 .124WJ Quantitative Concepts 2.70 0.82 .002

Mediator:  As‐Received (COP) Regression ResultsOutcome b SE pREMA 11.23 4.52 .016WJ Letter‐Word Identification 48.61 25.82 .065WJ Applied Problems 55.50 15.49 .002WJ Quantitative Concepts 60.81 11.35 .000

Mediation Step 2: Path b

Mediator:  As‐Delivered (COEMET) Regression Results Mediator

Outcome b SE p pREMA 0.55 0.29 .067 <.10WJ Letter‐Word Identification 0.99 1.67 .558 n.s.0.99 1.67 .558 n.s.WJ Applied Problems 1.73 1.11 .124 n.s.WJ Quantitative Concepts 2.70 0.82 .002 <.05

Mediator:  As‐Received (COP) Regression Results Mediator

Outcome b SE p pREMA 11.23 4.52 .016 <.05WJ Letter‐Word Identification 48.61 25.82 .065 <.10WJ Applied Problems 55.50 15.49 .002 <.05WJ Quantitative Concepts 60.81 11.35 .000 <.05

“Vertical” Fidelity

FIDELITY MEASURED

y

FIDELITY MEASURED ONLY IN TREATMENT

CLASSROOMS

Vertical Fidelity Variation

Is there variation in child outcomes andIs there variation in child outcomes andvariation in vertical fidelity?

Between-Classroom Variation in Child Between-Classroom Variation in Child Outcomes

Sho n in earlier slideShown in earlier slide

Between-Classroom Variation in Fidelity

Between-Classroom Variation in Fidelity(Example of General Curriculum Scores)(Example of General Curriculum Scores)

Linking Child Outcomes and Vertical Fidelity

If we have observed variation in both child outcomes and Vertical Fidelity, the question becomes whether they covary.

Correlations Among Fidelity and Residualized GainApplied Quant. Letter-

REMA Problems Concepts Word IDGeneral Curriculum 0.45 0.39 0.41 0.24Hands-On Centers 0.21 0.20 0.12 0.08Whole Group 0.47 0.34 0.35 0.24Small Group 0.09 0.29 0.16 0.19Computers 0.49 0.28 0.35 0.33p

Linking Child Outcomes and Vertical Fidelity

If we have observed variation in both child outcomes and Vertical Fidelity, the question becomes whether they covary.

Correlations Among Fidelity and Residualized GainApplied Quant. Letter-

REMA Problems Concepts Word IDGeneral Curriculum 0.45 0.39 0.41 0.24Hands-On Centers 0.21 0.20 0.12 0.08Whole Group 0.47 0.34 0.35 0.24Small Group 0.09 0.29 0.16 0.19Computers 0.49 0.28 0.35 0.33p

Conclusions: Horizontal Measures

As-DeliveredBB classrooms differed from Control classrooms in the amount and quality of math instruction provided.These differences were associated with differential gainThese differences were associated with differential gain.Better math instruction led to better math outcomes.

As-ReceivedChildren in BB classrooms differed from those in Control classrooms in the frequency of involvement in math activities.These differences were associated with differential gain.Involvement in more math-focused activities led to better outcomesoutcomes.

Conclusions: Vertical Measures

Fidelity mattered in our study.Fidelity mattered in our study.

In the Building Blocks classrooms, those teachers whose instruction was more teachers whose instruction was more closely-aligned with the curriculum in the General Curriculum activities Whole-Group General Curriculum activities, Whole-Group activities, and Computer activities had higher classroom gain than teachers whose higher classroom gain than teachers whose instruction was lower on these measures.

Discussion Points

Building Blocks had an effect on children’s Building Blocks had an effect on children s math achievement during Pre-K

The effect of BB might not have been seen The effect of BB might not have been seen without the effective change in classroom instruction and in children’s activitiesinstruction and in children s activities

REMA scores were affected by both horizontal and vertical measures which horizontal and vertical measures, which makes sense given the proximity of the measure and the curriculum measure and the curriculum.

Discussion Points

The top-down approach tends to be more aligned to the i l hil hi b h i ificurriculum, while this bottom-up approach is not specific

to Building Blocks, but to math activity in general.Curriculum developers should consider ways to use these Curriculum developers should consider ways to use these two views, but develop a more curriculum-specific look at the instruction as-received.Bottom up approaches are an added costBottom-up approaches are an added cost.Bottom-up approaches can give very different pictures of what is happening in the classroom than more top-down looks. Next approaches might involve examining the additive contribution of analyzing the child perspective with that of the teacher.

Acknowledgements

•Other Project StaffK A th

A special thanks to our – Karen Anthony– Canan Aydogan– Tracy Cummings

Linda Dake

pcollaborators,

Doug Clements d – Linda Dake

– Kelley Drennan– Sue Ganguly– Sarah Shufelt

and Julie Sarama

at University at Buffalo Sarah Shufelt– Beth Storey– Rachael Tanner-Smith– Filiz Varol

at University at Buffalo (SUNY),

as well as our funding – Betsy Watson

– Cathy Yun

Thank you!

source, IES.

Thank you!