ICF_AEA_multipaper

49
icfi.co m | Evaluation 2013: The State of Evaluation Practice in the Early 21st Century 27th Annual Conference of the American Evaluation Association Washington, DC, USA Design Considerations in Evaluating the Implementation of College Access, College Readiness, and Career Pathway Initiatives Thomas Horwood, Chair Barbara O’Donnel, Discussant October 18, 2013 | 4:30 - 6:00 PM

Transcript of ICF_AEA_multipaper

Page 1: ICF_AEA_multipaper

icfi.com |

Evaluation 2013: The State of Evaluation Practice in the Early 21st Century27th Annual Conference of the American Evaluation Association

Washington, DC, USA

Design Considerations in Evaluating the Implementation of College Access, College Readiness, and Career Pathway Initiatives

Thomas Horwood, ChairBarbara O’Donnel, Discussant

October 18, 2013 | 4:30 - 6:00 PM

Page 2: ICF_AEA_multipaper

icfi.com |

Determining Dosage: Evaluating the Implementation of a State GEAR UP Initiative Ashley BriggsCharles Dervarics

October 18, 2013

Presented to:

American Evaluation Association 2013 Conference

Page 3: ICF_AEA_multipaper

icfi.com | 3

GEAR UP Nationally

First grants funded in 1999 – Originally based on I Have a Dream program concept

Cohort approach – Most grantees follow a cohort of students from 7th grade through to postsecondary education, with support services (tutoring, mentoring, college visits) available

FY12 – 132 federal grants serving 647,772 students

Multiple award types – State grants and local partnership grants

OVERVIEW

Page 4: ICF_AEA_multipaper

icfi.com | 4

About the Texas GEAR UP State Grant (SG)OVERVIEW

FY12 state grant from USED – approx. $5M per year (2012 to 2019)

Focused on a single cohort of students starting in Grade 7 (students are in Grade 8 in 2013–14)

Includes district and statewide services

District services– Support schools in four districts (7 MS 5 HS) to increase academic rigor– Increase number of Grade 8 students succeeding in Algebra I (short-term goal)– Provide teacher professional development to support delivery of rigorous courses

(such as Pre-AP training)– Provide teacher professional development to support postsecondary goals (financial

literacy)– Promote vertical alignment of core subject teachers across the grades– Support college visits, summer learning opportunities, and tutoring services

Page 5: ICF_AEA_multipaper

icfi.com | 5

About the Texas GEAR UP State Grant (SG) (cont’d) OVERVIEW

Statewide Services– Postsecondary information dissemination to students and families statewide– Active, in-depth web site with information for students and families– Online communication and teaching platform available statewide– Statewide coalition of GEAR UP grantees (including local partnership grants not

directly under the SG)

TEA GEAR UP Partners – The University of Texas at Austin’s Institute for Public School Initiatives [IPSI] – TG [Texas Guaranteed Student Loan Corporation]– College Board– AMS Pictures

Page 6: ICF_AEA_multipaper

icfi.com | 6

About the Texas GEAR UP SG EvaluationEVALUATION DESIGN AND METHODOLOGY

The external evaluation is a longitudinal 7-year study using a quasi-experimental design that started in January 2013 to:

Provide ongoing formative evaluation of facilitators/barriers, promising practices, and recommended next steps

Explore implementation status, trends in the mix of implementation, and relationships between implementation and outcomes

Determine impact including short, intermediate, and long-term student outcomes

Identify impact on relevant family, school, and community partnership outcomes

Examine access to and use of statewide opportunities

Understand cost, spending, and sustainability

Page 7: ICF_AEA_multipaper

icfi.com | 7

Data SourcesEVALUATION DESIGN AND METHODOLOGY

Extant Data– Documents: Texas GEAR UP SG Grant Application, Notice and Grant Awards

(NOGAs), and implementation plans– Student level data: Demographics, attendance, high school course

completion and high school completion, school personnel, and district organizational information

– School level data: Profile information about campus-level performance, staff, finances, and programs

Student Tracking System (Annual Performance Report – APR)– Format: Submission by 4 subgrantee districts using a prepopulated

spreadsheet – Topics: Advanced course-taking; Academic services; Student services;

Student events and attendance; Parent events and attendance; Teacher professional development and enrollment; Community partners

Page 8: ICF_AEA_multipaper

icfi.com | 8

Data Sources (cont.)EVALUATION DESIGN AND METHODOLOGY

Surveys with Parents and Students– Format: Online and paper-based versions in English and Spanish– Topics: Aspirations and expectations; Knowledge of financial aspects;

Knowledge of college requirements; Perceptions of Texas GEAR UP SG

Site Visits to Texas GEAR UP SG Schools– Format: 1-1.5-day visits including interviews and focus groups with school

staff, teachers, students, parents, and community partners– Topics: GEAR UP activities and events (school and statewide); Knowledge of

college requirements and financial aspects; Perceptions of Texas GEAR UP SG; Readiness for success in college

Interviews with Key Leaders from TEA and Partner Organizations– Format: Telephone interviews– Topics: Level of partner involvement; Perceptions of program; Progress on

statewide implementation

Page 9: ICF_AEA_multipaper

icfi.com | 9

Initial Analysis: ImplementationANALYSIS

Data Source: Student tracking system (APR) and site visits Primary Analysis: Descriptive statistics on participation, dosage (number of

hours, events), and mix (range of services/activities); disaggregation by school, subject area, and format (virtual or in-person)

Implementation Strategy A B C D E F G

Adv. Course X X X X X X X

SSS: Tutoring X X X X X (math) X (math) X

SSS: Mentoring         X   X

SSS: Counseling/Advising             X

SSS: Other Activities         X (math) X (math)  

College Visit X X   X     X

Job Site Visit             X

Student Events X X X   X X X

Parent Events   X X   X X X

Teacher PD       X X X X

Community Partners   X X X     X

Use Statewide Services         X X X

Total 4 6 5 5 8 7 11

Page 10: ICF_AEA_multipaper

icfi.com | 10

Initial Analysis: Plans, Knowledge, and PerceptionsANALYSIS

Data Source: Student and parent surveys Primary Analysis

– Descriptive statistics (frequencies, averages, ranges)– Crosstabs (chi-square analyses comparing frequency distribution by

subgroup) – Analysis of variance (ANOVA) comparing means by subgroup– Correlation

Key Baseline Takeaway: Both parent and student aspirations often exceeded expectations, suggesting they are concerned about being able to achieve their educational dreams.

Key Baseline Takeaway: Few students or parents perceive themselves as very knowledgeable, which can potentially be changed by participation in Texas GEAR UP SG.

Key Baseline Takeaway: Student overall satisfaction with Texas GEAR UP SG was highest at one school, where 41% of students indicated they were very satisfied.

Page 11: ICF_AEA_multipaper

icfi.com | 11

Initial Analysis: Costs and Lessons LearnedANALYSIS

Cost– Data Source: Budgets and reported draw downs– Primary Analysis: Descriptive statistics, breakdown by cost categories

Facilitators and Barriers– Data Source: Survey and site visit data– Primary Analysis: Descriptive statistics, analysis of open-ended survey

responses, qualitative analysis– Key Baseline Takeaway: Parents reported that engagement in activities is

facilitated when topics are of interest to them, when events are held at times appropriate for their schedule, and when their student is also engaged.

Potentially Promising Practices– Data Source: Site visit data– Primary Analysis: Qualitative analysis– Key Baseline Takeaway: Early successes at some schools related to

afterschool mathematics programs, enhanced college visits, and family events.

Page 12: ICF_AEA_multipaper

icfi.com | 12

Forthcoming Analysis beyond Year 1ANALYSIS

Level and Mix of Implementation: Analysis of various service factors – Provision type (virtual or on-line)– Frequency of delivery (number of hours, number of sessions)– Mix of services (e.g., enrollment in and tutoring in an advanced course)– Quality of implemented activities

Plans, Knowledge, and Perceptions: Disaggregation by student characteristics– Gender, race/ethnicity, LEP status, special education status– Participation in advanced coursework

Cost– Descriptive analysis of actual expenditures (annual and cumulative) by cost

category Types of Analysis

– HLM (with student, school, and district levels) and cluster analysis– Impact analysis using extant outcome data– Comparisons using PSM– Linkages between outcomes and implementation– Change in implementation over time– Relationship of actual implementation and proposed plans

Page 13: ICF_AEA_multipaper

icfi.com | 13

Lessons About This Evaluation from Year 1LESSONS

Caution interpretation based on the period of data collection.

Use crosswalk to address 60+ evaluation questions.

Ensure common definitions of program services.

Consider ways to verify information across data sources.

Maximize the use of online surveys.

Leverage various strategies to obtain sufficient parent response rates.

Analyze data at multiple levels (school and district).

Utilize district-level case studies to understand the context in which implementation occurs.

Page 14: ICF_AEA_multipaper

14

O’Donnel, B., Briggs, A., Dervarics, D., Horwood, T., Sun, J., Alexander, A., Zumdahl, J., & Rhodes, J. (2013, September). Annual Implementation Report #1: Texas GEAR UP State Grant Evaluation. Report prepared for the Texas Education Agency by ICF International. Available online at: http://www.tea.state.tx.us/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=25769807659&libID=25769807662#25769807659

For more information, the report is publicly available:

Page 15: ICF_AEA_multipaper

icfi.com | 15

Assessing the Fidelity of Implementation in the Diplomas Now Evaluation

October 18, 2013

Presented to:

American Evaluation Association 2013 Conference

Felix FernandezAracelis Gray

Page 16: ICF_AEA_multipaper

icfi.com | 16

Overview of Diplomas Now Study

Diplomas Now is school turnaround model that unites three organizations – Talent Development, City Year, and Communities In Schools

A random assignment design

Sixty-two schools in 11 districts across the country participating in the study.

Study will compare student outcomes in the 32 middle and high schools that implement DN to those in the 30 schools that do not.

Page 17: ICF_AEA_multipaper

icfi.com | 17

Overview of Diplomas Now Implementation Study

Overall goal: document implementation in the 32 DN schools.

Research Questions: – How much variation in implementation fidelity was there across sites?– What were the largest challenges to implementing the DN model?– What were the most important ways in which the intervention as implemented

differed from the intervention as planned?

Page 18: ICF_AEA_multipaper

icfi.com | 18

DN Fidelity of Implementation

Fidelity of implementation is based on the DN Logic Model and measured by the Fidelity of Implementation Matrix.

The matrix is made up of 111 separate components, 62 of which were identified as critical to adequate implementation.

The fidelity matrix consist of 9 inputs ranging from program staff training, family and community involvement and student supports.

Page 19: ICF_AEA_multipaper

icfi.com | 19

DN Fidelity of Implementation

That is…

Input 1

Component X

Component Y Component

Z

Page 20: ICF_AEA_multipaper

icfi.com | 20

DN Fidelity of Implementation

And…

Overall Fidelity

Input 1

Input 2

Input 3Input 4Input 5Input 6

Input 7

Input 8

Input 9

Page 21: ICF_AEA_multipaper

icfi.com | 21

Page 22: ICF_AEA_multipaper

icfi.com | 22

Overview of Fidelity Matrix

– Program Staff Training and Professional Development• 18 individual components, 15 of which are critical

– Integrated On-Site Support (Critical Input)• 11 individual components, 9 of which are critical

– Family and Community Involvement• 6 individual components, 1 of which is critical

– Tiered Intervention Model (Critical Input)• 3 individual components, 2 of which are critical

– Strong Learning Environments (Critical Input)• 6 individual components, 4 of which are critical

– Includes 1 MS and 1 HS specific critical component

Page 23: ICF_AEA_multipaper

icfi.com | 23

Overview of Fidelity Matrix

– Professional Development and Peer Coaching (Critical Input)• 5 individual components, 2 of which are critical

– Includes 1 HS specific component

– Curriculum for College Readiness• 24 individual components, 4 of which are critical

– Includes 7 MS and 17 HS specific components

– Student Supports (Critical Input)• 24 individual items, 19 of which are critical

– Student Case Management (Critical Input)• 14 individual items, 5 of which are critical

Page 24: ICF_AEA_multipaper

icfi.com | 24

DN Fidelity of Implementation

Divided into two metrics, a categorical rating and a continuous score:1. Implementation Rating (categorical measure): focus on critical components

2. Implementation Score (continuous measure): allow for assessment of greater variability between sites

Together they provide the flexibility to:– Look at categories that emerge– See if scores vary by rating, amount of overlap, etc.– Study relationships between implementation and outcomes

Page 25: ICF_AEA_multipaper

icfi.com | 25

Data Sources

Fidelity of Implementation Data stem from the following sources:– Diplomas Now Implementation Support Team (DNIST) Survey– School Transformation Facilitator (STF) Survey– Citi Year Program Manager (CYPM) Survey– Communities In Schools (CIS) Site Coordinator (SC) Survey– Communities In Schools (CIS) Site Records

Page 26: ICF_AEA_multipaper

icfi.com | 26

Fidelity of Implementation: Strong Learning Environments

Component Operational definition Fidelity Scale Criterion Critical Sample Response

Strong Learning EnvironmentsSmall Learning Communities

Teams of Teachers working with the same small group of students

0: No1: Yes

1: Adequate/ High Fidelity

YES 1:Yes

Interdisciplinary Teams

Frequency of Interdisciplinary team meeting

0: do not /rarely occur1: occur monthly2: occur bi-weekly3: occur weekly4:occur multiple times a week5: occur daily

4: Adequate5: High Fidelity

YES 4:occur multiple times a week

DN Site-Based Meeting

Admin, STF, Program Manager, Site Coordinator hold brief review of program implementation (approx. 30 minutes)

0: Once a month or less1: Biweekly2: Weekly or more frequently

1=Adequate, 2= High Fidelity

YES 0: Once a month or less

DN Site-Based Collaboration

Site based collaborative (Admin, STF, PM, Site Coordinator) have norms for collaboration, standards for communication, and frameworks for decision making

0: No1: Partially/In Process2: Yes

1=Adequate, 2= High Fidelity

No 1: Partially/In Process

4x4 Block (High School Only Question)

Four 75-90 minute class periods that meet daily

0: No1: Hybrid/Acceptable Alternative2: Yes

1= Adequate 2= Highly Fidelity

YES 0: No

Page 27: ICF_AEA_multipaper

icfi.com | 27

Implementation Rating

The implementation rating focused on the “Critical to Fidelity/Adequate Rating” column within the fidelity matrix.

Using this column each input (e.g., program staff professional development) of the DN model was provided with one of two ratings:

1. “Successful” - have met all components identified as critical2. “Developing” - did not meet one or more critical components

In addition to critical components, critical inputs have also been identified (i.e., inputs critical to an adequate implementation).

Page 28: ICF_AEA_multipaper

icfi.com | 28

Implementation Rating

Individual input ratings served as the basis for the site-level fidelity rating, which has been broken up into four parts:

1. Low: successful on less than 3 critical inputs

2. Moderate: successful on at least 3 critical inputs

3. Solid: successful on at least 5 critical inputs

4. High: successful on 8 or more inputs including 5 critical inputs

Page 29: ICF_AEA_multipaper

icfi.com | 29

Example: Implementation Rating

Implementation Rating only takes into account components identified as critical. In this case:– Teams of teachers working with the same small group of students– Frequency of interdisciplinary team meetings– DN site-based meeting– 4x4 Block

Given our sample responses this site has met the criterion for adequate implementation for teams of teachers and frequency of interdisciplinary team meetings but not DN site-based meetings or 4x4 classroom blocks.

It would therefore receive an implementation rating of “Developing” on this input.

Page 30: ICF_AEA_multipaper

icfi.com | 30

Program Staff Professional Development

DN Fidelity Implementation Rating Flowchart

Successful

Developing

Integrated On-Site Support*

Tiered Intervention Model*

Professional Development and Peer Coaching*

Student Supports*

Student Case Management*

Family and Community Involvement

Strong Learning Environments*

Curriculum for College Readiness

Successful

Successful

Successful

Successful

Successful

Successful

Successful

Successful

Developing

Developing

Developing

Developing

Developing

Developing

Developing

Solid Implementation(successful on 5 critical inputs)

High Implementation(successful on 8+ inputs)

Mo

derate Im

plem

entatio

n(su

ccessful o

n at least 3 critical in

pu

ts)

Lo

w Im

plem

entatio

n(su

ccessful o

n less th

an 3 critical in

pu

ts)

* indicates critical inputs

Developing

Page 31: ICF_AEA_multipaper

icfi.com | 31

Implementation Score

The implementation score focused on the “Fidelity Scale” column within the fidelity matrix.

Using this column each site received an input score, calculated as the equally weighted sum of the site’s fidelity scales divided by the total number of components.

The average of the 9 individual input scores then formed the site-level implementation score.

Page 32: ICF_AEA_multipaper

icfi.com | 32

Example: Implementation Score

Implementation scores are calculated by taking the sum of the weighted response divided by the total number of components.– Scale scores are equally weighted, for example, a component scaled 0-2 would be

recoded 0=0, 1=.5, and 2=1

Adding up the weighted fidelity scale responses would equal 2.3 (1+.8+0+.5+0).

There are 5 Strong Learning Environments components.

The site’s implementation score for this input would then equal 2.3 divided 5 or .46.

Page 33: ICF_AEA_multipaper

icfi.com | 33

Program Staff Professional Development

Integrated On-Site Support

Tiered Intervention Model

Professional Development and Peer Coaching

Student Supports

Student Case Management

Family and Community Involvement

Strong Learning Environments

Curriculum for College Readiness

X / 18

InputsInput Score

(X / 18) / 9

X / 11 (X / 11) / 9

X / 6 (X / 6) / 9

X / 3 (X / 3) / 9

X / 5 (X / 5) / 9

X / 5 (X / 5) / 9

X / 17 (X / 17) / 9

X / 24 (X / 24) / 9

X / 14 (X / 14) / 9

Site-Level Score

+

+

+

+

+

+

+

+

Note: X is the equally weighted sum of fidelity scale components. Sample calculations provided are only for HS data.

DN Fidelity Implementation Score Flowchart

Page 34: ICF_AEA_multipaper

icfi.com | 34

Fidelity of Implementation

Independently, each measure provides useful but different information.

Together, they provide flexibility in understanding implementation, allow for detailed discussion of site fidelity, and help to shape an implementation story.

Page 35: ICF_AEA_multipaper

icfi.com | 35

Evaluating a Career Pathways Research InitiativePathTech: Successful Academic and Employment Pathways in Advanced Technology

Kristen Peterson

October 18, 2013

Presented to:

American Evaluation Association 2013 Conference

Page 36: ICF_AEA_multipaper

icfi.com | 36

The PathTech Program

Page 37: ICF_AEA_multipaper

icfi.com | 37

Background on the PathTech Project

Funded through a grant from the National Science Foundation (NSF) under the Advanced Technological Education (ATE) Program

ATE promotes the improvement of education for science and engineering technicians entering high-technology fields

The ATE program supports many different types of activities including:– Articulation between two-year and four-year programs– Career pathways– Curriculum development– Educator professional development– General research advancing the understanding of educating technicians for careers in

high-technology fields

Page 38: ICF_AEA_multipaper

icfi.com | 38

Background on the PathTech Project

Successful Academic and Employment Pathways in Advanced Technologies (PathTech)– A research study examining the progression of students from high school into

advanced technology programs and into the workforce– A four-year study currently entering the third year of the project

Collaborative grant awarded to the University of South Florida (USF)– Grant partnership includes USF researches, the Florida Advanced Technological

Education Center (FLATE) and four south Florida Community Colleges• Hillsborough Community College• Polk State College• St. Petersburg College• State College of Florida

Page 39: ICF_AEA_multipaper

icfi.com | 39

PathTech Research Design and Evaluation

Page 40: ICF_AEA_multipaper

icfi.com | 40

PathTech Research Questions

1. Who enrolls in engineering technology (ET) programs out of high school?– How are student demographic and academic characteristics related to ET

enrollment?– How do students learn about ET programs?– How can the pathway from high school into ET programs be improved?

2. How do ET students benefit from enrolling (in degree programs) and earning degrees through these programs?

– What are the most critical steps in ET degree attainment from enrollment through gatekeeper courses and to the degree?

– How do these students become ET program graduates?– How do the ET students differ from comparable students in their degree and

employment outcomes?

Page 41: ICF_AEA_multipaper

icfi.com | 41

Design Considerations for PathTech

Mixed-methods study

Quantitative Data and Analysis:– Descriptive statistics and empirical analysis with quantitative data from state

databases

Qualitative Data and Analysis:– Ethnographic and qualitative analyses of engineering technology programs– Three data sources:

• interviews with community college students, • Interviews with students at feeder high schools,• Interviews with local industry partners

Page 42: ICF_AEA_multipaper

icfi.com | 42

Evaluation Approach

External evaluation of PathTech complements and supports the efforts of the PathTech research team and involves:

1. Monitoring the progress of various aspects of the project

2. Providing objective reviews of project instruments, plans, reports and other materials

3. Serving as an external resource for technical advice

Designed a flexible evaluation model to account for a dynamic work plan – Developed an annual workplan tracker– Update the workplan tracker monthly– Facilitate monthly status calls with the research team

Page 43: ICF_AEA_multipaper

icfi.com | 43

PathTech Evaluation Challenges

Page 44: ICF_AEA_multipaper

icfi.com | 44

Evaluation Challenges: Quantitative Data Analysis

Using data from the Florida Department of Education’s Florida Education and Training Placement Information Program (FETPIP)– Initial requests for data went unanswered– New policy to not release FETPIP employment data in conjunction with supplemental

educational data that includes demographic data

Pursuing data from other sources including from the National Academy Foundation (NAF)– The NAF is a network of over 500 career-themed academies across the country which

include engineering as a career theme. – The data would provide longitudinal, national, state and regional student-level data

including data on academic performance, demographic characteristics and academy assessments.

Page 45: ICF_AEA_multipaper

icfi.com | 45

Evaluation Challenges: Qualitative Data Collection

Multiple sources of qualitative data including high schools with a focus on engineering technology, community colleges and engineering technology industry partners– Initial pilot study at one community college, high school, and industry– Data collection and analyses are underway for community college and industry

employers and recruiters– Challenge recruiting high schools with relevant engineering technology programs

Participant and stakeholder buy-in across sites– Particular challenge among area high schools

Page 46: ICF_AEA_multipaper

icfi.com | 46

Evaluation Challenges: External Evaluation Considerations

Challenge: Monitoring progress with a flexible project design and dynamic work plan– Tracking changing timelines – Reporting on initial and updated benchmarks

Response: Task management and evaluation tools– Monthly progress meetings– Flexible workplan task tracker, designed to accommodate high-level and individual

task changes

Page 47: ICF_AEA_multipaper

icfi.com | 47

Workplan Tracker

Page 48: ICF_AEA_multipaper

icfi.com | 48

Current Project Status and Next Steps

Qualitative pilot studies have been completed with high school students, community college students, and industry partners– Pilot interviews were transcribed and thematically coded– Initial findings indicate three main factors influencing engineering technology

pathways and several pathways leading into engineering technology fields

Interview protocols have been finalized and community college students have been recruited into the study

20 in-depth interviews with industry members were completed, and transcriptions and analysis are underway

Ongoing challenges obtaining quantitative data; pursing alternate data sources

Multiple publications in progress

Continue to monitor the research team’s progress and help overcome barriers

Page 49: ICF_AEA_multipaper

icfi.com | 49

Contact Information

Program Evaluation Barbara O’Donnel, Principal, [email protected] T.J. Horwood, Senior Manager, [email protected]

Texas GEAR UP State Grant Ashley Briggs, Senior Associate, [email protected] Chuck Dervarics, Expert Consultant, [email protected]

Diplomas Now Felix Fernandez, Technical Specialist, [email protected] Aracelis Gray, Senior Manager, [email protected]

PathTech Kristen Peterson, Senior Associate, [email protected]