Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

24
Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training
  • date post

    19-Dec-2015
  • Category

    Documents

  • view

    215
  • download

    0

Transcript of Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Page 1: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Extension Program Evaluation

Michigan Planning and Reporting System (MI PRS)

Winter 2011Training

Page 2: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Part of Planning Process

• Evaluation is an upfront activity in the design or planning phase of a program. • Evaluation is not an after-program

activity.

Page 3: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Why Outcomes?

Today, in a time of continued reduction in government funding, Extension professionals are challenged more than ever before to document outcomes of programs and address stakeholder demands for accountability.

Page 4: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Review of Part of Bennett's Hierarchy

As one moves up the hierarchy, the evidence of program impact gets stronger.

ReactionsKASAPractice Change

People InvolvementActivities

End Results

Resources

Page 5: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

• Collecting impact data on programs is costly, time consuming, and requires skill. (But not impossible!)• Extension professionals are expected

to evaluate a minimum of one program a year at impact level.

Page 6: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Example

• A pre/post measure can assess short term outcomes on knowledge, attitudes, skills, and aspirations (motivation to change).• A plan for participant follow-up is

required to assess behavior or practice change.

Page 7: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Plan early

• Plan early on what is needed with cost, time, skills (data collection, analysis, interpretation), and resources that are needed to evaluate an extension program.• Work with Institute teams/ groups.

Page 8: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

• Evaluating programs at the lower levels (inputs, participation, collaboration, activities, and reactions) may require little effort and are less expensive. This is process evaluation.

Page 9: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Process Evaluation

• Process evaluation, also called formative evaluation, helps program staff to assess ongoing programs for improvement and implementation.• Examples: program fidelity,

reaching target audiences

Page 10: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Outcome Evaluation

• Documenting impact or community-level outcomes requires skills relative to questionnaire development, data collection and analysis, interpretation and reporting.

Page 11: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

• Summative evaluation, also called impact or outcomes evaluation, may require understanding of evaluation designs, data collection at multiple points, and sophisticated statistical analyses such as Analysis of Covariance and the use of covariates.

Page 12: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

A Framework for Linking Costs and Program Outcomes Using Bennett's Hierarchy

Process (Formative) Evaluation Outcome Evaluation

Cost & Outcomes Inputs Activities Participation Reactions KASA

Practice/Behavior Change SEEC

Short Term X X X X XX XXX ----

Inter-mediate

X X X ---- XX XXX XXXX

Long Term X X X ---- XX XXX XXXX

X = Low cost, effort, and evidence;

XX = requires questionnaire development, data collection and analysis skills;

XXX = requires understanding of evaluation designs, multiple data collection, additional analysis, skills, interpretation;

XXXX—all of the above, time, increased costs, potentially resulting in stronger evidence of program impact.

Page 13: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Professional Development

• Plans for professional development are captured in MI PRS, consider building skills in evaluation. • Develop with Institute work teams

program evaluation plans that fit with logic models.

Page 14: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

To make an Evaluation Plan:

1. Decide if the program is ready for formative/process or summative/outcome evaluation.

2. Link program objectives to evaluation questions that address community outcomes.

Page 15: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

To make an Evaluation Plan, Cont.

3. Identify key indicators for evaluation (make sure they are measurable and relevant).

4. Consider evaluation costs (follow-up techniques and comparison groups used in summative designs are more expensive).

5. Develop a cost matrix.

Page 16: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

• Tracking program and project processes and outputs, as well as outcomes, will require data collection and analysis systems outside of MI PRS. • Link program costs and cost of

evaluation to the outcomes.

Page 17: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Conclusion

In the end, evaluation questions that address the “so what” issue are connected to outcomes and costs, and ultimately justify the value of Extension programs to public good.

Page 18: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Key Reference

Radhakrishna, R., & Bowne, C. (2010). Viewing Bennett’s hierarchy from a different lens: Implications for Extension program

evaluation. Journal of Extension, 48 (6). Retrieved 1/24/11 at: http://www.joe.org/joe/2010december/tt1.php

Page 19: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

MSUE Resources• Organizational Development webpage– Planning, Evaluation, and Reporting section

Page 20: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Evaluation Resources will Grow!

Page 21: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Other Extension materials on Evaluation….with future MSU specific

resources to be released in 2011

Page 22: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

MSU Evaluation Specialist

Assists with work teams to develop logic model objectives and evaluation strategies

Consults on evaluation designs Provides guidance to data analysis and

selecting measures Develops and delivers educational programs

related to Extension program evaluation Facilitates evaluation plan development or

brainstorming for Institute work teams

Page 23: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

Organizational Development team member

• Dr. Cheryl Peters, Evaluation Specialist• Statewide coverage• [email protected]• 989-734-2168 (Presque Isle)• 989.734.4116 Fax• Campus Office: Room 11, Agriculture Hall.

Campus Phone: 517-432-7605• Skype: cpeters.msue

Page 24: Extension Program Evaluation Michigan Planning and Reporting System (MI PRS) Winter 2011Training.

MI PRS Resources