Conducting Programme Evaluation

35
Programme Evaluation Conducting the Programme Evaluation Puja Shrivastav JRF (UGC)

Transcript of Conducting Programme Evaluation

Programme EvaluationConducting the Programme Evaluation

Puja ShrivastavJRF (UGC)

Puja Shrivastav 2

If the Goal of Evaluation is…… to improve a

program

Then no evaluation is good unless findings are used to make a difference

Puja Shrivastav 3

Programme EvaluationAny evaluation to examine and assess the

implementation and effectiveness of specific instructional activities in order to make adjustments or changes in the activities is often labeled "process or programme evaluation.”

The focus of process evaluation includes a description and assessment of the curriculum, teaching methods used, staff experience and performance, in-service training, and adequacy of equipment and facilities.

Puja Shrivastav 4

When to Conduct Evaluation?The stage of program

development influences the reason for program evaluation.

o The design stage.o The start up stage.o While the programme is in

progress.o After the programme wrap up.o Long after programme finishes.

Puja Shrivastav 5

Steps of conducting evaluation1. Planning for Evaluation-Identify the problem,

Renew program goals.2. Identify stakeholders and their needs-

Identify and contact evaluation stakeholders,

3. Determining the evaluation purpose - Revisit the purpose/objectives of evaluation

4. Decide who will evaluate-Decide if evaluation will be in house or contracted out.

5. Report results 6. Justify Conclusions

Cont..

Puja Shrivastav 6

1. Planning for EvaluationIdentify the problem.. And renew the goals

The mission and objectives of the instructional program should be focused.

Include information about its purpose, expected effects, available resources, the program’s stage of development, and instructional context.

Descriptions set the frame of reference for all subsequent planning decisions in an evaluation.

Puja Shrivastav 7

Planning for Evaluationo Determine data-collection

methods, o Create data-collection

instrument, o Test data-collection instrument, o Evaluation of collected data.o Summarize and analyze the data,

prepare reports for stakeholders

Puja Shrivastav 8

Planning for EvaluationGather dataData gathering focuses on collecting

information that conveys a holistic picture of the instructional program.

Data gathering includes consideration about what indicators, data sources and methods to use, the quality and quantity of the information, human subject protections, and the context in which the data gathering occurs. 

Puja Shrivastav 9

Create an evaluation planThe evaluation plan outlines how to

implement the evaluation including:i. Identification of the sponsor and

resources available for implementing the plan,

ii. What information is to be gathered,iii. The research method(s) to be used,iv. A description of the roles and

responsibilities of sponsors and evaluators.

v. A timeline for accomplishing tasks. 

Puja Shrivastav 10

2. Identify stakeholders and their needs

Stakeholders are the individuals and organizations involved in program operations,

Those served or affected by the program, and the intended users of the assessment or evaluation.

Stakeholder needs generally reflect the central questions which they have about the instructional activity, innovation, or program.

Determining stakeholder needs helps to focus the evaluation process so that the results are of the greatest utility.

Puja Shrivastav 11

Three principle groups of stakeholdersPersons Involved in Program

Operations◦ Staff and Partners

Persons affected or served by the program ◦ Clients, their families and social networks,

providers and community groupsIntended users of the evaluation

findings◦ Policy makers, managers, administrators,

advocates, funders, and others

Puja Shrivastav 12

3. Determining the evaluation purpose

Three general purposes for instructional evaluations are --

a. Gain Insight -o Assess needs and wants of community

memberso Identify barriers to use of the programo Learn how to best describe and measure

program activities

b. Change Practice - to improve the quality, effectiveness, or efficiency of instructional activities.

Puja Shrivastav 13

Determining the evaluation purpose

◦ Refine plans for introducing a new practice◦ Determine the extent to which plans were implemented◦ Improve educational materials◦ Enhance cultural competence◦ Verify that participants' rights are protected◦ Set priorities for staff training◦ Make mid-course adjustments◦ Clarify communication◦ Determine if client satisfaction can be improved◦ Compare costs to benefits◦ Find out which participants benefit most from the

program◦ Mobilize community support for the program

Puja Shrivastav 14

Determining the evaluation purpose

c. Measure Effects of program– to examine the relationship between instructional activities and observed consequences

◦Assess skills development by program participants

◦Compare changes in behavior over time◦Decide where to allocate new resources◦Document the level of success in accomplishing

objectives◦Demonstrate that accountability requirements

are fulfilled◦Use information from multiple evaluations to

predict the likely effects of similar programs

Puja Shrivastav 15

Determining the evaluation purpose

d. Affect on participants-◦ Empower program participants (for example,

being part of an evaluation can increase community members' sense of control over the program);

◦ Supplement the program (for example, using a follow-up questionnaire can reinforce the main messages of the program);

◦ Promote staff development (for example, by teaching staff how to collect, analyze, and interpret evidence); or

◦ Contribute to organizational growth (for example, the evaluation may clarify how the program relates to the organization's mission).

Puja Shrivastav 16

Determining the evaluation purpose

◦Reinforce messages of the program◦Stimulate dialogue and raise

awareness about community issues◦Broaden consensus among partners

about program goals◦Teach evaluation skills to staff and

other stakeholders◦Gather success storie◦Support organizational change and

improvement

Puja Shrivastav 17

Identify intended usesIntended uses are the specific ways

evaluation results will be applied. They are the underlying goals of the

evaluation, and are linked to the central questions of the study that identify the specific aspects of the instructional program to be examined.

The purpose, uses, and central questions of an evaluation are all closely related.

Puja Shrivastav 18

4. Decide who will evaluateDecide who will evaluate-Decide

if evaluation will be in house or contracted out.

In house – Principal, Teachers, Students, or Parents.

Out – Some agencies can be hired to help out, Retired professional of same stream etc can also evaluate the program.

Puja Shrivastav 19

5.Reporting ResultsAnalyze dataData analysis involves identifying

patterns in the data, either by isolating important findings (analysis) or by combining sources of information to reach a larger understanding (synthesis), and

Making decisions about how to organize, classify, interrelate, compare, and display information.

These decisions are guided by the questions being asked, the types of data available, and by input from stakeholders.

Puja Shrivastav 20

Report results

Factors to consider when reporting results, or dissemination, include tailoring report content for a specific audience, explaining the focus of the study and its limitations, and listing both the strengths and weaknesses of the study.

It may also include the reporting of active follow-up and interim findings.

Reporting interim findings is sometimes useful to instructors or staff in making immediate instructional adjustments.Cont..

Puja Shrivastav 21

Report resultsDescribe the accomplishments of

the program, identifying those instructional elements that were the most effective;

Describe instructional elements that were ineffective and problematic as well as areas that need modifications in the future; and

Describe the outcomes or the impact of the instructional unit on students. 

Puja Shrivastav 22

Report results

Complete documentation will make the report useful for making decisions about improving curriculum and instructional strategies.

In other words, the evaluation report is a tool supporting decision making, program improvement, accountability, and quality control in curriculum.

This will help in reframing the curriculum ….if needed.

Puja Shrivastav 23

Make conclusions and recommendations

Conclusions are linked to the evidence gathered and judged against agreed-upon standards set by stakeholders.

Recommendations are actions for consideration that are based on conclusions but go beyond simple judgments about efficacy or interpretation of the evidence gathered.

Puja Shrivastav 24

Justify the ConclusionsConclusions become justified when

they are linked to the evidence gathered and judged against agreed-upon values set by the stakeholders.

Stakeholders must agree that conclusions are justified in order to use the evaluation results with confidence.

The principal elements involved in justifying conclusions based on evidence are:

Puja Shrivastav 25

Justify the ConclusionsStandards- Standards reflect the values held by

stakeholders about the program. They provide the basis to make program judgments.

Analysis and synthesis- Analysis and synthesis are methods to discover and summarize an evaluation's findings.

Interpretation- Interpretation is the effort to figure out what the findings mean. Uncovering facts about a program's performance.

Judgements- Judgments are statements about the merit, worth, or significance of the program.

Recommendations-Recommendations are actions to consider as a result of the evaluation

Puja Shrivastav26

Standards for Effective Evaluation

26

Standards

UtilityFeasibilityProprietyAccuracy

Engagestakeholders

Steps

Describethe program

Gather credibleevidence

Focus theEvaluation

designJustify

conclusions

Ensure useand share

lessons learned

Puja Shrivastav 27

The Four StandardsUtility: Who needs the

information and what information do they need?

Feasibility: How much money, time, and effort can we put into this?

Propriety: What steps need to be taken for the evaluation to be ethical?

Accuracy: What design will lead to accurate information?

Puja Shrivastav 28

Standard: UtilityEnsures that the information needs

of intended users are met.Who needs the evaluation

findings?What do the users of the

evaluation need?Will the evaluation provide

relevant (useful) information in a timely manner?

Puja Shrivastav 29

Standard: Feasibility

Ensures that evaluation is realistic, prudent, diplomatic, and frugal.

Are the planned evaluation activities realistic given the time, resources, and expertise at hand?

Puja Shrivastav 30

Standard: ProprietyEnsures the evaluation is conducted

legally, ethically, and with due regard for the welfare of those involved and those affected.

Does the evaluation protect the rights of individuals and protect the welfare of those involved?

Does it engage those most directly affected by the program and by changes in the program, such as participants or the surrounding community?

Puja Shrivastav 31

Standard: AccuracyEnsures that the evaluation

reveals and conveys technically accurate information.

Will the evaluation produce findings that are valid and reliable, given the needs of those who will use the results?

Puja Shrivastav 32

Utilizing the Evaluation ResultThe evaluator records the actions, the features and

experiences of students, teachers and administrators. People who read the report will be able to visualise what the place looks like and the processes taking place. Thus the reader will understand the area’s for requirement of improvement.

The evaluator interpret and explains the meaning of events reported by putting it in its context. For example, why academically weak students were motivated to ask questions; why reading comprehension skills improved; why enthusiasm for doing science experiments increased and so forth.

Puja Shrivastav 33

Utilization of Evaluation ResultUse of Available resources-

organization of staff for learning, Administrative and physical conditions.

Decision area of Teacher- identifying the objective, selection of teaching learning process.

Communication- Properly done with the stakeholders.

Puja Shrivastav 34

Utilization of Evaluation ResultThe Results ensures that the

information needs of intended users are met/ or not. If not then further recommendations can be used which are made by the evaluator.

The feedback obtained could be used to revise and improve instruction or whether or not to adopt the programme before full implementation.

Development of overall programme or curriculum.

Puja Shrivastav 35