Monitoring and Evaluation (M&E) for Community Driven Development (CDD)...
-
Upload
hoangnguyet -
Category
Documents
-
view
221 -
download
1
Transcript of Monitoring and Evaluation (M&E) for Community Driven Development (CDD)...
Monitoring and Evaluation (M&E) for Community Driven Development (CDD)
Programs – Introduction to Concepts and Examples
Janmejay Singh, CDD Coordinator
Social Development Department, The World Bank
(Adapted from Susan Wong CDD M&E Curriculum)
May 2012
Overview
1. Introduction – M&E Basics; why its important and why its tough
2. Elements of a Good M&E system for CDD – what we monitor/evaluate and how
3. MIS and Reporting Systems – some tips
4. Results Frameworks for CDD projects (in next session)
3
M&E Basics Monitoring: Project managers learn what works, what doesn’t, and
why. Measures progress against work plans Provides feedback for real time decision-making Evaluation: What impact are we having? Have we reached the
project’s stated goals over the long term? Assesses changes in the well-being of individuals,
households, communities, etc. that can be attributed to a particular project/program.
Studies - Tests innovative approaches to poverty reduction. Pioneer new approaches to decide if they should be expanded and pursued on a larger scale.
Why Monitor Results or Evaluate Impact?
• To measure success in achieving goals and objectives • Helps us take corrective actions during implementation
and improve and learn from mistakes in future • Promotes efficiency and accountability • Makes ‘development effective’ – builds case for more aid • What gets measured gets done!
Despite this, the emphasis on evaluation and results has only emerged over the last 10-15 years
Why M&E for CDD is more necessary and more challenging…
• Dispersed and Large Scale (can cover 1000s of communities): • Dispersed implementation means M&E for CDD extremely important • Flow of information into MIS system more difficult to manage • Loss of control/comparison groups makes strategies for impact evaluation more
difficult to implement • Diversity of CDD Programs:
• Open menu for sub-projects means investments not predetermined and multi-sectoral - can cover combination of infrastructure, human, social and economic activities
• Participant communities may be unknown beforehand • Objective of good governance and building social capital necessitates evaluating
less easily measurable goals (e.g. transparency, accountability, empowerment) • Lack of In-country Capacity to Conduct M&E Work:
• CDD operations often operate in fragile/emergency/poor situations • Government: concept, design and analytical capacity is relatively weak; firms
contracted to conduct studies do not provide data of a high enough standard of quality to produce adequate research
• Research Institutions: small number of institutions are rarely strong in “full service” (design, analysis and report writing) capability
5
Elements of a Good M&E System for CDD
Evaluation Strategy Complaints Handling System
Participatory M&E/Social
Accountability Tools Special Studies
Reporting and Management
Information System (MIS)
• Sub-project data • HR data • Financial data •Data on Trainings, etc.
RESULTS FRAMEWORK Most
projects don’t have all of them and even
when they do, they aren’t
linked to each other
What Do We Normally Monitor?
• The Monitoring system (regular reporting, MIS, implementation and post-implementation monitoring, community monitoring, technical and financial supervision) should be able to answer questions of process.
• That is Achievement of Intermediate Results; and progress against work plan (inputs, outputs)
• Goal is to provide real-time feedback for decision making • Examples: Are funds being used as planned? Are project interventions reaching the intended
beneficiaries? Quality of inputs? Are poor, women, and vulnerable groups participating
in the process?
8
9
Methods for Monitoring Examples of monitoring mechanisms in CDD projects: Field monitoring by project staff Participatory/Community monitoring by village
committees using SA tools (e.g. Community Scorecards, social audit, etc.)
Independent/third party monitoring by NGOs, civil society groups, journalists, media
Grievance & complaint resolution mechanisms Financial reviews, audits, procurement post reviews Special/Case studies by independent researchers Bank Supervision Missions
Informal/Less Structured Methods More Structured/Formal Methods
Reviews of official records
Field visits
Community interviews
Participant Observation
Key informant interviews
Focus Group
Interviews
One-Time Survey
Questionnaires
Panel Surveys
Census Direct observation
Types of Data Collection Methods…
Citizen Report Card Surveys Community Scorecards
Adapted from “Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management”, World Bank, 2000
Monitoring in Action
11
▲
Community scorecards being prepared in Malawi Social Fund
Members of Community Participatory Monitoring Group in Aceh discussing findings ▼
What would we like to evaluate for impact? A. Poverty/ Welfare Dimensions
Has CDD reduced poverty? Has it reached the poor?
What are the impacts on livelihoods and employment?
B. Infrastructure
Has CDD improved access to services, quality, utilization?
Are CDD projects cost effective compared to other mechanisms?
Is the infrastructure maintained?
C. Local Governance/ Empowerment
Do CDD projects promote improvement in local governance?
Does it build stronger, more responsive local institutions?
Has transparency, participation, inclusion (esp. of women/vulnerable groups) increased?
D. Social Dynamics
Do CDD projects improve social relations and cohesion?
Does it reduce incidents of conflict?
13
14
How Do We Evaluate? Key Guiding Principles in Impact Evaluation:
The Counterfactual - Comparison/control groups
Sample size large enough to generate statistically significant results
Baseline data Mix of Quantitative & Qualitative methods
ideal
Types of Evaluation Work • Impact Evaluation: Rigorous quantitative evaluations which
attribute impact on outcome indicators to the project (e.g. Philippines Kalahi-CIDDS or Afghanistan NSP)
• Purpose: Establish effectiveness of project in achieving development objectives
• Best practice: Treatment and Control groups measured ex-ante and ex-post project implementation; could be through: • Randomized Control Trials (RCTs) • Non/Quasi-experimental techniques (e.g. Propensity Score Matching)
• Outcome indicators based on overall project objectives (including per capita consumption, access to health care and education, employment)
• Qualitative component to determine how and why impacts are occurring
Types of Evaluation Work (contd.) • Infrastructure Studies: Rigorously developed methods using economists
and engineers to assess sub-project infrastructure (E.g. Burkina Faso Community Based Rural Devt Proj) • Purpose: establish effectiveness of project as infrastructure delivery system • EIRR: direct and indirect economic impact on local economy • Quality: based on existing standards • Cost-effectiveness: relative to equivalent government construction
• Thematic Evaluations: Can be done on specific issues of interest (e.g. gender impacts, procurement, micro-finance, corruption, etc.) normally using qualitative approaches (e.g. PNPM Marginalized Groups study) • Smaller sample size allows methodological flexibility • Techniques include Focus Group Discussions, Key Informant Interviews and
Direct Observation • Randomized Pilots for new programs: Offer a mechanism to rigorously
test new design options during the umbrella project’s implementation period using smaller scale interventions for which rigorous impact evaluation is built-in (E.g. TASAF Community Based Conditional Cash Transfer Pilot)
Keys to Effectiveness of M&E Systems • Build evaluation work into project work plan and
budget and sustain over time • Ex-ante preparation including M&E framework and
baseline in place before implementation • Trust funds (e.g. SIEF, JSDF) often required to allow
flexibility in resource disbursement relative to loan funds particularly at baseline
• Utilize the most rigorous methods available • Specific talent and skills needed in M&E personnel • Careful balance between cost, timeliness and rigor of
methods • PNPM/KDP Experience: full time M&E staff plus outside
numerous external consultants and experts
Keys to Effectiveness of M&E Systems… • Clear understanding of what questions are being answered and how
indicators are being measured • Strong interaction with and input from project team at every stage of the
evaluation process • Timeline for evaluation fits expected trajectory of impact for outcomes being
measure • Good practice: research questions and indicators based on consultation with
counterparts and project results framework
• Link findings to improvement of operations • Commitment from counterparts and project team to use findings to improve
the project • Use M&E inputs for adaptive management and learning • Dissemination strategy which effectively communicates results and
recommendations • M&E resources used to test new design features and previously unexplored
research topics; findings have led to changes in project design and implementation.
Some Common Constraints Faced Difficulties in reporting and collecting real-
time, accurate data especially in large, complex operations… Problems with MIS – garbage in/garbage out.
Lack of proper analysis of data. Lack of in-country specialized skills and
capacity esp. for impact evaluations Time to prepare baselines, evaluations, etc. Maintaining control/comparison groups over
a long enough time to measure impacts.
20
Various Levels of Reporting Exist • Reporting from Implementing Agency to WB - FMRs - Semi-Annual Progress Reports • Implementing Agency to various constituents at
multiple levels - Annual reports to general public - e.g., national and district parliaments, village meetings • Within-project reporting - Village/commune >> district>>province>>national & vice versa
22
Within Project Reporting
Sample CDD Reporting & Information Systems
Basic Project & Subproject
Management Information
Human Resources
Financial Information
Capacity Building/ Training
Grievance Redress
See sample list of indicators for each database
23
Within Project Reporting: Developing the MIS
• Step 1: Refer to your overall questions, Results Framework • Step 2: Compose list of key indicators to monitor and track
periodically for reporting. • Step 3: Check list with stakeholders including project mgrs, WB • Step 4: Design reporting system and forms. Usually included in Project Operations or Implementation Manual. Reporting section should describe reporting system including: (i)
information flows; (ii) frequency;(iii) who’s responsible; and (iv) reporting formats if any.
Annex usually includes specific reporting formats Tips: Ensure forms are self-explanatory and have instructions and
examples Field test forms prior to mainstreaming Ensure that financial and project reports match
24
Developing the MIS Database • Step 1: Hire a specialized MIS firm or specialist • Step 2: Review indicators list/reporting forms to see which
fields should be included in MIS. Review flow of reports. Be specific about what information you expect from MIS and how often.
• Step 3: See what national codes are available for locations. • Step 4: Specialists go to field to observe reporting context
and speak to field staff. • Step 5: Design program together with project management. • Step 6: Run several tests of system before mainstreaming • Step 7: Training in the field for consultants and data entry
personnel.
25
Common Problems Faced & Ways to Avoid Them
• Reporting forms/MIS overly complex >> Start simple, field test first
• Project is not clear on what forms/info need to be included in MIS >> Review reporting needs carefully, consult stakeholders, start simple
• Project relies on excel spreadsheets as database >> DON’T • MIS specialist/firm receives no guidance from program
management >> Provide regular guidance and clear instructions in TOR and during consultancy
• MIS doesn’t take into account capability of data entry staff or time needed >> Field test, provide training, start simple
28
Common Problems Faced & Ways to Avoid Them (cont.)
• Different parts of MIS don’t match. >> Undertake regular meetings to tally information and fix anomalies
• No feedback on reports leading to “garbage in, garbage out”. >> Provide feedback on reports, do not collect information if it’s not needed, revise reporting system over time.
• No analysis of data, staple data together from different units >> need skilled project staff to be responsible for aggregating and analysing data, providing feedback.
• No sanctions for non-reporting >> usually sanctions are needed, e.g., withholding of disbursements
29
Traditional Logframe to WB Results Framework
31
• Logical Framework Approach developed in late 1960’s for USAID.
• Management tool mainly used in the design, monitoring nad evaluation of international development projects.
• Used widely by bilateral and multilateral donors, some NGOs.
• Move to focus more on intermediate results (rather than outputs) in late ‘90s by some donors. >> Results Framework
Traditional Logframe
32
Hierarchy of Objectives
Objectively Verifiable Indicators
Means of Verification
Important Assumptions/Risks
Goal
Purpose
Outputs
Activities
WB Results Framework
WB Results Framework
• Results Framework (RF) summarizes what project is trying to achieve and how results are measured, monitored and evaluated.
• It is a variation of the traditional Logical Framework. • Links together the Project Development Objective
(PDO), intermediate outcomes/outputs to be delivered by each component, & indicators to be used to achieve results.
• RF is useful for both project management and supervision by NSP, WB, donors, other stakeholders. Helpful to track progress towards PDO and to make changes if necessary during implementation.
33
Results Framework Template
34
Indicators Base-line
Yr 1 Yr 2 etc
Frequency & Reports
Data Collection Instruments
Responsibility for Data Collection
PDO Indicator
Intermediate Outcome /Output Indicators
The generic CDD results chain
INPUTS Funds (Loan,
govt & community contributions) - Technical
assistance (design, program rules)
OUTPUTS Community participation in
activities Small-scale Infra: e.g. roads, irrigation,
health centers, schools built, of high
quality & tailored towards community
needs Income-generating
activities supported Training provided
to communities
INTERMEDIATE RESULTS
Building Social Capital (trust, association,
community activities) Improved Access and
Use of services – e.g. access to roads and
markets, school enrollment, attendance, professional deliveries,
access to healthcare Community skills
improvements Jobs created
LONGER TERM OUTCOMES
Community empowerment Improved local governance
Household welfare increases (consumption,
income, assets)
Sustainable job creation
Improved educational and health outcomes
RISKS/ASSUMPTIONS
Funds are available and disburse in a timely manner Design is sound & promotes real participation rather than patronage Qualified project staff are in place
Communities are given genuine opportunities to receive info & participate TA & capacity building provided is sufficient and of high quality
Quality of supply-side interventions Economic growth Enabling environment for social, political reforms External shocks are minimized (economic, financial, crises, natural disasters) 35
Some Typical Questions IMPACT LEVEL Does CDD Project: • reduce poverty in the long term? (Long-term Outcome/Goal) • improve access/use of services (clean water, sanitation, roads,
markets, irrigation, health, education, etc) (PDO/Intermediate Outcome Level)
• contribute to local economic growth (job creation, local economic development)? (PDO level)
• provide high quality services and public goods to community? (Intermediate Outcomes level)
• Is project infrastructure cost effective? Rates of return? (PDO/IO level)
• Is project sustainable? (Depends on meaning – PDO/IO level)
36
Policy Questions (cont.) Does your CDD Project • strengthen local representative bodies, e.g village
committees? (PDO or IO level) • increase citizens’ participation in local development?
(IO level) • improve women’s participation and voice? (IO level) • improve community satisfaction with government
and services? (PDO/IO level) • improve social capital? (IO level) • help with conflict mitigation/resolution? (PDO/IO
level)
37
Process Questions • Is CDD Project proceeding according to plan? (IO
Level)
• Are results (project intermediate objectives/outputs) being achieved: (IO Component Level)
- Councils/committees formed, re-elected, strengthened?
- Local development plans completed? - Block grants distributed and used appropriately? - Subprojects implemented and maintained?
• Quality issues: participation, technical quality, fund usage, transparency (IO Component Level)
38
Results Framework Tips
39
PDO: • Pay attention to wording of Project Development Objective • Only one PDO Indicators: • Rule of thumb: 1-3 indicators per PDO level and per IO level • Make sure indicators reflect your PDO and IO statements • Ensure indicators areas unambiguous as possible Implementation Arrangements Table • As much as possible, insert baseline values. Have until 1st ISR. • Think carefully about timing of accomplishments, esp. in early
years of project when start-up may be slow or delayed due to procurement.
Overall: • Don’t get into “wordsmithing” as a group • At some point, TTL and Govt decide & bring closure
Finally, consider a Results Framework Workshop during Design
40
Several Options: (1) Use as a full design workshop to go over entire
project design with all stakeholders - normally 4-5 days depending upon project complexity
(2) Use just for M&E purposes (Indicators & Implementation Arrangements) – 1- 2 days
(3) Do it within task team and w/key members of Govt.
The Results Chain…
INPUTS OUTPUTS OUTCOMES IMPACT
Inputs refer to activities
undertaken, and physical or financial
resources spent by the project/
program
Outputs refer to the assets,
events, goods and services
produced by a project /program
Outcomes refer to the use, adoption, uptake,
or other behavior change among target beneficiaries induced
by the outputs of a project/program
Impacts are the long term well-being and development goals
that may result from the program/ project
These are directly under the control of the project/program
These are not directly under the control of the project/program. But the outcomes are expected to
materialize with effective implementation. Impacts depend on lots of other things and only occur over time.
These are the “results” that development agencies now want to
focus on and hold projects accountable for
An example: A typical Water and Sanitation Project
INPUTS
OUTPUTS
OUTCOMES
IMPACT
Construction of wells and sanitary facilities; training on hygiene & sanitation
Number of wells and toilets built; number of village members trained
Increased access (shorter distance and time) to water and sanitation facilities; better quality water; adoption of
better hygiene methods
Improved health and lower disease incidence; lower mortality from water-borne diseases
Reporting on inputs and outputs falls under the realm of “Monitoring” to give real-time feedback on progress
against workplans
Measurement of outcomes and impacts comes under the area of ‘evaluation’.
45
HOW FAR CAN THE PROJECT MANAGER REACH?
Activities
Intermediate Results/Outputs
Project Development Objective
Goal
Manageable Interests
Development Hypotheses
Source: Adapted from Team Technologies, Inc.
46
What is a SMART indicator?
46
• Specific = indicator is related to the results we want to achieve and only to those results.
• Measurable = indicator is very clearly defined and that there is an agreement about how to measure it.
• Attributable = changes in the indicator can be attributed to the project.
• Realistic = target for the indicator can be achieved in a practical manner
• Timely = indicator can be reported upon in a timely manner.
47
Example from Indonesia
PNPM-Rural/KDP Spent <1% of the program budget for M&E. Over 75 studies in 10 years
System was set up along 3 pillars: Monitoring – field monitoring, community participatory monitoring,
case studies, NGO/journalist independent monitoring, grievance/complaints, financial reviews, audits, supervisions
Evaluation – impact evaluation, audits, special thematic studies (infra, loans, cost-effectiveness, corruption, communications)
Studies – “shine a light on dark areas”, learn what works and what doesn’t. Move the knowledge frontier forward. Not project-specific (poverty targeting, corruption study, village finances and O&M)
KDP/PNPM-Rural Findings Poverty Impacts • 2008 impact evaluation of KDP2 showed real per capita
consumption gains 11% higher among poor households in treatment compared to control areas.
• Proportion of HHs moving out of poverty in poor subdistricts was 9.2% higher in KDP2 areas compared with control.
• Vulnerable HHs near the poverty line were less at risk of falling into poverty as a result of KDP participation.
• Limited impacts on per capita consumption and poverty for female-headed and households with heads with no primary education.
• Unemployment: Unemployment rates in control areas increased by 1.5% more than in KDP2 areas
48
KDP/PNPM-Rural Findings (cont)
Infrastructure (75% of block grant funds)
• Cost effectiveness – avge 56% less expensive than equivalent works under Min of Public Works
• High rates of return – avge EIRR 39% to 68%
• High quality infrastructure – Periodic independent evaluation found that 94% of 108 projects sampled were ranked good or v. good technical quality.
49
KDP/PNPM-Rural Findings (cont) • Participation rates: Community participation is high.
Participation of women in PNPM-Rural meetings averaged 48 % in 2008. Nearly 60% of those who attend KDP/PNPM-Rural planning meetings are from the poorer segments of the community. Recent impact evaluation and gender review did find however, that PNPM could do much more to promote participation of women and vulnerable groups.
• Promoting Stability: - In 2 provinces studied using qualitative techniques, KDP had
significantly contributed to improvements in inter-group relations. Where village had the program for longer, impacts were greater.
- No macro impact on levels of violent conflict.
50
51
Examples of how PNPM/KDP M&E findings were used
Example 1: For expansion • Evaluations and field reports found that program was popular
in communities, delivered the goods, high rates of return → Govt expanded program
Example 2: 2004 Corruption Study • Randomized interventions for: (a) increasing community
participation in monitoring; (b) increasing probability of external audits
• 600 villages in East/Central Java,road projects • → Findings related to importance of announcing audits
and increasing audit sample were incorporated (partially) into PNPM-Rural design
52
Examples of how “negative” PNPM/KDP M&E findings were used
4 (real) examples:
Example 3: MIS, some supervision reports, 2002 external review of microcredit component found that it was not working well. → Mgmt decided to redesign component.
Example 4: Evaluation of livelihoods pilot (SADI) came back with negative findings → WB recommended stopping the pilot.
.
53
Examples of how “negative” PNPM/KDP M&E findings were used (continued)
Example 5: Supervision reports, qualitative case studies & impact evaluation found that marginalized groups’ needs were not being met thru program → Launched complementary programs to address these special needs, follow-up studies.
Example 6: First round impact evaluation for PNPM-Generasi showed no/negative impact on education compared to control areas → (i) find out what was going on in the field to produce these results; (ii) see what follow-up survey tells us; & (iii)rethink the education target indicators.
Results Framework Template Project Development Objective
Project Outcome Indicators Use of Project Outcome Information
By end of the project, what should have been achieved?
How do you measure achievement of PDO?
How will you use info?
Intermediate Outcomes Intermediate Outcome Indicators
Use of Intermediate Outcome Information
Program Component 1: Intermediate outcome or output Program Component 2: Program Component 3 etc:
How do you measure achievement of intermediate outcomes /output under each component?
How will you use info?
54