Economic Empowerment of Adolescent Girls & Young Women (EPAG) Project
Monitoring & Evaluation System
Wednesday, 29 August, 2012AGI Technical Meeting, Monrovia-Liberia
Dala T. Korkoyah, Jr. EPAG M & E Director
OVERVIEW
Monitoring & Evaluation efforts have been undermined by two inhibitors:
1.Compliance syndrome – system is built mainly to satisfy
donor requirements
2.Vertical posture – a rigid system that utilizes a top-
bottom approach, intimidating and policing end-users.
OBJECTIVE
To share lessons learned in the monitoring and evaluation of the EPAG Project.
Main focus:
1.Developing the EPAG M&E System
2.Gaining the buy-in of the service providers
3.Building the monitoring team
4.Conducting random, unannounced classroom monitoring
visits
5.Problem solving with services
DESIGNING AN M&E SYSTEM
1.The development of EPAG M&E System involved the following core activities:
a. Reviewing the EPAG Operational Manual
b. Revising the Results Framework
c. Developing an M&E Plan
KEY LESSONS1. Operational Manual
The Operational Manual (OM) was somewhat detached from prevailing reality:
a. No capacity building plan for SPs,b. Different scores weight on indicators (e.g. training
venues should not be in a noisy area),c. Did not have tools for monitoring critical design
elements (child care, job/business performance, employment verification, etc.).
The Operational Manual must be harmonized with prevailing operational needs of the project.
KEY LESSONS2. Results Framework
Different versions of the results framework were discovered:
a. language inconsistencies (%, share of) b. Impractical indicators (annual report to
Parliament).
A standardized results framework should be used by all stakeholders.
KEY LESSONS3. Monitoring & Evaluation Plan
The above findings informed the M&E Plan:a. Alignment of the project results framework with the
Poverty Reduction Strategy (PRS) and the Millennium Development Goal 3,
b. Integrated a capacity building strategy for project key stakeholders,
c. Supported the drafting of monitoring tools for important project elements,
d. Promoted a realistic monitoring plan to accommodate volunteer monitors
An M&E Plan is critical to developing an effective M&E System – simplicity & participation
FRAMEWORK OF EPAG M&E SYSTEM
Training services
EPAG M & E
Internal monitoring-Quality control -Training sites visits- Implementation timeline- Reporting
Coordination- Quality control- Capacity building-Technical assistance & skills transfer-Feedback to SPs
Quality monitors
Service providersM & E teams
QM visits -Venue assessment-Classroom observation-Trainee interviews-Verification Trainers
- Attendance logs- Trainees performance- Reporting
OTHER CORE M&E ROLESIn addition to M&E of training services, the M&E served the IE survey firm and the Ministry of Gender M&E Unit
1.IE Survey Firma. coordinating activities between the WB, IE firm, and
the service providers,b. Participating in revision of survey instruments,c. Conducting quality control visits during data
collection and entry,d. Ensuring compliance with recruitment strategy,e. Reviewing reports, etc.
2.MoGD1. Technical assistance2. Coaching3. Capacity building
SERVICE PROVIDERS’ BUY-INMonitoring and evaluation can become a fulfilling relationship, once built on the following principles:
1.Mutual respect and trust – the monitor is not a boss or supervisor, and should serve with integrity.
2.Transparency and equity – all SPs should be appraised on a set of common, agreed standards.
3.Opportunity for capacity building and support – be available to help find solutions, provide technical assistance and offer moral support.
When the service providers realize that you serve in their best interest, they tend to cooperate.
BUILDING MONITORING TEAMThe overall team characteristics have great influence of the quality of service:
1.Recruit and contract qualified monitors– academic, previous experience working with similar target group (monthly stipend provided on daily rate).
2.Team capacity building– organize training sessions to help team understand the project (goals, objectives, timeline, etc.).
3.Involve team in other project activities– engage team members at various levels; meetings, project launch, social events, etc.
BUILDING MONITORING TEAM CONT’D4. Involve monitors in the definition of indicators and the
development of monitoring tools.
5. Set clear boundaries for the roles and responsibilities of the monitors (define the scope of work).
4. Involve service providers staff (M&E officers, supervisors, trainers, etc.) in the training for the monitors.
A strong collaboration between monitoring team and service providers ensures:
a. Synergy of efforts by all partiesb. Service providers are familiar with toolsc. Promotes effective internal monitoring
RANDOM, UNANNOUNCED SPOT CHECKSThe strategy that kept service providers on ‘their toes,’ sending a clear message for quality improvement:
1.Development of checklists – together with service providers a common, agreed set of indicators and score scales were developed.
2.Develop monitoring schedule – ensuring monitoring schedules fit into monitors’ existing plans.
3.Appoint a Monitors’ Supervisor – coordinates field work: distribute checklists, maintains working tools and equipment, collect scored checklists, write monthly activities report, etc.
RANDOM, UNANNOUNCED SPOT CHECKS
4.Weekly classroom monitoring – the monitoring plan allowed for two days of monitoring per week
5.A typical visit – monitors visit classes in pairs: both observe training delivery for about 15 min; one person scores the indicators while the other interviews two trainees selected on random basis. 5.Data analysis and reporting – both monitors tally scores, and report filled forms to Supervisors who submits data to M&E Director.
RANDOM, UNANNOUNCED SPOT CHECKS
Data analysis and report – filled monitoring forms are reviewed by Director:
a. Reviewed forms are submitted to data clerks for entry into computerized database,
b. Data is analyzed and quality performance scores calculated,
c. A consolidated monthly quality monitoring report is written, showing the quality performance of all service providers,
d. The report is shared with service providers for feedback
Sharing of the consolidated quality monitoringreport promotes a positive competition amongthe service providers
PROBLEM SOLVING Engaging service providers on monitoring findingsInvolved diplomacy – enforcing compliance tostandards, yet remaining respectful and supportive.
1. Ensure monitors comply with guidelines
2. Develop follow up proceduresa. Regular communication with field teamsb. Review completed forms immediately
PROBLEM SOLVING
3. Communicate promptly with project managers to highlight problems (unsanitary, untidy childcare facility), and require immediate action within defined timeline.
4. Follow up – phone call, next visit, or report
5. Coordinator engages with Executive Directors or Country Directors… follow up
The performance-based nature of contracts providedthe incentive for service providers to submit deliverables on time; payments were based on satisfying agreed requirements.
EPAG MONITORING TOOLS
A lists of monitoring tools used by EPAG
1.Community Assessment Tool2.Training Venue Assessment Tool3.Classroom Observation Checklist4.Trainee Interview Form5.Trainees Attendance Tracking Tool6.EPAG Trainees Directory7.Job Performance Assessment Tool8.Business Performance Assessment Tool9.Job/Business Verification Tool10.Quality Contests Tool
HIGH-QUALITY TRAINING
Two (2) mechanisms help to keep training quality high:
Project quality monitoring
Withheld incentive payment for service providers
Top Related