Use of routinely collected service delivery and M&E indicator data for timely feedback
description
Transcript of Use of routinely collected service delivery and M&E indicator data for timely feedback
Use of routinely collected service delivery and M&E indicator data for timely feedback
Denis Nash, PhD, MPHAssociate Professor of Epidemiology
Director, ICAP M&E Unit
Mailman School of Public Health, Columbia University, NYC, [email protected]
Common M&E Challenges in scale-up (1)• Large number of sites with relevant info residing with multiple
individuals– e.g., sites, districts, partner country teams, partner HQ , etc.
• Increasingly complex array of services to report on/evaluate– Collection, management and use of indicator data within country
• Traditionally siloed areas of reporting for program activities that are integrated at the site level– e.g., care and treatment, PMTCT, TB/HIV, testing & counseling
• Separate M&E reports for each program area• Comprehensive program evaluation? Triangulation?
• MOH vs. donor reporting requirements• Many important aspects of implementation and program
quality not captured in conventional, routinely collected M&E indicators– Generally M&E systems do not take context into account
Common M&E Challenges in scale-up (2)• Providing timely data processing and feedback of information
to implementation staff for program improvement– National-level (i.e., technical and management staff, IPs)– District-level– Site-level (and below)
• Program improvement ultimately happens and most often starts at the site level
• Integrated data management– Adequate database to house M&E indicator data is essential– Capture/store/process/utilize reported data in a streamlined and efficient way– Dynamic and flexible to accommodate changes in indicators
• Data quality– Missing or incomplete data– Incorrect data
• Demand for indicators that reflect quality of care/program– M&E indicators do not typically measure quality of care/program
Feeding data back to programs in the form of information
• Scale-up and sheer number of sites and geographic spread makes regular and timely feedback challenging, especially at site level
• Need for information at multiple levels– For implementation teams at national and district-levels
• Which sites to focus scarce mentoring and implementation support resources?
• Are efforts to maximize quality of care having an impact?
– For site staff• How is our site doing? Where can we improve?• Are our efforts to improve things working?
• Difficult to do without some form of automation (e.g., reports) and decentralization of information (e.g., web-based)
Num
ber o
f site
sNumber of sites by country,
as of March 31, 2010
Source: ICAP Site Census, March 2010
Tanza
niaKen
ya
Mozam
bique
Nigeria
Cote d'
Ivoire
South
Africa
Ethiop
ia
Rwanda
Swazila
nd
Leso
tho0
50
100
150
200
250
300
350
400
450
405
153 138 137107
73 68 57 47 34
Total number of sites supported by ICAP : 1,219
Feeding data back to programs in the form of information
• Scale-up and sheer number of sites and geographic spread makes regular and timely feedback challenging, especially at site level
• Need for information at multiple levels– For implementation teams at national and district-levels
• Which sites to focus scarce mentoring and implementation support resources?
• Are efforts to maximize quality of care having an impact?
– For site staff• How is our site doing? Where can we improve?• Are our efforts to improve things working?
• Difficult to do without some form of automation (e.g., reports) and decentralization of information (e.g., web-based)
Priority indicators by site
Examples of feedback tools used by ICAP
• Mainly aimed at providing feedback from ICAP-NY to ICAP country teams on reported data
• But some tools can also be used to feedback data to district and sites
Examples• ICAP URS dashboards and reports• Maps (static and interactive)• PFaCTS reports• Quarterly eUpdate• Patient-level data reports
Patient-level data
ICAP patient-level data warehouse elements
Enrollment Table•Basic demographic information
• Age• Sex• enrollment
date•Prior ARV use•Point of entry•Transfer
Visit Table: Visit date, WHO stage, height, weight, Hb, ALT, nextscheduled visit date
CD4 Table: CD4 test date, CD4 count, CD4 percent
ART Table: ART regimen, regimen start & end date,reason(s) for switching ART regimen
Medication Table: TB screening date and result, TB medicationreason (treatment or prophylaxis) and dates, CTX & fluconazole
Status Table: Patient disposition status (dead, transferred, withdrew, LTF, stopped ART, etc) and status date
Pregnancy Table: Visit date, weeks gestation at visit, due date, actual pregnancy end date
Baseline: 1 rowPer patient Follow-up data: 1 row per measure per patient
*measures at key points of interest (e.g., enrollment, ART initiation) calculated based on visit dates
Databases are anonymized using an automated tool. Data use governed by MOH approved protocols.
Patient-level data feedback reports• Multi-site feedback reports
– Combines and compares data across multiple sites– One for adult patients and one for pediatrics patients
• Site-specific feedback reports– General feedback report
• Summary of information on currently enrolled patients– Standards of care (SOC) report
• Quality of care indicators
• Reports are:– 100% automated and are in PDF format– generated and shared with sites within two weeks of submission of database– Currently generated in NYC at ICAP HQ– Report generation tools can be deployed, owned, and maintained by MOHs
where capacity exists or where it can be developed
Multi-site report
Site-specific general feedback report
PDF format, 100% automated
Site-specific SOC report
Dissemination of patient-level data reports
M&E Indicator data
Integrated data at site level
Filterable home page and program area dashboards
Example of care and treatment dashboard table
Filterable home page and program area dashboards
Conclusions
• Timely feedback and dissemination of routinely collected service data and M&E data is an increasing challenge, especially as the number of sites increases (i.e., scale-up)– National, district, site, IPs
• Database tools, automation, and decentralization of information are critical– Improves data quality and utility of information!
• Capacity building on interpreting and applying disseminated data to program improvement is needed
Thank you!