Use of routinely collected service delivery and M&E indicator data for timely feedback

42
Use of routinely collected service delivery and M&E indicator data for timely feedback Denis Nash, PhD, MPH Associate Professor of Epidemiology Director, ICAP M&E Unit Mailman School of Public Health, Columbia University, NYC, USA [email protected]

description

Use of routinely collected service delivery and M&E indicator data for timely feedback. Denis Nash, PhD, MPH Associate Professor of Epidemiology Director, ICAP M&E Unit Mailman School of Public Health, Columbia University, NYC, USA [email protected]. - PowerPoint PPT Presentation

Transcript of Use of routinely collected service delivery and M&E indicator data for timely feedback

Page 1: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Use of routinely collected service delivery and M&E indicator data for timely feedback

Denis Nash, PhD, MPHAssociate Professor of Epidemiology

Director, ICAP M&E Unit

Mailman School of Public Health, Columbia University, NYC, [email protected]

Page 2: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Common M&E Challenges in scale-up (1)• Large number of sites with relevant info residing with multiple

individuals– e.g., sites, districts, partner country teams, partner HQ , etc.

• Increasingly complex array of services to report on/evaluate– Collection, management and use of indicator data within country

• Traditionally siloed areas of reporting for program activities that are integrated at the site level– e.g., care and treatment, PMTCT, TB/HIV, testing & counseling

• Separate M&E reports for each program area• Comprehensive program evaluation? Triangulation?

• MOH vs. donor reporting requirements• Many important aspects of implementation and program

quality not captured in conventional, routinely collected M&E indicators– Generally M&E systems do not take context into account

Page 3: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Common M&E Challenges in scale-up (2)• Providing timely data processing and feedback of information

to implementation staff for program improvement– National-level (i.e., technical and management staff, IPs)– District-level– Site-level (and below)

• Program improvement ultimately happens and most often starts at the site level

• Integrated data management– Adequate database to house M&E indicator data is essential– Capture/store/process/utilize reported data in a streamlined and efficient way– Dynamic and flexible to accommodate changes in indicators

• Data quality– Missing or incomplete data– Incorrect data

• Demand for indicators that reflect quality of care/program– M&E indicators do not typically measure quality of care/program

Page 4: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Feeding data back to programs in the form of information

• Scale-up and sheer number of sites and geographic spread makes regular and timely feedback challenging, especially at site level

• Need for information at multiple levels– For implementation teams at national and district-levels

• Which sites to focus scarce mentoring and implementation support resources?

• Are efforts to maximize quality of care having an impact?

– For site staff• How is our site doing? Where can we improve?• Are our efforts to improve things working?

• Difficult to do without some form of automation (e.g., reports) and decentralization of information (e.g., web-based)

Page 5: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Num

ber o

f site

sNumber of sites by country,

as of March 31, 2010

Source: ICAP Site Census, March 2010

Tanza

niaKen

ya

Mozam

bique

Nigeria

Cote d'

Ivoire

South

Africa

Ethiop

ia

Rwanda

Swazila

nd

Leso

tho0

50

100

150

200

250

300

350

400

450

405

153 138 137107

73 68 57 47 34

Total number of sites supported by ICAP : 1,219

Page 6: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 7: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 8: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Feeding data back to programs in the form of information

• Scale-up and sheer number of sites and geographic spread makes regular and timely feedback challenging, especially at site level

• Need for information at multiple levels– For implementation teams at national and district-levels

• Which sites to focus scarce mentoring and implementation support resources?

• Are efforts to maximize quality of care having an impact?

– For site staff• How is our site doing? Where can we improve?• Are our efforts to improve things working?

• Difficult to do without some form of automation (e.g., reports) and decentralization of information (e.g., web-based)

Page 9: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 10: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Priority indicators by site

Page 11: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Examples of feedback tools used by ICAP

• Mainly aimed at providing feedback from ICAP-NY to ICAP country teams on reported data

• But some tools can also be used to feedback data to district and sites

Examples• ICAP URS dashboards and reports• Maps (static and interactive)• PFaCTS reports• Quarterly eUpdate• Patient-level data reports

Page 12: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Patient-level data

Page 13: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

ICAP patient-level data warehouse elements

Enrollment Table•Basic demographic information

• Age• Sex• enrollment

date•Prior ARV use•Point of entry•Transfer

Visit Table: Visit date, WHO stage, height, weight, Hb, ALT, nextscheduled visit date

CD4 Table: CD4 test date, CD4 count, CD4 percent

ART Table: ART regimen, regimen start & end date,reason(s) for switching ART regimen

Medication Table: TB screening date and result, TB medicationreason (treatment or prophylaxis) and dates, CTX & fluconazole

Status Table: Patient disposition status (dead, transferred, withdrew, LTF, stopped ART, etc) and status date

Pregnancy Table: Visit date, weeks gestation at visit, due date, actual pregnancy end date

Baseline: 1 rowPer patient Follow-up data: 1 row per measure per patient

*measures at key points of interest (e.g., enrollment, ART initiation) calculated based on visit dates

Databases are anonymized using an automated tool. Data use governed by MOH approved protocols.

Page 14: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Patient-level data feedback reports• Multi-site feedback reports

– Combines and compares data across multiple sites– One for adult patients and one for pediatrics patients

• Site-specific feedback reports– General feedback report

• Summary of information on currently enrolled patients– Standards of care (SOC) report

• Quality of care indicators

• Reports are:– 100% automated and are in PDF format– generated and shared with sites within two weeks of submission of database– Currently generated in NYC at ICAP HQ– Report generation tools can be deployed, owned, and maintained by MOHs

where capacity exists or where it can be developed

Page 15: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Multi-site report

Page 16: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 17: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 18: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Site-specific general feedback report

Page 19: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

PDF format, 100% automated

Page 20: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 21: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 22: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 23: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 24: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 25: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 26: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 27: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Site-specific SOC report

Page 28: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 29: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 30: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Dissemination of patient-level data reports

Page 31: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

M&E Indicator data

Page 32: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Integrated data at site level

Page 33: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 34: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Filterable home page and program area dashboards

Page 35: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Example of care and treatment dashboard table

Page 36: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 37: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Filterable home page and program area dashboards

Page 38: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 39: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 40: Use of routinely collected service delivery and M&E indicator data for  timely  feedback
Page 41: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Conclusions

• Timely feedback and dissemination of routinely collected service data and M&E data is an increasing challenge, especially as the number of sites increases (i.e., scale-up)– National, district, site, IPs

• Database tools, automation, and decentralization of information are critical– Improves data quality and utility of information!

• Capacity building on interpreting and applying disseminated data to program improvement is needed

Page 42: Use of routinely collected service delivery and M&E indicator data for  timely  feedback

Thank you!