1 Using Data for Program Improvement New Hampshire Presenter: Carolyn H. Stiles Part C...

Post on 03-Jan-2016

212 views 0 download

Transcript of 1 Using Data for Program Improvement New Hampshire Presenter: Carolyn H. Stiles Part C...

1

Using Data for Program Improvement

New Hampshire

Presenter: Carolyn H. Stiles

Part C Coordinator/Program Coordinator

Family Centered Early Supports and Services Program

Bureau of Developmental Services/NH DHHS

2

About NH….

Granite State Lead Agency - Department of Health and

Human Services/Bureau of Development Services

10 regions/18 programs Name: Family Centered Early Supports and

Services (ESS) Increase in number of children served from

2.7% in FFY2004 to 3.16% in FFY2008 Minimal resources

3

Let’s talk about the data….

• NH conducts monitoring visits with data collection every year, every program

• Each program conducts a self-assessment using the same checklist as the monitoring team

• Monitoring Team is composed of the BDS ESS representative, BDS Liaison for the region, regional representative, ESS program director)

4

Let’s talk about the data…., continued

During the monitoring visit data from the program self-review is confirmed – or not!

Onsite technical assistance is provided if needed

Root cause of noncompliance used to determine possible solution

5

Let’s talk about the data…., continued

• Verbal notification of need for further action at completion monitoring visit.

• Written confirmation of the program’s standing is provided typically within one week

6

How is the data used?

Improve Performance and Quality Determinations – local and statewide Public Reporting Federal Reporting: APR Ensure that Part C of the IDEA is being

implemented

7

Most Significant NH Improvement Resulting From Monitoring Data in the Past 3 Years

Creation of an Early Intervention Specialist Certification Process

8

So, what does this have to do with data?

• Program monitoring data showed that ESS programs were struggling to meet timelines

• “Balancing Act”

• Program directors began to use unqualified

personnel to complete evaluations (by NH

standards)

• Program directors complained that they did not have enough staff to complete requirements on time.

9

Background, continued:

• Lead Agency director said “solve the problem or grant the waivers”

• Agencies complained

• Part C staff said “no” to waivers of state

requirements

• Waiver requests began to arrive to allow unqualified personnel to conduct evaluations

10

Early Intervention Specialist Certification Process (Pilot 2008, Implemented 2009)

Indicator timeline data improved Number of evaluators and qualified

personnel increased Created a Career ladder for providers

in the Family Centered Early Supports and Services Program (ESS, Part C)

Why was this significant?

11

Other ways that the data is used for improvement…

• Some, but not all Program Directors struggling with integrating changes

(child outcome, notification, timeline requirements)

Peer Mentoring

12

Other ways that the data is used for improvement…

• Many program reviews demonstrate the same noncompliance

(functional vision and hearing not described on IFSP)

Statewide Training

(followed by monitoring and technical assistance)

13

Other ways that the data used for improvement…

• “Soft data” regarding a change in the definition of timely services gathered during conversations indicated that local providers were very confused (followed by collection of hard data to quantify problem)

On-site technical assistance with ESS program staff

14

Other ways that the data used for improvement…

Increase in the number of requests for Mediation Statewide

Annual training in format that could be used by directors with local program staff

15

In general:

Monitoring Data Analysis

(including root cause analysis)

Develop Improvement Strategy

Implement Improvement Strategy

Monitoring for Improvement