Mental Health Measure Development Meeting
Transcript of Mental Health Measure Development Meeting
Mental Health Measure Development Meeting
September 26, 2018
Agenda
• Introduction
• Overview of Human Services Performance Management System
• Overview of Measure Development Process
• Mental Health Critical Issues
• Review of Prior Mental Health Measure Development
• Overview of Balanced Sets of Measures
• Measure Development Exercises
2
Agenda
Overview of Human Services Performance Management System
3
History
• 2009 Service Delivery Act• Established Steering Committee• Recommendations made in 2012 report
• 2013 Legislation• Adopted recommendations• Provided resources• Established Performance Council
• 2014 Planning• Built structure• Hired staff• Provided baseline reports• Established thresholds
• 2015 Implementation4
Children’s services
Adult services
Income supports
Mission and Values
Mission: to improve outcomes for people through creativity, flexibility, accountability,
collaboration, and performance management
Values: accountability, collaboration, continuous improvement, equity, flexibility,
inclusiveness, reliance on data, sustainability, and transparency
No single entity can achieve client outcomes alone.
We need to work together to improve lives for the people we serve.
5
Performance
Assureperformance
thresholds are metImprovement
Provide technical assistance
Measurement
Developperformance framework
Develop Performance Improvement Plans
Performance Management System Overview
The Human Services Performance Management System creates an opportunity for the Minnesota Department of Human
Services, counties, and community partners to work more closely together to improve the lives of people served.
Goals:• Establish shared outcomes and measures • Allow counties more flexibility in the “how”
• Emphasis on continuous improvement • Ensure achievement of positive outcomes
• Create accountability and provides transparency
Outcomes and Measures
Adults and children are safe and secure
Of all children who were victims of a substantiated maltreatment report during a
12-month reporting period, the percent who were not victims of another
substantiated maltreatment report within 12 months of their initial report.
Percent of vulnerable adult maltreatment
allegations where there is not a repeat of the
same type within six months
Children have stability in their living situation
Of all children who enter foster care in a 12-month period, the percent who are
discharged to permanency within 12 months of entering foster care. (Includes
discharges from foster care to reunification with the child’s parents or primary
caregivers, living with a relative, guardianship, or adoption.)
Percent of current child support that is paid
Children have the opportunity to develop to their fullest potential
Of all days that children spent in family foster care settings during a 12-month
reporting period, the percentage of days spent with a relative.
Percent of open child support cases for which
paternity is established
People are economically secure
Percent of MFIP/DWP adults
working 30 or more hours per
week or off cash assistance
three years after baseline (Self-
Support Index)
Percent of expedited SNAP
applications where support was
issued within one business day
of application
Percent of public assistance
applicants who received
benefits within mandated
timeframes
Percent of open child support
cases with a child support order
established
Vulnerable adults experience quality of life
People have access to health care and receive effective services
Agenda
Overview of Measure Development Process
8
Measure Development Approach
• Bring together stakeholders from DHS, Counties, and the community to develop shared measures.
• Performance management is more than developing measures. A successful system integrates the use of data into all facets of the organization.
• We meet our partners where they are on their performance management journey to develop a system that meets their organizations unique needs.
• We leverage our partners expertise throughout the process of developing a performance management system.
9
Team Team Members Team Responsibilities
Steering Committee
• Representatives from DHS
• Representatives from Counties
• Agency & County Performance
• Provide background information and understanding of DHS Mental Health services
• Provide feedback on key deliverables
• Guide measure development work
Data Team • Representatives from Policy, Program,
and Data areas
• Conduct research and provide information about Mental Health
• Collect information needed to review Adult Protection
Measure
Development Team
• Representatives from DHS
• Representatives from Counties
• Provider representatives
• Partners and advocates
• Provide subject matter expertise, experience, and strategic thinking to develop
Performance Measures forMental Health
Agency & County
Performance Project
Team
• Carol Becker, project manager
• Olufemi Fajolu, county data
• Gary Mortensen, HSPM
• Manage the measurement development, gaps analysis, and data infrastructure
phases of the project
• Facilitate the outcome measures development conversations
• Collect and manage project information from County and DHS teams
• Synthesize key findings to develop measurement inventory, data inventory and
program inventory.
Team Roles and Responsibilities
Project Overview (page 1 of 2)
• Steering committee of DHS and county staff identifies participants and reviews meeting agenda.
• Host meeting of DHS, county and community folks to discussion current measures, the Balanced Framework and potential new measures.
• Outcome: List of potential new measures.
• Give list of potential new measures to data team to determine what is feasible.
• Get participants from meeting back together to discuss which potential new measures:
• are feasible.• have data quality issues and what could be done about that.• could be built into future system development.• will not be possible anytime soon.
11
Project Overview (page 2 of 2)
• For measures that are currently possible, move them into the existing measurement processes.
• Develop draft measures for each county. Give them a year to adjust their processes before moving the measure into the PIP process.
• For measures that have data quality issues, carry out discussions among DHS, counties and providers to identify what is needed to improve data quality.
• For measures that could be included in future system development, ensure that these needs are included in planning documents.
• For measures that are not possible, hold further discussions on how to address these needs.
• Repeat process as necessary.
12
Agenda
Critical Issues for Mental Health
13
Agenda
Overview of Balanced Sets of Measures
14
Background
Statutory Requirement - 402A.10 Definitions
Subd. 1a.Balanced set of program measures.
A "balanced set of program measures" is a set of measures that, together, adequately quantify achievement toward a particular program's outcome. As directed by section 402A.16, the Human Services Performance Council must recommend to the commissioner when a particular program has a balanced set of program measures.
Subd. 4d.Performance management system for human services.
A "performance management system for human services" means a process by which performance data for essential human services is collected from counties or service delivery authorities and used to inform a variety of stakeholders and to improve performance over time.
15
Background
Research
• Partnered with the Management Analysis and Development team at MN Management and Budget to guide us in the development of planning Balanced Sets of Measures
• Researched scorecards and performance measures used by other organizations
• Interviewed organizations with strong performance management systems
• Presented to MACSSA Forum for feedback
16
Balanced Sets of Measures – Guiding Principles
• A list of ten guiding principles was created to ensure measures are developed with people and communities at the core, the stakeholders involved, and using the principles of Results-Based Accountability.
• The measures developed will represent the core components that create a strong program so that we can identify and promote what is working, identify systemic issues and work with counties to improve performance.
17
Category How it will be used
Client Engagement
How satisfied/ respected are the people receiving services?
EquityDo diverse groups have different experiences or outcomes? (i.e. racial disparities measures)
FinancialWhat are the costs of providing these activities? (i.e. program ROI)
Operations How efficiently do we do our work? (e.g. staff training, staffing levels, data collection, etc.)
Program Effectiveness
How well do we do our work? (e.g. application processing times, quality of services, impact on individuals)
Balanced Sets of Measures
Measure categories used to assess balanced set of program measures
18
Category Sample Measures
Client Engagement
• Were you treated with dignity and respect?• Were staff members courteous and helpful?• Staff had the knowledge to handle my request?
Equity• County staff reflects population served• Rates of out-of-home placement by race• Family Assessment versus Investigation by race
Financial• Cost effectiveness of services• Return on Investment• Timeliness of payments / submitting requests for reimbursement
Operations • Percentage of staff trained in a timely manner• Staff to case load ratio• Data integrity
Program Effectiveness
• Percent of applications processed within one business day• Number of hours between initial contact and crisis assessment• Percent of people with paying jobs
Balanced Sets of Measures – Sample Measures
19
Criteria Description of Criteria
DataPower
Timely, reliable, currently exists, easily accessible, high validity, low
human error
CommunicationPower
Easily understood, compelling to stakeholders
ProxyPower
Says something of central importance, matches direction of other
measures in the mix
Balanced Sets of Measures
Measure criteria used to select and create a balanced set of measures
20
Comprehensive Performance Measurement Framework
21
Questions?
Agenda
Results Based Accountability
23
Key RBA concepts
• Two levels of accountability: population and program
• Three questions for program accountability
• How much did we do?
• How well did we do it?
• Is anyone better off?
Optional Tagline Goes Here | mn.gov/dhs 24
Results accountabilityis made up of two parts
Performance accountability:
about the well-being of
client populations
For Programs – Agencies – Service Systems
Population accountability:
about the well-being of
whole populations
For Communities – Cities – Counties – States – Nations
DEFINITIONS
Rate of low-birthweight babies, Percent ready at K entry,
crime rate, air quality index, unemployment rate
1. How much did we do?
2. How well did we do it?
3. Is anyone better off?
RESULT or OUTCOME
INDICATOR or BENCHMARK
PERFORMANCE MEASURE
A condition of well-being for children, adults,
families or communities
A measure that helps quantify the achievement
of a result
A measure of how well a program, agency or
service system is working
Three types: = Customer
Results
Po
pu
lati
on
Per
form
ance
Children born healthy, children ready for school, safe
communities, clean environment, prosperous economy
From Ends to Means
ENDS
MEANS
Po
pu
lati
on
Perf
orm
an
ce
RESULT or OUTCOME
INDICATOR or BENCHMARK
PERFORMANCE
MEASURECustomer result = Ends
Service delivery = Means
From Talk to Action
How
Muchdid we do?
( # )
How
Welldid we do it?
( % )
Quantity Quality
Performance Measures
EffortHow hard did we try?
EffectIs anyone better off?
Performance measures
How much
did we do?
Program performance measures
How well
did we do
it?
Is anyone
better off?
Quantity Quality
Eff
ect
E
ffo
rt
# %
How much did we
do?
Education
How well did we do
it?
Is anyone better off?
Quantity Quality
Eff
ect
E
ffo
rt Number of
students
Student-
teacher
ratio
Number of
high school
graduates
Percent of
high school
graduates
RBA Categories Account for All Performance Measures(in the history of the universe)
Quantity Quality
Efficiency, Admin overhead, Unit cost
Staffing ratios, Staff turnover
Staff morale, Access, Waiting time,
Waiting lists, Worker safety
Customer Satisfaction(quality service delivery
& customer benefit)
Cost / Benefit ratio
Return on investment
Client results or client outcomes
Effectiveness
Value added
Productivity
Benefit value
Product
Output
Impact
Process
Input
Eff
ect
Eff
ort
Cost
TQM
Effectiveness
Efficiency
How much did we
do?
Not All Performance Measures Are Created Equal
How well did we do
it?
Is anyone better off?
LeastImportant
Quantity Quality
Eff
ect
E
ffo
rt
MostImportant
Least
Most
Also
Very Important
How much did we
do?
The Matter of Control
How well did we do
it?
Is anyone better off?
Quantity Quality
Eff
ect
E
ffo
rt
LeastControl
PARTNERSHIPS
MostControl
How much did we do?
Types of measures found in each quadrant
How well did we do it?
Is anyone better off?
# Clients/customers
served
# Activities (by type
of activity)
% Common measures
client staff ratio, workload ratio, staff
turnover rate, staff morale, % staff fully
trained, % clients seen in their own
language, worker safety, unit cost
% Skills / Knowledge(e.g. parenting skills)
#
% Attitude / Opinion(e.g. toward drugs)
#
% Behavior(e.g. school attendance)
#
% Circumstance (e.g. working, in stable housing)
#
% Activity-specific measures
% timely, % clients completing activity,
% correct and complete, % meeting
standard
Point in time
vs. point-to-point
improvement
The matter of use
1. The first purpose of performance
measurement is to
IMPROVE PERFORMANCE.
2. Avoid the “performance measurement equals
punishment trap.”
• Create a healthy organizational
environment.
• Start small.
• Build bottom-up and top-down
simultaneously.
Different Descriptions of Progress
1. Data
a) Population indicators: movement for the better away from the baseline
b) Program performance measures: customer progress and better service
How much did we do?How well did we do it?Is anyone better off?
2. Accomplishments: Positive activities, not included above
3. Stories behind the statistics that show how individuals are better off
Agenda
Measure Development Exercises
38
Ground Rules
• All voices hold equal weight
• Respect others, respect yourself
• This is only one step in the process
• Do not assume we are all on the same page
39
Table Introductions
• Share your name
• What “hat” you’re wearing today (your role or roles)
• Ice Breaker
40
Client Engagement
• Were you treated with dignity and respect?• Were staff members courteous and helpful?• Staff had the knowledge to handle my request?
Program Effectiveness
• Percent of applications processed within one business day
• Number of hours between initial contact and crisis assessment
• Percent of people with paying jobs
Balanced Sets of Measures – Sample Measures
41
Operations• Percentage of staff trained in a timely manner• Staff to case load ratio• Data integrity
Finance
• Cost effectiveness of services• Return on Investment• Timeliness of payments / submitting requests for
reimbursement
Balanced Sets of Measures – Sample Measures
42
Balanced Sets of Measures – Sample Measures
Equity• County staff reflects population served• Rates of out-of-home placement by race• Family Assessment versus Investigation by race
43
County Mental Health Performance Report Planning
Behavioral Health Division (Previously the Mental Health Division and Alcohol and Drug Abuse Division)
Research, Evaluation, & Technical Support TeamSeptember, 2018
Overview of Planning Report (pg. 2)
• Based on stakeholder feedback received at December, 2016 meeting, RETS Team has provided:
1. description of relevant data (if any) currently being collected,
2. description of how much relevant data is currently being collected, to determine whether data it is valid for county- and tribal*-level reporting,
3. reliability estimates of relevant data,
4. validity estimates of relevant data, and
5. what database(s) currently house(s) relevant data.
• Defining “County* Mental Health Performance”: RBA
Response-Based Accountability (RBA; pgs. 2-3)
• working definition of the term “county* mental health performance”:
• “the extent to which county* mental health services are working; i.e., how well funding is being used to treat MN residents equitably via programs, agencies, and/or service systems; this includes 3 types of measures:
1. Quantity of services delivered,
2. Quality of services delivered, and
3. Impact on individuals’ lives.”
* Working definition may be modified, if recommended by stakeholders.
Figure 1. Response-Based Accountability (RBA)
Organization of Report (pg. 3)
• RBA:
1. Quantity indicators (2 proposed indicators)
2. Quality indicators (4 proposed indicators)
3. Impact indicators (7 proposed indicators)
• Lifespan Perspective
1. children and/or
2. adults
Figure 1. Response-Based Accountability (RBA)
Organization of Report (continued; pgs. 3-4)
• Operational Definitions of Performance Targets
• Working definitions
• Up for discussion
• Estimating Reliability and Validity of Existing Data
• Reliability: consistency of data reporting
• Validity: reliable reporting + meets operational definition
• Table Format for Visual Comparison of Proposed Indicators for Children versus Adults
• Summary Statements of Table Contents
• Conclusions, Recommendations, and Next Steps
2 Potential Quantity Indicators (pg. 5)
Proposed for Children Proposed for Adults
None
1. Emergency Services
Utilization: number of
clients using crisis
services.
Data being Collected: yes
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
2. Percent of Individuals
with Voluntary Services:
legal status at the start
of services.
Data being Collected: yes
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
Summary of 2 Potential Quantity Indicators (pg. 6)
• Child and Adult Indicators (0): None proposed.
• Child Only Indicators (0): None proposed.
• Adult Only Indicators (2): Data relevant to both of the proposed quantity indicators is estimated to be moderately to highly reliable and valid. Thus, both indicators are considered viable for inclusion in county mental health performance reporting for adults. • Currently Viable (2)
• Emergency Services Utilization
• Percent of Individuals with Voluntary Services
• Recommendation: quantity indicators are needed for children that ideally overlap with adults.
4 Potential QUALITY Indicators (pgs. 7-9)
Proposed for Children Proposed for Adults
1. Wait Time: average
number of days
between initial contact
and services start date
(for crisis services,
average number of
hours between initial
contact and crisis
assessment).
Data being Collected: some
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
1. Wait Time: see Children Data being Collected: some
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
4 Potential QUALITY Indicators (cont., pg. 8)
Proposed for Children Proposed for Adults
2. Contact with Caregivers:
Number of minutes a
service provider has had
contact with a person
who has primary
responsibility for the
wellbeing of the child
receiving services (e.g.,
face to face, phone,
video, or email contact
with a parent or other
individual legally
responsible for the care
of the child receiving
services).
Data being Collected: none
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …NONE
4 Potential QUALITY Indicators (cont., still pg. 8)
Proposed for Children Proposed for Adults3. Face-to-Face Contact with
Caregivers: Number of
times a service provider
meets with a person who
has primary responsibility
for the wellbeing of the
child receiving services
(excludes video contact).
Data being Collected: none
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …NONE
4 Potential QUALITY Indicators (cont., pg. 9)
Proposed for Children Proposed for Adults
4. Client Perception of
Care: report, by person
receiving services, of
how well services
a. align with the
person’s service
preferences (Service
Choice), and/or
b. led to achieving
identified goals
(Service-Driven
Outcomes).
Data being Collected: some
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
Summary of 4 Potential QUALITY Indicators (pg. 10)
• Child and Adult Indicators (1): The indicator proposed for both children and adults is insufficient in both reliability and validity, though it may be a feasible indicator, in the future. While technical support is being provided to improve the reliability and validity of CMH grants data, is being examined to determine reliability and validity. A state-wide expansion of the CCBHC approach to wait time is also being considered.
• Currently Viable (0)
• None
• Not Viable (0)
• None
Summary of 4 Potential QUALITY Indicators (cont., still pg. 10)
• Child Only Indicators (2): Of the two quality indicators proposed for children, only, neither is feasible, as relevant data is not being collected. There are no feasible plans to do so in the foreseeable future.
• Currently Viable (0)
• None
• May be Viable in the Future (0)
• None
• Not Viable without Significant Data Management Overhaul (2)
• Contact with Caregivers
• Face-to-Face Contact
* Of 4 Potential QUALITY measures, 2 are traditionally relevant only to children.
* CAUTION WARRANTED: Data development plans aspire to be person- AND family-focused.
Summary of 4 Potential QUALITY Indicators (cont., still pg. 10)
• Adult Only Indicators (1): The indicator proposed for adults is currently not viable due to (1) limited data collection, and (2) not meeting the definition. Efforts are underway to expand data collection, and stakeholders are invited to reconsider the definition to explore this becoming a feasible indicator in the future.
• Currently Viable (0)
• None
• Not Viable (0)
• None
One more lap!
Quantity and QUALITY down!!
Just IMPACT to go!!!
Here’s the real meat of our performance indicators
10/8/2018 Behavioral Health Division | mn.gov/dhs 15
7 Potential IMPACT Indicators (pgs. 11-14)
Proposed for Children Proposed for Adults
1. Youth Symptoms:
caregiver-report of
youth behaviors that
suggest possible mental
illness.
Data being Collected: yes
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …None
7 Potential IMPACT Indicators (cont., still pg. 11)
Proposed for Children Proposed for Adults
2. Caregiver Ability to Manage: caregiver report of how capable they feel in managing a. life andb. child.
Data being Collected: none
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …None
7 Potential IMPACT Indicators (cont., pg. 12)
Proposed for Children Proposed for Adults
3. Juvenile Justice Involvement: percent of Children’s Mental Health Targeted Case Management (CMH-TCM) youth who are involved in the juvenile justice system.
Data being Collected: some
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …None
7 Potential IMPACT Indicators (cont., pg. 12)
Proposed for Children Proposed for Adults
4. Mobile Crisis Ending in Retention of Youth in Community Setting:percent of mobile crisis services ending in retention of youth in community setting (i.e., if assessed away from residence, youth was able to remain in a community setting upon completion of assessment).
Data being Collected: yes
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …None
7 Potential IMPACT Indicators (cont., pg. 13)
Proposed for Children Proposed for Adults
5. Number of People Engaged in Community Life: The extent to which an individual feels socially connected, is engaged in desired
education and/or employment, and
is living in the most integrated setting possible, given the individual's
needs and proximity to family, friends, faith communities, etc.
Data being Collected: some
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
7 Potential IMPACT Indicators (cont., pg. 14)
Proposed for Children Proposed for Adults
6. Employment: percent of people with payed jobs; people without payed jobs who are looking for paid jobs, and are available for work are counted as "unemployed"; people who are not employed, are not looking for a paid job, and/or are not available for work are counted as "not in the labor force" (based on U.S. Department of Labor definitions).
Data being Collected: yes
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
7 Potential IMPACT Indicators (cont., still pg. 14)
Proposed for Children Proposed for Adults
7. Stable Housing: percent of clients who consistently live in a residential setting that is not a homeless shelter.
Data being Collected: starting
How Much Data: …
Reliability Estimate: …
Validity Estimate: …
Database(s): …
Summary of 7 Potential IMPACT Indicators (pgs. 15-16)
• Child and Adult Indicators (0): None proposed.
• Child Only Indicators (4):
• Currently Viable (2)
• Youth Symptoms: currently assessed with two standardized measures…discussions underway related to lifespan implications…
• Mobile Crisis Ending in Retention of Youth in Community Setting: crisis data is being collected with at least moderate reliability and high validity
• Not Viable (1)
• Caregiver Ability to Manage: no data and no plans …see earlier comments about data development aspirations…
Summary of 7 Potential IMPACT Indicators (cont., pg. 16)
• Adult Only Indicators (3):
• Currently Viable (1)
• Employment
• Not Viable (0)
• None
• These 3 Potential IMPACT Indicators are closely connected…recommend considering this in ongoing discussion with stakeholder.
Conclusions & Recommendations (pgs. 17-18)
• 2 Potential Quantity Indicators proposed:
• Emergency Services Utilization
• Percent of Individuals with Voluntary Services
• Both appear viable
• Both were proposed for adults, only
• This leaves a gap in County Mental Health Performance monitoring for children.
Conclusions & Recommendations (cont., still pg. 17)
• 4 Potential QUALITY Indicators:
• 0 are currently viable
• Wait Time
• Client Perception of Care
• Remaining 2 indicators proposed for children, only: not viable in the foreseeable future
• Contact with Caregivers
• Face-to-Face Contact
• We encourage further consideration of indicators relevant across the lifespan.
Conclusions & Recommendations (cont., pgs. 17-18)
• 7 Potential IMPACT Indicators:
• 3 seem viable now
• 2 proposed for children, only
• Youth Symptoms
• Mobile Crisis: Youth Retained in Community
• 1 proposed for adults, only
• Employment
• Juvenile Justice Involvement
• Engagement in Community Life
• Stable Housing
• We again encourage further consideration of indicators relevant across the lifespan.
• We also recommend explicitly conceptualizing the three proposed impact indicators for adults as closely interconnected:
• Employment
• Community Life
• Stable Housing
Conclusions & Recommendations (cont., pg. 18)
• Summary of Recommendations:
1. consider identifying indicators that might quantify county mental health performance for children,
2. pursue the possibility of using wait time (proposed quality indicator) as an indicator of quality that is relevant across the lifespan, and
3. consider the possibility that some impact indicators might also be relevant across the lifespan.
• Based on working definition of “County* Mental Health Performance”, we also recommend considering how performance monitoring might:
4. be tied to financial expenditures (e.g., calculate county mental health performance, at client level, according to county of financial responsibility; and
5. be guided by an equity lens (e.g., consider the role of tribes, and identify/track disparities in quantity, quality, and/or impact indicators of mental health performance).
Next Steps
Time for stakeholders
to help guide our next steps!
10/8/2018 Behavioral Health Division | mn.gov/dhs 29
1
Mental Health Measures Development Meeting Outcomes: Measures and Reports
2016 Stakeholder Meeting Results
In December 2016, the Human Services Performance Measurement Team and counties met to identify potential performance measures for county mental
health activities. A list of potential measures was generated. This list was given to the DHS Mental Health Data Team, who analyzed the measures for their
reliability and their validity as a performance measure. These suggested measures and the data team’s conclusions are listed in the table below.
Proposed Measures Reliability, Validity and Data Availability
Adult Emergency Services Utilization: number of clients using crisis services. Reliability Estimate: High Validity Estimate: High Database(s): MMIS
Adult Percent of Individuals with Voluntary Services: legal status at the start of services.
Reliability Estimate: Moderate to High Validity Estimate: Moderate Database(s): MHIS
Adult and Children Wait Time: average number of days between initial contact and services start date (for crisis services, average number of hours between initial contact and crisis assessment).
Reliability: Too early for CCBHCs; Low for grants Validity: Low Database(s): CCBHC Secured Data Portal; Excel spreadsheets (ECMH, SLMH, & CEMIG); MHIS
Children: Contact with Caregivers: Number of minutes a service provider has had contact with a person who has primary responsibility for the wellbeing of the child receiving services
Data being Collected: none
Children: Face-to-Face Contact with Caregivers: Number of times a service provider meets with a person who has primary responsibility for the wellbeing of the child receiving services
Data being Collected: none
Adult Client Perception of Care: report, by person receiving services, of how well services align with the person’s service preferences (Service Choice), and/or led to achieving identified goals (Service-Driven Outcomes).
Reliability Estimate: Low Validity Estimate: Low Database(s): Snap Survey
2
Proposed Measures Reliability, Validity and Data Availability
Youth Symptoms: caregiver-report of youth behaviors that suggest possible mental illness.
Reliability Estimate: Moderate Validity Estimate: High Database(s): Outcomes Measures Database
Child Caregiver Ability to Manage: caregiver report of how capable they feel in managing life and their child.
Data being Collected: None.
Juvenile Justice Involvement: percent of Children’s Mental Health Targeted Case Management (CMH-TCM) youth who are involved in the juvenile justice system.
Reliability Estimate: Low Validity Estimate: Low Database(s): Snap Survey
Mobile Crisis Ending in Retention of Youth in Community Setting: percent of mobile crisis services ending in retention of youth in community setting (i.e., if assessed away from residence, youth was able to remain in a community setting upon completion of assessment).
Reliability Estimate: Moderate Validity Estimate: High Database(s): MHIS
Employment: percent of people with payed jobs; people without payed jobs who are looking for paid jobs, and are available for work are counted as "unemployed"; people who are not employed, are not looking for a paid job, and/or are not available for work are counted as "not in the labor force" (based on U.S. Dept.of Labor definitions).
Reliability Estimate: Moderate Validity Estimate: High Database(s): MHIS
Stable Housing: percent of clients who consistently live in a residential setting that is not a homeless shelter.
Reliability Estimate: Low (Housing Status) to Moderate (Residential Status) Validity Estimate: Low (Housing Status) to Moderate (Residential Status) Database(s): MHIS
The extent to which an individual feels socially connected. Reliability Estimate: low Validity Estimate: low Database(s): Snap Survey
The extent to which an individual is engaged in desired education. Data being Collected: none
The extent to which an individual is engaged in desired employment. Reliability Estimate: High Validity Estimate: High Database(s): MHIS
Individual is living in the most integrated setting possible, given the individual's needs.
Reliability Estimate: High Validity Estimate: High Database(s): MHIS
Number of People living in proximity to family, friends, faith communities, etc. Data being Collected: none
Meeting Notes: Mental Health Measures Development – Sept. 26, 2018 3
2018 stakeholder meeting: Measures generated
In September 2018, the results of this earlier work were presented to a group from counties and DHS. Additional
measure ideas were generated at this meeting. These are listed below. Note that some of these desired measures have
already been explored and are listed above.
Potential measures: Intake
Five-day timeline met for client engagement by case manager
Time between referral and first contact
Number of individuals who rejected case management
Potential measures: Services
Number of services utilized before case management versus after (Did the number of services needed decline
after case management?)
Number of hospitalizations prior to case management initiation versus after
Number of attempts to access crisis and hospital services and other services that were unsuccessful
Percent of client-identified goals in plans achieved
Percent of adults with improved DLA 20 scores
Percent of adults with reduced symptoms after treatment
Case management client contact per month decrease over time
Number of clients leaving case management
Number of return cases within six months of close
Number of missed visits by client/client no shows
Number of client contacts per month per worker
Number of times the client’s caseworker has changed
Cost of hospitalization and cost savings from hospital diversion by county
Transactional cost to client to engage in services (phone minutes, transport time, in-sessions, complete forms)
Potential measures: Impact of services
Number of crisis calls before and after services
Number of times law enforcement involved before and after service
Number of out of home placements due to treatment needs (child)
Potential measures: Quality of life
Number of visits/contacts with family
Number of visits/contacts with parents (for children)
Number of visits/contacts with social worker
Duration of contacts
Number of clients achieving employment
Number of children attending school
Number of clients graduating (children)
Number of clients in stable housing
Number of clients in stable housing due to flex funds
Number of clients moving from more restrictive to less restrictive environments
Number of clients moving from less restrictive to more restrictive environments
Meeting Notes: Mental Health Measures Development – Sept. 26, 2018 4
Number of clients with self-harm ideation
Number of clients with one close relationship in their life
Number of clients with healthcare
Number of clients able to finish applications
Percent of clients reporting improvements in functioning
2018 stakeholder meeting: Requested data
During the meeting, participants shared a number of data points they felt could help in their improvement efforts. These
items may not directly relate to reported measures, but reporting on or making this data available may be useful to
counties.
Data requested: Intake
Number of intakes by county
Number of functional assessments completed (behavioral tool scores – CASII SDQ Support, etc.)
Number of the total reports are screened in
Number of crisis plans created
Number of case plans created
Number of client-identified goals created in each plan
Number of clients requesting services/self-referrals
Number of involuntary clients
Number of involuntary clients that become voluntary clients
Data requested: Client services
Number of referrals made (collateral contact/network/wraparound)
Number of services each client receive in a 9- or 12-month period
Number of clients with transportation available for each goal
Number of parents charged parental fees
Number of parents charged for outplacement costs via child support
Percent of clients treated in their own language
Cost/pool ratio
Data requested: Client characteristics
Number of clients by race
Number of clients by ethnicity
Number of clients hospitalized
Data requested: Population-level statistics
Percent of county population receiving mental health services
Number of mental health services per capita
Cost per capita for mental health costs
Per capita expenditures per client
Number of hospital days per capita
Percent of students completing “therapy evaluation questionnaire” (need to increase number of survey that have been completed – beyond 10%)
Meeting Notes: Mental Health Measures Development – Sept. 26, 2018 5
Data requested: Staffing
Client/staff ratio or caseload size
Turnover rates
Supervisor to staff ratio
Percent of people of color providing services (workforce)
Number of bilingual staff?
Percent of workforce that have been trained in everything they should have been trained in (requires developing a standardized curriculum first)
Percent of employees with diversity training
Complaint calls/ appeals/ombudsman
SAMHSA – TA staff satisfaction survey
Data requested: County community education activities
Number of community outreach/educational events held