University of Kansas School of Social Welfare Twente Hall ...

35
University of Kansas School of Social Welfare Twente Hall 1545 Lilac Lane Lawrence, Kansas 66044-3184 785-864-2269 Task Order #18 Web-Based Client Status Reports (CSRs) FY 2009/2010 Stephen A. Kapp, Ph. D. Principal Investigator Karen Flint Stipp, MSW, LSCSW Graduate Research Assistant June 30, 2010 This report has been supported through a contract with the Kansas Department of Social and Rehabilitation Services and prepared under grant No. 0704-HCP-0605-030.

Transcript of University of Kansas School of Social Welfare Twente Hall ...

Page 1: University of Kansas School of Social Welfare Twente Hall ...

University of Kansas School of Social Welfare

Twente Hall 1545 Lilac Lane

Lawrence, Kansas 66044-3184 785-864-2269

Task Order #18

Web-Based Client Status Reports (CSRs) FY 2009/2010

Stephen A. Kapp, Ph. D. Principal Investigator

Karen Flint Stipp, MSW, LSCSW

Graduate Research Assistant

June 30, 2010

This report has been supported through a contract with the Kansas Department of Social and Rehabilitation Services and prepared under grant No. 0704-HCP-0605-030.

Page 2: University of Kansas School of Social Welfare Twente Hall ...

Executive Summary

Purpose of the Study A University of Kansas social worker developed and introduced web-based Results-Oriented Management software for Kansas CBS, providing online access to disaggregated CSR data updated every 10 days. The software was developed and implemented collaboratively with Kansas CBS directors, and in cooperation with the state mental health authority. Prior study of Kansas CBS directors’ use of Children’s Client Status Reports suggested that use would increase if local data were more accessible (Kapp & Stipp 2006/2007; 2007/2008). A preliminary study indicated that CBS directors anticipated that the bottom-up approach to information technology would support knowledge transfer in Kansas CBS (Kapp & Stipp, 2008/2009). This study followed, to determine whether the new technology supported CSR utilization by Kansas CBS directors.

Implementation A literature review suggested that formal ties and familiarity between practitioners and researchers improves technology-supported information transfer between groups (Isett & Phillips, 2009). Collaboration between people with clinical expertise and people with technical expertise is thought to be useful for supporting technology’s role in local knowledge acquisition (Cnaan & Parsloe, 1989). There was some anticipation that the technology’s bottom-up, collaborative development would support data utilization. This study was a follow-up on the field test group’s early feedback about CSR utilization via results-oriented management. Analyzed data from the first round of interviews informed development of a PowerPoint presentation designed to report results and elicit member feedback for verifying authenticity of the results. Participants also volunteered updates about their data access since the initial round of interviews. The transcript of the member checking/follow-up interview was added to the study data, for further analysis of unique and common themes, codes, and patterns. Researchers triangulated director reports of time spent in the software with the website log of actual time spent logged into the website, and found director reports consistent with log-ins. Researchers used word processing software to support data management. Seven of the state’s directors of community-based mental health programs for children were members of the outcomes subcommittee that served as the ROM field test audience. A member checking and follow-up interviews with the seven outcomes subcommittee members were recorded, transcribed, and analyzed, using qualitative data analysis methods.

Page 3: University of Kansas School of Social Welfare Twente Hall ...

Study Challenges The personal relationship between CBS directors and the website developer created some reluctance to report limits to development of CSR utilization. Directors’ recall was supported by triangulating director reports with website log-in data. Findings Technology is well suited to information management, but in mental health as elsewhere, a glass wall separates data from data users. Kansas children’s community mental health directors had limited access to their local data, collected by the state’s automated information management system. Directors collaborated with a software developer, building web-based results-oriented management software to support local access. This qualitative inquiry used director interviews to examine whether collaboration permeated the glass wall. Interview results indicate that the new software created unanticipated organizational demands. The glass wall stands, but adding organizational supports may yet facilitate local utilization of state-collected mental health data. Implications Further CSR utilization will be supported by 1) agency and state support for ongoing building of skills for using the online reporting software; 2) agency support for the new kind of space the online data delivery demands; 3) agency and state strategies for creating reports for local management benefit from CSR data, and 4) establishment of new IT priorities for activating the software’s full capabilities.

Page 4: University of Kansas School of Social Welfare Twente Hall ...

CONTENTS

1 BACKGROUND 6

1.1 Bottom-Up Development of Results Oriented Management (ROM) Software 1.2 Prior Findings 1.3 Study Purpose

2 LITERATURE REVIEW 9 THROUGH THE GLASS WALL: LESSONS LEARNED FROM CHANGING TECHNOLOGY FOR KNOWLEDGE TRANSFER IN KANSAS CHILDREN’S COMMUNITY-BASED SERVICES 3 METHODS 10

3.1 Research Question 3.2 Research Design and Instrumentation 3.3 Sample and Data Collection 3.4 Data Analysis

4 FINDINGS 11

4.1 Anticipated Uses of CSR Data

4.1.1 Resource Allocation 4.1.2 Staff and Caseload Supervision 4.1.3 Program and Treatment Protocol Development

4.2 Realized Uses for ROM-Supported CSR Data

4.2.1 Improved Data Entry 4.2.2 Answered questions

Page 5: University of Kansas School of Social Welfare Twente Hall ...

4.3 Unanticipated Barriers: Data Delivery Change as Organizational Change

4.3.1 Need for Ongoing Training 4.3.2 Need for Opportunity to Reflect on Local Domain Applications 4.3.3 Need for IT Support for Loading Local Identifiers

5. DISCUSSION 26

5.1 Study Limitations

5.2 Implications

5.3 Recommendations

6. REFERENCES 30

Appendix A – Human Subjects Committee Lawrence Project Approval Letter

Appendix B – Information Statement Appendix C – Member Check PowerPoint handout

Page 6: University of Kansas School of Social Welfare Twente Hall ...

Results Oriented Management Software in Kansas Children’s Community Based Services:

Expanding Local Uses of Automated Information Management System Data

1. BACKGROUND

The State of Kansas administers children’s mental health services through a system of community mental health centers (CMHCs), which are gateways to services within geographically-defined catchment areas across the state. Services for children and youth include community based services (CBS), which were developed to reduce the risk of state psychiatric hospitalization (Kansas Social and Rehabilitative Services [SRS], 2005a). CBS include case management, attendant care, psychosocial groups, partial hospitalization, home based family therapy, and respite care. Kansas community mental health services are provided by a system of mental health centers that are the gateway to services within each of the state’s mental health catchment areas. Mental health services include children’s community-based services (CBS) for reducing risk of state psychiatric hospitalization for children and youth with severe emotional disturbance (Kansas Social and Rehabilitative Services [SRS], 2005a). Each CBS has a director whose duties might include direct practice, clinical supervision, and administration. This is an early report on the effects of this second generation of collaborative efforts, between Kansas SRS and the University of Kansas, School of Social Welfare (KU), and uses of the data in local Kansas children’s community-based services. The collaboration commenced in 2002 with introduction of Children’s Client Status Reports (CSR) for outcome measurement, and an automated information management system (AIMS) for data management. The Kansas mental health authority introduced Children’s Client Status Reports (CSR) for evaluating programs that receive state or federal funds for treatment of severe mental illness, and an automated information management system for collecting, storing, aggregating, and disseminating client status and other data (Kansas SRS, 2005b). Client status is the level of restriction in client living situations, ranked on a continuum from least restrictive (permanent home) to most restrictive (institutional placement) (Rapp, Gowdy, Sullivan & Wintersteen, 1988). The CSR also includes clinical information such as behavior and functioning scores, school data such as attendance, grades and special education status, community data such as contact with law enforcement and child welfare, Medicaid eligibility, and demographics (Kansas SRS, 2005b). The CSR was designed to measure system level performance for state and federal reporting, and to inform local quality improvement efforts (Kansas SRS, 2005b). A prior study indicated that only about half of the state’s CBS directors used the CSR for local quality improvement, and only at the program level; only

Page 7: University of Kansas School of Social Welfare Twente Hall ...

one director identified a CSR application at the client level (Kapp & Stipp, 2010). Agencies’ access to their own CSR data was restricted after it was entered into the state’s repository. CSR data was disseminated back to the CBS in quarterly print report of aggregate data, with a lag time of up to six months between data entry and report dissemination. Kansas CBS directors were eager for change in their data delivery (Kapp & Stipp). There continue to be national and international calls for technology to improve local mental health service delivery (President’s New Freedom Commission, 2003; World Health Organization, 2004). In Kansas CBS as elsewhere, however, technology has fallen short of its promise to amass data that transfers knowledge back into local quality improvement. The 2009 iteration of the collaborative introduced web-based results oriented management (ROM) software, which provides nearly real-time access to local CSR data (Kansas SRS, 2009).

1.1 Bottom-Up Development of Results Oriented Management (ROM) Software

To that end, a Kansas social worker developed and introduced web-based reporting system software for Kansas CBS, providing online access to disaggregated CSR data, updated every 10 days. ROM was developed and implemented collaboratively with Kansas CBS directors, and in cooperation with the state mental health authority. There was some anticipation that the bottom-up approach to information technology would support knowledge transfer in Kansas CBS. The Kansas CBS directors’ outcomes subcommittee served as the field test group for development of ROM. Each committee member had at least four and as many as a dozen interactions with the developer, who also served as the trainer. Ongoing interactions between the field developer and field test group included field testing in outcomes subcommittee meetings. The subcommittee also attended the developer’s presentations at statewide CBS directors’ meetings, and trainings on ROM. Trainings occurred in an online and telephone format. Each of the outcomes committee members also made contact with the developer outside scheduled venues, by phone, email, through a link in the ROM website, or face-to-face. ROM provides local access to CSR data in what one director called “nearly real time.” Software capabilities make it possible for CBS directors to run reports aggregated to agency and state as had been available in the quarterly print report, as well as disaggregated to a particular time period and to subgroups of children and youth, and for any of the CSR domains. Software capabilities make it possible for users to cross tabulate

Page 8: University of Kansas School of Social Welfare Twente Hall ...

reports by client age, custody status, ethnic group, gender, Medicaid eligibility, grade level, or special education services, in tables or graphs, using counts or percentages (Kansas SRS, 2009).

1.2 Prior Study and Findings

For first-hand knowledge of field testing and training for ROM, the interviewer attended a Kansas CBS directors’ meeting at which the developer demonstrated the website, participated in trainings alongside CBS directors and other data users from the agencies, prior to data collection. Researchers developed a semi-structured interview guide informed by the interviewer’s training on features of ROM, and the study’s literature review. The interview guide included questions about each director’s (1) exposures to and training in ROM, (2) comparisons between web and print access to the CSR, (3) plans to use ROM including applications for filters and subfilters, (4) purposes for which directors were accessing the CSR through ROM, and (5) agency involvement with the new CSR delivery. This study commenced two months after online CSR availability replaced the print reports, and member checking occurred five months after the initial interviews. The research team invited the seven field test group members to participate in interviews about how well ROM was supporting their CSR access. Six group members were able to participate in the first round of interviews. Study participants represented the state’s urban, semi-urban, rural and frontier catchment areas. The interviewer emailed an information statement and a guide for the semi-structured interviews, to each study participant, and used the semi-structured interview guide for leading 30-45 minute taped telephone conversations with each director. Interviews were audio recorded with directors’ oral consent. Kansas CBS directors prepared for the pending data delivery change, anticipating the benefits of what they called “drilling down” into a variety of CSR domains. Directors anticipated being able to create reports specific to particular programs and client groups that would inform their resource allocation, staff and caseload supervision, and development of local practice guidelines. Directors wanted better access to the domains they had been accessing in the quarterly, aggregate print reports. Four of the six directors reported that in the print reports, they had routinely compared their center’s client status outcomes to state client status outcomes. There were also directors who routinely looked at Medicaid status, school domains, behavior scores,

Page 9: University of Kansas School of Social Welfare Twente Hall ...

client age, and penetration rate per 10,000 children in the population. Each of the directors wanted to use ROM for drilling down into the domains they had been in the habit of looking at in the print reports. All but one of the directors reported looking at multiple domains, and no two directors looked at the same sets of domains. Directors were curious about what they might be able to glean from domains they had not been in the habit of perusing. To those ends, directors collaborated in development of the software and prepared themselves for the new data delivery technology. Beyond their exposures to the website as part of the field test group, each of the directors experienced at least one and as many as three hour-long training sessions. Five of the six directors involved their treatment team leaders in the training sessions, to “get it into the hands of people who would use the data, perhaps actively.” Directors wanted team leaders to be able to get into the data for themselves, to look at which sets of local practices were correlated with positive outcomes. After the training, each director spent hours in their offices familiarizing themselves with the software. They invited team leaders to join them, so team leaders could also learn to “drill down and actually look at their team and how their team is doing.” Directors also invited other agency personnel to join them as they familiarized themselves with using the website, people who had particular interest in either outcomes or software, including IT personnel, site managers, and an executive director. Data timeliness was supporting directors’ supervision of data entry accuracy. In the print CSR, there had been no way of isolating or correcting apparent data inaccuracies.

We got the [print] CSR usually about five months from the day it was submitted. That doesn’t help. The thing that is very good about ROM is the timeliness of it. I love the very near real time component to it.

ROM’s access features helped directors track inaccuracies back to the source.

It has led us to ask, “Are we reporting data accurately?” When we were getting this data three or four months after we reported it, it was hard to go back and even know where you start to look, where the problems might be, so it’s a timeliness that is really helpful.

Four directors were working with their local IT departments to correct the inaccuracies they had been able to isolate via ROM. Three directors were working with team leaders and case managers to prevent further inaccuracies, and were developing training for team leaders and case

Page 10: University of Kansas School of Social Welfare Twente Hall ...

managers around domain definitions, to improve the consistency of data collection. Directors anticipate further, that the improved CSR data accuracy will increase local trust in, and uses for, the CSR data. Directors saw improved data accuracy as a step toward their systemic uses for the data. Directors were creating some reports, somewhat tentatively, and not yet in the systematic manner they were anticipating. A director described, for instance, the surprise of successfully creating a report using ROM. The director was “getting ready to have a meeting with the foster care contractors here, and I set it in that perusal thing to look at ages, and there it was - specific age groups of children in foster care!” There were few such examples, however, and no examples of reports that systematically supported resource allocation, supervision or practice protocols. Directors referred to the time they spent familiarizing themselves with ROM, apart from formal training, as “playing,” which did not sound like, or feel to them like, a productive use of their management time.

My staff person and I played for awhile. I mean, there is a point at which I cannot play any longer. We must have taken half an hour at that point, and that is probably more time than I could spare.

Directors were reluctant to spend more time “playing” with how to get the reports they wanted. Directors intend to become proficient in their use of ROM, and to lead their staff and other CBS directors to achieve proficiency, but “I don’t feel like I’m quite there, yet.” Five of the six directors n the initial interviews wanted formal follow-up trainings that would help them become “savvy enough” to generate the reports they wanted, both for their own use, and for supporting data utilization by their team leaders. All of the five wanted more of the web-base trainings, and two additionally requested a face-to-face venue that would be more comfortable for them as “old school” learners. The arrival of the old quarterly print reports had been just “bloop, there you were, presented the report and that was it.” The bulky print reports had taken up space on agency shelves, but neither demanded much from, nor gave much back to, the directors or the agencies. ROM did not take up physical space, but demanded the work and time commitments of creating reports specific to local questions.

Everybody’s excited. It’s just a matter of knowing it’s there and getting used to using it. We’ve been used to everything

Page 11: University of Kansas School of Social Welfare Twente Hall ...

coming to us. . . . It’s less convenient, so it’s just something that you just have to make sure you’re thinking about.

Neither the directors nor their agencies had anticipated the new kind of space in their schedules that data utilization via ROM would demand. “It’s just a change for all of us. Just getting used to this is the new way of getting the data, and for us to carve out that time.” Directors had to schedule a time to even look at the reports, and were reminding their team leaders to look at the reports, “because they are not used to using this. . . . It’s just not on their list of many things to do, I’m sure.” Directors needed organizational support, including identified time, for including the CSR in their evaluative activities. Directors had a sense that there might be valuable information even from domains they had not perused in the print reports. Three participants suggested that follow-up training include a sort of peer training, a venue for exchanging ideas about what reports were being used and for what purposes, across the state. Not all exploration has to be by trial and error; it would take any one director a very long time to explore every possible combination of measures, filters, and time periods. After directors achieve proficiency in using CSR data via ROM, they may be positioned to be one another’s best teachers about what domains are locally useful. Directors suggested that a report that has already proven useful in another setting might also inform their own data utilization. That no two directors were using the same sets of domains was consistent with directors’ intuition that peer training would be valuable. Directors needed the support of local IT to realize the possibilities of ROM. The website allows users to drill down by time period and subpopulation, but if local identifiers are loaded, the data user can also drill down by treatment team and even by individual client. Built-in confidentiality safeguards precluded looking at data on individual clients and on treatment teams, without local IT support for setup. If it’s not connected to a team or a client, if you can’t figure out when that kid came in for services, what the behavior score is, how many absences the kid has, and what point they are at seven months later, it’s not meaningful. My team leaders feel like this would be really helpful if we could track how our kids are doing.

1.3 Study Purpose

The presumption that enough data would automatically suggest objective and correct decisions is replaced with an understanding that data are useful only as they are endowed with meaning by a data user (Davenport

Page 12: University of Kansas School of Social Welfare Twente Hall ...

& Prusak, 2000; Drucker, 1988). There is little local value to an organization’s central information repository, without a mechanism in place for transferring data back to local practitioners and managers who endow the data with local meaning. A glass wall has separated Kansas’ automated information management system from CBS directors who would endow data with local meaning. This study examines the anticipated benefit to CBS directors for serving as the field test group, that is, what CBS directors anticipated doing with improved access to the CSR data. The study examines the benefit directors realized from ROM. The study also examines barriers to anticipated benefits of improved technology for data delivery, and organizational supports that might yet affect the glass wall.

Page 13: University of Kansas School of Social Welfare Twente Hall ...

2. LITERATURE REVIEW – THROUGH THE GLASS WALL: LESSONS LEARNED FROM CHANGING TECHNOLOGY FOR KNOWLEDGE TRANSFER IN KANSAS CHILDREN’S COMMUNITY-BASED SERVICES

Technology is so well suited to managing evaluative data, that business strategists at one time viewed computers with an eye toward their performing formulaic decision making that would render middle management obsolete. The argument was that with enough data, “objectively correct decisions will automatically suggest themselves” (Davenport & Prusak, 2000, p. 3). Organizations gathered data and developed central information repositories to manage the data. Experts predicted that computers, developed for use in warfare and the hard sciences, would have a tremendous impact on business strategy, policy, planning, and management, “on none of which the computer has, however, had the slightest impact at all” (Drucker, 2008, p. 332). A “glass wall” (Basu & Jarnigin, 2008, R4; Pettit, 2008, A21) separates communities that generate data from communities that would use data for building local knowledge. Differences in mind set, language, social influence, expertise, and areas of control, however, between the information-generating communities and the local-use communities, make the glass wall impervious to penetration (Basu & Jarnigin, 2008; Jacobson, 2007; Jacobson, Butterill & Goering, 2004; Pfeffer & Sutton, 2000; Weiss & Weiss, 1981). Knowledge transfer to the local level happens, when it does happen, within a complex interface of researchers and organizations (Canadian Institutes of Health Research, 2004). Knowledge transfer involves program evaluators who decide what to measure, information technologists who decide how to collect and disseminate the data, local decision makers who would use the data, and the dynamics of local organizations where data would be used (Jacobson, Butterill & Goering, 2004, 2005; Kapp & Anderson, 2010). Evaluation is integral to mental health service delivery, and practice-based researchers need access to existing sources of local data for building at least correlational knowledge about what seems to be working, and for which groups, in a way that informs them about what to do (Grasso & Epstein, 1992; Epstein & Grasso, 2004; Thyer, 2007). Practice-based researchers in health and mental health settings want access to their local data, for building at least correlational knowledge about what seems to be working for which groups, in a way that informs them about what to do next (Grasso & Epstein, 1992; Peake & Epstein, 2004; Thyer, 2007). A task of health and mental health services researchers is to transfer knowledge back to local decision makers (Jacobson, Butterill & Goering, 2005). Academics have, at times, erroneously labeled local managers and practitioners as research-resistant, but it seems that the perceived resistance is an aversion to non-utilitarian data collection rather than an aversion to evaluative research (Peake & Epstein, 2004). In work environments with few resources available for

Page 14: University of Kansas School of Social Welfare Twente Hall ...

managers and practitioners to develop their own data sources, they rely on existing data for local evidence of practice effectiveness. There is a frustration with mental health data sources that meet contract requirements without transferring information back to the agency for the development of local knowledge (Kapp & Stipp, 2010). Social workers want access to performance measures at the system, program, and clinical levels, to advance knowledge about effective and efficient services for policy and program development (Mullen, 2004). Opportunity is lost to social work when the glass wall separates clear cut numeric data sequences from the highest practice priority, which is the “whole person in the context of the social environment” (Peake & Epstein, 2004). Program evaluators, information technologists, local decision makers and organizations all support the idea of knowledge transfer, yet there remains a glass wall between reposited mental health data, and would-be local data users.

Page 15: University of Kansas School of Social Welfare Twente Hall ...

3. METHODS

3.1 Research Objectives

The research objectives were to discover the field test group’s (1) anticipated benefit from investing themselves in the new online reporting software, (2) actual uses of the online reporting software, and (3) recommendations for organizational supports to further the effect of online reporting software on the glass wall in Kansas CBS.

3.2 Research Design and Instrumentation

This study of an effort to improve data access for Kansas CBS was a qualitative inquiry built on a research tradition useful for exploring mental health service delivery processes (Luchins, 2003; Shaw & Gould, 2001). The research team consisted of a principal investigator and a graduate research assistant who was the study interviewer. Approval for implementation of this project was granted in February 2009 by the Human Subject’s Committee Lawrence Campus, University of Kansas Institutional Review Board, #17871. Approval was extended February 2010, through February 2011 (Appendix A). Transcriptions of six interviews were data from a first round of interviews, as described in the background section. A member check interview occurred at an outcomes subcommittee meeting at which the developer and trainer for ROM presented website updates. Five of the six initial interviewees were present for the meeting, as was a committee member who had not participated in the initial interviews. The developer stayed for the member checking. The interviewer distributed an information statement (Appendix B) to each of the participants and received participants’ verbal consent to tape whatever the group’s feedback after the presentation. The interviewer used presented interview results support by PowerPoint, to elicit member feedback (Appendix C). The feedback included member check information, as well as information about their utilization of data made accessible to them by ROM in the five months since the first interviews. Transcript of the taped member checking follow-up interview was added to the study data, for further analysis of unique and common themes, codes, and patterns. Researchers used word processing software to support data management.

Page 16: University of Kansas School of Social Welfare Twente Hall ...

3.3 Sample and Data Collection

A seven member CBS outcomes subcommittee had served as field test audience for ROM. Each of the outcomes subcommittee members was invited via email to participate in this study; six of the seven subcommittee members participated. In following email exchanges, the interviewer sent an interview guide to each participant and set appointments for telephone interviews. The interviewer followed the interview guide in 30-45 minute telephone conversations with each of the six participants. One of the participants included the agency’s ROM site coordinator in the conversation. The interviewer asked for and obtained permission of each participant to audio record the interviews. Study data consists of the six transcribed interviews.

3.4 Data Analysis

The first coding level was developed from the five question categories in the semi-structured interview guide. The second coding level was developed from open coding of responses that expanded on the interview guide, capturing unanticipated uniqueness and commonality of director responses (Boeije, 2002; Drisko, 2001). Researchers analyzed responses by reading and rereading the interview text, for a comparative analysis within and between texts (Patton, 2002). Researchers achieved inter-rater agreement by discussing and identifying themes, codes, and patterns that represented the uniqueness and commonality of CBS director responses, and strove for internal validity by reflexively developing both anticipated and serendipitous themes (Boeije, 2002; Drisko, 2001). Researchers conducted member checking to further ascertain authenticity of the results.

Page 17: University of Kansas School of Social Welfare Twente Hall ...

4. FINDINGS

This study explored Kansas CBS directors’ (1) anticipated return on investment in the new online reporting software, (2) realized benefit from the online reporting software, and (3) organizational supports needed to affect the glass wall in Kansas CBS. Directors in the field test group collaborated in software development and prepared themselves for the new data delivery technology, in anticipation of using the data for what one called a “targeted approach” to management. With newly developed software skills, and existing organizational support, directors were already beginning to improve data entry and create reports for building their local knowledge. The timeliness of the data made it possible for directors to better trace apparent inaccuracies to their sources for improving local data entry, and to find immediate answers to local questions. The bottom-up approach, however, had not yet proven adequate to support the systematic development and use of reports that directors envisioned. CBS directors need further organizational supports for knowledge transfer from the Kansas CSR to their local agencies. At the time of this study, directors had begun to create reports on a small scale, but experienced barriers to what they anticipated doing with data made accessible to them by ROM. The change in data delivery created a need for enhanced organizational support, which directors needed for knowledge transfer from the CSR to their agencies to be realized.

4.1 Anticipated Uses of CSR Data

Directors anticipated that the transfer of timely CSR data, and were eager to use filters for focusing their attention on particular subpopulations for informing local (1) resource allocation, (2) staff supervision, and (3) development of local program and treatment protocols.

4.1.1 Resource Allocation

Directors would derive meaning for local resource allocation from client subpopulation data. Directors who looked at client status, for instance, wanted to know not only how many clients were living in a permanent home, but also who those children and youth were. They wanted to look at client status by domains including behavior scores and admission date.

If we’re lower than the state average on permanent home, then which kids are low? Are they the ones that just came in? Are they the ones that have been

Page 18: University of Kansas School of Social Welfare Twente Hall ...

here for a while? Where are their behavior scores? Who are those kids?

Similarly, directors who looked at behavior scores in the old print reports, wanted to use data made accessible to them by ROM to “run different combinations” such as whether children and youth with behavior scores in the clinical range had a lot of school absences, whether they had an individualized education plan, or whether the school had assessed their need for special education supports.

More than just to say, “we’re a little bit below the state average,” or “we’re above the state average so we don’t have to worry about that one,” it’s really helpful to us as a program to see which of our kids are doing better, or doing worse, and for the ones that are doing worse, what else is going on with them?

Directors wanted to access data that would allow them to assess outcomes for groups of consumers, for applying limited resources to subpopulations with the greatest need.

4.1.2 Supervising Staff and Caseloads

CBS directors would assign meaning to client outcome data in terms of areas in which clients were experiencing successes, as well as areas in which workers and teams need additional support, including more appropriate caseloads and additional training. The print reports had only allowed directors to say:

These are areas that are low and this is what we’re getting good in. . . . With this new CSR report we’re going to be able to find out which providers and consumers we need to work with.

Directors collaborated with ROM development, in hopes of better monitoring and maintaining a balance of caseload severity, for enhancing supports to workers with more demanding caseloads.

We’re going to be developing a system by which we can send information by team, and by case manager, so that when we go in and look at the information we can actually drill down and see, how well does this team compare to that team?

Page 19: University of Kansas School of Social Welfare Twente Hall ...

Directors needed data for comparing outcomes for teams and clients, to better supervise staff for increasing successful client outcomes.

4.1.3 Developing Local Practice Guidelines

Directors would derive meaning from outcome data, by examining correlations between satisfactory outcomes and local programs and interventions, for informing development of local practice guidelines. Directors wanted to use the software’s real time feature to look at correlations between local practices and client outcomes, including improved grades and school attendance and better behavior scores. “My plan is,” said one director, “that team leaders and I will at least monthly track trends and consumers, that these are the ones whose grades aren’t getting better, or they’re missing this much school.” Directors would look at behavior scores longitudinally for “whether or not that client’s behavior score was going up or down, so you would have an idea of that parent’s perception of that client, whether that client was improving or getting worse.” Directors need data that supports between-team comparisons of the sets of local programs and practices that are time-correlated with better local outcomes.

4.2 Realized Uses for ROM-Supported CSR Data

Directors reported that in the months since the reporting software had been available online, its timeliness had helped them improve data accuracy, and its features enabled them to create reports that answered some local questions. Although directors were not yet creating reports in the systematic ways they anticipated, they believed improved data accuracy and even anecdotal report development were precursors to their systematically drilling down into the data.

4.2.1 Improved Data Entry

Four of the study participants described tracking data entry accuracy with the online reporting software.

When we were getting this data three or four months after we reported it, it was hard to go back and even know where you start to look, where the problems might be, so it’s a timeliness that is really helpful. It has led us to ask, “Are we reporting data accurately?”

The timeliness and specificity of the online reporting software helped directors track data inaccuracies back to their sources.

Page 20: University of Kansas School of Social Welfare Twente Hall ...

Having tracked that information, four directors were working with their local IT departments to correct data inaccuracies at their sources; three directors were developing further training around domain definitions for case managers identified by the software as sources of inaccuracies. Directors reported that improved CSR accuracy would further encourage their own efforts to develop reports in a systematic way.

4.2.2 Answered questions

Directors gave anecdotal accounts of reports they created along the way, with some surprise that clicking on the designated icons actually gave them what they wanted. A director described, for instance, the surprise of successfully creating a useful report for a meeting with foster care contractors. In preparing for the meeting, the director became curious about the ages of children receiving CBS who were in foster care. The director “set it in that perusal thing to look at ages and there it was - specific age groups of children in foster care.” Although these kinds of return on investment were not yet at the level directors were anticipating, directors found those preliminary experiences of finding immediate answers to local questions encouraging to their development of more systematic reports

4.3 Unanticipated Barriers: Data Delivery Change as Organizational

Change

The value of the print CSR had been for comparing agency-wide performance against state performance, and against other agencies’ performance. Some of the field test group, however, had not yet found system-level comparisons they perused in the old print reports. The bottom-up approach to software development created a user-friendly format, but the bottom-up approach to change in data delivery did not generate adequate organizational support. None of the directors indicated that their community mental health center was resistant to the transition from the print to the web-based reports. But neither had the agencies carved out a niche for the new data delivery system in their evaluative activities, or established accessing the data through CSR as an agency IT priority. The arrival of the old quarterly print reports had an “event” to which directors responded by perusing familiar domains. The print reports had been just “bloop, there you were, presented the report and that was it.” The bulky print reports had taken up space on agency shelves, but neither

Page 21: University of Kansas School of Social Welfare Twente Hall ...

demanded much from, nor gave much back to, the directors or the agencies. ROM did not take up physical space, but demanded the work and time commitments of creating reports specific to local questions. Now that the data was always available for directors to use, directors were fitting it in as they could, but on their own.

Everybody’s excited. It’s just a matter of knowing it’s there and getting used to using it. We’ve been used to everything coming to us. . . . It’s less convenient, so it’s just something that you just have to make sure you’re thinking about.

Neither the directors nor their agencies had anticipated the new kind of space in their schedules that data utilization via CSR would demand. “It’s just a change for all of us. Just getting used to this is the new way of getting the data, and for us to carve out that time.” Directors needed further support from their local agencies, and in some instances from the state, for adopting the web-accessed CSR data into their local evaluations. Directors reported that they need (1) ongoing training, (2) a niche within the agency for looking at the data, (3) opportunity for reflecting on possible domain applications, and (4) ready IT support for loading local identifiers.

4.2.1 Ongoing Training

At the member checking, directors still wanted to learn how to create the reports they needed, but had done very little additional “playing” since their formal training concluded. Directors at the member asked for ongoing training, at least quarterly but as often as monthly until they achieved what they considered to be adequate proficiency.

4.2.2 Reflecting on Available Data

It turned out that the work directors had done to learn the mechanics of the website, was only the beginning of what the time commitment for utilizing ROM. The new data delivery system was not taking up space on a shelf, but running reports, and even before that deciding what reports to run, would take up space in directors’ and team leaders’ schedules.

Page 22: University of Kansas School of Social Welfare Twente Hall ...

4.2.3 IT Support

At member checking, four directors were in conversations with their local IT departments about loading local identifiers. Directors were accepting for themselves the onus of engaging IT support, but:

This isn’t a priority project, so I just kind of have to wait in line a little bit. . . . I’m going to have a hard time with getting my IT department to do that. My IT person says “well, I can give you those reports,” but she does not have the time. . . . But that’s on my end. I mean, that’s talking my person into whatever she has to do.

The field test group did not know of any agency in the state for which local identifiers were loaded. Even the director in a multi-site agency was still waiting for IT to load identifiers that would support looking at site-specific data. Kansas has multi-site agencies that cover geographically large but sparsely populated catchment areas. Until local identifiers are loaded, data remain aggregated across dozens of counties. A director at the member check interview suggested that rural and frontier agencies, some of which still used paper documentation, might have to expect they will not get the local IT support they need for loading local identifiers. Others suggested that the state will have to be involved in loading local identifiers in sparsely populated catchment areas. As the field test group, study participants had more exposures to ROM than will most other CBS directors in the state, yet even this group will need further support if the software will be successful as a vehicle for bring CSR program and client level performance data to local evaluators. Directors need agency involvement for engaging local IT support and for determining which domains and filters are best suited for providing local evidence of what works.

Page 23: University of Kansas School of Social Welfare Twente Hall ...

5. DISCUSSION

5.1 Study Limitations

Personal interactions encourage participants to provide what they believe to be socially desirable responses (Dillman, 2000). This study employed personal interactions between interviewer and interviewees during software training and throughout the interview process. The limitation was accepted for its value in support of the interviewer’s understanding of participant experiences with learning the software. The process was similar to participant observation, which capitalizes on subjective knowing. At member checking, the software developer was present to hear respondent feedback, which might have led participants to provide socially desirable responses. The developer, however, encouraged participants to be forthright in their responses by explaining that his access to the website let him know the extent to which each participant had accessed the data. The interviewer asked CBS directors to recall what data they have used to inform decisions. It is difficult to recall data use after the fact; an in vivo study, in which directors recorded their actual data utilization, may have provided a more accurate picture (Reid & Fortune, 1992). In this study, however, the novelty of the software, and the short time elapsed since the software had been available, would have supported directors’ recall. This study focused specifically on data utilization supported by a new online reporting software system, in Kansas CBS. Qualitative methods and the small sample size do not support generalizability to other settings. In-depth information about the cooperative process of introducing the website into local management processes, and resultant organizational barriers, however, may be transferred at least across Kansas CBS, and perhaps to inform support for knowledge transfer in other settings (Lincoln & Guba, 1985). The study examined several facets of what might comprise a glass wall. One aspect of that not included in this study was the actual data collected. The study solely focused on the viability of ROM, without considering the degree to which adopted measures are potentially useful to local settings.

5.2 Implications

IT capabilities have become the maxim for improving efficiency in 21st century mental health services, and technology plays an important role in measuring outcomes for Kansas CBS. Kansas’ automated information management system is impressive technology for managing statewide CBS outcomes data collection. ROM, which did not change the outcome

Page 24: University of Kansas School of Social Welfare Twente Hall ...

measures but did change the dissemination of those measures, is likewise impressive new available technology for making local access to the outcomes data possible. The information age would make it possible for people to do what they wanted with their information, supported by information technologies that made the impossible commonplace. Although impressive in their own right, information technologies are means to ends rather than ends in themselves (Davenport & Prusak, 2000; Drucker, 2008). Technology’s value is measured not by its capacities and capabilities, but by the degree to which it connects potential data users with local data (Ichijo & Nonaka, 2007). CBS directors were engaged in collaborating in ROM development, by the promise that their work would lead to the transfer of information back to building their own local knowledge. The software was introduced collaboratively rather than from the top down, out of a collegial relationship between the developer who is a social worker and the potential data users who had been calling for change in their data delivery status quo. Few directors had input into the development of existing measures a decade prior, but directors were now trying to look at that data on their own online, in hopes of gleaning some client and program information that would be useful to their local evaluations. Directors had begun using the software to improve the accuracy of their local data, but the data were not yet in their hands to the degree they had envisioned. Despite the data timeliness, accessibility, and ease of use, and despite the bottom-up approach to the software development, the glass wall was still apparent. Measurement designs change organizational policy and dynamics (Diamond & Shirky, 2008; Kapp & Anderson, 2010). The glass wall occurs at the point where technology meets the organization (Basu & Jarnigin, 2008; Pettit, 2008). From an organizational perspective, the new vehicle for CSR delivery has not yet interfaced with the Kansas mental health agencies. The software developer and field testers were ahead of organizational support for the changes the new data delivery system would precipitate. IT improves care when it is transformed at the same rate as the “underlying system processes” (Diamond & Shirky, 2008, w383). CBS system processes, including time for learning and using the software, time for reflecting on available domains, and IT support for loading local identifiers, lagged behind the developer’s work of implementing and training in the software, and behind directors’ investment in bringing local data to their own desktops.

Page 25: University of Kansas School of Social Welfare Twente Hall ...

Organizational change often creates more work than is required by the status quo (Kapp & Anderson, 2010). Directors had already invested heavily in preparing themselves and their teams for the new data delivery, but the work had only begun. Directors had perused outcomes in the print CSR, but they were moving from the work required for perusing, to the work required for drilling. Study participants envisioned how drilling down by subpopulation or team would help them better allocate resources to clients and workers experiencing the greatest need, and how drilling down by time period will help them determine the effectiveness of particular programs and practices. The drilling metaphor directors used conjures up images of road crews and heavy equipment. Likewise from Kansas CBS directors, the resources including time that are required for drilling down into the data are not incidental to the process. Directors remained enthusiastic about the new data delivery system, but at the member check interviews, the drilling was stalled. Directors were still finding time to become proficient in the software, identifying useful domains for drilling, and trying to engage local IT support for loading local identifiers. That no two directors had been in the habit of looking at the same sets of domains indicates that the access problem was not specific to a particular domain, but that the need for further support was systemic. Agencies were not involved in carving out time for directors to become first proficient in the software so they could create useful reports, or in reflecting with the directors on the types of reports that would be locally valuable. Nor had the agencies made IT support a priority. Directors look forward to accessing their data, supported by software that allows them to identify correlations between practices and outcomes; evidence of what works makes it possible to develop informed improvements to local policies and programs (Mullen, 2004). The practice of tracing outcomes to practices within a time period within an agency was the forerunner to the evidence-based practice movement, and remains valuable for developing local evidence of practices correlated with better outcomes (Walkup & Yanos, 2005). Directors had plans to do that, using software they described as “elegant” and “intuitive,” and with the support of the “very accessible” developer and trainer, yet the glass wall has not yet yielded, which is consistent with decades of experience with transferring data out of its repository into local use. There remains the promise, however, that organizational support technology introduced in response to directors’ demand for change, may yet support an interface between technology and local mental health agency data utilization. IT capabilities have made the impossible commonplace, but technology’s value to local decision makers is measured by the degree to which it actually connects them with their data, not by its capabilities (Ichijo &

Page 26: University of Kansas School of Social Welfare Twente Hall ...

Nonaka, 2007). Although impressive in its own right, the value of technology is a means to knowledge transfer rather than as an end in itself (Davenport & Prusak, 2000; Drucker, 2008). The promise of the information age was that people would be able to do what they wanted with their information. What study participants wanted from the software was consistent with the software’s capabilities, and with what social work practitioners and managers have long wanted from their data, including the ability to look for group differences and time correlations. The practice of tracing outcomes to practices within a time period within an agency was the forerunner to the evidence-based practice movement, and remains valuable for developing local evidence of practices correlated with better outcomes (Walkup & Yanos, 2005). Collaboration between people with clinical expertise and people with technical expertise is time consuming, but thought to be worthwhile for supporting technology’s role in local knowledge acquisition (Cnaan & Parsloe, 1989). Kansas CBS directors participated in development of the software they described as “elegant” and “intuitive,” with the support of a developer and trainer who was “very accessible,” but the data were not yet in their hands to the degree they had envisioned. Despite the bottom-up approach to development of software, which met directors’ demands for improving data timeliness and accessibility, the glass wall was still apparent where it has been observed elsewhere, at the point where technology meets the organization (Basu & Jarnigin, 2008; Pettit, 2008). Change in measurement design leads to change in organizational policies and dynamics (Diamond & Shirky, 2008; Kapp & Anderson, 2010). ROM did not change what was being measured or the way it was measured, but did change the data dissemination that precipitated organizational change. When new technology precipitates organizational change, potential data users weigh their perceptions of what they think the technology can do for them, against how easy or difficult the technology seems to be, in deciding whether to incorporate that technology into daily use (Venkatesh & Morris, 2000). Organizational change often creates more work than the status quo (Kapp & Anderson, 2010). Technologies involved in organizational change lie unused, when the effort seems too great for the return on investment. In Kansas CBS, directors had already invested heavily in preparing themselves and their teams for the new data delivery software, and were enthusiastic about the new data delivery system, but were also realizing that the work had only begun. Beyond the skill building tasks directors had already accepted, however, other apparent facets of the organizational change process would include developing new strategies for looking at the data, developing strategies for creating reports with optimal management

Page 27: University of Kansas School of Social Welfare Twente Hall ...

benefit, and tracking down IT support for activating the software’s full capabilities.

5.3 Recommendations

In prior study, Kansas CBS directors have shown themselves to be fairly sophisticated users of data. Technologies reconstitute the roles and identities of people in groups (Kline & Pinch, 1996; Pinch, 2010). Prior study of Kansas CBS directors’ uses of data in the state’s automated information management system described their efforts to use the CSR data “rather like an angler might describe tenaciously trolling for whatever might bite” (Kapp & Stipp, 2010, p. 140). Directors in this study, however, described using the online supporting software for “drilling,” which requires a different set of habits and behaviors, a different set of software skills, and a different set of specialized knowledge and tools. The drilling metaphor conjures up images of road crews and heavy equipment, the resources for which are not incidental to the process. Technology plays an important role in outcome measurement in Kansas CBS. The eight-year-old automated information management system stores and aggregates copious CSR data for tracking CBS outcomes. The year-old ROM creates the potential for better local access to the data. For CBS directors to realize the new technology’s capabilities, however, they need organizational support for new patterns of information seeking. Technology’s influence on patterns of information seeking presents an ongoing challenge to breaking down the glass wall for Kansas CBS. Changes in technology create changes in daily activities (Venkatesh & Morris, 2000). IT improves care when it is transformed at the same rate as the “underlying system processes” (Diamond & Shirky, 2008, w383). Change in CBS system processes required change in the time for learning and using the software, time and collaboration for developing useful reports from existing domains, and IT support for loading local identifiers. Change in system processes have lagged behind the developer’s work of bringing the software online and training directors in the software, and behind directors’ investment in bringing local data to their own desktops. The new vehicle for CSR delivery had not yet fully interfaced with the Kansas mental health agencies. The Kansas CBS directors are preparing themselves to change their data use activities from trolling to drilling, and continue to await the skill and knowledge building, scheduling, and technological supports still required for permeating the glass wall.

Page 28: University of Kansas School of Social Welfare Twente Hall ...

Study recommendations are that Kansas CMHCs and the state mental health authority make loading CSR identifiers a priority for agencies and the state. It is common that system processes - in this case data dissemination - change more rapidly than local procedures (Diamond & Shirky, 2008). This study recommends system process changes that will support the interface of technologies with Kansas mental health agency organizational systems. Local agencies need:

(1) agency and state support for ongoing building of skills for using the online reporting software

(2) agency support for the new kind of space the online data delivery demands

(3) agency and state strategies for creating reports for local management benefit from CSR data, and

(4) establishment of new IT priorities for activating the software’s full capabilities.

Bottom-up development and collaboration, and training in software utilization, have not yet proven sufficient for bringing down barriers to knowledge transfer that still remain between technology and the organization.

Page 29: University of Kansas School of Social Welfare Twente Hall ...

6. REFERENCES

Basu, A. & Jarnagin, C. (2008, March 10). Business insight (a special report): Information technology; how to tap IT’s hidden potential: Too often, there’s a wall between a company’s information-technology department and everything else. The Wall Street Journal, Eastern Edition, p. R4. Boeije, H. (2002). A purposeful approach to the constant comparative method in the analysis of qualitative interviews. Quality & Quantity, 36, 391-409. doi: 10.1023/A:1020909529486

Canadian Institutes of Health Research (2004). Knowledge translation strategy 2005-2009. (Cat. No.: MR21-56/2004E-HTML). Ottawa: Her Majesty the Queen in Right of Canada. Retrieved from http://www.cihr-irsc.gc.ca/e/26574.html Cnaan, R. & Parsloe, P. Eds. (1989). The impact of information technology on social work practice. New York: The Haworth Press. Davenport, T. & Prusak, L. (2000). Working knowledge: How organizations manage what they know. Boston: Harvard Business School Press. Diamond, C. & Shirky, C. (2008). Health information technology: A few years of magical thinking? Health Affairs, 27, w383-w390. doi: 10.1377/hlthaff.27.5.w383 Dillman, D. (2000). Mail and internet surveys: The tailored design method. New York: John Wiley & Sons, Inc. Drisko, J. (2001). How clinical social workers evaluate practice. Smith College Studies in Social Work, 71, 419-439. doi: 10.1080/00377310109517638 Drucker, P. (1988). The Coming of the New Organization. Harvard Business Review, 66(1). 45-53. Drucker, P. F. (2008). Management: Tasks, responsibilities, practices. New Brunswick, New Jersey: Transaction Publishers. Grasso, A. & Epstein, I. (1992). Research utilization in the social services: Innovations for practice and administration. New York: The Haworth Press. Ichijo, K. & Nonaka, I. (2007). Knowledge creation and management: New challenges for managers. New York: Oxford University Press. Isett, K. & Phillips, S. (2009). Improving practice-research connections through technology transfer networks. Journal of Behavioral Health Services & Research, 37, 111-123. doi: 10.1007/s11414-009-9183-1

Page 30: University of Kansas School of Social Welfare Twente Hall ...

Jacobson, N. (2007). Social epistemology: Theory for the “fourth wave” of knowledge transfer and exchange research. Science Communication, 29, 116-127. doi: 10.1177/1075547007305166 Jacobson, N., Butterill, D. & Goering, P. (2004). Organizational factors that influence university-based researchers’ engagement in knowledge transfer activities. Science Communication, 25, 246-259. doi: 10.1177/1075547003262038 Jacobson, N., Butterill, D. & Goering, P. (2005). Consulting as a strategy for knowledge transfer. The Milbank Quarterly, 83, 299-321. Kansas Department of Social and Rehabilitative Services: Children’s Mental Health Services (2009). Result Oriented Management website. Retrieved from https://rom.socwel.ku.edu/csr/Login.aspx Kansas Department of Social and Rehabilitative Services (2005a). Child welfare mental health referral guide, Appendix 3J, p.2. Retrieved from http://www.srskansas.org/CFS/PolicyDraft07012010/FormsAppendices/Appendix3JChildWelfareMentalHealthReferralGuide.doc Kansas Department of Social and Rehabilitation Services (2005b.) AIMS_V3.0 manual. Retrieved from http://www.srskansas.org/hcp/MHSIP/AIMS/aims_v30entire_revjune272005.pdf Kapp, S. & Anderson, G. (2010). Agency-based program evaluation: Lessons from practice. Thousand Oaks, CA: Sage. Kapp, S. & Stipp, K. (2010). Trolling for useful data in an automated information management system: Experiences of Kansas community mental health managers. Administration in Social Work, 34, 135-147. doi: 10.1080/03643101003608927 Kline, R. & Pinch, T. (1996). Users as agents of technological change: The social construction of the automobile in the rural United States. Technology and Culture, 37, 763-795. Lincoln, Y. & Guba, E. (1985). Naturalistic inquiry. Newbury Park, CA.: Sage Publications. Luchins, D. (2003). The qualitative and quantitative traditions within mental health administration. Administration and Policy in Mental Health, 31, 183-186. doi: 10.1023/B:APIH.0000003050.68888.51

Page 31: University of Kansas School of Social Welfare Twente Hall ...

Mullen, E. (2004). Outcomes measurement: A social work framework for health and mental health policy and practice. Social Work in Mental Health, 2, 77-93. doi: 10.1300/J200v02n02_06 Patton, M. (2002). Qualitative research & evaluation methods (3rd ed.). Thousand Oaks, CA: Sage. Peake, K. & Epstein, I. (2004). Theoretical and practical imperatives for reflective social work organizations in health and mental health: The place of practice-based research. Social Work in Mental Health, 3(1/2), 23-37. doi: 10.1300/J200v03n01_02 Pettit, D. (2008, March 25). IT leaders must change, not the business side. The Wall Street Journal, Eastern Edition, p. A21. Pfeffer, J. & Sutton, R. (2000). The knowing-doing gap: How smart companies turn knowledge into action. Boston: Harvard Business School Press. Pinch, T. (2010). The invisible technologies of Goffman’s sociology from the merry-go-round to the internet. Technology and Culture, 51, 409-424. doi: 10.1353/tech.0.0456 President’s New Freedom Commission on Mental Health (2003). Achieving the promise: Transforming mental health care in America. Final report (U.S. DHHS pub. No. SMA-03-3832). Rockville, MD: U.S. Department of Health and Human Services. Retrieved from http://www.mentalhealthcommission.gov/reports/FinalReport/FullReport.htm Rapp, C., Gowdy, E., Sullivan, W. & Wintersteen, R. (1988). Client outcome reporting: The status method. Community Mental Health Journal, 24, 118-133. doi: 10.1007/BF00756654 Reid, W. & Fortune, A. (1992). Research utilization in direct social work practice. In A. Grasso & I. Epstein (Eds.). Research utilization in the social services: Innovations for practice and administration (pp. 97-111). New York: Haworth Press. Shaw, I., & Gould, N. (2001). Qualitative research in social work. Sage Publications: Thousand Oaks, CA. Stipp, K. & Kapp, S. (2010). Building organizational knowledge and value: Informed decision making in Kansas children’s community-based mental health services. Community Mental Health Journal. doi: 10.1007/s10597-010-9334-0

Page 32: University of Kansas School of Social Welfare Twente Hall ...

Thyer, B. (2007). Social work in mental health: An evidence-based approach. Hoboken, NJ: John Wiley and Sons. Venkatesh, V. & Morris, M. (2000). Why don't men ever stop to ask for directions? Gender, social influence, and their role in technology acceptance and usage behavior. MIS quarterly, 24, 115-139. Walkup, J. & Yanos, P. (2005). Psychological research with administrative data sets: An underutilized strategy for mental health services research. Professional Psychology, 36, 551-557. doi:10.1016/j.schres.2004.02.021 Weiss, J. & Weiss, C. (1981). Social scientists and decision makers look at the usefulness of mental health research. The American Psychologist, 36, 837-847. doi: 10.1037/0003-066X.36.8.837 World Health Organization (2004). The world health report 2000: Changing history. Geneva, Switzerland: Author.

Page 33: University of Kansas School of Social Welfare Twente Hall ...

APPENDIX A

Human Subjects Lawrence Project Approval Letter

Page 34: University of Kansas School of Social Welfare Twente Hall ...

APPENDIX B

Information Statement

Page 35: University of Kansas School of Social Welfare Twente Hall ...

APPENDIX C

Member Checking Power Point