Advising Project Dana Clark Beth Nuccio Julia Teahen Mike Tyler October 3, 2012.

Post on 31-Mar-2015

217 views 0 download

Tags:

Transcript of Advising Project Dana Clark Beth Nuccio Julia Teahen Mike Tyler October 3, 2012.

Advising ProjectDana ClarkBeth NuccioJulia TeahenMike Tyler

October 3, 2012

PROJECT SELECTION

Why Advising?

Customer service surveys indicated that students are not satisfied with the current advising process (Noel-Levitz)Students who left Baker College indicated that the advising process was one reason for leaving the institution (Eduventures Retention Study)Flint campus study of advising indicated inconsistent quality of advising visits

Project SelectionSelected by the Lean Six Sigma Training GroupIntegrated into an existing project Worked with the Directors of Advising to define the scopeNarrowed the project to try and create a standardized advising processTackled project in a more focused way using Lean Six Sigma tools

Key StakeholdersDirectors of Advising Academic AdvisorsStudents

Impact on StakeholdersProject was designed to create a consistent process for advising visits

Clear expectations for Advisors were establishedModel for consistent advising was createdStudents received consistent advising services, both reactive and proactive (no longer missed opportunity to discuss current and future progress)

Lessons LearnedInvolvement in creation of a project selection rubric may create a clearer selection process and buy-in of selected projectsMore experience in identifying good Lean Six Sigma projects may be helpful. What are key characteristics of a good project? What are key characteristics that are not a good fit?Project scope defines stakeholders involvement – very useful!

Lessons LearnedHow inclusive should we be in team selection?Moved into an existing project…

it took time to build a synergy with the teambuy-in and agreement of what we are doing was valuable and took a few meetings to establish

Lessons LearnedLacked data necessary to understand the direction of the projectExcpected impact was not defined or clearly articulated at the beginning of the project due to lack of dataNeed to start with the question: “What will we measure?”DMAIC vs. DMEDI - used DMAIC and due to lack of baseline data ended up closer to the DMEDI model

TEAM EFFECTIVENESS

We Rocked!Lean Six Sigma Leadership TeamAcademic Advising Work Team

Team EffectivenessLeadership Team members volunteered to participate in the projectAcademic Advising Work team was established prior to project selection

Preparation to ParticipateExplained purposes of the project

Introduced quality improvement conceptsPlaced to help team achieve their goalDescribed purpose of tools as they were used throughout the project

Work Group ContributionAll campus advising departments were represented on the work teamAll contributed in meetings; not all able to participate in pilot phase due to timing and other responsibilitiesAll work group members were invested in the project and eager to participateMost of the work was completed in meetings

Work Group EffectivenessWe joined an existing and high functioning teamAgenda provided with goals at each meetingReviewed past sessions at each meetingConclusion at end of meeting with detailed responsibilities addressedAlways used VOC and project scope to limit derailers - kept us on track for a manageable project

Project Scope

Charter Gate Review

Leadership Team Effectiveness

Parts of the project were done individually so effectiveness was lostTeam members volunteered to tackle tools based on comfort levelUsed Blackboard to post documents for sharingShould have established regular appointments instead of by needLack of coordination/timing between meetings, Course Topics, and Advising Project team meetings

Leadership Team Effectiveness

On the job training and use of tools was an excellent way to learn the processFour members is a good team sizeOne team member was not as familiar with the project scope so had to learn two languages (Advising and Lean Six Sigma)

Lessons LearnedWould be beneficial to orient team to Lean Six Sigma tools at the beginning versus as they were usedMore lead time would have been useful in facilitating the project (we were only 1/2 step ahead of work group)

ANALYSIS

Analysis and Root CausesVoice of CustomerSIPOCCTQFive WhysFishbone DiagramFMEA - not effective

Voice of the Customer Data

Students felt well-servedThe survey did not measure if a student received a quality advising visitVery little data exists regarding accuracy of advising decisions

Voice of Customer

SIPOC

CTQ

Advising Five Whys

Advising Fishbone

Advising Fishbone

Advising Fishbone

Advising Fishbone

Advising Fishbone

FMEA

FMEA

FMEA

Root CausesFishbone and Five Whys helped change our direction to two root causes:

Cause 1: Inconsistent AdvisingLack of expectations Lack of trainingLack of audit

OUTCOME: Lead to development of checklist and training in using the checklist

Cause 2: Lack of Communication and Quality Program Information

Will be addressed in future phases

Lesson LearnedNot sure we really found the root cause of the problem

Define Gate Review

SOLUTION DEVELOPMENT

Solution DevelopmentVOC, Process Map, and other tools used to brainstorm “future state” of advising appointmentsThe “Future State” defined elements of a good advising appointmentCreated an advising checklist, as a work group, to implement consistency in advising visits within the academic office

Process Map Directors of Advising gathered input

from their advisors to create process maps for each function within our project scope Class Selection Drop Withdrawal Program Inquiry Program Change

Class Selection Process Map

Drop Process Map

Withdrawal Selection Process Map

Program Inquiry Selection Process Map

Program Change Selection Process Map

Advising Checklist

Advising Checklist

Expected BenefitsExpected Benefits = More consistent advising and stated advisor expectationsReviewed VOC from initial researchAsked: “If we did all of these things in the advising appointment, would we meet the needs of the customer?”

Lessons LearnedMeasurements to determine impact were difficult because we didn’t have baseline dataExpected benefits difficult to state specifically due to lack of benchmark dataImpact not clearly defined and not apparent in outcomeThere is still a lot of waste in each process that we did not address

SOLUTION IMPLEMENTATION

Solution ImplementationA pilot was developed to:

determine Usabilitygather VOC after new processgather Advisor feedbackidentify Potential challengesdetermine if advising appointments were more consistent?provide expectations through training

Solution ImplementationLeadership team met to discuss possible implementation strategies of pilotLarger work group met to brainstorm implementation strategies of pilotReviewed results of student survey, Director observations, advisor self-reported use of checklist, and feedback from advisorsWork group met to discuss final implementation strategy by discussing data collected, feedback received, and by brainstorming further challenges

Research MethodologyPre- and Post-test pilot on four campuses with ten advisorsPre-test - Director of Advising observed advising appointments and documented activity on the checklist; students completed a surveyPost-test - Training of those advisors who participated in the pre-test phase; Advisors used checklist during appointment, Director of Advising observed advising appointments and documented on the checklist, students completed a survey

Hypothesis

Hypothesis

Hypothesis

Hypothesis

MeasurementsGathered the dataCompiled and presented to Work GroupResult: Very little variation in pre- and post-test use of checklistSome suggestion that the tool was not used consistently or reliablyTools may not be discriminating enough to identify behavior at a fine enough level to determine differences that may have occurred between pre- and post-testData from students suggest generally are satisfied one-on-one when they meet with an advisor; which is in odds with other data we have from previous research

Measure Gate Review

Force Field Analysis

Challenges of Pilot and Full Implementation

Identified through brainstorming prior to pilot phaseAdditional challenges identified during the pilot phase through data analysis of pre- and post-test

Sign in procedure for walk-in appointments was a problem on some campusesChallenges in using the checklist consistentlyInconsistent documentation in Student TrackingWeek Seven Advising – to use or not to use the checklist?

Analyze Gate Review

Advisor Feedback

Advisor Feedback

Procedural ChangesWait time implemented to allow for time to review student records prior to meeting with studentEvery advisor will use the checklist for each visit – even during Week SevenSummary form for studentsCollection of checklist dataAnalysis of checklist data

Support and Buy-InAll Directors participated in the Advising Work Group TeamAdvisors’ use of checklist and opportunity to provide feedback to improve the checklistChief Academic Officer Committee and Presidents’ Council are needed for full implementation

Implementation PlanRevision of Checklist (completed)Approval needed to implement across all advising offices across the System in the Winter 2013 quarter (Denise Bannan, Dana Clark, & Julia Teahen)Revision of training (Dana Clark & Academic Advising Work Group)Provide in-person training of all advisors during Fall Quarter 2012 (Dana Clark & Julia Teahen)Full scale implementation in January 2013 (Academic Advising Work Group)Directors of Advising will collect the completed checklistsReevaluate data gathered in Spring 2013 (Academic Advising Work Group)Determine impact of checklist in Spring 2013 (Academic Advising Work Group

Sustainability IssuesControls have not yet been implemented for sustainabilityNeed to determine overall effectiveness in May 2013 and then decide if checksheet process should continueLong term strategies were discussed and noted for future sustainability; to be used after data collection in May 2013

Sustainability IssuesConsistency in using checklist Buy-in on Importance of Checklist UseObservation of Use Periodic survey of students to verify meeting VOCOngoing trainingDemonstrate Value-Added Tool

Improve Gate Review

RESULTS,KNOWLEDGESHARING AND APPLICATION

ResultsNot sure we made a difference - need to evaluate after full implementationExpectation of what a “good appointment” looks like was establishedBuy-in from Directors of AdvisingPilot results shared with the Advising Work Team; some advisors received the information on results; not clear with whom and how to share the results

Lessons LearnedFMEA developed was very broad. It could have been refined after the post-test pilot phaseCost/Benefit analysis not completed Force Field Analysis was helpful in generating challenges and addressing sustainabilityGate Reviews were not used to fullest potential in first Phase of learning. They will be useful to oversee future projects.

Control Gate Review

Questions?