Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional...
-
Upload
dina-skinner -
Category
Documents
-
view
213 -
download
0
Transcript of Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional...
Tips for Writing SACSCOC Non-Academic Program Assessment Reports
Office of Planning, Institutional Research, and Assessment (PIRA)
Fall 2015
Relation Between Existing Assessment and SACSCOC Reports
Ideally you already evaluate your unit’s effectiveness
Relation Between Existing Assessment and SACSCOC Reports
• Don’t create special data collection process for SACSCOC; just summarize existing processes
• Save time and unnecessary work by adapting your existing annual report to the SACSCOC Program Assessment Report template
• Study resources and template before starting• Use existing assessments, available documentation,
and your current reports whenever possible• Start with your existing assessments (measures) and
then write outcomes to go with them
Tips for an Efficient Report
Do You Have Survey Data?
• Non-academic units often use survey data for their assessment
• Surveys are indirect measures of student learning, but they are direct measures of customer (client, employee, patient, student) experience
Source: Mary Harrington, Univ of Mississippi
Types of Non-Academic Units
1) Administrative support services2) Academic and student support services3) Research4) Community/public service
We Offer Help With The 4 Types
Larger Administrative Units . . .
may prefer to submit a Program Assessment Report (PAR) for each office within the division, particularly if outcomes are not the same across those offices.
Show Clear Evidence That You Have
• defined desired mission, program outcomes or objectives, and related measures,
• collected and evaluated results from ongoing assessment (multiple years),
• undertaken actions for continuous improvement.
Help reviewers find key components quickly & easily
Implement Change
(Improve)
Collect Findings
Define Outcomes & Measures
Evaluate Results
Use PIRA Template for Key Elements
• Mission and program outcomes (objectives)• Operational and/or student learning outcomes (2+) and
related measures (2+ each, 1 should be direct measure)• Assessment findings: results of measures from multiple
years (if feasible)
Use PIRA Template for Key Elements
• Discussion of results: review of findings, including whether performance meets expectations
• Discussion of changes: initiatives to improve program and whether continuous improvement has occurred
• Clear narrative and organization to make compliance obvious (does everything make sense?)
Your Mission Statement Should
• tie to UM Mission:“The University of Miami’s mission is to educate and nurture students, to create knowledge, and to provide service to our community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to
develop future leaders of our nation and the world.”
and your strategic plan,• describe program outcomes/objectives (e.g., purpose
of unit, type of support for students—including any research or service components)
• Describe reasonable expectations in
measurable terms (efficiency, accuracy,
effectiveness, comprehensiveness, etc.)• Include at least 2 outcomes • Make outcomes easy to identify (e.g., use
bolding & numbering) and clearly stated (follow expected structure)
When Writing Operational Outcomes:
An Operational Outcome Should
• Focus on a current service or process• Be under the control of or responsibility of the unit• Be measurable• Lend itself to improvements• Be singular, not “bundled”• Be meaningful and not trivial• Not lead to “yes/no” answer
Source: Mary Harrington, Univ of Mississippi
Sample Operational Outcomes
• Efficiency: The Registrar’s Office processes transcript requests in a timely manner.
• Accuracy: Purchasing accurately processes purchase orders.
• Effectiveness: Human Resources provides effective new employee orientation services.
• Comprehensiveness: Financial Aid provides comprehensive customer service.
Source: Mary Harrington, Univ of Mississippi
Sample Learning Outcomes(with Focus on Students or Others)
• Library: Students will have basic information literacy skills.
• Career Services: Students will be able to create an effective resume.
• Information Technology: Staff will know to how use the student information system.
• Human Resources: New employees will be familiar with the benefit package.
Source: Mary Harrington, Univ of Mississippi
Examples of Measures
• Research: number of grants, total funding, number of peer-reviewed publications, conference presentations
• Administrative support: timeliness in processing orders, budget growth (or savings), complaint tracking/resolution, public safety improvements, audits
• Academic/student support: number of students counseled, job placements, scholarship awards, seminar participation, leadership training participation
• Community/public service: number of patients seen, community event participation, annual volunteer commitments
When Writing Assessment Findings:
• Ensure each measure has corresponding findings (and no findings without earlier measure)
• Insert corresponding outcome/measure as heading for each set of results
• Include multi-year data or explain if measures are new:
“As part of the major three-year continuous improvement update of our program assessment report in FY 2014, we decided to start using customer satisfaction surveys in conjunction with service requests. Because this is a new measure, we have data for only FY 2015, but we will continue to update the data in upcoming years to monitor continuous improvement.”
Writing Assessment Findings
• If measure is a narrative rather than data, include summary plus sample evaluations or insert statement
• Ensure results are presented clearly (tables) • Decide if appendix of findings (e.g., survey instrument) will be
necessary (usually not)
Common error: Programs simply state they evaluate outcomes or omit measure(s). Solution: You should provide evidence of assessment activity
(table/text summary of findings).
When Writing the Discussion Section:
• Provide a statement as to why these particular assessment instruments were used
• Include an analysis of the assessment findings and evidence of improvement
• general trends• specifically in response to improvement
Avoid These Common Errors In Discussion
• When describing initiatives to improve outcomes: The report simply lists initiatives.
• Solution: Include brief commentary on which outcome will benefit.
• When describing continuous improvement: The report does not include any evidence of improvement over time.
• Solution: At least discuss efforts to improve outcomes.
Help Reviewers Find What They Need
• Add bold, indents, and/or underlines• Nest measures under related outcomes • Label/nest Outcomes/Measures in Findings section• Remove yellow template instructions• Expand acronyms (e.g., RSMAS, PRISM, UMHC)• Spell check and fix typos
Questions for PIRA?
Contact:
Dr. Claudia GrigorescuCompliance Specialist (Assessment)305-284-4714
Dr. David E. WilesExecutive Director, Assessment and AccreditationInstitutional Accreditation Liaison 305-284-3276