Lance Speelmon Scholarly Technologist Enhancing OSP for Programmatic and Institutional Assessment.
-
Upload
abner-norris -
Category
Documents
-
view
217 -
download
0
Transcript of Lance Speelmon Scholarly Technologist Enhancing OSP for Programmatic and Institutional Assessment.
Many thanks…
• Shoji Kajita, Nagoya University
• Lynn Ward, Indiana University• John Gosney, Indiana University
IUPUI: Institutional Profile
• Indiana University Purdue University at Indianapolis• Founded 1969 with a strong local mission• Blended campus• Metropolitan research university• 20+ schools (15 with professional/pre-professional
foci)• Commuter campus, ~30,000 students (~20,000
undergraduates)
Approach to General Education
• Early 1990s - General Education based in the schools—distributive model
• Accreditation prompts internal reflection• Campus mandate for change, specifically a
centrally coordinated approach and specific learning outcomes for general education
• 1998 – Campus adopts a competency or ability-based model
Principles of Undergraduate Learning (PULs)
• Core Skills • written and oral communication• ability to comprehend, interpret, and analyze texts• analytical (quantitative reasoning)• Information and technological literacy
• Critical Thinking• Integration and Application of Knowledge• Intellectual Depth, Breadth, and Adaptiveness• Understanding Society and Culture• Values and Ethics
Assessment Needs
• Document and demonstrate the effectiveness of IUPUI’s approach to general education (HLCNCA and ICHE)
• Document student achievement in programs subject to specialized accreditation (Education, Engineering, Visual Communications, etc.)
• Standard reports that aggregate and summarize assessment data across courses and programs
• Filter and group on demographic and academic criteria
The Elephant in the US Living Room
• Department of Education – Spelling Commission• 2006 Report: A Test of Leadership: Charting the
Future of U.S. Higher Education• Recommendation 3:
Higher education institutions should measure student learning using quality assessment data from instruments such as, for example, the Collegiate Learning Assessment…
• Risks homogenized education• We want a better solution…
Challenges
• Site-centric nature of OSP tools; no way to easily aggregate data across sites
• No tools to simplify management of very large sites
• Customization also makes it difficult to aggregate data; each department uses a different evaluation form and rating scale.
• No canned reports; every report requires an experienced XML programmer understands underlying data structures
• Academic programs more concerned with their own disciplinary outcomes than PULs
Current Vision: Phase 1: Goal/Outcome Linking• Instructor or program administrator creates and publishes
goal set; goal set becomes aggregation point
• Instructors in program can link any course assignment, matrix cell, or wizard page to one or multiple goals
• Students can attach examples of their work directly to one or multiple goals.
• Standardization of evaluation form elements makes it possible to aggregate and report data across courses and programs
Phase 2: Goal/Outcome Mapping
Program
OutcomesProgram
Outcomes
Institutional Outcomes
Institutional Outcomes
Linkable Tools
• Assignments
• Matrices (cells can be linked to other cells)
• Wizards (pages can be linked to cells)
The Problem
Matrix authors must select forms and evaluators in each cell, even when the same choices are used in every cell.
The Problem
• No workflow support for reviewers (providers of formative feedback)
• Students cannot solicit feedback from peers, advisors, etc.
• No way for individuals who are not CIG members to provide feedback
Reviewer Workflow
• “Request Feedback” button
• Email Notification• Eventually …
Recipient notification
Visual indicator of new feedback
External Reviewers
• Email notification provides direct link to cell • Reviewer need not be a member of the portfolio
site• Reviewer must have a server login and password• Eventually …
Reviewer dashboard in My Workspace to aggregate pending feedback requests
The Problem
Users in evaluator (or reviewer) role can open all cells in a users matrix, even if not selected as an evaluator for the cell.
More granular access control is needed to support range of implementation scenarios from highly sensitive and secure to open and collaborative.
Per Cell Access Control
• Cells can only opened by the designated evaluators/reviewers
• Revision of matrix permissions to provide much greater flexibility and granularity of access
The Problem
Cells that have been evaluated and returned for additional evidence or other modifications look just like cells that have never been submitted.
For Spring and Summer 2009
• Standardized evaluation form and reports • Auto-population of portfolio sites based on
membership of associated course sites• Participant and evaluator notifications• Bring Wizards into functional parity with Matrices
and consolidate into a single tool• Merge IU enhancements with community code,
pending community acceptance
QUESTIONS?
Lance Speelmon, [email protected]
Lynn Ward, [email protected]
John Gosney, [email protected]