Crossing the Rubricon Assessing the Instructor Ned Fielden, Mira Foster San Francisco State...

15
Crossing the Rubricon Assessing the Instructor Ned Fielden, Mira Foster San Francisco State University San Francisco, California USA

Transcript of Crossing the Rubricon Assessing the Instructor Ned Fielden, Mira Foster San Francisco State...

Crossing the Rubricon

Assessing the Instructor

Ned Fielden, Mira FosterSan Francisco State UniversitySan Francisco, California USA

Case StudyAssessment of Librarian

Instructors

• Literature Review• Theoretical Issues• Rubric Design and

Implementation• Preliminary Review

Instructor AssessmentSeveral Methods

• Supervisor Review• Peer Evaluation• Surveys• Performance Assessment

(Learning outcomes of students assessed)

Institutional Need for Instructor Assessment

• Retention of probationary candidates, Tenure and Promotion

• CSU as public institution, criteria based, strict rules about personnel review

• Summative vs. formative assessment

Process

• Literature Review• Identify Suitable Mechanism for

Review• Create Draft• Consult with Library Education

Committee• Formally Adopted by Library Faculty

Rubrics

• Powerful, easy to use, standardized

• Considerable literature on rubric use for students/programs/outcomes

• Little on library instructor usage

Value of Rubrics

• Standardised• Easy to use (minimal training)• Insures all criteria of review met• Possibilities of quantitative data

analysis, introduction of new values• Can be employed both for

summative and formative assessment

Rubric Basics

• Glorified “checklist”, annotated to establish criteria, distinct items

A. Preparation1. Communicated with course instructor before

the session to determine learning objectives and activities

2. Learned about course assignment(s) specifically related to library research

3. Customized instruction session plan to curriculum, specific course assignments and/or faculty/student requests

Rubric Complexity• May be designed to reflect highly

nuanced categories

*Oakleaf, M.L., 2006. Assessing information literacy skills, Dissertation, University of North Carolina.

EvaluationCriteria

Beginning Developing Exemplary Student Learning Outcomes

ArticulatesCriteria

0 – Student does not address authority issues

1 – Student addresses authority issues but does not use criteria terminology

2 – Student addresses authority issues and uses criteria terminology such as: author, authority, authorship or sponsorship

LOBO 3.1.1The student will articulate established evaluation criteria (ACRL 3.2 a)

Types of Rubrics• Analytic – Specific Criteria – Isolated Facets –Capacity For Highly Granular ScoringAnalytic rubrics “divide … a product or

performance into essential traits or dimensions so that they can be judged separately…” *

• Holistic –Big Picture –Fuzzier Focus“overall, single judgment of quality” *

*Arter and Tighe, Scoring rubrics, 2001.

Rubric Design

• What criteria to include• Opportunity to introduce

specific values in program• Involvement of all

constituents (evaluators/evaluatees)

Rubric Implementation

• Formative – Raw data given to candidate– Pre- and post-consultation– Candidate to use data however

desired

• Summative – Framework for formal letter for RTP

file

Summary

• Powerful, easy to use tool, levels playing field, highly customizable

• Issues of mixing formative and summative functions

Further Study

• Explore different varieties of instructor assessment tools

• Test different rubrics• Establish balance point between

depth of data and ease of use• Evaluate outcomes

Crossing the RubriconAssessing the Instructor

• Bibliography– http://online.sfsu.edu/~fielden/rbib.html

• Sample Rubric– http://online.sfsu.edu/~fielden/

rubrics.html

Bridge Photo with permission from robep http://www.flickr.com/photos/robep/