Assessment for Research Universities and Graduate Programs Barbara Wright Associate Director, WASC...
-
Upload
gyles-park -
Category
Documents
-
view
212 -
download
0
Transcript of Assessment for Research Universities and Graduate Programs Barbara Wright Associate Director, WASC...
Assessment for Research Universities and Graduate Programs
Barbara WrightAssociate Director, WASC
Our roadmap:
Why is assessment a challenge? What is assessment, really? How can it work in a research or
graduate environment? A case study
Why is assessment a challenge?
Complex learning goals Fewer models, less information Less interest Less felt need Competition for faculty time,
attention Dominance of research as a value
1. Goals, questions
2. Gathering evidence3. Interpretation
4. Use
The Assessment Loop
So what is assessment?
It’s a systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students’ learning and development at any level of analysis from the individual student to the course, program, or institution
Assessment is not
Just the first step Just the first and second steps
It’s the WHOLE process.
Methods matter
Descriptive data – good Indirect evidence – better Direct evidence of learning –
best
Direct methods include … Portfolios Capstones (e.g., dissertations) Performances (e.g., poster or conference
presentation) Assignments (e.g. papers, research
projects) Secondary readings Comps, qualifying exams Commercial tests
Four dimensions of learning --
What students learn (cognitive as well as affective, social, civic, professional, spiritual and other dimensions)
How well (thoroughness, complexity, subtlety, agility, transferability)
What happens over time (cumulative, developmental effects; “value added”)
Is this good enough? (the ED question)
Thinking about standards . . . Absolute standards: the knowledge/skill level of
champions, award winners, top experts
Contextual standards: appropriate expectations for, e.g., a 10-year old, a college student, an experienced professional
Developmental standards: amount of growth, progress over time, e.g., 2 years of college, 5 years
Institutional, regional, national standards?
Keys to successful assessment
SLOs are meaningful Methods align with learning goals, Qs Methods are embedded, not add-ons Validity is in balance with reliability Analysis is collective, collegial Findings are actionable, lead to
improvement
Successful assessment, cont.
SL is part of program review Faculty get support, development Rewards are provided Innovations are planned for, funded The approach is sustainable – and
intellectually interesting