February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 1 Integrating Student Learning...

26
February 1, 2008 Retreat on StudentLlearning a nd Assessment, Irvine 1 Integrating Student Learning into Program Review Barbara Wright Associate Director, WASC [email protected]

Transcript of February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 1 Integrating Student Learning...

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

1

Integrating Student Learning into Program Review

Barbara WrightAssociate Director, [email protected]

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

2

Assessment & Program Review: related but different

Program review typically emphasizes Inputs, e.g.

Mission statement, program goals Faculty, their qualifications Students, enrollment levels, qualifications Library, labs, technology, other resources Financial support

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

3

Assessment & Program Review: related but different, cont.

Program review typically emphasizes Processes, e.g.

Faculty governance Curriculum review Planning Follow-up on graduates Budgeting And yes, assessment may be one of these

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

4

Assessment & Program Review: related but different, cont.

Program review typically emphasizesindirect indicators of student learning

and academic quality, e.g. Descriptive data Surveys of various constituencies Existence of relationships, e.g. with area

businesses, professional community

Program review has traditionally neglected actual student learning outcomes

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

5

Assessment & Program Review: related but different, cont.

PR is typically conceived as Data-gathering Looking at the past 5-8 years Reporting after the fact where the

program has been Using PR to garner resources – or at least

protect what program has Projecting needs into the future Expressing “quality” & “improvement” in

terms of a case for additional inputs

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

6

Capacity vs. Educational Effectivess for Programs:

Capacity questions: What does the program have in the way of inputs, processes, and evidence of outputs or outcomes? What does it need, and how will it get what it needs?

EE questions: How effectively do the inputs and processes contribute to desired outcomes? How good are the outputs? The student learning?

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

7

Assessment & Program Review: related but different

Assessment is all about Student learning & improvement at

individual, program & institutional levels Articulation of specific learning goals (as

opposed to program goals, e.g. “We will place 90% of graduates in their field.”)

Gathering of direct, authentic evidence of learning (as opposed to indirect evidence, descriptive data)

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

8

Assessment & Program Review: related but different, cont.

Assessment is all about Interpretation & use of findings to

improve learning & thus strengthen programs (as opposed to reporting of data to improve inputs)

A future orientation: Here’s where we are – and here’s where we want to go in student learning over the next 3-5 years

Understanding the learning “problem” before reaching for a “solution”

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

9

Assessment & Program Review: related but different, cont.

Assessment of student learning and program review are not the same thing. However, there is a place for assessment as a necessary and significant input in program review. We should look for A well-functioning process Key learning goals Standards for student performance A critical mass of faculty (and students) involved Verifiable results, and Institutional support

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

10

1. Goals, questions

2. Gathering evidence3. Interpretation

4. Use

The Assessment Loop

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

11

1. Does the program have student learning goals, questions?

2. Do they have methods, processes for gathering evidence? Do they have evidence?

3. Do they have a process for systematic, collective analysis and interpretation of evidence?

4. Is there a process for use of findings for improvement? Is there admin. support, planning, budgeting?

Rewards for faculty?

The Assessment Loop – Capacity Questions

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

12

1. How well do they achieve their student learning goals, answer questions?

2. How aligned are the methods? How effective are the processes? How complete is the evidence?

3. How well do processes for systematic, collective analysis and interpretation of evidence work? What have they found?

4. What is the quality of

follow-through on findings for improve-ment? Is there improvement? How adequate, effective are admin. support, planning, budgeting?

Rewards for faculty?

The Assessment Loop –Effectiveness Questions

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

13

Don’t confuse program-level assessment and program review Program-level assessment means we

look at learning on the program level (not just individual student or course level) and ask what all the learning experiences of a program add up to, at what standard of performance (results).

Program review looks for program-level assessment of student learning but goes beyond it, examining other components of the program (mission, faculty, facilities, demand, etc.)

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

14

What does WASC want? Both!

Systematic, periodic program review, including a focus on student learning results as well as other areas (inputs, processes, products, relationships)

An improvement-oriented student learning assessment process as a routine part of the program’s functioning

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

15

Institutionalizing Assessment – 2 aspects:

The PLAN for assessment (i.e. shared definition of the process, purpose, values, vocabulary, communication, use of findings)

The STRUCTURES and RESOURCES that make the plan doable

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

16

How to institutionalize --

Make assessment a freestanding function

Attach to an existing function, e.g. Accreditation Academic program review Annual reporting process Center for Teaching Excellence Institutional Research

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

17

Make assessment freestanding --

Maximum flexibility Minimum threat,

upset A way to start

Little impact Little sustainability Requires

formalization eventually, e.g. Office of Assessment

Positives and Negatives

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

18

Attach to Office of Institutional Research --

Strong data gathering and analysis capabilities

Responds to external expectations

Clear responsibility IR has resources Faculty not

“burdened”

Perception: assessment = data gathering

Faculty see little or no responsibility

Faculty uninterested in reports

Little or no use of findings

Positives and Negatives

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

19

Attach to Center for Teaching Excellence --

Strong impact possible Ongoing, supported Direct connection to

faculty, classroom, learning

Chance for maximum responsiveness to “use” phase

Impact depends on how broadly assessment is done

No enforcement Little/no reporting,

communicating Rewards, recognition

vary, may be lip service

Positives and Negatives

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

20

Attach to annual report --

Some impact (depending on stakes)

Ongoing Some compliance Habit, expectation Closer connection to

classroom, learning Cause/effect possible Allows flexibility

Impact depends on how seriously, how well AR is done

No resources Reporting, not

improving, unless specified

Chair writes; faculty involvement varies

Positives and Negatives

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

21

Attach to accreditation --

Maximum motivation Likely compliance Resources available Staff, faculty assigned Clear cause/effect

Resentment of external pressure

Us/them dynamic Episodic, not ongoing Reporting, gaming, not

improving Little faculty involvement Little connection to the

classroom, learning Main focus: inputs,

process

Positives and Negatives

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

22

Attach to program review --

Some impact (depending on stakes)

Some compliance Some resources

available Staff, faculty assigned Cause/effect varies

Impact depends on how seriously, how well PR is done

Episodic, not ongoing Inputs, not outcomes Reporting, not improving Generally low faculty

involvement Anxiety, risk-aversion Weak connection to the

classroom, learning

Positives and Negatives

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

23

How can we deal with the disadvantages? Strong message from administration: PR is

serious, has consequences (bad and good) Provide attentive, supportive oversight Redesign PR to be continuous Increase weighting of assessment in overall

PR process increase Involve more faculty, stay close to

classroom, program Focus on outcomes, reflection, USE Focus

on improvement (not just “good news”) and REWARD IT

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

24

How can we increase weighting of learning & assessment in PR? E.g.,

Optional part One small part of total

PR process

“Assessment” vague, left to program

Various PR elements of equal value (or no value indicated)

Little faculty involvement

Required Core of the process (so

defined in instructions) Assessment

expectations defined Points assigned to PR

elements; student learning gets 50% or more

Broad involvement

From to

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

25

Assessment serves improvement and accountability

A well-functioning assessment effort systematically improves curriculum, pedagogy, and student learning; this effect is documented.

At the same time, The presence of an assessment effort is

an important input & indicator of quality, The report on beneficial effects of

assessment serves accountability; and Assessment findings support $ requests

February 1, 2008 Retreat on StudentLlearning and Assessment, Irvine

26

New approaches to PR/assessment

Create a program portfolio Keep program data continuously

updated Do assessment on annual cycle Enter assessment findings, uses, by

semester or annually For periodic PR, review portfolio and

write reflective essay on student AND faculty learning