September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

133
September 28-29, 2 006 Moscow 1 International International Faculty Workshop for Faculty Workshop for Continuous Program Continuous Program Improvement Improvement

Transcript of September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

Page 1: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 1

InternationalInternational

Faculty Workshop for Faculty Workshop for Continuous Program Continuous Program

ImprovementImprovement

Page 2: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 2

Introductions

Page 3: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 3

Continuous Program Improvement

Moderator: Gloria Rogers

Associate Executive Director

Professional Services

ABET, Inc.

Facilitator: David Hornbeck

Adjunct Accreditation Director for Technology

ABET, Inc.

Page 4: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 4

ABET Faculty Workshop

To Promote Continuous Quality

Improvement in Engineering Education

Page 5: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 5

Workshop Expectations

Page 6: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 6

Workshop Will Develop:

1. An understanding of program development and management based on learning outcomes.

2. An awareness of definitions and linkages among

• Program Educational Objectives• Program Outcomes• Assessment• Evaluation• Constituencies

Page 7: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 7

Workshop Will Develop:3. An awareness of assessment tools and their

• Variety• Assets• Utility• Relevance• Limitations

4. An understanding of the structure & cyclic nature of Continuous Quality Improvement

• Planning• Implementation• Assessment• evaluation, • feedback • change

Page 8: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 8

Workshop Format

• We utilize both small group and plenary sessions

• We introduce concepts via critique of case study examples

• We apply concepts through group preparation of example scenarios

• We share results & develop understanding through interactive plenary sessions

Page 9: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 9

Workshop Day 1

• Identify attributes of effective educational objectives

• Identify attributes of effective program outcomes

• Investigate key components of effective assessment plans and processes

• Prepare written program outcomes

Page 10: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 10

Workshop Day 2

• Investigate the attributes of a variety of assessment tools

• Develop assessment & evaluation plans for the program educational objectives

• Develop assessment & evaluation plans for the set of program outcomes

• Summarize points of learning• Discuss lessons learned by ABET in its

experience with outcomes-based criteria

Page 11: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 11

Workshop Procedures

A. Record all your work produced in small group sessions

B. Identify recorded work by table and breakout room number

C. Reporting in Plenary Sessions: Each group selects a leader, a recorder & a reporter for each exercise

D. A workbook of all material & exercises will be provided to each participant

Page 12: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 12

Introduction to ABET Continuous Program

Improvement

Page 13: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 13

Goal of ABET

• To promote Continuous Quality Improvement in Applied Sciences, Computing, Engineering, and Technology education through faculty guidance and initiative.

Page 14: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 14

Accreditation Reform

The Paradigm Shift

Page 15: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 15

Philosophy

• Institutions & programs define missions and objectives

• Focus on the needs of their constituents • Enable program differentiation• Encourage creativity in curricula

• Emphasis on outcomes• Skills/knowledge required for professional practice• Technical and non-technical elements

• Programs demonstrate that they are• Meeting their objectives • Satisfying accreditation criteria

Page 16: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 16

Emphases• Practice of Continuous Improvement

– Input of constituencies– Process reliability & sustainability– Outcomes, Objectives, and Assessment – Technical and Professional Knowledge required by

the Profession

• Resources linked to Program Objectives– Student– Faculty and Support Personnel– Facilities – Institutional Support and Funding

Page 17: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 17

Primary Expectations of Programs

• Adequate preparation of graduates for engineering careers

• Effective Continuous Quality Improvement Processes

Page 18: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 18

The Focus

• Meaningful Educational Objectives

• Effective Program Outcomes

• Practical Assessment Tools

• Effective & Sustainable Assessment Plan

• Robust and Credible Evaluation Plan

Page 19: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 19

ABET Definitions 

Program Educational Objectives – broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve within the first few years after graduation.

 Program Outcomes – narrower statements that

describe what students are expected to know and be able to do by the time of graduation. These are the skills, knowledge, and behaviors that enable graduates to achieve the Program Educational Objectives. They are acquired by students as they matriculate through the program.

Page 20: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 20

ABET Definitions

Assessment – processes to identify, collect, and prepare data that are needed to evaluate the achievement of Program Outcomes and Program Educational Objectives.

Evaluation – processes that interpret data accumulated through assessment. Evaluation determines the extent to which Program Outcomes or Program Educational Objectives are being achieved. Evaluation results in decisions & actions that improve a program.

Page 21: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 21

• a systematic pursuit of excellence and

• satisfaction of the needs of constituencies in

• a dynamic and competitive environment.

Continuous Quality Improvement is

Page 22: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 22

Continuous Quality Improvement • Must be systematic and systemic • Is the dynamic behavior of an organization• Must be shared at all organizational levels• May be motivated by external factors• Must be sustained by internal behavior• Requires that the continuous pursuit of

excellence determine philosophies, plans, policies and processes of the organization

• Requires continuous interaction between internal and external constituencies

• Focuses on the needs of constituencies

Page 23: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 23

CQI Starts with Basic Questions

• Who are our constituencies?• What services do we provide?• Do constituencies understand our objectives?• What services, facilities and policies are

necessary to insure that we continue to satisfy our constituencies?

• Do our suppliers and institutional leadership understand and support our needs?

Page 24: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 24

..….More Basic Questions

• What steps do we perform to provide our services?

• Are our constituencies satisfied with our services?

• How do we measure our effectiveness?• How do we use these measures to continuously

improve our services?• Are we achieving our objectives and improving?

Page 25: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 25

Assessment: Foundation of CQI

• Assessment of inputs & processes establishes the capability or capacity of a program

• Assessment of outcomes measures how effectively the capability has been used

• Outcomes assessment improves:– Effectiveness– Learning– Accountability

Page 26: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 26

CQI as an Operating Philosophy

• Quality improvement comes from within institution

• Continuous improvement requires the planned integration of objectives, performance metrics, & assessment

• Continuous improvement is cyclical

• Assessment of performance is the baseline for future assessment

• Educational objectives, mission, and needs of constituencies must be harmonized to achieve CQI

Page 27: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 27

Role of ABET Accreditation

ABET accreditation provides periodic external assessment in support of the continuous quality improvement program of the institution.

Page 28: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 28

Potential Constituencies

• Students, parents, employers, faculty, alumni• Industry advisors, accrediting agencies• Educational administration: department, school,

college, etc• Government agencies: local, state, federal• Transfer colleges that supply students• Graduate programs that accept graduates• Donors, contributors, supporters

Page 29: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 29

Step 1: Who are your constituencies ?

• Identify possible constituencies.

• What are the expectations of each constituency?

• How will constituencies be satisfied?

• When will constituencies be satisfied?

• What relative priority do constituencies hold?

• How will constituencies be involved in your CQI?

Page 30: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 30

Pick Your Constituencies

• Select no more than three constituencies to

focus on for the workshop exercises

• Assign a person to represent each of these

constituencies at each table

• Consider what influence the choice of

constituencies will have on Educational

Objectives and Outcomes

Page 31: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 31

Objectives: Exercise 1

Page 32: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 32

Outcomes: Exercise 2

Page 33: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 33

Report Out on Exercise 1 and Exercise 2

Page 34: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 34

Objectives Summary• Each addresses one or more needs of a

constituency

• Must be understandable by the constituency being served

• Should be limited to a manageable number of statements

• Should be broader statements than the Program Outcomes

• Every Objective must be supported by at least one Program Outcome

Page 35: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 35

Outcomes Summary

• Each describes an area of knowledge and/or skill that a person can demonstrate

• Should be stated such that a student can demonstrate upon completion of the program and before graduation

• Must be a unit of knowledge/skill that supports at least one Educational Objective

• Collectively, Outcomes define the skills and knowledge imparted by the degree program

• Outcomes statements normally do not include measures or performance expectations

Page 36: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 36

Assessment Basics

Page 37: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006September 28-29, 2006

Gloria Rogers, Ph.D.Gloria Rogers, Ph.D.Associate Executive Director, Associate Executive Director,

Professional ServicesProfessional ServicesABET, Inc.ABET, Inc.

Program Assessment of Student Learning ©

Page 38: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 38

Foundational Truths

Programs are at different places in the maturity of their assessment processes

Programs have different resources available to them (e.g., number of faculty, availability of assessment expertise, time)

Each program has faculty who are at different places in their understanding of good assessment practice

Page 39: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 39

Hierarchy of assessment learning

Knowledge

Comprehension

Application

Analysis

Synthesis

Evaluation

NOVICE

INTERMEDIATE

Advanced

I apply what I have I apply what I have learned and begin to learned and begin to

analyze the effectiveness analyze the effectiveness of my assessment of my assessment

processes. processes.

I can take what I have learned I can take what I have learned and put it in context. I begin and put it in context. I begin

to question what I hear, to question what I hear, challenge assumptions and challenge assumptions and

make independent decisions make independent decisions about effective practices for about effective practices for

my program. my program.

Everyone who Everyone who makes a makes a

presentation is an presentation is an expert and I am a expert and I am a

sponge. sponge.

Page 40: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 40

Publication numbers/Faculty

development activities;

Credit hrs delivered

Faculty Background

Student Background

Educational Resources

Programs & servicesoffered;

populations served

Policies, procedures, governance

Faculty teaching

loads/class size

Processes

Statistics on resource

availability, participation

rates

Input

Student grades; graduation rates;

employment statistics

Student learning and growth

Faculty publication

citations data; faculty devlpmt

What have students

learned; what skills have

they gained; attitudes

developed?

OutcomesOutputs

What comes into

the system?

What are we doing with the inputs?

How many?

What is the effect?

Page 41: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 41

Faculty Background

Student Background

Educational Resources

Programs & servicesoffered;

populations served

Policies, procedures, governance

Faculty teaching

loads/class size

ProcessesInput Assessment of inputs and process only establishes the capability or capacity of a program (how many courses and what is “covered”, background of faculty, nature of facilities, etc.)

Page 42: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 42

Publication numbers/Faculty

development activities;

Credit hrs delivered

Statistics on resource

availability, participation

rates

Student grades; graduation rates;

employment statistics

Outputs

Assessment of outputs serve as indirect measures or proxies for effectiveness—they provide general indicators of achievement.

Page 43: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 43

Student learning and growth

Faculty publication

citations data; faculty devlpmt

What have students

learned; what skills have

they gained; attitudes

developed?

OutcomesAssessment of outcomes

provides for direct measures of the effectiveness of what has been done with that capability/ capacity related to individual learning and growth.

Page 44: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 44

Competency-Based Instruction

Assessment-Based Curriculum

Individual Perf. Tests

PlacementAdvanced Placement TestsVocational Preference Tests

Other Diagnostic Tests

“Gatekeeping”

Admissions TestsRising Junior Exams

Comprehensive ExamsCertification Exams

Campus and Program

Evaluation

Program ReviewsRetention Studies

Alumni Studies“Value-added”

Studies

Program Enhancement

Individual assessmentresults may be aggregated

to serve program evaluation needs

Levelof

Assessment(Who?)

Individual

Group

KNOWLEDGE

SKILLS

ATTITUDES

&

VALUES

BEHAVIOR

Object

of

Assess

ment

(What?

)Learning/Teaching(Formative)

Accountability(Summative)

Purpose of Assessment (Why?)(Terenzini, JHE Nov/Dec 1989)

Taxonomy of Approaches to Assessment

Page 45: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 45

ABET Terms DefinitionSome other terms for

same concept

Objectives

Broad statements that describe the career and professional accomplishments that the program is preparing graduates to achieve.

Goals, outcomes, purpose, etc.

Outcomes Statements that describe what students are expected to know and able to do by the time of graduation.

Objectives, standards, etc.

Performance Criteria

Specific, measurable statements identifying the performance(s) required to meet the outcome; confirmable through evidence.

Performance Indicators, Standards, rubrics,

specifications, metrics, outcomes, etc.

Assessment

Processes that identify, collect, use and prepare data that can be used to evaluate achievement. Evaluation

Evaluation

Process of reviewing the results of data collection and analysis and making a determination of the value of findings and action to be taken.

Assessment

Page 46: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 46Assessment for Quality AssuranceAssessment for Quality Assurance©©

LearningLearning OutcomesOutcomes

ConstituentsConstituents

Assessment: Assessment: Collection, Analysis Collection, Analysis

of Evidenceof Evidence

Evaluation:Evaluation:Interpretation of Interpretation of

EvidenceEvidence

Feedback Feedback for for

Continuous Continuous ImprovemeImproveme

ntnt

Gloria Rogers – ABET, Inc.

MeasurablMeasurable e

PerformanPerformance Criteriace Criteria

Educational Educational Practices/StrategiPractices/Strategi

eses

MissionMissionEducationaEducational l

ObjectivesObjectives

Assess/Evaluate

Page 47: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 47

Assessment Focus:Evaluate individual student performance (grades)Evaluate teaching/learning

Context:Subject matterFaculty memberPedagogyStudentFacility

Classroom Assessment©

Strength of Materials

TerminologyMaterial PropertiesBeamsTorsionColumnsFatigue

StressStrain

Tensile strengthDuctility

Sheer forceBending moment

Angle of twistPower transmission

Euler buckling

Crack growthS-N curves

G.Rogers, ABET

Subject

Concepts

Topics

Timeline 1 semester/quarter

Page 48: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 48

Objective

G.Rogers--ABET, Inc.

Work effectively with others

Outcome

Researches and gathers information

Fulfill duties of team roles

Shares work equally

Listens to other teammates

Performance Criteria

Ability to function on multi-

disciplinary team

Makes contributionsTakes responsibilityValues other viewpoints

Page 49: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 49

Student Pre-college

Traits

Educational

Outcomes

Institutional Context

Reciprocal Causation Adapted from Terenzini, et.al. 1994,1995

ProgramProgram AssessmentAssessment

Classroom Experience

Pedagogy; Facilities;

Climate; Faculty & Student

Characteristics

Out-of-class ExperiencesCo-curricular;

co-ops; internships;

support services

Coursework &

Curricular PatternsClasses

chosen; major

Timeline xx Years

EnvironmentalFactors

Page 50: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 50

Differences between classroom and program assessment

Degree of complexityTime spanAccountability for the assessment

processCostLevel of faculty buy-inLevel of precision of the measure

Page 51: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 51

Unsatisfactory1

Developing2

Satisfactory3

Exemplary4

Score

Contribute

Research & Gather

Information

Take Responsibility

Fulfill Team Role's Duties

Share Equally

Value Others' Viewpoints

Listen to Other Teammates

Average

Work Effectively in TeamsWork Effectively in Teams

Page 52: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 52

Unsatisfactory1

Developing2

Satisfactory3

Exemplary4

Score

Contribute

Research & Gather

Information

Does not collect any information that relates

to the topic.

Collects very little information--some relates

to the topic.

Collects some basic information--most

relates to the topic.

Collects a great deal of

information--all relates to the topic.

Take Responsibility

Fulfill Team Role's Duties

Does not perform any duties of assigned team

role.Performs very little duties.

Performs nearly all duties.

Performs all duties of assigned team

role.

Share EquallyAlways relies on others

to do the work.

Rarely does the assigned work--often needs

reminding.

Usually does the assigned work--rarely

needs reminding.

Always does the assigned work

without having to be reminded.

Value Others' Viewpoints

Listen to Other Teammates

Is always talking--never allows anyone else to

speak.

Usually doing most of the talking--rarely allows others

to speak.

Listens, but sometimes talks too

much.

Listens and speaks a fair

amount.

Average

Work Effectively in TeamsWork Effectively in Teams

Page 53: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 53

Developing performance criteria

• Two essential parts– Content reference

• Subject content that is the focus of instruction (e.g., steps of the design process, chemical reaction, scientific method)

– Action verb• Direct students to a specific performance

(e.g., “list,” “analyze,” “apply”)

Page 54: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 54

Knowledge

Comprehension

Application

Analysis

Synthesis

Evaluation

NOVICE

INTERMEDIATE

EXPERT

INTRODUCE

REINFORCE

DEMONSTRATE/CREATE

Page 55: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 55

Clarity of performance criteria

• Use of action verbs consistent with appropriate level of learning

• Reference table

Page 56: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 56

Writing Measurable Outcomes: Exercise 3

Page 57: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 57

Report Out on Exercise 3

Page 58: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 58

Examples

www.engrng.pitt.edu/~ec2000

Page 59: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 59

Page 60: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 60

What is ‘acceptable’ level of performance?

Developing scoring rubrics

Page 61: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 61

What is a rubric, anyway?????

• A rubric is a set of categories which define and describe the important components of the work being completed, critiqued, or assessed.

• Each category contains a gradation of levels of completion or competence with a score assigned to each level and a clear description of what performance need to be met to attain the score at each level.

Page 62: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 62

Purpose of Rubric(What do you want it to do?)

• Information to/about student competence (Analytic)– Communicate expectations– Diagnosis for purpose of improvement

and feedback• Overall examination of the status of

student performance? (Holistic)

Page 63: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 63

Generic or Task-Specific?

• Generic– General rubric that can be used across similar

performances (used across all communication tasks or problem solving tasks)• Big picture approach• Element of subjectivity

• Task-specific– Can only be used for a single task

• Focused approach• Less subjective

Page 64: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 64

How many points on the scale?

• Consider both the nature of the performance and purpose of scoring

• Recommend 3 to 6 points to describe student achievement at a single point in time.

• If focused on developmental curriculum (growth over time) more points are needed (i.e., 6-11???).

Page 65: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 65

Scale(Numeric

w/descriptor)

Scale(Numeric w/descriptor)

Scale(Numeric w/descriptor)

Scale(Numeric w/descriptor)

Scale(Numeric w/descriptor)

Scale(Numeric w/descriptor)

Performance

Identifiable performance

characteristics reflecting this level

Identifiable performance characteristics reflecting this level

Identifiable performance characteristics reflecting this level

Identifiable performance characteristics reflecting this level

Identifiable performance characteristics reflecting this level

Identifiable performance characteristics reflecting this level

Performance

Performance

Performance

Performance

Performance

RUBRIC TEMPLATE

Student Outcome_______________________________

Page 66: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 66

Unsatisfactory1

Developing2

Satisfactory3

Exemplary4

Content

Supporting Detail

Includes inconsistent or few details which may

interfere with meaning of text

Includes some details, but may

include extraneous or loosely related

material

Provides adequate supporting detail to

support solution/argument

Provides ample supporting detail

to support solution/ argument

Organization

Organizational Pattern

Little evidence of organization or

any sense of wholeness or completeness

Achieves little completeness and wholeness though

organization attempted

Organizational pattern is logical and

conveys completeness and

wholeness with few lapses

Organizational patter is logical

and conveys completeness and

wholeness

Style

Language and word choice

Has limited or inappropriate

vocabulary for the audience and

purpose

Limited and predictable vocabulary, perhaps not

appropriate for intended audience and

purpose

Uses effective language and appropriate word choices for intended

audience and purpose

Uses effective language; makes

engaging, appropriate word

choices for audience and purpose

Standard English

Does not follow the rules o f standard

English

Generally does not follow the rules of standard English

Generally follows the rules for standard

English

Consistently follows the rules of standard

English

Average

Effective Writing Skills

Page 67: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 67

Unsatisfactory1

Developing2

Satisfactory3

Exemplary4

Score

Contribute

Research & Gather

Information

Does not collect any information that relates

to the topic.

Collects very little information--some relates

to the topic.

Collects some basic information--most

relates to the topic.

Collects a great deal of

information--all relates to the topic.

Take Responsibility

Fulfill Team Role's Duties

Does not perform any duties of assigned team

role.Performs very little duties.

Performs nearly all duties.

Performs all duties of assigned team

role.

Share EquallyAlways relies on others

to do the work.

Rarely does the assigned work--often needs

reminding.

Usually does the assigned work--rarely

needs reminding.

Always does the assigned work

without having to be reminded.

Value Others' Viewpoints

Listen to Other Teammates

Is always talking--never allows anyone else to

speak.

Usually doing most of the talking--rarely allows others

to speak.

Listens, but sometimes talks too

much.

Listens and speaks a fair

amount.

Average

Work Effectively in TeamsWork Effectively in Teams

Page 68: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 68

Example of ResultsExample of Results

Work eff ectively in teams

0%

20%

40%

60%

80%

100%

At a level expected for a student who will graduate?

Page 69: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 69

Example of ResultsTeaming Skills

1. Research & gather information

2. Fulfill team role’s duties

3. Shares equally

4. Listens to teammates

Page 70: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 70

Example of ResultsCommunication Skills

1. Research & gather information

2. Fulfill team role’s duties

3. Shares equally

4. Listens to teammates

Page 71: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 71

Linking results to Practice

• Development of Curriculum Map

• Linking curriculum content/pedagogy to knowledge, practice and demonstration of learning outcomes

Page 72: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 72

Outcome Explicit. This outcome is explicitly stated as being a learning outcome for this course.Demonstrate Competence. Students are asked to demonstrate their competence on this outcome through homework, projects, tests, etc.Formal Feedback. Students are given formal feedback on their performance on this outcome.Not covered. This outcome is not addressed in these ways in this course.Note: Clicking on the link ‘view rubric’ will show you the scoring rubric for that particular performance criteria related to the outcome.

Outcome/Performance Criteria Outcome Explicit

Demonstrate Competence

Formal Feedback

Not Covered

Recognition of ethical and professional responsibilities.

1. Demonstrate knowledge of professional codes of ethics. View rubric or make a comment (optional) Yes Yes Yes 2. Evaluate the ethical dimensions of professional engineering, mathematical, and scientific practices. View rubric or make a comment (optional) Yes Yes Yes An ability to work effectively in team

1. Share responsibilities and duties, and take on different roles when applicable View rubric or make a comment (optional) Yes Yes Yes 2. Analyze ideas objectively to discern feasible solutions by building consensus View rubric or make a comment (optional) Yes Yes Yes 3. Develop a strategy for action. View rubric or make a comment (optional) Yes Yes Yes An ability to communicate effectively in oral, written, graphical, and visual forms

1. Identify the readers/audience, assess their previous knowledge and information needs, and organize/design information to meet those needs. View rubric or make a comment (optional)

Yes Yes Yes

2. Provide content that is factually correct, supported with evidence, explained with sufficient detail, and properly documented. View rubric or make a comment (optional)

Yes Yes Yes 3. Test readers/audience response to determine how well ideas have been relayed. View rubric or make a comment (optional) Yes Yes Yes 4. Submit work with a minimum of errors in spelling, punctuation, grammar, and usage. View rubric or make a comment (optional)

Yes Yes Yes

Page 73: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 73

Curriculum map for Communication Skills

1st Year 2nd Year 3rd Year 4th Year

FALL

CM111 Chem I

4 CH01

Cons Principles

4 CH 414

Heat Transfer

4 CH400 Career P III

0

EM 100

Life Skills 1 CM 251

O Chem I 4 CH 415

Materials 4 CH 401

Mass II 4

EM 104

Graph Comm 2 MA 221

DE I 4 CM 225

A Chem I 4 CH 403

Lab II 2

RH 131

FreshComp

4 HSS Elective 4 CH 304

Thermo II 4 CH 404

Kinetics 4

MA 111

Calc 1 5 CH 200

Career P I 0 Elective 4

WINTER

CM 113

Chem II 4 CH 202

Che Proc Calc

4 CH 300

Career P II 0 CH 406

Design I 4

PH 111

Physics I 4 CM 252

O Chem II 4 CM 360

P Chem 4 CH 408

Lab III 2

HSS Elective 4 MA 222

DE II 4 CH 305

Mass I 4 CH 440

P Control 4

MA112

Calc II 5 EM 101

Statics I 2 MA 227

Statistics 4 HSS Elective 4

MS 120

M.History 1 Hss Elective 4 Elective 4

SPRING

CM 115

Chem III 4 CH 301

Fluids 4 EE 206

EEE 4 CH 407

Design II 4

CS 100

Program. 2 Elective 4 CH 402

ChE Lab I 1 CH 409

Prof Prac 1

EM 103

Int Design 2 HSS Elective 4 Elective 4 HSS Elective 4

MA 113

Calc III 5 CH 303

Thermo I 4 Elective 4 Elective (Des) 4

PH 112

Physics II 4 HSS Elective 4 Elective (free) 4

Page 74: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 74

Assessment Methods

Page 75: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 75

Assessment MethodsAssessment Methods

• Written surveys and questionnaires

• Exit and other interviews

• Standardized exams• Locally developed

exams• Archival records• Focus groups

• Portfolios• Simulations• Performance

Appraisal• External

examiner• Oral exams• Behavioral

observations

Page 76: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 76

Direct Measures

Direct measures provide for the direct examination or observation of student knowledge or skills against measurable learning outcomes

Page 77: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 77

Indirect Measures

Indirect measures of student learning that ascertain the opinion or self-report of the extent or value of learning experiences

Page 78: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 78

Direct Indirect• Exit and other interviews• Standardized exams• Locally developed exams• Portfolios• Simulations• Performance Appraisal• External examiner• Oral exams• Behavioral observations

• Written surveys and questionnaires

• Exit and other interviews

• Archival records• Focus groups

Page 79: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 79

Tools: Exercise 4

Page 80: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 80

Assignment

• After you have shared methods, choose at least two methods (preferably three) that are appropriate for the performance criteria chosen

• At least one DIRECT measure• Use overhead transparency to record

your findings• Include your rationale for decision

Page 81: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 81

Report out on Exercise 4

Page 82: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 82

ValidityValidity

• relevance - the assessment option measures the educational outcome as directly as possible

• accuracy - the option measures the educational outcome as precisely as possible

• utility - the option provides formative and summative results with clear implications for educational program evaluation and improvement

Page 83: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 83

““Bottom Lines”Bottom Lines”

• All assessment options have advantages and disadvantages

• “Ideal” method means those that are best fit between program needs, satisfactory validity, and affordability (time, effort, and money)

• Crucial to use multi-method/multi-source approach to maximize validity and reduce bias of any one approach

ABET

Page 84: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 84

Assessment Method TruismsAssessment Method Truisms

• There will always be more than one way to measure any learning outcome

• No single method is good for measuring a wide variety of different student abilities

• There is generally an inverse relationship between the quality of measurement methods and their expediency

• It is important to pilot test to see if a method is appropriate for your program

Page 85: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 85

Data Collection Process

• Why?– Know your question

• What?– Focus on few criteria for each outcome

• Who? Students (cohorts); faculty (some)

• When?

Page 86: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 86

Sampling

• For program assessment, sampling is acceptable and even desirable for programs of sufficient size.– Sample is representative of all students

Page 87: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 87

Data collection

• How do objectives differ from outcomes in the data collection process?

Data Data collectioncollection

Evaluation Evaluation & design of & design of improvemeimproveme

ntsnts

Implement Implement improvemeimprovements & Data nts & Data CollectionCollection

Yr 1 Yr 2 Yr 3 Yr 4 Yr…

Define Define Outcomes/ Outcomes/ Map Curr.Map Curr.

Page 88: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 88

Learning Outcomes related to: 03-04 04-05 05-06 06-07 07-08 08-09

A recognition of ethical and professional responsibilities

An understanding of how contemporary issues shape and are shaped by mathematics, science, & engineering

An ability to recognize the role of professionals in the global society

An understanding of diverse cultural and humanistic traditions

An ability to work effectively in teams

An ability to communicate effectively in oral, written, graphical, and visual forms

Page 89: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 89

Closing the loopClosing the loop

JAN

FEB

MAR

APR

MAY

JUN

JUL

AUG

SEP

OCT

NOV

DEC

Eval Committee receives and

evaluates all data; makes report and

refers recom-mendations to

appropriate areas.

Institute acts on the recom-mendations of the Eval. Comm.

Reports of actions taken by the

Institute and the targeted areas are

returned to the Eval Comm. for iterative

evaluation.Institute assessment

cmte. prepares reports for

submission to Dept. Heads of the

collected data (e.g. surveys, e-portfolio

ratings).

Page 90: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 90

Student Learning Outcomes at the PROGRAM level©

Learning Outcome ________________________________________________________________________

Performance Criteria StrategiesAssessment Method(s)

Context for Assessment

Time of data collection

AssessmentCoordinator

Evaluation of Results

Results _____ (date):

Actions _____(date):

Second-Cycle Results ____(date):

[email protected]

Page 91: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 91

Checklist Assessment question is known and explicitOutcomes are defined and number of

performance criteria are manageableData are efficiently and systematically collectedAssessment methods are appropriate to

program contextResults are evaluated

Evaluation is more than looking at the results of learning outcomes

Action is appropriate

Page 92: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

Things I wish I had known:

Capitalize on what you are already doingOne size does not fit allYou don’t have to measure everything all the timeMore data are not always betterPick your battlesTake advantage of local resourcesDon’t wait for perfectionGo for the early winDecouple from faculty evaluation

Page 93: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 93

Page 94: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 94

•Tools to help you work through the assessment process•Assessment of student learning outcomes•Assessment processes in business and industry•Assessment rubrics•Electronic portfolios•Assessment terminology•Using grades for assessment•Using surveys and questionnaires for assessment •Data collection •General assessment articles and presentations•Assessment workshops and conferences

Page 95: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 95

Page 96: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 96

April 13-14, 2007April 13-14, 2007

www.rose-hulman.edu/assessent2007

Page 97: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 97

ABET Lessons Learned

Page 98: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 98

ABET Lessons Learned (1/6)

• Start as soon as possible

• Develop a comprehensive plan

• Begin implementing the plan as quickly as possible

• Do not allow the early steps to consume excessive time and create delays in the process

• Close Continuous Improvement loops as soon as possible

• Use consultants with caution - there can be positive and negative effects

Page 99: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 99

ABET Lessons Learned(2/6)

• It is extremely important to defining terminology

• When reported to constituents or external evaluators, evidence should be organized by Outcomes and Objectives rather than by courses

• Evidence should show evaluation and assessment processes are in place and working

• The accumulation of experience with outcomes assessment and continuous improvement will build confidence for all constituencies

Page 100: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 100

ABET Lessons Learned(3/6)

• Coordination between program assessment and institutional assessment can enhance both

• When presenting information for accreditation reviews:

Descriptions of the CI process should be accompanied by evidence of data reduction, analysis, and the resultant actions

Text should be used to explain, interpret, and strengthen tabular or statistical data

Page 101: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 101

ABET Lessons Learned(4/6)

• Each program should have some unique Outcomes that are different from those in accreditation criteria and those in other programs at the same institution. The absence unique Outcomes can imply that the program does not have a clear sense of mission.

• The most successful programs are those with faculty members who have participated in training sessions and communicated with faculty at other institutions

• It is important for the program Administration to be aware and supportive of Continuous Improvement activities

Page 102: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 102

ABET Lessons Learned(5/6)

• Continuous Improvement programs should employ a variety of assessment tools with a mixture of short and long time cycles

• Surveys should be only one of several evaluation tools used in Continuous Improvement

• Requirements for faculty, facilities, etc. should be linked to objectives, outcomes, and Continuous Improvement

• There has been no apparent relationship between the degree of success and the size of the institution

Page 103: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 103

ABET Lessons Learned(6/6)

• Programs that have successfully implemented Continuous Improvement have had two characteristics in common:

There will be at least one faculty member who is highly committed to developing and guiding implementation

There will be sincere involvement of the faculty members in the program

Page 104: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 104

Introduction to ABET

Page 105: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 105

Introduction to ABET Accreditation

• Federation of 28 professional societies• Board of Directors representing those societies• Four Commissions

– Applied Science Accreditation Commission (ASAC)– Computing Accreditation Commission (CAC)– Engineering Accreditation Commission (EAC)– Technology Accreditation Commission (TAC)

• Accreditation Council– Representatives of each commission– Coordination, harmonization of processes

Page 106: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 106

Accreditation Process

• Commission responsibilities– Conduct evaluations of programs– Determine accreditation actions

• Commission makeup– Commissioners are volunteers appointed by societies– Commissioners chair accreditation teams

• Accreditation Team– Chair + one Program Evaluator for each program– Program Evaluators (PEVs) are volunteers from

societies

Page 107: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 107

ABET Accreditation

• Federation of 28 professional societies• Board of Directors represents those societies• Four Commissions

– Applied Science Accreditation Commission (ASAC)– Computing Accreditation Commission (CAC)– Engineering Accreditation Commission (EAC)– Technology Accreditation Commission (TAC)

• Accreditation Council– Representatives of each commission– Coordination, harmonization of processes

Page 108: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 108

ABET Accreditation Statistics

Commission ASAC CAC EAC TAC

Total Programs Accredited 72 240 1793 740

Programs Evaluated in 2004-05 15 70 373 206

Increase in Number of Programs from 1995-2005

+57% +85% +18% -16%

Page 109: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 109

ABET Longitudinal Study

Page 110: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 110

Engineering Change: A Study of the Impact of EC2000*

Lisa R. Lattuca, Project Director and Co-PIPatrick T. Terenzini, Co-PI

J. Fredericks Volkwein, Co-PI

Pennsylvania State University Center for the Study of Higher Education

*EC2000 = Outcomes-based accreditation criteria for the Engineering Accreditation Commission of ABET

Page 111: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 111

Key Questions

1. What impact, if any, has EC2000 had on the preparation of graduating seniors to enter the engineering profession?

2. What impact, if any, has EC2000 had on practices that may be related to changes in student preparation?

Page 112: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 112

Significance of the Engineering Change Study

• The first national study of the impact of outcomes-based accreditation in the U.S.

• A model for assessments in other ABET Commissions.

• A pre-EC2000 benchmark (1994) forgraduating seniors’ preparation.

• The first post-EC2000 data point (2004) on graduating seniors’ preparation.

Page 113: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 113

Engineering Change: Studying the Impact of EC2000

PROGRAM CHANGES

EC2000

OUTCOMES

Continuous Improvement

Employer Ratings

Student Learning

(3.a-k)

Curriculum &

Instruction

Faculty

Culture

Policies &

Practices

STUDENTEXPERIENCES

In-Class

Out-of- Class

Page 114: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 114

Engineering Disciplines Examined

Aerospace Chemical Civil Computer Electrical Industrial Mechanical

Page 115: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 115

Data Sources and Response Rates

Data SourcesTarget

PopulationNumber of Responses

Response

Rate

Programs 203 147 72%

Faculty 2,971 1,243 42%

Deans 40 40+ 98%

1994 Graduates (Pre-) 13,054 5,494 42%

2004 Graduates (Post-) 12,921 4,330 34%

Employers unknown 1,622 N/A

Page 116: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 116

Conclusions

• Recent graduates are measurably better prepared than those of a decade ago in all nine EC2000 outcomes.

• The most substantial improvements are in Societal and Global Issues, Applying Engineering Skills, Group Skills, and Ethics and Professionalism.

• Changes in faculty practices are empirically linked to these increases in preparation.

• Although 25% of employers report decreases in problem-solving skills, 80% still think graduates are adequately or well-prepared in that skill area.

Page 117: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 117

Conclusions

• A complex array of changes in programs, faculty practices, and student experiences systematically enhance student learning.

• These changes are consistent with what one would expect to see if EC2000 was having an impact.

• Changes at the classroom level are particularly effective in promoting the a-k learning outcomes.

Page 118: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 118

Conclusions

• Students also learn engineering skills through out-of-class experiences.

• Finally, a faculty culture that supports assessment and continuous improvement is also important.

• Most deans’ comments echoed the study findings:

– EC2000 is an accelerant for change in engineering programs.

Page 119: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 119

Looking Forward

• ABET has set the stage for systematic continuous review of engineering education.

• Engineering Change provides important evidence that an outcomes-based model is an effective quality assurance mechanism.

• Evidence arrives just in time to inform the national debate.

Page 120: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 120

ABET Participation Project

Page 121: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 121

Participation Project PILOT Report

July 22, 2006

Page 122: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 122

Partnership to Advance Volunteer Excellence (PAVE)

Design and implement a comprehensive and effective program that optimizes the use of the expertise and experience of the volunteer professionals that participate in ABET’s outcomes-based accreditation process.

Page 123: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 123

Key Components

• Develop competency model for Program Evaluators

• Design a more effective recruitment and selection process

• Design a more effective training process

• Design a method of performance assessment and improvement

Page 124: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 124

What are competencies?

• Competencies are behaviors (which include knowledge, skills, and abilities) that define a successful PEV (program evaluator)

• Set expectations• Align with vision, values, and strategy• Drive continuous improvement

Page 125: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 125

Competencies

Effective Communicator• Easily conducts face to face interviews• Writes clearly and succinctly• Presents focused, concise oral briefingsProfessional• Conveys professional appearance• Is committed to contributing and adding value.• Is considered a person with high integrity and ethical

standards

Page 126: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 126

Competencies

Interpersonally Skilled Friendly and sets others at ease Listens and places input into context Open minded and avoids personal bias Forthright – doesn’t hold back what needs to be said Adept at pointing out strengths & weaknesses in non-

confrontational manner Technically Current Demonstrates required technical credentials for the

position Engaged in life long learning and current in their field

Page 127: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 127

Competencies

Organized• Is focused on meeting deadlines • Focuses on critical issues and avoids minutia• Displays take charge initiative • Takes responsibility and works under minimum

supervision Team Oriented Readily accepts input from team members Works with team members to reach consensus Values team success over personal success

Page 128: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 128

Member Society selects PEV

candidate via competency model

Society assigns mentor

Candidate works

preliminary modules

on-line

Candidate successfully completes modules

on-line

Candidate attends

visit simulation

training

Candidate successfully completes

visit simulation

training

Society approvesPEV for

assignmentProgram Evaluator

Support Facilitators

(Society)

Lead Facilitator (Society)

Becoming an ABET Program Evaluator

Candidate attends program specific training

(Society)

PHASE I

PHASE II

PHASE III

Observer visit (optional)

Page 129: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 129

Training Pilot

• Pre-Work CD with Checks for Understanding– Mentor Assigned– Self-Study– Complete Pre-visit forms

• 1.5 days simulating campus visit– Sunday team meeting– Display materials and lab interview– Draft statement homework– Monday night meeting

Page 130: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 130

Evaluation Pilot

• Performance Appraisal forms:– Describe how competencies are demonstrated pre-

visit and during visit– Provide Performance metrics– Require comments for below “met expectations” – Peer, Team Chair, Program

Page 131: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 131

Partnership to Advance Volunteer Excellence

• Determine best implementation strategies together

• Information-sharing, action planning and collaboration to carry the good work forward

• Increase the value of accreditation for your programs

Page 132: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 132

Points of Learning

Page 133: September 28-29, 2006Moscow1 International Faculty Workshop for Continuous Program Improvement.

September 28-29, 2006 Moscow 133

Questions & Answers