Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate...

44
Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter Wolf Director Centre for Open Learning & Educational Support University of Guelph Brian Frank (project coordinator), Queen’s University Susan McCahan, University of Toronto Lata Narayanan, Concordia University Nasser Saleh, Queen’s University Nariman Sepehri, University of Manitoba Peter Ostafichuck, University of British Columbia K. Christopher Watts, Dalhousie

Transcript of Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate...

Page 1: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Workshop 2B:What to Look for in an Outcomes-Based

Process

Susan McCahan

Vice-Dean, Undergraduate

Faculty of Applied ScienceUniversity of Toronto

Peter WolfDirector

Centre for Open Learning & Educational SupportUniversity of Guelph

Brian Frank (project coordinator), Queen’s UniversitySusan McCahan, University of TorontoLata Narayanan, Concordia UniversityNasser Saleh, Queen’s University

Nariman Sepehri, University of ManitobaPeter Ostafichuck, University of British Columbia

K. Christopher Watts, Dalhousie UniversityPeter Wolf, University of Guelph

Page 2: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

2

Workshop Outcomes:What makes for a sustainable, effective outcomes-based curriculum improvement process? In this workshop we will examine the parts of an outcome-based curriculum improvement process and identify the characteristics of a high quality process. We will also discuss common flaws that can undermine an outcome-based process; learn how to identify such flaws, and how to correct them. Short case studies will be used to give participants an opportunity to apply what they are learning in the workshop.

You should be able to• Identify the characteristics of a high quality outcomes-based curriculum

improvement process• Begin to provide an informed critique of a continuous curriculum

improvement process

Page 3: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Agenda: What to look for - overall - at each step

3

1. Program Evaluation:

Defining purpose and indicators

2. Mappingthe Curriculum

Stakeholder input

3. Identifying and Collecting Data

4. Analyzing and Interpreting

the data

5. Data-informed curriculum improvement:Setting priorities and planning for change

Page 4: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Perspective: Sec 3.1 of CEAB Procedures

• “The institution must demonstrate that the graduates of a program possess the attributes under the following headings... There must be processes in place that demonstrate that program outcomes are being assessed in the context of these attributes, and that the results are applied to the further development of the program.”

4

Page 5: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Activity: How do we ideally critique a research report, journal article, or grant proposal?

5

Page 6: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Frame this as a research study on your curriculum

• From perspective of learners, and outcomes• NOT inputs, teaching

6

Page 7: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Overall process

What to look for:• Research questions and methodology are well

defined and align with outcomes• Process is includes all key elements• Process is well defined and sustainable• Process is continuous:

cycle of data collection and analysis is explained

7

Page 8: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Research Questions: case study1. What are students’ strengths and weaknesses in

communication ability after completing our program?

2. There are several courses we think teach and utilize investigation skills; where are students really learning to investigate a problem?

3. Where does the program cover project management?

4. How many times do students participate in team based projects?

5. Does our students’ problem solving ability meet our expectations?

8

Page 9: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Task 2011/ 12 2012/ 13 2013/ 14 2014/ 15

Group 1 Attributes XGroup 2 Attributes XGroup 3 Attribute XIntegration of all Attributes X

Sample Process Framework (cont’d)

Task 2011/ 12 2012/ 13 2013/ 14 2014/ 15 2015/16

Year 1 - X XYear 2 - XYear 3 - XYear 4 - X X

Example 1: data collection by attribute

Example 2: classic longitudinal study in 12 dimensions (i.e. cohort follow )

Page 10: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Task 2011/ 12 2012/ 13 2013/ 14 2014/ 15 2015/16

All attribute areas X XReport data for visit X

Sample Process Framework (cont’d)

Task 2011/ 12 2012/ 13 2013/ 14 2014/ 15 2015/16

Year 4 X X X X X

Example 3: data collection by snapshot

Example 4: Data collection on all attributes at graduation

Example 5 Collect data on every attribute every year across the whole curriculum

Page 11: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Task 2012-13 2013-14 2013/ 14 2014/ 15

Graduate Survey X X X

XStudent Portfolios Review

Student/Faculty Feedback X

Alumni Survey X

Employer Focus Group X

Faculty& Student Workshops/Retreat X X X

Review assessment process & adapt X

Sample Process Framework (cont’d)

Example 1

Page 12: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

12

5 Data-Informed Curriculum Improvement:

Setting Priorities and Planning for Change

Page 13: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

1. Program Evaluation: Defining purpose and indicators

Graduate Attributes: 12 defined by CEAB• Characteristics of a graduating engineer• A broad ability or knowledge base to be held by graduates

of a given undergraduate engineering program

Indicators:• Descriptors of what students must do to be considered

competent in an attribute; the measurable and pre-determined standards used to evaluate learning.

13

Page 14: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Indicators1) For Attribute #3 (Investigation), which of the

following potential indicators are appropriate?a) Complete a minimum of three physical experiments in

each year of study.b) Be able to develop an experiment to classify material

behaviour as brittle, plastic, or elastic.c) Be able to design investigations involving information

and data gathering, analysis, and/or experimentationd) Learn the safe use of laboratory equipmente) Understand how to investigate a complex problem

2) What are other potential indicators for this attribute?

3) How many indicators are appropriate for this attribute? Why?

Investigation: An ability to

conduct investigations of

complex problems by methods that

include appropriate

experiments, analysis and

interpretation of data, and

synthesis of information in order to reach

valid conclusions

Page 15: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

What to look for:• Indicators align with attributes and research

questions

• Indicators are “leading indicators”: central to attribute; indicate competency

• Enough indicators defined to identify strength areas; and weak areas (but not too many)

• Indicators are clearly articulated and measurable 15

1. Program Evaluation: Defining purpose and indicators

Page 16: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Example: Adapted from Queens, 2010

16

# Attribute Primary Year Shortname Description

FirstIdentifies known and unknown information, uncertainties, and biases when presented a complex ill-structured problem

Graduating Identifies problem, known and unknown information, uncertainties, and biases

First

Graduating

FirstSelects and applies appropriate quantitative model and analysis to solve problems

Graduating Selects and applies appropriate model and analysis to solve problems

First

Graduating

First Generates ideas Generates ideas and working hypothesis

FirstDesigns investigation

Designs investigations involving information and data gathering, analysis, and/or experimentation

First Synthesizes data Synthesizes data and information to reach conclusion

FirstAppraise conclusions

Appraises the validity of conclusion relative to the degrees of error and limitations of theory and measurement

First Uses processAdapts general design process to design system, component, or process to solve open-ended complex problem.

FirstIdentify design problem

Accurately identifies significance and nature of a complex, open-ended problem

GraduatingIdentify design problem

Identifies problem and constraints including health and safety risks, applicable standards, economic, environmental, cultural and societal considerations

… … …

Evaluates validity of results and model for error, uncertainty

Creates process for solving problem including justified approximations and assumptions

4 Design

Identify problem

Create process

Select model

Evalute solution

Problem analysis

Investigation

2

3

Page 17: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

2. Mapping the Curriculum

• Goal: – Where are the students learning?– Where are we already assessing learning?– Start to identify assessment checkpoints

17

Page 18: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

2. Mapping the CurriculumWhat to look for:

• Information in the map is – Accurate, with some depth, identifies outcomes– Not simply a list of topics “covered”

• Map provides information for each attribute– Can include curricular and other experiences

• Map indicates where the attribute is:– Taught: possibly with some information– Assessed – Points of planned data collection 18

Page 19: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Curriculum Assessment Case Study

Curriculum Context:• Small applied sciences undergraduate with

approximately 200 students• 20 faculty (40% of whom are non-native English

speakers) with no sessional/contract instructors

Question:There is a suspicion and concern amongst faculty that the writing skills of students is lower than desired. Is this the case? If so, how to adapt curriculum & related practices to further enhance student writing

Page 20: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Data Collection:

• Map writing to courses

• Survey of student work

• Student survey on writing development

• Department meeting discussion (including TAs, contract instructors, academic counselors, etc.)

Page 21: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Relevant qualitative data:• Students wish they had more opportunities to develop

writing skills

• Samples show consistently lower-than-desired level of sophistication

• The department meeting included discussion about:• The large international proportion of faculty • The appropriateness of scientists teaching writing • A reluctance to teach and assess writing in general from

faculty• A lack of resources and tools for those faculty interested

but unsure how to go about it

Page 22: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

1st year 2nd year 3rd year 4th year0

5

10

15

20

25

Major coursesPre-RequisitesFree electives

Courses available to Majors

Page 23: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

3rd year 4th year0

2

4

6

8

10

12

14

16

18

Not Taught / Not AssessedNot Taught / AssessedTaught / Not AssessedTaught / Assessed

Mapping Writing

Page 24: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

• Superior graduating students• Evidence of graduating student quality• Opportunity for individual student & programme

autonomy• Enhanced time & resource usage

Continuous improvement of the curriculum can lead to:

24

Note: in the Graduate Attributes process • curriculum mapping is a step toward outcomes assessment,

not the end goal • can yield important insights into curriculum and

improvement opportunities

Page 25: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

3. Collecting Data on Student Learning

• Data collection can include:– Qualitative data– Quantitative data

• Ultimately is translated into information that addresses the research questionsOn the indicator being assessedAt a particular, identified point in the program

25

Page 26: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

3. Collecting Data on Student Learning

What to look for:• Assessment aligns with indicator; i.e. valid data

• Triangulation is used: i.e. reliable data collectionwithin reason

• Assessment scoring is well designedlevels are well described, and appropriate

• Assessment avoids “double barreled” (or more) scoring

• Sampling is used appropriately

• Data collected for assessing the program/cohort quality, not an assessment of student

26

Page 27: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Case Studies

1. Communication: Ability to develop a credible argument is assessed using a multiple choice test.

2. Communication: Ability to develop a credible argument is assessed using a lab report discussion section. Grading is done based on word count.

3. Investigation: Ability to develop an investigation plan is assessed using a lab report that requires experiment design.

4. Ethics: Ethics is assessed only using the grade in an ethics course.

5. Design: Ability to generate creative design ideas is assessed using a student survey.

6. Knowledge base: A course grade in physics is used to assess physics knowledge base. 27

Page 28: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Examples of Rubrics

28

Page 29: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

1(not demonstrated)

2(marginal)

3(meets expectations)

4(outstanding)

Mark

Gathers information from appropriate sources 3.04-FY4: Gathers info

No significant information used, not cited; blatant plagiarism.

Insufficient usage; improper citations.

Gathers and uses information from appropriate sources, including applicable standards, patents, regulations as appropriate, with proper citations

Uses information from multiple authoritative, objective, reliable sources; cited and formatted properly

/4

Plans and manages time and money3.11-FY1: Manage time and money

No useful timeline or budget described; poorly managed project; safety issues

Poor timeline or budget; infrequent meetings; minor safety problems

Plans and efficiently manages time and money; team effectively used meetings; safety considerations are clear

Efficient, excellent project plan presented; detailed budget; potential risks foreseen and mitigated

/4

Describes design process3.04-FY1: Uses process

No discussion of design process.

Generic design process described.

Describes design process used to design system, component, or process to solve open-ended complex problem.

Comprehensive design process described, with appropriate iterations and revisions based on project progress

/4

Incorporates social, environmental, and financial factors3.09-FY4: Sustainability in decisions

No consideration of these factors.

Factors mentioned but no clear evidence of impact on decision making.

Incorporated appropriate social, environmental, and financial factors in decision making

Well-reasoned analysis of these factors, with risks mitigated where possible

/4

Demonstrates appropriate effort in implementation

Insufficient output Sufficient implementation but some opportunities not taken, or feedback at proposal not incorporated in implementation

Appropriate effort, analysis, and/or construction demonstrated to implement product, process, or system

Outstanding implementation /4

Compares design solution against objectives3.04-FY7: Compares solution

No evaluation of design solution

Some factors missed in evaluating design solution

Compares the design solution against the project objectives and functional specifications, providing qualitative evaluation where appropriate

Comprehensive evaluation of design solution, with well-defended recommendations for future work or implementation

/4

Creates report following requirements

Poorly constructed report

Some organization problems, minor formatting problems, redundancy, spelling grammar/errors

Report achieves goal using formal tone, properly formatted, concisely written, appropriate use of figures, few spelling/grammar errors

Professional tone, convincing argument, authoritative, skillful transitions

/4

Overall Grade: /28

29threshold targetSample Rubric (Queens)

Page 30: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

1. Ability to define the problem

State the problem, its scope and importanceDescribe the previous workState the objective of the work

1. Ability to identify and credibly communicate engineering knowledge

Situate, in document or presentation, the solution or design in the world of existing engineering, taking into account social, environmental, economic and ethical consequences Recognize a credible argument (reading)Construct a credible argument in written or spoken form – to persuasively present evidence in support of a claim Organize written or spoken material– to structure overall elements so that their relationship to a main point and to one another is clearCreate “flow” in document or presentation – flow is a logical progression of ideas, sentence to sentence and paragraph to paragraph

Mapping Indicators to Existing Evaluation (UofT)

30

Page 31: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Old Evaluation Form (UBC)

31

0 1 2 3 4 5 Is the parameter/factor being studied important to the overall project success? The team should be able to describe why they are conducting the prototype test and what they hope to find with it. They should be able to explain why this particular prototype test is preferred over a calculation or simulation.

Has an appropriate prototyping method been selected? Given what the teams want to find, have they selected a good approach? (Does it have sufficient accuracy? Is it reasonably insensitive to other parameters? Is there an obvious better/simpler/more accurate way to run the test?)

What is the quality of the prototype, the test execution, and the results? Did the team do a good job in building their prototype, running their tests, and analyzing/interpreting the data?

Are the findings being used appropriately? How does the team plan to incorporate the results of the prototype test to their design? Do they understand the limitations of the data they have collected?

Totals

Page 32: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Evaluation Reformatted as Rubric (UBC)

32

Criterion

Level of Mastery

Unacceptable

0

Below Expectations

1

Meets Expectations

2

Exceeds Expectations

3

2.1 Problem Identification

Team is NOT able to identify the parameter they are using the prototype to study.

Parameter studied is NOT directly relevant to project success.

Parameter studied is appropriate for project, AND the team is able to provide some justification why.

Parameter studied is appropriate for project, AND the team is able to provide strong justification why.

3.2 Investigation Design

Team has NOT built a prototype.

Prototyping method is NOT appropriate for the parameter being studied (i.e. will not yield desired data).

Prototyping method is at least somewhat appropriate for the parameter being studied; a simpler approach MAY exist

Prototyping method is appropriate for the parameter being studied, AND the team is able to clearly justify why the physical prototype used is superior to other physical or virtual prototypes.

3.3 Data Collection

No data collected; prototype does NOT work

The prototype works BUT data collection / analysis techniques are inappropriate.

Data collection and analysis are done appropriately AND data quality is fair.

Data collection and analysis are done appropriately AND data is of high quality.

3.4 Data Synthesis

No conclusions are drawn, OR inappropriate conclusions are drawn.

Appropriate conclusions are drawn from the data, BUT the team is NOT able to explain the how the data affects the project.

Appropriate conclusions are drawn from the data, AND the team is able to provide some explanation of how the data affects the project. Some implications are overlooked.

Appropriate conclusions are drawn from the data, AND the team is able to provide strong and complete explanation of how the data affects the project.

3.5 Analysis of Results

The team does NOT consider limitations or errors in the tests, or validity of the conclusions.

The team considers errors, limitations, and validity in the tests, BUT does NOT quantify errors or take appropriate action.

The team quantifies errors, and considers limitations and validity, AND takes action, BUT action is limited or somewhat inappropriate.

The team quantifies errors, and considers limitations and validity, AND is able to justify and take appropriate action.

Page 33: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

4. Analyzing and interpreting the data

• Timing of data collection and analysis • Analysis of the data• Data used to inform the improvement plan.

33

Page 34: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

4. Analyzing and Interpreting the data

What to look for:• Timing of data collection and analysis is clear, and

continuous (cyclic).

• Analysis is high quality and addresses the data

• Improvement plan aligns with the analysis and data

• Improvement plan is implemented

34

Page 35: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

5. Data-informed Curriculum Improvement

• The process of “closing the loop”• Information collected, analyzed and used for

curriculum improvement

35

Page 36: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

5. Data-informed Curriculum Improvement

What to look for:• Integrity of the overall research method:

– Quality of the research questions– Quality of the methodology

• Indicators• Curriculum mapping• Data collection process

– Valid, reliable data collected– Analysis of the data is clear and well grounded

• Results used to inform curriculum change36

Page 37: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Disaggregating the data to get more information

37

Investigation

Indicator #1 Indicator #2 Indicator #3

Performance histogram- Fails- Below Expectation- Meets Expectation- Exceeds Expectation

Page 38: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Disaggregating the data to get more information

38

Investigation

Indicator #1 Indicator #2 Indicator #3

Performance histogram- First year- Middle year- Final year

Page 39: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Why not use grades to assess outcomes?

39

Electric Circuits IElectromagnetics ISignals and Systems IElectronics IElectrical Engineering LaboratoryEngineering CommunicationsEngineering Economics...Electrical Design Capstone

78568271867688

86

Student transcriptHow well does the program prepare

students to solve open-endedproblems?

Are students prepared to continuelearning independently after

graduation?

Do students consider the socialand environmental implications of

their work?

What can students do withknowledge (plug-and-chug vs.

evaluate)?

Course grades usually aggregateassessment of multiple objectives,

and are indirect evidence for some expectations

Page 40: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Rubrics

40

Dimensions(Indicator)

Scale (Level of Mastery)

Not demonstrated Marginal Meets

expectationsExceeds

expectations

Reduces variations between grades (increase reliability)Describes clear expectations for both instructor and students (increase validity)

Indicator 1

Indicator 2

Indicator 3

Descriptor 1a

Descriptor 2a

Descriptor 3a

Descriptor 1b

Descriptor 2b

Descriptor 3b

Descriptor 1c

Descriptor 2c

Descriptor 3c

Descriptor 1d

Descriptor 2d

Descriptor 3d

Page 41: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

Histograms for Lifelong learning (Queens)

FEAS - 3.12-FY1 FEAS - 3.12-FY2 FEAS - 3.12-FY5 FEAS - 3.12-FY60

10

20

30

40

50

60

1 - Not Demonstrated 2 - Marginal 3 - Meets Expectations 4 - Outstanding

Attributes

Per

cent

age

(%)

41

3.12-FY1 Uses information effectively, ethically, and legally to accomplish a specific purpose, including clear attribution of Information sources.

3.12-FY2 Identifies a specific learning need or knowledge gap.

3.12-FY5 Identifies appropriate technical literature and other information sources to meet a need

3.12-FY6 Critically evaluates the procured information for authority, currency, and objectivity.

Page 42: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

0%

20%

40%

60%

80%

100%

Define the Problem Devise and execute a plan to solvethe problem

Use critical analysis to reach validconclusions

Histogram for Communication (UofT)several assessment points in ECE496

42

Percentage of students who meet or exceed performance expectations in indicators

Page 43: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

0%

20%

40%

60%

80%

100%

Define the Problem Devise and execute a plan to solvethe problem

Use critical analysis to reach validconclusions

Histogram for Communication (UofT)

43

Percentage of students who meet or exceed performance expectations in indicators

Page 44: Workshop 2B: What to Look for in an Outcomes-Based Process Susan McCahan Vice-Dean, Undergraduate Faculty of Applied Science University of Toronto Peter.

44

1st Year 2nd Year

3rd Year 4th Year

Below Expectations: 6%Meets Expectations: 75%Exceeds Expectations: 19%

Indicator Summary Courses and elements assessed

Attribute 4: DesignAn abil ity to design solutions for complex, open-ended engineering problems and to design systems, components or processes that meet specified needs with appropriate attention to health and safety risks, applicable standards, and economic, environmental, cultural and societal considerations.

Overall

6%

75%

19%

0%

25%

50%

75%

100%

BE ME EE

0%

50%

100%

BE ME EE0%

50%

100%

BE ME EE

0%

50%

100%

BE ME EE0%

50%

100%

BE ME EE

4.4 Solution Generation MECH 223MECH 223MECH 45X

4.5 Solution Evaluation MECH 223MECH 223MECH 45X

4.6 Detailed Design MECH 223MECH 325MECH 45X

Formal report 1 & 2Apply appropriate engineering knowledge, judgement, and tools, in creating and analyzing design solutions criteria

Assignments 1-5Preliminary design report

Formal report 1 & 2Perform systematic evaluations of the degree to which several design concept options meet project criteria

Oral presentation 1 & 2Concept selection report

Formal report 1 & 2Produce a variety of potential design solutions suited to meet functional specifications

Oral presentation 1 & 2Concept selection report

0%

50%

100%

BE ME EE

0%

50%

100%

BE ME EE

0%

50%

100%

BE ME EE

Histograms / Summary for Design (UBC)