Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement....

35
Report on the Development of the Graduate Attributes Process Faculty of Applied Science and Engineering University of Toronto August 2012

Transcript of Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement....

Page 1: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

Report on the Development of the

Graduate Attributes Process

Faculty of Applied Science and Engineering

University of Toronto

August 2012

Page 2: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

2

Table of Contents

Faculty Report........................................................................................................... 3

1. Introduction................................................................................................... 3

2. Methodology……………………………………………………………… 3

3. Outcomes and Indicators………………………………………………….. 5

4. Curriculum Mapping ……………………………………………………… 7

5. Development of Rubrics and Support for Assessment……………………. 8

6. Implementation of Assessment Plan: Data Collection to Date…………… 9

7. Plan for Continuous Curriculum Improvement…………………………… 9

8. Conclusion…………………………………………………………………. 11

Appendix A: Global Outcomes and Indicators…………………………………… 12

Appendix B: Example Rubrics…………………………………………………… 22

Appendix C: Program Reports……………………………………………………. 36

Appendix C: Engineering Science Report…………………………………….. 37

Appendix C: Core 8 First Year Report………………………………………. 129

Appendix C: Chemical Engineering Report………………………………….. 169

Appendix C: Civil Engineering Report………………………………………. 182

Appendix C: Electrical and Computer Engineering Report………………… 256

Appendix C: Materials Engineering Report…………………………………. 287

Appendix C: Mechanical and Industrial Engineering Report………………… 329

Appendix C: Mineral Engineering Report…………………………………… 458

Page 3: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

3

Faculty Report

1. Introduction to the Faculty process

In 2010 the Faculty created a Graduate Attributes Committee (GAC), with representation from our 9 programs and from other units that support student learning directly, such as the Engineering Communication Program (ECP). The GAC was tasked with managing the development of the continuous curriculum improvement process for the Faculty. They have determined which aspects of the process would be consistent across all 9 programs (Faculty-wide), and which aspects would be handled at the departmental level. The GAC has met frequently over the last two year and has made substantial progress toward a functional continuous curriculum improvement process.

The members of the GAC have been responsible for developing many of the supporting materials in this report, such as indicators and rubrics. However, the GAC has also frequently invited other faculty to participate in the process. For example, an early draft of the indicators was vetted by an expert on assessment from the Ontario Institute for the Study of Education (OISE) at the University of Toronto. This feedback helped us refine our indicators. We also made use of engineering faculty with special interests and areas of expertise, including members of the Engineering Communication Program who devised the first version of our indicators for the Communication attribute. In total, more than 20 faculty have participated actively in the graduate attribute process to date, and we anticipate that this number will grow as the improvement process is implemented.

This participation rate has added substantially to the resources available to successfully implement this system. The GAC members, and other faculty and staff who worked with the GAC, have developed a level of understanding that will allow them to successfully translate the Faculty materials we have developed into effective, useful instruments for assessing learning outcomes and interpreting the resulting data. The programs will report their outcomes, interpretations of data, and analyses to the GAC during the implementation phase of the project. This exchange of information, and vetting of the results, will simultaneously serve to keep all programs moving forward on implementation and promote a high level of quality.

2. Methodology

The methodology used at the University of Toronto follows the process suggested by the Engineering Graduate Attributes Development (EGAD) group, an advisory group to the National Council of the Deans of Engineering and Applied Science. The 5 step process involves:

1. Program evaluation

Page 4: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

4

2. Curriculum mapping 3. Collecting data on student learning 4. Analyzing and interpreting the data 5. Data informed curriculum improvement

Step 1: The Graduate Attributes Committee (GAC) has spent most of their efforts over the last two years on Program Evaluation. This work involved the development of a set of global learning outcomes that are common across the Faculty, and address all 12 of the graduate attributes. The global outcomes define the competencies that we believe all University of Toronto engineering students should achieve by graduation in the 12 dimensions of the attributes. However, the global outcomes represent fairly high level goals that are not directly measurable. To assess the global outcomes, the GAC has developed a set of measurable indicators. This process, and its results, is discussed in more detail in section 3.

Step 2: Concurrent with the GAC work, each program was tasked with developing a set of curriculum maps (see section 4). These curriculum maps illustrate: (i) where material associated with the attributes is taught; (ii) where students have an opportunity to practice skills associated with the attributes; (iii)where students are assessed in these dimensions and demonstrate their competency. The curriculum maps are unique for each program. They show the direct formal assessment that we propose to use to collect data on student learning outcomes. They also suggest indirect measures that will be used to round out the data set.

Step 3: We have taken a two-pronged approach to the data collection process. At the Faculty level, the GAC has developed rubrics for the indicators. The rubrics developed by the GAC act as a starting point, or exemplar set, for instructors who will be collecting data for the graduate attribute process. Simultaneously, at the program level, data is being collected in a select set of courses. This pilot data collection project is allowing instructors to anticipate the adjustments they will need to make to their assessment methods to align the assessment with the outcomes they want to measure. Through this process, we have learned that many of the rubrics and other assessment tools we have been using require some modification to improve the validity of the data collection methodology. This step is discussed in more detail in sections 5 and 6.

Step 4: We have just begun to analyze the data we have collected. To date, the data collected is very sparse and primarily concentrated on the major design components in the curricula. At this stage in the implementation of the graduate attribute process, the data collection is teaching us more about our collection methodology than producing meaningful results. We can draw few conclusions from the limited data we have collected. However, the process of collecting the preliminary data is helping us immensely to further the development of a robust, valid and reliable data collection plan. The data collected to date, and a discussion of the results, is given in section 6 and the appendices.

Page 5: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

5

Step 5: We have not yet progressed to the point where we can substantially begin data-informed curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will be discussed in section 7. Our plan will be implemented in 2012/13 and will result in data that can be used for curriculum reform. Following the analysis of the data, the first significant round of data-informed curriculum improvement will occur in 2013/14.

3. Outcomes and Indicators

The Graduate Attributes Committee (GAC) took a three-step approach to the development of the indicators that will be used as the foundation for the continuous curriculum improvement process. In this explanation of the development of our indicators, we make reference to “assessment points”. These are points in the students’ development where we are assessing their proficiency in relation to a particular indicator. The assessment measures the quality of a performance (e.g. oral presentation) or artifact (e.g. written response to an exam question) as a demonstration of learning. The assessment point could be direct, such as an exam or lab report, or indirect, such as a survey or annual data on academic offense cases.

As a first step, the GAC developed global outcomes based on each attribute. The global outcomes1 represent our learning priorities for our students, and define broadly the competencies we want our students to achieve by graduation. The global outcomes developed by the GAC are common across all 9 of the programs in the Faculty of Applied Science and Engineering at the University of Toronto. The global outcomes associated with each attribute are shown in Appendix B. We have developed between 2 and 4 essential global outcomes for each attribute. This system should allow us to identify areas of strength and weakness across the attributes and within each attribute (to inform our curriculum evolution). At the same time, we should not assess so many aspects of each attribute that we create an unsustainably onerous data collection system. The global outcomes are important because they create a shared understanding of the competencies we want our students to achieve. Any program is free to add additional outcomes to this basic list if they wish. However, global outcomes are not specific enough to be measured directly. They serve as an intermediate step to creation of measurable indicators.

In the second step, the GAC developed indicators based on the global outcomes that we had defined. There are many educational outcomes, i.e. indicators that could be associated with each of the global outcomes. To create a meaningful and sustainable data collection methodology, we used the concept of leading indicators2, which serve to identify key performance indicators of

1 The term “global outcomes” as it is used here is synonymous with “global objectives” as defined by Anderson, Krathwohl et al. in A Taxonomy for Learning, Teaching, and Assessing, 2001. They state “Global objectives are complex, multifaceted learning outcomes that require substantial time and instruction to accomplish. They are broadly stated and encompass a large number of more specific objectives.” (p. 15) 2 The term “leading indicator” is a term most often associated with economic assessment. Gloria Rogers, an expert in educational assessment, has suggested the concept of leading indicators is a useful way of understanding the role of performance indicators in

Page 6: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

6

learning for each of the global outcomes. The GAC developed these indicators at the Faculty level without reference to a specific program or point of assessment within a given program. The indicators at this level are, therefore, necessarily somewhat general. The GAC also identified more indicators than any one program could, or should, use. The list is not extensive, with each indicator being central to the global outcome it purports to measure. However, giving a range of indicators allows each of the 9 programs to select a subset that best fits with their current courses, extra or co-curricular activities, and assessment methods.

In the third step, each program needs to select a subset of indicators to use when assessing the global outcomes. They are generally choosing from the Faculty list of indicators, and adapting these to fit the specific type of performance they are assessing. For example, a program might pick “Describe the causes of a problem and its effects”, from the Faculty list of indicators for Problem Analysis. Suppose they want to apply this indicator to assess student performance on a report on traffic congestion in Toronto, written in a third year Civil Engineering course. The program may choose to modify the indicator to read “Describes the causes of traffic congestion in Toronto and its effects,” to be included as a criteria on their marking rubric. The results from this assessment will indicate the cohort’s “demonstrated ability to identify and characterize an engineering problem” which is the global outcome associated with this indicator. This data, combined with results from other assessment information in this global outcome category, will produce a clear picture of the achievement level of the cohort on this outcome.

A different program may choose “Classify a given problem, and the type of solution sought” from the same global outcome indicator list. They may, for example, modify this indicator to conform to a final exam question; e.g. “Part a) Given the stated problem, classify this problem as under determined, over determined or indeterminate, and identify the type of solution sought.” The performance results from this particular question would be collected to indicate the class’ “demonstrated ability to identify and characterize an engineering problem.” Although this indicator is different than the example from the Civil course, the data indicates the level of proficiency of the students on the same global outcome. Essentially, these are two different ways to measure performance quality on the same outcome. This system allows us to use a wide variety of different assessment points, but produce data that can be compared.

To create a reliable data set, each program needs to select a broad set of indicators. The chosen set should cover every outcome, given that these indicators need to be applied at several assessment points (pre-existing or new) in their program to generate a reliable data set. In CEAB terminology, this is referred to as ‘triangulation’. If none of the indicators developed by the Faculty GAC suits the assessment point they have chosen, the program is then free to develop another indicator that aligns well with the global outcome. However, this new indicator will be brought up for review by the GAC to confirm that it is, in fact, a leading indicator. Not every

outcomes based accreditation processes. See G. Rogers blog entry posted May 29, 2010: http://programassessment.blogspot.com/2010/05/what-is-performance-indicator-anyway.html

Page 7: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

7

outcome can be assessed at every assessment point. For example, it is not reasonable to assess every outcome based on a final design report in a capstone design course. Therefore, it is important to consider whether the indicator is well aligned with the outcome, and also whether the indicator is well aligned with the assessment (performance or artifact). This review process by the GAC not only ensures the integrity of the indicator list in the Faculty, but also makes the new indicator available to other programs, in the event that it fits an assessment point they would like to use for gathering data.

4. Curriculum mapping

Curriculum mapping is a crucial step in a continuous curriculum improvement process. Each program must identify how they support student competency development, from the time the student enters the program to the point when the student is eligible to graduate. In particular, we needed to map the curriculum in the 12 dimensions that represent the graduate attributes so we can identify assessment points, while also identifying gaps, overlaps, and points of weakness in the curriculum for improvement purposes.

The curriculum maps, developed by the 9 programs respectively, are given in Appendix C. The programs took a variety of different approaches to curriculum mapping. Several of the programs took an Introduce, Teach, Utilize (ITU) approach to mapping the curricula3. In addition, points of assessment (A) were noted and described. Some programs also chose to identify courses which emphasize (E) a particular attribute, e.g. a course with significant design experience, or where communication content is explicitly taught, would be marked as “E” in these attributes. Other programs simply identified courses where the graduate attributes are developed, and described the type of teaching and assessment used without a formal ITU approach. The specifics of the curriculum mapping process for each program are explained in the Appendix C reports. In every case, the approach was carried out at the global outcomes level for each attribute so the programs can identify (i) where the specific global outcomes are taught, and (ii) how and where they are assessed. This process allows us to develop an assessment plan for the continuous curriculum improvement process.

The assessment plan involves selecting a set of existing assessment points, targeting this set for data collection from the curriculum map, and adding any new or indirect assessments necessary to generate a reliable data set for evaluating cohort performance on each global outcome. In many cases, the direct measures will be part of an assignment grade or other type of assessment. For example, it could be the mark assigned to one targeted question on an exam. Alternatively, it could constitute one row in a rubric that assesses a specific criterion (indicator) out of several criteria, which also cut across different attributes being assessed on a report (e.g. various aspects

3 Bankel, J. et al., “Benchmarking Engineering Curricula with the CDIO Syllabus”, Int. J. Engng Ed., Vol. 21, No. 1, pp. 121-133, 2005.

Page 8: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

8

of communication, critical thinking, problem analysis, investigation, etc.). Choosing the points for data collection selectively helps us keep the data collection process manageable. It also helps us create a consistent monitoring system, like placing weather stations in the curriculum map, allowing us to ensure that the assessment, year-over-year, continues to provide data on a particular outcome.

The assessment points are carefully chosen to target the indicator. We have generally avoided using composite marks, such as the score on a whole test or the summative mark in a course, to ensure that the data is aligned with, and specific to, the indicator we are purporting to measure. However, in some instances, a composite mark is well aligned with an indicator. In particular, the knowledge base attribute lends itself to using a combination of composite marks and partial marks. For example, we can generally assess the strength of our students in the area of basic statics by looking at the average and grade distribution in this course and comparing these to how our students perform in first year calculus. This data may help us decide whether our students are meeting our expectations more readily in physics than in math, and inform where we focus our attention to improve the curriculum or teaching in the future. To decide what areas of math our students are struggling with most, we need to look at their performance in more specific detail using partial marks (i.e. a specified set of test questions in a particular area of mathematics). This information allows us to sharpen our curriculum improvements to address more specific weaknesses within the subject field.

5. Development of rubrics and support for assessment

Assessment of a performance or learning artifact can take many forms. One method that is used extensively in outcomes-based assessment is rubrics. It is ideal if instructors use consistent rubrics across the program so data can be compared across assessment points. However, it is also necessary for rubrics to be customized, to some degree, to align with the specific assessment point where they are being used. To support both of these goals, the Faculty Graduate Attribute Committee (GAC) developed generic rubrics for each of the Faculty indicators (see Appendix B for examples). The rubric for an indicator comprises one row of a table, and describes the characteristics of the performance, from a low level of quality to a high level of quality. The levels we chose for the rubrics are: fails, below expectations, meets expectations, and exceeds expectations. An instructor can create a customized rubric table by (i) identifying the indicators that are assessed on a particular assignment (possibly from a variety of graduate attributes), (ii)selecting the rubrics that pertain to those indicators, (iii) and then compiling the individual indicator rubrics into a complete table4 for the assignment. The instructor can then add to, or adapt, the rubric as needed to align it with the assignment.

4 This would usually take the form of a table, often referred to as an “analytic rubric” where each row represents an individual criterion (i.e. indicator) that aligns to the assignment.

Page 9: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

9

The generic rubrics set a target level of performance that is common across the Faculty. The rubrics describe the quality of the performance that we expect from every student graduating in any Engineering program. Because these standards are Faculty-wide (and for every indicator) the rubrics create a shared understanding of the level of learning and quality of work we expect our students to demonstrate. If the instructor decides to use a numerical score (e.g. a score on an exam question) to indicate the quality of performance, this score can be related back to a qualitative description on one of the rubrics so this data can be compared to data coming from other assessment points. We believe this will facilitate the comparison of data coming from different assessment points and different programs. As a result, data sets can be mined for information more effectively to answer questions about the strengths and weaknesses in our curricula.

The development of the Faculty rubrics is still in progress. A first draft for many of the rubrics has been developed (see Appendix B for examples), and the rubrics are being piloted this year. There will continue to be further work on, and refinement of, the rubrics as the Faculty moves toward a steady and continuous curriculum improvement process.

6. Implementation of assessment plan: Data collection to date

To date, every program has started to collect information at some assessment points in the program. During this pilot phase, it is not expected that the programs try to fully implement indicator assessment at every point in the curriculum. The purpose of the pilot is for programs to gain experience with this new system, and for instructors to begin examining their teaching from an outcomes perspective. A report on the results of the pilot data collection can be found in the individual program reports, appended below (see Appendix C).

7. Plan for continuous curriculum improvement

The Faculty is planning to begin a systematic cycle of outcomes-based continuous curriculum improvement, starting in 2012/13. To date, the programs have been working on the assessment of outcomes in select courses to gain experience with this new approach. There has not been a globally-coordinated cycle of data collection in action. Starting in 2012/13, the programs will be following a Faculty plan that will ensure that we collect sufficient data, in a manner that allows us to answer significant questions about the quality of our curricula.

We have decided to follow a plan that is typical for longitudinal studies in engineering education research. In studies of this nature, a cohort is followed through their development. This allows the researcher to observe both the development of the cohort as a whole and the progress of individuals. Such a study enables a wide range of research questions to be addressed, e.g.:

Page 10: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

10

x What are the most significant learning experiences in the program that contribute to competency development in a particular area?

x Which competencies are/are not effectively developed through the undergraduate experience?

x What are the characteristics of the developmental pathways followed by a particular group of students? E.g. students from socio-economically disadvantaged backgrounds; or students who struggle academically in year 1, etc.

x How do student attitudes change during their development through the undergraduate program?

x How does professional identity and practice emerge during the undergraduate years?

In our plan, each program will identify where in their program they will be collecting data on the global outcomes defined by the Faculty, i.e. create an effective curriculum map and assessment plan with identified assessment points and methodology. Programs are expected to have assessment plans that cover all global outcomes at least once near graduation (years 3 or 4), and collect data on the global outcomes earlier in the program also, to create a clear picture of competency development. Starting in 2012/13, each program will collect data from their identified assessment points that reside in years 1 and 4 of the program. Then, in 2013/14, they will collect the data at the assessment points that reside in year 2 of the program, and in 2014/15 collection will occur in year 3 (see table below). This will produce a data record of the 2012 first year cohort, starting from first year and following them up through graduation in 2016. Similarly, we will be following the cohort that starts in 2015 up through their graduation in 2019, and so on. We plan to analyze the data as it is collected so that we can begin rolling in evidence-based improvements in our curriculum starting in 2013/14. Using the longitudinal approach, we will also be utilizing the full cycle of data every three years to investigate our curricula in more depth. For example, our first opportunity for this kind of comprehensive examination will be in 2016/17. We will have four years of data on the class of 2016, giving us an opportunity to explore the development of this student cohort in more depth.

Table 1. Cyclic data collection plan

Program year where data collection is occurring

year Year 1 Year 2 Year 3 Year 4

2012/13 X X

2013/14 X

2014/15 X

2015/16 X X

2016/17 X

2017/18 X

Page 11: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

11

2018/19 X X

8. Conclusion

The Faculty has been actively engaged in the development of a robust outcomes-based, continuous curriculum improvement process since 2010. Our approach to this process has been to work, both at the Faculty level and at the program level, toward a cyclic process that will produce meaningful information on which to base curriculum change. This effort has engaged a wide range of faculty and has included student consultation. The result is a framework that sets outcome goals and expectations in the form of global outcomes, indicators and rubrics. At the program level, every program has made significant progress toward mapping their curriculum and identifying assessment points that can be used in this process. The programs have piloted the system in select courses this year, and developed tools (e.g. curriculum mapping tools) to help them with the process. It will take considerable sustained effort to implement this plan into a fully functioning, steady-state data collection process. However, we are confident that we have developed the capacity, and that we have a framework, to fully operationalize this system.

Page 12: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

12

Appendix A. Global outcomes and indicators

The list below is organized as follows: 3.1.x Graduate Attribute

x.A Global Outcome x.A.i Indicators

The Faculty Graduate Attribute Committee, in consultation with a wide range of stakeholders and contributors, developed a list of global outcomes that describe the University of Toronto engineering graduate in the 12 dimensions of the graduate attributes. Each of the global outcomes is further characterized by a set of measureable indicators. The list of global outcomes and indicators was presented to the Faculty Council in spring 2012 and was approved. The global outcomes have also been represented graphically (see Figure 1 below). This diagram visually organizes the outcomes. The outcomes, indicators and diagram will continue to evolve as we further develop the process.

Page 13: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

13

1. Knowledge base for engineeringDemonstrate competence in:1.A. mathematics and modeling.1.B. natural sciences and engineering

fundamentals.1.C. specialized engineering knowledge

appropriate to the program.

2. Problem analysisDemonstrate the ability to:2.A. identify and characterize an engineering

problem.2.B. formulate a solution plan (methodology) for

an engineering problem.2.C. formulate and interpret a model.2.D. execute solution process for an engineering

problem.

3. InvestigationDemonstrate the ability to:3.A. define a problem.3.B. devise and execute a plan to

solve a problem.3.C. use critical analysis to reach

valid conclusions supported by the results of the plan.

4. DesignDemonstrate the ability to:4.A. frame a complex, open-ended

problem in engineering terms.4.B. generate a diverse set of candidate

engineering design solutions.4.C. select candidate engineering design

solutions for further development.4.D. advance an engineering design to a

defined end state.

5. Use of engineering toolsDemonstrate the ability to:5.A. use fundamental modern techniques,

resources and engineering tools.5.B. use discipline specific techniques,

resources and engineering tools.5.C. show recognition of limitations of the

tools used.

6. Individual and team workDemonstrate the ability to:6.A. establish and monitor team

organizational structure.6.B. promote team effectiveness through

individual action.6.C. be successful in a team based

project.

7. Communication skillsDemonstrate the ability to:7.A. identify and credibly communicate engineering

knowledge.7.B. use different modes of communication.7.C. develop communication through an iterative

process.

8. ProfessionalismDemonstrate the ability to:8.A. describe engineering roles in a

broader context, e.g. as pertains to the environment, health, safety, and public welfare

8.B. recognize the impacts of engineering within a global society (the broader public interest)

8.C. behave in a professional manner

9. Impact of engineering on society and the environmentDemonstrate :9.A. understanding of relationships among technology and the social,

cultural, economic and environmental conditions of society, locally and globally, in both the short-and long-term.

9.B. ability to identify and choose alternative ways to mitigate or prevent adverse social, environmental, human health and safety impacts.

9.C. awareness of legal issues relevant to an engineering activity.

10. Ethics and equityDemonstrate the ability to:10.A. recognize ethical and equity based

dilemmas.10.B. apply the Code of Ethics and equity

principles. 10.C. act ethically and demonstrate individual

accountability.

11. Economics and project managementDemonstrate the ability to:11.A. estimate the life-cycle economic and financial costs and

benefits for relevant engineering activities.11.B. evaluate the economic and financial performance of an

engineering activity and compare alternative proposals on the basis of these measures.

11.C. read and understand financial statements for engineering activities.

11.D. plan and manage engineering activities to be within time and budget constraints.

12. Life-long learningDemonstrate the ability to:12.A. independently summarize, analyze,

synthesize and evaluate information from a wide variety of sources.

12.B. develop a strategy to identify and address gaps in knowledge.

InterpersonalAbility

Standards of Practice

Perspectives & Management

UofT Engineering

Figure 1. shows a graphical representation of the global outcomes for the Faculty of Applied Science and Engineering at the University of Toronto. This figure was created to provide a visually appealing diagram that helps faculty, students, and staff conceptualize this information.

Page 14: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

14

3.1.1. A knowledge base for engineering 1.A. Demonstrate competence in mathematics and modeling.

1.A.i Recognize functional relationships between independent and dependent variables. 1.A.ii Describe the physical meaning of functions, derivatives of functions, and integrals

of functions. 1.A.iii Demonstrate ability to construct a mathematical model to describe a physical

system. 1.A.iv Demonstrate ability to obtain valid solutions to a model, including the application

of integral and differential calculus and linear algebra.

1.B. Demonstrate competence in natural sciences and engineering fundamentals. Fundamental scientific and physical principles essential to engineering. Ideally these should include: Conservation laws, Newtonian mechanics, fundamentals of electricity and magnetism, thermodynamics, atomic structure of matter, chemical interactions.

1.B.i Identify fundamental scientific and engineering principles that govern the performance of a given process or system.

1.B.ii Accurately apply fundamental natural science and engineering science knowledge and principles.

1.C. Demonstrate competence in specialized engineering knowledge appropriate to the program. List of areas of scientific and engineering knowledge relevant to the student’s discipline: (e.g. for ECE: Photonics and semiconductor physics, Electromagnetics and energy systems, Analog and digital electronics, Control communications and signal processing, Computer hardware and computer networks, Software.)

1.C.i Identify scientific and engineering principles that govern the performance of a given process or system.

1.C.ii Accurately apply the natural and engineering principles relevant to the student’s discipline.

3.1.2. Problem analysis

2.A. Demonstrate the ability to identify and characterize an engineering problem 2.A.i Recall the types of problems that engineers encounter: simple, complex;

determinate, indeterminate; open-ended, closed-ended; degree of accuracy required in the solution.

2.A.ii Describe the causes of the problem and its effects. 2.A.iii Classify a given problem, and the type of solution sought. 2.A.iv Recognize the mathematical, engineering and other relevant knowledge that

applies to a given problem.

2.B. Demonstrate the ability to formulate a solution plan (methodology) for an engineering problem. 2.B.i Determine primary objectives and key constraints (think primary stakeholders). 2.B.ii Reframe complex problems into interconnected sub-problems. 2.B.iii Identify existing solution processes that can be applied to solve a problem.

Page 15: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

15

2.B.iv Recognize missing information: information that needs to be gathered, or assumptions that need to be made.

2.B.v Compare and contrast alternative solution processes to select the best process. 2.B.vi Plan a systematic solution process. 2.B.vii Determine evaluation criteria for assessing alternative solution plans to open-

ended problems.

2.C. Demonstrate the ability to formulate and interpret a model. 2.C.i Choose a model (mathematical or otherwise) of a system or process that is

appropriate in terms of applicability and required accuracy. 2.C.ii Identify assumptions (mathematical and physical) necessary to allow modeling of

a system at the level of accuracy required. 2.C.iii Formulate the model in engineering terms. 2.C.iv Interpret modeling results of processes or systems using scientific and engineering

principles. 2.D. Demonstrate the ability to execute a solution process for an engineering problem.

2.D.i Implement solution to simple problems. 2.D.ii Evaluate alternative solutions to open-ended problems and select final solution. 2.D.iii Identify sources of error in the solution process, and limitations of the solution. 2.D.iv Validate the solution to a complex engineering problem.

3.1.3. Investigation

3.A. Demonstrate the ability to define a problem Lower level (appropriate for years 1 and 2)

3.A.i Describe how to define a problem for purposes of investigation 3.A.ii Explain how to find previous work Graduating level

3.A.iii State the problem, its scope and importance 3.A.iv Describe the previous work 3.A.v State the objective of the work

3.B. Demonstrate the ability to devise and execute a plan to solve a problem

Lower level (appropriate for years 1 and 2) 3.B.i Describe standard tests (experiments) and methods for information collection Graduating level

3.B.ii Select a set of tests to be conducted 3.B.iii Select, plan and apply the methods for collecting the information 3.B.iv Identify limitations of the tests and methods used and their impact on the results.

3.C. Demonstrate the ability to use critical analysis to reach valid conclusions supported by

the results of the plan 3.C.i Analyze the results 3.C.ii Formulate the conclusions

Page 16: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

16

3.C.iii Validate conclusions by induction or deduction 3.C.iv Compare conclusions with previous work 3.C.v Characterize the limitations and implications of the conclusions

3.1.4. Design

4.A. Demonstrate ability to frame a complex, open-ended problem in engineering terms. 4.A.i. Elicit and document engineering requirements from stakeholders. 4.A.ii. Synthesize engineering requirements from a review of the State of the Art. 4.A.iii. Extract engineering requirements from relevant engineering Codes and Standards 4.A.iv. Explore and synthesize engineering requirements from larger social and

professional concerns.

4.B. Demonstrate ability to generate a diverse set of candidate engineering design solutions. 4.B.i. Apply formal idea generation tools to develop a diverse set of candidate

engineering design solutions. 4.B.ii. Adapt reference designs to generate a diverse set of candidate engineering design

solutions. 4.B.iii. Use models, prototypes, etc., to generate a diverse set of candidate engineering

design solutions.

4.C. Demonstrate ability to select candidate engineering design solutions for further development. 4.C.i. Apply formal multi-criteria decision making tools to select candidate engineering

design solutions for further development. 4.C.ii. Use the results of experiments and analysis to select candidate engineering design

solutions for further development. 4.C.iii. Consult with domain experts and stakeholders to select candidate engineering

design solutions for further development.

4.D. Demonstrate ability to advance an engineering design to a defined end state. 4.D.i. Refine a conceptual design into a detailed design. 4.D.ii. Implement, or provide a plan to implement, a conceptual or detailed design. 4.D.iii. Redevelop or iterate a conceptual design.

3.1.5. Use of engineering tools

5.A. Demonstrate ability to use fundamental modern techniques, resources and engineering tools. 5.A.i. Demonstrate ability to apply fundamental engineering tools, techniques and

resources in engineering activities for the purpose of: o Acquiring information: Use of library and internet references o Modeling and simulating systems: Use of simulation software,

prototype/simplified physical models

Page 17: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

17

o Monitoring system performance: Measuring instruments, monitoring software o Creating engineering designs: Use of CAD tools

5.B. Demonstrate ability to use discipline specific techniques, resources and engineering

tools. Departments fill in here: short list of tools that are important and specific to the discipline 5.B.i Demonstrate ability to use engineering tools, techniques and resources specific to

the student’s discipline. 5.C. Show recognition of limitations of the tools used.

5.C.i Recognize the assumptions and simplifications used in a model or a simulation and their impact on the results.

5.C.ii Recognize the limitations of the capabilities of the tools used. 5.C.iii Recognize accuracy and sources of error in measurements, modeling or

simulations.

3.1.6. Individual and team work

6.A. Demonstrate ability to establish and monitor team organizational structure 6.A.i Establish and use norms of practice (e.g. rules, roles, charters, agendas, etc.). 6.A.ii Re-assess and refine team’s norms of practice during the course of a project. 6.A.iii Assess individual contributions to a team activity and provide feedback.

6.B. Demonstrate ability to promote team effectiveness through individual action

6.B.i Apply formal models of team and individuals (e.g. psychometrics, team role models, etc) to adapt individual actions to team norms.

6.B.ii Demonstrate effective communication within the team . 6.B.iii Demonstrate trust and accountability within the team.

6.C. Demonstrate success in a team based project

6.C.i Complete a successful project: Measurement methods could include peer assessment, attribution tables, reflection and observation by an instructor amongst others.

6.C.ii Present results as a team, with smooth integration of contributions from all individual efforts.

3.1.7. Communication skills

7.A. Demonstrate the ability to identify and credibly communicate engineering knowledge. 7.A.i Recognize and explain context of a particular engineering design or solution in

relation to past and current work as well as future implications. 7.A.ii Recognize credible evidence in support of claims, whether the evidence is

presented in written, oral or visual form (reading).

Page 18: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

18

7.A.iii Formulate, in written, visual and/or spoken form, credible and persuasive support for a claim.

7.A.iv Organize written or spoken material– to structure overall elements so that their relationship to a main point and to one another is clear.

7.A.v Create “flow” in a document or presentation – flow is a logical progression of ideas, sentence to sentence and paragraph to paragraph.

7.B. Demonstrate the ability to use different modes of communication. 7.B.i Relate ideas in a multi-modal manner – visually, textually and/or orally. 7.B.ii Incorporate various media effectively in a presentation or written documents.. 7.B.iii Tailor the mode of presentation, depending on the situation, whether it is a formal

talk, a technical report or a poster presentation, etc., and to accommodate different learning styles.

7.C. Demonstrate the ability to develop communication through an iterative process. 7.C.i Use iteration to clarify and amplify understanding of issues being communicated. 7.C.ii Use reflection to determine and guide self-development.

3.1.8. Professionalism

8.A. Demonstrate the ability to describe engineering roles in a broader context, e.g. as pertains to the environment, health, safety, and public welfare 8.A.i Identify and describe various engineering roles; particularly as pertains to

protection of the public and public interest 8.A.ii Explain the basic concepts of risk management (hazard vs risk; identification,

assessment, mitigation, tolerance, etc.) 8.A.iii Consider the entire life cycle within such an explanation including risk to the

public, and the environment 8.B. Demonstrate the ability to recognize the impacts of engineering within a global society

(the broader public interest) 8.B.i Recognize the limitations of regulations, codes and standards, and the limits of a

life cycle analysis, when engineering in a global context 8.B.ii Identify options, select and defend a preferred option as part of an integrated

design experience 8.B.iii Recognize stakeholders and their interests and objectives

8.C. Demonstrate the ability to behave in a professional manner

8.C.i Demonstrate RAGAGEP (Recognized as Generally Accepted Good Engineering Practice) appropriate to the discipline, including regulations, standards, guidelines, quality of written work

8.C.ii Demonstrates professional etiquette and conduct as illustrated in the Guideline on Professional Practice, particularly with regards to interpersonal relations

Page 19: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

19

8.C.iii Demonstrate duty to the profession 8.C.iv Ability to recognize and avoid misconduct

3.1.9. Impact of engineering on society and the environment

9.A. Demonstrate understanding of the relationships among technology and the social, cultural, economic and environmental conditions of society, locally and globally, in both the short-and long-term. 9.A.i Explain the interconnectedness of social and technological development,

including both the impacts of technology on society and impacts of society on technology.

9.A.ii Identify the possible social, cultural, environmental and human health-related impacts over the life-cycle of an engineering product or activity relevant to the student’s discipline.

9.A.iii Identify and evaluate the potential risks (likelihood and consequences) to human health and the environment of an engineering product or activity relevant to the student’s discipline.

9.A.iv Identify relevant viewpoints and stakeholders in an engineering activity. 9.B. Demonstrate the ability to identify and choose alternative ways to mitigate or prevent

adverse social, environmental, human health and safety impacts. 9.B.i Compare technological alternatives and identify means to mitigate the social,

environmental, human health and safety impacts. 9.B.ii Apply principles of preventive engineering and sustainable development to an

engineering activity or product relevant to the student’s discipline. 9.C. Demonstrate awareness of legal issues relevant to an engineering activity.

9.C.i Identify legal requirements relevant to a specific engineering activity, such as standards, codes or regulations.

3.1.10. Ethics and equity

10.A. Demonstrate the ability to recognize ethical and equity based dilemmas. 10.A.i Distinguish the differences between ethics, and legality (i.e. legal standard). 10.A.ii Articulate the issues involved in ethical case studies (given a case study). 10.A.iii Articulate the issues involved in case studies involving equity problems.

10.B. Demonstrate the ability to apply the Code of Ethics and equity principles.

10.B.i Analyze a case, describe and defend an appropriate response in which the Code of Ethics is applied.

10.B.ii Ability to work with a diverse group of people(s) in a mutually respectful manner.

10.B.iii Apply a code of ethics and/or equity principles in the context of a course project or team project.

Page 20: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

20

10.C. Demonstrate the ability to act ethically and demonstrate individual accountability.

10.C.i Demonstrate ability to behave in accordance with the Code of Behaviour on Academic Matters.

10.C.ii Demonstrate ability to behave in accordance with other Codes of the University (e.g. the Code of Student Conduct).

3.1.11. Economics and project management

11.A. Demonstrate ability to estimate the life-cycle economic and financial costs and benefits for relevant engineering activities. 11.A.i Identify the various types of economic and financial benefits and costs of an

engineering activity. 11.A.ii Identify credible estimates of the costs and benefits. 11.A.iii Recognize the uncertainties in the estimates.

11.B. Demonstrate ability to evaluate the economic and financial performance of an engineering activity and compare alternative proposals on the basis of these measures. 11.B.i Calculate appropriate economic and financial performance measures of an

engineering activity. 11.B.ii Choose the most appropriate alternative based on economic and financial

considerations. 11.B.iii Explain the implications of inflation, taxes and uncertainties on these values and

comparisons. 11.C. Demonstrate ability to read and understand financial statements for engineering

activities 11.C.i Explain the various types of financial statements and the terminology used in

them, and calculate key financial measures. 11.D. Demonstrate ability to plan and manage engineering activities to be within time and

budget constraints 11.D.i Identify the tasks required to complete an engineering activity, and the

resources required to complete the tasks. 11.D.ii Determine and adjust the schedule of tasks and their resources to complete an

engineering activity on time and within budget.

3.1.12. Life-long learning

12.A. Demonstrate the ability to independently summarize, analyze, synthesize and evaluate information from a wide variety of sources (learning independently) 12.A.i Summarize the key points in an assigned reading 12.A.ii Connect new, self-taught, material to more formally taught knowledge.

Page 21: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

21

12.A.iii Organize knowledge independently, and using methods distinct from instructor provided materials; such as creating notes, outlining, concept maps, responses to reflective questions, arguments or workbook.

12.B. Demonstrate the ability to develop a strategy to identify and address gaps in

knowledge (becoming a self-directed learner) 12.B.i Identify the deficiencies or gaps of understanding uncovered through reading or

reflective review. 12.B.ii Create a plan to address new material that has not been formally taught.

Page 22: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

22

Appendix B. Example Rubrics

The Graduate Attribute Committee (GAC) is developing a rubric for each of the approved indicators. The purpose is to give faculty a starting point for developing their own assessment tools, and also to set a Faculty wide benchmark for the quality of performance we expect (i.e. a target level of quality). The target is the “meets expectations” level of performance quality.

The rubrics can be combined and adapted by faculty to fit a particular assessment. The results will allow us to compare demonstrated competency of different cohorts at different assessment points, e.g. compare year 1 to year 3; or compare mechanical engineering students to chemical engineering students; or compare how the same cohort performed during the same term on two different oral presentations in different courses.

These rubrics are still in draft stage and will continue to be refined. The GAC will be continuing to work on the rubrics and improve them as we gain experience implementing them into our assessment system. Generally the levels of every rubric are:

Fails Fails to demonstrate; work quality is poor Below expectations Demonstrates to some degree; work quality is marginally

acceptable but below standard Meets expectations Demonstrates correctly and completely; good work quality Exceeds expectations Demonstrates to a high degree; shows deep understanding and

exceptional work quality

Examples The following examples illustrate the way in which we are using rubrics to set expectations for the quality of performance we expect to see from a graduating engineer in regard to our indicators. 3.1.2. Problem analysis

2.A. Demonstrate the ability to identify and characterize an engineering problem 2.A.i Recall the types of problems that engineers encounter: simple, complex;

determinate, indeterminate; open-ended, closed-ended; degree of accuracy required in the solution.

Fails Recalls few if any types of problems Below expectations Recalls several types of problems, but lacks accuracy in

definition of problem types Meets expectations Recalls many types of problems, and provides a generally

accurate characterization of problem types

Page 23: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

23

Exceeds expectations Recalls many types of problems and provides an accurate and insightful characterization of the problem

2.A.ii Describe the causes of the problem and its effects. Fails Description misses several major causes, and effects are poorly

described Below expectations Describes some causes but misses at least one significant cause,

and effects are only marginally described. Meets expectations Describes all of the major problem causes correctly, and effects

are well described. Exceeds expectations All major causes well described clearly, and effects from

multiple perspectives described well.

2.A.iii Classify a given problem, and the type of solution sought. Fails Does not recognize the type of the problem or solution type Below expectations Superficially or partially identifies the problem type; solution

type is not accurately characterized Meets expectations Describes the problem and type of solution accurately Exceeds expectations Provides an insightful characterization of the problem and the

solution type

2.A.iv Recognize the mathematical, engineering and other relevant knowledge that applies to a given problem.

Fails Does not identify any significant, relevant knowledge Below expectations Identifies some of the relevant knowledge Meets expectations States problem correctly in applicable mathematical or other

terms; accurately identifies the relevant mathematical and engineering principles that apply

Exceeds expectations Accurately identifies all relevant mathematical and engineering principles that apply; Makes excellent choices of models and justifies choices or demonstrates applicability to the problem

2.B. Demonstrate the ability to formulate a solution plan (methodology) for an engineering problem. 2.B.i Determine primary objectives and key constraints (think primary stakeholders). Fails Missing key constraints or objectives; little explanation of

constraints and objectives. Below expectations Determines some of the key constraints and some objectives;

constraints and objectives are only marginally explained. Meets expectations Accurately determines the key constraints that need to be taken

into account in the solution plan; Accurately describes the key objectives for the solution. Constraints and objectives are explained to some degree.

Page 24: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

24

Exceeds expectations Accurately determines the key constraints and explains/justifies choices; Accurately describes key objectives for the solution and clearly connects these objectives to engineering principles and/or stakeholder interests.

2.B.ii Reframe complex problems into interconnected sub-problems. Fails Unable to identify an appropriate approach to reframe the

problem; or reframing is incorrect or very difficult to understand Below expectations Reframing is incomplete, or some errors in reframing; re-

framing process can be understood with some effort Meets expectations Problem is properly reframed into sub-problems; process of

reframing is clear and logical, easy to understand Exceeds expectations Finds an elegant and efficient reframing; process is clear, logical

and provides insight or commentary that contributes significantly to understanding of the reader.

2.B.iii Identify existing solution processes that can be applied to solve a problem. Fails Unable to identify an approach for solving the problem Below expectations Identifies a somewhat related, but incorrect solution process; or

solution process identified is incomplete Meets expectations Identifies an effective solution process, or set of processes, that

can be applied to solve the problem Exceeds expectations Identifies a correct, concise and elegant solution process;

justification for selection is clearly presented

2.B.iv Recognize missing information: information that needs to be gathered, or assumptions that need to be made.

Fails Unable to realize that there is missing information in a problem statement or assumptions that need to be made

Below expectations Identifies some of the missing information and makes some assumptions though not necessarily entirely correct

Meets expectations Identifies all the missing information and makes all the assumptions needed

Exceeds expectations Identifies all missing information with explanation; Makes very well-informed and well-justified assumptions as needed

2.B.v Compare and contrast alternative solution processes to select the best process. Fails Fails to identify one or more than one solution process; no

comparison made; no clear selection process Below expectations Alternative solution processes identified, but missing

comparison or comparison is superficial Meets expectations Identifies alternative solution processes; well-grounded

comparison made; selection process is clear and justified Exceeds expectations All alternative solution processes identified; thorough

Page 25: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

25

comparison made at a professional level; selection process is clear and well developed argument used to justify choice

2.B.vi Plan a systematic solution process. Fails No solution plan in evidence or solution plan is deeply flawed Below expectations Solution plan given, but it is superficial or somewhat incomplete,

may have some flaws Meets expectations Solution plan is clear, complete and useable Exceeds expectations Solution plan is exceptionally clear and complete; level of detail

is exemplary and would allow another to use it without further explanation

2.B.vii Determine evaluation criteria for assessing alternative solution plans to open-

ended problems. Fails No criteria given, or criteria is substantially incomplete or

incorrect; little or no justification for criteria Below expectations Some criteria given; missing some key criteria; or criteria are

not well justified Meets expectations All key criteria determined; criteria are well justified and correct Exceeds expectations All criteria thoroughly determined; justification is thorough,

clear and well-grounded in the problem, literature or other sources

2.C. Demonstrate the ability to formulate and interpret a model.

2.C.i Choose a model (mathematical or otherwise) of a system or process that is appropriate in terms of applicability and required accuracy.

Fails Unable to find a reasonable model for the problem Below expectations Chooses a reasonable but incomplete model; or chooses a related

but not entirely correct model Meets expectations Chooses a model that is applicable and accurate Exceeds expectations Chooses the best model in terms of applicability, accuracy and

other considerations (such as solvability, or efficiency of solution); choice is well supported and deep understanding demonstrated

2.C.ii Identify assumptions (mathematical and physical) necessary to allow modeling of

a system at the level of accuracy required. Fails Fails to identify correctly assumptions necessary Below expectations Correctly identifies many of the key assumptions Meets expectations Correctly identifies virtually all of the key assumptions

necessary to allow modeling of a system at the level of accuracy required; some justification given

Exceeds expectations Correctly identifies all key assumptions and provides a clear and thorough justification that demonstrates deep understanding

Page 26: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

26

2.C.iii Formulate the model in engineering terms. Fails Fails to translate the problem into engineering terms, or

formulation is significantly flawed Below expectations Problem is formulated in engineering terms with some errors or

inconsistencies Meets expectations Problem is corrected formulated in engineering terms; generally

complete and consistent Exceeds expectations Formulation is clear, easy to understand, and correct; it is

complete and consistent; explanation of formulation is clear 2.C.iv Interpret modeling results of processes or systems using scientific and engineering

principles. Fails Unable to interpret the modeling results correctly; or no

interpretation given Below expectations Interpretation is somewhat correct; may be incomplete or

somewhat inaccurate Meets expectations Interpretation of modeling results is clear, complete and correct Exceeds expectations Interpretation of modeling results is very clear, thorough, correct

and provides useful insight into the process or system

2.D. Demonstrate the ability to execute solution process for an engineering problem.

2.D.i Implement solution to simple problems. Fails Unable to solve the problem Below expectations Provides a partial solution; or solution is given but difficult to

follow steps to the answer Meets expectations Provides a complete and correct solution; easy to follow solution Exceeds expectations Provides an elegant and efficient solution; solution is well

explained, easy to follow, and commentary provides insight

2.D.ii Evaluate alternative solutions to open-ended problems and select final solution. Fails Fails to appropriately evaluate alternative solutions; selection

process is shallow or poorly justified Below expectations Some evaluation evident; selection process is justified Meets expectations Evaluation is clear, appropriate, and complete; selection process

is clear, appropriate, and supported (i.e. explained well) Exceeds expectations Evaluation process is exceptionally clear and complete;

selection process is very clear and well-grounded in the problem requirements, literature, and other sources.

2.D.iii Identify sources of error in the solution process, and limitations of the solution. Fails Does not recognize sources of error or limitations of solution Below expectations Recognizes and identifies some sources of error, but may be

Page 27: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

27

somewhat superficial; limitations of the solution are given but not clearly described

Meets expectations Correctly identifies sources of error and describes limitations of the solution accurately

Exceeds expectations Excellent thorough discussion of sources of error; limitations of the solution are clearly and accurately described in depth

2.D.iv Validate the solution to a complex engineering problem. Fails No evidence of validation; or validation process is superficial or

highly flawed (e.g. circular) Below expectations Makes a reasonable but incomplete or somewhat inaccurate

attempt at validating the solution Meets expectations Provides a validation of the solution which is generally complete Exceeds expectations Validates the solution thoroughly and provides insights into the

solution process and/or results

3.1.4. Design

4.A. Demonstrate ability to frame a complex, open-ended problem in engineering terms. 4.A.i. Elicit and document engineering requirements from stakeholders. Fails - Little consideration of stakeholders

- Limited application of stakeholder input Below expectations - Some key stakeholders missing

- Some application of stakeholder input to requirements - Fair documentation of engineering requirements

Meets expectations - All key stakeholders identified - Well developed documentation - Clearly defined requirements - Logical relationship between stakeholder interests and

requirements Exceeds expectations - Comprehensive list of stakeholders

- Thorough and comprehensive requirements - Extensively developed - Insightful relationship to stakeholder interest

4.A.ii. Synthesize engineering requirements from a review of the State of the Art. Fails - Minimal review of state of the art

- Minimal consideration of engineering requirements Below expectations - fair review of state of the art

- Some understanding of engineering requirements with some

Page 28: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

28

missing Meets expectations - Good review of state of the art

- Solid links to engineering requirements Exceeds expectations - Comprehensive review of state of the art

- Well developed links to a complete list engineering requirements

4.A.iii. Extract engineering requirements from relevant engineering Codes and Standards Fails - Minimal review of codes and standards

- Minimal linkage to engineering requirements Below expectations - Some review of codes and standards

- Some linkage to engineering requirements Meets expectations - Good review of relevant codes and standards

- Clear links to engineering requirements Exceeds expectations - Comprehensive review of relevant codes and standards

- Well developed links to engineering requirements 4.A.iv. Explore and synthesize engineering requirements from larger social and

professional concerns. Fails - Minimal consideration

- Minimal linkage to engineering requirements Below expectations - Some consideration of concerns

- Some linkage to engineering requirements Meets expectations - Good consideration of relevant concerns

- Solid links to engineering requirements Exceeds expectations - Comprehensive consideration of relevant concerns

- Well developed links to engineering requirements

4.B. Demonstrate ability to generate a diverse set of candidate engineering design solutions.

4.B.i. Apply formal idea generation tools to develop a diverse set of candidate engineering design solutions.

Fails - Only a few solutions proposed, with one clearly favored - Little use of formal ideation tools

Below expectations - Only a few solutions proposed, - Some use of appropriate use of formal ideation tools

Meets expectations - A diverse set of feasible solutions proposed, - appropriate use of formal ideation tools

Page 29: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

29

Exceeds expectations - A diverse set of desirable solutions proposed, - Excellent use of formal ideation tools

4.B.ii. Adapt reference designs to generate a diverse set of candidate engineering design

solutions. Fails - Inappropriate or irrelevant reference designs cited

- Limited analysis of reference designs Below expectations - Identification of relevant reference designs

- Some analysis of reference designs Meets expectations - Identification of relevant reference designs

- Good analysis of reference designs - application to proposed solutions

Exceeds expectations - Complete identification of relevant reference designs - Thorough analysis of reference designs - Clear linkage and application to proposed solutions

4.B.iii. Use models, prototypes, etc., to generate a diverse set of candidate engineering

design solutions. Fails - Limited use of models or prototypes (if relevant) Below expectations - Some use of models or prototypes) Meets expectations - Good use of well-developed models or prototypes Exceeds expectations - Excellent use of well-developed models or prototypes

4.C. Demonstrate ability to select candidate engineering design solutions for further

development. 4.C.i. Apply formal multi-criteria decision making tools to select candidate engineering

design solutions for further development. Fails - Limited justification for candidate designs Below expectations - Some justification of candidate designs Meets expectations - Clear justification of ranking of candidate designs, with use of

formal decision making tools or methods. Exceeds expectations - Thorough justification of ranking of candidate designs, with

logical arguments based on the use of formal decision making tools.

4.C.ii. Use the results of experiments and analysis to select candidate engineering design

solutions for further development.

Page 30: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

30

Fails - Limited justification for candidate designs Below expectations - Some justification of candidate designs Meets expectations - Clear justification of ranking of candidate designs or design

considerations based on experiment and analysis Exceeds expectations - Thorough justification of ranking of candidate designs or

design considerations based on comprehensive experiment and analysis

4.C.iii. Consult with domain experts and stakeholders to select candidate engineering

design solutions for further development. Fails - Little or no consultation with DES Below expectations - some consultation with DES

- limited application of DES input in proposed design solutions Meets expectations - Consultation with most stakeholders

- Clear application of DES input in proposed design solutions Exceeds expectations - Thorough consultation with DES

- Thorough analysis of stakeholder input - Well developed application in proposed design solutions.

4.D. Demonstrate ability to advance an engineering design to a defined end state.

4.D.i. Refine a conceptual design into a detailed design. Fails - Little or no refinement Below expectations - Some refinement of design Meets expectations - Appropriate refinement into a detailed design Exceeds expectations - Extensive refinement into a detailed design

4.D.ii. Implement, or provide a plan to implement, a conceptual or detailed design. Fails - Cursory or unrealistic work plan for implementation Below expectations - Description of work plan with some missing pieces or

unrealistic goals Meets expectations - Complete and realistic work plan for implementation Exceeds expectations - Detailed description of realistic and thorough work plan

4.D.iii. Redevelop or iterate a conceptual design. Fails - Little or no iteration Below expectations - Some iteration to arrive a final design

Page 31: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

31

Meets expectations - Clear use of iteration to evolve a design Exceeds expectations - Very clear use of iteration to evolve and significantly improve

a design

3.1.7. Communication skills

7.A. Demonstrate the ability to identify and credibly communicate engineering knowledge. 7.A.i Recognize and explain context of a particular engineering design or solution in

relation to past and current work as well as future implications. Fails No discussion is given about relevant current work and/or future

implications Below expectations Relevant current work and/or future implications are somewhat

identified and discussed. Meets expectations Relevant current work and/or future implications are identified

and discussed in reasonable detail Exceeds expectations Relevant current work and/or future implications are identified

and discussed in comprehensive detail.

7.A.ii Recognize credible evidence in support of claims, whether the evidence is

presented in written, oral or visual form (reading). Fails Evidence that supports claims is drawn from unreliable sources Below expectations Evidence that supports claims is drawn from somewhat reliable

sources. Meets expectations Evidence that supports claims is drawn from fairly reliable

sources. Exceeds expectations Evidence that supports claims is drawn from highly reliable

sources.

7.A.iii Formulate, in written, visual and/or spoken form, credible and persuasive support

for a claim. Fails No claims are formulated or most claims are not credible in

themselves or are not supported logically with evidence from reliable sources

Below expectations Some claims and conclusions are credible in themselves are supported logically with evidence from reliable sources.

Meets expectations Most claims and conclusions are credible in themselves are supported logically with evidence from fairly reliable sources.

Page 32: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

32

Exceeds expectations All claims and conclusions are credible in themselves are supported logically with evidence from highly reliable sources.

7.A.iv Organize written or spoken material– to structure overall elements so that their

relationship to a main point and to one another is clear. Fails Material is poorly organized with no clear structure to its

elements Below expectations Material is partially organized with some clear structure to its

elements Meets expectations Material has good organization with clear relationship among its

elements Exceeds expectations Material is very well organized with excellent relationship

among its elements 7.A.v Create “flow” in a document or presentation – flow is a logical progression of

ideas, sentence to sentence and paragraph to paragraph. Fails Ideas are not presented in a clear logical progression, sentence to

sentence and paragraph to paragraph. Below expectations Some ideas are presented in a clear logical progression, sentence

to sentence and paragraph to paragraph Meets expectations Most ideas are presented in a clear logical progression, sentence

to sentence and paragraph to paragraph Exceeds expectations All ideas are presented in a clear logical progression, sentence to

sentence and paragraph to paragraph

7.B. Demonstrate the ability to use different modes of communication.

7.B.i Relate ideas in a multi-modal manner – visually, textually and/or orally. Fails Ideas are mostly related in a single mode. Below expectations Ideas are related in more than one mode with some effectiveness Meets expectations Ideas are related in more than one mode with good effectiveness Exceeds expectations Ideas are related effectively and appropriately using visual,

textual and oral modes 7.B.ii Incorporate various media effectively in a presentation or written documents. Fails Limited or no use of multiple media (e.g. tables, figures or

pictures) Below expectations Multiple media used with some effectiveness Meets expectations Multiple media used with good effectiveness Exceeds expectations Multiple media used appropriately and effectively throughout the

report/presentation

Page 33: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

33

7.B.iii Tailor the mode of presentation, depending on the situation, whether it is a formal talk, a technical report or a poster presentation, etc., and to accommodate different learning styles.

Fails Limited or no use of multiple media (e.g. tables, figures or pictures)

Below expectations Multiple media used with some effectiveness Meets expectations Multiple media used with good effectiveness Exceeds expectations Multiple media used appropriately and effectively throughout the

report/presentation

7.C. Demonstrate the ability to develop communication through an iterative process.

7.C.i Use iteration to clarify and amplify understanding of issues being communicated. Fails No significant improvement in clarity and depth of ideas from

one draft to the next Below expectations Some improvement in clarity and depth of ideas from one draft

to the next Meets expectations Good improvement in clarity and depth of ideas from one draft

to next; demonstrated ability to effectively use feedback and/or self-assessment to improve work

Exceeds expectations Excellent improvement in clarity and depth of ideas from one draft to the next; demonstrates valuing of iteration to improve quality of work

7.C.ii Use reflection to determine and guide self-development. Fails Self-reflection inaccurate and not useful for development of

strategies for improvement Below expectations Self-reflection somewhat accurate and somewhat useful for

development of strategies for improvement Meets expectations Self-reflection mostly accurate and useful for development of

strategies for improvement Exceeds expectations Self-reflection accurate, insightful and very useful for

development of strategies for improvement 3.1.11. Economics and project management

11.A. Demonstrate ability to estimate the life-cycle economic and financial costs and benefits for relevant engineering activities. 11.A.i Identify the various types of economic and financial benefits and costs of an

engineering activity. Fails Provides inappropriate analysis of economic and financial

benefits

Page 34: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

34

Below expectations Identifies some of the important benefits Meets expectations Recognizes most of the important benefits Exceeds expectations Provides an excellent discussion of benefits 11.A.ii Identify credible estimates of the costs and benefits. Fails Estimates are unreasonable Below expectations Some but not all estimates are reasonable Meets expectations All estimates are reasonable Exceeds expectations Estimates are excellent and well justified

11.A.iii Recognize the uncertainties in the estimates. Fails Poor understanding of uncertainties Below expectations Reasonable understanding of uncertainties Meets expectations Identifies all important uncertainties correctly Exceeds expectations Excellent discussion of uncertainties

11.B. Demonstrate ability to evaluate the economic and financial performance of an engineering activity and compare alternative proposals on the basis of these measures. 11.B.i Calculate appropriate economic and financial performance measures of an

engineering activity. Fails Unable to identify appropriate measures or calculate them

correctly Below expectations Calculates some measures correctly Meets expectations Properly identifies all important measures and calculates them

correctly Exceeds expectations Excellent discussion of economic impact and financial

considerations 11.B.ii Choose the most appropriate alternative based on economic and financial

considerations. Fails Unable to compare alternatives correctly Below expectations Makes some reasonable but incomplete financial comparisons Meets expectations Makes a reasonable choice taking all important factors into

consideration Exceeds expectations Provides an excellent financial comparison 11.B.iii Explain the implications of inflation, taxes and uncertainties on these values and

comparisons. Fails Unable to provide a reasonable accounting Below expectations Takes some but not all factors into account Meets expectations Provides a reasonable accounting of the important factors Exceeds expectations Provides an excellent discussion of the implications of various

factors

Page 35: Report on the Development of the Graduate Attributes Process Fac… · curriculum improvement. However, we have developed a plan for continuous curriculum improvement, which will

35

11.C. Demonstrate ability to read and understand financial statements for engineering

activities 11.C.i Explain the various types of financial statements and the terminology used in

them, and calculate key financial measures. Fails Unable to read financial statements correctly Below expectations Demonstrates understanding of some of the terms Meets expectations Correctly interprets financial statements and makes reasonable

conclusions Exceeds expectations Demonstrates mastery of the subject

11.D. Demonstrate ability to plan and manage engineering activities to be within time and

budget constraints 11.D.i Identify the tasks required to complete an engineering activity, and the resources

required to complete the tasks. Fails Unable to analyze an engineering activity properly Below expectations Identifies some of the required resources correctly Meets expectations Provides a proper accounting of the tasks and corresponding

resources Exceeds expectations Provides an excellent discussion of the resources and any trade-

offs

11.D.ii Determine and adjust the schedule of tasks and their resources to complete an

engineering activity on time and within budget. Fails Unable to prepare a reasonable schedule Below expectations Schedule is reasonable but does not fully account for

contingencies, etc. Meets expectations Schedule is reasonable and accounts for the most important

factors Exceeds expectations Prepares a tight schedule with excellent discussion of

contingencies