NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and...

22
NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and ABET Mashup

Transcript of NMSU Pathways Workshop September 18-20, 2014 Design Thinking, Low Res Prototyping, Assessment and...

NMSU Pathways WorkshopSeptember 18-20, 2014

Design Thinking, Low Res Prototyping, Assessment and ABET Mashup

Lisa Getzler-Linn , Director Baker Institute for Entrepreneurship, Creativity and

InnovationIntegrated Product Development Program

Integrating assessment of innovation,

creativity and entrepreneurial skills in

the undergraduate engineering

curriculum

Integrated Product Development Program (IPD):

Authentic, experiential learning through projects with established companies, local entrepreneurs, student entrepreneurs

• >19 years

• >250 Industry Sponsors

• >~3000 students in over 400 project teams

• 2014 Project Year: 32 teams, 210 students, 15 majors,18 team advisors

• Assessment of student performance in an experiential, problem based, multidisciplinary team project course where a large part of the learning is unstructured, and the body of knowledge expected to be applied is variable can be direct and authentic but it’s a challenge.

• Tools used to assess a student’s performance should represent all meaningful aspects of that performance as well as provide equitable grading standards.

• IPD has developed direct, authentic and formative measurement tools that are tied directly to course learning objectives.

Assessment of Student Performance in IPD

Student Performance Assessment PrimerDirect MeasuresTools used to measure what a student can doIndirect MeasuresTools used to measure what is perceived by the student Authentic MeasuresTools used to measure an act that occurs in a real setting as opposed to a simulation Performance CriteriaThe standards by which student performance is measuredFormative AssessmentTools that measure attainment and provide feedback so the student may adjust, improve or reiterate their work product, performance or behavior.Summative AssessmentTools that measure skill attainment or knowledge gained during a period of time where the measurement is taken at the end of the process.

 

 

IPD Objectives:

• Design effective solutions to industry problems

• Demonstrate an understanding of technical entrepreneurship

• Participate in & lead an interdisciplinary product development team

• Effectively communicate through written, oral & graphical presentations

• Develop engineering design solutions in a broad global context

• Address aesthetics & ergonomics issues in product development

• Develop a value statement for the product to be developed

• Design, create & evaluate technical and financial feasibility studies

• Experience project management including people & financial resources

Lisa Getzler-Linn, Integrated Product Development Program

John B. Ochs, Integrated Product Development Program

Todd A. Watkins, College of Business & Economics

Can we measure a student’s understanding of the underlying process, entrepreneurial mindset, use of higher order skills, and willingness to immerse themselves in the product development/innovation journey?

What “measurable moments” occur during the lifecycle of an IPD project?

Which are the appropriate assessment tools for each measurable moment? Behaviors, attitudes, mindset?What about project artifacts?

How and what to Measure ?

Lisa G “A well articulated and publicly visible rubric can become the meeting ground that facilitates a shared understanding of what the students should know and be able to do (Bush & Timms, 2000)”

IPD + Assessment = rubrics

A rubric is:•An assessment tool used to create a clear picture of what is expected of students in a given task or deliverable.

•Designed to both set expectations and measure the learning that has occurred.

•Used by IPD to directly measure the authentic learning that has occurred during the life of the team project as well as to give formative feedback.

Lisa Getzler-Linn, Integrated Product Development Program

John B. Ochs, Integrated Product Development Program

Todd A. Watkins, College of Business & Economics

The IPD Toolset of Rubrics

Rubrics are used by all 18 team advisors to grade all 200 students across all 32 teams and measure:

student performance by team-Midterm Presentation, Final Presentation, Written Reports, Posters and Briefs

individual student performance-Notebook, Contribution to Project

Lisa Getzler-Linn, Integrated Product Development Program

John B. Ochs, Integrated Product Development Program

Todd A. Watkins, College of Business & Economics

The IPD Toolset of Rubrics activity - artifact - criteria

Spring semester – Month #1 Background and overview of industry, company and competitive landscape. Problem definition, business opportunity and technical contextualization of problem including customer and stakeholder identification and needs plus current practices and specifications and constraints.

Presentation #1 - team describes, discusses and presents evidence of above. Rubric – team measure for capturing the artifacts, experiences and authentic moments when the students actually discovered these, and measuring the level to which they did so.

Presentation #1 – team’s first attempt

Project Scope Value Statement Communication

Mission Statement and/or Project Description are clear, concise &

demonstrate asolid understanding of project scope.

Tackboard reflects that the team has an understandingof both the business &

technical challenges of this project and that they are building a foundation which

will allow them to move forward in a business context.

Team is able to clearly articulate the scope of the project. They are able to answer questions aproppriate for

this timeframe and have taken ownership of the project.

How and Why – not so much What

Spring semester - Month #2, Presentation #2:

Generating concepts then combining and selecting the one(s) that will solve the technical problem in a business context through innovative, appropriate means and the process followed to do so.

Technical analyses of concept(s) through modeling, simulation, mock-up development to create a clear path toward recognizing parameters, performance characteristics and user requirements.

Tying the customer /stakeholder needs back to the concept selection process and quantifying those needs.

Presentation #2 – deeper dive

Target Specifications Market / Business ContextRationale for Concept(s) Status of Best Concept(s)

Team has established a set of specifications that spell out in measurable

detail what the product/process has to be or do to be commercially acceptable.

Customer needs, aesthetics & ergonomics, technical & financial feasibility, and factors that differentiate the best concept specs

from alternatives have clearly been considered in the process.

Target customers/stakeholders & their needs as well as the competitive

environment of the particular industry or market have been researched; this market context clearly supports the target specs being developed; clear

competitive benchmarking of key specs was discussed.

Team can clearly articulate why their top concepts were chosen &

how they are relevant to the market/business context. Without presenting a laundry list of what

they did the team shows/tells why many concepts that were generated & screened were rejected and how

they justified & supported the concept(s) selected.

The team's best concept(s) is (are) clearly represented by way of a physical or analytical model, simulation, series of sketches, process/functional diagram,

actual product/prototype or some combination of above. Team is able to articulate their best concept, its relative

merits, trade-off issues & all of its attributes to date.

Spring semester - Individual Lab or Maker Notebook:

This living document is used throughout the project as both a record of work done by the individual student as a member of the course/team/project, and as a legal record of Intellectual Property if invention occurs.

Reflection on the design process has been included as a metric that measures professional skills beyond that of a student in a course. Notebooks are collected 3 x per semester and the rubric is applied. One grade is given at the end for the overall document and process followed.

Record, Reflect, Reiterate

Individual Notebook Rubric

IPD Process Content Record of Intellectual Property Format

Project Journal: Daily written evidence of use of the IPD process from text, lecture, team & individual work that is a clear chronological record of what is being learned, applied &

accomplished. Consideration is given to BOTH project development and team leadership/team

member behavior. Extensive daily written evidence of reflections of what is being learned by the writer based on experiences during the

execution of the project.

Clear written evidence of all ideas, both carried out and abandoned. All

analytical work, design ideas, technical specifications, experiments, financial estimates, sources and significant thinking are noted and dated in this

record.

The Notebook itself is the correct one. The requisite care was taken to

reserve pages for a Table of Contents, each page was numbered, signed and dated. Authorship is clearly identified. Entries were not crowded and ink was used when any patenting questions

were involved and all data was entered contemporaneously. There is a clear and concise link to the weekly team

progress reports including time spent.

Essential Elements

Experimental/business/market data, interview data, analyses & interpretation thereof; Analytical work, calculations&

conclusions; Graphs, charts with labels, titles and interpretations; Sketches, CAD

drawings, models & photographs titled and interpreted; References to information

sources; Reflections on work, ideas, input and project as a whole throughout.

exemplary

Individual Contribution to Team

Technical Contribution Contribution / Resourcefulness Leadership & Team WorkProfessionalism & Interaction

with Sponsor

Technical knowledge gained and contributed set the course of the

project. Amount and quality of work was paramount to the successful

outcome of the project.

Took on more than their share of the workload & identified & pursued most of the resources needed to find the

best solution for almost every aspect of the project.

Inspired the vision of the team, nurtured a team harmony, and took on

a role of leader when appropriate. Always a team player. Guided the

progress of the project and delegated responsibilities; was paramount in

project's success.

Level of professionalism and maturity was exemplary. Fostered a positive professional relationship with others

outside the team who were involved in the project, which added greatly to the

success of the project.

exemplary

Presentations Weekly Briefs and Executive Summaries:

Team artifacts with measures to capture the professional skills of graphical, oral and written communications in the context of presenting evidence of the project’s status.

Through the ability to communicate the actual events that led to the discoveries and solutions to the problem, student learning is achieved and measured – both presentation skills and the authentic events that are documented.

Project Artifacts

Presentation Rubric

Program Assessment - indirect

IPD uses surveys, interviews and focus groups

for student feedback on program efficacy – for the purpose of continuous improvement. Areas covered are course objectives as related to project deliverables, students’ understanding of their own capabilities as a result of the course, student satisfaction with faculty, staff, process and facilities.

for sponsor satisfaction, surveys and individual interviews are conducted as external evaluation of above metrics.

IPD provides assessment data, documents and protocols to participating departments and colleges for accreditation purposes.

Design Your Assessment Tools: focus

Higher order skills like creativity, innovation, communication, critical thinking, design thinking, etc. can be measured.

What are you measuring for? Attainment of knowledge? Application of techniques? Evidence of work accomplished?

Which skills should be measured for grading purposes?

Are there activities that indicate that learning has occurred?

How should domain knowledge be measured?

Design Your Assessment Tools: define

Purpose of assessment: grading? learning outcomes? pre/post

Student performance, self-efficacy or program efficacy?

Direct or Indirect?

Formative or Summative?

Type of learning being measured? experiential, authentic?

Multiple graders or one faculty member?

Learning objectives?

Criteria for each objective?

Thank You!

Lisa Getzler-Linn, Director Baker Institute for Entrepreneurship, Creativity and InnovationIntegrated Product Development Program

Lehigh University 11 E Packer Avenue Bethlehem PA 18015 [email protected]