Transforming Assessment in Education

93
Transforming Assessment in Education Implementing the I nstructional D ecision S upport S ystem of the AEFIS Solution Platform Education and Information Systems, Technologies and Applications: EISTA 2010 June 30, 2010, 10:10am-12:10pm Orlando, Florida, USA

Transcript of Transforming Assessment in Education

Page 1: Transforming Assessment in Education

Transforming Assessment in EducationImplementing the

Instructional Decision Support System of the AEFIS Solution Platform

Education and Information Systems, Technologies and Applications: EISTA 2010June 30, 2010, 10:10am-12:10pm

Orlando, Florida, USA

Page 2: Transforming Assessment in Education

Presenting TeamPresenting Team► Donald McEachron, Ph.D.*

Drexel University School of Biomedical Engineering, Science and Health Systems, Philadelphia, Pennsylvania USAPennsylvania, USA

► Metta AlsobrookUniversity of Texas at Dallas, Office of Student Success and Assessment, Richardson, Texas, USA

► Craig Bach, Ph.D.Drexel University, Office of the Provost, Philadelphia, Pennsylvania, USA

► Elisabeth Papazoglou, Ph.D.► Elisabeth Papazoglou, Ph.D.Drexel University School of Biomedical Engineering, Science and Health Systems, Philadelphia, Pennsylvania, USA

► Mustafa Sualp**pUntra Academic Management Solutions, LLC, Philadelphia, Pennsylvania, USA

► Antoinette TorresDrexel University, Office of the Provost, Philadelphia, Pennsylvania, USA

EISTA 2010 | Orlando, Florida, USA

*Session Organizer** Session Co‐Organizer

Page 3: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – AEFIS 3.0 Demo – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

t ti l i i S t S t h t t ti t hi ► Instructional Decision Support Systems: A New Approach to Integrating Assessment, Teaching

and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

Each presenter will speak for 10 minutes, followed by a

5 minute question/answer/discussion period.

EISTA 2010 | Orlando, Florida, USA

Page 4: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student

Learning Outcomes – 15 minutes

► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – 15 minutes

► Learning Anal tics: Targeting Instr ction C rric la and St dent S pport 15 min tes► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 5: Transforming Assessment in Education

EISTA 2010: The 8th International Conference on Education and Information Systems, Technologies and Applications

Orlando, Florida, USA

June 29th – July 2nd ,  2010

Creating and Sustaining Change: Assessment of Student Learning Outcomes

Metta Alsobrook

EISTA 2010 | Orlando, Florida, USA

Metta AlsobrookUniversity of Texas at Dallas

Page 6: Transforming Assessment in Education

Problem

Concern of the quality of higher education institutions

Government (Higher Education Act 1965 & Higher Education Opportunity Act 2008) Process of Assessment of SLO Accrediting agencies Higher Education Institutions

Student Learning Outcomes = SLO

EISTA 2010 | Orlando, Florida, USA

Page 7: Transforming Assessment in Education

Process at UT Dallas

Support Team Leadership Communication

Good Relationship

Assessment ToolChange

agents

Relationship

Facultyagents

Resources

EISTA 2010 | Orlando, Florida, USA

Page 8: Transforming Assessment in Education

Evaluating the Processg

Is it working? Is it working? Is it beneficial/meaningful? Can it be done? Can it be done?

EISTA 2010 | Orlando, Florida, USA

Page 9: Transforming Assessment in Education

The Realityy

Yes, we have a process of assessment Yes, we have a process of assessment No, it is not beneficial for improving our program Cannot be done (too much work to do too little Cannot be done (too much work to do, too little

time) One size fits all One size fits all

Insufficient message from the gleadership

EISTA 2010 | Orlando, Florida, USA

Page 10: Transforming Assessment in Education

How to Sustain the Effort?

Clear and correct message from institution’s leaders Clear and correct message from institution s leaders Assessment process that fits schools and

departmentsdepartments Aligning program assessment with other reviews

such as school review, department review, other such as school review, department review, other accreditation review

Intensive communication about the goals, what is Intensive communication about the goals, what is going on, benefits, and champions

EISTA 2010 | Orlando, Florida, USA

Page 11: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

B k 15 i t► Break – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 12: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum

Design and Assessment – 15 minutes

► Break – 15 minutes

► L i A l ti T ti I t ti C i l d St d t S t 15 i t► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 13: Transforming Assessment in Education

EISTA 2010: The 8th International Conference on Education and Information Systems, Technologies and Applications

Orlando, Florida, USA

June 29th – July 2nd ,  2010

An Iterative Mapping Strategy for Improved Curricular Design and Assessmentp g

Fred AllenElisabeth PapazoglouDrexel UniversityPhiladelphia, Pa

EISTA 2010 | Orlando, Florida, USA

Philadelphia, PaSchool of Biomedical Engineering Science and Health Systems

Page 14: Transforming Assessment in Education

Summaryy

• Definition of Assessment• Student Outcomes vs. Student Learning

P d t P Q lit• Product vs. Process Quality• “Student as Product” Paradigm• Role of Mapping• Role of Mapping• Multiple Assessment• Importance of Iterative Mappingp pp g

EISTA 2010 | Orlando, Florida, USA

Page 15: Transforming Assessment in Education

Modes of Assessment• Multiple Levels

– Institutional– Programmatic– Course/Activityy– Instructor– Student

• Student Outcomes vs. Student Learning Outcomes– Student outcome – SAT scores, retention, graduation rate– Student Learning Outcome – what value has been added

to each individual student?

EISTA 2010 | Orlando, Florida, USA

Page 16: Transforming Assessment in Education

Product versus Process Qualityy

• Product Quality - ensures that the end product meets specifications

• Process Quality - relates to the management process or the means by which product quality is achieved and monitored

C ti Q lit I t (CQI)– Continuous Quality Improvement (CQI)– Process Quality Management (PQM)

EISTA 2010 | Orlando, Florida, USA

Page 17: Transforming Assessment in Education

Student as Product Paradigm

• Why?• Strengths

– Manufacturing processes are well understood -man q alit management models a ailablemany quality management models available

– Process Management concept is generally accepted by engineering facultyp y g g y

• Limitations– Student as human being and customer g– Product must collaborate in own manufacture

EISTA 2010 | Orlando, Florida, USA

Page 18: Transforming Assessment in Education

Establishing GoalsEstablishing Goals• Manufacturing Quality Goal: To ensure thatManufacturing Quality Goal: To ensure that

products meet specifications• Academic Translation: To ensure thatAcademic Translation: To ensure that

students achieve program learning outcomesupon graduation and alumni meet program objectives

• Thus,– Students = products– Specifications/requirements = outcomes and

objectives

EISTA 2010 | Orlando, Florida, USA

objectives

Page 19: Transforming Assessment in Education

Two Possible Results• Products meet specifications

All is well– All is well– No action required

• Products do not meet specifications– What specification(s) are not being met?– Why are these specification(s) not being met?– Where in the process is the problem occurring?– Where in the process is the problem occurring?– What action(s) must be taken to correct the

problem?G d P Q lit M t M d l • Good Process Quality Management Models allow all questions to be answered

EISTA 2010 | Orlando, Florida, USA

Page 20: Transforming Assessment in Education

Multiple LevelsA

B pRaw Material

Stage 1 Stage 2 Stage 3

Stage 4 Stage 5Stage 6 C

SubcontD SubcontD

E

EISTA 2010 | Orlando, Florida, USA

Final Product

Page 21: Transforming Assessment in Education

Multiple Sample (Assessment) Points

• A Raw Materials• A - Raw Materials– Must know properties of starting materials– If starting materials change, a change in If starting materials change, a change in

the manufacturing process will be needed

St d t• Students– What characteristics are important?– How to measure?– How to measure?– How to use results?

EISTA 2010 | Orlando, Florida, USA

Page 22: Transforming Assessment in Education

Multiple Sample (Assessment) Points

• B and D - Stage Monitoring– Measure effects of selected number of

processes/activities– Measure progress towards ultimate goals

• Students– Mapping process– Monitor student progress in order to effectively – Monitor student progress in order to effectively

intervene– Multiple Levels of Intervention

• Student• Student• Instructor• Course/Activity

EISTA 2010 | Orlando, Florida, USA

• Programmatic

Page 23: Transforming Assessment in Education

Multiple Sample (Assessment) Points

• C - Subcontractor/Outside Source– Need to determine if subcontractor/outside source Need to determine if subcontractor/outside source

meeting expectations– Allows for possibility of replacement

• Students – More Mapping– Extracurricular activities

Co operative education– Co-operative education– Service and/or Service Learning

• Students and Activity are Assessed – Not just to monitor students– Also assess effects of non-classroom experience

EISTA 2010 | Orlando, Florida, USA

– Part of the curricular paradigm

Page 24: Transforming Assessment in Education

Multiple Sample (Assessment) Points

• E - Final Product– Must determine if products met acceptable levels of

specificationsspecifications– Final confirmation that all other measures and

activities are valid– Also determines if process quality management – Also determines if process quality management

plan is adequate• If A-D predict successful product and final product

does not meet requirements, the process quality management plan needs to be revised

• Students – More Mapping– Capstone Experiences

• Senior Design• Senior Sequence

EISTA 2010 | Orlando, Florida, USA

Senior Sequence– Limited Intervention for Individual Student

Page 25: Transforming Assessment in Education

Quality Control QuestionsQuality Control Questions

–What is our error margin for a defective product?defective product?

–What happens to a defective d t?product?

–What frequency of defects calls process into question?

EISTA 2010 | Orlando, Florida, USA

Page 26: Transforming Assessment in Education

Correct Mapping is Vital

• What is being Mapped?at s be g apped– Performance criteria

• Constituent elements of student learning outcomes• Measurable• Indicate when and where students are exposed to learning

opportunities• Can also be used to indicate level of knowledge / skill

acquisition

– Assessments• Depend on performance mapping but are not the same

S diff t

EISTA 2010 | Orlando, Florida, USA

• Serve different purpose

Page 27: Transforming Assessment in Education

Performance Mapping is Iterative

• Initial Stages– Develop Student Learning Outcomes and Program

ObjectivesDecompose Student Learning Outcomes into– Decompose Student Learning Outcomes into Performance Criteria

• Example: School of Biomedical Engineering, Science and Health Systems at Drexely– 14 General Student Learning Outcomes– Concentration Area Specific Outcomes

EISTA 2010 | Orlando, Florida, USA

– Approximately 70 Performance Criteria

Page 28: Transforming Assessment in Education

Performance C it i

Courses; Why? How?

Student Learning OutcomesP Obj ti

Criteria Co-Operative Education;Extracurricular Activities, etc.

Where?

Program Objectives

Map 1 – Performance Criteria p

What?Map 2 – Educational Activities

When?Map 3 – Curriculum Sequence

EISTA 2010 | Orlando, Florida, USA

p q

Page 29: Transforming Assessment in Education

First Mapp

• Set Performance Levels:• Introduce; Reinforce; Emphasize• Introduce; Practice; Review; Utilize

C f• Not Critical – Refine Later

• Associate Performance Criteria with• Associate Performance Criteria with Activity

• Small group (ex Curriculum Committee)• Small group (ex. Curriculum Committee)• Use syllabi to associate criteria with course• After initial analysis, submit to faculty for

EISTA 2010 | Orlando, Florida, USA

After initial analysis, submit to faculty for review – buy in

Page 30: Transforming Assessment in Education

EISTA 2010 | Orlando, Florida, USA

Page 31: Transforming Assessment in Education

Utility of First MappingUtility of First Mapping

• Reveals Uneven Distributions of Performance CriteriaPerformance Criteria

G O• Indicates Gaps or Over-Emphasis

• Begins to Reveal True Nature of Current Curriculum

EISTA 2010 | Orlando, Florida, USA

Page 32: Transforming Assessment in Education

Second Map

• First Map: Performance Criteria vs. Coursep– Reveals Relative Importance of Various

Performance Criteria in the Curriculum– Not Enough

• Second Map: Course vs. Performance Criteria– Shows Relative Roles of Courses in Curriculum

R l Di t b t C– Reveals Disconnects between Course Requirements and Performance Criteria

– Uncovers ‘Core’ or ‘Gateway’ courses

EISTA 2010 | Orlando, Florida, USA

Uncovers Core or Gateway courses

Page 33: Transforming Assessment in Education

EISTA 2010 | Orlando, Florida, USA

Page 34: Transforming Assessment in Education

Refining the MappingRefining the Mapping• Place the Performance Criteria in a Temporal

C t tContext

L k t Wh P f C it i• Look at When Performance Criteria are covered as well as Where

• Can Reveal Additional DisconnectsL i D l t l P– Learning as a Developmental Process

– Does the Curriculum Develop Learning Properly?– Do not desire to reinforce or emphasize a concept

EISTA 2010 | Orlando, Florida, USA

– Do not desire to reinforce or emphasize a concept yet to be introduced.

Page 35: Transforming Assessment in Education

EISTA 2010 | Orlando, Florida, USA

Page 36: Transforming Assessment in Education

Additional RefinementsAdditional Refinements

T l t L l i t Bl ’ T• Translate Levels into Bloom’s Taxonomy– Reinforce the Developmental Aspects of Learning

• Place Additional Resources into Key CoursesCourses– Many Performance Criteria– Transitional Timingg

• Begin Mapping Assessments into the

EISTA 2010 | Orlando, Florida, USA

g gCurriculum

Page 37: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – AEFIS 3 0 Demo 15 minutes► Break – AEFIS 3.0 Demo – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 38: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and

Student Support 15 minutesStudent Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 39: Transforming Assessment in Education

EISTA 2010: The 8th International Conference on Education and Information Systems, Technologies and Applications

Orlando, Florida, USA

June 29th – July 2nd ,  2010

Learning Analytics: Targeting Instruction, Curricula and Student Support

Craig Bach

EISTA 2010 | Orlando, Florida, USA

gDrexel UniversityPhiladelphia, PaOffice of the Provost

Page 40: Transforming Assessment in Education

INTRODUCTIONINTRODUCTIONCommon Applications of Analytics- Insurance Industry- Pharmaceutical Industry- Credit and Financial Services- Academic (e.g., Enrollment)

How do we apply these techniques to improve and inform learning?learning?

DefinitionBy learning analytics we mean the application of advanced statistical By learning analytics we mean the application of advanced statistical modeling to target instructional, curricular, learning, and advising actions in support of achieving specific learning goals.

Learning analytics provides actionable insight from data

EISTA 2010 | Orlando, Florida, USA

Learning analytics provides actionable insight from data.

Page 41: Transforming Assessment in Education

THE PROBLEMTHE PROBLEMWe are inundated by data…

Test scores (e g ACT SAT CLA MAPP)- Test scores (e.g, ACT, SAT, CLA, MAPP)- Class and project grades- Demographic, psychographic, bio data- Learning styles, characteristics or preferences data- LMS/CMS activity data

Survey data (e g CIRP NSSE)- Survey data (e.g., CIRP, NSSE)

How do we select the most salient data that can inform how we support student learning?How do we integrate data from across the institution in support of student learning?

EISTA 2010 | Orlando, Florida, USA

Page 42: Transforming Assessment in Education

ACADEMIC ANALYTICSACADEMIC ANALYTICS

EISTA 2010 | Orlando, Florida, USA

Page 43: Transforming Assessment in Education

APPLICATIONS TO LEARNINGAPPLICATIONS TO LEARNINGPossible Applications

- Predicting Outcome Achievement- Curricular Sequencing- Complexity Indexp y- Prioritizing Learning Outcomes- Setting Course and Instructional Policies

D fi i A d i Q lit- Defining Academic Quality

EISTA 2010 | Orlando, Florida, USA

Page 44: Transforming Assessment in Education

LIMITATIONSLIMITATIONS1. Quality of data

Garbage In => Garbage Out

2. Communication of Data (dashboarding)

3 A i t f ti f l i t l i d t3. Appropriateness of assumptions of analysis to learning data

4. Potential for misuse

EISTA 2010 | Orlando, Florida, USA

Page 45: Transforming Assessment in Education

ETHICAL CONSIDERATIONSETHICAL CONSIDERATIONS► What data is appropriate (legal) to collect about students?

What data is inappropriate?What data is inappropriate?► Who should be able to access the data and view results? Which

data should be reported anonymously? Which can be tagged to students for educational purposes?to students for educational purposes?

► What is the impact of showing faculty modeling results? Do any of the data bias faculty instruction and evaluation of students?

EISTA 2010 | Orlando, Florida, USA

Page 46: Transforming Assessment in Education

Group DiscussionGroup DiscussionConvergence in Higher Education

- The increasing focus on applying analytics to problems in higher education and the attention on learning outcomes assessment provides us with a unique opportunity to support student l i i learning in new ways.

Discussion Points:

- How can analytic tools be use to address the most pressing instructional, curricular or operational problems your faculty confront?

- What are the benefits and challenges of developing a learning analytics program?

EISTA 2010 | Orlando, Florida, USA

Page 47: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching– 15 minutes

► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 48: Transforming Assessment in Education

EISTA 2010: The 8th International Conference on Education and Information Systems, Technologies and Applications

Orlando, Florida, USA

June 29th – July 2nd ,  2010

EduApps: Freeing Faculty for Innovative TeachingInnovative Teaching

Craig Bach1Donald McEachron2

EISTA 2010 | Orlando, Florida, USA

Drexel UniversityPhiladelphia, Pa1Office of the Provost2School of Biomedical Engineering. Sciences and Health Systems

Page 49: Transforming Assessment in Education

THE PROBLEMTHE PROBLEMDefinition and ContextThere are several barriers to faculty adoption of innovative teaching methodologies:

- Time and effort involvedTime and effort involved- Insufficient opportunity and reward- Lack of demonstrated effectiveness- Insufficient support and resources- Significant personnel dependence- Lack of scalabilityLack of scalability- Lack of faculty ownership- Current educational “apps” not particularly good

EISTA 2010 | Orlando, Florida, USA

How do we overcome these barriers?

Page 50: Transforming Assessment in Education

THE PROBLEMTHE PROBLEMThe Audience:Faculty, in terms of adoption of instructional innovations, can be separated into three groups:

- Usual Suspects: Faculty members who regularly attend professional development activities, attend educational conferences, and use educational research to improve their instructional methodologies

- The Majority: Faculty members with little training in teaching methods who are heavily focused on their research and scholarly activities and have little or no time to seek out and try new educational pedagogies

- Untouchables: Faculty members who will never voluntarily spend time on these issues and grouse at the smallest provocation

The first group is always, already in. The third group never. Our focus is on supporting the second group!

EISTA 2010 | Orlando, Florida, USA

Page 51: Transforming Assessment in Education

AN EduApps PORTALAN EduApps PORTAL

The Approach:The Approach:

- Modeled after the iPhone approach for distributing applications

- Not necessarily a software application or technological in o ecessa y a so a e app ca o o ec o og ca nature

- Small, transferrable and modular

- No significant learning curve required in order for faculty to implement

- No significant additional investment of time and/or resources to No significant additional investment of time and/or resources to implement

- Provides instructional, curricular or operational support

EISTA 2010 | Orlando, Florida, USA

- Responds to a specific educational need

Page 52: Transforming Assessment in Education

AN EduApps PORTALAN EduApps PORTAL

Identifying and Prioritizing Problems:- Focus on faculty-identified problems (may also

include students/advisors)

- Build in feedback loops from faculty (and students/advisors?)

- Build early success across broad range of faculty (and students/advisors?)

EISTA 2010 | Orlando, Florida, USA

Page 53: Transforming Assessment in Education

AN EduApps PORTALAN EduApps PORTALPoint-of-Use IntegrationTraining and support is need for faculty to understand how tog pp yrespond to a range of data collected about learner characteristicsand course feedback. EduApps can “plug into” specific points ofuse – educable moments.- Pre-/Post-Course Surveys: EduApps aligned to questions, specific

constructs, or results- Outcome Assessment Results: EduApps aligned to specific areas pp g p

of student challenge- Learning Styles Inventory: EduApps targeted to specific styles- Institution Learning Goals: EduApps developed to support - Institution Learning Goals: EduApps developed to support

instruction or evaluation of specific learning goals- Accreditation Standards: EduApps focused on supporting

compliance to specific standards (e g syllabus content)

EISTA 2010 | Orlando, Florida, USA

compliance to specific standards (e.g., syllabus content)

Page 54: Transforming Assessment in Education

EXAMPLE | ClickersEXAMPLE | ClickersProblem: During a lecture, an instructor wants todetermine student understanding in order to pace ordetermine student understanding in order to pace orredirect the discussion

We have an EduApp for that: Clickers

Main Features: - Contact information for several faculty with experience using clickersContact information for several faculty with experience using clickers- One‐page setup/implementation instruction set- Best use, potential pitfalls and suggestions- Directions to use collected data- Evaluation and usage data collection tools

EISTA 2010 | Orlando, Florida, USA

Page 55: Transforming Assessment in Education

EXAMPLE | SyllabusEXAMPLE | SyllabusProblem: A program director wants to assureconsistent quality of course information whileconsistent quality of course information whileencouraging faculty to re-think the syllabus aroundstudent learning outcomes

W h Ed A f th t Th L i S ll bWe have an EduApp for that: The Learning Syllabus

Main Features: - Syllabus tool (technology) will lead users through syllabus developmentSyllabus tool (technology) will lead users through syllabus development 

focused around a set of learning outcomes- Delineate required, optional, and suggested items

Support decision points to determine length of syllabus and what other- Support decision points to determine length of syllabus and what other related documents are created to supplement the syllabus

- Contact information for expert resources

EISTA 2010 | Orlando, Florida, USA

- Evaluation and usage data collection tools

Page 56: Transforming Assessment in Education

Future DirectionsFuture DirectionsProblem: How to Best Utilize EduApps

Solution: EduApps in Context

Associate and Integrate: Incorporate into Instructional Decision Support SystemIncorporate into Instructional Decision Support System

Provide faculty with EduApps in the Context of a Class or Other Educational Experience

Incorporate into Learning Decision Support SystemIncorporate into Learning Decision Support SystemProvide students with personalized learning support system 

Provide faculty and students with right information in the right format

EISTA 2010 | Orlando, Florida, USA

in the right place at the right time

Page 57: Transforming Assessment in Education

Group DiscussionGroup Discussion

EduApp Examples:- What are the most pressing instructional, curricular or

operational problems your faculty confront?- Identify several ways to address these problems that

could be implemented using the EduApp model?could be implemented using the EduApp model?

Implementation- What do you see as the most significant roadblocks

to implementing an EduApps portal?

EISTA 2010 | Orlando, Florida, USA

Page 58: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

► Instructional Decision Support Systems: A New Approach

to Integrating Assessment, Teaching and Learning – 15 minutes

► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 59: Transforming Assessment in Education

EISTA 2010: The 8th International Conference on Education and Information Systems, Technologies and Applications

Orlando, Florida, USA

June 29th – July 2nd ,  2010

Instructional Decision Support Systems: A New Approach to Integrating

Assessment Teaching and LearningAssessment, Teaching and LearningInnovations in Engineering Education, Curriculum, and Infrastructure (IEECI) FundingSupported by Grant #: NSF 0835985 awarded to D. McEachron

Donald McEachron1Antoinette Torres2Antoinette TorresDavid Delaine3Drexel UniversityPhiladelphia, Pa1School of Biomedical Engineering 

EISTA 2010 | Orlando, Florida, USA

Science and Health Systems2Office of the Provost3Department of Electrical and Computer Engineering

Page 60: Transforming Assessment in Education

Problem Statement

► “Engineering Education must change in light of the changing workforce anddemographic needs”demographic needs

►Much data on student performance and perceptions on course recorded, butthen where does it go?then where does it go?

►Assessment data pushed up administrative hierarchy before returning towhere its neededwhere its needed Loss of time Data becomes less detailed, looses resolution

How can this loop be closed and essential faculty receive the information when and where it is needed?

EISTA 2010 | Orlando, Florida, USA

information when and where it is needed?

Page 61: Transforming Assessment in Education

Proposed Solution

►An evidence‐based intervention system is proposed for theguided evolution of engineering education programsguided evolution of engineering education programs.

►The implementation of Instructional Decision Support System►The implementation of Instructional Decision Support System(IDSS) approaches will provide rapid feedback of assessmentdata combined with student characteristics to empower facultyinstructors and enhance student learninginstructors and enhance student learning.

Preliminary data is provided as proof‐of‐concept of thisPreliminary data is provided as proof of concept of this approach

EISTA 2010 | Orlando, Florida, USA

Page 62: Transforming Assessment in Education

Outline

1. Problem Statement2 Concept Overview2. Concept Overview3. Proposed Solutions4. Current Studies4. Current Studies5. Conclusion

EISTA 2010 | Orlando, Florida, USA

Page 63: Transforming Assessment in Education

Our Approach

1. Employ a dynamic view of learning and teaching styles where the characteristics of student and faculty are yperiodically measured to establish an assessment process calibration

2. Use knowledge management systems to process voluminous data collection and analysis in an efficient yand flexible manner

3. Use a modular design of an established assessment paradigm that provides points of real‐time intervention to responsively optimize educational practices

EISTA 2010 | Orlando, Florida, USA

to responsively optimize educational practices

Page 64: Transforming Assessment in Education

Instructional Decision Support System

The potential of a web‐based knowledge management system p g g ythat promotes personalized learning is investigated.

Provides rapid feedback of assessment data combined with student characteristics to empower faculty instructors and enhance student learningg

Dr. Robert Hayward of the Centre for Health Evidence, in medical practice, clinical decision support systems (CDSS) “link health observations with health knowledge to influence health choices by clinicians for improved health care”.

EISTA 2010 | Orlando, Florida, USA

health choices by clinicians for improved health care . 

Page 65: Transforming Assessment in Education

Knowledge Management

1. To develop and implement an information system for the collection and analysis of student and faculty instructors characteristics

1. To develop and implement an information system for the collection and analysis of course and curricular characteristics

2. To develop and implement an information system for the collection and analysis of student performance

3. Develop and implement a method for instructional support that ensures these data are used to enhance student learning.

This information must be collected without overburdening the users with data and delivered in

context for maximum usability

EISTA 2010 | Orlando, Florida, USA

context for maximum usability

Page 66: Transforming Assessment in Education

Outline

• Problem Statement• Concept Overview• Concept Overview• Proposed Solutions• Current StudiesCurrent Studies• Conclusion

EISTA 2010 | Orlando, Florida, USA

Page 67: Transforming Assessment in Education

What is an IDSS?

b d fAn interactive computer‐based information system which links student characteristics, d f hstudent performance, instructor characteristics, 

learning outcomes, and instructional methods i f f l d i i h ito inform faculty decisions on the appropriate educational pedagogy to improve student 

l ilearning 

EISTA 2010 | Orlando, Florida, USA

Page 68: Transforming Assessment in Education

EISTA 2010 | Orlando, Florida, USA

Page 69: Transforming Assessment in Education

What is AEFIS?

Academic Evaluation, Feedback and Intervention System—AEFIS is the web based academic the web-based academic assessment management solution that automates best practices in assessment and evaluation in order to enhance curriculum development and streamline the accreditation process.

EISTA 2010 | Orlando, Florida, USA

Page 70: Transforming Assessment in Education

IDSS Approach Active AEFISResearch/Development

Instructional Decision Support System – IDSSfeatures on AEFIS Solution Platform 

Incoming Course Rationale Evaluation Results NotesIncomingStudent/Course

Profile (ISCP)

Course Rationaleand History

Profile (CRHP)…and other features and functionality to be added to IDSS implementation.

Evaluation Results, Notesand Recommendations

(ERNR)

CoursesCourseSyllabus

Program

l ti SCoursesSyllabus

History & Details

Course Objectives

LearningOutcomes

PerformanceCriteria

Direct Assessment &Evaluation Feedback

Student Feedback

Evaluations & Surveys

www.goaefis.com/

Meeting Minutes &Documentation

Course Objectives

Course Outcomes

Assessment

CriteriaStudent Feedback

Faculty Feedback

Alumni Feedback

EISTA 2010 | Orlando, Florida, USA

http://w

DocumentationRecommendations

Page 71: Transforming Assessment in Education

Avoiding Data OverloadThree standard reports to be presented to each faculty instructor prior to the beginningof any term in which that instructor is teaching :

A useful data “Snapshot” to facilitate instructional decision without requiring significantA useful data Snapshot to facilitate instructional decision without requiring significantadditional effort

1. Incoming Student/Course Profile IncomingStudent/Courseg

1 Course Rationale and History Profile

Student/CourseProfile (ISCP)

Course Rationale1. Course Rationale and History Profile

1 E l i R l N d R d i

and HistoryProfile (CRHP)

1. Evaluation Results Notes and RecommendationsEvaluation Results, Notesand Recommendations

EISTA 2010 | Orlando, Florida, USA

(ERNR)

Page 72: Transforming Assessment in Education

IncomingStudent/CourseP fil (ISCP)

IDSS Format►Incoming Student/Course Profile (ISCP)

Profile (ISCP)

I. Relevant student characteristics (learning styles, course load, work load, lifestyle, etc.)

II Current performance achievement on performanceII. Current performance – achievement on performance metrics related to the course materials

III. Suggestions for instructional approachesI. Definitions of terms (what is meant by global or visual 

learning, etc.)II. Links to possible instructional approaches for p pp

students with such characteristicsIV. Clear, simple format with links to additional information

EISTA 2010 | Orlando, Florida, USA

Page 73: Transforming Assessment in Education

Course Rationaleand History

Profile (CRHP)IDSS Format►Course Rationale and History Profile (CRHP)

Profile (CRHP)

I. How does course fit into program curriculum?I. What performance criteria and/or student learning 

outcomes are associated with the courseoutcomes are associated with the courseII. What educational experiences came before this class? 

What can students expect to encounter afterwards?II. What is the value of the course?

I. How does learning this material and/or skill set facilitate program goals?f p g g

II. How does learning this material and/or skill set facilitate student goals?I Employment

EISTA 2010 | Orlando, Florida, USA

I. EmploymentII. Professional advancement

Page 74: Transforming Assessment in Education

IDSS Format Evaluation Results, Notesand Recommendations

(ERNR)

►Evaluation Results, Notes and Recommendations (ERNR)

(ERNR)

( )

I. Summarize assessment data for this courseI S d /i b i / i i /i i hI. Student/instructor observations/opinions/insightsII. Any direct measures of performance on previous 

studentsI. Grouped dataII. Correlations with student characteristics and 

instructional approachesinstructional approachesII. Recommendations

I. Archival recommendations (searchable)II C t d ti f l t t t

EISTA 2010 | Orlando, Florida, USA

II. Current recommendations from latest assessment and evaluation

Page 75: Transforming Assessment in Education

Prototype IDSS Summary Report Active AEFISResearch/Development

Visual Learner90%

Verbal Learner10%

Sequential Learner30%Global 

Learner70%

Concentration A Average GPA: 2.9

Active Learner60%

Reflective Learner40% Sensing 

Learner

Intuitive Learner40%

Concentration C

Concentration B Average GPA: 3.2

Average GPA: 3.4

60% 60%

0 5 10 15 20 25

Concentration D

Number of Students

Average GPA: 3.4

2 53

3.54

nce Levels

00.51

1.52

2.5

Performan

EISTA 2010 | Orlando, Florida, USA

PC 1 PC 2 PC 3 PC 4Performance Criteria

Page 76: Transforming Assessment in Education

Outline

• Problem Statement• Concept Overview• Concept Overview• Proposed Solutions• Current StudiesCurrent Studies• Conclusion

EISTA 2010 | Orlando, Florida, USA

Page 77: Transforming Assessment in Education

Current Studies►Drexel University School of Biomedical Engineering, Science and Health Systems students being used as test‐caseand Health Systems students being used as test case

►Data collected from students at three points within i lcurriculum:

1) Freshmen 2) Pre Junior and 3) Senior Years

►SurveysInventory of Learning StylesMyers‐Briggs Personality InventoryMyers Briggs Personality InventoryStudent Developmental Task and Lifestyle InventoryMultiple Intelligence InventoryPerspectives and Motivation Inventory

EISTA 2010 | Orlando, Florida, USA

Perspectives and Motivation InventoryStudent Lifestyle Impact Survey

Page 78: Transforming Assessment in Education

Current Studies►Data collected from 150+ students over the past academic year.

►Survey data collected is initially sorted by academic level andgender.

►Continuously obtain valuable insight into the student body’s: Personality and characteristics as a wholeCharacteristics within particular demographics Characteristics within particular demographics

For an individual class or course On an individual student basis

►Provides the ability to precisely modify teaching methodologies to best fit the student body and maximize learning outcomes.  

EISTA 2010 | Orlando, Florida, USA

y g

Page 79: Transforming Assessment in Education

Index of Learning Styles

40.73%

11.10%

59.27%

88.90%

Active Learner

Reflective LearnerVisual Learner

Verbal Learner

3 90%40.80%

62.10%

37.90%

59.20%

Sensing Learner Sequential Learner

EISTA 2010 | Orlando, Florida, USA

Intuitive Learner Global Learner

Page 80: Transforming Assessment in Education

Multiple Intelligence Survey

8

5

6

7

3

4

5

7.05

5.54 5.52 5.30 5.28 4.79

0

1

2 3.963.47

0

Average Logical 

Mathematical Intelligence

Average Interpersonal Intelligence

Average Bodily Kinistethic Intelligence

Average Intrapersonal Intelligence

Average Musical 

Intelligence

Average Spatial 

Intelligence

Average Linguistic Intelligence

Average Naturalistic Intelligence

EISTA 2010 | Orlando, Florida, USA

Page 81: Transforming Assessment in Education

Rokeach Survey

14.00

16.00

8 00

10.00

12.00

4.00

6.00

8.00

0.00

2.00

EISTA 2010 | Orlando, Florida, USA

Page 82: Transforming Assessment in Education

Myers-Briggs Analysis

Educational Goals Student Satisfaction

5.00

6.00

7.00

vel

Educational Goals

2.50

3.00

3.50

vel

Student Satisfaction 

2.00

3.00

4.00

Educationa

l Lev

Male

Female1.00

1.50

2.00

Satisfaction Lev

Male

0.00

1.00

17 19 21 23 25 27

Age

Female

0.00

0.50

17 19 21 23 25

Age

Female

EISTA 2010 | Orlando, Florida, USA

Page 83: Transforming Assessment in Education

Additional SurveysSalubrious LifestyleSenior

Pre‐Junior

Freshmen

6.00Mature Interpersonal …

Peer Relationships

ToleranceEngineering Perspectives

4.00

5.00

Academic Autonomy

Instrumental Autonomy

p

1.00

2.00

3.005.30 5.60 5.70

4.63

Developing Autonomy

Emotional Autonomy

Interdependence

0.00

Educational Involvement

Cultural Participation 

Developing Autonomy …

Career Planning

Lifestlye Planning

EISTA 2010 | Orlando, Florida, USA

0.00 20.00 40.00 60.00Student Developmental Task and 

Learning Assessment

Page 84: Transforming Assessment in Education

►St d t f h i t i lResults in Action►Students prefer having material presented as a series of small steps from which they will derive an overall understanding – a kind of 37 90%overall understanding – a kind of ‘bottom up’ method. 

►Dr. Papazoglou, is a global thinker

37.90%

Sequential Learner

Global Learner►Dr. Papazoglou, is a global thinker and was using a ‘big picture’ approach in her instructional delivery – a type of ‘top‐down’ 

62.10%

methodology. 

►Having the information about the ifi l i l f h

GlobalSequential

1Sequential

2Course Poorly Organized 42% 26% 10%Ability to Follow

specific learning styles of her students in time to adjust her instructional delivery enabled Dr. Papazoglou to enhance those

yCourse 35% N/A N/A

EISTA 2010 | Orlando, Florida, USA

Papazoglou to enhance those students’ educational experience. 

Page 85: Transforming Assessment in Education

Outline

• Problem Statement• Concept Overview• Concept Overview• Proposed Solutions• Current StudiesCurrent Studies• Conclusion

EISTA 2010 | Orlando, Florida, USA

Page 86: Transforming Assessment in Education

Future Work

►Future studies will focus on other student characteristics, the i t ti f th h t i ti ith f lt i t tinteractions of these characteristics with faculty instructor teaching styles and the effect these interactions have on metrics of student performance across the entire curriculum.

►Faculty Characteristics

►Repetition and increased analysis of survey data

►Learning Decision Support System►Learning Decision Support System

EISTA 2010 | Orlando, Florida, USA

Page 87: Transforming Assessment in Education

Conclusions

►Data collection is actually not a significant issue in assessment. There are many methods and techniques available for the collection and storage of assessment results.for the collection and storage of assessment results. 

►The real problem is getting the right data to the right peoplei i ht f t d t th i ht tiin right format and at the right time. 

►It is also important to build into any such system the flexibility to adapt to new circumstances. 

EISTA 2010 | Orlando, Florida, USA

Page 88: Transforming Assessment in Education

Conclusions

►The use of an integrated KM platform has demonstrated proven capabilities to manage information and deliver real‐time data tocapabilities to manage information and deliver real time data to all user groups appropriately.  

►Through the development of faculty friendly IDSS structures this►Through the development of faculty‐friendly IDSS structures this work can lead to enhanced student learning, continuous quality improvement and the necessary validation to support accreditation. 

►The system is being implemented to provide a continuous and y g p pon‐going process of data collection, analysis, use and evaluation so that as the student body changes – and their needs and the needs of society change instructional delivery can adapt

EISTA 2010 | Orlando, Florida, USA

needs of society change – instructional delivery can adapt. 

Page 89: Transforming Assessment in Education

ReferencesArmstrong. T. (2000). Multiple Intelligences in the Classroom, 2nd Edition. Association for Supervision and Curriculum Development.

Anderson, L and Krathwohl, D. (2001). A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Addison, Welsey, Longman

Borg M and Stranahan H (2002) Personality type and student performance in upper level economics courses: The importance ofBorg, M. and Stranahan, H. (2002). Personality type and student performance in upper‐level economics courses: The importance of race and gender. Journal of Economics Education, 33: 3‐14.

Cole, J. and Denzine, G. (2004). “I’m not doing as well in this class as I’d like to”: Exploring achievement, motivation and personality. Journal of College Reading and Learning, 34: 29‐44.

Diaz‐Lefebvre, R. (2004). Multiple intelligences, learning for understanding, and creative assessment: Some pieces to the puzzle of learning. Teachers College Record, 106: 49‐57.

DiMuro, P. and Terry, M. (2007).  A matter of style: Applying Kolb’s learning style model to college mathematics teaching practices. Journal of College Reading and Learning, 38: 53‐60.

Dunn, R. and Stevenson, J. (1997). Teaching diverse college students to study with a learning‐styles prescription. College Student Journal, 31: 333‐339.

Entwistle, N. aand McCune, V. (2004). The conceptual basis of study strategy inventories. Educational Psychology Review, 16: 325‐345345.

Felder, R and Spurlin, J, (2005). Applications, reliability and validity of the index of learning styles. International Journal of Engineering Education, 21: 103‐112.

EISTA 2010 | Orlando, Florida, USA

Page 90: Transforming Assessment in Education

ReferencesFelder, R. (1995). A longitudinal study of engineering student performance and retention. IV. Instructional methods and student responses to them. Journal of Engineering Education, 84: 361‐367.

Felder, R. and Silverman, L. (1988). Learning and teaching styles in engineering education. Engineering Education, 78: 674‐681.

Felder R Felder G and Dietz E J (1998) A longitudinal study of engineering student performance and retention: V ComparisonsFelder, R., Felder, G., and Dietz, E.J. (1998). A longitudinal study of engineering student performance and retention: V. Comparisons with traditionally‐taught students. Journal of Engineering Education, 87: 469‐480.

Felder, R.M., Felder, G. and Dietz, E.J. (2002). The effects of personality type on engineering student performance and attitudes. Journal of Engineering Education, 91, 3‐17.Gardner, H. (1999). Intelligence reframed: Multiple Intelligences for the 21st century. New York: Basic Books., ( ) g f p g f y

Graff, M. (2003). Learning from web‐based instructional systems and cognitive style. British Journal of Educational Technology, 34: 407‐418.

Information Builders. (2010). Decision Support Systems – DSS (definition). Downloaded from http://www.informationbuilders.com/decision‐support‐systems‐dss.html

Ishiyama, J. (2005). The structure of an undergraduate major and student learning: A cross‐institutional study of political science programs at thirty‐two colleges and universities. The Social Science Journal, 42: 359‐366.

Kunzman R (2002) Extracurricular activities: Learning from the margin to rethink the whole Knowledge Quest 30: 22 25Kunzman, R. (2002). Extracurricular activities: Learning from the margin to rethink the whole. Knowledge Quest, 30: 22‐25.

Li, O., McCoach, B., Swaminathan, H. and Tang, J. (2008). Development of an instrument to measure perspectives of engineering education among college students. Journal of Engineering Education, 97: 47‐56.

Litzinger, T, Lee, S. H., Wise, J., and Felder, R. (2007). A psychometric study of the index of learning styles. Journal of Engineering 

EISTA 2010 | Orlando, Florida, USA

g , , , , , , , ( ) p y y g y f g gEducation, 96: 309‐319.

Page 91: Transforming Assessment in Education

ReferencesMartin, G.P. (2000). Maximizing multiple intelligences through multimedia: A real application of Gardner’s theories. Multimedia Schools, 7: 28‐33.

McCoog, I.J. (2007). Integrated instruction: Multiple intelligences and technology. The Clearing House, 81: 25‐28.

National Science Board (2007). Moving Forward to Improve Engineering Education. NSB‐07‐122 (November 19, 2007).

Noble, T. (2004). Integrating the revised Bloom’s taxonomy with multiple intelligences: A planning tool for curriculum differentiation. Teachers College Record, 106: 193‐211.

Raven, M., Cano, J., Carton, B. and Shelhamer, V. (1993). A comparison of learning styles, teaching styles and personality styles of preservice Montana and Ohio agriculture teachers. Journal of Agricultural Education, 34: 40‐50.

Terry, M. (2001). Translating learning style theory into university teaching practices: An article based on Kolb’s experiential learning model. Journal of College Reading and Learning, 32: 68‐85.

Trianatafillou, E., Pomportsis, A., Demetriadis, S. and Georgiadou, E. (2004). The value of adaptivity based upon cognitive style: An empirical study. British Journal of Educational Technology, 35: 95‐106.

Van der Hulst, M. and Jansen, E. (2002). Effects of curriculum organization on study progress in engineering studies. Higher Education 43: 489 506Education, 43: 489‐506.

White, J., Shiffman, R., Middleton, B. and Caban, T.Z. (2008). A National Web Conference on Using Clinical Decision Support to Make Informed Patient Care Decisions. Downloaded from http://healthit.ahrq.gov/images/sep08cdswebconference/textonly/index.html

Winston, R.B. (1990). The student developmental task and lifestyle inventory: An approach to measuring students’ psychosocial

EISTA 2010 | Orlando, Florida, USA

, ( ) p y y pp g p ydevelopment. Journal of College Student Development, 31: 108‐120.

Page 92: Transforming Assessment in Education

Presentation OutlinePresentation Outline► Creating and Sustaining Change: Assessment of Student Learning Outcomes – 15 minutes

► Small Group Activity and Discussion 15 minutes► Small Group Activity and Discussion – 15 minutes

► An Iterative Mapping Strategy for Improved Curriculum Design and Assessment – 15 minutes

► Break – 15 minutes

► Learning Analytics: Targeting Instruction, Curricula and Student Support – 15 minutes

► Drexel EduApps: Freeing Faculty for Innovative Teaching – 15 minutes

t ti l i i S t S t h t t ti t ► Instructional Decision Support Systems: A New Approach to Integrating Assessment,

Teaching and Learning – 15 minutes

► Question and Answer Discussion 15 i t► Question and Answer Discussion – 15 minutes

EISTA 2010 | Orlando, Florida, USA

Page 93: Transforming Assessment in Education

Thank You!Thank You!

► Thank you for joining us!y j g► Please see us if you have any questions

► Interested in getting involved?– AEFIS is currently seeking 3.0 Partner Program ParticipantsAEFIS is currently seeking 3.0 Partner Program Participants– Learn more at www.goAEFIS.com/partner– Contact us at [email protected]

EISTA 2010 | Orlando, Florida, USA