Jim Julius SDSU Course Design Institute May 27, 2009.

29
Jim Julius SDSU Course Design Institute May 27, 2009

Transcript of Jim Julius SDSU Course Design Institute May 27, 2009.

Page 1: Jim Julius SDSU Course Design Institute May 27, 2009.

Jim JuliusSDSU Course Design InstituteMay 27, 2009

Page 2: Jim Julius SDSU Course Design Institute May 27, 2009.

Guiding Questions

Why collect formative feedback on course design?

How should one decide what kind of feedback to seek?

What tools are available to collect feedback?

What do I do with the data?

Page 3: Jim Julius SDSU Course Design Institute May 27, 2009.

What (and why) are you measuring?

Page 4: Jim Julius SDSU Course Design Institute May 27, 2009.

What (and why) are you measuring?

Outcomes: tell you what you got, not how or why

Inputs Processes

Seeking continuous improvement Approaching course design from

an inquiry mindset

Page 5: Jim Julius SDSU Course Design Institute May 27, 2009.

Outcomes

Satisfaction Retention Success Achievement External proficiencies Real-world performance

Page 6: Jim Julius SDSU Course Design Institute May 27, 2009.

Inputs

Learner characteristics Context Design Learning resources Faculty development

Page 7: Jim Julius SDSU Course Design Institute May 27, 2009.

Processes

Pedagogies Presentation media Assignments/assessments Student use of technologies Community of Inquiry model

(social, cognitive, teaching presence) Interactions

(content, peers, instructor, technology itself)

Page 8: Jim Julius SDSU Course Design Institute May 27, 2009.

Community of Inquiry Model

Page 9: Jim Julius SDSU Course Design Institute May 27, 2009.

CoI - Interactions

Page 10: Jim Julius SDSU Course Design Institute May 27, 2009.

Narrowing Your Inquiry Do you want to evaluate your course

according to “best practices”, i.e. standard course design quality criteria?

Do you want to know more about your learners in general: needs, preferences, motivation, satisfaction?

Do you want to focus on student achievement?

Do you want feedback on your facilitation of learning?

Do you want feedback on specific course elements and/or technologies?

Page 11: Jim Julius SDSU Course Design Institute May 27, 2009.

Course Design Quality Criteria Chico rubric Quality Matters Related to Chickering and Gamson’s

“7 Principles for Good Practice in Undergraduate Education” From Indiana University, 2001 From VCU, 2009

Paid tool: Flashlight

Page 12: Jim Julius SDSU Course Design Institute May 27, 2009.

Learning about Learners

Direct Indirect

Learning styles surveys Parallel faculty-student

surveys ELI – student and faculty SDSU’s LRS faculty and

student surveys, adapted from LITRE (NC State)

Distance Education Learning Environment faculty and student surveys

National and institutional data (aggregate)

Institutional data (for your learners)

LMS data

Page 13: Jim Julius SDSU Course Design Institute May 27, 2009.

Student Achievement

Direct Indirect

Low-stakes: muddiest point, minute papers, clickers, discussion boards

Pre- and post- tests

Grade data Attendance/participation Outcome comparisons

(Different technology/pedagogy and same outcome, or Same technology/pedagogy and different outcomes)

Page 14: Jim Julius SDSU Course Design Institute May 27, 2009.

Teacher Behaviors/Overall Direct Indirect

Community of Inquiry Survey

Small Group Analysis Mid-semester surveys End of course

evaluations Assessing online

facilitation Paid: IDEA survey of

student ratings of instruction

Observation Protocols

Page 15: Jim Julius SDSU Course Design Institute May 27, 2009.

Course Elements

Direct Indirect

Student Assessment of Learning Gains: SALG

Clicker opinions survey

Examine usage data from Blackboard

Page 16: Jim Julius SDSU Course Design Institute May 27, 2009.

Data from M. Laumakis

pICT fellow in 2005 Began teaching parallel 500-student

sections of PSYCH 101 in 2006, one traditional and one hybrid

First fully online PSYCH 101, Summer 2008

Page 17: Jim Julius SDSU Course Design Institute May 27, 2009.

Evaluating the Face-to-Face Class Evaluated Fall 2005 innovations via the

Student Assessment of Learning Gains (SALG)

How much did the following aspects of the class help your learning?

Rated from 1 (no help) to 5 (great help)

Page 18: Jim Julius SDSU Course Design Institute May 27, 2009.

Evaluating the Face-to-Face Class What did the data show?

Question MWF Section

TTH Section

ConceptCheck Questions 4.1 4.1

Discussion Boards 2.9 3.1

Page 19: Jim Julius SDSU Course Design Institute May 27, 2009.

19

Evaluation Findings: IDEA Diagnostic Survey

Page 20: Jim Julius SDSU Course Design Institute May 27, 2009.

20

Evaluation Findings: IDEA Diagnostic Survey

Fall 2006 Blended

Fall 2006 Traditional

Spring 2007

Blended

Spring 2007

Traditional

Progress on objectives

70 73 77 77

Excellent teacher

65 68 69 68

Excellent course

62 72 73 71

Note: Top 10% = 63 or more

Page 21: Jim Julius SDSU Course Design Institute May 27, 2009.

21

Evaluation Findings:Departmental Course Evaluations

Page 22: Jim Julius SDSU Course Design Institute May 27, 2009.

22

Evaluation Findings: Course GradesFall 2007

Fall 2007 Course Grades

12.8

15

34.6

35.8

3.9

15

12.1

33.1

31

8.9

0 10 20 30 40

F

D

C

B

A

Gra

de

% in Category

Blended

Traditional

Page 23: Jim Julius SDSU Course Design Institute May 27, 2009.

Clicker Data: Spring 2007

Question % Agree or Strongly

Agree

Class clicker usage makes me more likely to attend class. 93%

Class clicker usage helps me to feel more involved in class. 84%

Class clicker usage makes it more likely for me to respond to a question from the professor.

91%

I understand why my professor is using clickers in this course. 90%

My professor asks clicker questions which are important to my learning.

90%

Page 24: Jim Julius SDSU Course Design Institute May 27, 2009.

Summer 2008 Fully Online: SALG Data How much did the following aspects

of the class help your learning? Rated from 1 (no help) to 5 (great

help)

Page 25: Jim Julius SDSU Course Design Institute May 27, 2009.

Summer 2008 Fully Online: SALG Data

Question Summer 2008 Online

Taking the test online 4.27

Discussion Forums 3.00

Introduction e-mail that explained the basics of the course

4.50

Page 26: Jim Julius SDSU Course Design Institute May 27, 2009.

SALG Data over time

Question Fall 2007

Blended

Fall 2007

F2F

Spring 2008

Blended

Spring 2008 F2F

Summer 2008

Online

Questions, answers, and discussions in class

3.96 4.04 4.10 4.01 4.36

Live online class sessions

3.39 4.20 4.15

Archives of live online class sessions

4.15 4.50 4.44

Quality of contact with the teacher

3.41 3.48 3.94 3.90 4.26

Working with peers outside of class/online

3.12 3.22 3.31 3.39 3.82

Page 27: Jim Julius SDSU Course Design Institute May 27, 2009.

Summer 2008: Community of Inquiry Survey Statements rated from 1 (strongly

disagree) to 5 (strongly agree) Based on the Community of Inquiry

framework’s three elements:1. Social Presence2. Cognitive Presence3. Teaching Presence

Page 28: Jim Julius SDSU Course Design Institute May 27, 2009.

Summer 2008:Community of Inquiry SurveyCoI Dimension Student Ratings

Social Presence 3.94Affective Expression 3.56Open Communication 4.29Group Cohesion 3.97

Cognitive Presence 3.96Triggering Event 3.91Exploration 3.73Integration 4.09Resolution 4.10

Teaching Presence 4.38Design and Organization 4.50Facilitation 4.38Direct Instruction 4.23

Page 29: Jim Julius SDSU Course Design Institute May 27, 2009.

So, what would you like to further explore?