Assessing the Impacts of a Highly Interactive Planetary Science Elective Course for Senior Science...

17
Assessing the Impacts of a Highly Interactive Planetary Science Elective Course for Senior Science Students ~ Francis Jones and Catherine Johnson Dep’t Earth and Ocean Sciences

Transcript of Assessing the Impacts of a Highly Interactive Planetary Science Elective Course for Senior Science...

Assessing the Impacts of a Highly Interactive Planetary Science Elective

Course for Senior Science Students~

Francis Jones and Catherine Johnson

Dep’t Earth and Ocean Sciences

Outline

A. Purpose of upper level electives

B. Assessing the target with diagnostics

C. Impacts of seven interactive strategies

D. Impact on instructors

E. Lessons learned and adjustments made

2

A. Purpose of upper level electives

• Diverse student backgrounds and intentions • Limited options for studying any discipline in depth.• BUT … senior science students should be gaining

maturity in “generic scientific skills”.

HENCE: Learning Goals sound like …– Emulate thinking of scientists– Relate models to observations or data– Develop and articulate ideas (hypotheses) about how …– Pose a clear question, research current state-of-art and

communicate / debate findings in a scholarly manner.

3

• What collective foundation do we have? – Mass / Density, Newton’s laws, Seasons / Tides, Geosci / Maps

• Results help … – Students target catch-up study time using materials provided.– Instructors can anticipate specific challenges.

B. Assessing the target: diagnostics

Frequency counts of students’

choices:

Quest # 1 2 3 4 5 6Mass1 (Multiple Choice)

1 62 1 9 6 34 102 4 4 44 6 14 173 1 34 3 4 9 254 0 28 5 31 7 145 0 0 4 7 3 16 2 11 0 07 2

Qn.Option

ABCDE

Everyone is OK

Quest # 1 2 3 4 5 6Density1 (Multiple Choice)

1 62 1 9 6 34 102 4 4 44 6 14 173 1 34 3 4 9 254 0 28 5 31 7 145 0 0 4 7 3 16 2 11 0 07 21 misconception

Quest # 1 2 3 4 5 6Newton’s law of gravitation (Multiple Choice)

1 62 1 9 6 34 102 4 4 44 6 14 173 1 34 3 4 9 254 0 28 5 31 7 145 0 0 4 7 3 16 2 11 0 07 2

Problems here

4

Seven interactive strategies: 1. Teams – survey feedback

“Partial” Team-Based-Learning*– Permanent & random team assignments– Used for quizzing and in-class activities (NOT homework)

* See http://www.teambasedlearning.org/ (Michaelsen, & others)5

0%

20%

40%

60%

80%

100%

0 1 2 3 4 5 6 7 8 9 10 11 12 13

Qui

z S

core

s

Team No. (teams of 4-6 individuals)

Quiz 3, 2010

team results

individuals

1. Teams – survey feedback

Do students “like” teams? Yes … mostly

End of Term Survey Questions (N = 59 of 63) 5 = always1 = never

• Team work is more effective than doing same work alone 4.17• Team activities helped improve thinking skills needed for assignments, exams & projects. 4.16• Doing quizzes individually and in teams was worthwhile. 4.10• All team members provided balanced contributions. 3.54

6 Clicker questions effectively fostered discussion. 4.49

They like CLICKERS even better!

3. Combining classroom strategies

Teams, ie “peer instruction” Worksheets (annotate images) Clickers for milestones;

results help ASSESS impacts. Whole class discussions

Students practice High level thinking Evidence-based judgments Non-unique outcomes

Detailed sequence in appendix.

Example of a 20 minute exercise using

? ?

AA

AA

AA

BB

BB

BB

CC??

DD

DD

CC

CC

CC

Identify major regions on the basis of crater densities and grayscale

7

4. Three module-assignments (1 per module)

Observe, estimate, synthesize, interpret, … etc.

• Eg: Assig 2’s goal Test two hypotheses for Venus’ geological history using observations of cratering, volcanism and tectonism from radar images. Decide which of the hypotheses your observations best support.

• Instructions, “data packets”, worksheets are provided.

• Workloads:75% of students took 3-9 hrs.

80

5

10

15

20

25

Nu

mb

er s

tu

de

nts v

s s

co

re

Results:

5. Pre-posts for modules 2On line: “17 practice questions for the midterm”.Averaged scores show class gains for all questions.

Types of questions• Map use: Gains,

lowest improve more.

• Physics: Gains, yesbut abilities mixed.

• Interpreting photos: best improvements.

Gains when above the diagonal

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90%100%

Post

-tes

t sco

res

Pre-test scores

maps (6)

image physics (6)

image interp'n (5)0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90%100%

Post

-tes

t sco

res

Pre-test scores

maps (6)

image physics (6)

image interp'n (5)0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90%100%

Post

-tes

t sco

res

Pre-test scores

maps (6)

image physics (6)

image interp'n (5)

9

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90%100%

Post

-tes

t sco

res

Pre-test scores

maps (6)

image physics (6)

image interp'n (5)

6. Midterms: Assessing the assessment

Was the exam appropriate? Use item analysis.

Question #s A5 B14 B10 B13 B20

Averages:top 50% 97 78 93 77 33bot 50% 93 67 45 38 7

whole class: 95 73 69 58 20Total # Qns

discrim: 2 8 35 34 66 HARD 7

EASY 9

E O O H H OTHER 17

Example of 5 questions• A5 - easy• B20 - hard• B14 - a poor discriminator• Others are OK.

B14A5 B20

10

50%

60%

70%

80%

90%

100%

50% 60% 70% 80% 90% 100%

Inst

ruct

or a

vg

Peers avg.

7. Posters: peer assessment

• Peers seem poor at discriminating quality.– Instructor grading is “lower”.– Peer grading is “narrower”.

• Peer assessment evidently needs practice!

• Feedback is positive, except for workloads.

Instructor versus Peer assessments; 25 projects.

11

D. Assessing impact on two instructors • Instructor 1: (planetary geophysicist)

– Developed the course; taught it for 3 terms.

• Instructor 2: (tectonics / earthquakes geophysicist)– Taught ~40% of course once; no prior active-learning experience.

• Assessment purposes– Assess satisfaction of active strategies in this context.– Identify challenges of sustaining the course in its present form.

• Assessment methods– Interview each instructor pre-course + post-course– Post course questionnaire– In-class observations: What worked + Recommendations– Instructional support as needed during term

12

D. Assessing impact on two instructors. From questionnaires:

Messages: - This term went well. - BUT, new instructor was ‘worried’ about taking over.

Comments and interviews provide details of needs &/ concerns.

0

1

2

3

4

5

logistics (12)

inclass(14)

assigs./proj. (8)

No

...

...

Yes

Did THIS term work well ?Inst1

Inst2

0

1

2

3

4

5

logistics (12)

inclass(14)

assigs./proj. (8)

No

...

... Y

es

Will NEXT term be easy to run?Inst1

Inst2

(No. questions in brackets.)

13

E. Lessons learned by assessing activities

• Positive:– Teams– Assignments– Module pre/posts– Active lessons

• Clickers• Worksheets, etc.

– Projects • Peer asses. needs work

– Some testing is liked– Surveying is helpful– Diagnostics

00.05

0.10.15

0.20.25

0.30.35

0.4

much more more similar less much lessprec

enta

ge o

f tot

al co

urse

s

Time on 355 compared to other courses

Fall (N=221)Spr (N=176)

• Negative: workloads!

- Students spend more time on this course compared to others.

14

- Instructors find complexity daunting.

Conclusions

• Assessing impacts of strategies helps students by– Providing timely feedback, – optimizing learning strategies & balancing workloads, – improving student-instructor and instructor-instructor collegiality .

• Assessing impacts helps instructors discover– Misconceptions, gains, thinking, workloads, motivational challenges, etc.– Identifying challenges for sustaining the course

Thanks to …– Grad student TAs – Students – CWSEI & colleagues– GSA organizers

http://www.eos.ubc.ca/research/cwsei/ 15

Blank

16

Appendix I: Learning Goals fromhttp://www.eos.ubc.ca/courses/eosc355/eosc355.htm

1. Emulate the thinking of specialists when addressing questions or hypotheses by referring to measurements & observations, existing knowledge, and accepted or proposed models.

2. When relating models and observations or data, recognize the relevant assumptions and limitations of both model and data, and recommend observations, further theory or model refinement that might improve the model.

3. Estimate basic whole-body parameters of any object (planet, moon, small body) using relationships between those parameters and data that describe the motions of relevant bodies.

4. Use observable surface features to discuss models of surface age and geological history of a body.

5. Develop and articulate hypotheses about how internal structure, dynamics and evolution relate to surface features, atmosphere, bulk properties, and magnetic fields.

6. Pose a clear question, hypothesis, or mission plan regarding any aspect of planetary science, research the current state-of-the-art and communicate / debate your findings in a scholarly manner.

17