NTU Innovations in Teaching Seminar - students as co creators

46
NTU Singapore — Innovations in Teaching Seminar Expert-guided, crowdsourced learning content advancing knowledge co-creation & peer learning [email protected] @simonpbates

Transcript of NTU Innovations in Teaching Seminar - students as co creators

NTU Singapore — Innovations in Teaching Seminar

Expert-guided, crowdsourced learning content advancing knowledge co-creation & peer learning

[email protected]@simonpbates

2

3

Overview

❖ Principles: ownership, how learning works, effective practices

❖ A story in two parts

❖ PeerWise - an online space for student generated MCQ assessments

❖ Learning objects - going beyond MCQs.

• Web-­‐based  Multiple  Choice  Question  repository  built  by  students  

• Students:

– develop  new  questions  with  associated  explanations  – answer  existing  questions  and  rate  them  for  quality  and  

difficulty  – take  part  in  discussions  – can  follow  other  authors

peerwise.cs.auckland.ac.nz

10

As a question author…..

11

12

Badges' Points' Leaderboards'

Implementation

Minimum participation requirements for each of two assessment exercises (PW1, PW2)

Write 1 Answer 5 Rate / comment 3

5% course credit

Physics 101, Energy & Waves Winter Semester: 3 sections, ~800 students

Implementation - scaffolding TOCCLT'.' -

Qest\I\sct

How 1o...

q/t4osTer

Submit ond onswer questionson topics in lhe torget region,just obove lhe physicsyou hove olreody mostered.

Dislroclers

) l*"*"ò

rNrrltngIh s region corrk:ins llte

plrysics knowledgo ond colrco¡lsyou connol leorn yel

becouse the louncalionscrrc nol in plocc

PHYSICS TOPICS IN YOUR TARG T REGION

buo'5ot't1 'Den¡¡lu

B¿:o\v\vì 1 JF:rcq - YOUR CHOSEN TOPIC

à..\auC'q^ x@bôo'osro

pn ro, d logroo, \o 9tS

COMMON MISCONCEPÎIONS AND ERRORS(Sec hHÞ:/jÞhy¡.udofo¡.cdu/CJP/trêconcoÞllo¡t.pdt

tor o llil ol common mhconcepllons)

¿K N(,qht %cce Açxs no\ exi¡þ, c.rrrçr\J On oblec'-t \5 õÞYJrnr cìü?ÀI in rtuiò. ü

lrh'brot4ont eprç,e .I¡1enÀs crqden-s,\{^ c>Ç c.,þ¡ec\ , cìo+ .\*¿n:r

oÇ Çtu,8,fr- btrrqon! $crCe c¿c\s Àoc¡:'cr

fr¿lt{e- r *hcrn ú?

Trr¿e. $ììfifflsz hrrr: ¿lÕoolr-g lnnS

(**= tcpo þ/rnl {otr-* qrÇ bq.\\:. 0" Ooz

Physics knowledgeond conccpluol underslonding

you lrove olrcodyconslrucleC ¡n your heod

¡,<'t.lerJ rt)\^,¿z! Þ.rrp \r- \rr't\¡ ,l?

Chcck lhol yovr on3Íêr ¡t rêo!ànoblê ðnd potrlble

\'f<,,cc\

Photo by Seth Casteelhttp://www.littlefriendsphoto.comPermission to use agreed

Writing original questions is a demanding activity

Extensive scaffolding exercises

Revisited in subsequent tutorials

All scaffolding materials available on the PeerWise Community website: www.PeerWise-Community.org

Selected results and analysisEngagement - how do students use the system?

Benefits - what is the impact on learning?

Question quality - how good is what students produce?

Relevant publications:

Scaffolding student engagement via online peer learning - European Journal of Physics 35 (4), 045002 (2014)

Student-Generated Content: Enhancing learning through sharing multiple-choice questions. International Journal of Science Education, 1-15 (2014).

Assessing the quality of a student-generated question repository - Phys Rev ST PER (2014) 10, 020105

Student-generated assessment - Education in Chemistry (2013) 13 1

Engagement • Generally, students did:

– Participate beyond minimum requirements

– Engage in community learning, correcting errors

– Create problems, not exercises

– Provide positive feedback

Correlation with learning outcomes

Quality of student authored content

Bloom’s Taxonomy of levels in the cognitive domain

Score Level Description

1 Remember Factual knowledge, trivial plugging in of numbers

2 Understand Basic understanding of content

3 Apply Implement, calculate / determine. Typically one-stage problem

4 Analyze Typical multi-step problem; requires identification of strategy

Evaluate Compare &assess various option possibilities; often conceptual

Synthesize Ideas and topics from disparate course sections combined. Significantly challenging problem.

Text

Question quality

0%

5%

10%

15%

20%25%

30%

35%

40%

45%

50%

1 2 3 4 5 6

Taxonomic Category

Per

cent

age

of S

ubm

itte

d Q

uest

ions

First semester N = 350

Second semester N = 252

Explanation quality

0 Missing No explanation provided or explanation incoherent/irrelevant

1 Inadequate Wrong reasoning and/or answer; trivial or flippant

2 MinimalCorrect answer but with insufficient explanation/justification/ Some aspects may be unclear/incorrect/confused.

3 Good Clear and detailed exposition of correct method & answer.

4 ExcellentThorough description of relevant physics and solution strategy. Plausibility of all answers considered. Beyond normal expectation for a correct solution

0 1 2 3 4 0 1 2 3 40

20

40

60

Num

ber o

f que

stio

ns

Assessment 1 Assessment 2

Explanation Quality

Question quality summary (UoE 2011)

2 successive years of the same course (N=150, 350)

‘High quality’ questions: 78%, 79%

Over 90% (most likely) correct, and majority of those wrong were identified by students.

69% (2010) and 55% (2011) rated 3 or 4 for explanations

Only 2% (2010) and 4% (2011) rated 1/ 6 for taxonomic level.

That’s not commonBottomley & Denny Biochem and Mol Biol Educ. 39(5) 352-361 (2011)

107 Year 2 biochem students 56 / 35 / 9 % of questions in lowest 3 levels.

Momsen et al CBE-Life Sci Educ 9, 436-440 (2010)

“9,713 assessment items submitted by 50 instructors in the United States reported that 93% of the questions asked on examinations in introductory biology courses were at the lowest two levels of the revised Bloom’s taxonomy”

Beyond MCQs

Why not short answer Qs?

Why not …. anything?

Beyond MCQs

Why not short answer Qs?

Why not …. anything? LEARNING OBJECTS

Adaptive Comparative Judgement

PHYS101: Energy and Waves

Implementation logistics Cohort split into 4 groups

Each week one group tasked with creating LOs

Each submission counts for 2.5% of final grade

Repeat cycle twice per Semester

Students can submit >2 LOs & receive grade for best 2

Short survey on submission

Students encouraged to apply CC licenses

Results: engagement0 100 200 300

LO 1

LO 2

LO 3

LO 4

LO 5

LO 6

LO 7

LO 8

Number of students

AssignedOptional

Results: time on task

0 100 200 300 400

Less than 0.5h

0.5 to 1 h

1 to 2h

2 to 3h

3 to 4h

4 to 5h

More than 5h

Number of students

Results: self-reported change in understanding

0 200 400 600 800

None

Little

Moderate

Good

Excellent

Number of students

0200400600800

Number of students

before creating it after creating it How much did you understand the topic your LO was based on

PHYS101: Energy and Waves

3. Successes

http://youtu.be/BObyt_NJYrE

Sample 2 - Standing Wave in a bowl

Sample 3 - Colour Loss Underwater

Student generated exam content

before creating it after creating it How much did you understand the topic your LO was based on

Resources for use in class

http://blogs.ubc.ca/phys101

In summary

technology-enabledpeer learning,

as authentic assessment

Not quite the whole story• Despite these outstanding examples, many students

didn’t like the LO assessment

• difficulty level vs other assessed components of the course

• credit weighting

• Students dropped these assessments more than other coursework

• Strange ‘phase transition’ for LO vs exam grades

NTU Singapore — Innovations in Teaching Seminar

Expert-guided, crowdsourced learning content advancing knowledge co-creation & peer learning

[email protected]@simonpbatesbit.ly/batestalks

Copyrig

ht  2013  Graham

 Fow

ell  /  The

 Hitm

an,  re-­‐prod

uced

with

 permission,    Edu

catio

n  In  Chemistry,    Vo

l  50  No  1  (201

3)

Photo credits

Photo credits

Community: http://www.flickr.com/photos/kubina/471164507/

Screen grab from Mwensch ‘A vision of students today’ http://www.youtube.com/watch?v=dGCJ46vyR9o

65