The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

15
The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocat o June 2007

Transcript of The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

Page 1: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

The Luddite Exam: Not Using Technology to Gauge Student Writing Development

John BrocatoJune 2007

Page 2: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

2

This presentation discusses a case-study-based writing exam, written by hand during a three-hour exam period, that gauges the writing development of engineering students.

Summary

Page 3: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

3

Research Questions

1. How can we reliably gauge student writing development over time?

2. How can we do so in a controlled, fair setting?

Page 4: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

4

Why Answer These Questions?

To Ensure Graduate Proficiency

To Assess Program

Effectiveness

To Gauge Student Progress

To Measure Student

PerformanceTo Improve

Communication Pedagogy

Page 5: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

5

Why not use technology?

•Plagiarism (wireless access)

•Reliance on software

•Security of exam materials

All are necessary evils for routine writing.

[1]

Page 6: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

6

What does Luddite mean?Luddite describes a “[m]ember of organized groups of early 19th-century English craftsmen who surreptitiously destroyed the textile machinery that was replacing them…. The term Luddite was later used to describe anyone opposed to tech-nological change” [2].

Luddites Smashing a Loom [3]

DISCLAIMER: Our program contains no Luddites (that we’re aware). We simply think the name is

apt and attention-getting. We embrace technology!

Page 7: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

7

Approach

Application – Allow students to apply what they have learned.

Purity – Ensure their work is untainted.

Reuse – Restrict exam materials for reuse and purity.

Fairness – Weight “capstone” documents fairly.

Page 8: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

8

Methods

1. We prepared students by explicitly describing the exam requirements and using case studies prior to the exam.

2. We allowed students to use their own copies of required textbooks (but nothing else).

Page 9: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

9

Methods (cont.)

3. We controlled content by providing the case study (see provided sample) at the exam.

4. We required students to write their exams by hand.

5. We monitored students closely, assisting when necessary.

Page 10: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

10

Methods (cont.)

6. We required students to turn the case study back in with their exams.

7. We recorded the times students submitted their exams.

8. We graded exams with the same type of rubric we use for routine writing (see provided sample).

Page 11: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

11

Findings

Grades increased on the Luddite Exam compared to documents written in unrestricted settings.

Page 12: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

12

Findings (cont.)

 Paper Averages*

Exam Averages Difference

Fall 2005 (43 students) 80.82 80.89 + 0.07Spring 2006 (51 students) 83.63 85.1 + 1.47Summer 2006 (37 students) 85.36 88.01 + 2.65Fall 2006 (25 students) 82.43 86.01 + 3.58Spring 2007 (29 students) 86.15 91.9 + 5.75

*Scores are out of 100 possible points divided according to standard letter-grade breakdowns: A = 100-90, B = 89-80, C = 79-70, D = 69-60, F = 59-0.

Page 13: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

13

Conclusions

The Luddite Exam is an effective way to gauge student writing development over time.

The consistent increase in student performance likely stems from enhanced student ability as well as increased instructor skill in preparing students for the exam.

Page 14: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

14

Future Work

Implement the Luddite Exam programmatically/outside the specialized course (already begun).

Formally survey student perceptions on the exam’s usefulness.

Enable computer use on the exam without affecting the major objectives.

Page 15: The Luddite Exam: Not Using Technology to Gauge Student Writing Development John Brocato June 2007.

15

References[1] G. Larson & Steve Martin, “Wait! Wait! … Cancel that. I guess it says ‘helf,’” in The Complete Far Side 1980-1994 (2 volumes), Andrews & McMeel, Kansas City, MO, 2003, p. 205.

[2] “Luddite,” in Britannica Concise Encyclopedia, June 19, 2007. [Online.] Available: http://www.britannica.com/ebc/article-9370681

[3] “Luddites Smashing a Loom,” in “Smash the (Bad) Machines!,” June 19, 2007. [Online.] Available: http://www.stephaniesyjuco.com/antifactory/blog/2005_07_01_archive.html