Online Course Evaluations: Lessons Learned With a cast of thousands, including: Susan Monsen, W. Ken...

55
Online Course Evaluations: Lessons Learned With a cast of thousands, including: Susan Monsen, W. Ken Woo, Carrie Mahan Groce,& Wayne Miller

Transcript of Online Course Evaluations: Lessons Learned With a cast of thousands, including: Susan Monsen, W. Ken...

Online Course Evaluations: Lessons Learned

With a cast of thousands, including: Susan Monsen, W. Ken Woo, Carrie Mahan Groce,& Wayne Miller

Online Course Evaluations: Lessons Learned

Susan Monsen

Yale Law Experience

Course Evaluations were run by Student Representatives

Introduced first online system 2001 Changed system twice and introduced

incentives For Spring 2005 have 90% response rate

YLS OCE Version 1

First online course evaluation (OCE) Fall 2001-Spring 2003

Home grown web application with 18 questions System did not scale for in-class completion General email reminders sent to all students No incentives Response rate less than 20%

Back to Paper

Returned to Paper after 3 semesters use

Reasons:

Low response rate

Wanted an easier to use interface for completing and viewing results

Wanted ability to add incentives

OCE Version 2—Design

Design with input from student representatives and faculty

Modeled after Yale College system Reduced the number of questions to 8 Added a comment question Students with evaluations to complete

received weekly email reminder

Incentives

Tested Class Time for CompletionWorked for small-midsize classesResponse rate about 90%Load testing indicated up to 75 simultaneous

users. Introduced Grade Blocking

Students see an “*” instead of grade for those classes not evaluated.

OCE 2 Results View

Response Rate by System

0 10 20 30 40 50 60 70 80 90 100

OCE 1 (Fall 2001-Fall 2002)

OCE 2 (Spring 2004-Fall 2004)

Paper (Spring 2003-Fall 2003)

OCE 2 in class time2 classes (Fall 2004)

OCE 2 with GradeBlocking (Spring

2005)

Percent Completed

OCE 2 Response Rates

0

10

20

30

40

50

60

70

80

90

100

Percent completed

Week 1 Week 2 Week 3 Week 4 Final

Week

Spring 03

Fall 04

Spring 05

What did we learn

Don’t Too many questions No automated reminders No incentives

Do Incentives work! Reminders help Load test system

CTEs Online

Presented by:

Ken Woo

Director, Law School Computing

Northwestern University School of Law

When?

1st Semester : Spring 2004 2nd Semester : Fall 2005 3rd Semester : Spring 2005

Only 1.5 years into it Online

When? (continued)

Paper system : Fall 2003 80% Paper system : Spring 2003 77% Paper system : Fall 2004 70% 1st Semester : Spring 2004 N/A 2nd Semester : Fall 2005 70% 3rd Semester : Spring 2005 67.8%

Why?

Wanted to push everything onto the Web.Everyone had some sort of web accessLoose papers and go paperless

Centralized storage locationOn a centralized serverNo Data Steward availableAccess by Registrar and Registrar Team onlyProfessors can view own results

Why? (continued)

Perceived as easier to manageChanges were easier for Registrar3 types of forms

Standard (19 questions) CLR (23 questions) Clinic (18 questions)

Legibility was a small issue

Lessons Learned

Very similar to paper questions with some added questions for clarity

Participation rate is falling Some ideas to increase participation

Withhold transcripts – noWithhold final grades – noLet know, no view of any results if no

participation – next semester Fall 06

Q & A

CTEs Online Presented by:

Ken Woo

Director, Law School Computing

Northwestern University School of Law

Online Course Evaluations: Lessons Learned

Carrie Mahan Groce

University of DenverSturm COL Experience Why Online Evaluations

Academic Dean was the instigator. Wanted better, more timely, access to evaluations, particularly comments.

Hoped to get more meaningful written comments, both good and bad.

Our school has a culture of use of written comments by students and search committees.

University of DenverSturm COL Experience Web Manager built homegrown Cold Fusion

application using current evaluation form and procedures as a start.

Data pulled from administrative (Banner) system.

Course and student data stored in one database, results in a separate db (anonymity).

Questions generate dynamically.

University of DenverSturm COL Experience Initial concerns taken into account.

Faculty - only registered students, one per student. No evaluation after exam.

Students – retain anonymity, no faculty access before grades.

Additional Student Concern Complained this format would be too time consuming

– not addressed, later feedback suggests students appreciate freeing up class time.

University of DenverSturm COL Experience Additional Faculty Concerns – how addressed

Lower response rates – pilot conducted to get a feel for response rates before faculty approval of online evals.

Concern that comments would be too accessible leaving “less popular” professors vulnerable – agreed that Academic Dean could remove very negative comments from public view.

Not all courses followed standard exam schedule – handled case by case.

University of DenverSturm COL Experience Assoc. Dean wanted data to take to faculty

– came to Ed. Tech.Started with pilot group in Fall 02 – 7 profs, 10

course participated.Spring 03 all adjuncts and a handful of

appointed faculty – 80 courses in allSummer 03 all courses participated.

University of DenverSturm COL Experience Evaluation Procedures

Evaluation goes online 2 weeks prior to semester end – available through the day prior to exams beginning. Originally only last two weeks of class – extended during 1st pilot.

Students receive emails with links to all their course evaluations and detailed instructions.

Reminder emails sent every other day or so to those who have not completed.

University of DenverSturm COL Experience Results from pilots

encouraging. Response rates good (higher than paper), though inflated due to incentives and babysitting.

Summer low but very short evaluation period.

Dean took data to faculty for approval to move all courses online. Approval given beginning Fall 2003.

0%

20%

40%

60%

80%

100%

Fall 02 83%

Spring 03 81%

Sum 03 64%

Avg response

University of DenverSturm COL Experience Response Rates - real use setting

83%

77%

67%

72%

72%

Fall 03

Spring 04

Summer 04

Fall 04

Spring 05

University of DenverSturm COL Experience Reasons for drop in response rates - speculation

Change of Academic Dean. Current dean not invested, less hands on encouragement.

Novelty wearing off. This year we had our first incoming class who never did a paper evaluation. No novelty factor – just another chore.

University of DenverSturm COL Experience What should we do?

Nothing? Assessment department happy with 70% and we are getting better rates than other divisions.

My preference – get the new dean back on board, even more reminders, advertisement.

Better communication to faculty about timing so they can tell students what to expect.

University of DenverSturm COL Experience Next steps

More sophisticated results generation. Advanced searching: ability to compare profs side by side, show all evals for a professor or a course.

Streamline course list interaction. Build direct access to Banner system rather than pulling data out of the admin system. Not likely to happen.

Move from Access back-end to SQL Server.

University of DenverSturm COL Experience Potholes to watch out for.

Difficult to know how good the data is. We realized late that the person pulling lists didn’t have permissions to get non-law students enrolled in law classes. No way to know that from looking at such large amounts of data. 150 courses/nearly 5000 individual evaluations.

Different schedules for different courses can cause headaches. 1st year Legal Writing wanted complete control over timing. Some courses finish early. Hard to keep those in institutional memory. Anytime an individual eval has a different schedule response lower.

University of DenverSturm COL Experience Potholes (cont.)

Complete anonymity made a few instances of students filling out one evaluation as though it was for a different professor tedious. Mostly resolved by adding the professor’s name throughout the text of the eval, in as many places as possible.

Students want to retract an evaluation (usually negative). This semester was the first time we heard this request. Academic dean turned down all requests and shut the door to additional requests.

University of DenverSturm COL Experience And a sink hole…

A more pervasive problem: with any ed tech project, once we do something it becomes “ours.”

Problematic because we don’t have the staff to take on administrative functions, nor have we been given the power to handle issues with those functions.

University of DenverSturm COL Experience Remedies?

Proactive – never take too much control of a project. Build as much administrative functionality in as possible at the beginning.

If you’ve taken on too much - give it back, if it was their job before it was online, it should still be their job.

Easier said than done.

University of DenverSturm COL Experience Final words of wisdom

Don’t try to reinvent the wheel. We found we had better buy-in when we agreed to keep system as close to original as possible.

Contact information

Carrie Mahan GroceWeb ManagerUniversity of Denver Sturm College of Law

[email protected]

Online Course Evaluations: Lessons Learned

Wayne Miller

The Duke Law Experience

Introduced Summer 2003 without much planning when scantron equipment failed and replacement was deemed too expensive

My motivation was to provide a service to the law school that would benefit all: more efficient for staff and students; unmediated access for faculty; better community access to public information (summaries)

The Duke Law Experience

Homegrown, PHP-based survey software was employed

Student Information System provided rosters

Local email system provided authentication (through LDAP) for both students and faculty

Shortsightedness….

Paper form was copied without re-evaluation

10 minutes for in-class completion of paper evaluations was “given back” to faculty

Incentives for students were not thought through

“Click the radio button”is awkward at best

Scale changesare very problematic

Things we designed right

Registrar has direct control over which classes are included; which faculty are associated with each class; etc.

Things we designed right

Students can submit “conditional evaluations” when they fail to log in correctly or are not in our roster

Things people want

Students want to be able to edit and save, and come back to evaluations

Registrar and some faculty members want individualized time windows for certain classes

Student Response Rate

70% response rate required to share course eval summaries with community

Students need constant cajoling or we need to provide a better incentive

Some faculty are apprehensive about including students who would not have been in attendance on day of paper evaluations, and uneasy about cajoled students

Student Response RateSemester Total

Response Rate

Percentage of Class/Instr Making Cutoff

Fall 2003 66% (extended into exam period)

24/82 = 29%

Spring 2004 60% 36/119 = 30%

Fall 2004 52% 8/93 = 8%

Spring 2005 67% (dropped non-law students)

48/117 = 41%

Student Response Rate

Number of Submissions

0

100

200

300

400

500

600

Student Response Rate

Number of Submissions

0

100

200

300

400

500

600

Time scheduled for evals in large classes

Student Response Rate

Number of Submissions

0

100

200

300

400

500

600

Automated and person-specific email from Associate Dean

Student Response Rate

Number of Submissions

0

100

200

300

400

500

600

Second automatic email from Associate Deanand cajoling email from Registrar

Incentives under “consideration”

Withhold registration for following semester

Withhold grades Withhold free printing Withhold firstborn….

Issues

Security – not discussed much, but was a big part of planning

Privacy – deal breaker for some students; responses are anonymized before release

Accuracy – faculty are suspicious of mix-ups; varying scales have confused students

Urban legends – stories abound among faculty about how Prof X saw everyone’s evaluations, etc.

Future

Evaluation form is being reworked: easier to fill out, less confusing

Incentives are being considered Scantron on/off-line solutions are being

weighed Support could at any point be withdrawn – And probably would have been, were

another solution easy to implement….

Contact information

Wayne MillerDirector of Educational TechnologiesDuke University School of Law

[email protected]

http://edtech.law.duke.edu/