Technology-Mediated Assessment
description
Transcript of Technology-Mediated Assessment
Technology-Mediated Assessment
Jack McGourty, Columbia UniversityJohn Merrill, Ohio State University
Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh
Gateway Engineering Education Coalition
Technology-Mediated Assessment Introduction
Your Expectations Applications
Drexel and Columbia’s Course Evaluation Ohio State’s Activities Team Evaluator
Your Experiences Enablers and Barriers (Break-out Groups) Conclusions
Introduction Reasons for Online-Assessment Common Applications Design and Development Things to Think About
Reasons for On-Line Assessment Customized development Targeted communication Ease of distribution/no boundaries Automatic data collection and
analyses Real time response monitoring Timely feedback
Common Applications Attitude Surveys Multisource assessment and
feedback Course evaluations Portfolios Technology-mediated interviews Tests
Design and Development Item/Question development Adaptive testing/expert systems Multimedia tutorials Dialogue boxes Reporting wizards
Things to Think About Confidentiality/Privacy Response rates Reliability/Validity Ease of use
Administrators, end users
System growth Can it easily be
upgraded? Adding modules
System flexibility Survey/test construction
Data flexibility Item databases Reporting wizards Data storage
Platforms Specific vs.
combination Reporting
Various levels Dissemination
mechanisms Real time vs. delayed
Technology in Education
Dr. John MerrillThe Ohio State University
Introduction To Engineering Program
Technology Enabled Technology Enabled AssessmentAssessment
The Wave of The FutureThe Wave of The Future
Objectives Explanation of web-based
assessment tools Uses of assessment tools Virtual run-through of student
actions Lessons learned Q&A
Web-Based Assessment Tools Course Sorcerer (through WebCT)
Online Journal Entries Course Evaluations
Team Evaluator Peer Evaluations
WebCT WebCT is a commercial web-based
tool used for course management. IE Program uses/capabilities:
Electronic grade book, chat rooms, bulletin boards, calendars
Provides links to Course Material Course Sorcerer Team Evaluations (Team Evaluator)
Course Sorcerer A simple, web-based evaluation tool created by
Scott Cantor at University Technology Services Technical Specifications:
Written in Cold Fusion Run on Windows NT with a Netscape Enterprise Web
Server Uses a MS SQL Server database with 15 tables Server Machine: PII-450 w/ 512M of RAM Accesses Sybase running on Solaris 2.6 as a
warehouse for roster data. Used for Journal Entries & Course Evaluations
Team Evaluator (Peer Evaluation) Used by team members to provide confidential assessment System Requirements:
Operating System: Windows 2000 with ActivePerl or UNIX with Perl 5.004 or
higher Perl Modules: CGI, DBI (plus SQL drivers), POSIX SQL Server: MySQL 3.23 or higher Web Server: IIS (Windows) or Apache 1.3 (UNIX) CPU: Pentium II 400 or better recommended Memory: 128 MB or higher recommended Disk Space: 100 MB for adequate database space
Journal Entries Students complete journal entries
online every two weeks. Submissions are anonymous. All entries are read and summarized
by a staff member and shared with the instructional team.
Instructional team members share the summaries with their classes.
Course Evaluations Students in 181 & 182 complete
online course evaluations at the end of each quarter.
Questions designed to evaluate courses based on items a-k of Criterion 3, Program Outcomes & Assessment, in the ABET Engineering Criteria, 2000.
Short Term Uses Journal Entries & Course Evaluations
Address immediate student concerns/questions about class, labs, or projects.
Inquire about student problems with specific topics and labs.
Discover general information from students in regards to interests, influences, and attitudes.
ExampleAddressing Immediate Student Concerns
“How are the figures supposed to be done? Strictly isometric or just drawn so you can see everything? What pieces need to be labeled?”
“What are we doing in labs 6 & 7? I know it says in the syllabus that we are incorporating the sorting mechanism, but is that going to take two weeks?”
Long-Term Uses Journal Entries & Course Evaluations
Improve program content Improve course materials Modify teaching styles Evaluate course based on ABET
criteria
ExampleImproving Course Content
“Positive: I... - Gained knowledge about circuits in general - Learned how to read schematics - Learned how to use breadboards - Further developed team working skills Negative: - The circuits did not work the first time. - Time ran short for both labs, but we did finish each circuit.”
How It Works
Start: WebCT site:http://courses2.telr.ohio-state.edu
Completion TrackingEngineering 182
Journal Completion Rate
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
Per
cent
Com
plet
e P
er W
eek
Dickinson
Hastings
Chubb
Herrera
Gustafson
Dickinson 87.2% 76.2% 73.8% 78.0% 75.5% 78.1%
Hastings 92.7% 85.5% 80.1% 78.4% 73.0% 81.9%
Chubb 93.1% 86.1% 79.2% 80.6% 80.6% 83.9%
Herrera 93.0% 71.7% 81.8% 74.6% 74.6% 79.1%
Gustafson 71.9% 65.6% 70.3% 68.8% 68.8% 69.1%
Journal Entry #1
Journal Entry #2
Journal Entry #3
Journal Entry #4
Journal Entry #5
All Entries Avr.
Lessons LearnedJournal Entries & Course Evaluations
Students are more likely to complete if given credit.
Students are extremely responsive to the anonymity of the online survey.
Students respond positively when asked for suggestions/solutions to problems in the class.
Web Enhanced Course Evaluation at Columbia University
Jack McGourtyColumbia University
Overview A little history How does course assessment fit
into the “big picture”? Why use web technology? How is it being done? Does it work?
History Columbia’s Fu Foundation School of
Engineering and Applied Science began using the web for course assessment about four years ago starting with a student administered web site for results
Designed and developed state-of-the-art system using student teams
Now building on current infrastructure to include on-line tutorials and increased flexibility for administration
Student Web Site Search
by course or faculty
Current and past results
No comments
The Big Picture Why are we assessing courses and programs?
Continuous improvement of the education process What are we doing right, and what can we do better?
Integral part of our ABET EC2000 Compliance Develop a process Collect and evaluate data Close the loop Document/Archive results
Course evaluation one of several outcome assessment measures such as senior exit surveys, enrolled student surveys, and alumni surveys
How WCES Fits in
SEAS Assessment Processes
1998 1999 2000 2001pre1997
Initiate Course Evaluation Process
Conduct First Alumni Survey (All Alumni)
Conduct Second Alumni Survey1989 & 1994 Grads.
Benchmarking Senior Surveys -Class of 2000
Start Academic Review Cycle
Create Web Based Course EvaluationProcess
Senior Surveys -Class of 2001Alumni - 1996
Initiate Freshman Pre-Attitude Survey
Using Technology Pro
Students have the time to consider their responses
Timely feedback Responses are easily
analyzed, archived and distributed
Less paper Lower cost/efficient
administration
Con You lose the “captive
audience” You can’t guarantee a
diversity of opinions Motivated/Non-
motivated Like course/Dislike
course Not necessarily less
effort
Course Assessment Details 10 Core Items
Course Quality Instructor Quality
Relevant ABET EC2000 Items Pre-selected by
faculty member Customized
questions for specific course objectives
Selecting EC2000 Questions
Monitoring Faculty UsageOne of our culture change metrics is the percentage of faculty who are capitalizing on the system and adding custom and EC2000 questions. Currently around 15%.
Course Evaluation Results Web page access
Current term’s assessment Limited time window Limited access Secure site
Previous terms results Open access to numerical results; not comments
Email Results Individual faculty Aggregate Data – Department Chairs
Reporting
Promoting Responses Student-driven
results website Multiple targeted
emails to students and faculty from Dean
Announcements in classes
Posters all over the school
Random prize drawing
Closing the Loop
Does it Work? Student response rates have steadily
increased over past two years from 72% to 85%
More detail in student written comments in course assessments
Data is available that we have never had before
Faculty use of ABET EC2000 and Customized question features increasing but still limited (15%)
Cross Institutional Assessment with a Customized Web-Based Survey System
Mary Besterfield-Sacre & Larry Shuman University of Pittsburgh
This work is sponsored by two grants by the Engineering Information Foundation, EiF 98-01, Perception versus Performance: The Effects of Gender and Ethnicity Across Engineering Programs, and the National Science Foundation, Action Agenda - EEC-9872498, Engineering Education: Assessment Methodologies and Curricula Innovations
Why a Web-Based Survey System for Assessment? Need for a mechanism to routinely
Elicit student self-assessments and evaluations Facilitate both tracking and benchmarking
Most engineering schools lack sufficient resources to conduct requisite program assessments Expertise Time Funds
Triangulation of multiple measures Multiple measures
Pitt On-line Student Survey System (Pitt-OS3)
Allows multiple engineering schools to conduct routine program evaluations using EC 2000 related web-based survey instruments.
Assess and track students at appropriate points in their academic careers via questionnaires
Survey students throughout their undergraduate career Freshman Pre and Post Sophomore Junior Senior Alumni
Freshman orientation expanded to include Math Placement Examinations Mathematics Inventory Self-Assessment
Knowledge-Based
Competence
Application AreaSynthesize multiple
areas
Can Takeon
Complexity
AcceptAmbiguityWelcome
Environment
Confidence
DevelopComfort
Preparation
Opportunityand
Application
Work Experience
EC Outcomes
Attitudesand
Valuing
Student-Focused Model
The Student
Curriculum
Culture
In-ClassInstruction
LearningThrough
Experience
School ofEngineering
Services
EngineeringManagement
Advising/Counseling
UniversityServices
StudentGrowth
ENABLERS & ENHANCERS
OUTCOMES
Knowledge
Skills
Attitudes
CORE PROCESSES
WHO
WHAT HOW
System-Focused Model
Pitt OS3
Conduct routine program evaluation via surveys through the web Data collection Report generation (under development)
Web versus paper surveys Pros
Administration ease Minimize obtrusiveness Data is “cleaner”
Cons Lower response than paper-pencil surveys User/Technical issues
Pitt OS3
System Components
Internet
On-Line Student SurveySystem (OS3)
Global AdministratorMaintaining System
Local AdministratorControlling Survey "A"
StudentsTaking Survey "A"
StudentsTaking Survey "B"
Local AdministratorControlling Survey "B"
Pitt OS3
Local Administrator Individual at the school where the surveys are being
conducted Responsible for the administering the surveys through
a web-interface Controls the appearance of the survey
Selects school colors Uploads school emblem/logo
Selects survey survey beginning and ending dates Composes initial and reminder email letter(s) to
students Cut-and-pastes user login names and email address Manages surveys in progress Extends surveys beyond original dates
Pitt OS3
Local Administrator
Pitt OS3
Local Administrator
Pitt OS3
Local Administrator
Pitt OS3
Local Administrator
Pitt OS3
Student Java Applet running on a web browser One question per screen minimizes scroll bar
confusion Once student submits questionnaire, results are
compressed and sent to the OS3 server Results stored and student’s password is
invalidated Confirmation screen thanks the student for taking
the survey Can accommodate users who do not have email
accounts
Pitt OS3
Sample Student EmailSubject: Freshman Engineering Attitudes Pre-SurveyTo: [email protected]
Hello and Welcome to the Colorado School of Mines!
You are invited to participate in a research study designed to study students' attitudes about engineering, mathematics, andscience. This information will help CSM to design more effective courses and programs to enhance your undergraduateeducation.
The survey is called the Freshman Engineering Attitudes Pre-Survey. If you decide to participate, you will be asked to completethis survey twice: once at the beginning of the semester and again at the end of the academic year. The questionnaire, which takesless than 15 minutes to complete, can be taken any time at your leisure; however, the pre survey will only be available until 2000-09-22.
Please remember that there are no right or wrong answers, so be honest with your responses. Your responses will remainconfidential. If you have questions about this study, please contact Dr. Barbara Olds [ext. 3991 or [email protected]] or Dr. RonMiller [ext. 3892 or [email protected]].
Your decision to participate in this study is voluntary and there is no penalty if you decide not to participate.
For your convenience, the University of Pittsburgh has made it possible to take the survey online:Web location: http://136.142.87.142/os3/SurveyClient.html?=4
Your username is: MaryYour password is: Mary715
If you experience technical problems taking the survey, please contact Dr. Ray Hoare via email at [email protected].
Your participation in this project is important to us. Once you have completed the survey, please stop by the McBride HonorsProgram office to pick up a small token of our appreciation. Thank you for your help with this important project.
Barbara M. Olds Ronald L. MillerProfessor of Liberal Arts & International Studies Professor of Chemical Engineering
Pitt OS3
Student Welcome
Pitt OS3
Student Instructions
Pitt OS3 Questionnaire
Pitt OS3
How it Works Every day OS3 summarizes all active surveys
for each school Summary reports on the number of students who
have and have not taken the survey Specific students can also be viewed from the local
administrators account Upon completion of the survey dates
Email addresses are stripped from the system Only login names remain with results Only time the OS3 system has student email
addresses is when the local admin is receiving daily updates about their active surveys
Pitt OS3
Sample Daily ReportDate: Mon, 18 Jun 2001 13:10:21 -0500 (EST)Date-warning: Date header was inserted by pitt.eduFrom: [email protected]: Math Inventory Daily UpdateTo: [email protected] Math Inventory survey for University of Pittsburgh Freshman was startedon 2001-05-18.The last day for the survey is 2001-08-20.
227 have taken the survey.3 have not yet taken the survey.
The survey system is online athttp://166.153.77.154/os3/Student.html?=99,local. You can check the status of individual students as well as change other options such as the color schemethrough your local administrator account:
Username: localPassword: xxx1234
Pitt OS3
Evaluation of the System Piloted on five schools
Multiple surveys concurrently at each school Multiple schools at one time
Response rates vary (30 - 70% on average) Example
University of Pittsburgh - April 2001 One initial with two reminder emails over 2.5 weeks Responses
Freshman 70% Sophomores 48% Junior 44%
Varied by department Some usernames had “+”
Pitt OS3
System Trace of One School Freshman Post Survey Survey available for two weeks with one reminder
message 57% overall response rate Increased server ‘traffic’ 2 to 24 hours after each email Design concerns
63% of students had to log in more than one time Multiple logins due to case sensitive passwords
14% never finished - browser problems or didn’t want to finish
10% gave up - just didn’t complete login
Pitt OS3
Issues to Consider Consent for Human Subjects
Discuss with institution’s Internal Review Board
Surveys often exempt Java Applets not supported by
very old browsers HTML as alternate
Firewalls established by other organizations