E-Learning Systems as Research Platforms Results from the Networked Education Database (NED) Project...
-
Upload
dwayne-lewis -
Category
Documents
-
view
221 -
download
0
description
Transcript of E-Learning Systems as Research Platforms Results from the Networked Education Database (NED) Project...
e-Learning Systems as Research Platforms
Results from the Networked Education Database (NED) Project
Matthew Pittinsky, Hui Soo Chae, Anthony Cocciolo, Bert Ross, Tom Angelo
JULY 10, 2007
2
NED
The Problem
3
Original Data Collection• Requires precious classroom time.• Informed consents difficult to secure.• Customizing instruments for context across sites
and time is costly and discourages certain types of data collection (e.g. sociometric).
• Data entry and coding inhibits sharing and re-use.
• Incomplete responses undermine results and inhibit certain types of data collection (e.g. full classrooms).
4
Major Secondary Datasets• International studies (e.g. TIMSS).• Federal studies (e.g. NELS, HSB).• State data warehouses (e.g. Florida K-20
Education Data Warehouse (EDW)).• Sponsored private studies (e.g.
AddHealth).
5
Secondary Datasets: Issues• Requires tough trade-off’s when operationalizing
specific research questions.– e.g. classmate effect studies.
• Often based on stratified samples, not whole classrooms and schools.– e.g. same-teacher class periods.
• Rarely longitudinal within academic years.• Rarely contextualized (e.g. relationship
questions that require roster).
6
NED: A Vision• Could “generic” and custom data be collected
through school systems automatically, and anonymously, massively reducing the cost and complexity of educational research?
• Schools have long invested in student administrative systems.
• Schools are now adopting eLearning systems equipped with gradebooks, class rosters and Web-based survey (assessment) tools.
• Both are Internet-enabled.
7
NED
The Solution
8
NED: Data Collection Model• Asynchronous (outside class time)• Automatic (pre-scheduled, add/drops)• Contextual (draws on system data to generate
questions)• Non-duplicative (uses already stored or entered
data where possible)• Complete (form checks)• Efficient (paperless and coded)• Sustainable (self-perpetuating)• Anonymous (unique ID)
9
Participant Anonymity• Data arrives to NED as secondary data (anonymous and
coded).• Survey responses are tagged with unique participant
IDs.• Generic data tagged with same ID and automatically
merged with survey responses (gradebook, demographic, course overlap).
• Structure of unique participant ID allows for sorting by class and school, however...
• Data arrives at NED without any knowledge of participant’s school or classroom identity.
10
Participant Confidentiality• Survey responses are encrypted and
automatically self-delete on local server.• No school official has access to student
responses or completion status. • Data transmission is via secure protocol.
11
NED Data Feed: Examplened_id ned_course_id ned_questions_pk1ned_answer role----------- ------------- ----------------- -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
1 6395 1 C9F0F895FB98AB9159F51FD0297E236D P 1 6395 2 B38A9DE63E6BA841FA6AD56AF95431FC P 1 6395 3 20D3DC7E7E8014DC92BD81AD0AE37B83 P 1 6395 4 86E89BF7F214F556284F8577DA3ED1B8 P 1 6395 5 0DB377921F4CE762C62526131097968F P 1 6395 6 63889CFB9D3CBE05D1BD2BE5CC9953FD P
Grade Item Item Grade NED User NED Course -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------- -----------asdf NULL 1 6392asdf 10 1 6392asdf 10 31640 6392asdf 10 31652 6392Test 2 0 31650 6395
12
How NED Works• School installs NED extension and marks participating classes.• School’s eLearning system automatically posts survey based on
schedule (w/ announcement).• Participant provides consent.• Survey is delivered through eLearning system GUI.• Survey draws on class context (roster, subject matter, student
information, etc.) when phrasing customized questions.• Survey enforces certain completion rules.• Survey responses stored in special encrypted tables that self-delete
after posting.• Survey responses and pre-existing data (demographic, gradebook)
are packaged and securely posted to NED.• New students are “caught up” when added to course.• Teacher knows how many completes, but not who.
13
NED
The Pilot
14
NED Pilot• Custom extension to the Blackboard Learning System.• Solicited 15 sites, 5 agreed, 3 ultimately participated.
– All secondary schools (2 private / 1 public), 3 different states and regions.
• 19 teachers, 37 classes, 732 participants• Three pre-scheduled NED survey administrations (October,
January, May).– Each “live” for two weeks
• NED staff know site names, but not names of participating schools (if district), teachers or class information.
• Students provided incentive to participate.• Surveys included questions from other datasets to compare
responses.• Approximately 250 development hours.
15
16
17
18
19
20
21
22
23
NED
Instructor (additional examples)
24
25
26
NED Status• All three administrations have been completed.• Several showstopper technical issues identified
and resolved.• Participation varied:
School T-Part. S-Part. C-Part. T-Rec. S-Rec. C-Rec.1 4 429 16 1 (25%) 111 (26%) 82 11 200 15 9 (82%) 120 (60%) 133 4 103 8 4 (100%) 66 (64%) 8TOTAL 19 732 37 14 (74%) 297 (41%) 29 (78%)
* All surveys received were complete
27
Implementation Issues• Not a standard “building block;” required custom coding.
– Bb installations vary, affecting custom code.• Not Bb’s standard survey tool.
– Survey formatting limited.– Save and start, adaptive, and timing features limited.– Low ease of use (e.g. self-reference not grayed in sociometric
questions; matrix questions scroll off screen without freezing roster).• Gradebook entries are user-defined, without a standard taxonomy.• Many schools create one mega-site for all class periods.
– Pilot leans away from core subjects.– Pilot leans away from same-teacher course sections.
• Many schools do not use Bb as their gradebook or student profile of record.
• Relying on teacher responses for student-level data not always viable (e.g. mixed age-grade classes).
28
Implementation Issues• Required “enterprise license” of Blackboard.• Data transmission via local SQL scripts, not Web service.• IP address of sending site could allow for matching of school name
with unique ID schema.• Different participant ID’s across classes (if student changed class
periods).• System reports fragmented and unusable without additional
programming. • Total eligible population not included in system report.• Ideal survey length difficult to asses.• Anonymity and class time impact concerns during site solicitation.• Will students participate?
29
Future Directions• Implement through standardized APIs and via eLearning
system’s survey tool.• Pilot with larger number of sites.• Pilot with smaller, more frequent surveys.• Pilot with full site participation across all classes and
grades.• Pilot with full age-grade population over time.• Include non eLearning systems (e.g. TPR) and non Bb
eLearning systems.– Formalize vendor NED interface program.
• Expand to higher education.
30
NED: A Dataset• Both eLearning and non-eLearning related researcher
concerns.• Most appropriate for research questions that require:
– Contextualized instruments – Anonymous responses– Asynchronous administration– Frequent longitudinal collection– Parsimonious question sets– A mix of custom and system data– At scale
31
NED: Imagine• A national dataset.• Fed from tens of thousands of sites.• Collecting unique classroom-level data.• Throughout the academic year and a
student’s educational career.• With minimal site-specific maintenance.• Efficiently and cost effectively.
e-Learning Systems as Research Platforms
Results from the Networked Education Database (NED) Project
Matthew Pittinsky, Hui Soo Chae, Bert Ross, Tom Angelo JULY 10, 2007