Post on 14-Jul-2020
Using Assessment System Data to GenerateIndividualized Learning Materials for
StudentsWhat We Learned from Developing the iDAP
Alex Brodersen & Ying ChengLAMBS lab, Department of Psychology
University of Notre Dame
A presentation at the 19th MARC conference, Nov. 7, 2019
2
Overview
1 BackgroundAP-CATiDAP
2 Lessons learnedTeacher Access & Teacher EngagementPlan for HeterogeneityToolchainQuality ControlProcess Data
3 FutureLearning Modules
Background 3
Background
Background 4
AP-CAT
Full Name: “Cognitive Diagnostic Computerized AdaptiveTesting (CD-CAT) for AP Statistics”
A five-year CAREER project funded by the NationalScience Foundation
Two major componentsResearchEducational outreach
Background 5
2018
Development
Testing Observation
RCTRefinement
Item BankWeb Platform
BugsUX/UIItem Keys
Feedback TypesLearning Outcomes
Engagement SurveyItem CalibrationAdaptive Testing
MobileDiagnostic Score ReportPilot Data
20152015 2016 2017 2018 2019
Background 6
Item Bank
850 items
4 sections16 main topics158 attributes
Background 7
Background 8
Background 9
Background 10
Scaffolding
Step-by-step Solutions
Visible after assignment completion
Background 11
Background 12
Background 13
Feedback
Granular feedback
Areas of opportunity
Background 14
Background 15
iDAP
Intelligent Diagnostic Assessment Platform(i-DAP) for High School Statistics Education
GOAL: Develop a holistic, personalized,learning system integrated into theclassroom
Background 16
iDAP
Leverage attribute system developed in theAP-CAT
Map to 36 common core state standardson “statistics and probability”
Background 17
Lessons learned 18
Lessons learned
Lessons learned 19
Lesson 1
Teacher Access & Teacher Engagement
Lessons learned 20
Teacher Access & Teacher Engagement
Teacher approval
delivery enginereportingfeature requestsrollout of assignments
Example: In-Class AssignmentLength
Lessons learned 21
Lesson 2
Plan for Heterogeneity
in learning environmentin students
Lessons learned 22
Sample Heterogeneity
Not surprising
learning outcomes depend on schoolSurprising
Student self-predictions of their finalexam scores depend on school (Ober etal., In Prep)Patterns of incorrect responses
Lessons learned 23
Lesson 3
Choose your toolchain wisely
Lessons learned 24
Lessons learned 25
1 con <� DBI : : dbConnect(RMySQL: :MySQL( ) , ## Connect to DB2 host = "domain.name.edu" ,3 port = 3306,4 user = "db_user" ,5 password = "db_pw" ,6 dbname = "db_name" )7
8 tb l (con , " act iv i ty " ) %>%9 f i l t e r ( role == "student" ) %>% ## Only student data
10 select(�firstname , �lastname , ## de�identi fy11 �role , �username,12 �id , �meta)13
Lessons learned 26
Lesson 4
Have Quality Control and Contingency Plans
Lessons learned 27
Lesson 4
Not all data are worth being collected
Not all data that have been collected areworth being analyzed
Lessons learned 28
# Source: lazy query [?? x 4]# Database: mysql 5.7.27-0ubuntu0.18.04.1# [user@domain.name:/dname]
url tabname t info<chr> <chr> <chr> <chr>
1 /results/461 results(student) 2019-09-25 11:30:28 <NA>2 /results/461 results(student) 2019-09-25 11:30:30 <NA>3 /results/461 student_info_btn 2019-09-25 11:30:34 "btn":"info","qid":"773","is_shown":true4 /results/461 student_info_btn 2019-09-25 11:30:36 "btn":"info","qid":"436","is_shown":false5 /results/461 student_info_btn 2019-09-25 11:31:30 "btn":"info","qid":"223","is_shown":true6 /results/461 student_info_btn 2019-09-25 11:31:38 "btn":"info","qid":"773","is_shown":false7 /results/461 Attributes(student) 2019-09-25 11:33:20 <NA>8 /results/461 results(student) 2019-09-25 11:33:29 <NA>9 /results/787 results(student) 2019-09-25 11:43:31 <NA>10 /results/787 results(student) 2019-09-25 11:46:55 <NA># ... with many, many, many more rows
Lessons learned 29
Lesson 4
Quality Control
Dummy itemsResponse time data (Qualtrics hack)Open text responses to surveys
Contingency Plan
End users are used to having "delete"mean "put in trash"Choosing not to take the exam
Lessons learned 30
Lesson 5
System paradata (Process data) can provide awealth of information
Lessons learned 31
Example: Procrastination
Active Procrastination (Chu & Choi, 2005)Criticisms (e.g. Delay?) (Krause & Freund, 2014)
Attempting to validate a survey scale via process data
Lessons learned 32
Data Source
Datalag = Assignment Deadline - Assignment Submission Timeacross 5 AssignmentsActive Procrastination Scale
Modeling lag as a behavioral indicator of procrastination
Lessons learned 33
0
5
10
15
20
25
0 100 200
Mean Lag
freq
uenc
y
Histogram of Time to Due Date
Lessons learned 34
●
●
●
●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
0
25
50
75
100
0 100 200
Mean Lag
SD L
ag
Mean vs Standard Deviation
Lessons learned 35
Behavioral LV
Outcome Statisfaction
Preference for Pressure
Intentional Decision
Ability to Meet
Deadlines
assgn1 assgn2
os1
os2
os3
os4
pp1 pp2 pp3 pp4 id1 id2 id3 id4
md1
md2
md3
md4
assgn3 assgn4 assgn5
0.070.23*
-0.27*-0.09
0.23*
0.76*
0.48*0.14
-0.06 -0.47*
0.83
Lessons learned 36
Model Fit
Model Fit
Model cfi tli rmsea(5%)
rmsea(95%)
srmr
ActiveProcrastination 0.964 0.957 0.074 0.082 0.071
Lessons learned 37
Summary
Summary
Limitations
Future StudiesEstimate Latent Procrastination to provide remindersOther applications
Future 38
Future
Future 39
Future 40
A holistic student view
Multiple Data SourcesAssessment DataBig 5Statistics AnxietyFree ResponseHelp Forum
Thank You 41
Thank You
Thank You 42
Acknowledgments
https://lambslab.nd.edu/
Thank You 43
Thank you
Questions?
abroders@nd.edu