Measuring Autorickshaw Emissions to Inform Air Quality Policy
Measuring Student Success: Using Data to Inform Redesign
-
Upload
pearson-education-customer-experience -
Category
Education
-
view
327 -
download
1
description
Transcript of Measuring Student Success: Using Data to Inform Redesign
Efficacy Program&
Course Redesign
Brian BuckleyProgram Manager, Efficacy Research
Colleen KochannekExecutive Marketing Manager,
English, Humanities & Social Sciences
What is Efficacy?
ef·fi·ca·cy n. Power to produce a desired effect; effectiveness
ef·fi·ca·cy (Pearson)n. Power to change lives through learning
• Pearson’s commitment to efficacy• Measuring MyLabs’ impact on learning• How Pearson can help• Using the results• Early intervention tool• Course redesign resources
Agenda
“We’re setting out to become the efficacy company… we need to define ourselves by how effective we are, by the impact we make.”
- Marjorie Scardino, Pearson CEO, 1996-2012
“…we want to be able to demonstrate that everything we do as a company delivers an improved learning outcome.”
- John Fallon, Pearson CEO, 2013-
Committed to Student Success
Efficacy FrameworkCriteria area Rating Rationale summary
• Clear and manageable plan of action
• Clear governance and lines of responsibility
• Monitoring and reporting system
• Capacity and culture within Pearson team
• Customer capacity and culture
• Relationships with other stakeholders
Outcomes / Impact
• Clarity of intended outcome
• Value for money
• Quality of design
• Comprehensiveness of evidence
• Quality of evidence
• Effectiveness of evidence use
Strength of evidence base
Quality of planning and implementation
Capacity to deliver
Likelihood of impact
So what?
Efficacy
Efficacy
Efficacy
Efficacy
Efficacy
MyLab White Papers
I’ve helped put more computers in more schools than anybody else in the world and I’m absolutely convinced that technology is by no means the most important thing.
The most important thing is a person.
- Steve Jobs
Evidence of Success
Trends
• Recognize /embrace educational value of technology integration
• Require MyLab for at least 10% of the final course grade
• Receive MyLab training and follow recommended best practices
• Provide students with an early introduction to MyLab• What it is, how to use it, and how it contributes to their grade• Use Pearson’s First Day of Class program
• Enable active class discussion by assigning pre-lecture homework• Implement assessments to measure student learning gains
• Pre- and post-tests, standardized exams, common final exams• Provide historical data: pass rates, comparable-exam scores, etc.
Share your Story
Reading, Writing, and MyFoundationsLab
Colleen Kochannek, [email protected]
Sara Owen, [email protected]
Brian Buckley, [email protected]
Using Data to Inform Redesign
Experimental Study: Introductory Physics
Question: Does homework copying adversely impact learning/retention?
Measure: Pretest scores, exam scores, and rates of copying
Answer Submission Patterns
Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying
Physics Education Research 6, 010104 (2010)
Quick solvers
Real-time solvers
Delayed solvers
COPIERS!
Exam Scores and Copy Rates
< 1/3 SD on pretest> 2 SD on final exam
Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying; Physics Education Research 6, 010104 (2010)
Failure Rate and Copy Rates
Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying
Physics Education Research 6, 010104 (2010)
Redesign
• Large lecture to studio physics
• More instructor contact
• Fewer, more frequent assignments
• Pass/fail to graded homework
Results…
Reduction in Copy Rates
Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying
Physics Education Research 6, 010104 (2010)
Decrease in D/F rate
Palazzo, D., et al.; Patterns, correlates, and reduction of homework copying
Physics Education Research 6, 010104 (2010)
Early Intervention Tool
The Use of Fractal Dimension in the
Categorization of Students’ Response Patterns
and in the Prediction of Final Exam Scores
(Random Walk)
William Galen & Rasil Warnakulasooriya
Statistical Learning Algorithms Team
Learning Technologies Group
November 2011
How do we “see” a student in traditional assessment?
e.g., Score = 85%, Letter grade = “B”
How did the student get there?
Correct on First Try Grading and Walks
A step to the left: Graded , Score = -1
A step to the right: Graded , Score = +1
Keep track of the
Net Score = # steps to the right – # steps to the left
after each step
Distinguishing Response Patterns
0 100 200 300 400 500 6000
50
100
150
200
250
300
Student 57Fractal D = 1.60
Net S
core
Responses Submitted by Student
A student not showing
random walk behavior
How can we quantify the differences?
A student showing
random walk behavior
0 100 200 300 400 500 600
-25
-20
-15
-10
-5
0
5
10
Net S
core
Responses Submitted by Student
Distinguishing Response Patterns
0 100 200 300 400 500 600
-25
-20
-15
-10
-5
0
5
10
Net S
core
Responses Submitted by Student
Distinguishing Response Patterns
0 100 200 300 400 500 6000
50
100
150
200
250
300
Student 57Fractal D = 1.60
Net S
core
Responses Submitted by Student
A student showing
random walk behavior
Fractal Dimension = 1.94
A student not showing
random walk behavior
Fractal Dimension = 1.60
Fractal Dimensions calculated based on-- Estimators of Fractal Dimension: Assessing the Roughness of Time Series and Spatial Data. T. Gneiting, H. Ševčíková, & D. B. Percival, Univ. of Washington (Seattle) Technical Report No. 577, 2010.
Distinguishing Response Patterns
Fractal Dimension
1 2
Less irregularity in the response pattern
i.e., Persistent behavior
(an increase in net score at one instance is more likely to be followed by an increase
at a future instance)
High irregularity in the response pattern
i.e., Anti-Persistent behavior
(an increase in net score at one instance is more likely to be followed by a
decrease at a future instance)
Straight line Plane
Distinguishing Response Patterns
A Student with a Fractal Dimension 1.32
0 100 200 300 400 500 6000
100
200
300
400
500
Net S
core
Responses Submitted by Student
Student 27Fractal D = 1.32
0 100 200 300 400 500 600
-25
-20
-15
-10
-5
0
5
10
Net S
core
Responses Submitted by Student
Random Walk
Distinguishing Response Patterns
A Student with a Fractal Dimension 1.60
0 100 200 300 400 500 600
-25
-20
-15
-10
-5
0
5
10
Net S
core
Responses Submitted by Student
Random Walk
0 100 200 300 400 500 6000
50
100
150
200
250
300
Student 57Fractal D = 1.60
Net S
core
Responses Submitted by Student
Distinguishing Response Patterns
A Student with a Fractal Dimension 1.81
0 100 200 300 400 500 600
-30
-20
-10
0
10
20
30
40
50
Student 16Fractal D = 1.81
Net S
core
Responses Submitted by Student
0 100 200 300 400 500 600
-25
-20
-15
-10
-5
0
5
10
Net S
core
Responses Submitted by Student
Random Walk
start of course end of courseResponses Submitted
the band of students who are responding randomly at the course level (breaks away from the majority of the students about 1/3rd of the way into the course)
Visualizing the Response Patterns of the Entire Class
majority of the students’ responses crisscrosses in this region; net score increases by about 40 points for every 100 responses
two skilled students’ response patterns overlap over a period of time
Responses Submittedstart of course mid-course
Both students have the same net score
A Tale of Two Students
The Struggling Student can be Identified
Short runs of smoothness followed by short runs of irregularity. Learning behavior?
Responses Submitted
start of course mid-course
onset
Long runs of smoothness followed by long runs of irregularity, then a tipping point (onset).
Responses Submitted
alert
start of course mid-course
Putting the Response Patterns of the Two Students
in Perspective
start of course end of course
Conditions for Triggering Alerts
Alerts (random walking)
No Alert
(not enough information)
Alert
(no random walking; but the net score steadily decreases)
Random Walk Region
1 .0
1 .2
1 .4
1 .6
1 .8
2 .0F
racta
l D
ime
nsio
n
R esponses Subm ittedResponses Submitted
Fra
ctal
Dim
ensi
on
1.0
1.2
1.4
1.6
1.8
2.0
[+1 , ...)(-1 , +1)
Avg
. F
ract
al D
imensi
on
(first
half
of in
tera
ctio
ns)
Final Exam Score (std. dev. units)
(... , -1]
Errors: 95% CI, SEM
-1 -1 … 1 1
Students who scored above 1 std. dev in the final exam
had significantly smoother response patterns
95% confidence intervals shown
Thanks to Profs. Randall Hall and Leslie Butler for providing the exam score data
students exhibiting relatively smooth and consistent response patterns
students exhibiting irregular and inconsistent response patterns
Questions?