Learning analytics
Collecting learning insights: Unpacking the complexity of
learning analytics
Shane Dawson [email protected] Twitter: @shaned07
Question
How familiar are you with the field of learning analytics? A. Never heard of it B. A little; have read some research C. Very familiar; follow the research D. Expert; undertake research and actively
contribute to the fields development
Overview
• Background and overview • Data and algorithms • Learning Analytics
• Technologies and data • Course grain to fine grain data • Exemplars and case studies • Complexity of LA
More things change the more they stay the same
http://www.pellinstitute.org/downloads/publications-Indicators_of_Higher_Education_Equity_in_the_US_45_Year_Trend_Report.pdf
High-income 8 times more likely than low-income to obtain a bachelor’s degree by age 24 In the US the gap is increasing: • 2013 – 8 times • 1970 – 6 times
http://www.pellinstitute.org/downloads/publications-Indicators_of_Higher_Education_Equity_in_the_US_45_Year_Trend_Report.pdf
• Student diversity increasing • Increased part time employment • Gender distribution changing across fields
• (Females under-represented in STEM; Males under-represented in Education, health, arts)
• Decreased HE funding and rising costs • Increased demand for University
http://theconversation.com/who-goes-to-university-the-changing-profile-of-our-students-40373
Minimal impact. Digital disruption?
Student feedback under utilised Assessment processes poor Lack of proficiency in self-regulated learning
Freeman, S., et.al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410-8415. Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. Routledge.
Big data and analytics
Massive interest in data and analytics.
• > 5 Billion mobile phones • > 1 Billion Youtube users • 300 hrs of video uploaded to YouTube/ minute • $600 drive to store all of the worlds music • 60% increase in operating margin for retailers using
big data
Its accessible, cheap and critical Manyika, J., et al. (2011). Big Data: The Next Frontier for Innovation, Competition, and Productivity: McKinsey Global Institute
Algorithms – predictions
Predictive modelling
Predictive modelling
Target coupons inform father of daughter's pregnancy
Predictive modelling
Analytics for everything
Education is no different • Huge investment in analytics • Ease of access to learner data - LMS • Growth in technical devices • Growth in blended/ online learning models
Education context
Examples in education
• Academic performance • Student retention • Pastoral care • Academic literacies • Social networks – collaborations
• Student from Shanghai-based East China Normal University
• "Last month, you spent less on meals. Are you in financial difficulty? If so, please contact me via phone, text message or e-mail.“
http://www.bjreview.com.cn/nation/txt/2014-06/23/content_625466.htm
Education context
• Automatically track students' meal card spending.
• If spending falls under a threshold level, a designated faculty member sends the student a short message to check whether they are in financial difficulty.
Education context
http://www.bjreview.com.cn/nation/txt/2014-06/23/content_625466.htm
What are learning analytics? • Academic analytics • Learning analytics • Institutional analytics • Educational data mining • Assessment analytics • Social learning analytics • Business intelligence
Learning Analytics
• Learning analytics is a bricolage field • Incorporates diverse methods and approaches:
– Statistics – Data mining – Machine Learning – Artificial Intelligence – Text Mining – Information Visualization – Social Network Analysis
Learning Analytics
Learning Analytics
Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE review, 46(5), 30.
Definition
…is the collection, collation, analysis and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning
Learning Analytics • “game changer” for education
Definition
Learning analytics …the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data
Cooper, 2012
Definition
Similarities in definitions: Use of learner/educational data to provide actionable insight to aid the learning process
Large data sets – trends/ patterns or anomalies.
Learning Analytics
Get answers to your most important questions like: • How can I easily find students who are at-risk?
Data environments Analysis methods for LA What tools? Learning Analytics cycle Worked example
Technologies and analyses
Technologies and analyses
Technologies and analyses
Data types
Data environments • Student information system (SIS) • Learning management system • Assessment • Survey data (e.g. MSLQ; COI; Grit) • Course evaluations • Other related learning technologies
• Publishers • Quizzes • Video
• Structured data – Data is stored in DB tables and highly organized. – e.g. Student name, grades, address
• Unstructured data
– Everything else - text, audio, images, video
Data types
Data environments Analysis methods for LA What tools? Learning Analytics cycle Worked example
Technologies and analyses
Technologies and analyses
Analysis methods See: “Big data and Education” Ryan Baker. http://www.columbia.edu/~rsb2162/bigdataeducation.html
Different data types - different analytical approaches.
Data analysis
• Unstructured data analysis – Natural Language Processing and text analytics – E.g. topic modelling; sentiment analysis – (people, places, locations, dates, times etc., sentiment
analysis (+,-) – Audio analysis – E.g. speaker identification, language identification,
speech to text – Video analysis – E.g. facial and object recognition
Data analysis
• Descriptive statistics: – sums, means, std devs, basic plotting (graphs, charts,
histograms) • Machine learning:
– find patterns in data to perform either classification, clustering or prediction
– e.g. decision trees, neural networks, support vector machines, linear regression, self organising maps, k-means
• Predictive analytics – – Algorithmic approaches (generally machine learning)
for predicting key target variables of interest.
Data analysis
Technologies and analyses
Common approaches – structured data • Prediction • Structure discovery • Relationship mining
Baker, R., and Salvador Inventado, P. (2014). Educational data mining and learning analytics." In J. Larusson & B. White. Learning Analytics, pp. 61-75. Springer New York, 2014.
Technologies and analyses
• Prediction
• Classification • Regression • Latent Knowledge Estimation
Models to “predict” outcome/variable based on predictor variables
Baker, R., and Salvador Inventado, P. (2014). Educational data mining and learning analytics." In J. Larusson & B. White. Learning Analytics, pp. 61-75. Springer New York, 2014.
Technologies and analyses
Examples • Predict
• Which students will pass a course? • Which students will stay in the course?
• Can inform • Support interventions • Curriculum design, assessment, recruitment,
marketing, pre-requisites
Technologies and analyses
• Structure discovery
• Clustering • Factor analysis • Network analysis
• Will explore network analysis later. Identify emergent patterns and structure in data No specific target or predictor
Technologies and analyses
Analysis methods • Relationship mining
• Discover relationships between variables • Association Rule mining
• If/then statements - analyses and predicts behaviour
• Need to be careful interpreting
• Diapers and beer • Target example
Analysis methods for LA? What tools? Learning Analytics cycle What questions?
Technologies and analyses
Technologies and analyses
Tools • Weka
• http://www.cs.waikato.ac.nz/ml/weka/ • R
• www.r-project.org • Rapidminer
• https://rapidminer.com/ • SNA specific tools
• Gephi • Netdraw/ Ucinet • Pajek
Analysis methods for LA? What tools? Learning Analytics cycle Worked example
Technologies and analyses
The learning analytics cycle
Technologies and analyses
Clow, D. (2012). The learning analytics cycle: closing the loop effectively. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 134-138). ACM.
Technologies and analyses
Student demographics FT/PT
Learning design
Engagement, time? Technology choice
Support processes Dashboards Sensemaking
Assessment models Retention/ Std success
Technologies and analyses
Technologies and analyses
Technologies and analyses
Develop an early alert process for student retention and success
Technologies and analyses
Data sources • Student Information System
• Entry pathway (HS; mature age) • Past subjects studied/ grades • International/ domestic • Parental education • Home address • Program of study • Course of study • Course load • PT/ FT
Technologies and analyses
Data sources • Learning Management System
• Access • Forum postings, reads, replies • Quiz attempts and scores • Lecture capture • Downloads • Video • Assignment submissions • Turnitin scores
Technologies and analyses
A simple model: • Queensland University of Technology (QUT) • Built off Vincent Tinto’s work – social and academic
integration • First Year Experience • Transition pedagogy
Technologies and analyses
Technologies and analyses
• All student first year data • Regression analysis – students that pass/fail • Identify significant variables:
• Students with no course access • Students that fail or do not submit
assessment • Course access
Technologies and analyses
• Basic analytics to monitor engagement
• LMS (Blackboard) data • Failure of first assessment – trigger event
• Why a trigger at this point?
Technologies and analyses
A replicable/ transferable model?: • ESAP – Enhancing Student Academic Potential
• Trained callers • Failure or non-submission of first assessment
• Focus on first year courses
Technologies and analyses
A worked example: A replicable model?: • ESAP – Enhancing Student Academic Potential
• Trained callers • Failure or non-submission of first assessment
• Focus on first year courses Significant improvement in academic
performance No improvement in retention
High
Medium
Low
No risk
Course start to completion
Technologies and analyses
Learning support
Intervention
High
Medium
Low
No risk
Course start to completion
Technologies and analyses
Establish earlier support processes
Intervention Touch points
Technologies and analyses
Earlier interventions required • CBA – Classification based on Associations
• Association mining – all associations • CBA - Pre-determined target in this case student
retention
Technologies and analyses
Risk indicators: Student demographic data Touch point 1 – pre-course commencement • Parental level of education • High School • Distance from University campus • International
High
Medium
Low
No risk
Course start to completion
Technologies and analyses
Establish earlier support processes
Touch points
Technologies and analyses
What type of intervention does this information serve?
Technologies and analyses
Advisory and preparatory conversation only What can teaching staff do with this information?
Technologies and analyses
Risk indicators: Student demographic + LMS data • Course login • Comparative engagement • Parental level of education • High School • Distance from University campus • International
Technologies and analyses
What type of intervention does this information serve? Poor SRL skills
High
Medium
Low
No risk
Course start to completion
Technologies and analyses
Establish earlier support processes
Intervention Touch points
Technologies and analyses
Risk indicators: Student demographic + LMS + assessment data • Assessment completion and score • Course login • Comparative engagement • Parental level of education • High School • Distance from University campus • International
Technologies and analyses
What type of intervention does this information serve? Confirmatory – students do have poor SRL skills
High
Medium
Low
No risk
Course start to completion
Technologies and analyses
Establish earlier support processes
Intervention Touch points
Summary Risk indicators at: pre-course commencement • Student demographic data; secondary education or
alternate entry
course commencement • Previous + Engagement with the university (e.g.
learnonline)
Mid course • Previous + formative and summative assessment
Technologies and analyses
Impact: Retention initiative • 11160 students • 2158 (19.5%) classified as academically at risk • Attempts to contact all 2158 students • 1532 were contacted • 5.2% improvement on retention
Technologies and analyses
Impact: Retention initiative • Trigger with greatest impact?
• Course access • Scalable, easy to interpret • Conversations have greater resonance
Technologies and analyses
Dashboard
Dashboard
Dashboard
Dashboard
Dashboard
Dashboard
Quick example: Purdue University
Arnold, K. E., & Pistilli, M. D. (2012). Course Signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 267-270), ACM.
Example
Example
Tanes, Z., Arnold, K. E., King, A. S., & Remnet, M. A. (2011). Using Signals for appropriate feedback: Perceptions and practices. Computers & Education, 57(4), 2414-2422.
Example
More than simple prediction
Is this accurate for all courses?
More than simple prediction
Example patterns
0
2
4
6
8
10
12
1 2 3 4 5 6 7 8 9 10 11 12 13
Student login frequency avg / week
0
2
4
6
8
10
12
1 2 3 4 5 6 7 8 9 10 11 12 13
Class average
0
2
4
6
8
10
12
1 2 3 4 5 6 7 8 9 10 11 12 13
High performing student
0
2
4
6
8
10
12
1 2 3 4 5 6 7 8 9 10 11 12 13
Typical login engagement pattern
0
10
20
30
40
50
60
70
80
90
100
-4 -2 0 2 4 6 8 10 12
Grades vs Time of first login (0 = course start)
0
10
20
30
40
50
60
70
80
90
100
-4 -2 0 2 4 6 8 10 12
0
10
20
30
40
50
60
70
80
90
100
0
10
20
30
40
50
60
70
80
90
100
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Females Internationalstudents
Otherlanguage at
home
Living in non-urban
Part timestudent
Previouslyenrolled to a
course
Early access Did notaccess
Late access
ACCT 1008 (n = 746)
BIOL 1007 (n = 220)
BIOL 1047 (n = 657)
COMM 1060 (n = 499)
COMP 1039 (n = 242)
ECON 1008 (n = 661)
GRAP 1017 (n = 192)
MARK 1010 (n = 723)
MATH 1063 (n = 194)
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
100.00%
All coursestogether
ACCT1008 BIOL1007 BIOL1047 COMM1060 COMP1039 ECON1008 *GRAP1017
MARK1010 MATH1063
Model 1
Moodle
Model 1 + Moodle
Instructional conditions
Instructional conditions shape LA results
Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84.
Summary
1. Multiple drivers and new opportunities for LA/ EDM
Summary
1. Multiple drivers and new opportunities for LA/ EDM
2. LA is about generating insights into student learning for optimizing learning
Summary
1. Multiple drivers and new opportunities for LA/ EDM 2. LA is about generating insights into student learning for
optimizing learning
3. Multiple analytical approaches – each have their strengths and weaknesses
Summary
1. Multiple drivers and new opportunities for LA/ EDM 2. LA is about generating insights into student learning for
optimizing learning 3. Multiple analytical approaches – each have their
strengths and weaknesses
4. Correlations for the sake of correlation is NOT important
Summary
1. Multiple drivers and new opportunities for LA/ EDM 2. LA is about generating insights into student learning for
optimizing learning 3. Multiple analytical approaches – each have their
strengths and weaknesses 4. Correlations for the sake of correlation is NOT
important
5. What we do with information is critical. Timing and content
Summary
1. Multiple drivers and new opportunities for LA/ EDM 2. LA is about generating insights into student learning for
optimizing learning 3. Multiple analytical approaches – each have their
strengths and weaknesses 4. Correlations for the sake of correlation is NOT
important 5. What we do with information is critical. Timing and
content
6. Ultimately to improve learning we need to also improve teaching.
Questions
Questions?
Next session
1. Moving from course grained data to more fine grain 2. Social Network Analysis and introduction. 3. Video analytics 4. Why so few large scale examples of LA?
From course to fine
LMS/ SIS data - Course grained data Can we gain further insight?
Learning theory
“Traditional“ learning theory
Social learning theory
Where thinking happens
In the head Interactions with the world
Unit of analysis Individual Individual to community
Measurement focus
Outcomes Process
How learning happens
Transmission of information
Knowledge construction
I've become convinced that understanding how networks work is an essential 21st century literacy
Howard Rheingold
Social Network Analysis
One of the most common methods in LA What is SNA? Social network analysis [SNA] is the mapping and measuring of relationships between actors (people, groups, organisations, etc.). SNA can identify patterns of information flow and structure
Social Network Analysis
History of SNA Graph theory, psychology and anthropology Seven bridges of Konigsberg
Social Network Analysis
Seven bridges of Konigsberg – Graph Theory Euler path (Leonhard Euler 1707 – 1783)
Can you walk around the city crossing each bridge only once?
Social Network Analysis
Jacob Moreno (1930’s) New York Training School for Girls Problem – large number of runaways in short period of time
Social Network Analysis
Sociogram Graphic representation of the network structure • Actors (nodes) • Relations (lines) • Network (graph)
Basic concepts
• Tie strength – Identify strong/weak ties.
• Key players – Identify key/central nodes in the network
• Cohesion – A measure of the overall network structure
Social Network Analysis
Granovetter, M. Strength of weak ties
Centrality
• Degree: number of connections an actor has (Kevin Bacon)
• Betweenness: high betweenness – links disparate clusters.
• Closeness: distance to all other actors in the network
• Density
Social Network Analysis
Burt, R. (2005). Brokerage and closure: An introduction to social capital. Oxford, Oxford University Press.
Interpreting visualisations
Social Network Analysis
Betweenness Broker (connects clusters)
Social Network Analysis
Degree: Number of connections for a node (in degree and out degree) Degree = 7
Social Network Analysis
Closure: Tight cluster, strong ties – trust building
Networks
Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the Fourth International Conference, LAK (pp. 231-240). ACM.
Happy Network
Fowler, J. H., & Christakis, N. A. (2008). Dynamic spread of happiness in a large social network: longitudinal analysis over 20 years in the Framingham Heart Study. Bmj, 337, a2338.
Obesity Network
1 degree of separation 45% increase in probability a person is obese if social contact is obese 2 – 25%; 3 -10%; 4 - <5%
Christakis, N. A., & Fowler, J. H. (2007). The spread of obesity in a large social network over 32 years. New England journal of medicine, 357(4), 370-379.
“single most potent source of influence”
Astin, A. (1993). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass.
Learning Network
Learning Networks
Gašević, D., Zouaq, A., & Janzen, R. (2013). “Choose Your Classmates, Your GPA Is at Stake!”: The Association of Cross-Class Social Ties and Academic Performance. American Behavioral Scientist, 57(10), 1460-1479. DOI: 10.1177/0002764213479362
Choose your classmates carefully – your GPA is at stake
• Social Networks Adapting Pedagogical Practice • Focus on student relationships (learning
networks) • Simple visualizations to assist with
interpretation and evaluate impact of activities • Lightweight analytics tool
• Bookmarklet • Rapid and easy dissemination
Visualisation
Visualisations
• Forum A • Forum B
14 messages posted by 4 participants
Visualisations
• Forum A • Forum B
Visualisations
Visualisations
Forum 1 Forum 2
Visualisations
Forum 1 Forum 2
Visualisations for interpretation
Monitoring online networks - Informed decisions for improving learning design - Evaluate impact of implemented activities
Network examples
Facilitator Centric Patterns • Interactions occurs between the facilitator and individual
participants but not between participants. • Indication that knowledge sharing and collaboration may
not be occurring e.g. Question and Answer Forums
Facilitator centric
Instructor
Disconnected students
Is this a learning community?
Facilitator centric
Instructor
Facilitator centric
Ego-networks • Top 10% • Bottom 10%
Ego-networks
Dawson, S. (2010). 'Seeing' the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology, 41(5), 736–752.
Top 10% student located in network
Student with a passing grade
Ego-networks
Low 10% student located in network
Students with a grade >75% < 90%
Ego-networks
• Staff intervention • High – 70% of networks • Low – 10% of networks • Why?
• The pursuit of community
Ego-networks
Course (unit)
Curriculum networks
Curriculum networks – dominant pathways
Curriculum networks
What outcomes, what experiences?
A F E D
C
B
Curriculum networks
What outcomes, what experiences? Assessment Learning outcomes Learning experiences Graduate attributes
Automated portfolio/ Learning Relationship
Curriculum networks
Can we predict student progression? What major? what courses? Analytical Hierarchy Process • Modelling based on student preferences and
motivation • Grades, BBA, SEoT, Professor, GPA, Assessment,
Timetable, Past course experience and demographics
Course prediction
Ognjanovic, I., Gasevic, D., & Dawson, S. (2016). Using institutional data to predict student course selections in higher education. The Internet and Higher Education, 29, 49-62.
• We can predict course and major choice • Timetable, Instructor
• Can we alter and influence course flow? • What learning experiences offer optimal value
at what time? • e.g. international experience; Work
integrated learning
Course Prediction
Ognjanovic, I., Gasevic, D., & Dawson, S. (2016). Using institutional data to predict student course selections in higher education. The Internet and Higher Education, 29, 49-62.
OVAL – Online Video Annotations for Learning
Video Analytics
Judgement of learning
Video Analytics
Student A (course 2 – graded)
Student B (course 4 – non-graded)
Video Analytics
Scalability and impact
Yet in terms of wide-scale institutional adoption there are few examples
Why?
LA/EDM research: • Predictions of learning success (early alerts) • Performance and retention • Indicators of on/off task attention • Carelessness and gaming • SRL proficiency • NLP • Learning dispositions • Graduate qualities/ 21st C literacies • Learning design
Research Summary
Great research BUT: • Ignores the complexity of university wide practices • Small scale and technology specific • Tends to be institutional specific • Lacks guidance to aid further adoption • Frequently requires high level skills and capacities
LA Research
Innovative research BUT: • Very few University wide examples of LA adoption • Obviously an area of increasing need and
importance
LA Research
National project to benchmark LA status, policy and practices for Australian Universities Many thanks to: Cassandra Colvin, Alex Wade and Tim Rogers
National Project
• understand current LA practice in Australia • unpack the challenges to institutional
adoption • Identify practices that can aid the
implementation of LA
Aims
2 complementary but separate studies • Study 1 – interviews with senior institutional leaders • Study 2 – concept mapping with LA expert panel
Approach
First study: Interviews with 32 Universities: • Identification of current practice, methods and
approaches • Identification of key drivers for institutions, stage
of development, process for implementation, project leads
Study 1
First study: Coding: • Development and application of coding protocol • Cluster analyses performed (PAM clusters)
Study 1
• Much interest in LA • Stated organisational priority
• LA projects were in the early phases of
implementation and small scale (at time of interview July 2014)
• 2 distinct clusters across variables such as: implementation, conceptualisation, readiness
Cluster 1 (n=15) – Solutions focused Cluster 2 (n=17) – Process focused
Study 1
High interest – slow uptake - predominantly at the stage of basic reporting • Goldstein & Katz (2005) reviewed US universities
and noted vast majority were in Stage 1 or 2 (of 5 stages in maturity)
• Yanosky (2009)
• Bichsel (2012)
Benchmarking
Clearly there remain challenges with implementing LA at scale
Benchmarking
2 Distinct trajectories for implementation • Differences in cluster variables such as:
• Conceptualisation of LA • Readiness • Implementation approach
Study 1
• Cluster 1 • focused on retention outcomes • Limited mention of LA as a means to
improve learning • Main driver is budget (cost savings)
• Cluster 2 • Broader view of learning analytics and its
application into learning and teaching practice
Conceptualisation
• Cluster 1 • Limited to no articulated strategy • Minimal capacity building activities • Success is seen as staff access to information • Technology infrastructure sound and
developing • Cluster 2
• Defined strategy • Developing capacity building activities • Technology – less emphasis on development.
Readiness
• Cluster 1 • Minimal stakeholder engagement • Leadership top down and siloed • Vendor tools heavily integrated
• Cluster 2
• Extensive stakeholder engagements • Leadership top down – but wide-spread. • Multiple engaged units (IT, teaching, faculty) • Where vendor tools/ processes adopted
maintained critical perspective
Implementation
Strategic capability
Context
Student demographics
Student retention and performance
Govn accreditation
University group / ATN/ Rural/ Go8
Strategic capability
1. Solutions focused
• LA to address a pressing need • Time sensitive
2. Process focused • Networked and integrated model • Minimal time pressures • Innovation and experimentation
Strategic capability
What are the ideal dimensions for long term sustainable uptake of LA? • Invite to Australian and international LA experts • 28 completed the entire concept mapping phases • 3 phases – brainstorming; sorting and ranking of
statements • Following the final ranking phase – a 7 cluster
solution emerged.
Study 2
Study 2
Bringing it together
Study 1 – 2 clusters Study 2 – 7 clusters Essentially – how an organisation approaches its conceptualisation of LA underpins (2 clusters) the method for deployment and adoption (7 clusters)
Systems Model
Strategic capability
Interested Implementing
Implementation capability
Tool/Data Quality
Research/ Learning
Educator uptake
Systems Model
Strategic capability
Implementing
Tool/Data Quality
Educator uptake
Systems Model
Strategic capability
Interested Implementing
Implementation capability
Tool/Data Quality
Research/ Learning
Educator uptake
LA Maturity
LA Maturity
Solutions focused Address an immediate concern – e.g. retention
LA Maturity
Process focused Broad view of LA
LA Maturity
Merged model – responsive and agile
Challenges to be addressed: • Leadership awareness • Teams are seldom interdisciplinary • IT driven and system focused • Scale versus understanding • Capabilities and skills deficit. • Over reliance on current research – requires
further validation across different contexts to demonstrate transportability of models
Bringing it together
Leveraging the outcomes of short term goals for long term gain
• How do we merge both models to gain both
short and long term impact?
Complexity
Complexity
Law of requisite variety (Ashby 1958) • To control a system – the number of problems
need to be matched by at least an equal number of responses
• The more complex the system the more problems
Ashby W.R. (1958) Requisite variety and its implications for the control of complex systems, Cybernetica 1:2, p. 83-99. (available at http://pcp.vub.ac.be/Books/AshbyReqVar.pdf, republished on the web by F. Heylighen—Principia Cybernetica Project)
Complexity
Law of requisite complexity (Boisot & McKelvey 2011) • It take complexity to defeat complexity • A system must possess a level of complexity that
at least matches that of its environment to function effectively
Boisot, M., & McKelvey, B. (2011). Complexity and organization--Environment Relations: Revisiting Ashby's Law of Requisite Variety. The Sage handbook of complexity and management, 279-298.
Education is a complex system • Resilient to change • Adaptive and self-organising clusters • Change is non-linear and often unexpected
LA is complex – tendency towards simplification for implementation
Complexity
LA often reduced to independent components • Data • Analysis • Technology • Dashboards/ visualisations • Staff training
Does not adequately deal with the inter-relationships nor the overarching complexity of the system
Complexity
• The view to simplify can lead to fixed boundaries and organisational silos
• IT example
Complexity
Conclusion
• LA requires new models for implementation and
leadership • Enabling leadership • Whole of organisation • Models that are agile and research informed
• Working in complexity creates friction
• Embrace the friction – generates innovation
Conclusion
• A solutions based model can drive change – but
need to be mindful of responding to changing organsational needs
• Process based model can drive innovation and interest – but need to be mindful of how to scale
Conclusion
• Any action aimed at only a single element of the
system if student retention is unlikely to have impact
• Target multiple elements in the system – be adaptive and monitor change
Forsman, J., Van den Bogaard, M., Linder, C., & Fraser, D. (2015). Considering student retention as a complex system: a possible way forward for enhancing student retention. European Journal of Engineering Education,40(3), 235-255.
Conclusion
• Combined model framed in the organisational
context • Small, diffuse pockets of innovation to build
capacity and build interest • View to scale adoption – demonstration of
impact (technical, pedagogical) • Distributed enabling leadership (complexity
leadership)
Conclusion
• Any “successful” adoption of LA will be dependent
on an institution’s ability to rapidly recognise and respond to the organisational culture and the concerns of all stakeholders.
Thank you…
[email protected] Twitter: @shaned07
Top Related