Learning networks-2012 griffiths-richards-harrison
-
Upload
dai-griffiths -
Category
Education
-
view
257 -
download
0
description
Transcript of Learning networks-2012 griffiths-richards-harrison
![Page 1: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/1.jpg)
The role of feedback in the design of learning activities
or…how do we know that a good learning activity is 'good'
Dai Griffiths – CETIS / Institute for Educational Cybernetics, The University of Bolton
Griff Richards, Michelle Harrison – Thompson Rivers University
![Page 2: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/2.jpg)
IntroductionThe Insiders Producing courses Research into instructional design practice
The Visitor Technology Enhanced Learning researcher Learning Design
![Page 3: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/3.jpg)
ContextTRU Open Learning Over 400 courses – print/web, now moving to
online (a shift in practice) Revision/New course development - large
teams, work with external SMEs Online paced cohort – design for open
contexts (new focus on OERs)
Instructional design team initiatives In a constantly changing environment how do
you more systematically improve, evaluate, share and reflect on practice?
![Page 4: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/4.jpg)
InitiativesSo far… Workshops Attempt at activity tagging – development of a
catalogue?
Pilot survey for student feedback (online questionnaire)
To do… Focus groups (Faculty/other stakeholders) Embed tools directly in courses (at the activity level) Survey questionnaire Analytics (some constraints) Development of activity catalogue (higher level) for
sharing
![Page 5: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/5.jpg)
Idea: design patterns + analytics can provide feedback and improve practice
But it was not possible to inspect the courses and identify the patterns
So what feedback could help the design task?
We did interviews to establish Factors which determine success in learning
activities The feedback which designers would like
![Page 6: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/6.jpg)
The interviews were very interesting!
We drew out the themes with a Qualitative Data Analysis tool
Obtained a list of factors determining success of learning activities
These are candidates for gathering feedback
We also foundApproaches taken by designers in seeking
effective activitiesConstraints on designers in doing this
![Page 7: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/7.jpg)
Students Lecturers Designers Delivery team
Acessibility issues Appropriate group formation
Activity instructions
Promptness/delay in the system
Cultural fit with students
Evident presence of lecturer online
Amount of text and balance of media
Reliability of technical systems
Perceived activity usefulness
Lecturer 'buy in' to the activity
Degree of complexity
Scheduling of courses
Student learning process preferences
Level of formative feedback
Facilitator workload
Technical barriers Preparation for the activity
Activity structures familiar to students
Time pressure Quality of facilitiation
Fit of pedagogy / context / student
Uneven participation levels
Relationship to learning objectives
Rubrics for marking
![Page 8: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/8.jpg)
How can institutions deal with 24 combinatorial states?
We can attenuate the variety, through well established methods which position students as being identical:
CurriculaCohortsAssessments
We can amplify our response, for exampleThrough peer learningThrough team work
Or we can shut it out and hope it goes away.Through institutional double speak about the importance of
the learning experienceBy isolating strategic planning, design and deliveryThis simplifies the institutions strategy (at least in the short
term!)
See Oleg Liber's application of Stafford Beer to education
![Page 9: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/9.jpg)
What are the implications of the interviews for the designers task?Designers use professional and personal knowledge,
skills and intuition to produce good solutions to impossible problems
These major themes (and more) need to be balancedInstitutional policyThe learners, their capabilities, preferences, and
available timeThe type and level of the knowledgeThe preferences and capabilities of the subject matter
expertsFit with learning outcomesThe intricacies of copyrightThe nature of the online environment v. face to face
equivalent
![Page 10: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/10.jpg)
How could we amplify the designers response?
Designers currently wrestle with the combinatorial states
IndividuallyIntuitively, based on their experience of learning
and of the contextUnaware of the wider implicationsThe complexity and strategic importance of what
they do is not recognised
Make explicit the rules of thumb for The problems faced How we deal with that particular problem around
here Start the design problem further along
See Pawson & Tilley's Realistic Evaluation for approach to rules
![Page 11: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/11.jpg)
Off-load some of the effort to a document
See Hollan & Hutchins distributed cognition
![Page 12: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/12.jpg)
“In this instructional design group this is how we resolve that problem...”
a) There is a moral imperative for equality of opportunity
b) Many of our learners are Unused to group work Unused to online collaboration Get stressed when they are being assessed
c)So it is our rule of thumb to enable learners to practice taking on roles in complex activities by providing a non-assessed activity, before using a complex activity in assessment
![Page 13: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/13.jpg)
But of course, this is contested... Alternatively:a) There is a moral and economicimperative to serve the
learner
b) Our learners are focused on obtaining a qualification They look for the assessment weighting of the activities They act strategically and avoid any learning activity which does not contribute to their grades
c)So it is our rule of thumb that all learning activities (especially complex activities which take a lot of time) will lead to assessment
The role of feedback and analytics is to enable us to choose between rival interpretations
![Page 14: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/14.jpg)
More feedback, or a dictat, is just another thing to deal with
The rules of thumb should always beAn answer to a problem identified by the
people who have to apply themProvisional hypothesesSocially constructedContested
Feedback on learning activities should be focused on confirming or falsifying the hypotheses
Feedback on activities then becomes a research based capacity raising exercise
![Page 15: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/15.jpg)
In summary, a methodology for feedback on learning activities...
Does not simply confirm successful deliveryHypothesises what works where when and whyExamines hypotheses in an iterative process of action
research which informs practiceIs consultative and negotiatedExplicitly links feedback, analytics & design
Results in documents that Take (a little) pressure off Instructional Designers by
providing design principles Provide a basis for explaining the design task and
decisions to students and colleagues
So another title for you: “How a learning designer learned to stop worrying and love learning analytics”
![Page 16: Learning networks-2012 griffiths-richards-harrison](https://reader036.fdocuments.in/reader036/viewer/2022081907/5479c67fb4af9faa158b48e6/html5/thumbnails/16.jpg)
Thanks for your attention, and please feel free to contact us
Dai Griffiths: [email protected] Michelle Harrison: [email protected]
Dai Griffiths thanks Thomson Rivers Open Learning for their support provided in his visiting scholarship in 2011. Without it this research would not have taken place.