EASAC Science-Policy-Dialogue workshop 20-21 Oct 2011 Lisbon
Design based for lisbon 2011
description
Transcript of Design based for lisbon 2011
Research Methods in Distance Education: Design Based
ResearchTerry AndersonPhD Seminar
Nova University, LisbonMarch 2011
Tuesday’s Agenda
• Lecture – Research methods and in DE– Design Based research
• Break– Design Based research in Action– Athabasca’s Elgg Project
The context of Distance Education Implementation
• Disruptive innovation (Christensen, 2008) simpler, not wanted by main stream customers
• Rapid gains in functionality• Cheaper• Adaptive• Moving from peripheral to mainstream
(blended and online for full time students)
Good Research
• Good Theory (see http://www.learning-theories.com)
• Good Question(s)• Good methodology• Brave conclusions• Important applications• Spans multiple iterations
Because much e-learning tools and context is so emergent - need for
• Demographic studies– Who is using what for doing what?– How are they using multi-use tools?
• Visualization of activity and relationships• Network analysis - SNAPP• Connection between context and type of use• Data mining/analytics• Sharing and making visible research and
practitioner results
Research Paradigms
Research Paradigms
Research ParadigmsQuantitativeQuantitative ~ ~ discovery of the laws that discovery of the laws that
govern behaviorgovern behavior
Qualitative ~Qualitative ~ understandings from an understandings from an
insider perspectiveinsider perspective
CriticalCritical ~~ Investigate and expose the Investigate and expose the
power relationshipspower relationships
Design-based ~Design-based ~ interventions, interventions,
interactions and their effect in multiple interactions and their effect in multiple
contextscontexts
Paradigm 1Quantitative Research
• employs a scientific discourse derived from the epistemologies of positivism and realism.
• “those who are seeking the strict way of truth should not trouble themselves about any object concerning which they cannot have a certainty equal to arithmetic or geometrical demonstration”– (Rene Descartes)
• Inordinate support and faith in randomized controlled studies
Quantitative 1 – CMC Content Analysis
• Anderson, Garrison, Rourke 1997-2003– http://communitiesofinquiry.com - 9 papers reviewing results
focusing on reliable , quantitative analysis– Identified ways to measure teaching, social and cognitive
‘presence’– Most reliable methods are beyond current time constraints of
busy teachers– Questions of validity– Serves as basic research as grounding for AI methods and major
survey work of the future – Serves as qualitative heuristic for teachers and course designers
Quantitative – Meta-Analysis
• Aggregates many effect sizes creating large N’s more powerful results.
• Ungerleider and Burns (2003)• Systematic review of effectiveness and efficiency of
Online education versus Face to face• The type of interventions studied were extraordinary
diverse –only criteria was a comparison group • “Only 10 of the 25 studies included in the in-depth
review were not seriously flawed, a sobering statistic given the constraints that went into selecting them for the review.”
Is DE Better than Classroom Instruction?Project 1: 2000 – 2004
• Question: How does distance education compare to classroom instruction? (inclusive dates 1985-2002)
• Total number of effect sizes: k = 232
• Measures: Achievement, Attitudes and Retention (opposite of drop-out)
• Divided into Asynchronous and Synchronous DE
14
Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How does distance education compare to classroom instruction? A M., & Huang, B. (2004). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439.meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379-439.
Primary findings• DE and CI are essentially equal (g+ ≈ 0.0 to low
average effect) on all measures• Effect size distributions are heterogeneous; some
DE >> CI, some DE << CI• Generally poor methodological quality• Pedagogical study features account for more
variation than media study features (Clark, 1994)• Interactive DE an important variable*
15
*Lou, Y., Bernard, R.M., & Abrami, P.C. (2006). Media and pedagogy in undergraduate distance education: A theory-based meta-analysis of empirical literature. Educational Technology Research & Development, 54(2), 141-176.
Summary of results: Achievement
16
Type of DEType of DE kk g+g+ Sig.Sig.
CombinedCombined 318*318* 0.0130.013** > 0.05> 0.05
SynchronousSynchronous 9292 ––0.1020.102** < 0.05< 0.05
AsynchronousAsynchronous 174174 0.0530.053** < 0.05< 0.05
Achievement Outcomes
*Significantly heterogeneous average effect
Summary of results: Attitudes
17
Type of DEType of DE kk g+g+ Sig.Sig.
CombinedCombined 154154 ––0.0810.081** < 0.05< 0.05
SynchronousSynchronous 8383 ––0.1850.185** < 0.05< 0.05
AsynchronousAsynchronous 7171 ––0.0340.034** > 0.05> 0.05
Attitude Outcomes
*Significantly heterogeneous average effect
Summary of results: Retention
18
Retention Outcomes
Type of DEType of DE kk g+g+ Sig.Sig.
CombinedCombined 103103 ––0.0570.057** < 0.05< 0.05
SynchronousSynchronous 1717 0.0050.005 > 0.05> 0.05
AsynchronousAsynchronous 5353 ––0.0930.093** < 0.05< 0.05
*Significantly heterogeneous effect sizes
Equivalency: Are all types of Interaction necessary?
Anderson, 2003IRRODL
Anderson’s Equivalency Theorem (2003)
Moore (1989) distinctions are: Three types of interaction
o student-student interactiono student-teacher interaction o Student-content interaction
Anderson (2003) hypotheses state: High levels of one out of 3 interactions will produce
satisfying educational experience Increasing satisfaction through teacher and learner
interaction interaction may not be as time or cost-effective as student-content interactive learning sequences
20
Do the three types of interaction differ? Moore’s distinctions
21
Achievement and Attitude Outcomes
Achievement Attitudes Interaction Categories k g+adj. k g+adj. Student-Student 10 0.342 6 0.358 Student-Teacher 44 0.254 30 0.052 Student-Content 20 0.339 8 0.136 Total 74 0.291 44 0.090 Between-class 2.437 6.892*
Moore’s distinctions seem to apply for achievement (equal importance), but not for attitudes (however, samples are low for SS and SC)
Does strengthening interaction improve achievement and attitudes? Anderson’s hypotheses
22
Anderson’s first hypothesis about achievement appears to be supported
Anderson’s second hypothesis about satisfaction (attitude) appears to be supported, but only to an extent (i.e., only 5 studies in High Category)
Achievement and Attitude Outcomes
Achievement Attitudes Interaction Strength k g+adj. SE k g+adj. SE Low Strength 30 0.163 0.043 21 0.071 0.042 Med Strength 29 0.418 0.044 18 0.170 0.043 High Strength 15 0.305 0.062 5 -0.173 0.091 Total 74 0.291 0.027 44 0.090 0.029 (Q) Between-class 17.582* 12.060*
Bernard, Abrami, Borokhovski, Wade, Tamin, & Surkes, (in press). Examining Three Forms of Interaction in Distance Education: A Meta-Analysis of Between-DE Studies. Review of Research in Education
Because much web 2.0 tech use is so emergent need for:
• Demographic studies– Who is using what for doing what?– How are they using multi-use tools?
• Visualization of activity and relationships• Network analysis• Connection between context and type of use• Data mining
Quantitative Research Summary
• Can be useful especially when fine tuning well established practice
• Provides incremental gains in knowledge, not revolutionary ones
• The need to “control” context often makes results of little value to practicing professionals
• In times of rapid change too early quantitative testing may mask beneficial positive capacity
• Will we ever be able to afford blind reviewed, random assignment studies?
Paradigm 2 Qualitative Paradigm
• Many different varieties• Generally answer the question ‘why’ rather
then ‘what’, ‘when’ or ‘how much’?• Presents special challenges in distributed
contexts due to distance between participants and researchers
• Currently most common type of DE research (Rourke & Szabo, 2002)
Qualitative study of Social Software
• Critically important:– In early stages of adoption– To track effects of user competence and efficacy– As contexts are personalized– as tools are appropriated by users for entirely
different tasks than those intended by developers
Qualitative Example
–Dearnley (2003) Student support in open learning: Sustaining the Process
–Practicing Nurses, weekly F2F tutorial sessions
–Phenomenological study using grounded theory discourse
Core category to emerge was “Finding the professional voice”
Dearnley and Matthew (2003 and 2004)Dearnley and Matthew (2003 and 2004)
Qualitative example 2
• Mann, S. (2003) A personal inquiry into an experience of adult learning on-line. Instructional Science 31
• Conclusions:– The need to facilitate the presentation of learner and teacher
identities in such a way that takes account of the loss of the normal channel
– The need to make explicit the development of operating norms and conventions
– reduced communicative media there is the potential for greater misunderstanding
– The need to consider ways in which the developing learning community can be open to the other of uncertainty, ambiguity and difference
3rd ParadigmCritical Research
• Asks who gains in power?• David Noble’s critique of ‘digital diploma Mills’
most prominent Canadian example • Are profits generated from user generated
content exploitative?• Confronting the “net changes everything”
mantra of many social software proponents.• Who is being excluded from social software
See Norm Friesen’s
Friesen, N. (2009) Re-thinking e-learning research: foundations, methods, and practices. Peter Lang Publishers
Is the extraction of information from the masses exploitative or empowering?
• Why does Facebook own all the content that we supply?
• Does the power of the net further marginalize the non connected?
• Who benefits from voluntary disclosure?• Why did the One Laptop Per Child fail?
Quantitative vs. QualitativeParadigm Wars Rekindled
• Current research “more resembles the pendulum swings characteristic of art or fashion, rather than the progressive improvements characteristic of science and technology” (p. 16).• Slavin (2002) in Educational Researcher
• Solution to embrace “evidence based learning” • Projected to increase from 5% to 75% of US Gov.
funding by 2007 for “research that addresses causal questions and uses random assignments ….” Slavin, 2002 p. 15
Do Either Qualitative or Quantitative Methods Meet Real Needs of Practicing
Distance Educators?
But what type of research has most effect on practice?
– Kennedy (1999) - teachers rate relevance and value of results from each of major paradigms.
– No consistent results – teachers are not a homogeneous group of consumers but they do find research of value
– “The studies that teachers found to be most persuasive, most relevant, and most influential to their thinking were all studies that addressed the relationship between teaching and learning.”
But what type of research has most effect on Practice?
– “The findings from this study cast doubt on virtually every argument for the superiority of any particular research genre, whether the criterion for superiority is persuasiveness, relevance, or ability to influence practitioners’ thinking.” Kennedy, (1999)
4th ParadigmDesign-Based Research
• Related to engineering and architectural research
• Focuses on the design, construction, implementation and adoption of a learning initiative in an authentic context
• Related to ‘Development Research’• Closest educators have to a “home grown”
research methodology
Design-Based Research Studies
– iterative, – process focused, – interventionist, – collaborative, – multileveled, – utility oriented, – theory driven and generative
• (Shavelson et al, 2003)
Critical characteristics of design experiments
• According to Reeves (2000:8), Ann Brown (1992) and Alan Collins (1992):– addressing complex problems in real contexts in
collaboration with practitioners,– integrating known and hypothetical design-principles
with technological affordances to render plausible solutions to these complex problems, and
– conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design-principles.
Design-based research• Methodology developed by educators for
educators• Developed from American pragmatism – Dewey
(Anderson, 2005)• Recent Theme Issues:
– The Journal of the Instructional Sciences, (13, 1, 2004), – Educational Researcher (32, 1, 2003) and – Educational Psychologist (39, 4, 2004)– See bibliography at
http://cider.athabascau.ca/CIDERSIGs/DesignBasedSIG/
• My article at www.cjlt.ca/abstracts.html
Integrative Learning Design(Bannan-Ritland, 2003)
• “design-based research enables the creation and study of learning conditions that are presumed productive but are not well understood in practice, and the generation of findings often overlooked or obscured when focusing exclusively on the summative effects of an intervention” Wang & Hannafin, 2003
• Iterative because• ‘Innovation is not restricted to the prior design
of an artifact, but continues as artifacts are implemented and used”
• Implementations are “inevitably unfinished” (Stewart and Williams (2005)
• intertwined goals of (1) designing learning environments and (2) developing theories of learning (DBRC, 2003)
Amiel, T., & Reeves, T. C. (2008).
Design Based research and the Science of Complexity
• Complexity theory studies the emergence of order in multifaceted, changing and previously unordered contexts
• This emerging order becomes the focus of iterate interventions and evaluations
• Order emerges at the “edge of chaos” in response to rapid change, and failure of previous organization models
Call Centres At Athabasca:
• •
Answer 80% of student inquiries
Savings of over $100,000 /yearAnderson, T. (2005). Design-based research and its application to a call center innovation in distance education. Canadian Journal of Learning and Technology, 31(2), 69-84
D-B Research examples
Design-Based Research Strategies for Studying Situated Learningin a Multi-user Virtual EnvironmentChris Dede, 2004
Graduate Student Resource Hub in Design Research in Education
• http://www.lkl.ac.uk/projects/designresearch/
• Need to study usability, scalability and innovation adoption within bureaucratic systems
• Allow knowledge tools to evolve in natural context through supportive nourishment of staff
Conclusion
• Education research is grossly under-resourced to meet the magnitude of opportunity and demand
• Paradigm wars are unproductive• Design-based research offers a promising new
research design model• It can be used for Doctoral dissertations see
• Herrington, J., McKenney, S., Reeves, T., & Oliver, R. (2007). Design-based research and doctoral students: Guidelines for preparing a dissertation proposal.
Design Based research in Action
• Phase 1 Exploration – surveys, talking to faculty and tutors, investigating open source tools, setting research questions
• Phase 2. Building the intervention – Elgg through two versions and 85 plugins (on going)
• Phase 3 Evaluation – Before and after survey’s see:
– Anderson, T., Poelhuber, B., & McKerlich, R. (2010). Self Paced Learners Meet Social Software. Online Journal of Distance Education Administration, 13 http://www.westga.edu/~distance/ojdla/Fall133/anderson_poellhuber_mcKerlich133.html
– Dr students – Use of past student archives– Ongoing iterations and development of tools
• Phase 4 – Testing in multiple contexts– Development of design principles/patterns
Survey Results –Anderson et al 2004
78% indicated they would interact with other students if they were also able to proceed through the course at their own pace.
Survey Results Anderson et al 2004
• Only 29% of the student respondents had participated in the optional (credit and non credit) interactive computer conferences
Draft Results AU Unpaced Learners social Software Survey, Anderson Sept 2009 sent to 3763 undergrad students who enrolled in AU ungrad
courses in Aug.2009 24.7% response rate
N=820
Undergrad Survey Sept. 2009
Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009
N = 820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009
N = 820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson Sept 2009
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
25.12%
N = 820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
N = 820
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
47.93%
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
61.95%
31.47%
6.59%
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
Draft Results, AU Unpaced Learners Social Software Survey, Anderson, Sept 2009.
Lots of Support
• “Not networking with other students, and not having peers is one drawback in doing individualized studies through Athabasca, with these technologies available could solve this problem.”
• “I think that hearing other people's opinions is a great way to spark new thoughts of your own. I also think that it is a great way to ask questions rather than emailing back and forth or making long distance phone calls.”
Lots of Concerns
• “People have other commitments and might not be able to join in, they like to do things on their own time.”
• “I am not part of a social network due to the fact that I work in mental health, I am concerned about my privacy.”
• “I'm scared as a first time user of e-learning, that I may miss something”
Survey Conclusions
• We have a very heterogeneous population of net users and non users
• Many of our learners are “don’t know” about web 2.0 tool use in formal education – are they literate????
Challenges to AU Moving to Connectivist Pedgagogy
• Personal competence, literacy and tools• Dealing effectively with disruptive
technologies• Crystallized ways of thinking about our
educational development and delivery model• Developing Tutor Networks• Union contracts ???
How to insure we all are learning professionals?
Professional, Hobby, Personal
News
Professional, Hobby, Personal
News
Produsage, networks
Produsage, networks
Personal Hosting: Blogs, E-portfolios,
Presentations, Profile
Personal Hosting: Blogs, E-portfolios,
Presentations, ProfileBookmarks
Tags ResourcesBookmarks
Tags Resources
CollectionsPhotosBooks
CollectionsPhotosBooks
Formal Education Provider(s)
Formal Education Provider(s)
Production Tools
Production Tools
IIPLEPLE
Identity
Email Email
Social Networks
My Personal Learning Network
Open NetOpen Net
Athabasca University
Athabasca Landing
E-PortfoliosProfilesNetworksBookmarksBlogs
Media lab
Secondlife campus
AUspace
AlFrescoCMS
AlFrescoCMS
Moodle
LibraryLibrary
Course Development
Course Development
ELGG
MY AULogin
RegistryRegistry
OERs, YouTUBE
DiscoveryRead & Comment
Single Sign on
CIDER
Research/Community Networks
Sample CC Course units and Branded OERs
PasswordsPasswords
Access Controls in Elgg
Design Based research in Practice
• Athabasca Landing– Elgg based– Started in 2008– 1600 users (2011)– Unpaced – Paced Courses– Informal Learning
• The Demo!• Elgg
Questions and Comments