Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology...

35
Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director of Assessment Institutional Effectiveness and Analysis Florida Atlantic University Presented at the FAU Faculty Technology Learning Community Boca Raton, Fl November 12, 2010

Transcript of Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology...

Page 1: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Research in Practice: Using Better Research Design and Evidence to

Evaluate the Impact of Technology on Student Outcomes

Gail Wisan, Ph.D.University Director of Assessment

Institutional Effectiveness and Analysis Florida Atlantic University

Presented at the FAU Faculty Technology Learning Community

Boca Raton, FlNovember 12, 2010

Page 2: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Department of Education Evaluates Evidence

Page 3: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Perspective/ point of view:

Evaluation Research should drive outcomes assessment because:

it helps identify what works; it provides direct evidence; it helps improve educational outcomes.

Page 4: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Overview of Presentation: Benefits/Learning Outcomes

1. Be Able to explain evaluation research;2. identify the benefits of evaluation research; 3. Able to explain use of experimental and quasi-experimental design evaluation

research in education assessment;4. Able to apply evaluation research strategies to outcomes assessment at your

institution to improve student learning outcomes.

;After this pres improve student learning outcomes at your

institution. Assessment should seek systematic evidence of the effectiveness of existing programs, pedagogies, methodologies and approaches to improve student learning outcomes and instill a cycle of continuous improvement.

 

Page 5: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Outcomes Assessment and Evaluation Research

Outcomes Assessment, at its most effective, incorporates the tools and methods of evaluation research.

  1. Outcomes Evaluation Research  2. Field Experiment Research

Page 6: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Outcomes Assessment and Evaluation Research

Field Experiment Research assesses the effects of new programs, pedagogies, and educational strategies on students’ learning, competencies, and skills

Page 7: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Outcomes Assessment and Evaluation Research

Outcomes Evaluation Research assesses the effects of existing programs, pedagogies, and educational strategies on students’ learning, competencies, and skills

Page 8: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Outcomes Assessment and Evaluation Research

 Evaluation Research can answer the question:

How can Assessment Improve Education?

Page 9: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Research Design Examples: Overview

Notation: X, O, RExperimental DesignPre-Experimental Design and its

problems in educational research1. Threats to internal validity (Is X

really having an effect?) 2. Threats to External Validity

(generalizability)

Page 10: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Research Design Examples:Quasi-Experimental Designs Versus Pre-

Experimental Designs

QUASI- Experimental Designs Better Answers

1. Better Solutions to internal validity threats (Is X really having an effect?)

2. Better Solutions to external validity threats (generalizability)

Page 11: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Notation on Diagrams

An X will represent the exposure of a group to an experimental variable or teaching method, the effects of which are to be measured.

O will refer to observation or measurement.R refers to a random assignment.

Page 12: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Research Design

How Quasi-experimental Design helps to solve the problems of Pre-experimental Design

Page 13: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Experimental Designs

Pretest-Posttest Control Group Design: Random assignment to two groups

R O X O R O O

Page 14: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Experimental Designs

Pretest-Posttest Control Group DesignR O X O

R O O

Sources of Invalidity External

Interaction of Testing and XInteraction of Selection and X ?Reactive Arrangements ?

Page 15: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Experimental Designs

Posttest-Only Control Group Design

R X O R O

Page 16: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Experimental Designs

Posttest-Only Control Group DesignR X O R O

Sources of Invalidity External

Interaction of Selection and X ?Reactive Arrangements ?

Page 17: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Pre-Experimental Designs

One-Shot Case StudyX O

Sources of Invalidity Internal

HistoryMaturationSelectionMortality

ExternalInteraction of Selection and X

Page 18: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Pre-Experimental DesignsOne-Group Pretest-Posttest Design

O X O

Sources of Invalidity Internal

History Maturation Testing Instrumentation Interaction of Selection and Maturation, etc. Regression ?

External Interaction of Testing and X Interaction of Selection and X Reactive Arrangements ?

Page 19: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Pre-Experimental Designs

Static-Group ComparisonX O

O

Sources of Invalidity Internal

Selection Mortality Interaction of Selection and Maturation, etc. Maturation ?

External Interaction of Selection and X

Page 20: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Threats to Internal Validity

History, the specific events occurring between the first and second measurement in addition to the experimental variable.

Maturation, processes within the respondents operating as a function of the passage of time per se (not specific to the particular events), including growing older, growing hungrier, growing more tired etc.

Testing, the effects of taking a test upon the scores of a second testing.

Page 21: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Threats to Internal Validity

Instrumentation, in which changes in the calibration of a measuring instrument or changes in the observers or scorers used, may produce changes in the obtained measurements.

Regression. This operates where groups have been selected on the basis of their extreme scores.

Page 22: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Threats to External Validity

Interaction of Testing and X. A pretest might increase/decrease the respondent’s sensitivity or responsiveness to the experimental variable, making the results obtained for a pretested population unrepresentative for the unpretested universe from which the respondents were selected.

Interaction of Selection and X

Page 23: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Threats to External Validity

Reactive Arrangements. This would preclude generalization about the effect of the experimental variable upon persons being exposed to it in nonexperimental settings.

Multiple-X Interference. This is likely to occur whenever multiple treatments are applied to the same respondents, because the effects of prior treatments are not usually erasable.

Page 24: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Threats to Internal Validity

Selection. There could be biases resulting in differential selection of respondents for the comparison groups.

Mortality. This refers to differential loss of respondents from the comparison groups.

Interaction of Selection and Maturation, etc., which in certain of the multiple-group quasi-experimental designs might be mistaken for the effect of the experimental variable.

Page 25: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Quasi-Experimental Designs:

Nonequivalent Control Group Design

O X OO O

Page 26: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Quasi-Experimental Designs:

Nonequivalent Control Group Design: Comparing Math Classes Example

O X OO O

Page 27: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Quasi-Experimental Designs

Nonequivalent Control Group DesignO X O

O O

Sources of Invalidity Internal

Interaction of Selection and Maturation, etc Regression ?

ExternalInteraction of Testing and XInteraction of Selection and X ?Reactive Arrangements ?

Page 28: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Examples of Other Quasi-Experimental Designs

Time Series

O O O O X O O O O

Multiple Time SeriesO O O O X O O O OO O O O O O O O

Page 29: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Quasi-Experimental Designs

Time SeriesO O O OXO O O O

Sources of Invalidity Internal

HistoryInstrumentation ?

ExternalInteraction of Testing and XInteraction of Selection and X ?Reactive Arrangements ?

Page 30: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

U.S. Dep’t. of Ed Focuses on Level of Evidence

U.S. Department of Education highlights “What Works” in educational strategies;

“What works” is based upon assessment of level of evidence provided by educational research: evaluation research

Page 31: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Department of Education Evaluates Evidence

Page 32: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

General Education & Learning Outcomes Assessment:

The National Context

At the National Symposium on Student Success, Secretary of Education Margaret Spellings and others called on colleges to measure and provide evidence of student learning.“Measuring Up”-National Report Cards By

State: Little Data on Whether students are Learning

Outcomes assessment has two purposes• Accountability (standardized national

tests?)•Assessment/Effectiveness —Are Students Learning? How much?

Page 33: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Performing Assessment as Research in Practice

Assessment should seek systematic evidence of the effectiveness of existing programs, pedagogies, methodologies and approaches to improve student learning outcomes and instill a cycle of continuous improvement.

Implementation Strategy: Aim for Quasi-Experimental Designs (or Exp. Designs)

Page 34: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

Revitalizing Assessment: Consider these Next Steps

Encourage comparing teaching strategies when faculty are teaching more than one section of the same course

Communicate and Use Results

Page 35: Research in Practice: Using Better Research Design and Evidence to Evaluate the Impact of Technology on Student Outcomes Gail Wisan, Ph.D. University Director.

QUESTIONS? Please email [email protected]