Presenters

85
Richard W. Riley College of Education and Leadership Research Methodology for the Project Study EdD Residency Program

description

Richard W. Riley College of Education and Leadership Research Methodology for the Project Study EdD Residency Program. Presenters. Dr. Wade Smith: [email protected] 352-895-9900 Dr. Paul Englesberg, [email protected] 360-380-2238 PST - PowerPoint PPT Presentation

Transcript of Presenters

Page 1: Presenters

Richard W. RileyCollege of Education and Leadership

Research Methodologyfor the Project Study

EdD Residency Program

Page 3: Presenters

• http://edd-res-method.wikispaces.com/

• Ask to join

• Upload notes for colleagues usage

Collaborative Space for NotesCollaborative Space for Notes

Page 4: Presenters

• Select and apply the appropriate method to a given problem statement and research questions.

• Align methodology with problem statement, research questions, data collection, and analysis.

• Recognize methodology alignment in a peer-reviewed article

• Practice summarizing primary research.

Purpose of this SessionPurpose of this Session

Page 5: Presenters

• STUDY TYPE

• PROJECT STUDY – Scholarly paper and project

• STUDY METHODOLOGY

• Quantitative

• Qualitative

• Mixed

Methodology for a Project Study (EdD)Methodology for a Project Study (EdD)

Page 6: Presenters

• Concise statement of purpose and question

• Specification of measurable constructs (variables)

• Question poses a relationship or comparison

• Results are numerically presented.

• Narrative is objectively oriented, all elements congruent, exactly consistent statements of key elements

• NOT creative or emotive rhetorical writing

Quantitative ResearchQuantitative Research

Page 7: Presenters

The research question is the most important element of the research endeavor. More time and care should be invested in determining the right question in the correct form than in any other part of the process. Once a proper question is articulated, the rest of process falls into place [paraphrase] (Creswell, personal communication, 1995).

Research Questions and PurposeResearch Questions and Purpose

Page 8: Presenters

• Study must include a conjectured relationship between at least 2 variables:

• Independent Variable (IV): The variable that is being manipulated or tested

• Dependent Variable (DV): The variable that varies based on manipulation of the IV

• Example: SIGNIFICANT Differences in reading comprehension scores (DV) by level of parental involvement (IV)

DVs and IVs for Quant. Problem StatementDVs and IVs for Quant. Problem Statement

Page 9: Presenters

• According to Creswell (2003), independent variables are generally defined as consisting of the two or more treatment conditions to which participants are exposed. These variables “cause, influence, or affect outcomes” (Creswell, 2003, p. 94).

• Dependent variables are observed for changes in order to assess the effect of the treatment. They “are the outcomes or results of the influence of the independent variables” (Creswell, 2003, p. 94).

Dependent and Independent VariablesDependent and Independent Variables

Page 10: Presenters

For example: An independent variable could be parental involvement in reading activities of English language learners while the dependent variable is identified as reading comprehension performance as measured by the reading portion of Criterion Reference Competency Test (CRCT) for English reading comprehension assessments.

Dependent and Independent VariablesDependent and Independent Variables

Page 11: Presenters

• Specific purpose, research questions, hypotheses, and/or research objectives are concise and clearly defined.

• Must include measurable elements.

I. Quantitative purpose and questionsI. Quantitative purpose and questions

Page 12: Presenters

• There are three basic types of questions

• Descriptive: When a study is designed primarily to describe what is going on or what exists

• Relational: When a study is designed to look at the relationships between two or more variables

• Comparative: When a study is designed to compare differences between groups or conditions in terms of measurable outcomes; sometimes called causal comparative

Quantitative Research QuestionsQuantitative Research Questions

Page 13: Presenters

• Statements of relationships between variables you want to test—only when using statistical tests

• Expectations based on what the literature indicates about the relationship between the variables

• Stated in terms of a null hypothesis (no SIGNIFICANT difference or no SIGNIFICANT change) and research hypothesis (statement of conjectured SIGNIFICANT difference)

HypothesesHypotheses

Page 14: Presenters

• Hypotheses are, in a sense, a statement of operational definitions

• An operational definition matches a concept, such as intelligence, with a measurement tool, such as the Wechsler Adult Intelligence Scale (WAIS)

• Your hypotheses MUST operationalize your concepts; this makes your hypotheses TESTABLE.

Hypotheses as Operational DefinitionsHypotheses as Operational Definitions

Page 15: Presenters

• You are interested in cognition and ADD

• Null (H01): There is no [statistically significant] difference in time on task [as measured by the TaskTest], between ADD children who are given 5 minutes of physical exercise every half an hour and ADD students who go through the normal daily routine.

• Research (H1): There is a [statistically significant] difference in time on task [as measured by the TaskTest], between ADD children who are given 5 minutes of physical exercise every half an hour and ADD students who go through the normal daily routine.

Hypothesis ExampleHypothesis Example

Page 16: Presenters

• Statistical hypotheses should be straightforward statements of what is expected to happen in regard to the independent and dependent variables.

• NOTE: Descriptive questions do not suggest hypotheses.

HypothesesHypotheses

Page 17: Presenters

• The NULL is a statement of NO SIGNIFICANT effect, NO SIGNIFICANT relationship, or NO SIGNIFICANT difference, depending on the research question and design. The statement should be clear, understandable, and not unnecessarily complex or obtuse.

Hypotheses, cont’d.Hypotheses, cont’d.

Page 18: Presenters

Null hypotheses are in the following form:

The independent variable has NO SIGNIFICANT effect on the dependent variable.

Or

There is NO SIGNIFICANT relationship between the independent and dependent variable.

Or

There is no SIGNIFICANT difference between the treatment group and the placebo group in terms of the dependent variable.

Hypotheses, cont’d.Hypotheses, cont’d.

Page 19: Presenters

• For complex designs, multiple hypotheses are sometimes required. For example, when the design incorporates multiple independent variables (factorial designs), there might be two or more sets of hypotheses.

Hypotheses, cont’d.Hypotheses, cont’d.

Page 20: Presenters

Example: Research questions:

1. Does gender of student affect math performance?

2. Does pedagogy X affect math performance?

3. Is there an interaction between gender and pedagogy X in terms of math performance?

Research Question ExampleResearch Question Example

Page 21: Presenters

• Hypotheses:Null 1: Gender has no SIGNIFICANT effect on math performance.Null 2: Pedagogy X has no SIGNIFICANT effect on math performance.Null 3: There is no SIGNIFICANT interaction between gender and pedagogy X in terms of math performance.

Hypotheses ExamplesHypotheses Examples

Page 22: Presenters

• Note that quantitative analysis requires numeric measurement of variables. So, focus groups, interviews, and open-ended observations are NOT typically part of quantitative study. Quantitative inquiry focuses on variables that can be measured by reliable and valid instruments, such as tests, numerically represented attitude scales, or other tools that yield numerical results.

Measured VariablesMeasured Variables

Page 23: Presenters

• Simple paragraph that describes the intent of your study. It should flow directly from the problem statement. Two to three sentences are sufficient.

• It should be logical and explicit.

• Review that it directly relates to the problem statement and research questions.

I. Purpose of the StudyI. Purpose of the Study

Page 24: Presenters

• The purpose of this correlational study is to examine the relationship between the level of parental involvement and reading comprehension performance on the XYZ assessment among elementary English language learners.

• (Matches earlier DV and IV example)

Purpose of the Study ExamplePurpose of the Study Example

Page 25: Presenters

• All elements of the quantitative study MUST be exactly aligned.

• The problem, purpose, questions, hypotheses, design, and methods MUST be congruent and directly aligned.

• The same terms must be used every time an element is discussed, described, or mentioned.

Quantitative Research – Design and MethodsQuantitative Research – Design and Methods

Page 26: Presenters

• The design and methods narrative includes:

• The larger population of interest, to which the results will be generalizable

• The location and context of the study (how does the location and context relate to the research purpose and questions?)

• Instruments or means of measuring variables

(Only variables mentioned in the purpose and question are measured.)

Quantitative Research – Design and Methods, Quantitative Research – Design and Methods, cont’d.cont’d.

Page 27: Presenters

• The subjects who will provide the measurement on variables of interest

(Why are these the best subjects for the purpose and questions?)

• The sampling strategy (for quantitative inquiry, a random/representative or equivalent sample is required)

Quantitative Research – Design and Methods, Quantitative Research – Design and Methods, cont’d.cont’d.

Page 28: Presenters

• Data analysis—how will the data collected be analyzed to answer the research question?

Quantitative Research – Design and Methods, Quantitative Research – Design and Methods, cont’d.cont’d.

Page 29: Presenters

• Experimental

• Random assignment—Comparative design

• Quasi-Experimental—Pre-selected groups (not random–convenient)

• Causal comparative design

• Within-group designs (Pretest/posttest or matched pairs comparisons) Very weak design – unconvincing.

• Between-group designs (comparisons b/w groups)

• Non-Experimental

• Descriptive

• Correlational/secondary data analysis

Main Research DesignsMain Research Designs

Page 30: Presenters

• What are the underlying constructs (variables)?

• What is the intended population?

• What is the intended interpretation?

• Sometimes you need more than one type of design

• You can/should use an existing design (Creswell, pp.168–171)

Selecting a Research DesignSelecting a Research Design

Page 31: Presenters

• Treatment/intervention outcomes (e.g., in schools/classrooms)

• Activity analysis (# of behaviors/episode)

• Policy analysis (Breakdown by criteria – content analysis)

• Program Evaluation

• Needs Assessment

• Surveying an organization to understand the impact of management practices on employees

• Secondary data analysis

• Developing reliable and valid scales to assess attitudes

A Few Examples of Quantitative StudiesA Few Examples of Quantitative Studies

Page 32: Presenters

• Setting and Sample

• Population from which sample will be drawn

• Describe and defend sampling method

• Describe and defend sample size

• Use sample size generators for random selection designs:

• See notes

• Eligibility criteria for participants

• Characteristics of selected sample

Methodology —Setting and SampleMethodology —Setting and Sample

Page 33: Presenters

• What is the population of interest (Ex: 4th grade boys)

• Your sample is drawn from a population. Note: You can only generalize to the populations from which you have sampled. Describe how you will select your participants.

• Recruitment strategies

• What is your role as a researcher?

• Describe the demographics of your sample

• Gender

• Age

• Independent variables

Population and Sample ExamplePopulation and Sample Example

Page 34: Presenters

• WEAK: Convenience sample – the subset of a population is available for study (may or may not be representative of the population)

• Random Sample – each member of a population has an equal chance of being picked

• Equivalent to random/representative

• Website for further information: http://www.socialresearchmethods.net/tutorial/Mugo/tutorial.htm

Types of SamplesTypes of Samples

Page 35: Presenters

• Rule of thumb: In order to estimate the number needed for your sample, use 15–20 participants per variable. The larger the sample, the better!

• Statistical Power Analysis basically answers the question How large must my sample be to ensure a reasonable likelihood of detecting a difference if it really exists in the population?

• You will need to report statistical power in your study (http://statpages.org/).

Sample SizeSample Size

Page 36: Presenters

• Instrumentation and Materials—describe data collection tools

• Name of instrument

• Concepts measured by instrument

• How scores are calculated; meaning

• Processes of assessing the reliability and validity of instrument (e.g. Cronbach’s alpha, etc.) See notes

• How participants will complete

• Detailed description of each variable

Methodology —InstrumentationMethodology —Instrumentation

Page 37: Presenters

• Describe the survey or other tools you will use in your investigation

• A brief description of the methods with references to previous studies that have used it

• Identify the underlying constructs. Constructs should derive from the theoretical foundation.

• Include copies of your measure in an Appendix.

Instrumentation and MaterialsInstrumentation and Materials

Page 38: Presenters

• A score from a test or other measurement instrument must represent what it is intended to represent.

• The researcher must provide some evidence and support for validity.

Test ValidityTest Validity

Page 39: Presenters

• Tests/measurement scores must be reliable.

• Reliable means that the score will be similar over different administrations, or that the score is based on an internally consistent set of items/tasks.

• Reliability is a technical issue.

Test ReliabilityTest Reliability

Page 40: Presenters

• An explanation of all descriptive and/or inferential analyses

• Null hypotheses as they relate to research questions

• Specific explanations of variables

• Best presented in a sequence that matches the research questions

Methodology —Data AnalysisMethodology —Data Analysis

Page 41: Presenters

• Based on frequency in category, average of measurement, variability of measurement

• Answers questions like is the frequency of occurrence what we expect? Is frequency the same across groups?

• Are the average results different between or among groups?

• Is there a relationship between or among measured variables?

Statistical Analysis Statistical Analysis

Page 42: Presenters

• Rooted in testable and confirmable theories

• Looking for relationships

• Statistical tests are used to analyze the data:

• t tests

• Analysis of variance (ANOVA), analysis of covariance (ANCOVA)

• Chi square analysis

• Correlation

• Linear/Logistic regression See notes

Characteristics of a Quantitative MethodCharacteristics of a Quantitative Method

Page 43: Presenters

• Independent-Samples t test (compare scores of two independent groups)

• Compare achievement of two groups

• Compare employees from two companies on morale

• Paired-Samples t tests (compare two groups of scores that are matched)

• Compare the pretest and posttest scores provided by participants of an intervention (pre-post design)

• ANOVA (comparing two or more levels of an independent variable)

• Can be between groups (independent groups) or repeated-measures (matched scores)

Types of TestsTypes of Tests

Page 44: Presenters

• Chi-square (examine statistical relationship between categorical variables)

• Association – relationship between two categorical variables

• Goodness of fit- is the distribution in your sample the same as the population or the same as another study

• Correlation (relationship between 2 variables)

• Pearson r (parametric) or Spearman (non-parametric)

• Regression (examine the effect of multiple independent variables on one variable See notes

• How do various differentiation strategies predict achievement?

Types of TestsTypes of Tests

Page 45: Presenters

• Tutorial that helps you decide:

http://www.wadsworth.com/psychology_d/templates/student_resources/workshops/stat_workshp/chose_stat/chose_stat_01.html

• This site has four different interactive webpages that help you decide the correct analytical procedure:

http://statpages.org/#WhichAnalysis

Which test should I use?Which test should I use?

Page 46: Presenters

Problem

Purpose

Research Question

Hypothesis

Analysis

Tool for Aligning Methods and QuestionsTool for Aligning Methods and Questions

Page 47: Presenters

Problem Statement

Nature of the Study/Guiding Question

Purpose of the Study

and

Research Design

Setting and Sample

Data Collection

Data Analysis

GOAL = Overall Alignment in your StudyGOAL = Overall Alignment in your Study

Page 48: Presenters

ReferencesReferences

• Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum.

• Creswell, J. W. (2002). Research design: Qualitative, quantitative, and mixed-methods approaches (2nd ed.). Thousand Oaks, CA: Sage Publications.

• Gravetter, F. J., & Wallnau, L. B. (2004). Statistics for the behavioral sciences (6th ed.). Belmont, CA: Thompson-Wadsworth.

• Hallahan, M., & Rosenthal, R. (1996). Statistical power: Concepts, procedures, and applications. Behavior Research and Therapy, 34, 489–499.

• Lipsey, M. W., & Wilson, D. B. (1993). The efficacy of psychological, educational, and behavioral treatment: Confirmation from meta-analysis. American Psychologist, 49(12), 1181–1209.

• Murphy, K. R., & Myors, B. (1998). Statistical power analysis: A simple and general model or traditional and modern hypothesis tests. Hillsdale, NJ: Erlbaum.

• Patten, M. L. (2007). Understanding research methods: An overview of the essentials (6th ed.). Los Angeles: Pyrczak Publishing.

• Rossi, J. (1990). Statistical power of psychological research: What have we gained in 20 years? Journal of Consulting and Clinical Psychology, 58(5), 646–656.

• Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th ed.). Needham Heights, MA: Allyn & Bacon.

Page 49: Presenters

• Natural setting & culture/context-bound

• Researcher as instrument (interpersonal skills essential)

• Use of researcher’s tacit/implicit knowledge

• Qualitative methods (narrative observation & in-depth interview)

• Purposive sampling (based on research question)

• Inductive data analysis (emic vs. etic)

• Grounded theory (emerges from data/represent “reality”)

Qualitative Research/Naturalistic Inquiry: Project Qualitative Research/Naturalistic Inquiry: Project StudyStudy

Page 50: Presenters

• Negotiated outcomes (interpret with participants)

• In-depth contextualized reporting (thick description)

• Idiographic (vs. nomothetic) representation

• Tentative (conditional) application

• Focus-determined but emerging boundaries

• Criteria for trustworthiness (confidence in findings)

Qualitative Research/Naturalistic Inquiry (cont’d.)Qualitative Research/Naturalistic Inquiry (cont’d.)

Page 51: Presenters

• Biography—life history of an individual (n=1)

• Phenomenology—meaning of experiences re: particular phenomenon/construct (5-10)

• Grounded theory—develop theory inductively (20-30+)

• Ethnography—describe/interpret culture

• Case study—in-depth study of one or more cases

• Action research—theory-research-action

• Participatory (action) research—stakeholders as partners

Qualitative Research: ApproachesQualitative Research: Approaches

Page 52: Presenters

• Involves the study of an individual’s experiences.

• Can be told to the research

• Can be biographies, autobiographies, life histories, and oral histories

BiographyBiography

Page 53: Presenters

• Select the type of biographical study

• Begin with objective experiences noting the life course stages and experiences

• Gather stories or concrete contextual biographical material

• Organize stories around themes that account pivotal events/epiphanies.

• Explore the meaning of the stories

• Look to explain the meaning

Researcher Role in BiographiesResearcher Role in Biographies

Page 54: Presenters

• Explores how people make meaning of their experiences.

• Based on the philosophical perspective of Husserl

• Focus is on the intentionality of consciousness (Experiences are mediated by memory, image, and meaning)

PhenomenologyPhenomenology

Page 55: Presenters

• Reduction

• Analysis of specific statements and themes

• Search for all possible meaning

• Setting aside all prejudgments

• Bracketing researcher experiences

Researcher Role in PhenomenologyResearcher Role in Phenomenology

Page 56: Presenters

• Understand the philosophical perspectives of the approach

• Write research questions that explore what experiences mean to the people and ask individuals to describe everyday life experiences

• Collect data from people who have experienced the issue or phenomenon under investigation

• Analyze data

• Write reports that describe the essence of the experiences and describe similarities in meaning of these experiences

Researcher Role in Phenomenology (cont’d.)Researcher Role in Phenomenology (cont’d.)

Page 57: Presenters

• Exploration of a case or multiple cases over time

• Involves detailed, in-depth data collection from multiple sources of information

• The case can be a program, an event, an activity, or individuals.

• The case should be in the physical, social, historical, or economic setting

Case StudyCase Study

Page 58: Presenters

• Consider the type of case study will be most useful

• Select case studies that show and array of perspectives on the problem, process, or event

• Collect data using multiple sources of information such as observations, interviews, documents, and audiovisual materials

• Analyze data through holistic or embedded analysis using within case analysis or cross case analysis

• Report the learned lessons from the case.

Researcher Role in Case StudyResearcher Role in Case Study

Page 59: Presenters

• Seeks full collaborative inquiry from all participants

• Often aimed at sustained change in organizational, institutional, or community contexts

• Distinction between participants and researchers blur.

Action ResearchAction Research

Page 60: Presenters

• Begin with a critical review of past actions

• Plan the next action using the information gained from that review.

• Engage in research to improve practice and to get understanding and change at the same time

• Engage in a processing of regular, critical and systematic reflection as you act or practice at every stage

• Maintain flexibility

• Report

Researcher Role in Action ResearchResearcher Role in Action Research

Page 61: Presenters

• Generates the discovery of a theory closely related the phenomenon being studied

• Focuses on how individuals interact, take actions or engage in a process in response to a phenomenon

Grounded TheoryGrounded Theory

Page 62: Presenters

• Conduct field interviews to the point of saturation for categories

• Theoretically choose participants

• Collect and analyze observations and documents

• Analyze data as you go

• Use systematic approach to data analysis that involves open, axial, and selective coding and a conditional matrix

• Produce a substantive-level theory

Researcher Role in Grounded TheoryResearcher Role in Grounded Theory

Page 63: Presenters

• Participant (narrative) observation

• Key informant interviewing

• In-depth interviewing

• Focus group interviews

• Ethnographic (culture-specific) survey

• Freelist & pilesort (elicitation techniques)

• Fieldnotes (researcher)

• Journal/logs/diaries (participants)

• Social networks

• Spatial mapping

• Tangible products (“artifacts”)

MethodsMethods

Page 64: Presenters

• The extent to which one can have confidence in the study’s findings

• Parallel of reliability, validity, & objectivity in traditional “quantitative” research

• Criteria

• Credibility—plausible, reflect reality of participants (cf. internal validity)

• Transferability—applicable to other contexts/samples (cf. external validity)

• Dependability—accounts for instability/change (cf. reliability)

• Confirmability—authenticate process & document researcher bias (cf. objectivity)

Trustworthiness (Lincoln & Guba, 1985)Trustworthiness (Lincoln & Guba, 1985)

Page 65: Presenters

• Prolonged engagementa

• Persistent observationa

• Triangulationa

• Peer debriefinga

• Member checksa

• Thick descriptionb

• Audit trailc,d

• Negative case analysisa

• Reflexive journala,b,c,d

• Referential adequacya

a Credibility c Dependability

b Trustworthiness d Confirmability

Insuring Trustworthiness (Lincoln & Guba, 1985)Insuring Trustworthiness (Lincoln & Guba, 1985)

Page 66: Presenters

• Ongoing and recursive

• Integrates emic (meaningful to participants) and etic (meaningful to observers) perspectives

• Requires systematic documentation of procedures

• Includes both descriptive and inferential language

• Maximizes data triangulation

• Involves search for confirming and disconfirming evidence

• Specific to purpose and audience

• Participatory process

Principles of Data TransformationPrinciples of Data Transformation(Description, Analysis, Interpretation)(Description, Analysis, Interpretation)

Page 67: Presenters

• Sampling refers to selection of individuals, units, settings

• Typically purposeful or criterion-based, that is, driven by characteristics relevant to research questions

• Purposes/goals:

• Identify representative or typical

• Reflect heterogeneity or variation

• Find cases critical to theory/question (e.g., extreme, unique, ideal)

• For controlled comparison (to demonstrate differences based on criteria; e.g., multiple case studies)

Qualitative Research: SamplingQualitative Research: Sampling

Page 68: Presenters

• Biography/Case Study: 1

• Phenomenology: < 10

• Grounded theory/ethnography/action research: 20-30 to reach saturation

• Methods (examples)

• Key informants: 5

• In-depth interviews: 30

• Freelist / pilesort: 30

• Focus group: based on “groupings” represented

(e.g., male & female, 3 age levels)

• Ethnographic survey: typically larger & representative (purposeful or random based on purpose)

Sample Size: Rule of ThumbSample Size: Rule of Thumb

Page 69: Presenters

• Creswell, J. W. (1998). Qualitative inquiry & research design: Choosing among five traditions. Thousand Oaks: CA. Sage.

• Maxwell, J. A. (1996). Qualitative research design: An interactive approach. Thousand Oaks, CA: Sage.

• Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage.

• Schensul, J. J., & LeCompte, M. D. (Eds.). (1999). Ethnographer’s toolkit: Volumes 1–7. Walnut Creek, CA: AltaMira Press.

ReferencesReferences

Page 70: Presenters

• Decisions

• Regarding the type of data to be given priority (qualitative or quantitative)

• When data are collected and analyzed

• Most common designs sequence the use of qualitative and quantitative methods

Mixed-Methods DesignMixed-Methods Design

Page 71: Presenters

• Data are collected in two phases

• Quantitative data collected first; qualitative data are collected later

• Quantitative data is the emphasis with qualitative data used to explain the quantitative findings.

Explanatory DesignExplanatory Design

Page 72: Presenters

• Schools are surveyed that participate in a National Science Foundation program. Teachers from the schools are select the most important outcomes for their students. Schools with high numbers of students who received awards at science contests are selected. Teachers are interviewed to understand how they are using the NSF program.

Example of Explanatory DesignExample of Explanatory Design

Dr. Wade Smith
THis sentence seems odd to me.
Page 73: Presenters

• Data are collected in two phases

• Qualitative data collected first (interviews); quantitative data are collected later (surveys)

• Advantage: Surveys are constructed from the language used in the interviews

Exploratory DesignExploratory Design

Page 74: Presenters

• Ten schools are selected and case studies are conducted using observations, interviews, and document review to describe procedures and activities that these schools used to address school improvement for No Child Left Behind. Teachers and administrators were interviewed. A survey was developed following analysis of the case studies. One hundred schools with similar problems were surveyed, which produced extensive data from this larger sample.

Example of Exploratory DesignExample of Exploratory Design

Page 75: Presenters

• Most complex design

• Qualitative and quantitative data are collected simultaneously

• Results from both types of data are compared (triangulated) to see if similar findings are produced.

Triangulation DesignTriangulation Design

Page 76: Presenters

• A researcher attempted to paint a portrait of effective teacher techniques and behaviors. He used qualitative techniques (classroom observations, daily logs, interviews with students and teachers). He also used quantitative instruments (performance checklists, rating scales, discussion flowcharts). Triangulation was achieved by comparing qualitative data with the results of quantitative measures of classroom interaction and achievement to develop detailed description of each teacher’s teaching style, behaviors and techniques and their effects on students. Teachers were compared for similarities and differences.

Example of Triangulation DesignExample of Triangulation Design

Page 77: Presenters

• Determine the feasibility

• Do I have the time to use a Mixed-Methods Design?

• Do I have the knowledge to use a Mixed-Methods Design?

• Do I have the resources to use a Mixed-Methods Design?

• Identify a clear rationale

• What are the reasons for needing both qualitative and quantitative data?

Guide to Conducting a Mixed-Methods StudyGuide to Conducting a Mixed-Methods Study

Page 78: Presenters

• Complex

• Evaluated to determine whether the design and research procedures match the guide for conducting the mixed-methods study

• Consider whether the criteria for the qualitative/quantitative approaches are appropriate

• Was the survey pilot tested? What was the response rate? (Quantitative)

• Did the researcher spend an adequate amount of time in the field? Did the researcher use multiple sources? (Qualitative)

Evaluating Mixed-Methods ResearchEvaluating Mixed-Methods Research

Page 79: Presenters

• Is there a clear rationale for using each type of data?

• Is the type of design clearly described or clearly presented in a visual manner?

• Are there research questions for each type of data, and are these appropriate for the sequence and priority of data collection and analysis?

Questions for Evaluating a Mixed-Methods StudyQuestions for Evaluating a Mixed-Methods Study

Page 80: Presenters

• Are the data collection procedures clearly described? Is evidence of the quality of the methods of data collection presented, as appropriate, for both quantitative (reliability and validity) and qualitative (credibility and dependability) measures?

• Are the procedures for data analysis consistent with the type of design chosen and appropriate to the research questions asked?

• Does the written results have a structure that is appropriate to the type of mixed-methods design being used?

Creswell, 2010, p. 288

Questions for Evaluating a Mixed-Methods Study Questions for Evaluating a Mixed-Methods Study (cont’d.)(cont’d.)

Page 81: Presenters

• Creswell, J. W., & Plano, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.

• Morse, J. M. (1991, March/April). Approaches to qualitative-quantitative methodological triangulation. Nursing Research, 40(1), 120–123.

• Tashakkori, A., & Creswell, J. W. (2007). Exploring the nature of research questions in mixed methods research. Journal of Mixed Methods Research, 1(3), 207–211.

• Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage.

• Tashakkori, A., & Teddlie, C. (Eds.) (2003). Handbook of mixed method research in the social and behavior sciences. Thousand Oaks, CA: Sage

ReferencesReferences

Page 82: Presenters

Group Activity: Your turn to draft… DIRECTIONSGroup Activity: Your turn to draft… DIRECTIONS

• Divide students into small groups.

• Instruct students to examine the exemplar study and rubric.

• Using the exemplar study, students should locate the chosen method in Section 2.

• Using the rubric and the exemplar, students will locate the standards and sub-standards in the rubric and match them to those elements in the study exemplar.

• Instruct students to enter the page numbers in the rubric where the standards are met in the study exemplar.

• Students will then complete an alignment grid to show how the chosen methodology aligns with the research questions and data collection/analysis.

• Notify students that the Tool for Aligning Methods and Questions handout is for quantitative studies, and quantitative portions of mixed methods designs, only.

Page 83: Presenters

Examine your proposal, rubric and alignment grid.

Decide which methodology would be appropriate for your study.

Complete your alignment grid using 1–2 research questions.

It’s Your Turn…It’s Your Turn…

Page 84: Presenters

Don’t leave until I get to answer your question.

Questions? Questions?

Page 85: Presenters

• Submit Your Seminar Evaluation

• Provide Feedback

• Include Improvements!

Evaluation of SessionEvaluation of Session