Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty...
-
Upload
augustine-justin-lindsey -
Category
Documents
-
view
219 -
download
0
Transcript of Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty...
![Page 1: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/1.jpg)
Jen SweetOffi ce for Teaching, Learning & Assessment
DePaul University
Shannon MilliganFaculty Center for Ignatian Pedagogy
Loyola University Chicago
ADVANCED SURVEY DESIGN
Tuesday, December 8, 20152:00 – 3:30pm
![Page 2: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/2.jpg)
Workshop Outline
• Introductions• Conflicting Recommendations for Survey Design?• Cognition and Survey Design• Recommendations to Reduce Cognitive LoadACTIVITY!• Affective Assessment and SurveysACTIVITY!• Survey Design & Distribution Tools• Analyses of your Survey Instrument
![Page 3: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/3.jpg)
Workshop Outcomes
By the end of this workshop, participants will be able to:
• Apply knowledge of the cognitive processes students use to respond to surveys to design effective survey items and instruments.
• Identify non-cognitive variables that can be assessed with surveys
• Use the tools available to them at their respective institutions to design and distribute surveys.
• Identify a variety of methods available for the analysis of survey data.
![Page 4: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/4.jpg)
Conflicting Recommendation
s for Survey Design?
![Page 5: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/5.jpg)
Recommendations for Survey Design that seem to Conflict
Examples:Neutral Point • Always include? • Never use? • Sometimes yes; sometimes no?
Number of Scale Points to Include• 2?• 3?• 4?• 5?• 6?• 7?• 9?
• The more the better?
All of This is in the Literature!
![Page 6: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/6.jpg)
So, What’s Up with the Literature?All of these recommendations may be appropriate depending on the specific context:
• respondent attributes • nature of the items in the survey • length of the survey• Etc.
Generally looking at things like:• Reliability• *Validity • Survey Outcomes • Response Rate (high)
• Use of Response Sets (low)
Survey Design is as Much Art as Science!“There is always a well-known solution to every human problem - neat, plausible, and wrong.” H.L. Hencken
![Page 7: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/7.jpg)
Cognition and Survey Design
![Page 8: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/8.jpg)
Cognitive LoadPaas and Van Merrienboer, 1994
The amount of cognitive effort (or thinking) students need to exert to respond to a survey item.
If cognitive load exceeds the student’s working memory capacity, they will take some sort of shortcut (Paas & Van Merrienboer, 1994), or satisfice (Krosnick, 1991)
• Read questions less carefully (skim)
• Use a response set
• Give same response for all questions, regardless of content
• Overuse neutral or N/A response option
• Skip the question (provide no response)
• Respond randomly
• Decide not to complete the survey
![Page 9: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/9.jpg)
Cognitive to Respond to Surveys (Tourangeau, 1984)
1.Interpretation
2.Retrieval
3.Judgment
4.Response
![Page 10: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/10.jpg)
Recommendations to Reduce
Cognitive Load
![Page 11: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/11.jpg)
Step 1: Interpretation•Use language that is clear and familiar to survey respondents. • Avoid cognitively taxing wording.• Avoid unfamiliar words and phrasing.• Avoid jargon and acronyms
• Ensure that question stems are clear and explicit.• Do not use concepts that are unclear or unfamiliar to respondents.• Avoid complex sentence structures.• Ask about only one concept in each stem; avoid double-barreled
questions
• Use questions that do not make assumptions.• Ask for information in a direct manner by avoiding double negatives • Ensure question stems are succinct, including only as much information as is necessary for respondents to properly interpret what is being requested of them.
![Page 12: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/12.jpg)
Interpretation (Continued)• Include clear instructions that clarify the purpose of the survey instrument, and provide respondents with expected procedures for responding to the survey instrument.
• Ensure that every portion of a survey instrument is visible without the need for additional action by the respondent. • Use radio buttons instead of drop-down boxes to display response
options.• Do not “hide” definitions respondents may need to interpret and
respond to survey items.• Use easy-to-read font size and type.• Use high-contrast font and background colors.
![Page 13: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/13.jpg)
Step 2: Retrieval•Use stems that request information with which respondents have primary experience and avoid asking for second-hand information (i.e., information that the respondent has heard about, but not experienced personally) or hypothetical information.•Group conceptually similar items together.
![Page 14: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/14.jpg)
Step 3: Judgment•Use the smallest number of response options necessary to encompass all meaningful divisions of what you are asking about.•General Guideline: four or five response options, depending on whether or not there will be a neutral option.• Include a neutral option if you reasonably expect participants to have no opinion, but otherwise, they should be avoided •Neutral responses can be difficult to interpret•Offering a neutral option may encourage satisficing
![Page 15: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/15.jpg)
Step 4: Response•Use the smallest number of response options necessary to encompass all meaningful divisions of what you are asking about.•Label the scale options.•May only need to label the most extreme options• Include a neutral option if you reasonably expect participants to have no opinion, but otherwise, they should be avoided •Neutral responses can be difficult to interpret•Offering a neutral option may encourage satisficing
![Page 16: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/16.jpg)
Activity!
![Page 17: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/17.jpg)
Practice Evaluating Survey Items!
Individually:Complete the worksheet
In Groups:Compare ResponsesDid Everyone Identify the same items for Improvement?Are there differences in ways you edited items?
![Page 18: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/18.jpg)
Surveys and the Rise of Affective
Assessment
![Page 19: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/19.jpg)
Growing Emphasis on Non-Cognitive Abilities• Grit• Growth• Social-emotional development• Self-awareness/management/efficacy• General affect• Engagement• Mattering• Climate
• Research shows strong relationships between these variables and overall success
From NPR: “Nonacademic Skills Are Key to Success. But What Should We Call Them?” (May 28, 2015)
![Page 20: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/20.jpg)
The Role of Surveys From NPR: “To Measure What Tests Can’t, Some Schools Turn to Surveys” (December 2, 2015)
From that article: “A growing battery of school leaders, researchers and policymakers think surveys are the best tool available right now to measure important social and emotional goals for schools and students”
Why?• Easy to administer• Easier to collect and analyze than reflection papers (and the like)• Many surveys/survey questions already exist• Faster data sharing = faster decision-making/implementation
(maybe)
![Page 21: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/21.jpg)
Activity!
![Page 22: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/22.jpg)
Non-Cognitive Assessment and You
Individually:•What non-cognitive variables might be of interest to your program?•Are you already assessing any of these variables?
In Groups:•Share what you are assessing and what you might be interested in assessing•What variables might be important to look at on an institutional-level?
![Page 23: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/23.jpg)
Survey Design/Distributio
n Tools
![Page 24: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/24.jpg)
Three Main Tools Used at DePaul
1. Qualtrics
2. Google Forms
3. Survey Monkey
![Page 25: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/25.jpg)
QualtricsAdvantages:•Free to DePaul faculty and staff•Supported by Information Services•Very Flexible and Comprehensive System•Lots of Features
Disadvantage:•Reporting Features aren’t Great•Steeper Learning Curve than Other Systems
![Page 26: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/26.jpg)
Google FormsAdvantages:•Free to Everyone•Data is Collected in Excel Format•Easier to Learn than Qualtrics
Disadvantage:•Not Nearly as many Features as Qualtrics
![Page 27: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/27.jpg)
Survey MonkeyAdvantages:•?Disadvantage:•Most Advanced Features Cost $•To get all the features available in Qualtrics, cost is $780/year•Limited to 10 questions and 100 responses on the free version
![Page 28: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/28.jpg)
Overview of Survey Analysis
![Page 29: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/29.jpg)
Survey Analysis• For when you want to:• Group survey respondents• Group survey items• Make predictions/observe relationships• Analyze “fit” of respondents and items
•Which to use based on:1. Purpose
2. Audience
http://www.edmeasurement.net/5244/SPSS%20survey%20data.pdf
![Page 30: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/30.jpg)
Basic Survey Analysis• Sometimes all you need are:
1. Means
2. Correlations
3. Frequency tables/graphs
![Page 31: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/31.jpg)
Survey Analysis-Group Respondents
• Cluster Analysis• What it does: Creates groups or “clusters” of respondents
based on similar responses to a set of survey questions• Kind of intuitive because: Clustering is part of organizing (e.g.
organization of produce section, medical symptoms)• What it answers: What do members of each cluster have in
common?• Ex. First-generation students tend to have less familiarity with
research opportunities
• Useful for: Marketing, outreach, cohort creation
![Page 32: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/32.jpg)
Survey Analysis-Group Survey Items
• Factor Analysis • What it does: Groups statistically related survey items into a
number of “factors”• Kind of intuitive because : Groupings can be done based on
preconceived ideas or based on the data analysis. Also, largely correlation-based• What it answers: Can items be removed from the survey? Do
the factors relate to constructs that make sense (e.g. identified learning outcomes)?• Useful for: Survey refinement, naming constructs
Example: Do our questions aimed at measuring a certain outcome actually seem to measure that outcome?
![Page 33: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/33.jpg)
Survey Analysis-Make Predictions• Regression • What it does: Determines the relationship between multiple
predictor variables and a dependent variable• Kind of intuitive because : Similar to correlation (determining the
relationship between 2 variables), but with the ability to control for other variables• What it answers: What item(s) are significant predictors of an
outcome? Is there a positive or negative relationship between the predictor variable and outcome variable? How well do our variables explain the observed outcome?• Useful for: Looking at relationships between responses to certain
questions (or factors) and outcomes. Determination of resource allocation. Possible survey refinement
Example: Do students who report greater library use have a higher GPA?
![Page 34: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/34.jpg)
Survey Analysis-Analyze “Fit” of Respondents and Items
• Rasch Analysis and Item Response Theory (IRT)• What it does: Provides information about “ability” of
respondents and “difficulty” of survey items• But they differ in: Rasch only considers respondent ability and
item difficulty. IRT can also account for guessing and greater differences between high/low ability respondents• What it answers: What item(s) are too easy or too difficult? Is
the survey measuring a single variable? Are there trends in responses between respondent groups?• Useful for: Survey refinement, detecting bias in survey items,
determining number of levels needed in a rating scale
Example: Is a survey item written in such a way that it is interpreted differently across student groups?
![Page 35: Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty Center for Ignatian Pedagogy Loyola University Chicago.](https://reader031.fdocuments.in/reader031/viewer/2022012914/5697c0071a28abf838cc6272/html5/thumbnails/35.jpg)
Contact InformationJen SweetDePaul UniversityOffice for Teaching, Learning & [email protected]
Shannon MilliganLoyola University ChicagoFaculty Center for Ignatian [email protected]