Introduction to Web Survey Usability Design and Testing
-
Upload
jennifer-romano-bergstrom -
Category
Education
-
view
1.834 -
download
3
description
Transcript of Introduction to Web Survey Usability Design and Testing
![Page 1: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/1.jpg)
Introduction to Web Survey Usability Design and Testing
DC-AAPOR Workshop
Amy Anderson RiemerJennifer Romano Bergstrom
The views expressed on statistical or methodological issues are those of the presenters and not necessarily those of the U.S. Census Bureau.
![Page 2: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/2.jpg)
2
Schedule9:00 – 9:15 Introduction & Objectives9:15 – 11:45 Web Survey Design: Desktop &
Mobile11:45 – 12:45 Lunch12:45 – 2:30 Assessing Your Survey2:30 – 2:45 Break2:45 – 3:30 Mixed Modes Data Quality3:30 – 4:00 Wrap Up
![Page 3: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/3.jpg)
3
ObjectivesWeb Survey Design: Desktop & Mobile• Paging vs. Scrolling• Navigation • Scrolling lists vs. double-banked
response options• Edits & Input fields• Checkboxes & Radio buttons• Instructions & Help• Graphics• Emphasizing Text & White Space• Authentication• Progress Indicators• Consistency
Assessing Your Survey• Paradata• Usability
Quality of Mixed Modes• Mixed Mode Surveys• Response Rates• Mode Choice
![Page 4: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/4.jpg)
Web Survey Design
The views expressed on statistical or methodological issues are those of the presenters and not necessarily those of the U.S. Census Bureau.
![Page 5: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/5.jpg)
5
Activity #1
1. Today’s date2. How long did it took you to get to BLS today?3. What do you think about the BLS entrance?
![Page 6: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/6.jpg)
6
Why is Design Important?
• No interviewer present to correct/advise• Visual presentation affects responses
– (Couper’s activity)• While the Internet provides many ways to
enhance surveys, design tools may be misused
![Page 7: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/7.jpg)
7
Why is Design Important?
• Respondents extract meaning from how question and response options are displayed
• Design may distract from or interfere with responses
• Design may affect data quality
![Page 8: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/8.jpg)
8
Why is Design Important?
http://www.cc.gatech.edu/gvu/user_surveys/
![Page 9: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/9.jpg)
9
Why is Design Important?
• Many surveys are long (> 30min)• Long surveys have higher nonresponse rates• Length affects quality
Adams & Darwin, 1982; Dillman et al., 1993; Haberlein & Baumgartner, 1978
![Page 10: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/10.jpg)
10
Why is Design Important?
• Respondents are more tech savvy today and use multiple technologies
• It is not just about reducing respondent burden and nonresponse
• We must increase engagement• High-quality design = trust in the designer
Adams & Darwin, 1982; Dillman et al., 1993; Haberlein & Baumgartner, 1978
![Page 11: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/11.jpg)
11http://www.pewinternet.org/Static-Pages/Trend-Data-(Adults)/Device-Ownership.aspx
![Page 12: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/12.jpg)
12http://www.pewinternet.org/Static-Pages/Trend-Data-(Adults)/Device-Ownership.aspx
![Page 13: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/13.jpg)
13
http://www.nielsen.com/content/dam/corporate/us/en/reports-downloads/2012-Reports/Nielsen-Multi-Screen-Media-Report-May-2012.pdf
![Page 14: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/14.jpg)
14
http://www.nielsen.com/content/dam/corporate/us/en/reports-downloads/2012-Reports/Nielsen-Multi-Screen-Media-Report-May-2012.pdf
![Page 15: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/15.jpg)
15Nielsen: The Cross-Platform Report, Quarter 2, 2012-US
![Page 16: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/16.jpg)
UX Design Failure
• Poor planning• “It’s all about me.” (Redish: filing cabinets)• Human cognitive limitations
– Memory & Perception– (fun activity time)
![Page 17: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/17.jpg)
![Page 18: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/18.jpg)
![Page 19: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/19.jpg)
![Page 20: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/20.jpg)
![Page 21: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/21.jpg)
![Page 22: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/22.jpg)
UX Design Failure
• Poor planning• “It’s all about me.” (Redish: filing cabinets)• Human cognitive limitations
– Memory & Perception– (fun activity time)
- Primacy- Recency
- Chunking- Patterns
![Page 23: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/23.jpg)
23
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 24: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/24.jpg)
24
Paging vs. Scrolling
Paging• Multiple questions per page• Complex skip patterns• Not restricted to one item
per screen• Data from each page saved
– Can be suspended/resumed
• Order of responding can be controlled
• Requires more mouse clicks
Scrolling• All on one static page• No data is saved until
submitted at end– Can lose all data
• Respondent can review/change responses
• Questions can be answered out of order
• Similar look-and-feel as paper
![Page 25: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/25.jpg)
25
Paging vs. Scrolling
• Little advantage (breakoffs, nonresponse, time, straightlining) of one over the other
• Mixed approach may be best• Choice should be driven by content and target
audience– Scrolling for short surveys with few skip patterns;
respondent needs to see previous responses– Paging for long surveys with intricate skip
patterns; questions should be answered in orderCouper, 2001; Gonyea, 2007; Peytchev, 2006; Vehovar, 2000
![Page 26: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/26.jpg)
26
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 27: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/27.jpg)
27
Navigation
• In a paging survey, after entering a response– Proceed to next page– Return to previous page (sometimes)– Quit or stop– Launch separate page with Help, definitions, etc.
![Page 28: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/28.jpg)
28
Navigation: NP
• Next should be on the left– Reduces the amount of time to move cursor to
primary navigation button– Frequency of use
Couper, 2008; Dillman et al., 2009; Faulkner, 1998; Koyani et al., 2004; Wroblewski, 2008
![Page 29: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/29.jpg)
29
Navigation NP Example
Peytchev & Peytcheva, 2011
![Page 30: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/30.jpg)
30
Navigation: PN
• Previous should be on the left– Web application order– Everyday devices– Logical reading order
![Page 31: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/31.jpg)
31
Navigation PN Example
![Page 32: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/32.jpg)
32
Navigation PN Example
![Page 33: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/33.jpg)
33
Navigation PN Example
![Page 34: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/34.jpg)
34
Navigation PN Example
![Page 35: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/35.jpg)
35
Navigation Usability Study/Experiment
Romano & Chen, 2011
![Page 36: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/36.jpg)
36
Method
• Lab-based usability study• TA read introduction and left letter on desk• Separate rooms• R read letter and logged in to survey• Think Aloud • Eye Tracking• Satisfaction Questionnaire• Debriefing
Romano & Chen, 2011
![Page 37: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/37.jpg)
37
Results: Satisfaction I
* p < 0.0001
Romano & Chen, 2011
![Page 38: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/38.jpg)
38
Results: Satisfaction II
Overall reaction to the survey: terrible – wonderful. p < 0.05.
Information displayed on the screens: inadequate – adequate. p = 0.07.
Arrangement of information on the screens: illogical – logical. p = 0.19.
Forward navigation: impossible – easy. p = 0.13.
Mean N_P PN6
6.5
7
7.5
8
8.5M
ean
Satis
facti
on R
at-
ing
Mean N_P PN6
6.5
7
7.5
8
8.5
Mea
n Sa
tisfa
ction
Ra
ting
Mean N_P PN6.46.66.8
77.27.47.67.8
88.2
Mea
n Sa
tisfa
ction
Rat
-in
g
Mean N_P PN6
6.5
7
7.5
8
8.5
Mea
n Sa
tisfa
ction
Ra
ting
Romano & Chen, 2011
![Page 39: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/39.jpg)
Eye Tracking
39
• Participants looked at Previous and Next in PN conditions• Many participants looked at Previous in the N_P conditions
– Couper et al. (2011): Previous gets used more when it is on the right.
![Page 40: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/40.jpg)
40
N_P vs. PN: Respondent Debriefing
• N_P version– Counterintuitive– Don’t like the “buttons being flipped.”– Next on the left is “really irritating.”– Order is “opposite of what most people would
design.”• PN version
– “Pretty standard, like what you typically see.”– The location is “logical.”
Romano & Chen, 2011
![Page 41: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/41.jpg)
41
Navigation Alternative• Previous below Next
– Buttons can be closer– But what about older adults?– What about on mobile?
Couper et al., 2011; Wroblewski, 2008
![Page 42: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/42.jpg)
42
Navigation Alternative• Previous below Next
– Buttons can be closer– But what about older adults?– What about on mobile?
Couper et al., 2011; Wroblewski, 2008
![Page 43: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/43.jpg)
43
Navigation Alternative: Large primary navigation button; secondary smaller
![Page 44: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/44.jpg)
44
Navigation Alternative: No back/previous option
![Page 45: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/45.jpg)
45
Confusing Navigation
![Page 46: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/46.jpg)
46
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 47: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/47.jpg)
47
Long List of Response Options
• One column: Scrolling– Visually appear to belong to one group– When there are two columns, 2nd one may not be
seen (Smyth et al., 1997)• Two columns: Double banked
– No scrolling– See all options at once– Appears shorter
![Page 48: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/48.jpg)
48
1 Column vs. 2 Column Study
Romano & Chen, 2011
![Page 49: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/49.jpg)
49
Seconds to First Fixation
* p < 0.01
2 column 1 column0
5
10
15
20
25
first halfsecond half
Romano & Chen, 2011
![Page 50: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/50.jpg)
50
Total Number of Fixations
2 column 1 column0
5
10
15
20
25
30
35
40
first halfsecond half
Romano & Chen, 2011
![Page 51: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/51.jpg)
51
Time to Complete Item
Mean Min Max0
20
40
60
80
100
120
1 col2 col
Seco
nds
Romano & Chen, 2011
![Page 52: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/52.jpg)
52
1 Col. vs. 2 Col.: Debriefing
• 25 had a preference– 6 preferred one column
• They had received the one-column version
– 19 preferred 2 columns• 7 had received the one-column version• Prefer not to scroll• Want to see and compare everything at once• It is easier to “look through,” to scan, to read• Re one column, “How long is this list going to be?”
Romano & Chen, 2011
![Page 53: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/53.jpg)
53
Long Lists
• Consider breaking list into smaller questions• Consider series of yes/no questions• Use logical order or randomize• If using double-banked, do not separate
columns widely
![Page 54: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/54.jpg)
54
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 55: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/55.jpg)
55
Input Fields Activity
![Page 56: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/56.jpg)
56
Input Fields
• Smaller text boxes = more restricted• Larger text boxes = less restricted
– Encourage longer responses• Visual/Verbal Miscommunication
– Visual may indicate “Write a story”– Verbal may indicate “Write a number”
• What do you want to allow?
![Page 57: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/57.jpg)
57
Types of Open-Ended Responses
• Narrative– E.g., Describe…
• Short verbal responses– E.g., What was your occupation?
• Single word/phrase responses– E.g., Country of residence
• Frequency/Numeric response– E.g., How many times…
• Formatted number/verbal– E.g., Telephone number
![Page 58: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/58.jpg)
58
Open-Ended Responses: Narrative
• Avoid vertical scrolling when possible• Always avoid horizontal scrolling
![Page 59: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/59.jpg)
59
Open-Ended Responses: Narrative
• Avoid vertical scrolling when possible• Always avoid horizontal scrolling
Wells et al., 2012
32.8 characters 38.4 characters
~700 Rs
![Page 60: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/60.jpg)
60
Open-Ended Responses: Numeric
• Is there a better way?
![Page 61: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/61.jpg)
61
Open-Ended Responses: Numeric
• Is there a better way?
![Page 62: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/62.jpg)
62
Open-Ended Responses: Numeric
• Use of templates reduces ill-formed responses– E.g., $_________.00
Couper et al., 2009; Fuchs, 2007
![Page 63: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/63.jpg)
63
Open-Ended Responses: Date
• Not a good use: intended response will always be the same format
• Same for state, zip code, etc. • Note
– “Month” = text– “mm/yyyy” = #s
![Page 64: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/64.jpg)
64
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 65: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/65.jpg)
65
Check Boxes and Radio Buttons
• Perceived Affordances• Design according to existing conventions and
expectations• What are the conventions?
![Page 66: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/66.jpg)
66
Check Boxes: Select all that apply
![Page 67: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/67.jpg)
67
Check Boxes in drop-down menus
![Page 68: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/68.jpg)
68
Radio Buttons: Select only one
![Page 69: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/69.jpg)
69
Radio Buttons: Select only one
![Page 70: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/70.jpg)
70
Radio Buttons: In grids
![Page 71: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/71.jpg)
71
Radio Buttons on mobile
• Would something else be better?
![Page 72: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/72.jpg)
72
Reducing Options
• What is necessary?
![Page 73: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/73.jpg)
73
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 74: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/74.jpg)
74
Placement of Instructions
• Place them near the item
• “Don’t make me think”
• Are they necessary?
![Page 75: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/75.jpg)
75
Placement of Instructions
• Place them near the item
• “Don’t make me think”
• Are they necessary?
![Page 76: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/76.jpg)
76
Placement of Instructions
• Place them near the item
• “Don’t make me think”
• Are they necessary?
![Page 77: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/77.jpg)
77
Instructions
• Key info in first 2 sentences• People skim
– Rule of 2s: Key info in first two paragraphs, sentences, words
![Page 78: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/78.jpg)
78
Instructions
![Page 79: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/79.jpg)
79
Instructions
![Page 80: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/80.jpg)
80
Placement of Clarifying Instructions
• Help respondents have the same interpretation
• Definitions, instructions, examples
Conrad & Schober, 2000; Conrad et al., 2006; Conrad et al., 2007; Martin, 2002; Schober & Conrad, 1997; Tourangeau et al., 2010
![Page 81: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/81.jpg)
81
Placement of Clarifying Instructions
Redline, 2013
![Page 82: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/82.jpg)
82
Placement of Clarifying Instructions
• Percentage of valid responses was higher with clarification
• Longer response time when before item• No effects of changing the font style• Before item is better than after• Asking a series of questions is best
Redline, 2013
![Page 83: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/83.jpg)
83
Placement of Help
• People are less likely to use help when they have to click than when it is near item
• “Don’t make me think”
![Page 84: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/84.jpg)
84
Placement of Error Message
• Should be near the item• Should be positive and helpful, suggesting
HOW to help• Bad error message:
![Page 85: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/85.jpg)
85
Placement of Error Message
• Should be near the item• Should be positive and helpful, suggesting
HOW to help• Bad error message:
![Page 86: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/86.jpg)
86
Error Message Across Devices
![Page 87: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/87.jpg)
87
Error Message Across Devices
![Page 88: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/88.jpg)
88
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 89: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/89.jpg)
89
Graphics
• Improve motivation, engagement, satisfaction with “fun”
• Decrease nonresponse & measurement error• Improve data quality• Gamification
Henning, 2012; Manfreda et al., 2002
![Page 90: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/90.jpg)
90
Graphics
• Use when they supply meaning– Survey about advertisements
• Use when user experience is improved– For children or video-game players– For low literacy
Libman, 2012
![Page 91: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/91.jpg)
91
Graphics
![Page 92: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/92.jpg)
92
Graphics
http://glittle.org/smiley-slider/
http://www.ollie.net.nz/casestudies/smiley_slider/
![Page 93: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/93.jpg)
93
Graphics Experiment 1.1
• Appearance– Decreasing boldness (bold faded)– Increasing boldness (faded bold)– Adding face symbols to response options ( )
• ~ 2400 respondents• Rated satisfaction re health-related things• 5-pt scale: very satisfied very dissatisfied
Medway & Tourangeau, 2011
![Page 94: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/94.jpg)
94
Graphics Experiment 1.2• Bold side selected more
• Less satisfaction when face symbols present
Medway & Tourangeau, 2011
Very satisfied
Somewhat satisfied Neutral
Somewhat dissatisfied
Very dissatisfied
Your physician O O O O O
Very satisfied
Very dissatisfied
Your physician O O O O O
![Page 95: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/95.jpg)
95
Graphics Experiment 2.1
• Appearance– Radio buttons– Face symbols ( )
• ~ 1000 respondents• Rated satisfaction with a journal• 6-pt scale: very dissatisfied very satisfied
Emde & Fuchs, 2011
![Page 96: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/96.jpg)
96
Graphics Experiment 2.2• Faces were equivalent to radio buttons• Respondents were more attentive when faces
were present– Time to respond
Emde & Fuchs, 2011
![Page 97: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/97.jpg)
97
Slider Usability Study
• Participants thought 1 was selected and did not move the slider. 0 was actually selected if they did not respond.
Strohl, Romano Bergstrom & Krulikowski, 2012
![Page 98: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/98.jpg)
98
Graphics Experiment 3.1
• Modified the visual design of survey items– Increase novelty and interest on select items– Other items were standard
• ~ 100 respondents in experimental condition• ~ 1200 in control• Questions about military perceptions and
media usage• Variety of question types
Gibson, Luchman & Romano Bergstrom, 2013
![Page 99: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/99.jpg)
99
Graphics Experiment 3.2
• No differences
Gibson, Luchman & Romano Bergstrom, 2013
![Page 100: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/100.jpg)
100
Graphics Experiment 3.3• Slight differences:
– Those with enhanced version skipped more often– Those in standard responded more negatively.
Gibson, Luchman & Romano Bergstrom, 2013
![Page 101: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/101.jpg)
101
Graphics Experiment 3.4
Gibson, Luchman & Romano Bergstrom, 2013
• Slight differences: – Those with enhanced version skipped more often
![Page 102: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/102.jpg)
102
Graphics Experiment 3.5
• No major differences
Gibson, Luchman & Romano Bergstrom, 2013
![Page 103: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/103.jpg)
103
Graphics Considerations• Mixed results• “Ad blindness”• Internet speed and
download time• Unintended meaning
![Page 104: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/104.jpg)
104
Graphics Considerations
![Page 105: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/105.jpg)
105
Graphics Considerations
![Page 106: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/106.jpg)
106
Graphics Considerations
![Page 107: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/107.jpg)
107
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 108: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/108.jpg)
108
Emphasizing Text
• Font– Never underline plain text– Never use red for plain text– Use bold and italics sparingly
![Page 109: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/109.jpg)
109
Emphasizing Text
![Page 110: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/110.jpg)
110
Emphasizing Text
![Page 111: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/111.jpg)
111
Emphasizing Text
• Hypertext– Use meaningful
words and phrases– Be specific– Avoid “more” and
“click here.”
![Page 112: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/112.jpg)
112
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 113: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/113.jpg)
113
White Space
• White space on a page• Differentiates sections• Don’t overdo it
![Page 114: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/114.jpg)
114
White Space
![Page 115: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/115.jpg)
115
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 116: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/116.jpg)
116
Authentication
• Ensures respondent is the selected person• Prevents entry by those not selected• Prevents multiple entries by selected
respondent
![Page 117: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/117.jpg)
117
Authentication
• Passive– ID and password embedded in URL
• Active– E-mail entry– ID and password entry
• Avoid ambiguous passwords (Couper et al., 2001)– E.g., contains 1, l, 0, o
• Security concerns can be an issue• Don’t make it more difficult than it needs to be
![Page 118: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/118.jpg)
118
Authentication
![Page 119: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/119.jpg)
119
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 120: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/120.jpg)
120
Progress Indicators
• Reduce breakoffs• Reduce burden by displaying length of survey• Enhance motivation and visual feedback• Not needed in scrolling design• Little evidence of benefit
Couper et al., 2001; Crawford et al., 2001; Conrad et al., 2003, 2005; Sakshaug & Crawford, 2009
![Page 121: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/121.jpg)
121
Progress Indicators: At the bottom
![Page 122: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/122.jpg)
122
Progress Indicators: At the top
![Page 123: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/123.jpg)
123
Progress Indicators: Mobile
![Page 124: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/124.jpg)
124
Progress Indicators
• They should provide meaning
Strohl, Romano Bergstrom & Krulikowski, 2012
![Page 125: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/125.jpg)
125
Web Survey Design• Paging vs. Scrolling• Navigation• Scrolling vs. Double-
Banked• Edits and Input Fields• Checkboxes and
Radio Buttons• Instructions and Help
• Graphics• Emphasizing Text• White Space• Authentication• Progress Indicators• Consistency
![Page 126: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/126.jpg)
126
Consistency• Predictable
– User can anticipate what the system will do• Dependable
– System fulfills user’s expectations• Habit-forming
– System encourages behavior• Transferable
– Habits in one context can transfer to another• Natural
– Consistent with user’s knowledge
![Page 127: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/127.jpg)
127
Inconsistency
![Page 128: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/128.jpg)
128
Inconsistency
![Page 129: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/129.jpg)
129
Inconsistency
Strohl, Romano Bergstrom & Krulikowski, 2012
![Page 130: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/130.jpg)
Questions and Discussion
![Page 131: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/131.jpg)
Assessing Your Survey
The views expressed on statistical or methodological issues are those of the presenters and not necessarily those of the U.S. Census Bureau.
![Page 132: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/132.jpg)
Assessing Your Survey
Paradata• Background • Uses of Paradata by
mode• Paradata issues
Usability• Usability vs. User Experience• Why, When, What?• Methods
– Focus Groups, In-Depth Interviews
– Ethnographic Observations, Diary Studies
– Usability & Cognitive Testing• Lab, Remote, In-the-Field• Obstacles
![Page 133: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/133.jpg)
Paradata
![Page 134: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/134.jpg)
Types of Data
• Survey Data – collected information from R’s• Metadata – data that describes the survey
– Codebook– Description of the project/survey
• Paradata – data about the process of answering the survey at the R level
• Auxiliary/Administrative Data – not collected directly, but acquired from external sources
![Page 135: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/135.jpg)
Paradata
• Term coined by Mick Couper– Originally described data that were by-products of
computer-assisted interviewing– Expanded to include data from other self-
administered modes • Main uses:
– Adaptive / Responsive design– Nonresponse adjustment– Measurement error identification
![Page 136: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/136.jpg)
Total Survey Error Framework
Groves et al. 2004; Groves & Lyberg 2010
![Page 137: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/137.jpg)
TSE Framework & Paradata
Krueter, 2012
![Page 138: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/138.jpg)
Adaptive / Responsive Design
• Create process indicators • Real-time monitoring (charts & “dashboards”)• Adjust resources during data collection to
achieve higher response rate and/or cost savings• Goal:
– Achieve high response rates in a cost-effective way– Introduce methods to recruit uncooperative – and
possibly different – sample members (reducing nonresponse bias)
![Page 139: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/139.jpg)
Nonresponse Adjustment
• Decreasing response rates have encouraged researchers to look at other sources of information to learn about nonrespondents– Doorstep interactions– Interviewer observations– Contact history data
![Page 140: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/140.jpg)
Contact History Instrument (CHI)
• CHI developed by the U.S. Census Bureau (Bates, 2003)
• Interviewers take time after each attempt (refusal or non-contact) to answer questions in the CHI
• Use CHI information to create models (i.e., heat maps) to identify optimal contact time
• Typically a quick set of questions to answer
• European Social Survey uses a standard contact form (Stoop et al., 2003)
![Page 141: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/141.jpg)
Contact History Inventory (CHI)
U.S. Census Bureau CHI
![Page 142: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/142.jpg)
Paradata
• Background information about Paradata• Uses of Paradata by mode• Paradata issues
![Page 143: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/143.jpg)
Uses of Paradata by Mode
• CAPI• CATI• Web• Mail• Post-hoc
![Page 144: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/144.jpg)
Uses of Paradata - CAPI
• Information collected can include:– Interviewer time spent calling sampled households– Time driving to sample areas– Time conversing with household members– Interview time– GPS coordinates (tablets/mobile devices)
• Information can be used to: – Inform cost-quality decisions (Kreuter, 2009)– Develop cost per contact– Predict the likelihood of response by using interviewer
observations of the response unit (Groves & Couper, 1998)– Monitor interviewers and identify any falsification
![Page 145: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/145.jpg)
Uses of Paradata - CATI
• Information collected can include:– Call transaction history (record of each attempt)– Contact rates– Sequence of contact attempts & contact rates
• Information can be used to: – Optimize call back times– Interviewer monitoring– Inform a responsive design
![Page 146: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/146.jpg)
Uses of Paradata - Web
• Server-side vs. client-side• Information collected can include:
– Device information (i.e., browser type, operating system, screen resolution, detection of JavaScript or Flash)
– Questionnaire navigation information
Callegaro, 2012
![Page 147: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/147.jpg)
Web Paradata - Server-side
• Page requests or “visits” to a web page from the web server
• Identify device information and monitor survey completion
![Page 148: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/148.jpg)
Web Paradata - Server-side cont.
• Typology of response behaviors in web surveys1. Complete responders2. Unit non-responders3. Answering drop-outs4. Lurkers5. Lurking drop-outs6. Item non-responders7. Item non-responding drop-outs
Bosnjak, 2001
![Page 149: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/149.jpg)
Web Paradata – Client-Side
• Collected on the R’s computer• Logs each “meaningful” action• Heerwegh (2003) developed code / guidance for client-
side paradata collected using Java-Script– Clicking on a radio button– Clicking and selecting a response option in a drop-down box– Clicking a check box (checking / unchecking)– Writing text in an input field– Clicking a hyperlink– Submitting the page
![Page 150: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/150.jpg)
Web Paradata – Client-Side cont.
• Stern (2008) used Heerwegh’s paradata techniques to identify:– Whether R’s changed answers; what direction – The order that questions are answered when more
than one are displayed on the screen– Response latencies – the time that elapsed between
when the screen loaded on the R’s computer and they submitted an answer
• Heerwegh (2003) found that the longer the response time, the greater the probability of changing answers and an incorrect response
![Page 151: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/151.jpg)
Browser Information / Operating System Information
• Programmers use this information to ensure they are developing the optimal design
• Desktop, laptop, smartphone, tablet, or other device• Sood (2011) found a correlation between browser
type and survey breakoff & number of missing items– Completion rates for older browsers were lower– Using browser type as a proxy for age of device and
possible connection speed– Older browsers were more likely to display survey
incorrectly; possible explanation for higher drop-out rates
![Page 152: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/152.jpg)
JavaScript & Flash
• Helps to understand what the R can see and do in a survey
• JavaScript adds functionality such as question validations, auto-calculations, interactive help– 2% or less of computer users have JavaScript
disabled (Zakas, 2010)• Flash is used for question types such as drag &
drop or slide-bar questions– Without Flash installed, R’s may not see the question
![Page 153: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/153.jpg)
Flash Question Example
![Page 154: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/154.jpg)
Questionnaire Navigation Paradata
• Mouse clicks/coordinates– Captured with JavaScript– Excessive movements can indicate
• An issue with the question• Potential for lower quality
• Changing answers– Can indicate potential confusion with a question– Paradata can capture answers that were erased– Changes more frequent for opinion question than
factual questions
Stieger & Reips, 2010
![Page 155: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/155.jpg)
Questionnaire Navigation Paradata cont.
• Order of answering– When multiple questions are displayed on a
screen– Can indicate how respondents read the questions
• Movement through the questionnaire (forward and back)– Unusual patterns can indicate confusion and a
possible issue with the questionnaire (i.e., poor question order)
![Page 156: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/156.jpg)
Questionnaire Navigation Paradata cont.
• Number of prompts/error messages/data validation messages
• Quality Index (Haraldsen, 2005)
• Goal is to decrease number of activated errors by improving the visual design and clarity of the questions
![Page 157: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/157.jpg)
Questionnaire Navigation Paradata cont.
• Clicks on non-question links– Help, FAQs, etc.– Indication of when and where Rs use help or other
information built into the survey and displayed as a link• Last question answered before dropping out
– Helps to determine if the data collected can be classified as complete, partial, or breakoff
– Used for response rate computation– Peytchev (2009) analyzed breakoff by question type
• Open ended increased break-off chances by 2.5x; long questions by 3x; slider bars by 5x; introductory screens by 2.6x
![Page 158: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/158.jpg)
Questionnaire Navigation Paradata cont.
• Time per screen / time latency– Attitude strength– Response uncertainty– Response error
• Examples– Heerwegh (2003)
• R’s with weaker attitudes take more time in answering survey questions than R’s with stronger attitudes
– Yan and Tourangeu (2008)• Higher-educated R’s respond faster than lower-educated R’s• Younger R’s respond faster than older R’s.
![Page 159: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/159.jpg)
Uses of Paradata – Call Centers
• Self-administered (mail or electronic) surveys• Call transaction history software
– Incoming calls• Date and time: useful for hiring, staffing, and workflow
decisions• Purpose of the call
– Content issue: useful for identifying problematic questions or support information
– Technical issue: useful for identifying usability issues or system problems
• Call outcome: type of assistance provided
![Page 160: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/160.jpg)
Paradata
• Background information about Paradata• Uses of Paradata by mode• Paradata issues
![Page 161: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/161.jpg)
Paradata Issues
• Reliability of data collected• Costs• Privacy and Ethical Issues
![Page 162: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/162.jpg)
Reliability of data collected
• Interviewers can erroneously record housing unit characteristics, misjudge features about respondents & fail to record a contact attempt
• Web surveys can fail to load properly, and client-side paradata fails to be captured
• Recordings of interviewers can be unusable (e.g., background noise, loose microphones)
Casas-Cardero 2010; Sinibaldi 2010; West 2010
![Page 163: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/163.jpg)
Paradata costs
• Data storage – very large files• Instrument performance• Development within systems• Analysis
![Page 164: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/164.jpg)
Privacy and Ethical Issues
• IP addresses along with e-mail address or other information can be used to identify a respondent
• This information needs to be protected
![Page 165: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/165.jpg)
Paradata Activity
• Should the respondent be informed that the organization is capturing paradata?
• If so, how should that be communicated?
![Page 166: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/166.jpg)
Privacy and Ethical Issues cont.
• Singer & Couper asked members of the Dutch Longitudinal Internet Studies for the Social Sciences (LISS) panel at the end of the survey if they could collect paradata – 38.4% agreed
• Asked before the survey – 63.4% agreed• Evidence that asking permission to use
paradata might make R’s less willing to participate in a survey
Couper & Singer, 2011
![Page 167: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/167.jpg)
Privacy and Ethical Issues cont.
• Reasons for failing to inform R’s about paradata or get their consent– Concept of paradata is unfamiliar and difficult for
R’s to grasp– R’s associate it with the activities of advertisers,
hackers or phishers– Asking for consent gives it more salience– Difficult to convey benefits of paradata for the R
![Page 168: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/168.jpg)
Questions and Discussion
![Page 169: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/169.jpg)
Usability Assessment
![Page 170: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/170.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 171: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/171.jpg)
Background Knowledge
• What does usability mean to you?• Have you been involved in usability research?• How is “user experience” different from
“usability?”
![Page 172: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/172.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 173: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/173.jpg)
Usability vs. User Experience
• Usability: “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.” ISO 9241-11
• Usability.gov• User experience includes emotions, needs and
perceptions.
![Page 174: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/174.jpg)
Understanding Users
Whitney’s 5 E’s of Usability Peter’s User Experience Honeycomb
The 5 Es to Understanding Users (W. Quesenbery): http://www.wqusability.com/articles/getting-started.htmlUser Experience Design (P. Morville): http://semanticstudios.com/publications/semantics/000029.php
![Page 175: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/175.jpg)
User Experience
![Page 176: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/176.jpg)
![Page 177: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/177.jpg)
![Page 178: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/178.jpg)
Measuring the UX
• How does it work for the end user?• What does the user expect?• How does it make the user feel?• What is the user’s story and habits?• What is the user’s needs?
![Page 179: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/179.jpg)
What people do on the Web
Krug, S. Don’t Make Me Think
![Page 180: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/180.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 181: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/181.jpg)
Why is Testing Important?
• Put it in the hands of the users.• Things may seem straightforward to you but
maybe not to your users.
![Page 182: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/182.jpg)
Why is Testing Important?
• Put it in the hands of the users.• Things may seem straightforward to you but
maybe not to your users.
![Page 183: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/183.jpg)
Why is Testing Important?
![Page 184: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/184.jpg)
Why is Testing Important?
• Put it in the hands of the users.• Things may seem straightforward to you but
maybe not to your users.• You might have overlooked something big!
![Page 185: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/185.jpg)
When to test
Concept
Prototype
Final Product
Test with users
Test with users
Test
Test
Test
Test
Test
![Page 186: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/186.jpg)
What can be tested?
• Existing surveys• Low-fidelity prototypes
– Paper mockups or mockups on computer– Basic idea is there but not functionality or
graphical look• High-fidelity prototypes
– As close as possible to final interface in look and feel
![Page 187: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/187.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 188: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/188.jpg)
Methods to Understand Users
assess interactiona
lmotivations and goals
ensure users can use products
efficiently & w
satisfaction
ensure content is
understood as intended
assess emotions,
perceptions, and
reactions
randomly sample the
population of interest
understand interactions in natural
environment
discuss users’
perceptions and
reactions
Method
Linguistic Analysis
Usability Testing
Cognitive Testing
User Experienc
e Research
Surveys
Ethnographic
Observation
Focus Groups and In-Depth
Interviews
Assessment
![Page 189: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/189.jpg)
Focus Groups
• Structured script
• Moderator discusses the survey with actual or typical users– Actual usage of survey
– Workflow beyond survey
– Expectations and opinions
– Desire for new features and functionality
• Benefit of participants stimulating conversations, but risk of “group think”
![Page 190: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/190.jpg)
In-Depth Interviews
• Structured or unstructured • Talk one-on-one with users,
in person or remotely– Actual usage of the survey– Workflow beyond survey– Expectations and opinions– Desire for new features and
functionality
![Page 191: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/191.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 192: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/192.jpg)
Ethnographic Observations• Observe users in home, office
or any place that is “real-world.”
• Observer is embedded in the user’s culture.
• Allows conversation & activity to evolve naturally, with minimum interference.
• Observe settings and artifacts (other real-world objects).
• Focused on context and meaning making.
![Page 193: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/193.jpg)
Diaries/Journals
• Users are given a journal or a web site to complete on a regular basis (often daily).
• They record how/when they used the survey, what they did, and what their perceptions were.• User-defined data• Feedback/responses develop and change over time• Insight into how technology is used “on-the-go.”
• There is often a daily set of structured questions and/or free-form comments.
![Page 194: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/194.jpg)
Diaries/Journals
• Users are given a journal or a web site to complete on a regular basis (often daily).
• They record how/when they used the survey, what they did, and what their perceptions were.• User-defined data• Feedback/responses develop and change over time• Insight into how technology is used “on-the-go.”
• There is often a daily set of structured questions and/or free-form comments.
![Page 195: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/195.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 196: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/196.jpg)
Usability Testing
• Participants respond to survey items• Assess interface flow and design
• Understanding • Confusion• Expectations
• Ensure skip intricate response patterns work as intended
• Can test final product or early prototypes
![Page 197: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/197.jpg)
Cognitive Testing
• Participants respond to survey items• Assess text
• Confusion• Understanding• Thought process
• Ensure questions are understood as intended and resulting data is valid
• Proper formatting is not necessary.
![Page 198: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/198.jpg)
Usability vs. Cognitive Testing
Usability Testing Metrics• Accuracy
• In completing item/ survey• Number/severity of errors
• Efficiency• Time to complete item/survey• Path to complete item/survey
• Satisfaction • Item-based• Survey-based• Verbalizations
Cognitive Testing Metrics• Accuracy
• Of interpretations
• Verbalizations
![Page 199: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/199.jpg)
Moderating TechniquesTechniques Pros Cons
Concurrent Think Aloud (CTA)
Understand participants’ thoughts as they occur and as they attempt to work through issues they encounter
Elicit real-time feedback and emotional responses
Can interfere with usability metrics, such as accuracy and time on task
Retrospective Think Aloud (RTA)
Does not interfere with usability metrics
Overall session length increases Difficulty in remembering
thoughts from up to an hour before = poor data
Concurrent Probing (CP)
Understand participants’ thoughts as they attempt to work through a task
Interferes with natural thought process and progression that participants would make on their own, if uninterrupted
Retrospective Probing (RP)
Does not interfere with usability metrics
Difficulty in remembering = poor data
Romano Bergstrom, Moderating Usability Tests: http://www.usability.gov/articles/2013/04/moderating-usability-tests.html
![Page 200: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/200.jpg)
Choosing a Moderating Technique
• Can the participant work completely alone?• Will you need time on task and accuracy data?• Are the tasks multi layered and/or require
concentration?• Will you be conducting eye tracking?
![Page 201: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/201.jpg)
Tweaking vs. Redesign
Tweaking• Less work• Small changes occur quickly.• Small changes are likely to
happen.
Redesign• Lots of work after much has
already been invested• May break something else• A lot of people• A lot of meetings
![Page 202: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/202.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 203: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/203.jpg)
Lab vs. Remote vs. In the Field• Controlled
environment
• All participants have the same experience
• Record and communicate from control room
• Observers watch from control room and provide additional probes (via moderator) in real time
• Incorporate physiological measures (e.g., eye tracking, EDA)
• No travel costs
Laboratory Remote In the Field• Participants tend to
be more comfortable in their natural environments
• Recruit hard-to-reach populations (e.g., children, doctors)
• Moderator travels to various locations
• Bring equipment (e.g., eye tracker)
• Natural observations
• Participants in their natural environments (e.g., home, work)
• Use video chat (moderated sessions) or online programs (unmoderated)
• Conduct many sessions quickly
• Recruit participants in many locations (e.g., states, countries)
![Page 204: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/204.jpg)
Lab-Based Usability Testing
Observation area for clientsWe maneuver the
cameras, record, and communicate through microphones and speakers from the control room so we do not interfere
Live streaming close-up screen shot of the participant’s screen
Participant in the testing room
0
Participant in the testing room
Large screens to display material during focus groups
Fors Marsh Group UX Lab
![Page 205: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/205.jpg)
Eye Tracking
• Desktop• Mobile• Paper
Fors Marsh Group UX Lab
![Page 206: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/206.jpg)
Remote Moderated Testing
Participant working on the survey from her home in another state
Moderator working from the office
Observer taking notes, remains unseen from participant
Fors Marsh Group UX Lab
![Page 207: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/207.jpg)
Field Studies
Participant is in her natural environment, completing tasks on a site she normally uses for work
Researcher goes to participant’s workplace to conduct session. She observes and takes notes
Participant uses books from her natural environment to complete tasks on the website
![Page 208: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/208.jpg)
Usability Assessment• Usability vs. User Experience• Why, When, What?• Methods
• Focus Groups, In-Depth Interviews• Ethnographic Observations, Diary Studies• Usability and Cognitive Testing
• Lab, Remote, In-the-Field• Obstacles
![Page 209: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/209.jpg)
Obstacles to Testing
• “There is no time.”– Start early in development process.– One morning a month with 3 users (Krug)– 12 people in 3 days (Anderson Riemer)– 12 people in 2 days (Lebson & Romano Bergstrom)
• “I can’t find representative users.”– Everyone is important.– Travel– Remote testing
• “We don’t have a lab.”– You can test anywhere.
![Page 210: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/210.jpg)
Final Thoughts
• Test across devices.• “User experience is an ecosystem.”
• Test across demographics.• Older adults perform differently than young.
• Start early.
Kornacki, 2013, The Long Tail of UX
![Page 211: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/211.jpg)
Questions & Discussion
![Page 212: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/212.jpg)
Quality of Mixed Modes
The views expressed on statistical or methodological issues are those of the presenters and not necessarily those of the U.S. Census Bureau.
![Page 213: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/213.jpg)
213
Quality of Mixed Modes
• Mixed Mode Surveys• Response Rates• Mode Choice
![Page 214: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/214.jpg)
214
Mixed Mode Surveys
• Definition: Any combination of survey data collection methods/modes
• Mixed vs. Multi vs. Multiple – Modes • Survey organization goal:
– Identify optimal data collection procedure (for the research question)
– Reduce Total Survey Error– Stay within time/budget constraints
![Page 215: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/215.jpg)
215
Mixed Mode Designs
• Sequential – Different modes for different phases of interaction
(initial contact, data collection, follow-up)– Different modes used in sequence during data
collection (i.e., panel survey which begins in one mode and moves to another)
• Concurrent – different modes implemented at the same time
de Lueew & Hox, 2008
![Page 216: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/216.jpg)
216
Why Do Mixed Mode?
• Cost savings• Improve Timeliness• Reduces Total Survey Error
– Coverage error– Nonresponse error– Measurement error
![Page 217: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/217.jpg)
217
Mixed Modes – Cost Savings
• Mixed mode designs give an opportunity to compensate for the weaknesses of each individual mode in a cost effective way (de Leeuw, 2005)
• Dillman 2009 Internet, Mail, and Mixed-Mode Surveys book:– Organizations often start with lower cost mode
and move to more expensive one• In the past: start with paper then do CATI or in person
nonresponse follow-up (NRFU)• Current: start with Internet then paper NRFU
![Page 218: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/218.jpg)
218
Mixed Modes – Cost Savings cont.
• Examples: • U.S. Current Population Survey (CPS) – panel survey
– Initially does in-person interview and collects a telephone number
– Subsequent calls made via CATI to reduce cost• U.S. American Community Survey
– Phase 1: mail– Phase 2: CATI NRFU– Phase 3: face-to-face with a subsample of remaining
nonrespondents
![Page 219: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/219.jpg)
219
Mixed Mode - Timeliness
• Collect responses more quickly• Examples:
– Current Employment Statistics (CES) offers 5 modes (Fax, Web, Touch-tone Data Entry, Electronic Data Interchange, & CATI) to facilitate timely monthly reporting
![Page 220: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/220.jpg)
220
Why Do Mixed Mode?
• Cost savings• Improve Timeliness• Reduces Total Survey Error
– Coverage error– Nonresponse error– Measurement error
![Page 221: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/221.jpg)
Total Survey Error Framework
Groves et al. 2004; Groves & Lyberg 2010
![Page 222: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/222.jpg)
222
Mixed Mode - Coverage Error
• Definition: proportion of the target population that is not covered by the survey frame and the difference in the survey statistic between those covered and not covered
• Telephone penetration• Landlines vs mobile phones
– Web penetration
Groves, 1989
![Page 223: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/223.jpg)
223
Coverage – Telephone
• 88% of U.S. adults have a cell phone• Young adults, those with lower education, and
lower household income more likely to use mobile devices as main source of internet access
Smith, 2011; Zickuhr & Smith, 2012
![Page 224: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/224.jpg)
224
Coverage - Internet
• Coverage is limited– No systematic directory of addresses
• 1 in 5 in U.S. do not use the Internet
Zickuhr & Smith, 2012
![Page 225: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/225.jpg)
225
![Page 226: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/226.jpg)
226
World Internet Statistics
![Page 227: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/227.jpg)
227
Coverage –Web cont.
• Indications that Internet adoption rates have leveled off
• Demographics least likely to have Internet– Older – Less education– Lower household income
• Main reason for not going online: not relevant
Pew, 2012
![Page 228: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/228.jpg)
228
European Union – Characteristics of Internet Users
![Page 229: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/229.jpg)
229
Coverage - Web cont.
• R’s reporting via Internet can be different from those reporting via other modes – Internet vs. mail (Diment & Garret-Jones, 2007;
Zhang, 2000)• R’s cannot be contacted through the Internet
because e-mail addresses lack structure for generating random samples (Dillman, 2009)
![Page 230: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/230.jpg)
230
Mixed Mode – Nonresponse Error
• Definition: inability to obtain complete measurements on the survey sample (Groves, 1998)– Unit nonresponse - entire sampling unit fails to
respond– Item nonresponse – R’s fail to respond to all
questions• Concern is that respondents and non-
respondents may differ on variable of interest
![Page 231: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/231.jpg)
231
Mixed Mode – Nonresponse cont.
• Overall response rates have been declining• Mixed mode is a strategy used to increase
overall response rates while keeping costs low • Some R’s have a mode preference (Miller,
2009)
![Page 232: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/232.jpg)
232
Mixed Mode – Nonresponse cont.
• Some evidence of a reduction in overall response rates when multiple modes offered concurrently in population/household surveys– Examples: Delivery Sequence File Study (Dillman,
2009); Arbitron Radio Diaries (Gentry, 2008), American Communities Survey (Griffen, et al, 2001), Survey of Doctorate Recipients (Grigorian & Hoffer, 2008)
• Could assign R’s to modes based on known preferences
![Page 233: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/233.jpg)
233
Mixed Mode – Measurement Error
• Definition: “observational errors” arising from the interviewer, instrument, mode of communication, or respondent (Groves, 1998)
• Providing mixed modes can help reduce the measurement error associated with collecting sensitive information– Example: Interviewer begins face-to-face interview
(CAPI) then lets R continue on the computer with headphones (ACASI) to answer sensitive questions
![Page 234: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/234.jpg)
234
Mode Comparison Research
• Meta-analysis of articles by – Harder to get mail responses– Overall non-response rates & item non-response
rates are higher in self-administered questionnaires, BUT answered items are of high quality
– Small difference in quality between face-to-face and telephone (CATI) surveys.
– Face-to-face surveys had slightly less item non-response rates
de Leeuw, 1992
![Page 235: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/235.jpg)
235
Mode Comparison Research cont.
• Question order and response order effects less likely in self-administered than telephone – R’s more likely to choose last option heard in CATI
(recency effect)– R’s more likely to choose the first option seen in
self-administered (primacy effect)– Mixed results on item-nonresponse rates in Web
de Leeuw, 1992; 2008
![Page 236: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/236.jpg)
236
Mode Comparison Research cont.
• Some indication that Internet surveys are more like mail than telephone surveys– Visual presentation vs auditory
• Conflicting evidence item non-response (some show higher item non-response on Internet v.s. mail while others show no difference)
• Some evidence of better quality data– Fewer post-data collection edits needed for
electronic v.s. mail responses
Sweet & Ramos, 1995; Griffin et. al, 2001
![Page 237: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/237.jpg)
237
Disadvantages of Mixed Mode
• Mode Effects– Concerns for measurement error due to the mode
• R’s providing different answers to the same questions displayed in different modes
– Different contact/cooperation rates because of different strategies used to contact R’s
![Page 238: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/238.jpg)
238
Disadvantages of Mixed Mode
• Decrease in overall response rates– Why: Effects of offering a mix of mail/web mixed– What: Meta-analysis of 16 studies that compared
mixed mode surveys with mail and web options
Results: empirical evidence that offering mail and Web concurrently resulted in a significant reduction in response rates
Medway & Fulton, 2012
![Page 239: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/239.jpg)
239
• Why this is happening? – Potential Hypothesis #1: R’s dissuaded from responding
because they have to make a choice• Offering multiple modes increases burden (Dhar, 1997)• While judging pros/cons of each mode, neither appear attractive
(Schwartz, 2004)
– Potential Hypothesis #2: R’s choose Web, but never actually do it
• If R’s receive invitation in mail, there is a break in their response process (Griffin, et. al, 2001)
– Potential Hypothesis #3: R’s that choose Web may get frustrated with the instrument and abandon the whole process (Couper, 2000)
Response Rates in Mixed Mode Surveys
![Page 240: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/240.jpg)
240
Overall Goals
• Find the optimal mix given the research questions and population of interest
• Other factors to consider: – Reducing Total Survey Error (TSE)– Budget– Time– Ethics and/or privacy issues
Biemer & Lyberg, 2003
![Page 241: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/241.jpg)
241
Quality of Mixed Modes
• Mixed Mode Surveys• Response Rates• Mode Choice
![Page 242: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/242.jpg)
242
Technique for Increasing Response Rates to Web in Multi-Mode Surveys
• “Pushing” R’s to the web– Sending R’s an invitation to report via Web– No paper questionnaire in the initial mailing– Invitation contains information for obtaining the
alternative version (typically paper)– Paper versions are mailed out during follow-up to
capture responses to those that do not have web access or do not want to respond via web
– “Responding to Mode in Hand” Principal
![Page 243: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/243.jpg)
243
“Pushing” Examples
• Example 1: Lewiston-Clarkson Quality of Life Survey
• Example 2: 2007 Stockholm County Council Public Health Survey
• Example 3: American Community Survey• Example 4: 2011 Economic Census Re-file
Survey
![Page 244: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/244.jpg)
244
Pushing Example 1 – Lewiston-Clarkson Quality of Life Survey
• Goals: increase web response rates in a paper/web mixed-mode survey and identify mode preferences
• Method: – November 2007 – January 2008– Random sample of 1,800 residential addresses– Four treatment groups– To assess mode preference, this question was at the end of
the survey: • “If you could choose how to answer surveys like this, which one
of the following ways of answering would you prefer?”• Answer options: web or mail or telephone
Miller, O’Neill, Dillman, 2009
![Page 245: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/245.jpg)
245
Pushing Example 1 – cont.
• Group A: Mail preference with web option– Materials suggested mail was preferred but web
was acceptable• Group B: Mail Preference
– Web option not mentioned until first follow-up• Group C: Web Preference
– Mail option not mentioned until first follow-up• Group D: Equal Preference
![Page 246: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/246.jpg)
246
Pushing Example 1 – cont.
• Results
![Page 247: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/247.jpg)
247
Pushing Example 1 – cont.
“If you could choose how to answer surveys like this, which one of the following ways of answering would you prefer?”
![Page 248: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/248.jpg)
248
Pushing Example 1 – cont.
![Page 249: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/249.jpg)
249
Pushing Example 1 – cont.
Group C = Web Preference Group
![Page 250: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/250.jpg)
250
Pushing Example 1 – cont.
• Who can be pushed to the Web?
![Page 251: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/251.jpg)
251
Pushing Example 2 – 2007 Stockholm County Council Public Health Survey
• Goal: increase web response rates in a paper/web mixed-mode survey
• Method:– 50,000 (62% response rate)– 4 treatments that varied in “web intensity”– Plus a “standard” option – paper and web login
data
Holmberg, Lorenc, Werner, 2008
![Page 252: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/252.jpg)
252
Pushing Example 2 – Cont.
• Overall response rates
S= Standard A1= very paper “intense”A2= paper “intense” A3= web “intense”A4= very web “intense”
![Page 253: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/253.jpg)
253
Pushing Example 2 – Cont.
• Web responses
S= Standard A1= very paper “intense”A2= paper “intense”A3= web “intense”A4= very web “intense”
![Page 254: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/254.jpg)
254
Pushing Example 3 – American Community Survey
• Goals: – Increase web response rates in a paper/web
mixed-mode survey– Identify ideal timing for non-response follow-up– Evaluate advertisement of web choice
Tancreto et. al., 2012
![Page 255: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/255.jpg)
255
Pushing Example 3 – Cont.
• Method– Push: 3 versus 2 weeks until paper questionnaire– Choice: Prominent and Subtle– Mail only (control)
– Tested among segments of US population
• Targeted• Not Targeted
![Page 256: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/256.jpg)
256
Response Rates by Mode in Targeted Areas
Ctrl (Mail only) Prom Choice Subtle Choice Push (3 weeks) Push (2 weeks)0
5
10
15
20
25
30
35
40
45
38.1
28.434.1
2.5
12.5
9.83.5
28.6 28.0
Internet Mail
37.238.1
31.1
40.5
31.1
37.638.2
![Page 257: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/257.jpg)
257
Response Rates by Mode in Not Targeted Areas
Ctrl (Mail only) Prom Choice Subtle Choice Push (3 weeks) Push (2 weeks)0
5
10
15
20
25
30
35
40
45
50
29.724.1
27.82.7
12.6
6.32.1
17.1 17.2
Internet Mail
37.2
29.7
19.8
30.4 29.9 29.8
![Page 258: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/258.jpg)
258
Example 4: Economic Census Refile
• Goal: to increase Internet response rates in a paper/Internet establishment survey during non-response follow-up
• Method: 29,000 delinquent respondents were split between two NRFU mailings– Letter-only mailing mentioning Internet option– Letter and paper form mailing
Marquette, 2012
![Page 259: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/259.jpg)
259
Example 4: Cont.
![Page 260: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/260.jpg)
260
Quality of Mixed Modes
• Mixed Mode Surveys• Response Rates• Mode Choice
![Page 261: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/261.jpg)
261
Why Respondents Choose Their Mode?
• Concern about “mode paralysis” – When two option are offered, R’s much choose
between tradeoffs– This choice makes each option less appealing – By offering a choice between Web and mail;
possibly discouraging response
Miller and Dillman, 2011
![Page 262: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/262.jpg)
262
Mode Choice
• American Community Survey – Attitudes and Behavior Study
• Goals: – Measure why respondents chose the Internet or
paper mode during the American Community Survey Internet Test
– Why there was nonresponse and if it was linked to the multi-mode offer
Nichols, 2012
![Page 263: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/263.jpg)
263
Mode Choice – cont.
• CATI questionnaire was developed in consultation with survey methodologists
• Areas of interest included:– Salience of the mailing materials and messages– Knowledge of the mode choice– Consideration of reporting by Internet– Mode preference
![Page 264: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/264.jpg)
264
Mode Choice – cont.
• 100 completed interviews per notification strategy (push
![Page 265: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/265.jpg)
265
Mode Choice – cont.
• Results– Choice/Push Internet respondents opted for perceived
benefits – easy, convenient, fast– Push R’s noted that not having the paper form
motivated them to use the Internet to report– Push R’s that reported via mail did so because they did
not have the Internet access or had computer problems– The placement of the message about the Internet
option was reasonable to R’s– R’s often recalled the letter that accompanied the
mailing package mentioning the mode choice
![Page 266: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/266.jpg)
266
Mode Choice – cont.
• Results cont.– Several nonrespondents cited not knowing that a
paper option was available as a reason for not reporting
– Very few nonrespondents attempted to access the online form
– Salience of the mailing package and being busy were main reasons for nonresponse
– ABS study did NOT find “mode paralysis”
![Page 267: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/267.jpg)
Questions and Discussion
![Page 268: Introduction to Web Survey Usability Design and Testing](https://reader038.fdocuments.in/reader038/viewer/2022102815/554a5520b4c90531228b4d44/html5/thumbnails/268.jpg)
Amy Anderson RiemerUS Census Bureau
Jennifer Romano BergstromFors Marsh Group