© Fred Van Bennekom, Great Brook, 2001Page 1 The Art & Science of Designing a Survey Instrument...
-
Upload
madeline-bell -
Category
Documents
-
view
214 -
download
0
Transcript of © Fred Van Bennekom, Great Brook, 2001Page 1 The Art & Science of Designing a Survey Instrument...
© Fred Van Bennekom, Great Brook, 2001 Page 1
The Art & Science of Designing a Survey Instrument
Frederick C. Van Bennekom, Dr.B.A.
Great Brook ConsultingEnhancing Organizational Improvement
Through Customer Feedback
421 Main Street Bolton, MA 01740 (978) 779-6312 (877) GreatBr Toll Free
[email protected] www.greatbrook.com
© Fred Van Bennekom, Great Brook, 2001 Page 2
Art versus Science
None – ignorance Know a good outcome from bad Know the characteristics of a quality outcome Prioritization of these quality characteristics Know the variables that lead to these outcomes Know the impact of individual variables Know the interaction effects among variables Able to measure the variables Able to control process to achieve quality outcomes –
repeatedly & consistently
Art
Science
© Fred Van Bennekom, Great Brook, 2001 Page 3
Art and Science?
The Art– Crafting the wording of the questions
The Science– The design process
– Design of scales
– Not to mention the survey administration
© Fred Van Bennekom, Great Brook, 2001 Page 4
x x
xx x
x
x x
x
xx
x
x x
x
xx
x
x
xx
x
xx
x x
xx
x x
rr
rrrr
rr
rrrr
rr
x
x
x
x
x x
x
x x
rr
Population Surveying a Sample is More Surveying a Sample is More Efficient Than a Full CensusEfficient Than a Full Census
Sample
rrrr
rr
rrrr
rr
rr
rr
What is a Survey?
Design & Administer Instrument to a Sample
Generalize Results to the Population
Instrument Validity + Administration Accuracy = Reliability
© Fred Van Bennekom, Great Brook, 2001 Page 5
The Role of Surveying in Achieving Loyal Behavior
Value added chain
Design... Replication… Sales… Service...
Good service delivery
Continuous improvement
Problem solicitation
Effective problem handling
© Fred Van Bennekom, Great Brook, 2001 Page 6
A Rigorous Instrument Design Process
1) Interview management
2) Identify questions to ask
3) Draft survey instrument
4) Review by project team
6) Conduct pilot
7) Redraft & finalize instrument
5) Revision Iterations
Science
© Fred Van Bennekom, Great Brook, 2001 Page 7
Identifying Questions to Ask
Attributes of Service Delivery – that need to be understood & tracked
Attitudinal Outcomes – driven by perceptions of service delivery performance
Demographic Segmentations – for data analysis
© Fred Van Bennekom, Great Brook, 2001 Page 8
Identify the Attributes
Draw a Service Blueprint– Process flow diagram
– Highlights the Moments of Truth = where we “touch” the customer
Review complaint data, conduct focus groups, interviews, or other critical incident studies– What are critical service attributes?
– What are customers major concerns?
Art
© Fred Van Bennekom, Great Brook, 2001 Page 9
Classify the Attributes – Service Quality Dimensions
Reliability: Delivering on promises Responsiveness: Being willing to help Assurance: Inspiring trust and confidence Empathy: Treating customers as individuals Tangibles: Representing the service physically
• A useful framework for thinking about the instrument design – and analyzing the data
Science
© Fred Van Bennekom, Great Brook, 2001 Page 10
Instrument Design – Attitudinal Outcomes
Perception of service delivery leads to attitudes– Likelihood of repurchase
– Willingness to provide reference
– Overall satisfaction
– Any others?
Use of attitudinal measures– Summary measure for the survey
– Dependent variable for regression tests
• Link attributes to true behavioral outcomes if data are available
© Fred Van Bennekom, Great Brook, 2001 Page 11
Drafting the Survey Instrument
Overall form of the survey instrument
Issues with the construction of the questions
Selecting a scale
Question formats
Question sequencing
© Fred Van Bennekom, Great Brook, 2001 Page 12
Overall Form of the Instrument
Pre-Administration Announcement Letter– Letter or email from senior executive
Survey Introduction– Set the mental state & be consistent– Define critical terms
Initiation - First Questions– Engage the respondent– Get the respondent thinking
Instructions – Even if it seems silly...
With each contact,
motivate the respondent!
Science
© Fred Van Bennekom, Great Brook, 2001 Page 13
The Need for Instructions –(Need we say more?)
© Fred Van Bennekom, Great Brook, 2001 Page 14
Overall Form of the Instrument
Grouping strategies– By topic, by scale, by chronology
Conditional branching – “Skip & Hit” Routine – a response rut
– Long series of questions that read in a rhythm – Respondents just give the same answer
Fatigue – caused by long list of choices– Leads to choosing first or last item– Especially important for telephone surveys
Art
© Fred Van Bennekom, Great Brook, 2001 Page 15
Issues with the Construction of Questions
Critical Criterion: Common Interpretation• Otherwise... You’re Asking the Respondents
Different Questions
Focus
Brevity Clarity
3 Key Attributes
Control for •Instrumentation Bias•Response Bias
© Fred Van Bennekom, Great Brook, 2001 Page 16
Avoiding Instrumentation Bias
Clearly Stated Criteria for Evaluation Question Must Apply to Respondent Examples Should Not Lead Response Reasonable Recall Expectations Unambiguous Word Choice Ask One Question at a Time Don’t Ask Leading or Loaded Questions
Bias Introduced by the Survey Instrument
© Fred Van Bennekom, Great Brook, 2001 Page 17
Scale Anchoring Options
Fully Anchored Extremely Extremely
Satisfied Satisfied Undecided Dissatisfied Dissatisfied
1 2 3 4 5
Is this an interval scale or just an ordinal scale??
Endpoint Anchored Extremely Extremely
Satisfied Dissatisfied
1 2 3 4 5
© Fred Van Bennekom, Great Brook, 2001 Page 18
How Should I Solicit a Response?
Unstructured– Free-form or open-ended
response
– “Please describe...”
– “Is there anything else...”
Structured– Response on
pre-determined list or scale
– “Check all that apply...”
– “Please rate...”
Remember the Objective of a Survey: Maximize Information Gained
while ... Minimizing Respondent
Burden
© Fred Van Bennekom, Great Brook, 2001 Page 19
Question Formats & Types
Un stru ctu red F o rm at
S in g le Resp o n se
M u ltip le Resp o n se
Ad jective Ch ecklist
M ultip le Ch oice
Ord in al
F o rced Ran kin g
Paired Co m p ariso n
O rd in al Scales
L ikert T yp e
Verb al F req u en cy
Co m p arative
Ho rizo n tal Nu m erical
Sem an tic D ifferen tial
"S tap el"
F ixed Su m
In terval Scales
F ractio n atio n
Ratio Scale
Stru ctu red F o rm ats
© Fred Van Bennekom, Great Brook, 2001 Page 20
Question Format: Unstructured
Advantages– Response Not Constrained
to Predetermined Categories
– May Uncover Unexpected Answers
Disadvantages– Very Long to Complete
• Respondent Burden
• Cost to Administer
– Textual Data Difficult to Analyze and Summarize
Free-Form or Open-Ended Response
© Fred Van Bennekom, Great Brook, 2001 Page 21
Question Format: Structured
Advantages– Clearer responses
– Easy to summarize & analyze
– Easy to administer
Disadvantages– Limits responses
– May bias responses
– Requires more investment in question design
Coded Response • Multiple Choice & Scaled
Data
© Fred Van Bennekom, Great Brook, 2001 Page 22
Interval Rating Scales – Elements
Listed below are several statements. Please indicate your agreement with each by selecting a number from 1 to 5 where 1 represents Strongly Disagree and 5 represents Strongly Agree.
Strongly Strongly Disagree AgreeI was on hold for a short time N/A 1 2 3 45
Question Item Scale
Anchors
Instructions
© Fred Van Bennekom, Great Brook, 2001 Page 23
Clearly Stated Criteria for Evaluation
Wrong How would you rate the
ability of the project team to define business requirements?
Right Compared to other projects
done for you, how would you rate the ability of the project team to define business requirements?
© Fred Van Bennekom, Great Brook, 2001 Page 24
Applicability to Respondent
Wrong How effective did you find
the FAX-Back support system?
Right If you used the FAX-Back
support system, how effective did you find it?
Include a “not applicable” response choice Multiple NAs may lead to non-response. Use
skip & hit.
© Fred Van Bennekom, Great Brook, 2001 Page 25
Do Not Lead With Examples
Wrong What aspect of our service
is most critical to you, for example, the speed of response?
Right What aspect of our service
is most critical to you?
Most critical with open-ended question format
© Fred Van Bennekom, Great Brook, 2001 Page 26
Reasonable Recall Expectations
Wrong In your support calls over
the past year, how many minutes was it before the phone was answered?
Right During the past three
months, has the time for a support representative to answer the phone been reasonable?
© Fred Van Bennekom, Great Brook, 2001 Page 27
Unambiguous Wording
Wrong In your last support call,
was the response time reasonable?
Right Consider your last request
for support. How reasonable was the time from when you called until you spoke with a support representative?
Major source of construction flaws Avoid jargon
© Fred Van Bennekom, Great Brook, 2001 Page 28
Examples of Ambiguous Phrasing
– The ability of the help desk to resolve problems on the first try
– The promptness with which you received the Service Engineer’s estimated time of arrival…
– Satisfaction with the functionality of the equipment
– Responsiveness of the Customer Support Personnel
– Have you received service of consistent quality?
– Was your call answered promptly?
© Fred Van Bennekom, Great Brook, 2001 Page 29
Ask One Question at a Time
Wrong Was the staff technically
competent and courteous?
Right Was the staff member who
handled your issue technically competent?
Was the staff member who handled your issue courteous?
© Fred Van Bennekom, Great Brook, 2001 Page 30
Avoid Loaded & Leading Wording
Wrong How did our interest in
you, our customer, match your expectations?
Right To what extent did our
concern for you match your expectations?
© Fred Van Bennekom, Great Brook, 2001 Page 31
Thanks for Attending
Any Questions??