COMMUNITY HEALTH SCIENCES/EPIDEMIOLOGY M218 … · CHS / Epi M218 Fall 2014 ... 31. Conrad, FG, ......
Transcript of COMMUNITY HEALTH SCIENCES/EPIDEMIOLOGY M218 … · CHS / Epi M218 Fall 2014 ... 31. Conrad, FG, ......
CHS / Epi M218
Fall 2014
Page 1
COMMUNITY HEALTH SCIENCES/EPIDEMIOLOGY M218
Questionnaire Design and Administration
Course web site: http://ccle.ucla.edu
Day & Time: Mon & Wed 8-10 A.M. Instructor: Linda B. Bourque
Room: CHS 41-268 Office: 41-230 CHS
ID#: 840 108 200 (CHS) Office Hrs: Mon & Wed 10:00-11:30
844 110 200 (EPI) Sign up for appointments on sheet outside office.
TEXTBOOKS:
A. Required books available for purchase in the Health Science Bookstore:
1. LuAnn Aday, Llewellyn J. Cornelius, Designing and Conducting Health Surveys,
3rd edition, Jossey-Bass, 2006.
2. Linda Bourque and Eve Fielder, How to Conduct Telephone Surveys, The Survey
Kit, Sage Publications, 2nd
Edition, 2003.
3. Materials available on course website and other UCLA web sites.
B. Recommended books available for purchase in the Health Sciences Bookstore.
1. Linda Bourque and Eve Fielder, How to Conduct Self-Administered and Mail
Surveys, 2nd
Edition, The Survey Kit, Sage Publications, 2003.
2. Linda Bourque and Virginia Clark, Processing Data: The Survey Example, Sage
Publications, 1992.
3. Jean M. Converse and Stanley Presser, Survey Questions, Sage, 1986.
4. Orlando Behling and Kenneth S. Law, Translating Questionnaires and Other
Research Instruments, Problems and Solutions, Sage Publications, 2000.
C. Recommended books available in the UCLA libraries:
1. Arlene Fink, How to Ask Survey Questions, The Survey Kit, Sage Publications,
1995, 2nd
edition, 2003.
2. Arlene Fink, How to Design Surveys, The Survey Kit, Sage Publications, 1995,
3. 2nd
edition, 2003.
4. Eleanor Singer and Stanley Presser, eds., Survey Research Methods, A Reader,
The University of Chicago Press, 1989.
5. Donald Dillman, Mail & Telephone Surveys, Wiley-Interscience, 1978.
6. Peter H. Rossi, James D. Wright, Andy B. Anderson, Handbook of Survey
Research, Academic Press, 1983.
7. Seymour Sudman & Norman M. Bradburn, Asking Questions, Jossey-Bass, 1982.
8. Robert M. Groves & Robert L. Kahn, Surveys by Telephone, Academic Press,
1979.
9. Norman M. Bradburn & Seymour Sudman, Polls & Surveys, Jossey-Bass, 1988.
10. Jean M. Converse, Survey Research in the United States, University of California
Press, 1987.
CHS / Epi M218
Fall 2014
Page 2
11. Hubert O'Gorman, ed., Surveying Social Life, Wesleyan University Press, 1988.
12. Herbert H. Hyman, Taking Society's Measure, Russell Sage Foundation, 1991.
13. Judith M. Tanur, ed., Questions About Questions, Russell Sage Foundation, 1992.
D. Supplementary Materials
All of the following articles are available on the class website at http://ccle.ucla.edu.
When you use information from articles, please remember that they are under copyright.
Articles on the Web Site:
1. Adua L, JS Sharp. Examining survey participation and response quality: The significance
of topic salience and incentives. Survey Methodology 2010; 36: 95-109.
2. The American Association for Public Opinion Research, 2011. Standard Definitions:
Final Dispositions of Case Codes and Outcome Rates for Surveys. 7th
edition. AAPOR.
3. Ansolabehere S, BF Schaffner. Residential mobility, family structure, and the cell-only
population. Public Opinion Quarterly 2010; 74:244-259.
4. Axinn WG, D Ghimire, NE Williams. Collecting survey data during armed conflict.
Journal of Official Statistics 2012; 28:153-171.
5. Baker R, JM Brick, NA Bates, M Battaglia, MP Couper, JA Dever, KJ Gile, R
Tourangeau. Summary Report of the AAPOR Task Force on Non-probability Sampling.
Journal of Survey Statistics and Methodology 2013; 1:90-143.
a. Valliant R. Comment, 105-110.
b. Rivers D. Comment, 111-117.
c. Crawford CAG. Comment, 118-123.
d. Terhanian G., Comment, 124-129.
e. Langer G. Comment, 130-136.
f. Baker R, JM Brick, NA Bates, M Battaglia, MP Couper, JA Dever, KJ Gile, R
Tourangeau. Rejoinder, 137-143.
6. Barón JD, RV Breunig, D Cobb-Clark, T Gørgens, A Sarbayeva. Does the effect of
incentive payments on survey response rates differ by income support history? Journal of
Official Statistics 2009; 25:483-507.
7. Barton, AH. Asking the Embarrassing Question. The Public Opinion Quarterly 22: 67-
68, 1958.
8. Bates N, MH Mulry. Using a geographic segmentation to understand, predict, and plan
for Census and survey mail nonresponse. Journal of Official Statistics 2011; 27: 601-618.
CHS / Epi M218
Fall 2014
Page 3
9. Bhopal, Raj & Liam Donaldson, White, European, Western, Caucasian, or What?
Inappropriate Labeling in Research on Race, Ethnicity, and Health,” American Journal of
Public Health 88(9):1303-1307, 1998.
10. Binson, D., J.A. Canchola, J.A. Catania, “Random Selection in a National Telephone
Survey: A Comparison of the Kish, Next-Birthday, and Last-Birthday Methods,” Journal
of Official Statistics 16(1):53-59, 2000.
11. Bischoping, K., J. Dykema, “Toward a Social Psychological Programme for Improving
Focus Group Methods of Developing Questionnaires,” Journal of Official Statistics
15(4):495-516, 1999.
12. Blair, E.A., G.K. Ganesh, “Characteristics of Interval-based Estimates of
Autobiographical Frequencies,” Applied Cognitive Psychology 5:237-250, 1991.
13. Blumberg SJ, JV Luke, N Ganesh, ME Davern, MH Boudreaux. Wireless Substitution:
State-level estimates from the National Health Interview Survey, 2010-2011. National
Health Statistics Reports 61; October 12, 2012.
14. Bourque, L.B. “Coding.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao, Editors, The Sage
Encyclopedia of Social Science Research Methods, Volume 1, Thousand Oaks, Ca: Sage
Publications, 2003, pp. 132-136.
15. Bourque, L.B. “Coding Frame.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao, Editors, The
Sage Encyclopedia of Social Science Research Methods, Volume 1, Thousand Oaks, Ca:
Sage Publications, 2003, pp. 136-137.
16. Bourque, L.B. “Cross-Sectional Design.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao,
Editors, The Sage Encyclopedia of Social Science Research Methods, Volume 1,
Thousand Oaks, Ca: Sage Publications, 2003, pp. 229-230.
17. Bourque, L.B. “Self-Administered Questionnaire.” In M.S. Lewis-Beck, A. Bryman, T.F.
Liao, Editors, The Sage Encyclopedia of Social Science Research Methods, Volume 3,
Thousand Oaks, Ca: Sage Publications, 2003, pp. 1012-1013.
18. Bourque, L.B. “Transformations.” In M.S. Lewis-Beck, A. Bryman, T.F. Liao, Editors,
The Sage Encyclopedia of Social Science Research Methods, Volume 3, Thousand Oaks,
CA: Sage Publications, 2003, pp. 1137-1138.
19. Bradburn, NM, The Seventh Morris Hansen Lecture on “The Future of Federal Statistics
in the Information Age,” with commentary by TerriAnn Lowenthal, Journal of Official
Statistics 15(3):351-372, 1999.
20. Bradburn, N.M. “Understanding the Question-Answer Process,” Survey Methodology
30:5-15, 2004.
CHS / Epi M218
Fall 2014
Page 4
21. Bradburn, N.M., L.J. Rips, S.K. Shevell, “Answering Autobiographical Questions: The
Impact of Memory and Inference on Surveys,” Science 236:157-161, 1987.
22. Brick, J.M., J. Waksberg, S. Keeter, “Using Data on Interruptions in Telephone Service
as Coverage Adjustments,” Survey Methodology 22(2):185-197, 1996.
23. Brick JM, PD Brick, S Dipko, S Presser, C Tucker, Y Yuan. Cell phone survey feasibility
in the U.S.: Sampling and calling cell numbers versus landline numbers. Public Opinion
Quarterly 2007; 71: 23-39.
24. Brick JM, WS Edwards, S Lee. Sampling telephone numbers and adults, interview
length, and weighting in the California Health Interview Survey cell phone pilot study.
Public Opinion Quarterly 2007; 71:793-813.
25. Brick JM, D Williams, JM Montaquila. Address-based sampling for subpopulation
surveys. Public Opinion Quarterly 2011; 75:409-428.
26. Caplow, T., H.M. Bahr, V.R.A. Call. “The Polls--Trends, The Middletown Replications:
75 Years of Change in Adolescent Attitudes, 1924-1999,” Public Opinion Quarterly
68:287-313, 2004.
27. Chang L, JA Krosnick. Comparing oral interviewing with self-administered computerized
questionnaires: An experiment. Public Opinion Quarterly 2010; 74: 154-167.
28. Chang L, JA Krosnick. National surveys via rdd telephone interviewing versus the
internet: Comparing sample representativeness and response quality. Public Opinion
Quarterly 2009; 73: 641-678.
29. Childs J, P Goerman. Bilingual questionnaire evaluation and development through mixed
pretesting methods: The case of the U.S. Census nonresponse followup instrument.
Journal of Official Statistics 2012; 26: 535-557.
30. Christian, L.M., D.A. Dillman. “The Influence of Graphical and Symbolic Language
Manipulations on Response to Self-Administered Questions,” Public Opinion Quarterly
68:57-80, 2004.
31. Conrad, FG, MF Schober. Promoting Uniform Question Understanding in Today’s and
Tomorrow’s Surveys, Journal of Official Statistics 21: 215-231, 2005.
32. Conrad FG, J Blair. Sources of error in cognitive interviews. Public Opinion Quarterly
2009; 73: 32-55.
33. Converse, Philip E. & Michael W. Traugott, “Assessing the Accuracy of Polls &
Surveys,” Science 234:1094-1098, November 28, 1986.
CHS / Epi M218
Fall 2014
Page 5
34. Couper, M.P., “Survey Introductions and Data Quality,” Public Opinion Quarterly
61:317-338, 1997.
35. Couper, Mick P., Johnny Blair & Timothy Triplett, “A Comparison of Mail & E-mail for
a Survey of Employees in U.S. Statistical Agencies,” Journal of Official Statistics
15(1):39-56, 1999.
36. Couper, Mick P., “Web Surveys: A Review of Issues and Approaches,” Public Opinion
Quarterly 64:464-494, 2000.
37. Couper MP, C Kennedy, FG Conrad, R Tourangeau. Designing input fields for non-
narrative open-ended responses in web surveys. Journal of Official Statistics 2011; 27:65-
85.
38. Couper, M.P., R. Tourangeau. “Picture This! Exploring Visual Effects in Web Surveys,”
Public Opinion Quarterly 68:255-266, 2004.
39. Couper MP, E Singer, FG Conrad, RM Groves. Experimental studies of disclosure risk,
disclosure harm, topic sensitivity, and survey participation. Journal of Official Statistics
2010; 26:287-300.
40. Couper MP, MB Ofstedal, S Lee. Encouraging record use for financial asset questions in
a web survey. Journal of Survey Statistics and Methodology 2013; 1:171-182.
41. Curtin, R, S Presser, E Singer. “Changes in Telephone Survey Nonresponse Over the Past
Quarter Century,” Public Opinion Quarterly 69:87-98, 2005.
42. de Leeuw, ED. “To Mix or Not to Mix Data Collection Modes in Surveys,” Journal of
Official Statistics 21: 233-255, 2005.
43. Dengler, R., H. Roberts, L. Rushton, “Lifestyle Surveys--The Complete Answer?”
Journal of Epidemiology and Community Health 51:46-51, 1997.
44. Dillman, DA, A Gertseva, T Mahon-Haft. “Achieving Usability in Establishment Surveys
Through the Application of Visual Design Principles,” Journal of Official Statistics 21:
183-214, 2005.
45. Driscoll J, N Lidow. Representative surveys in insecure environments: A case study of
Mogadishu, Somalia. Journal of Survey Statistics and Methodology 2014; 2:78-95.
46. Durrant GB, RM Groves, L Staetsky, F Steele. Effects of interviewer attitudes and
behaviors on refusal in household surveys. Public Opinion Quarterly 2010; 74:1-36.
47. Dykema, Jennifer, Nora Cate Schaeffer. “Events, Instruments, and Reporting Errors,”
American Sociological Review 65:619-629, 2000.
CHS / Epi M218
Fall 2014
Page 6
48. Elliott MN, WS Edwards, DJ Klein, A Heller. Differences by survey language and mode
among Chinese respondents to a CAHPS health plan survey. Public Opinion Quarterly
2012; 76:238-264.
49. Erosheva EA, TA White. Issues in survey measurement of chronic disability: An example
from the national long term care survey. Journal of Official Statistics 2010; 26:317-339.
50. Fitzgerald R, S Widdop, M Gray, D Collins. Identifying sources of error in cross-national
questionnaires: Application of an error source typology to cognitive interview data.
Journal of Official Statistics 2011;27:569-599
51. Frankenberg, E, NR Jones. “Self-Rated Health and Mortality: Does the Relationship
Extend to a Low Income Setting?” Journal of Health and Social Behavior 45: 441-452,
2004.
52. Fricker, S, M Galesic, R Tourangeau, T Yan. “An Experimental Comparison of Web and
Telephone Surveys,” Public Opinion Quarterly 69:370-392, 2005.
53. Fullilove, Mindy Thompson, “Comment: Abandoning 'Race' as a Variable in Public
Health Research--An Idea Whose Time Has Come,” American Journal of Public Health
88(9): 1297-1298, 1998.
54. Galesic M, M Bosnjak. Effects of questionnaire length on participation and indicators of
response quality in a web survey. Public Opinion Quarterly 2009; 73: 349-360.
55. Ganz, P., Hays, R.D., Kaplan, R.M., & Litwin, M.S. Measuring health-related quality of
life and other outcomes, Chapter 11, pp. 307-341. In G.G. Kominski & T.H. Rice (eds),
Changing the U.S. Health Care System: key issues in health services policy and
management 4th
edition, San Francisco, CA: Jossey-Bass, 2014.
56. Gardner, W., B.L. Wilcox, “Political Intervention in Scientific Peer Review,” American
Psychologist 48:972-983, 1993.
57. Gaziano, C. “Comparative Analysis of Within-Household Respondent Selection
Techniques,” Public Opinion Quarterly 69:124-157, 2005.
58. Groen JA. Sources of error in survey and administrative data: The importance of
reporting procedures. Journal of Official Statistics 2012; 28: 173-198.
59. Groves, R.M., M.P. Couper, “Contact-Level Influences on Cooperation in Face-to-Face
Surveys,” Journal of Official Statistics 12(1):63-83, 1996.
60. Groves RM, E Peytcheva. The impact of nonresponse rates on nonresponse bias: A meta-
analysis. Public Opinion Quarterly 2008; 72: 167-189.
CHS / Epi M218
Fall 2014
Page 7
61. Iannacchione VG. Research Synthesis: The changing role of address-based sampling in
survey research. Public Opinion Quarterly 2011; 75:556-575.
62. ICPSR, Guidelines for Effective Data Management Plans, no date.
63. Israel GD. Effects of answer space size on responses to open-ended questions in mail
surveys. Journal of Official Statistics 2010; 26: 271-285.
64. Iverson J. Metadata-Driven Survey Design. IASSIST Quarterly, Summer 2009.
65. Keeter S, C Miller, A Kohut, RM Groves, S Presser. Consequences of reducing
nonresponse in a national telephone survey. Public Opinion Quarterly 2000; 64:125-148.
66. Khanna D, E Krishnan, EM Dewitt, PP Khanna, B Spiegel, RD Hays. The future of
measuring patient-reported outcomes in rheumatology. Arthritis Care and Research
2011; 63:S486-S490.
67. Kornhauser, Arthur, and Paul B. Sheatsley, “Questionnaire Construction and Interview
Procedure,” Appendix B, in Claire Selltiz, Lawrence S. Wrightsman, Stuart W. Cook
(eds.), Research Methods in Social Relations, 3rd Edition, Holt, Rinehart and Winston,
1976, pp. 541-573.
68. Krenzke T, L Li, K Rust. Evaluating within household selection rules under a multi-stage
design. Survey Methodology 2010; 36:111-119
69. Krosnick, Jon A., Allyson L. Holbrook, Matthew K. Berent, Richard T. Carson, W.
Michael Hanemann, Raymond J. Kopp, Robert Cameron Mitchell, Stanley Presser, Paul
A. Ruud, V. Kerry Smith, Windy R. Moody, Melanie C. Green, Michael Conaway, “The
Impact of ‘No Opinion’ Response Options on Data Quality, Non-Attitude Reduction or
an Invitation to Satisfice?” Public Opinion Quarterly 66:371-403, 2002.
70. Krosnick, Jon A., “Survey Research,” Annual Review of Psychology 50:537-67, 1999.
71. Krosnick JA, N Malhotra, U Mittal. Public misunderstanding of political facts: How
question wording affected estimates of partisan differences in birtherism. Public Opinion
Quarterly 2014; 78:147-165.
72. Lavin, Daniele, Douglas W. Maynard. “Standardization vs. Rapport: Respondent
Laughter and Interviewer Reaction During Telephone Surveys,” American Sociological
Review 66:453-479, 2001.
73. Lee S, HA Nguyen, M Jawad, J Kurata. Linguistic minorities in a health survey. Public
Opinion Quarterly 2008; 72:470-486.
CHS / Epi M218
Fall 2014
Page 8
74. Lee S, N Schwarz. Question context and priming meaning of health: Effect on differences
in self-rated health between Hispanics and Non-Hispanic Whites. American Journal of
Public Health 2014; 104:179-185.
a. Kawada T. Question context, ethnic difference, and self-rated health. Letter to the
editor. American Journal of Public Health 2014; 104:e3.
b. Lee S, N Schwarz. Lee and Schwarz respond. American Journal of Public Health
2014; 104:e3-e4.
75. Lind LH, MF Schober, FG Conrad, H Reichert. Why do survey respondents disclose
more when computers ask the questions? Public Opinion Quarterly 2013; 77:888-935.
76. Link MW, JW Lai. Cell-phone-only households and problems of differential nonresponse
using an address-based sampling design. Public Opinion Quarterly 2011; 75:613-635.
77. Link MW, J Murphy, MF Schober, TD Buskirk, JH Childs, CL Tesfaye. Mobile
Technologies for Conducting, Augmenting and Potentially Replacing Surveys: Report of
the AAPOR Task Force on Emerging Technologies in Public Opinion Research.
American Association for Public Opinion Research, April 25, 2014.
78. Lynn P. Alternative sequential mixed-mode designs: Effects on attrition rates, attrition
bias, and costs. Journal of Survey Statistics and Methodology 2013; 1:183-205.
79. Macera, Caroline, Sandra Ham, Deborah A. Jones, Dexter Kinsey, Barbara Ainsworth,
Linda J. Neff. “Limitations on the Use of a Single Screening Question to Measure
Sedentary Behavior,” American Journal of Public Health 91:2010-2012, 2001.
80. Martin, E., T.J. DeMaio, P.C. Campanelli, “Context Effects for Census Measures of Race
and Hispanic Origin,” Public Opinion Quarterly 54:551-566, 1990.
81. Maxwell SE, K Kelley, JR Rausch. Sample size planning for statistical power and
accuracy in parameter estimation. Annual Review of Psychology 2008; 59:537-63/
82. McGonagle KA, RF Schoeni, MP Couper. The Effects of a Between-Wave Incentive
Experiment on Contact Update and Production Outcomes in a Panel Study. Journal of
Official Statistics 2013; 29(2):261-276.
83. Messer BL, DA Dillman. Surveying the general public over the internet using address-
based sampling and mail contact procedures. Public Opinion Quarterly 2011; 75:429-
457.
84. Millar MM, DA Dillman. Improving response to web and mixed-mode surveys. Public
Opinion Quarterly 2011; 75:249-269.
CHS / Epi M218
Fall 2014
Page 9
85. Mohorko A, de Leeuw E, Hox J. Coverage bias in European telephone surveys:
Developments of landline and mobile phone coverage across countries and over time.
Survey Methods: Insights from the Field 2013; Retrieved from
http://surveyinsights.org/?p=828.
86. Mohorko A, de Leeuw E, Hox J. Internet coverage and coverage bias in Europe:
Developments across countries and over time. Journal of Official Statistics 2013; 29:609-
622.
87. Montaquila JM, JM Brick, D Williams, K Kim, D Han. A study of two-phase mail survey
data collection methods. Journal of Survey Statistics and Methodology 2013; 1:66-87.
88. Morrison RL, DA Dillman, LM Christian. Questionnaire design guidelines for
establishment surveys. Journal of Official Statistics 2010; 26:43-85.
89. Olsen, Jørn on behalf of the IEA European Questionnaire Group, “Epidemiology
Deserves Better Questionnaires,” International Journal of Epidemiology 27:935, 1998.
90. Olson K, RM Groves. An examination of within-person variation in response propensity
over the data collection field period. Journal of Official Statistics 2012; 28:29-51.
91. Peter J, PM Valkenburg. The impact of “forgiving” introductions on the reporting of
sensitive behavior in surveys: The role of social desirability response style and
developmental status. Public Opinion Quarterly 2011; 75:779-787.
92. Petrolia DR, S Bhattacharjee. Revisiting incentive effects: Evidence from a random-
sample mail survey on consumer preferences for fuel ethanol. Public Opinion Quarterly
2009; 73: 537-550.
93. Pew Research Center. Assessing the Representativeness of Public Opinion Surveys.
Tuesday, May 15, 2012. Author.
94. Peytchev A. Survey breakoff. Public Opinion Quarterly 2009; 73: 74-97.
95. Peytchev A. Breakoff and unit nonresponse across web surveys. Journal of Official
Statistics 2011; 27:33-47.
96. Peytchev A, RK Baxter, LR Carley-Baxter. Not all survey effort is equal: Reduction of
nonresponse bias and nonresponse error. Public Opinion Quarterly 2009; 73: 785-806.
97. Preisendörfer, F Wolter. Who is telling the truth: A validation study on determinants of
respons behavior in surveys. Public Opinion Quarterly 2014; 78:126-146
CHS / Epi M218
Fall 2014
Page 10
98. Presser, S., M.P. Couper, J.T. Lessler, E. Martin, J. Martin, J.M. Rothgeb, E. Singer,
“Methods for Testing and Evaluating Survey Questions,” Public Opinion Quarterly
68:109-130, 2004.
99. Reline C. Clarifying categorical concepts in a web survey. Public Opinion Quarterly
2013; 77:89-105.
100. Rizzo, L., J. M. Brick, I. Park, “A Minimally Intrusive Method for Sampling
Persons in Random Digit Dial Surveys,” Public Opinion Quarterly 68:267-274, 2004.
101. Sayles H, RF Belli, E Serrano. Interviewer variance between event history
calendar and conventional questionnaire interviews. Public Opinion Quarterly 2010; 74:
140-153.
102. Scheuren F. What is a Survey?: American Statistical Association; 2004.
103. Schräpler JP, J Schupp, GG Wagner. Changing from PAPI to CAPI: Introducing
CAPI in a longitudinal study. Journal of official Statistics 2010; 233-269.
104. Shaeffer, EM, JA Krosnick. GE Langer, DM Merkle. “Comparing the Quality of
Data Obtained by Minimally Balanced and Fully Balanced Attitude Questions,” Public
Opinion Quarterly 69: 417-428, 2005.
105. Sigelman, L, S.A. Tuck, JK Martin. “What’s In a Name? Preference for ‘Black’
Versus ‘African-American’ Among Americans of African Descent,” Public Opinion
Quarterly 69: 429-438, 2005.
106. Singer E, J Van Hoewyk, MP Maher. Experiments with incentives in telephone
surveys. Public Opinion Quarterly 2000; 64:171-188.
107. Singer E, editor. Special Issue: Nonresponse bias in household surveys. Public
Opinion Quarterly 2006; 70 (5).
a. Groves RM. Nonresponse rates and nonresponse bias in household surveys, 646-
675.
b. Abraham KG, A Maitland, SM Bianchi. Nonresponse in the American time use
survey: Who is missing from the data and how much does it matter? 676-703.
c. Johnson TP, YI Cho, RT Campbell, AL Holbrook. Using community-level
correlates to evaluate nonresponse. 704-719.
d. Groves RM, MP Couper, S Presser, E Singer, R Tourangeau, GP Acosta, L
Nelson. Experiments in producing nonresponse bias. 720-736.
e. Olson K. Survey participation, nonresponse bias, measurement error bias and total
bias. 737-758.
f. Keeter S, C Kennedy, M Dimock, J Best, P Craighill. Gauging the impact of
growing nonresponse on estimates from a national RDD telephone survey. 759-
CHS / Epi M218
Fall 2014
Page 11
779.
g. Brick JM, S Dipko, S Presser, C Tucker, Y Yuan. Nonresponse bias in a dual
frame sample of cell and landline numbers. 780-793.
h. Link MW, AH Mokdad, D Kulp, A Hyon. Has the national do not call registry
helped or hurt state-level response rates? A time series analysis. 794-809.
108. Sikkel d, R Steenbergen, S Gras. Clicking vs. dragging: Different uses of the
mouse and their implications for online surveys. Public Opinion Quarterly 2014; 78:177-
190.
109. Skalland B, M Khare. Geographic inaccuracy of cell phone samples and the effect
on telephone survey bias, variance, and cost. Journal of Survey Statistics and
Methodology 2013; 1:45-65.
110. Small ML. How to conduct a mixed methods study: Recent trends in a rapidly
growing literature. Annual Review of Sociology 2011; 37:57-86.
111. Smyth JD, DA Dillman, LM Christian, M McBride. Open-ended questions in web
surveys: Can increasing the size of answer boxes and providing extra verbal instructions
improve response quality? Public Opinion Quarterly 2009: 73: 325-337.
112. Stevens, Gillian & David L. Featherman, “A Revised Socioeconomic Index of
Occupational Status,” Social Science Research 10:364-395, 1981.
113. Stevens, Gillian & Joo Hyun Cho, “Socioeconomic Indexes and the New 1980
Census Occupational Classification Scheme,” Social Science Research 14:142-168, 1985.
114. Suchman, L., B. Jordan, “Interactional Troubles in Face-to-Face Survey
Interviews,” Journal of the American Statistical Association 85(409):232-253, 1990, with
Commentary by Stephen E. Fienberg, Mary Grace Kovar and Patricia Royston, Emanuel
A. Schegloff, and Roger Tourangeau, and Rejoinder by Lucy Suchman and Brigitte
Jordan.
115. Tambor, E.S., G.A. Chase, R.R. Faden et al, “Improving Response Rates Through
Incentive and Follow-up: The Effect on a Survey of Physicians' Knowledge of Genetics,”
American Journal of Public Health 83:1599-1603, 1993.
116. Thompson KJ, BE Oliver. Response rates in business surveys: Going beyond the
usual performance measure. Journal of Official Statistics 2012;28:221-237.
117. Todorov, A., C. Kirchner, Bias in Proxies’ Reports of Disability: Data from the
National Health Interview Survey on Disability, American Journal of Public Health
90(8):1248-1253, 2000.
118. Toepoel V, M Das, A van Soest. Design of web questionnaires: The effect of
layout in rating scales. Journal of Official Statistics 2009; 25:509-528.
CHS / Epi M218
Fall 2014
Page 12
119. Tourangeau R, T Yan. Sensitive Questions in Surveys, Psychological Bulletin
2007; 133(5):859-883.Tourangeau R, MP Couper, F Conrad. Color, labels, and
interpretive heuristics for response scales. Public Opinion Quarterly 2007; 71: 91-112.
120. Tourangeau R, RM Groves, C Kennedy, T Yan. The presentation of a web survey,
nonresponse and measurement error among members of web panel. Journal of Official
Statistics 2009; 25:299-321.
121. Tourangeau R, MP Couper, FG Conrad. “Up means good” The effect of screen
position on evaluative ratings in web surveys. Public Opinion Quarterly 2013; 77:69-88.
122. Tourangeau R, FG Conrad, MP Couper, C Ye. The effects of providing examples
in survey questions. Public Opinion Quarterly 2014; 78:100-125.
123. Tucker, C., J. M. Brick, B. Meekins, Household Telephone Service and Usage
Patterns in the United States in 2004: Implications for Telephone Samples. Public
Opinion Quarterly 2007; 71: 3-22.
124. van Tuinen HK. Innovative statistics to improve our notion of reality. Journal of
Official Statistics 2009; 25: 431-465.
125. Vercruyssen A, B van de Putte, IAL Stoop. Are they really too busy for survey
participation? The evolution of busyness and busyness claims in Flanders. Journal of
Official Statistics 2011; 27: 619-632.
126. Wang, J.J., P. Mitchell, W. Smith, “Vision and Low Self-Rated Health: The Blue
Mountains Eye Study,” Investigative Ophthalmology and Visual Science 41(1):49-54,
2000.
127. Weinberg DH. Management challenges of the 2010 U.S. Census. Journal of
Official Statistics 2012; 28: 199-220.
128. Willis, G.B., P. Royston, D. Bercini, “The Use of Verbal Report Methods in the
Development and Testing of Survey Questionnaires,” Applied Cognitive Psychology
5:251-267, 1991.
129. Yan T, R Curtin, M Jans. Trends in income nonresponse over two decades.
Journal of Official Statistics 2010; 26: 145-164.
130. Ye C, J Fulton, R Tourangeau. Research Synthesis: More positive or more
extreme? A meta-analysis of mode differences in response choice. Public Opinion
Quarterly 2011; 75:349-365.
131. Yeager DS, JA Krosnick, L Chang, HS Javitz, MS Levendusky, A Simpser, R
Wang. Comparing the accuracy of rdd telephone surveys and internet surveys conducted
with probability and non-probability samples.
CHS / Epi M218
Fall 2014
Page 13
132. Yeager DS, JA Krosnick. Does mentioning “some people” and “other people” in
an opinion question improve measurement quality? Public Opinion Quarterly 2012; 131-
141.
CHS / Epi M218
Fall 2014
Page 14
Course Materials Available on Course Web Site
Information about Institutional Review Boards
1. OPRR Reports, Protection of Human Subjects, Title 45, Code of Federal
Regulations, Part 46, Revised June 18, 1991, Reprinted March 15, 1994.
2. Siegel, Judith, Linda Bourque, Example of Submission, Questions Raised by the
IRB and Responses, 2002.
Materials developed at the UCLA Institute for Social Science Research
Engelhart, Rita, “The Kish Selection Procedure”
Codebooks
Example of a Codebook, December 1, 2002.
Also on earthquake web site:
http://www.sscnet.ucla.edu/issr/da/earthquake/erthqkstudies2.index.htm
The construction of scales and indices
1. Inkelas, Moira, Laurie A. Loux, Linda B. Bourque, Mel Widawski, Loc H.
Nguyen, “Dimensionality and Reliability of the Civilian Mississippi Scale for
PTSD in a Postearthquake Community,” Journal of Traumatic Stress 13, 149-167,
2000.
2. McKennel, A.C., Chapter 7, “Attitude Scale Construction,” in C.A.
O'Muircheataugh & C. Payne (eds.), Exploring Data Structures, Vol. 1, The
Analysis of Survey Data, John Wiley & Sons, 1977, pp. 183-220.
3. Bourque, L.B, H. Shen. “Psychometric Characteristics of Spanish and English
Versions of the Civilian Mississippi Scale,” Journal of Traumatic Stress 2005;
18:719-728.
CHS / Epi M218
Fall 2014
Page 15
Materials related to the administration and analysis of data collected with
questionnaires
1. Questionnaire for Assignment #1
2. Record for Non-respondents
3. Enlistment Letters
4. Call Record
5. Formatting Questionnaires
6. Income Questions
7. Calculating Response Rates
8. Examples of Grids
9. Codebook and Specifications
10. Constructing a Code Frame
11. Scale Construction Example
Questionnaires, Specifications and Codebooks are also available at:
http://www.sscnet.ucla.edu/issr/da/earthquake/erthqkstudies2.index.htm and
http://www.ph.ucla.edu/sciprc/3_projects.htm under Disasters.
The books and articles listed above will give you a background on and an introduction to
surveys and questionnaires. Each book has different strengths and weaknesses. They should be
considered resources. The required books are available in the Health Sciences Bookstore. The
Recommended books are available in the various UCLA libraries. The decision as to which
books you buy and the order in which you read them is yours. I recommend reading all the
material you buy or check out as soon as possible. It will then be available to you as a resource
as we go through the quarter.
The articles on the web site provide you with examples of some of the journals where
research about questionnaires, their administration, and surveys can be found. They also provide
information about some of the “cutting-edge” issues of concern. Currently, a major focus is on
response rates, particularly for telephone interviews, and web-based administration of
questionnaires.
CHS / Epi M218
Fall 2014
Page 16
COURSE REQUIREMENTS AND GRADING
Subjects and Site:
Each student selects a topic on which s/he wants to design questionnaires, and the site(s)
at which s/he will conduct the interviews needed in pretesting the questionnaire. You are free to
select any site and any sample of persons with the following exceptions:
1. All respondents MUST be at least 18 years of age.
2. DO NOT collect information from respondents such as name, address, and phone
number which would enable them to be identified.
3. DO NOT interview persons in the Center for Health Sciences or persons
connected with the Center for Health Sciences.
4. DO NOT interview your fellow students, your roommates, your friends, your
relatives, or persons with whom you interact within another role (e.g., employees,
patients).
5. DO NOT ask about topics which would require the administration of a formal
Human Consent Form.
Should you violate these requirements, the data collected will not fulfill the requirements
for an assignment in this class. Only interviews, not self-administered questionnaires, can be
used for pretesting the questionnaires developed in this class.
Course Objectives and Assignments:
The objective of this course is to learn how to design respectable questionnaires.
Research data can be collected in many ways. Questionnaires represent one way data is
collected. Although usually found in descriptive, cross-sectional surveys, questionnaires can be
used in almost any kind of research setting. Questionnaires can be administered in different ways
and the questions within a particular questionnaire can assume an infinite variety of formats.
As is true of any research endeavor, there are no absolutes in questionnaire design. There
are no recipes and no cookbooks. The context of the research problem you set for yourself will
determine the variety of questionnaire strategies that are appropriate in trying to reach your
research objective; the context will not tell you the absolutely “right” way to do it.
The final “product” for the quarter is a questionnaire designed in segments and pretested
at least three times. The questionnaire will be designed to collect data to test a research objective
specified by you during the second week of the quarter. The final version of the questionnaire is
due Wednesday, December 17th
at 5:00 PM. All assignments must be typed; handwritten
materials are not accepted. Every version of your questionnaire must be typed, but final versions
should be as close to “picture-ready” copy as you can manage. For Assignment 6, due on
CHS / Epi M218
Fall 2014
Page 17
December 17th
, you will provide the final copy of your questionnaire, a full copy of
Interviewer/Administrator Specifications, a Codebook and/or coding instructions, a summary of
data collected in your last pretest, a tentative protocol that could be used to analyze data collected
with your questionnaire, and what, if anything, further you would like to do if time allowed.
The following six assignments will move you toward the final product.
ASSIGNMENTS
Assignment 1: Practice Interviewing (5% of Final Grade) Due October 13
This assignment is designed to expose you to the process of interviewing. Questionnaires
will be handed out on the first day of class (October 6). You are to conduct 9 interviews. On
October 13, turn in both the completed interviews and a brief write-up describing where you
went, what happened and a brief description of the data you collected. These materials are also
on the course web site.
In selecting respondents, go to a central public location such as a shopping area, the
beach, or a park. In conducting your interviews, try to obtain a range of ages, sexes, and ethnic
groups. You will be given identification letters to carry in case anybody asks who you are.
DO NOT INTERVIEW ON PRIVATE PROPERTY UNLESS YOU HAVE PERMISSION.
THIS AFFECTS MANY SHOPPING CENTERS.
Keep track of the characteristics of refusals on the “Record for Non-respondents.” A
refusal is a person you approach for an interview who turns you down.
Assignment 2: Statement of Your Research Question (5% of Final Grade)
Due October 15
Questionnaires are designed to get data that can be used to answer one or more research
questions. To help you get started, state a research question. Remember it should be relevant to
the interviewing sites available to this class. Is your research question, as written, testable?
What concepts are included in or implied by your question? Can your concepts be
operationalized into working definitions and variables for which a questionnaire is a viable data
collection procedure?
Assignment 3: Completion of Human Subjects Protection Certification (5% of final
grade) Due October 29
All UCLA faculty, staff, students and administrators who conduct research with human
subjects are required to complete the Collaborative Institutional Training Initiative (CITI)
Training Program prior to conducting research. This is required for both funded and unfunded
research. For Assignment 3, complete the CITI Training Program and turn in a copy of the
certificate that documents that you completed the training on October 29.
CHS / Epi M218
Fall 2014
Page 18
Some of you may already have completed CITI training as part of a job or other activity at
UCLA. If you have completed training, you do not have to redo it. Please turn in a copy of your
certificate on October 29.
For those who have not completed training, go to the main web site for the Human
Research Protection Program at http://ohrpp.research.ucla.edu/. Click on “Education and
Training” at http://ohrpp.research.ucla.edu/pages/certification. Read through the section on
Collaborative Institutional Training Initiative. You will be completing the training program for
Social and Behavioral researchers and staff. If, in fact, you were submitting an application to one
of the Institutional Review Boards, your application would go to the South General Campus IRB.
I recommend that you read through the questions and answers at “Frequently Asked
Questions and Answers” at http://ohrpp.research.ucla.edu/faq/one-faq?faq_id=7602. These give
you information about the certification process. Then click on “Collaborative Institutional
Training Initiative” at https://www.citiprogram.org/default.asp to start the training program.
After you are finished, you can click on http://www.citiprogram.org to get a copy of your
certificate.
Assignment 4: “Mini-Questionnaire” #1. (20% of Final Grade) Due November 5
Part 1
Prepare and test “Mini-Questionnaire” #1. This represents your first attempt at designing
a questionnaire to test your “Research Question.” The substantive content of the questionnaire
should focus on current status, behaviors or knowledge. You can choose any topic that interests
you, but since our focus is on “health,” you may want to consider asking about: 1) Current acute
and chronic diseases, accidents, injuries, disabilities, and impairments; and 2) Knowledge and
use of health services.
In addition to substantive content, all questionnaires must collect some demographic
information on such things as:
1. Respondent age
2. Respondent education
3. Individual, family or household income
4. Occupation
5. Respondent marital status
There is no limit to the number of questions you may include. However, you must
provide a minimum of 6 questions in addition to the demographic questions discussed above. I
expect your questionnaire to include a mixture of open-ended and closed-ended questions.
Open-ended questions are particularly useful when you are in the process of exploring an area of
research or in the initial stages of designing a questionnaire.
In preparing the questions in your questionnaire, keep in mind the problems of survey
research design which have been discussed in class and in the readings. Pay particular attention
to the following:
CHS / Epi M218
Fall 2014
Page 19
1. Respondent frame of reference--will it be the same as yours?
2. Level of concreteness/abstraction.
3. Question referent--is it clear, and is it what you intend?
4. Tone of question--will it stimulate yea-saying? Or nay-saying?
5. Balance--within the question and across the set of questions.
6. Problems of bias induced by wording--watch out for leading, loaded terms, etc.
7. Screening questions to reduce noise due to non-attitudes.
Indicate explicitly the format of the questions. How will it look? Present the questions in
the order you want them to appear in the questionnaire. Pay particular attention to the following:
1. Problems of preservation due to fatigue.
2. Problems of bias induced by contamination of responses due to ordering of
questions.
3. Problems of threatening material/invasion of privacy.
4. Skip patterns to tailor questionnaire for various respondent types.
In sum, your questionnaire should look as much as possible like a finished product, ready
to be fielded or at least pre-tested.
Part 2
In addition to your questionnaire you must provide a justification for each question. This
is the beginning of writing Specifications. For each question or set of related questions there
should be a brief statement as to why the question is included/necessary, and the rationale behind
the format selected. IT IS NOT SUFFICIENT TO SAY “IT'S SELF-EVIDENT.” It is
NEVER self-evident to someone else--like me! Specifications should also include the research
question being tested and information about how your sample was selected and from where.
Part 3
Test your questionnaire by interviewing a convenience sample of at least five
respondents.
On November 5, turn in:
1. All the completed interviews you did.
2. One copy of your specifications for me.
3. One copy of the blank questionnaire for me.
4. Fifteen copies of the blank questionnaire; these are to share with your classmates.
5. A brief report (5-7 typed pages) describing the instrument you constructed, the
data collected with it, the respondents from whom the data was collected, what
you think worked well and what you think did not, and how you would change it.
CHS / Epi M218
Fall 2014
Page 20
Assignment 5: “Mini-Questionnaire” #2. (25% of Final Grade) Due November 26
Revise the questionnaire you designed in Assignment 4 in accordance with your accrued
wisdom and the succinct observations from me and your classmates.
Add a new set of questions that collects at least one of the following: sensitive behaviors,
retrospective data, or attitudes and opinions. For example, you might design questions that will
elicit information about substance abuse (e.g., use of alcohol), the use of non-traditional health
practices (e.g., faith healers, curanderos, over-the-counter drugs, other people's drugs, etc.),
threatening behaviors (e.g., abortion, etc.). Retrospective data might be collected about past
health care experienced by the respondent over his/her lifetime. Finally, you might find out the
respondents' opinions of their current or past health care. If you have a good reason, you could
adopt or adapt sets of questions from other studies if they help you get to your objective.
Explicitly indicate the format of the questions. Will there be a checklist? How should it
look if presented to the respondent? Do you need a card to cue the respondent? What should be
on the card? Are other visual aids needed?
Start designing a codebook that can be used with your questionnaire. The codebook
should include information on how verbal answers are converted to numbers, where the variables
are located in the data set, and the names of the variables in the data set. I recommend using
your questionnaire as the basis for your codebook.
Whenever you write a question, you should have in mind the probable responses--if you
cannot think of the responses, then you have not thought about the question enough!! The
process of setting up categories for expected (and finally actual) responses is called code
construction. Closed-ended, pre-coded questions have already had codes constructed for them;
the respondent is presented with a specified set of alternatives which are the codes used later in
data analysis. The only additional coding problem presented by pre-codes is how to handle
residuals. For the code construction assignment, you must consider each of your pre-coded
questions, assign numbers to the alternatives following the procedures outlined in class
discussions and readings, and solve the residual problem.
For open-ended questions, you have to consider all possible responses and list these along
with code numbers. Include instructions for the coder to follow regarding how many responses
are to be coded, any precedence rules to follow and any other problems you think might arise.
Remember in this case also to provide a way of handling residual categories.
Remember to include codes for the required questions on age, education, income and
marital status. Do not attempt to set up a code for occupation; do write a paragraph outlining
your thoughts about how one would go about coding occupational data.
You do not have to write specifications for this assignment. You may want to start
CHS / Epi M218
Fall 2014
Page 21
revising your old ones and writing new ones in anticipation of Assignment 5.
Test your questionnaire by interviewing at least five respondents.
On November 26, turn in:
1. A report (7-10 typed pages) describing the development of the instrument--why
items were selected, how and why they were revised; the data collected with this
instrument; the sample of respondents from whom the data were collected.
2. Sixteen copies of the blank questionnaire; one for me and 15 to share with your
classmates.
3. The codebook.
Assignment 6: Your Magnum Opus! (40% of the final grade) Due December 17 by 5:00 PM
This is the culmination of all your work! Revise your earlier questionnaires consistent
with your vastly increased wisdom. Remember that you should have a “final product” that is as
close to “picture-ready copy” as you can manage. This questionnaire should include variable
names for coding. Turned in with the questionnaire are a final set of Specifications and a final
Codebook, along with a write-up that summarizes your pretest interviews of this version of the
questionnaire with 8-10 respondents, a proposed analysis plan, and discussion of any further
changes that might be considered were you to actually use this instrument in a study.
On December 17, turn in:
1. A 7-10 page report that summarizes your pretest interviews, a proposed analysis
plan, and a discussion of any further changes that should be considered were you
to actually use this instrument in a study.
2. One blank questionnaire.
3. One set of final specifications.
4. One final codebook.
CHS / Epi M218
Fall 2014
Page 22
GENERAL STATEMENTS ON GRADING AND
PRESENTATION OF ASSIGNMENTS
When you enter M218, it is assumed that you will exit with a grade of “B.” A “B” is a
good, respectable grade. I write lots of letters of recommendation for people who get “B’s” in
M218. An “A” grade is earned by doing a really exceptional job. If you end up with a “C”
grade, it is probably because you did not make a serious effort in this class: you did not do the
reading, you never came to class, you left all the assignments for the night before, etc. In other
words, it is hard to get a “C” in this class, BUT if that is what you earn, then that is what you will
get.
It is expected that all assignments will be turned in on the date due. There are no
extensions. Incompletes are not given in this course.
CHS / Epi M218
Fall 2014
Page 23
CLASS SCHEDULE
WEEK/DATE, ASSIGNMENTS TOPIC, RELEVANT READINGS
WEEK 1: October 6 & 8 STARTING A RESEARCH STUDY
1. Overview of M 218
2. Research Questions
3. Hypotheses, Concepts, and Working Definitions
4. Variables: Independent, dependent, control
5. Levels of Measurement
Relevant Readings: Aday, Chapters 1-5; Bourque &
Fielder, Chapter 1.
WEEK 2: October 13 CONTEXT FOR & TYPES OF QUESTIONNAIRES
ASSIGNMENT 1 DUE 1. Data Collection Options
2. Administrative Types
3. Surveys & Cross Sectional Studies
4. Question Types: Open/Closed
5. Information Obtainable by Questionnaire:
Facts, Behaviors, Attitudes
Relevant Readings: Aday, Chapters 1-5; Bourque &
Fielder, Chapter 1; Bourque in Lewis Beck, Bryman, Liao,
pp. 229-230; Curtin, Presser, Singer; Fricker et al.; Chang,
Krosnick, 2009, 2010; Krosnick, 1999.
October 15 MANAGING DATA USING AVAILABLE SOFTWARE
ASSIGNMENT 2 DUE Elizabeth Stephenson, Director,
UCLA Social Science Archive
Relevant Readings: Iverson, J. Metadata-Driven Survey
Design, IASSIST Quarterly, Summer 2009.
ICPSR, Guidelines for Effective Data Management Plans,
No date.
http://www.icpsr.umich.edu/icpsrweb/content/deposit/guide
/index.html
CHS / Epi M218
Fall 2014
Page 24
WEEK 3: October 20 HUMAN SUBJECTS PROTECTION AND FORMS
http://ohrpp.research.ucla.edu. See information on pages
15-16.
Relevant Readings: OPRS web site and materials on class
web site.
“BEGINNINGS” AND “ENDS” OF QUESTIONNAIRES
1. Call Record Sheet
2. Enlistment Letters
3. Questions to Interviewer
4. Selecting the respondent
Relevant Readings: Bourque & Fielder, Chapter 6; Binson;
Couper, 1997; Gaziano; Engelhart; Krenzke, Li, Rust;
examples on websites.
October 22 QUESTIONS TO OBTAIN DEMOGRAPHIC
INFORMATION
1. Why?
2. How much?
3. How?
4. Location?
5. Household Roster
6. Selecting Questions from Other Studies
Relevant Readings: Aday, Chapters 8, 10; Bourque &
Fielder, Chapters 2, 3; Bhopal, Donaldson; Fullilove;
Kornhauser, Sheatsley; Sigelman, Tuck, Martin; Martin et
al, 1990; Yan, Curtin, Jans; examples on course web site
and earthquake web site.
CHS / Epi M218
Fall 2014
Page 25
WEEK 4: October 27 & 29 QUESTIONNAIRE SPECIFICATIONS
1. Functions
2. Format
Relevant Readings: Bourque and Fielder, Chapter 3;
Kornhauser, Sheatsley.
October 29 ASSIGNMENT 3 DUE
WEEK 5 November 3 ASCERTAINING INFORMATION ABOUT
RETROSPECTIVE BEHAVIORS
1. Grids
2. Histories
3. Aided Recall
4. Use of Records
Relevant Readings: Aday, Chapter 11; Blair, Ganesh;
Bradburn et al, 1987; Bradburn, 2004; Presser et al, 2004.
November 5 MEASURING ATTITUDES
ASSIGNMENT 4 DUE 1. Beginning
2. Developing Composite Measures
3. Use of Existent Measures
4. Adopting and Adapting
Relevant Readings: Aday, Chapter 11; Inkelas et al;
McKennel; Bourque, Shen; Krosnick et al, 2002; Shaeffer
et al, 2005; Yeager, Krosnick, 2012; examples on websites.
WEEK 6: November 10 & 12 WORKSHOP ON ASSIGNMENT 4
WEEK 7: November 17 & 19 CODEBOOKS AND CODE CONSTRUCTION
1. Objective
2. Types
3. Content Analysis
Relevant Readings: Aday, Chapter 13; Bourque & Fielder,
Chapter 3; Bourque and Clark; Bourque, Coding, Code
Frames; examples on web sites.
CHS / Epi M218
Fall 2014
Page 26
WEEK 8: November 24 CONDUCTING THE CALIFORNIA HEALTH
INTERVIEW SURVEY (CHIS)
Questionnaire Design, Sampling and Contracting
Matt Jans, Ph.D., Survey Methodologist & Data Quality
and Survey Methodology Manager, CHIS
November 26 MEASURING HEALTH-RELATED QUALITY OF LIFE
ASSIGNMENT 5 DUE Ronald Hays, Ph.D., Professor, General Internal Medicine,
and Health Policy and Management
Relevant Readings: Ganz, P., Hays, R.D., Kaplan, R.M., &
Litwin, M.S. Measuring health-related quality of life and
other outcomes, pp. 307-341. In G.G. Kominski & T.H.
Rice (eds), Changing the U.S. Health Care System 4th
edition, San Francisco, CA: Jossey-Bass, 2014.
Khanna D, E Krishnan, EM Dewitt, PP Khanna, B Spiegel,
RD Hays. The future of measuring patient-reported
outcomes in rheumatology. Arthritis Care and Research
2011; 63:S486-S490.
WEEK 9: December 1 & 3 WORKSHOP ON ASSIGNMENT 5
WEEK 10: December 8 EFFECTS OF INCENTIVES & LANGUAGE
Relevant Readings: Adua, Sharp 2010; Baron et al 2009;
Petrolia, Bhattacharjee 2009; Tambor et al 1993; Lee et al
2008; Elliott et al 2012; McGonagle, Schoeni, Couper
2013.
SENSITIVE BEHAVIORS Relevant Readings:
Tourangeau, Yan 2007; Peter, Valkenburg 2011; Barton
1958; Couper et al 2010.
CHS / Epi M218
Fall 2014
Page 27
WEEK 10, continued
December 10 ADMINISTRATION OF SURVEYS, DATA
PROCESSING AND ANALYSIS OF QUESTIONNAIRE
DATA
1. Raw Data vs. Processed File
2. Coding
3. Data Entry/Keypunching
4. Cleaning
5. Raw vs. Actual Variables
6. Data Quality, Missing Data, etc.
Relevant Readings: Aday, Chapters 13, 14, 15; Bourque &
Fielder, Chapter 6.
FORMATTING QUESTIONNAIRES
1. Order/Location
2. Grouping
3. Spacing
Relevant Readings: Aday, Chapter 12; Bourque & Fielder,
Chapter 4; Couper, Tourangeau, Kenyon, 2004; Krosnick
1999; Shaeffer et al; Dillman et al 2005; Christian, Dillman
2004; Macera et al 2004; Morrison et al 2010; Smythe,
Dilman; Couper et al 2004, 2011; Toepoel et al 2009;
Tourgeau et al 2007, 2009.
WEEK 11: December 17 ASSIGNMENT 6 DUE AT 5:00 PM
CHS / Epi M218
Fall 2014
Page 28
OBJECTIVES ASPH COMPENTENCIES RELEVANT MATERIALS
Upon completing this course…
Know how to design, develop,
administer and document
questionnaires.
E.2. Identify the causes of
social and behavioral factors
that affect health of
individuals and populations.
E.5. Describe steps and
procedures for the planning,
implementation and evaluation
of public health programs,
policies and interventions.
E.6. Describe the role of social
and community factors in both
the onset and solution of
public health problems.
E.8. Apply evidence-based
approaches in the development
and evaluation of social and
behavioral science
interventions.
C.1. Identify key sources of
data for epidemiologic
purposes.
C.10. Evaluate the strengths
and limitations of
epidemiologic reports.
Communication and
Informatics: The ability to
collect, manage and organize
data to produce information
and meaning that is exchanged
by use of signs and symbols;
to gather process, and present
information to different
audiences in-person, through
information technologies, or
through media channels; and
to strategically design the
information and knowledge
exchange process to achieve
specific objectives.
Program Planning: The ability
All textbooks, readings,
lectures and assignments.
CHS / Epi M218
Fall 2014
Page 29
to plan for the design,
development, implementation,
and evaluation of strategies to
improve individual and
community health.
Know how to write
questionnaire specifications.
K.3. Explain how the findings
of a program evaluation can be
used.
K.7. Differentiate among
goals, measurable objectives,
related activities, and expected
outcomes for a public health
program.
Communication and
Informatics: The ability to
collect, manage and organize
data to produce information
and meaning that is exchanged
by use of signs and symbols;
to gather process, and present
information to different
audiences in-person, through
information technologies, or
through media channels; and
to strategically design the
information and knowledge
exchange process to achieve
specific objectives.
Assignments 4 and 6
Lectures on 10/27 & 10/29
Know how to develop
codebooks.
Communication and
Informatics: The ability to
collect, manage and organize
data to produce information
and meaning that is exchanged
by use of signs and symbols;
to gather process, and present
information to different
audiences in-person, through
information technologies, or
through media channels; and
to strategically design the
information and knowledge
exchange process to achieve
specific objectives.
Assignments 5 and 6
Lectures on 11/17 & 11/19
CHS / Epi M218
Fall 2014
Page 30
Know how to submit research
proposals for review by
Institutional Review Boards.
E.9. Apply ethical principles
to public health program
planning, implementation and
evaluation.
J.2. Apply basic principles of
ethical analysis (e.g. the Public
Health Code of Ethics, human
rights framework, other moral
theories) to issues of public
health practice and policy.
Assignment 3
Lecture on 10/20