Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:39917/feed
Articles
Vol. 10, Issue 3, 2017May 31, 2017 EDT

Tailored Explanation: A Strategy to Minimize Nonresponse in Demographic Items Among Low-income Racial and Ethnic Minorities

Nora Jacobson, Maichou Lor, Anna Krupp, Barbara J Bowers,
tailored explanationmotivationitem nonresponse
https://doi.org/10.29115/SP-2017-0015
Survey Practice
Jacobson, Nora, Maichou Lor, Anna Krupp, and Barbara J Bowers. 2017. “Tailored Explanation: A Strategy to Minimize Nonresponse in Demographic Items Among Low-Income Racial and Ethnic Minorities.” Survey Practice 10 (3). https:/​/​doi.org/​10.29115/​SP-2017-0015.
Save article as...▾

View more stats

Abstract

There is a high rate of nonresponse for demographic items in survey research, particularly for racial and ethnic minority respondents. This present study examined whether explaining the rationale for asking the demographic questions, prior to asking a set of demographic questions, would increase respondents’ motivation to reduce nonresponse to gender, income, age, and race items. Using a cross-sectional, randomized comparison design, 99 respondents were randomly assigned to two groups. Group 1 did not receive an explanation for asking the demographic questions. Group 2 received an explanation designed to be relevant and meaningful to them, concerning the significance and potential use of demographic information for racial and ethnic minority populations. A proportional difference test was used to calculate the differences in the proportion of respondents’ completing demographic survey items between the groups. A proportional difference effect size (Cohen’s h effect size) was used to determine the magnitude of difference between the two groups. Over 50% of respondents were African Americans. While none of the item nonresponses for both groups is statistically significant in terms of proportional differences, there is small (Cohen’s h=0.184) to moderate (Cohen’s h=0.342) effect in reducing demographic item nonresponse when an explanation was provided to respondents. Specifically, adding an explanation made the biggest improvement in reporting income. The study findings support the importance of providing participants with an explanation that is relevant and meaningful to them, increasing their motivation to respond, thereby minimizing item nonresponse.

Introduction

Incomplete data in surveys is a major concern for researchers. Data incompleteness can be associated with undercoverage, unit nonresponse, and/or item nonresponse (Groves et al. 2009). This paper focuses on item nonresponse specifically to demographic items including income, race, gender, and age. Item nonresponse occurs when the respondents in the selected sample provide some, but not all, of the requested information (Groves et al. 2009). Item nonresponse can lead to bias and measurement errors. Thus, it is essential to reduce item nonresponse.

Factors that Could Explain Item Nonresponse

Researchers have reported predictors both of nonresponse in general and of income item nonresponse (Ferber 1966; Grønhaug, Gilly, and Enis 1988; Ralph 1984). For instance, Ferber (1966) and others (Ralph 1984; Riphahn and Serfling 2005) reported that item nonresponse was found to be higher for females compared to males and higher for people with low compared to high educational attainment. In addition, people who are self-employed, homeowners (Pleis and Dahlhamer 2004; Sousa-Poza and Henneberger 2000), in a lower social position, female, older (e.g., age 65 and over; Ralph 1984; Ross and Reynolds 1996; Schräpler 2003), and African American (Ross and Reynolds 1996) have demonstrated higher item nonresponse rates.

Researchers have suggested that cognitive and motivational factors contribute to item nonresponse including income items (Loosveldt et al. 1999a, b; Tourangeau, Rips, and Rasinski 2000). Beatty, Herrmann, et al. (2002) proposed three antecedents that influence item nonresponse in surveys: (1) amount of knowledge about the topic (cognitive), (2) perceptions about accuracy required to answer the question (cognitive), and (3) willingness to respond to the question (motivation). Similarly, Juster and Smith (1997) and Moore, Stinson, and Welniak, Jr. (1999) have a model that includes cognitive and motivational factors. However, Moore, Stinson, and Welniak, Jr. (1999) added interpretation and comprehension of questions to their model.

On the other hand, Singer, Hippler, and Schwarz (1992) suggested that item nonresponse could be due to concerns about confidentiality. Questions regarding income have been reported to be difficult to answer or thought as an invasion of privacy (Singer, Mathiowetz, and Couper 1993; Tourangeau and Yan 2007). Other researchers have suggested that item nonresponse could be due to the survey design (e.g., response format, poor question wording; Groves et al. 2009).

Strategies to Improve Item Nonresponse

Efforts have been made to decrease item nonresponse by addressing cognitive, motivational, and design issues. For example, researchers have used a series of bracketed income questions, known as ‘unfolding bracket’ technique (Heeringa, Hill, and Howell 1993; Juster and Smith 1997). Unfolding bracket is used after the respondent has refused to answer or said “don’t know” to an open-ended income question asking for their exact income amount. This technique consists of a series of closed-ended questions with specific income ranges, where researchers ask respondents to place their income within the specific range. Unfolding bracket has been noted to reduce missing financial data by 50% or more (Heeringa, Hill, and Howell 1993; Juster and Smith 1997) and income nonresponse by 50% (Battaglia et al. 2002; McGrath 2005). However, evidence suggests that the unfolding bracket technique is only effective for addressing cognitive causes of income nonresponse and not motivational causes because the unfolding bracket technique reduces the cognitive burden by providing a range of income brackets. The technique does not involve persuading respondents to report income (Yan, Curtin, and Jans 2010).

Moore (2006) created three procedures to elicit responses about income. The first procedure was to use a flexible approach, where respondents can choose how they will report their incomes. The second procedure involved an expanded use of closed-ended range questions for use as follow up probes in the event of an initial nonresponse. The third procedure used dependent follow-up interviewing after a nonresponse event occurred. The dependent interviewing procedure is a follow up technique that interviewers use with respondents by saying, “I have recorded from last time that you received [dollars] from [income source] last February. Does that still sound about right?” (Moore 2006, 3). Moore (2006) found that there were reductions in item nonresponse with the second and third approaches.

Other solutions related to the placement of demographic questions in questionnaires. Some researchers contend that the best placement is at the end of the questionnaires (Dillman 2007; Fink, Bourque, and Fielder 2003; Jackson 2009; Whitley 1996). Stoutenbourgh (2008) argued that including the demographic items at the end would allow the primary survey questions to be answered when participants are most alert and invested in taking the assessment as well as to reserve the boring demographic questions until the end. In contrast, other researchers argued for placing demographic items at the beginning of survey interview to increase rapport (Babbie 2008). Despite these suggestions, there are mixed results for both placements of demographic questions. Teclaw, Price, and Osatuke (2012) reported that placing demographic questions at the beginning of a questionnaire lowers item nonresponse rates. In contrast, other researchers suggest placing demographic questions at the end of a questionnaire is better (Dillman 2007; Fink, Bourque, and Fielder 2003; Jackson 2009; Whitley 1996).

Many of the efforts used to reduce item nonresponse have been focused on survey design and cognition. There are fewer studies that focus on increasing motivation of respondents to report the demographic items. In addition, reported efforts have not considered the context of respondents including literacy and culture. Thus, we tested whether motivation to complete demographic survey items would increase among low income and low educated racial ethnic minorities after they are provided a meaningful verbal explanation for asking demographic questions. The explanation described the importance of providing demographic information and explained how demographic information could be used to improve care for people from communities like theirs or for populations with similar characteristics. We hypothesize that if people from the same background craft the explanatory message for people who are from similar communities, then it will increase such participants’ motivation to report the desired demographic items, thereby reducing item nonresponse.

Methods

This study was approved by the University of Wisconsin-Madison Institutional Review Board.

Overview of Study Design

We recently described in detail our evaluation of the impact of an established lay group, the Community Advisors on Research Design and Strategies (CARDS), on revising research materials and consent forms to improve recruitment and responsiveness (Bowers, Jacobson, and Krupp 2017; Jacobson, Bowers, and Krupp 2016). In brief, the CARDS is a community group of generally low income, high unemployment, and primarily ethnic and racial minority individuals (Kaiser, Thomas, and Bowers 2017). Using materials from a previously completed study of heart failure, participants were asked to assess their likelihood of responding positively to recruiting materials, consenting to participate in the study and responding accurately to demographic items if they were actually enrolled in the study. This paper addresses participant responses to the demographic items included in that study.

Design and Sample

A cross-sectional, randomized comparison study was designed to understand whether providing a meaningful explanation to participants would increase participants’ motivation to complete demographic survey items, thereby reducing item nonresponse for demographic items. Prior to collecting data, the researchers met with CARDS to draft a brief explanation of why responding to demographic items is important. The explanation is different from how researchers more often approach demographic items, providing either no explanation or a general explanation that the research will improve care for a particular condition, or is required by the sponsor, but not making the connection to how people like them might benefit from answering those items.

Recruitment was done by a long standing employee of the organization, who lived in the local community. A convenience sample of 99 adult participants was recruited from food banks and other programs (e.g., parenting and childcare programs, women’s support groups, and senior meal programs) at two community centers in low income, racially and ethnically diverse areas of the city. People were eligible to participate in the study if they were eligible for food bank support or use of other community center services targeted to low-income individuals.

Data Collection: Three white researchers from a local university collected data. The researchers met face-to-face with the eligible participants individually on several different days at the two community centers to conduct the study. To minimize contamination from social conversation about the study, researchers randomized the interview assignments based on day. That is, all participants on the same day, at the same center, were included in the same group. Groups were distinguished by either hearing the CARDS explanation of why participation in research is important (see Table 1) or hearing no explanation. The researchers chose to read to all the participants in group 2 because they wanted the verbal explanation to be delivered in the same way (standardized) and to ensure that literacy was not an issue.

Table 1  Tailored verbal script read by the researcher.
We know that some groups of people are healthier and even live longer than others. For example, women live longer than men. People with higher incomes are healthier and live longer than people with low incomes. So a major goal of health research today is to try to eliminate this difference in health and how long people live, to get rid of these differences between groups of people. One way to do that is to learn more about these differences and to understand the reasons for the differences in health. If we knew more, we could be able to make the changes that would improve the health of people who suffer the most from health problems. So if researchers can find out who is most affected by particular health problems and the reasons they have more health problems, they would be better able to improve care of people who need it the most. This is why researchers always ask questions about the people who are doing the surveys. For instance, researchers usually ask about race, income, education, gender. We know that many people don’t answer these questions. Sometime they leave it blank and sometime they actually give an incorrect answer, thinking that income or race, for example, doesn’t have anything to do with health. This makes it difficult for researchers to learn about differences among groups of people or to develop the most effective treatments.

Participants in each group were shown four demographic questions and then were read the multiple choice responses (see Table 2); “If this question appeared on a survey you had agreed to complete, would you: Answer the question correctly?, Answer the question incorrectly?, or leave the item blank?” Participants were only asked how they would respond to the question, not actually asked to respond to the items.

Table 2  Demographic and motivation questions.
Age What is your age?
What would you be most likely to do?
• Answer the question accurately
• Leave it blank
• Give an answer that is not accurate
Race What is your race?
What would you be most likely to do?
• Answer the question accurately
• Leave it blank
• Give an answer that is not accurate
Gender
What is your gender?
What would you be most likely to do?
• Answer the question accurately
• Leave it blank
• Give an answer that is not accurate
Income What is your annual household income?
What would you be most likely to do?
• Answer the question accurately
• Leave it blank
• Give an answer that is not accurate

Analysis of Data

The answers to the questions about whether respondents anticipated that they would answer the demographic questions were compared between the two groups. Analysis was done in two steps. First, we examined the differences between the groups on anticipated completion of the responses for each demographic item: (a) age, (b) gender, (c) race, and (d) income. We used the proportional difference test and then calculated the potential differences in the proportion of participants’ anticipated willingness to respond between the groups with 95% confidence.

In the second step, we examined the proportional difference effect size, allowing us to determine the magnitude of nonresponse differences between group 1 (no explanation) and group 2 (CARDS explanation) because of the small sample size. We used Cohen’s h (arcsine transformation) effect size (Cohen 1988). Effect sizes (Cohen’s h) for explanation vs. no explanation differences are presented; Cohen’s h statistics are made positive for effects where the explanation outperforms the no explanation.

Results

Because the purpose of this study was to examine the impact of receiving the tailored explanation vs. no tailored explanation on reporting demographic items, we did not screen for or collect demographic data related to gender, race, gender, or income prior to the interview. However, because we recruited respondents from specific neighborhoods, the respondents were predominantly African American and lower income with an even divide between female and male respondents.

In general, there was a higher proportion of intended non-reporting on all demographic items for the group not receiving the explanation (group 1). For example, as shown in Table 3, 5.6% of participants in group 1 indicated they would not report age compared to 2.2% of participants in group 2 (those receiving explanation), constituting a 3.3% difference. However, this is not statistically significant [confidence interval (CI): −0.135, 0.0667].

Table 3  Proportional test and effect size of motivation to report demographic items.
Items Group
1 (%)
Group
2 (%)
Mean difference
proportion (%)
CI 95% p-Value Cohen CI 95%
Age 5.6 2.2 3.4 (−0.14, 0.08) 0.30 0.18 (−1.82, 2.17)
Gender 7.4 2.2 5.5 (−0.16, 0.05) 0.16 0.25 (−1.74, 2.25)
Race 18.5 8.9 9.6 (−0.24, 0.05) 0.09 0.18 (−0.34, 0.71)
Income 35.2 20 15.2 (−0.33, 0.03) 0.05 0.34 (0.09, 0.59)

While none of the item nonresponses for both groups is statistically significant in terms of proportional differences, we calculated Cohen’s h effect size to determine the magnitude of the effect. As shown in Table 3, there is a small (Cohen’s h=0.184) to moderate (Cohen’s h=0.342) effect in reducing demographic item nonresponse when an explanation was provided to respondents. Specifically, adding an explanation made a small improvement in reporting gender and age. All of the demographic items (age, gender, race, and income) are going in the same direction, indicating some improvement after the explanation was provided. The largest improvement in item nonresponse was to the income item. For group 1, the income question nonresponse was 35.19%. However, there is a 15.2% reduction of nonresponse to the income item when the explanation was provided to participants, constituting a moderate (Cohen’s h=0.34) effect size in the desired direction. This calculation was done focusing on leaving blank answers relative to accuracy of responses.

Discussion

These findings suggest that providing a meaningful verbal explanation to respondents may improve their motivation to respond to demographic items. The increased motivation to report the demographic items may be due to the specific explanation created by CARDS – an explanation tailored to the context of the respondents. This particular population has been found to mistrust researchers and research and to worry about what the researchers will do with the results of studies (Yancey, Ortega, and Kumanyika 2006). This specific explanation focused on how they, or someone like them, might benefit from participation. Thus, it may have motivated participants by replacing a fear of how results could be used with an anticipated benefit of using the results. Furthermore, because the verbal explanation emphasized the benefits to people like themselves that might result from providing demographic information, participants might have determined that the benefits outweighed any anticipated costs or they saw the benefits for their community and not just personal gain. This notion could be explained by Dillman’s (1978) theoretical appeal argument from a social exchange approach for increasing response in mail surveys. The social exchange approach states that respondents are more likely to respond if they believe that in the long run the anticipated benefits of responding outweigh the anticipated costs (Dillman 1978).

It is perhaps not surprising that participants in the group that did not receive a tailored explanation had higher item nonresponse. It could be that without the explanation participants are less motivated to complete the items for reasons described in the existing literature (Loosveldt et al. 1999a, 1999b). In the absence of an explanation tailored to the context of the respondents, the task of completing the demographic items may seem less attractive (Cialdini 1984).

While we did not directly measure interviewer effect on respondents or responses, we were surprised to observe that respondents were cooperative in disclosing their unwillingness to respond to certain items, as well as their reasons, to researchers who were not of the same racial and ethnic background. For instance, some respondents shared that they would not provide information about demographic items on a survey because they did not trust how the results would be used or because they feared that people hearing the results would make assumptions about them. This observation is inconsistent with existing research on racial and ethnic interviewer effect on respondents’ responses (Davis and Silver 2003; Samples et al. 2014). Researchers reported that African Americans provide different responses to surveys when interviewed by African American interviewers than with European American interviewers (Davis and Silver 2003; Lowe, Lustig, and Marrow 2011; Samples et al. 2014). Thus, it could be possible that the tailored explanation provided a forum of trust for the respondents in this study. However, more research is needed to better understand if and how tailored explanation can impact interviewer effect and item nonresponse rates.

Although our findings appear to be promising for reducing item nonresponse by using a verbal tailored, meaningful explanation, we did not compare or test how the verbal explanation differed from having the respondent read a written explanation on his or her own. Thus, future studies could examine whether verbal or a written explanation is more effective in reducing item nonresponse. Because we did not have African American interviewers deliver the tailored explanation, we do not know if there was an interviewer effect on response. Future studies can investigate how the racial/ethnic background of the interviewers affect item response rates using a tailored explanation. It is possible that there could be a social desirability bias occurring after we provided the explanation, but we did not evaluate this. Future research could also examine the effect of a tailored explanation on social desirability. In addition, we did not control for interviewer effect. It is possible that the three researchers who read the questions may have had different effects on response.

Acknowledgement

This project was partially supported by the Clinical and Translational Science Award (CTSA) program, through the NIH National Center for Advancing Translational Sciences (NCATS), grant UL1TR000427. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.

References

Babbie, E. 2008. The Basics of Social Research. 4th ed. Belmont, CA: Thomson Wadsworth.
Google Scholar
Battaglia, M.P., D.C. Hoaglin, D. Izrael, M. Khare, and A. Mokdad. 2002. “Improving Income Imputation by Using Partial Income Information and Ecological Variables.” In Proceedings of the American Statistical Association, Survey Research Methods Section, 152–57.
Google Scholar
Beatty, P., D. Herrmann, et al. 2002. “To Answer or Not to Answer: Decision Processes Related to Survey Item Nonresponse.” In Survey Nonresponse, edited by R.M. Groves, 2nd ed., 71–85. Hoboken, New Jersey: John Wiley and Sons.
Google Scholar
Bowers, B.J., N. Jacobson, and A. Krupp. 2017. “Can Lay Community Advisors Improve Recruitment of Underrepresented Populations?” Research in Nursing and Health 40 (1): 63–69.
Google Scholar
Cialdini, R.B. 1984. Influence: How and Why People Agree to Things. New York: Quill.
Google Scholar
Cohen, J. 1988. Statistical Power Analysis for the Behavioral Sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.
Google Scholar
Davis, D.W., and B.D. Silver. 2003. “Stereotype Threat and Race of Interviewer Effects in a Survey on Political Knowledge.” American Journal of Political Science 47 (1): 33–45.
Google Scholar
Dillman, D.A. 1978. Mail and Telephone Surveys: The Total Design Method. New York: Wiley-Interscience.
Google Scholar
———. 2007. Mail and Internet Surveys: The Tailored Design Method. 2nd ed. Hoboken, NJ: John Wiley and Sons, Inc.
Google Scholar
Ferber, R. 1966. “Item Nonresponse for a Consumer Survey.” Public Opinion Quarterly 30 (3): 399–415.
Google Scholar
Fink, A., L.B. Bourque, and E.P. Fielder. 2003. The Survey Kit: How to Conduct Self-Administered and Mail Surveys. 2nd ed. Thousand Oaks, CA: Sage Publications, Inc.
Google Scholar
Grønhaug, K., M.C. Gilly, and B.M. Enis. 1988. “Exploring Income Non-Response: A Logit Model Analysis.” Journal of the Market Research Society 30 (3): 1–36.
Google Scholar
Groves, R.M., F.J. Fowler, M. Couper, J.M. Lepkowski, E. Singer, and R. Tourangeau. 2009. “Nonresponse in Sample Survey.” In Survey Methodology, 2nd ed. Hoboken, NJ: Wiley.
Google Scholar
Heeringa, S., D.H. Hill, and D.A. Howell. 1993. “Unfolding Brackets for Reducing Item Nonresponse in Economic Surveys.” AHEAD/HRS Report, no. 94-029 Institute for Social Rese.
Google Scholar
Jackson, S.L. 2009. Research Methods and Statistics: A Critical Thinking Approach. 3rd ed. Wadsworth, Belmont, CA.
Google Scholar
Jacobson, N., B. Bowers, and A. Krupp. 2016. “Planning for Ancillary Care Provision: Lessons from the Developing World.” Journal of Empirical Research on Human Research Ethics 11 (2): 129–34.
Google Scholar
Juster, T., and J.P. Smith. 1997. “Improving the Quality of Economic Data: Lessons from the HRS and AHEAD.” Journal of the American Statistical Association 92 (440): 1268–78.
Google Scholar
Kaiser, B., G. Thomas, and B.J. Bowers. 2017. “A Case Study of Engaging Hard-to-Reach Participants in the Research Process: Community Advisors on Research Design and Strategies (CARDS)®.” Research in Nursing and Health 40 (1): 70–79.
Google Scholar
Loosveldt, G. et al. 1999a. “Evaluation of the Relationship between Data Quality and Interviewer Behavior during Doorstep Contact in Face-to-Face Interviews.” In Paper Presented at the International Conference on Survey Non-Response. Portland, OR.
Google Scholar
———. 1999b. “Item Non-Response as a Predictor of Unit Non Response in a Panel Survey.” In Paper Presented at the International Conference on Survey Non-Response. Portland, OR.
Google Scholar
Lowe, S.R., K. Lustig, and H.B. Marrow. 2011. “African American Women’s Reports of Racism during Hurricane Katrina: Variation by Interviewer Race.” New School Psychology Bulletin 8:46–57.
Google Scholar
McGrath, D. 2005. “Comparison of Data Obtained by Telephone versus Personal Visit Response in the U.S.” In Consumer Expenditures Survey. Proceedings of the American Statistical Association, Survey Research Methods Section, 3368–75.
Google Scholar
Moore, J.C. 2006. “The Effects of Questionnaire Design Changes on Asset Income Amount Nonresponse in Waves 1 and 2 of the 2004 SIPP Panel.” Survey Methodology 1:1–19.
Google Scholar
Moore, J.C., L.L. Stinson, and E.J. Welniak, Jr. 1999. “Income Reporting in Surveys: Cognitive Issues and Measurement Error.” In Cognition and Survey Research, edited by M.G. Sirken, D.J. Herrmann, S. Schechter, N. Schwarz, J.M. Tanur, and R. Tourangeau. New York: Wiley.
Google Scholar
Pleis, John R., and J. Dahlhamer. 2004. “Family Income Response Patterns for Varying Levels of Income Detail: An Analysis of the National Health Interview Survey (NHIS).” In JSM Proceedings, Survey Research Methods Section, 4200–4207.
Google Scholar
Ralph, B. 1984. “Item Nonresponse in Telephone Surveys: An Analysis of Who Fails to Report Income.” Social Science Quarterly 65 (1): 207.
Google Scholar
Riphahn, R.T., and O. Serfling. 2005. “Item Non-Response on Income and Wealth Questions.” Empirical Economics 30 (2): 521–38.
Google Scholar
Ross, C.E., and J.R. Reynolds. 1996. “The Effects of Power, Knowledge, and Trust on Income Disclosure in Surveys.” Social Science Quarterly 77 (4): 899–911.
Google Scholar
Samples, T.C., A. Woods, T.A. Davis, M. Rhodes, A. Shahane, and N.J. Kaslow. 2014. “Race of Interviewer Effect on Disclosures of Suicidal Low-Income African American Women.” Journal of Black Psychology 40 (1): 27–46.
Google Scholar
Schräpler, J.-P. 2003. “Gross Income Non-Response in the German Socio-Economic Panel Refusal or Don’t Know?” Schmollers Jahrbuch, Jg 123 (1): 109–24.
Google Scholar
Singer, E., H.J. Hippler, and N. Schwarz. 1992. “Confidentiality Assurances in Surveys: Reassurance or Threat?” International Journal of Public Opinion Research 4 (3): 256–68.
Google Scholar
Singer, E., N. Mathiowetz, and M. Couper. 1993. “The Impact of Privacy and Confidentiality Concerns on Survey Participation: The Case of the 1990 U.S. Census.” Public Opinion Quarterly 57 (4): 465–82.
Google Scholar
Sousa-Poza, A., and F. Henneberger. 2000. “Wage Data Collected by Telephone Interviews: An Empirical Analysis of the Item Nonresponse Problem and Its Implications for the Estimation of Wage Functions.” Revue Suisse d Economie Politique Et De Statistique 136 (1): 79–98.
Google Scholar
Stoutenbourgh, J.W. 2008. Demographic Measures. Edited by P.J. Lavrakas. Vol. 1. Encyclopedia of Survey Research Methods. Thousand Oaks, CA: Sage Publications, Inc.
Google Scholar
Teclaw, R., M.C. Price, and K. Osatuke. 2012. “Demographic Question Placement: Effect on Item Response Rates and Means of a Veterans Health Administration Survey.” Journal of Business and Psychology 27 (3): 281–90.
Google Scholar
Tourangeau, R., L. Rips, and K. Rasinski. 2000. The Psychology of Survey Response. Cambridge: Cambridge University Press.
Google Scholar
Tourangeau, R., and T. Yan. 2007. “Sensitive Questions in Surveys.” Psychological Bulletin 133 (5): 859–83.
Google Scholar
Whitley, B.E. 1996. Principles of Research in Behavioral Research. 2nd ed. New York: McGraw-Hill.
Google Scholar
Yan, T., R. Curtin, and M. Jans. 2010. “Trends in Income Nonresponse over Two Decades.” Journal of Official Statistics 26 (1): 145–64.
Google Scholar
Yancey, A.K., A.N. Ortega, and S.K. Kumanyika. 2006. “Effective Recruitment and Retention of Minority Research Participants.” Annual Reviews Public Health 27:1–28.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system