Undisclosed Privacy: The Effect of Privacy Rights Design on Response Rates

Roos Haer University of Konstanz

Nadine Meidert1

Abstract

In this research note, we carried out a 2×2×2 experimental design to investigate the impact of the design of the privacy and confidentiality assurances on the respondents’ probability to participate in or break off the Web survey. In these experiments, we examined the effect of (1) emphasizing the privacy rights in the e-mail invitation, (2) placing the privacy rights in bold on the welcome screen, and (3) having a checkbox in which the respondents have to indicate that they have read and agreed to these assurances. We expected that the more these rights were emphasized, the less likely the respondents were to complete the survey. However, the way these assurances are presented in the e-mail invitation or on the welcome screen has no significant influence on the response rate.

Introduction

In general, Web surveys are connected with relatively low response rates compared to other modes of survey research (Cook et al. 2000; Couper 2000; Lozar Manfreda et al. 2008). Some scholars have attributed this low rate to increased public concerns about the confidentiality of the data requested and the violation of privacy rights of the respondent (Sills and Song 2002; see also for some theoretical insights Singer et al. 1992).2

A common way to deal with these concerns is to reassure the respondent that confidentiality will be upheld and that privacy rights will be respected. These assurances are considered to be a crucial part of doing ethical research. Consequently, many countries have legal guidelines on the content of these assurances. However, these guidelines rarely include rules about how these assurances should be presented, which has resulted in a variety of privacy rights designs. To our knowledge, few scholars have researched how to present these assurances in such a way that respondents will not break off prematurely or fail to complete the survey.

The present study seeks to remedy this lack of research. In doing so, special attention is devoted to the effect of three ways of presenting privacy rights: (1) by emphasizing the privacy rights in the e-mail invitation (2) by emphasizing these assurances on the welcome screen using a bold letter type and (3) by using a checkbox that the respondent has to click to indicate that he or she has read these assurances.

Background

Survey researchers have always been concerned about the effect of survey introductions on the willingness of potential respondents to participate (Singer et al. 1992). Our study is by no means the first study to investigate the relationship between confidentiality and privacy right statements and the response rate. However, few studies have examined how to present these rights effectively in Web surveys.

Most of the early studies focused on the U.S. decennial census and were based on the assumption that privacy assurances were a ‘good’ thing – it increases response rates by overcoming respondents’ concerns (Singer et al. 1992, 1995). However, most of these early studies show the contrary; privacy assurances reduce respondents’ willingness to participate (Fay et al. 1991; Hillygus et al. 2006; Singer et al. 1993, 2003).

This negative effect has never been confirmed in Web survey research. This may be due to the fact that survey researchers are ethically and legally obliged to give these assurances. In the United States, the Privacy Act statement determines the content of such assurances for surveys conducted by the government (Couper 2008). European laws have been even more responsive to privacy and confidentiality (Singer 2008). In Germany, after the 1983 census debacle in which the census had to be postponed because of concerns about inadequate data protection, concerns surrounding confidentiality and privacy rights have become highly salient and rules have been strictly formulated (Flaherty 1989; Singer et al. 1992).

While survey researchers are legally and ethically bound to present these assurances, they know little about how different presentations of these assurances affect respondents’ willingness to participate. Consequently, Web surveys have been characterized by a variety of possible designs (see for instance, Couper 2008). This variety of existing designs lead us to our research question: Do differences in design have influence on the response rate?

Design of the Experiments

To investigate the importance of different presentations of confidentiality and privacy right assurances on respondents’ cooperation in participating in and completing surveys, we added experiments to a list-based sample survey of first year students at the University of Konstanz conducted by the quality management unit of the university. The main research questions were formulated to evaluate why students choose to study in Konstanz, and what problems they encountered in the beginning of their studies. Students were asked to provide information on their attitudes toward the university, their lectures, and about their intention for breaking off their studies. The questionnaire was approved by the university and its juridical advisory board. All first year students who were enrolled in a study program at the university two weeks before the survey started were invited to the Web survey by an e-mail sent to their university account.

We tested the effects of three possible designs by presenting the privacy rights in a 2×2×2 experimental design. It is important to note that the first year students were randomly assigned to receive one of the two e-mail invitations. Once they clicked on the personalized survey link in the e-mail, they were randomly assigned to one of the four other treatment groups. In the first experiment, we took the conclusions made by Crawford et al. (2001) into account. They concluded that the e-mail invitation plays a disproportionately important role in eliciting responses to a Web survey. Consequently, we placed the privacy and confidentiality assurances for one group (n=1311) not only on the welcome screen but also in the received e-mail invitation. The other group of respondents (n=1312) received a short version of these assurances in the invitation e-mail and received these assurances on the welcome screen of the survey. We expected that those respondents that received the long e-mail invitation with the detailed assurances were less likely to start the survey since they would be primed to believe that the upcoming survey included sensitive questions for which privacy and confidentiality assurances are necessary (see also Haer and Meidert 2013).

In a second experiment, we presented privacy rights in bold on the welcome screen (n=536), others received the same rights without the bold letter type (n=505). We expect that by emphasizing these assurances, respondents would become aware of possible problems with these rights and would be less likely to start the Web survey.

In the last added experiment, one sample (n=498) was given a welcome screen on which respondents had to mark a checkbox indicating that they read the privacy rights and confidentiality assurances before they could proceed to the Web survey. In the other group (n=543), the respondents were given the same assurances on the welcome screen without having a checkbox. They could proceed to the survey just by pressing the next-button. We expect that those respondents who received the screen with the checkbox were more likely to break off. This is not only because the checkbox would likely trigger a negative response effect due to the priming effect but also because marking the checkbox might be perceived as an extra burden for the respondent. The more respondents are burdened, the more likely they refuse participation (Crawford et al. 2001). Screen shots of the welcome screen and the e-mail invitations can be found in Appendix I and II.

Implementation

After a pretest, in which we had specifically tested the technical aspects of the survey, we sent out an e-mail invitation with a unique URL to the survey. This e-mail invitation was successfully sent to 2623 students on November 13, 2012. Five days after this initial e-mail, a reminder was sent to those who had not yet participated in the survey. A final reminder was sent on November 30, 2012, to those who had not participated and to those respondents who had not completed the survey. The reminders had the same structure as the initial e-mail invitation. The survey closed on December 17, 2012. In total 1,140 students started the survey, 827 completed the entire survey, and 313 partially completed the survey. This gives us a total response rate (i.e., including completed and partial questionnaires) of 43.5 percent using American Association for Public Opinion Research (AAPOR) RR2 and a response rate including the completed questionnaires of 31.5 percent using AAPOR RR1 (American Association for Public Opinion Research 2011).

Results

Table 1 shows the number of assigned respondents to each group and the break-off rate directly on the welcome screen. Only 1140 respondents clicked on the link in the e-mail to start the survey. The table also shows that the number of respondents in each group differs because of the random selection procedure. Furthermore, different break-off rates can be observed.

Table 1 n and overall break-off rates.

Short e-mail
Long e-mail
n=1312
n=1311
No bold type Bold type No bold type Bold type
No checkbox n=132 n=162 n=123 n=126
28.03% 20.37% 21.95% 16.67%
Checkbox n=121 n=114 n=129 n=134
14.05% 14.91% 24.03% 23.13%

To analyze the data, we estimated Heckman selection models because of the two-stage structure of the experimental design (Heckman 1979). First, participation in the survey is dependent on the invitation e-mail the respondent received. These estimators are then used in the second stage as a regressor in the second estimation with the two other treatments as independent variables and the break-off rate as dependent variable.

Table 2 shows the two Heckman selection models. Model 1 displays the estimation with the dichotomous dependent variable break-off directly on the welcome screen. Model 2, on the other hand, reports the results with the dichotomous dependent variable overall break-off (respondents break off at any screen of the survey).

Table 2 Heckman selection models.

Model 1 Model 2
First step estimation (dv: participation in the survey)
Long e-mail –0.035 –0.035
(0.050) (0.050)
Constant –0.204*** –0.204***
(0.036) (0.036)
Second step estimation (dv: break-off)
Checkbox –0.003 –0.027
(0.012) (0.029)
Bold –0.005 –0.032
(0.0124) (0.029)
Constant –0.314 –0.552
(0.694) (1.635)
Lambda 0.354 0.833
(0.735) (1.732)
n 2524 2524

Reported are the coefficients and the standard errors in parentheses; ***significant at the 1 percent level; **significant at the 5 percent level; *significant at the 10 percent level.

When looking at the effect of the e-mail invitation in the both models (i.e., putting the detailed privacy rights in the e-mail invitation vs. sending a short e-mail invitation with only a brief outline of these rights), there is a negative effect of the long e-mail on the probability to start the survey. In other words, having a detailed e-mail invitation decreases a respondent’s willingness to start the survey. Although the direction confirms our hypothesis, the effect is not statistically significant.

For the two other treatments (the use of a bold letter type and the checkbox), the table shows that there is a negative effect between these treatments and the probability of breaking off prematurely. Hence, emphasized privacy rights in bold letter type and a checkbox on the welcome screen decrease an individual’s probability to break off both directly on the welcome screen and on any other page. It might be the case that because the entire assurance was bolded, it signaled the respondent to skip over that section, as it is seemed less relevant. Further research should, however, examine this possible explanation for the unexpected results. However, neither of the effects are strong nor significant.

To ensure robustness, we also calculated a simple one-stage logistic regression and cross-tables with chi-squared tests. Additionally, we checked for two-way interaction effects in the one-stage logistic regression models with the dependent variable overall break-off. Table 3 presents these results, which show that the combination of a long e-mail invitation and a checkbox on the welcome screen significantly increases a respondent’s overall break-off probability whereas the combination of a long e-mail invitation and a bold text does not.

Table 3 Logistic regression.

Model 1 Model 2
Long e-mail –0.279 0.103
(0.211) (0.215)
Checkbox –0.629*** –0.166
(0.231) (0.155)
Bold –0.213 –0.215
(0.154) (0.219)
Long e-mail X checkbox 0.886***
(0.317)
Long e-mail X bold 0.0367
(0.308)
Constant –1.049*** –1.227***
(0.159) (0.169)
n 1041 1041

Reported are the coefficients and the standard errors in parentheses; ***significant at the 1 percent level; **significant at the 5 percent level; *significant at the 10 percent level.

Overall, the above analyses and robustness checks suggest that minor differences in presenting the privacy rights do not affect the respondents’ willingness to complete the survey but that a combination of different features which increase the respondents’ burden can impact a survey’s response rate.

Conclusion

Concerns about privacy and confidentiality are among the reasons most often given by potential respondents for refusing to participate in surveys (Singer 2004). It is then also a normal practice to assure the respondents about their privacy and confidentiality rights. However, the ways these assurances are presented differ considerably. This study is the first to examine these differences and how they influence participants’ willingness to complete the survey. Our results show that emphasizing these rights by placing them in bold on the welcome screen or by presenting a checkbox does not influence the response rate. Additionally, putting the assurances in the invitation e-mail does not seem to have much effect either. However, a combination of different design features of the privacy rights will increase the respondent’s burden and will influence the survey’s response rate.

This study has several limitations that need to be overcome in further studies. First, the experiments were restricted to a survey in which the content was not too sensitive, and they were intended as a general test of the importance of design concerning the privacy and confidentiality assurance to respondents. Second, the study was conducted on a nonprobability sample of a very specific population – first year students. Our respondents are likely to have a heightened interest in the topic compared to respondents in other surveys and might therefore be more likely to start the survey and complete it. Future research should then also examine how these assurances affect the response rates in other settings.

Acknowledgement

We thank Christine Abele and Elena Sewelies for the data gathering support and collaboration.

References

American Association for Public Opinion Research 2011
American Association for Public Opinion Research. 2011. Standard definitions: final dispositions of case codes and outcome rates for surveys. Available at: http://www.aapor.org/AM/Template.cfm?Section=Standard_Definitions2&Template=/CM/ContentDisplay.cfm&ContentID=3156.
Cook et al. 2000
Cook, C., F. Heath and R.L. Thompson. 2000. A meta-analysis of response rates in web- or Internet-based surveys. Educational and Psychological Measurement 60(6):821–836.
Couper 2000
Couper, M.P. 2000. Web surveys. A review of issues and approaches. Public Opinion Quarterly 64(4): 464–494.
Couper 2008
Couper, M.P. 2008. Designing effective web surveys. Cambridge University Press, Cambridge.
Crawford et al. 2001
Crawford, S.D., M.P. Couper and M.J. Lamias. 2001. Web surveys. Perceptions of burden. Social Science Computer Review 19(2): 146–162.
Fay et al. 1991
Fay, R.F., N. Bates and J. Moore. 1991. Lower mail response in the 1990 Census: a preliminary interpretation. In Proceedings of the Annual Research Conference of the U.S. Census Bureau. Census Bureau, Washington, DC.
Flaherty 1989
Flaherty, D.H. 1989. Protecting privacy in surveillance societies. University of North Carolina Press, Chapel Hill.
Haer and Meidert 2013
Haer, R. and N. Meidert. 2013. Does the first impression count? The impact of Web survey’s welcome screen on response rates. Survey Methodology Journal 39(2): 419–434.
Heckman 1979
Heckman, J. 1979. Sample selection bias as a specification error. Econometrica 47(1): 153–161.
Hillygus et al. 2006
Hillygus, S., N.H. Nie, K. Prewitt and H. Pals. 2006. The hard count: the Political and social challenges of census mobilization. Russell Sage Foundation, New York.
Lozar Manfreda et al. 2008
Lozar Manfreda, K., M. Bosnjak, J. Berzelak, I. Haas and V. Vehovar. 2008. Web surveys versus other survey modes: a meta-analysis comparing response rates. International Journal of Market Research 50(1): 79–104.
Sills and Song 2002
Sills, S. J. and C. Song. 2002. Innovations in survey research: An application of Web-based surveys. Social Science Computer Review, 20: 22–30.
Singer 2004
Singer, E. 2004. Risk benefit, and informed consent in survey research. Survey Research 35: 1–6.
Singer 2008
Singer, E. 2008. Ethical issues in surveys. In: (E.D. de Leeuw, J.J. Hox, and D.A. Dillman, eds.) International handbook of survey methodology. Lawrence Erlbaum Associates, New York.
Singer et al. 1992
Singer, E., H. Hippler and N. Schwarz. 1992. Confidentiality assurances in surveys: reassurance or threat? International Journal of Public Opinion Research 4(3): 256–268.
Singer et al. 1993
Singer, E., N.A. Mathiowetz and M.P. Couper. 1993. The impact of privacy and confidentiality on survey participation. The case of the 1990 U.S. Census. Public Opinion Quarterly 57(4): 465–482.
Singer et al. 1995
Singer, E., D.R. von Thurn and E.R. Miller. 1995. Confidentiality assurances and response: a quantitative review of the experimental literature. Public Opinion Quarterly 59(1): 66–77.
Singer et al. 2003
Singer, E., J. van Hoewyk and R.J. Neugebauer. 2003. Attitudes and Behavior. the impact of privacy and confidentiality concerns on participation in the 2000 Census. Public Opinion Quarterly 67(3): 368–384.

Appendix I. E-mail Invitation with the Long Version Privacy Rights

=== English version see below ===

[German version]

=== English version ===

Dear Mrs. Example,

You are now in your first or second semester of your course of studies at the University of Konstanz. In these first few weeks, you have gotten to know your department, the university, and the city. We would like to know about your experience with the university advisory services, the support received from the lecturers and your overall expectations. Therefore, we would like to invite you to participate in our First Year Students Survey. Filling in this survey will help us to improve the study conditions of the University of Konstanz.

Please click on the following link to the survey (English version):

[Personalized link]

If you cannot enter the survey via the link, please copy and paste the address in your Web browser.

The participation in this survey is voluntary. The survey is subjected to the data protection regulations. The collected information will only be used for scientific purposes and within the scope of the Quality Management Unit of the University of Konstanz. Your answers on the questions will not be linked to your personal information by the staff of the Quality Management Unit. Your contact details will not be jointly saved in the dataset with your answers, in order to avoid linking your e-mail address to your personally given answers. However, in theory it is possible that your identity can be inferred from a combination of your given answers. Notwithstanding, the staff of the Quality Management Unit does not have the organizational capacities to do so. In addition, the University of Konstanz has promised, if the possibility exists to infer your identity on the basis of your answers, not to do so. The analysis of the survey results will only be done from a statistical point of view. No individual cases will be reported. Only group-comparison formats will be presented (for example, men vs. women) and consequently, individual answers cannot be inferred. The results will only be published in an anonymous format. Please note that you can always skip particular questions or indicate that you do not want to answer or that you do not know the answer of a question.

For questions and comments, please e-mail our survey team:

[E-mail address].

Thank you for your participation and we wish you all the best with your course of studies!

Kind regards

[Name]

Vice Rector, Study Programs

Appendix II. Welcome Screen with Bold Text on Privacy Rights and Checkbox

Welcome to the First Year Students Survey of the University of Konstanz!

You are in your first or second semester of your course of studies at the University of Konstanz. We would like to ask you about your experiences with the university advisory services, the support received from the lecturers and your overall expectations. The questions are therefore solely focused on your current studies and your major.

You can answer the questions by clicking or by giving short entries. The survey will take around 15 minutes. The Continue button will bring you to the next page of the survey.

Note on privacy rights:

The collected information will only be used for scientific purposes and within the scope of the Quality Management Unit of the University of Konstanz. The participation in this study is voluntary. Your answers on the questions will not be linked to your personal information by the staff of the Quality Management Unit. Your contact details wiil not be jointly saved in the dataset with your answers, in order to avoid linking your e-mail address to your personally given answers. However, theoretically it might be possible that in some cases your identity can be inferred from a combination of your given answers. Notwithstanding, the staff of the Quality Management Unit does not have the organizational capacities to do so. In addition, the University of Konstanz has promised, if the possibility exists to infer your identity on the basis of your answers, not to do so. The analysis of the survey results will only be done from a statistical point of view. No individual cases will be reported. Only group-comparison formats will be presented (for example, men vs. women) and consequently, individual answers cannot be inferred. The results will only be published in an anonymous format. Please note that you can always skip particular questions or indicate that you do not want to answer or that you do not know the answer to a question.

I read and accept the privacy rights

For questions and comments, please e-mail our survey team:@uni-konstanz.de.

Thank you for your participation in the survey!

Footnotes
1 The authors’ names appear in alphabetical order. They contributed equally to the article.
2 Note that confidentiality and privacy rights are two distinct concepts – but are often treated by respondents as one and the same element. This study will therefore use privacy rights as the term connoting both concepts.


About Survey Practice Our Global Partners Disclaimer
The Survey Practice content may not be distributed, used, adapted, reproduced, translated or copied for any commercial purpose in any form without prior permission of the publisher. Any use of this e-journal in whole or in part, must include the customary bibliographic citation and its URL.