Introduction
In general, Web surveys are connected with relatively low response rates compared to other modes of survey research (Cook, Heath, and Thompson 2000; Couper 2000; Lozar Manfreda et al. 2008). Some scholars have attributed this low rate to increased public concerns about the confidentiality of the data requested and the violation of privacy rights of the respondent (Sills and Song 2002; see also for some theoretical insights Singer, Hippler, and Schwarz 1992).[1]
A common way to deal with these concerns is to reassure the respondent that confidentiality will be upheld and that privacy rights will be respected. These assurances are considered to be a crucial part of doing ethical research. Consequently, many countries have legal guidelines on the content of these assurances. However, these guidelines rarely include rules about how these assurances should be presented, which has resulted in a variety of privacy rights designs. To our knowledge, few scholars have researched how to present these assurances in such a way that respondents will not break off prematurely or fail to complete the survey.
The present study seeks to remedy this lack of research. In doing so, special attention is devoted to the effect of three ways of presenting privacy rights: (1) by emphasizing the privacy rights in the e-mail invitation (2) by emphasizing these assurances on the welcome screen using a bold letter type and (3) by using a checkbox that the respondent has to click to indicate that he or she has read these assurances.
Background
Survey researchers have always been concerned about the effect of survey introductions on the willingness of potential respondents to participate (Singer, Hippler, and Schwarz 1992). Our study is by no means the first study to investigate the relationship between confidentiality and privacy right statements and the response rate. However, few studies have examined how to present these rights effectively in Web surveys.
Most of the early studies focused on the U.S. decennial census and were based on the assumption that privacy assurances were a ‘good’ thing – it increases response rates by overcoming respondents’ concerns (Singer, Hippler, and Schwarz 1992; Singer, D.R. von Thurn, and Miller 1995). However, most of these early studies show the contrary; privacy assurances reduce respondents’ willingness to participate (Fay, Bates, and Moore 1991; Hillygus et al. 2006; Singer, Mathiowetz, and Couper 1993; Singer, J. van Hoewyk, and Neugebauer 2003).
This negative effect has never been confirmed in Web survey research. This may be due to the fact that survey researchers are ethically and legally obliged to give these assurances. In the United States, the Privacy Act statement determines the content of such assurances for surveys conducted by the government (Couper 2008). European laws have been even more responsive to privacy and confidentiality (Singer 2008). In Germany, after the 1983 census debacle in which the census had to be postponed because of concerns about inadequate data protection, concerns surrounding confidentiality and privacy rights have become highly salient and rules have been strictly formulated (Flaherty 1989; Singer, Hippler, and Schwarz 1992).
While survey researchers are legally and ethically bound to present these assurances, they know little about how different presentations of these assurances affect respondents’ willingness to participate. Consequently, Web surveys have been characterized by a variety of possible designs (see for instance, Couper 2008). This variety of existing designs lead us to our research question: Do differences in design have influence on the response rate?
Design of the Experiments
To investigate the importance of different presentations of confidentiality and privacy right assurances on respondents’ cooperation in participating in and completing surveys, we added experiments to a list-based sample survey of first year students at the University of Konstanz conducted by the quality management unit of the university. The main research questions were formulated to evaluate why students choose to study in Konstanz, and what problems they encountered in the beginning of their studies. Students were asked to provide information on their attitudes toward the university, their lectures, and about their intention for breaking off their studies. The questionnaire was approved by the university and its juridical advisory board. All first year students who were enrolled in a study program at the university two weeks before the survey started were invited to the Web survey by an e-mail sent to their university account.
We tested the effects of three possible designs by presenting the privacy rights in a 2x2x2 experimental design. It is important to note that the first year students were randomly assigned to receive one of the two e-mail invitations. Once they clicked on the personalized survey link in the e-mail, they were randomly assigned to one of the four other treatment groups. In the first experiment, we took the conclusions made by Crawford, Couper, and Lamias (2001) into account. They concluded that the e-mail invitation plays a disproportionately important role in eliciting responses to a Web survey. Consequently, we placed the privacy and confidentiality assurances for one group (n=1311) not only on the welcome screen but also in the received e-mail invitation. The other group of respondents (n=1312) received a short version of these assurances in the invitation e-mail and received these assurances on the welcome screen of the survey. We expected that those respondents that received the long e-mail invitation with the detailed assurances were less likely to start the survey since they would be primed to believe that the upcoming survey included sensitive questions for which privacy and confidentiality assurances are necessary (see also Haer and Meidert 2013).
In a second experiment, we presented privacy rights in bold on the welcome screen (n=536), others received the same rights without the bold letter type (n=505). We expect that by emphasizing these assurances, respondents would become aware of possible problems with these rights and would be less likely to start the Web survey.
In the last added experiment, one sample (n=498) was given a welcome screen on which respondents had to mark a checkbox indicating that they read the privacy rights and confidentiality assurances before they could proceed to the Web survey. In the other group (n=543), the respondents were given the same assurances on the welcome screen without having a checkbox. They could proceed to the survey just by pressing the next-button. We expect that those respondents who received the screen with the checkbox were more likely to break off. This is not only because the checkbox would likely trigger a negative response effect due to the priming effect but also because marking the checkbox might be perceived as an extra burden for the respondent. The more respondents are burdened, the more likely they refuse participation (Crawford, Couper, and Lamias 2001). Screen shots of the welcome screen and the e-mail invitations can be found in Appendix I and II.
Implementation
After a pretest, in which we had specifically tested the technical aspects of the survey, we sent out an e-mail invitation with a unique URL to the survey. This e-mail invitation was successfully sent to 2623 students on November 13, 2012. Five days after this initial e-mail, a reminder was sent to those who had not yet participated in the survey. A final reminder was sent on November 30, 2012, to those who had not participated and to those respondents who had not completed the survey. The reminders had the same structure as the initial e-mail invitation. The survey closed on December 17, 2012. In total 1,140 students started the survey, 827 completed the entire survey, and 313 partially completed the survey. This gives us a total response rate (i.e., including completed and partial questionnaires) of 43.5 percent using American Association for Public Opinion Research (AAPOR) RR2 and a response rate including the completed questionnaires of 31.5 percent using AAPOR RR1 (American Association for Public Opinion Research 2011).
Results
Table 1 shows the number of assigned respondents to each group and the break-off rate directly on the welcome screen. Only 1140 respondents clicked on the link in the e-mail to start the survey. The table also shows that the number of respondents in each group differs because of the random selection procedure. Furthermore, different break-off rates can be observed.
To analyze the data, we estimated Heckman selection models because of the two-stage structure of the experimental design (Heckman 1979). First, participation in the survey is dependent on the invitation e-mail the respondent received. These estimators are then used in the second stage as a regressor in the second estimation with the two other treatments as independent variables and the break-off rate as dependent variable.
Table 2 shows the two Heckman selection models. Model 1 displays the estimation with the dichotomous dependent variable break-off directly on the welcome screen. Model 2, on the other hand, reports the results with the dichotomous dependent variable overall break-off (respondents break off at any screen of the survey).
When looking at the effect of the e-mail invitation in the both models (i.e., putting the detailed privacy rights in the e-mail invitation vs. sending a short e-mail invitation with only a brief outline of these rights), there is a negative effect of the long e-mail on the probability to start the survey. In other words, having a detailed e-mail invitation decreases a respondent’s willingness to start the survey. Although the direction confirms our hypothesis, the effect is not statistically significant.
For the two other treatments (the use of a bold letter type and the checkbox), the table shows that there is a negative effect between these treatments and the probability of breaking off prematurely. Hence, emphasized privacy rights in bold letter type and a checkbox on the welcome screen decrease an individual’s probability to break off both directly on the welcome screen and on any other page. It might be the case that because the entire assurance was bolded, it signaled the respondent to skip over that section, as it is seemed less relevant. Further research should, however, examine this possible explanation for the unexpected results. However, neither of the effects are strong nor significant.
To ensure robustness, we also calculated a simple one-stage logistic regression and cross-tables with chi-squared tests. Additionally, we checked for two-way interaction effects in the one-stage logistic regression models with the dependent variable overall break-off. Table 3 presents these results, which show that the combination of a long e-mail invitation and a checkbox on the welcome screen significantly increases a respondent’s overall break-off probability whereas the combination of a long e-mail invitation and a bold text does not.
Overall, the above analyses and robustness checks suggest that minor differences in presenting the privacy rights do not affect the respondents’ willingness to complete the survey but that a combination of different features which increase the respondents’ burden can impact a survey’s response rate.
Conclusion
Concerns about privacy and confidentiality are among the reasons most often given by potential respondents for refusing to participate in surveys (Singer 2004). It is then also a normal practice to assure the respondents about their privacy and confidentiality rights. However, the ways these assurances are presented differ considerably. This study is the first to examine these differences and how they influence participants’ willingness to complete the survey. Our results show that emphasizing these rights by placing them in bold on the welcome screen or by presenting a checkbox does not influence the response rate. Additionally, putting the assurances in the invitation e-mail does not seem to have much effect either. However, a combination of different design features of the privacy rights will increase the respondent’s burden and will influence the survey’s response rate.
This study has several limitations that need to be overcome in further studies. First, the experiments were restricted to a survey in which the content was not too sensitive, and they were intended as a general test of the importance of design concerning the privacy and confidentiality assurance to respondents. Second, the study was conducted on a nonprobability sample of a very specific population – first year students. Our respondents are likely to have a heightened interest in the topic compared to respondents in other surveys and might therefore be more likely to start the survey and complete it. Future research should then also examine how these assurances affect the response rates in other settings.
Acknowledgement
We thank Christine Abele and Elena Sewelies for the data gathering support and collaboration.
Note
The authors’ names appear in alphabetical order. They contributed equally to the article.
Appendix I. E-mail Invitation with the Long Version Privacy Rights
=== English version see below ===
[German version]
=== English version ===
Dear Mrs. Example,
You are now in your first or second semester of your course of studies at the University of Konstanz. In these first few weeks, you have gotten to know your department, the university, and the city. We would like to know about your experience with the university advisory services, the support received from the lecturers and your overall expectations. Therefore, we would like to invite you to participate in our First Year Students Survey. Filling in this survey will help us to improve the study conditions of the University of Konstanz.
Please click on the following link to the survey (English version):
[Personalized link]
If you cannot enter the survey via the link, please copy and paste the address in your Web browser.
The participation in this survey is voluntary. The survey is subjected to the data protection regulations. The collected information will only be used for scientific purposes and within the scope of the Quality Management Unit of the University of Konstanz. Your answers on the questions will not be linked to your personal information by the staff of the Quality Management Unit. Your contact details will not be jointly saved in the dataset with your answers, in order to avoid linking your e-mail address to your personally given answers. However, in theory it is possible that your identity can be inferred from a combination of your given answers. Notwithstanding, the staff of the Quality Management Unit does not have the organizational capacities to do so. In addition, the University of Konstanz has promised, if the possibility exists to infer your identity on the basis of your answers, not to do so. The analysis of the survey results will only be done from a statistical point of view. No individual cases will be reported. Only group-comparison formats will be presented (for example, men vs. women) and consequently, individual answers cannot be inferred. The results will only be published in an anonymous format. Please note that you can always skip particular questions or indicate that you do not want to answer or that you do not know the answer of a question.
For questions and comments, please e-mail our survey team:
[E-mail address].
Thank you for your participation and we wish you all the best with your course of studies!
Kind regards
[Name]
Vice Rector, Study Programs
Appendix II. Welcome Screen with Bold Text on Privacy Rights and Checkbox
Welcome to the First Year Students Survey of the University of Konstanz!
You are in your first or second semester of your course of studies at the University of Konstanz. We would like to ask you about your experiences with the university advisory services, the support received from the lecturers and your overall expectations. The questions are therefore solely focused on your current studies and your major.
You can answer the questions by clicking or by giving short entries. The survey will take around 15 minutes. The Continue button will bring you to the next page of the survey.
Note on privacy rights:
The collected information will only be used for scientific purposes and within the scope of the Quality Management Unit of the University of Konstanz. The participation in this study is voluntary. Your answers on the questions will not be linked to your personal information by the staff of the Quality Management Unit. Your contact details wiil not be jointly saved in the dataset with your answers, in order to avoid linking your e-mail address to your personally given answers. However, theoretically it might be possible that in some cases your identity can be inferred from a combination of your given answers. Notwithstanding, the staff of the Quality Management Unit does not have the organizational capacities to do so. In addition, the University of Konstanz has promised, if the possibility exists to infer your identity on the basis of your answers, not to do so. The analysis of the survey results will only be done from a statistical point of view. No individual cases will be reported. Only group-comparison formats will be presented (for example, men vs. women) and consequently, individual answers cannot be inferred. The results will only be published in an anonymous format. Please note that you can always skip particular questions or indicate that you do not want to answer or that you do not know the answer to a question.
I read and accept the privacy rights
For questions and comments, please e-mail our survey team:@uni-konstanz.de.
Thank you for your participation in the survey!
Note that confidentiality and privacy rights are two distinct concepts – but are often treated by respondents as one and the same element. This study will therefore use privacy rights as the term connoting both concepts.