Processing math: 100%
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:54963/feed
Articles
Vol. 10, Issue 1, 2017December 31, 2016 EDT

Improving survey response rates: The effect of embedded questions in web survey email Invitations

Mingnan Liu, Nick Inchausti,
response rateembedded questionemail invitationexperimentweb survey
https://doi.org/10.29115/SP-2017-0005
Survey Practice
Liu, Mingnan, and Nick Inchausti. 2017. “Improving Survey Response Rates: The Effect of Embedded Questions in Web Survey Email Invitations.” Survey Practice 10 (1). https:/​/​doi.org/​10.29115/​SP-2017-0005.
Save article as...▾
Download all (3)
  • Figure 1   Screenshot email invite for Condition 1: first question embedded.
    Download
  • Figure 2   Screenshot email invite for Condition 2: standard email.
    Download
  • Figure 3   Email invite click rate and completion rate by experimental condition.
    Download

Sorry, something went wrong. Please try again.

If this problem reoccurs, please contact Scholastica Support

Error message:

undefined

View more stats

Abstract

Survey response rate is one of the most critical survey data quality indicators. Many research efforts have been devoted to exploring new ways to improve response rates, especially among web surveys. When inviting participants to web surveys through email, the survey email invitation is the first point of contact made to the survey participants. Several previous studies have examined the impacts of content and design of an email on response rate. However, to our knowledge none of them tested the effect of presenting a survey question in the email. In this study, we report our findings from a web survey experiment focused on embedding the first survey question within the email invitations. In this condition survey takers are able to see and answer the first question of the survey within the survey email invitation. The results show that as compared to the standard email invite (a link without any survey questions shown), the embedded question email invite improves the email click rate, and survey completion rate, with a small cost to survey drop-out rate. Additionally, the responses to the first question of the survey has shown no difference between the embedded and standard email conditions. The implications of this study and future research directions are also discussed.

Introduction

Inviting survey participants through emails is one of the most widely used methods in data collection. Several studies have tested different ways of drafting the email in order to maximize the survey response rate. An earlier study examined the impact of personalization, the email address of the sender, the authority of the email signatory, and the profile of the requesting office on response rate and found that none of them had significant impact (Porter and Whitcomb 2003). In another study, however, personalization was shown to improve the email click rate and completion rate (Heerwegh 2005). Another study by Whitcomb and Porter examined the impact of background color (white vs. black) and the use of header (simple vs. complex) and found that a white background and simple header had higher response rates than other conditions (Whitcomb and Porter 2004). Mentioning the purpose of an email (requesting for survey participation) and the sponsor of the survey in the email subject line also had an impact on survey participation (Porter and Whitcomb 2005). Several other factors, including the length of the email, placement of the URL, and the estimated time of the survey, have also been explored in survey experiments in order to improve survey participation and response rate (Kaplowitz et al. 2012; Keusch 2012; Trouteaud 2004).

In this study, we report findings from a web survey experiment on a new design feature for email invitations. As described below, we included the first questions of the survey inside the email invitation we sent out to the survey participants. We evaluated the success of the experiment on several matrices, including email click rate, completion rate, and response to the first question.

Methods

The experiment was conducted using the SurveyMonkey platform and among a group of SurveyMonkey customers who agreed to participate in research projects and provided their email addresses. In total, some 8,876 emails were sent to invite customers to participate in a research project (i.e., responding to a survey) on July 27, 2016. A reminder was sent out four days after the initial invite. The content of the reminder was identical to the initial email. The survey was closed on August 8, 2016. An identical survey was sent out to all participants. The survey contained 13 questions, asking about customers’ experience with the survey platform, satisfaction, and additional features.

The participants were randomly assigned to one of two email conditions. In Condition 1, the email started with a short message requesting participation in a customer feedback survey (Figure 1). Immediately following the message, the first question was also presented in the email. The question asked “How likely is it that you would recommend SurveyMonkey to a friend or colleague?” using a scale of 0 (Not at all likely) to 10 (Extremely likely). The question is usually referred to as a Net Promoter Score (NPS) (Reichfield 2010). By clicking on one of the answer options, respondents would be directed to the survey webpage, with the answer to the NPS questions registered already.

Figure 1   Screenshot email invite for Condition 1: first question embedded.

Condition 2 is the standard email invite (Figure 2) with the same short message as Condition 1. However, the NPS question was not included in the email invite. Instead, the respondents had to click on the “Begin survey” button to start the survey on the survey webpage.

Figure 2   Screenshot email invite for Condition 2: standard email.

Results

For the embedded email condition, 4,436 emails were sent and 103 of them were invalid (opted out or bounced), which resulted in 4,333 valid emails. For the standard condition, 4,440 emails were sent and among them 93 were invalid. As a result, there were 4,347 valid emails for this condition. As Figure 3 shows, the email invite click rate for the embedded condition was 32.0 percent and for the standard condition it was 26.2 percent, and the difference was statistically significant (t=5.93, p<0.001). This means that the respondents in the embedded condition were much more likely to click on the embedded question and start the survey than the respondents in the standard condition to click on the “Begin survey” button.

Figure 3   Email invite click rate and completion rate by experimental condition.

Figure 3 also shows the final completion rate for the two conditions. The completion rate for the embedded condition was 29.1 percent, significantly higher than the completion rate for the standard condition, which was 24.4 percent (t=4.99, p<0.001).

In addition, we also examined the completion rate among those who clicked on the survey email invite and started the survey. The rates for the embedded and standard conditions were 90.8 percent and 92.8 percent, respectively (t=1.78, p=0.07). This means that about 2 percentage points of respondents in the embedded condition were more likely to drop out from the survey than the standard condition.

Lastly, we examined the response to the first NPS question by condition. NPS has 11 response options (0–10). Typically, respondents are grouped into three categories, namely promoters (9–10), passives (7–8), and detractors (0–6). Then, the NPS score is calculated as follows:

NPS(%Promotor−%Detractors)=×100

We are not able to release the NPS score for this proprietary data. However, the ratio of the two NPS scores between the embedded and standard email invites was 0.98, suggesting the two responses to the first question were almost identical for the two conditions.

We also compared the response to the first question in a more standard way, that is, by examining the distribution between the two conditions. We calculated the ratio of the two conditions for each category. The ratio for the promoters is 0.976. The ratios for the passives and detractors are 1.089 and 0.911, respectively. A chi-square test showed that the distributions of this question were not significantly different between the two conditions (χ2=1.85, p=0.39).

Discussion

In this study, we reported findings from a survey experiment testing whether embedding a survey question in the email invitation can improve the survey response rate. In the embedded condition, the first question showed up in the email invite. Both the survey click-through rate and the completion rate increased in the embedded condition more than the standard condition. There is a slight increase in the drop-out rate in the embedded condition as well, but it was very small and not significant. The responses to the first question (embedded question) were very similar for both conditions, suggesting that embedding the question did not change the question measurement. Given these findings, an embedded survey question in an email invite improves the survey response rate without any apparent disadvantages. One additional advantage of embedding the question in the email was that even if respondents drop out from the survey, their answers to the first question will be recorded in the embedded condition. In the standard condition, if respondents drop out before completing the first page, all data will be lost.

In addition to the practical implications, this study also opens up a few other research opportunities. First, this study only embedded one question in the email. Future studies should explore the impact of the number of questions in the email on the survey participation. Second, survey length may interact with the effect of embedding questions, and this should be examined through experiments. Third, other question types of the embedded question should also be examined in future research. Fourth, the survey population is very unique in this study. Future research should replicate it among other survey participants.

Disclaimer

The research was conducted when Mingnan Liu was at SurveyMonkey, before he joined Facebook.

References

Heerwegh, D. 2005. “Effects of Personal Salutations in E-Mail Invitations to Participate in a Web Survey.” Public Opinion Quarterly 69 (4): 588–98.
Google Scholar
Kaplowitz, M.D., F. Lupi, M.P. Couper, and L. Thorp. 2012. “The Effect of Invitation Design on Web Survey Response Rates.” Social Science Computer Review 30 (3): 339–49.
Google Scholar
Keusch, F. 2012. “How to Increase Response Rates in List-Based Web Survey Samples.” Social Science Computer Review 30 (3): 380–88.
Google Scholar
Porter, S.R., and M.E. Whitcomb. 2003. “The Impact of Contact Type on Web Survey Response Rates.” Public Opinion Quarterly 67 (4): 579–88.
Google Scholar
———. 2005. “E-Mail Subject Lines and Their Effect on Web Survey Viewing and Response.” Social Science Computer Review 23 (3): 380–87.
Google Scholar
Reichfield, F. 2010. “The Ultimate Question.” 2010. http:/​/​dentalwealthbuilder.com/​dwb-wp/​wp-content/​uploads/​2014/​05/​DWB\_The\_Ultimate\_Question.pdf.
Trouteaud, A.R. 2004. “How You Ask Counts a Test of Internet-Related Components of Response Rates to a Web-Based Survey.” Social Science Computer Review 22 (3): 385–92.
Google Scholar
Whitcomb, M.E., and S.R. Porter. 2004. “E-Mail Contacts: A Test of Complex Graphical Designs in Survey Research.” Social Science Computer Review 22 (3): 370–76.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system