Survey practitioners have spent decades conducting experiments to better understand why people respond to surveys. Although the use of the internet to conduct surveys is not new, the research on why people respond to a web survey is still being explored. There are many reasons that could influence a person to take a web survey. As Keusch (2015) notes in his meta-analysis, factors like the characteristics of the person, societal-level factors, attributes of the survey design, and questionnaire design all affect why a person participates. As the usage of the web to administer surveys increases, it is important to understand how to effectively use the email subject line and email invitation when communicating with potential respondents.
The subject line plays an important role in the decision-making process that individuals undergo when deciding whether to open an email. A person uses the subject line as a filter mechanism and decides whether to open the email based on the information it provides (Wainer, Dabbish, and Kraut 2011). An experiment among Michigan State University (MSU) students found that an authoritative subject line (a specific request from the MSU Vice President) yielded higher response rates than a subject line that did not have an authoritative tone (Kaplowitz et al. 2012). Pleas for help, without sounding too desperate, were found to improve response rates (Trouteaud 2004). Sappleton and Lourenço (2016) tested if you could increase email open rates by creating a sense of curiosity by using a blank subject line. Their experiment found that a blank and non-blank subject line performed similarly in influencing a person to open the email.
A person has two options after opening an email, deleting it without reading or evaluating the request (Keusch 2015). If the person decides to evaluate the request, the content of the request can influence whether the person takes the survey. Much of the literature on email invitations focuses on the structuring of the content. For example, several studies found conflicting results in the usage of personalization in email messages. An experiment among first-year college students found that the usage of personalization in email invitations increased the response rate by 7.8 percentage points (Heerwegh 2005). In a different experiment among college students, Trespalacios and Perkins (2016) found that neither the degree of personalization nor the length of the invitation impacted survey response or completion. Sauermann and Roach (2013) found that including the recipient’s name increased response rates. However, they cautioned that their study was limited to their sample of scientists and engineers working in the United States, and different experiments among different populations could yield different results. Additionally, survey participants in these experiments have an existing relationship with the sponsor, and the recognition of the sponsor and the perception of how the results will be used could influence their decision to participate (McCarthy, Johnson, and Ott 1999).
There is a substantial amount of research on communicating with prospective respondents to increase response rates. However, much of this research uses Social Exchange Theory as its foundation, and experiments assessing the effectiveness of alternative communication strategies has not been as widely explored. This paper focuses on two communication strategies that mirror research conducted by Greenberg and Dillman (2023). In their experiment, they used mail to solicit survey response among households in southwest West Virginia. They tested two communication theories – Social Exchange theory and Pre-Suasion theory – against each other to assess if one led to higher response rates and item nonresponse. This research mirrors their work but uses email as the mode of contact. The goal of this research is to better understand if one of these communication frameworks increases the likelihood of a person opening an email, and ultimately deciding to join a business survey panel.
About the Communication Theories In the Experiment
Social Exchange theory was adapted by Dillman, Smyth, and Christian (2014) for survey researchers to connect multiple design issues – which often were dealt with in isolation – to obtain high survey response rates. The theory conveys that a respondent is more likely to comply with a survey request if they believe and trust that the benefits of taking the survey will exceed the costs of complying. Survey professionals have utilized aspects of Social Exchange theory in their surveys; however, Greenberg and Dillman (2023) argue that Social Exchange has not been experimentally tested against another general approach to assess if one performs better than another.
Pre-Suasion theory, developed by Robert Cialdini, guides people through a sequencing of actions to influence their decisions (2016). A tenet of Pre-Suasion theory is establishing a personal trust. Social Exchange theory also relies on trust, but the trust in Social Exchange theory focuses on fulfilling promises made to the respondent (for example, providing incentives or that the information will be used to make a decision). Pre-Suasion theory also states that you need to command continued attention, such as using a “magnetizer” to keep the focus of the person. Before asking the person to make a decision (i.e., join a survey panel), the requester needs to create a privileged moment. The purpose of the privileged moment is so the recipient is in the right mindset before being asked to take an action. Once the privileged moment is established, the requestor can transition to their request.
Email Communications
The Social Exchange email in this experiment follows the structure set forth in Greenberg and Dillman’s experiment, which follows the tenets of that theory. The email, which can be found in the supplementary materials, opens with a clear request about why I am reaching out to them (“I am reaching out to you about joining our Monthly Survey of Business Conditions”). The email follows by going into the benefits of joining the survey panel (“The results of the survey provide critical input into monetary policy decisions”). The email aims to provide the relevant information needed for a person to decide if they want to join our survey panel. The subject line was meant to clearly explain the purpose of the email (“Survey of South Carolina Businesses”).
The Pre-Suasion email, which can be found in the supplementary materials, opens with a “magnetizing” question, similar to Greenberg and Dillman’s experiment (“What does the future hold for South Carolina businesses?”). The purpose of the magnetizing question is to gain the attention of the reader. To build a personal trust, our email conveys that the Richmond Federal Reserve cares about their business and understands what it is like to run a business in South Carolina (“From the Lowcountry all the way through the Upstate, we have heard from all types of businesses about the opportunities and challenges of running a business.”). We create a privileged moment with the reader by stating that we represent South Carolina in monetary policy discussions, and those decisions have a direct impact on business conditions. The email transitions to the request of joining the business survey panel. The Pre-Suasion subject line utilizes a magnetizing question: “What Does the Future Hold for South Carolina Businesses?”
Background and Experimental Design
The Federal Reserve Bank of Richmond maintains a survey panel of manufacturing businesses in the Fifth Federal Reserve District, which includes Washington D.C., Maryland, North Carolina, South Carolina, Virginia, and West Virginia. This monthly survey provides the bank with real-time information about changes in economic conditions. One method of recruiting new manufacturing firms into the survey is by using commercial lists of businesses that contain email addresses.
The bank purchased a list of email addresses from a commercial vendor that specializes in the manufacturing sector. This experiment only uses South Carolina manufacturing firms to control for geography. Among the total of 3,526 firms in the sample, about 87 percent, or 3,064, had an email address. Of the 3,064 firms with an email address, about 57 percent, or 1,999 firms, had a personal business email address, such as joe.smith@company.com. The remaining 1,065 firms had a generic business email address, such as sales@company.com. Businesses were randomly assigned to one of six segments. Each segment was balanced to have a similar distribution of firm sizes, regions of South Carolina, and the proportion of personal business email addresses. Segments five and six were held out from this experiment for future research. Thirty-four firms were removed from the sample because they were already a part of the bank’s survey panel. The final number of businesses in the experiment was 2,003. The distribution of emails was spaced out between April 17th, 2023 and April 21st, 2023.
To test the effect of both theories on email open rates and conversion rates, a 2x2 factorial design was used. This allowed for different iterations of each subject line and email content. Table 1 details the subject line and email invitation each segment received.
Defining Email Success Metrics
The emails in this experiment were distributed through the Qualtrics platform. Once the emails were sent, the Qualtrics platform automatically generated an email summary report. This report tracks the outcome of each email. For example, if the email was opened by the recipient, the email summary report marked the email status as “Email Opened.” This summary report also tracks if the recipient clicked on the survey link within the email, allowing me to see if the recipient joined the survey panel. Additionally, the email summary report specifies if the email failed to reach the recipient (a definition of email failure is found below). For example, if the email address was not valid, the summary report marked the email status as “Email Hard Bounce” (referred to as “Email Failed” in this experiment).
There were instances when the summary report did not capture if an email failed to get to the recipient. Since the emails were addressed from me, I received an automated email notification in my email inbox that the email was “undeliverable.” In these cases, the email summary report marked the email as “Email Sent,” even though I received a notification that they were not delivered. Of the 237 emails that failed to get to the recipient, 168 were flagged by the email summary report. The remaining 69 emails – or 29.1 percent – were not flagged by the summary report but I received an automated email failure notice. Emails were coded as a failure if the summary report generated from Qualtrics marked the email as “Email Hard Bounce” or if I received an undeliverable email notification.
To assess success for each email communication, below are definitions for the possible email outcomes.
-
Email Failed: The email did not reach the inbox of the intended recipient. There are several reasons why an email failed, including the email address not existing, the inbox being full, a technical issue like a firewall that blocks unknown email addresses, or other technical issues that did not allow the email to reach the recipient’s inbox. I was able to tell if an email failed through the email summary report or an automatic bounce back email notification indicating that the email was undeliverable.
-
Email Opened: I did not receive a notification that the email failed (as defined above) and the email summary report indicated the email was opened by the recipient. There are two outcomes when an email is opened, the person can join the survey panel (email opened conversion), or they take no action (email opened non-conversion). In this experiment, both actions constitute as an opened email. I am not able to tell the level of engagement a person had once they opened the email. There is the possibility that a person opened the email and immediately deleted it without reading the content.
-
Email Not Opened: I did not receive a notification that the email failed (as defined above) and the email was not marked as “Email Opened” in the email summary report. The email summary report marks these emails as “Email Sent.” My assumption is that these emails successfully reached the recipient but were not opened. However, I cannot be completely sure that the email successfully reached the inbox. There is the possibility that the email never reached the recipient’s inbox but was not flagged by the summary report as undeliverable nor did I receive an undeliverable notification.
Table 2 shows the outcomes of all the emails that were sent. Of the 2,003 emails sent, about 88 percent were successful and did not fail to get to the recipient.
In keeping with the literature on the role of the subject line and email invitation in the survey decision making process, two metrics were used to measure success. The first is email open rates by subject line. The second metric is conversion to the business survey panel among those who opened the email. The calculations for both email open rates and conversion rates exclude emails that failed to get to the recipient, as defined above.
Subject Lines Results
As shown in Table 3, 37.2 percent of emails with the Social Exchange subject line and 35.8 percent of emails with the Pre-Suasion subject line were opened. An analysis of variance test on the means showed no significant differences between the communication theories on open rates (F=0.40, p=0.5294). Firms with five or fewer employees were significantly more likely to open the Social Exchange subject line email than the Pre-Suasion subject line email (F=4.86, p=0.0280). There were no significant differences among firms with six or more employees on open rates between the two communication strategies. Among subject lines sent to generic business email addresses, the Social Exchange subject line was significantly more likely to be opened than the Pre-Suasion subject line (F=5.25, p=0.0223). Among personal email addresses, the Social Exchange and Pre-Suasion subject lines did not differ significantly (F=0.55, p=0.4602).
Email Invitation Results
Overall, around 3 percent of emails that were opened led to a conversion. As shown in Table 4, the Social Exchange invitation and the Pre-Suasion invitation did not have a significant difference in converting businesses into the survey panel (F=0.12, p=0.7291). Additionally, panel conversion by email type were not statistically different (Personal Business Email: F=0.73, p= 0.3940; Generic Business Email: F=0.60, p= 0.4394)
Table 5 shows the results for a logistic regression model measuring the odds a business converts to the survey panel. The logistic regression results suggest that the odds a business converts increases as the size of the firms also increases. Specifically, businesses with 25 to 49 employees, 50 to 99 employees, and 100 or more employees had significantly greater odds of joining the survey panel compared to businesses with five or fewer employees. However, the model has both large standard errors and wide confidence intervals, which is likely due to only twenty firms joining the panel. Use caution when interpreting the results of the model due to the lack of precision.
Discussion
There are several ways in which this experiment was limited. There are an infinite number of ways an email can be written within the Pre-Suasion and Social Exchange frameworks. The results of this study are limited to how the subject lines and emails were written and should not be generalized to all possible iterations of the frameworks. Another consideration for future similar experiments is that the content of the emails should have a similar structure. For example, the reason for reaching out to the recipient in the Pre-Suasion email in this experiment was not explained until the fourth paragraph compared to the second paragraph of the Social Exchange email. It is possible that the unbalanced structure of the email content led to differences in the decision-making process individuals experienced when deciding whether to join the survey panel.
This experiment attempted to understand if one communication strategy performed better than the other when using email as a recruitment method. Overall, the Social Exchange and Pre-Suasion subject lines performed similarly in motivating a person to open an email. A possible explanation for this is that both subject lines were straightforward and informative, which has been found to be a preference when receiving email communications about surveys (Brenner, Cosenza, and Fowler 2020). Interestingly, this experiment found that the smallest firms were more sensitive to the subject line, as those with five or fewer employees were more likely to open the Social Exchange subject line email than the Pre-Suasion subject line email. Research has shown that small businesses are more reluctant to give away information (Wilson et al. 2014), and it is possible that the Pre-Suasion subject line did not seem as trustworthy as the Social Exchange.
Conversion rates were low and did not differ significantly by the type of email invitation. This was expected as low response is common in voluntary business surveys, especially when compared to mandatory government surveys (Küfner, Sakshaug, and Zins 2022). This research found that larger firms had higher conversion rates than smaller firms. Research by Seiler (2014) found that larger firms were more likely to respond to the survey than their smaller counterparts. Seiler hypothesized that larger firms may benefit more from participating in a survey, thus, resulting in higher response rates. Email invitation language may need to be crafted differently to smaller firms since their motivations to participate in a survey could be different than larger firms.
Conclusion
The public is flooded with communications from all parts of our society. Survey researchers compete in a marketplace with marketers, spammers, and other groups competing for our attention using the same modes we use to solicit survey participation. Due to this, it is not surprising that both email open rates and conversion rates were not higher. Recipients of surveys administered via email may not know if the survey is from a legitimate organization, possibly leading them to not participate. It is important for survey researchers to think about how to break through the clutter to get the attention of potential respondents, both by further exploring the visuals of our communications and better understanding of how to talk about our surveys. This research was focused on the latter and found that neither Social Exchange theory nor Pre-Suasion theory led to noticeable differences in convincing a person to open an email or converting them into the panel. However, some differences did emerge when looking at firm size that are worth taking into consideration when communicating with businesses.
The views and opinions expressed herein are those of the author. They do not represent an official position of the Federal Reserve Bank of Richmond or the Federal Reserve System.
Jason Kosakow, Federal Reserve Bank of Richmond, 701 E Byrd St, Richmond, VA 23219, Jason.Kosakow@rich.frb.org