Introduction
Response rates for population surveys have been steadily declining, raising concerns among researchers and policy makers about the future of survey research (Czajka and Beyler 2016; National Research Council of the National Academies 2013). Seeking to stanch this trend, remain at the cutting edge of survey research, and work within the constraints of ever-shrinking budgets, an increasing number of federal surveys have explored the use of web components in their data collection efforts (Baumgardner, Doborah, and Raglin 2014; Bergeson et al. 2013; Labovitz, Patel, and Santander 2017; U.S. Census Bureau 2015).
When adding a web component to a survey protocol, survey researchers should not only consider respondents’ preferential device choice for web-based survey completion, but the method(s) of survey notification. How respondents receive the web survey invitation and reminder(s) may influence whether they complete the survey and the nature of their responses. Understanding how the mode of invitation to participate in a web survey influences respondent behavior throughout the survey process is critical in survey planning.
Paradata refers to auxiliary process information acquired during a survey’s data collection process (West 2011). In the case of web-based surveys, paradata can provide insight into web users’ preferences, such as device type, access method, number of access attempts, and time spent on each question—this valuable information can inform survey design and analysis. While paradata was originally used primarily for interviewer performance monitoring, additional uses have emerged (Lynn and Nicolaas 2010). For example, researchers have used paradata to explore differential response tendencies in ‘check-all-that-apply’ item response option sequencing (Stern, Bilgen, and Rookey 2012) and to estimate future survey cost (Wagner 2019). In addition, survey methodologists have examined and quantified the impact of mixed device use, finding different rates of nonresponse, different lengths of time spent on items, and differences in number of access attempts depending on device type (Bosnjak, Bauer, and Weyandt 2017; Buskirk and Andrus 2014; Callegaro 2010).
Using the paradata acquired from an emergency department (ED) experience of care survey in which a web component was the initial mode in several mixed-mode protocols, we aimed to answer the questions (1) how does the method of invitation to a web survey (mailed letter with URL; e-mail; text) affect device use, (2) what is the association between how a respondent accessed the survey and their device selection, (3) what, if any, respondent characteristics differ by device type used, (4) how many web survey invitations are needed?
Our contributions to the literature are threefold. First, we focus on a unique population— ED patients discharged home after receiving care—in which, to our knowledge, an extensive examination of paradata is currently unavailable. Second, we tested and compared nine different complex protocols for survey administration, eight of which included a web component, which gives us the ability to describe observed differences in paradata under multiple conditions. Third, our study had a relatively large sample size which allows us to statistically examine differences in respondent characteristics by device type. These contributions have the potential to inform future web survey development and utilization, particularly, though not exclusively, in health care-related surveys.
Methods
This research is based on a randomized experiment that sampled 26,991 ED patients discharged to the community (i.e., not admitted to the hospital or other care facility) between January and March 2018 from16 large, geographically dispersed hospital-based EDs and randomized patients within EDs to nine survey arms. The primary goal of the study was to examine response rates and compare utilization of various protocols across the survey arms. Eight of the nine arms included a web component option; arm 1 did not include a web component. Information about study design and results from the full study, including overall response rates by arm, protocol comparisons, comparisons of respondent characteristics by completion mode, and characteristics associated with the ability to email/text patients are presented elsewhere (Parast et al. 2019). Prior analyses of these data have demonstrated that inclusion of telephone mode increased response rates, that older patients and male patients were significantly less likely to be reached by email or text, and that telephone and web respondents were more representative of the sampled population demographic characteristics than mail respondents (Parast et al. 2019).
This paper focuses specifically on the patients who interacted with the web survey in the eight experimental arms with a web component, in which the sequence of paper, email, and text push-to-web notifications and reminders varied, as shown in Table 1. All of these protocols involved combinations of a mailed introduction letter (containing a URL link with or without a Quick Response [QR][1] code directed to the survey), a text message, and/or an email inviting or reminding prospective respondents to complete the web survey. Sampled patients who completed the survey via the initial web mode outreach(es) within the allotted time were then contacted via a subsequent non-web mode: mailed paper and/or telephone survey.
All patients assigned to arms with a mailed paper invitation to the web survey (arms 2, 3, 8, and 9) had the opportunity to access the web survey.[2] Thirty percent of sampled patients provided an email address and consented to email contact; 59% provided a valid cell telephone number and consented to text contact. As such, patients assigned to a survey arm with only email and/or text invitations to the web survey (arms 4, 5, 6, and 7) who did not provide the appropriate contact information or did not consent did not have the opportunity to access the web survey.
A total of 16,157 eligible[3] sampled patients were provided the option to access the web survey (61.5% of the eligible sampled population); of those, 1,581 sampled patients interacted (i.e., interacted with it in some way, even if simply opening the web survey and viewing the introductory screen) with the web survey (9.8% of the population who could have accessed the survey).
Our analyses focus on web paradata for eligible sampled patients, particularly patients who interacted with the web survey (N=1,581) and those who completed[4] the survey by web (N=1,408, or 89.1% of those who interacted with the web survey). Confirmit, a web survey software, was used for web survey administration and collection of web paradata (Confirmit 2019). Paradata collected included the number of times the web survey was accessed, the device used to access the web survey (i.e., smartphone, tablet, computer), and the method and invitation/reminder by which the web survey was accessed (i.e., link in an email, link in a text, URL typed into browser, scanned QR code). Distributional frequencies of paradata were descriptively examined. Chi-squared tests were used to test for differences in web respondent self-reported characteristics (age, gender, race/ethnicity, primary language spoken at home, education, general health and overall mental health status, arrival by ambulance, and whether a proxy helped in completing the survey) by device type and access method used.
Results
Web survey participation
Of the 16,157 sampled patients who were given the opportunity to interact with the web survey, 1,581 (9.8%) interacted with the web survey. Of these 1,581 patients, 89% completed the survey by web, less than 1% completed by either mail or telephone during the non-web mode follow-up, and 9% did not complete the survey by any mode. Among the patients who interacted with the web survey but did not complete the survey, the majority (91%) did not engage the survey beyond the introduction screen.
How does the method of invitation to a web survey affect device use?
Table 2 displays the distribution of device type used among respondents who completed the survey by web across the eight arms involving a web component. Smartphones (44% to 73% of completions across the 8 arms and 59% of all completions) far outnumbered computers (24% to 49% of completions across the 8 arms; 36% of all completions), and tablets were the least-used mode (4% to 8% of completions; 5% overall). Text invitations (arms 7 and 8) to the web survey clearly boosted smartphone use (69% to 73% of completions) compared to paper and email web invitations (arms 2, 3, and 9; 44% to 53% of completions) or email only web invitations (arms 4, 5, and 6; 57% to 58% of completions). A paper invitation without a text component (arms 2, 3, and 9) resulted in the largest proportions of web completions by computer (42% to 49%) compared to email only web invitations (arms 4, 5, and 6; 35% to 38% of completions) or text and email invitations (arms 7 and 8; 24% to 27% of completions).
What is the association between how a respondent accessed the survey and their device selection?
Sixty-nine percent of web respondents who typed in the URL link completed the survey via a computer (results not shown). For all other access methods (email, text, and QR code), the majority (56% to 99%) completed the web survey via smartphone. Tablets were the least-chosen device for nearly all access methods, except for QR codes, where no web respondent completed by computer.
What, if any, respondent characteristics differ by device type used?
Significant differences in web respondent characteristics were identified by both the device type and access method used to complete the survey. Table 3 shows the distribution of web respondent characteristics by device type. Both computer and tablet completers were more likely to be older (55 and older), non-Hispanic white, more educated (with a bachelor’s degree or more), and have excellent or very good self-reported mental health than smartphone completers. Computer completers also had a more even distribution of male and female respondents than smartphone and tablet completers, who were largely female.
Significant differences in respondent characteristics were also observed by access method (results not shown); this is expected given the previously discussed relationships between access method and device type. For instance, respondents who typed in a URL code (mainly completing via computer) were more often older, non-Hispanic white, more educated, and more evenly distributed by gender than those accessing the web survey via an emailed or texted link (mainly completing via smartphone). Trends by QR code access are not discussed given the small number of web respondents who completed the survey via this method.
How many web survey invitations are needed?
Table 4 shows the distribution of web respondents by survey arm and the last protocol invitation or reminder sent prior to completion of the survey. These results are useful with respect to considering whether all invitations are needed. For example, if the large majority of web respondents complete the survey with the first or second email, then subsequent reminders emails may not be necessary. Table 4 shows that across all experimental arms, the majority of web respondents completed the web survey before the third outreach attempt regardless of the protocol component. For example, for arms with an initial email invitation plus three email reminders (arms 4, 5, and 6), 36% to 41% of web respondents completed after receiving the initial invitation link and 60% to 61% completed after the first email reminder was sent. In the arms that included a text component (arms 7 and 8), 55% to 64% of web respondents completed the survey after receiving a texted link. For arm 7 (initial text invitation followed by an email, text, and another email reminder), nearly half of web respondents completed the survey after the first text invitation.
Discussion
This research examined ED web respondent paradata across a number of mixed mode protocols using varying invitation approaches and found that text message notifications pushed the largest proportion of patients to complete the survey via web, primarily using smartphones. We also found that smartphone users tended to be younger (18–54 years old), racial and ethnic minorities, less educated, and in worse self-rated mental health than computer and tablet respondents. We recommend text and email invitations and reminders to help reach and represent younger and ethnically diverse populations. Even though a mailed invitation with a URL link may bolster push-to-web efforts for older and more-educated patients, we do not recommend this invitation method be used alone given cost considerations, lower response rates, and less representativeness than other push-to-web protocols.
Access method was strongly correlated with device type. We have demonstrated that text message notifications (either as an invitation or a reminder) in a web survey contact protocol yielded the highest proportion of web responders on a smartphone. Slightly more than half of patients who responded to an emailed invitation used a smartphone to complete the survey. Mailed letters that required typing in a URL code appeared to drive respondents to complete the web survey on a computer. Although the analyses presented here were descriptive, it appears that the ED population—similar to populations included in general population web surveys— uses a range of devices to complete a web questionnaire, with a substantial portion opting for smartphones (Bosnjak, Bauer, and Weyandt 2017). Future web survey work should focus efforts on mobile-optimization of the web survey given the large number of smartphone users.
Of those ED patients invited to the web survey, only 9.8% interacted with the web survey. However, it is notable that nearly all (89%) who chose to interact with the web survey eventually completed the survey by web, suggesting that a web survey is a very well-received administration mode for those who can be contacted in this manner.
Our study had several limitations. Hospitals that participated in this experiment were not nationally representative and not all patients had access to the web survey, so these results may not generalize to all ED patients in the United States, nor do they necessarily generalize outside of the ED population. In addition, because our research questions focused only on patients who interacted with the web survey, we did not discuss respondents by mail or telephone; detailed results for all respondents can be found in Parast et al. (2019). While our examination of paradata generated from this research was exploratory and descriptive, it provides a glimpse of its value to researchers. Future investigations could build on the insights from paradata with larger-scale examinations of response patterns by device type and alternative push-to-web protocols designed to increase web survey completion rates. We hypothesize that even in other settings with potentially higher response rates and/or a different mix of patient characteristics, our lessons learned using paradata could be useful in terms of informing the selection of methods for invitation to a web survey and the design for a future study using similar modes.
Conflict of Interests
This work was supported by the Centers for Medicare & Medicaid Services, Department of Health and Human Services (HHSM-500-2016-00093G). The content of this publication neither necessarily reflect the views or policies of the Department of Health and Human Services nor does the mention of trade names, commercial products, or organizations imply endorsement by the U.S. government. The authors assume full responsibility for the accuracy and completeness of the ideas presented.
A QR code, which consists of multiple black squares within a square grid on a white background that can be read by an imaging device such as a camera on a mobile device, points to a website or an application.
99.9 percent of patients had a mailing address available.
Patient eligibility requirements are described further in Parast et al. (2019) and were applied prior to sampling as well as post-sampling using updated administrative data that were provided. approximately 2–5 months after discharge. The overall post-sampling ineligibility rate for the full study was 2.6% (N=704).
A completed survey was defined as one in which at least 50% of the items that were applicable to all patients were completed.