Introduction
Longitudinal survey designs provide an important resource for data on development by gathering data on the same people over time (Binder 1998). Retention in longitudinal studies has been decreasing in recent years, which raises concerns about the representativeness of the longitudinal samples and possible resulting bias in estimates of long-term outcomes (Lynn 2009, 2018).
In the U.S. national Monitoring the Future study, we explore data collection mode as one avenue to boost sample retention over time. The main question in the experiment reported here is whether the mixed-mode experiment results in differential attrition at the second follow-up survey. That is, are there differences in participant retention at the second follow-up survey based on data collection mode approach (i.e. web push vs. mail push) or by the type of response (web or paper) at the first follow-up survey? Longer-term gains in retention or cost savings from mode changes, regardless of concurrent cross-sectional changes based on study design, would advance the understanding of mode change benefits in longitudinal studies.
This research note examines the longitudinal outcomes of a mixed mode experiment (Patrick et al. 2018) with the national Monitoring the Future (MTF) study. MTF includes a longitudinal component with a nationally representative sample of 12th grade students followed through adulthood (Miech et al. 2018; Schulenberg et al. 2018). In 2014, the standard first follow-up (at modal ages 19/20) using mailed paper questionnaires was compared to three experimental mixed mode conditions, described below: 1: Mail Push, 2: Web Push, 3: Web Push + Email. Results from the first experimental follow-up data collection indicated that Condition 3 (Web Push + Email) was promising based on similar response rates and lower costs when compared to the other two conditions (Patrick et al. 2018). At the second follow-up at ages 21/22 in 2016, respondents were assigned to the same condition they received in the first follow-up. The resulting data should allow us to examine how experimental mixed-mode condition prospectively affects participation at later waves, including overall retention, response mode, type of web response (i.e. device), and cumulative cost across two waves of data collection.
Mixed-mode data collection is increasingly being adopted for both cross-sectional and longitudinal surveys (Couper 2017). While there is a growing literature on the effect of mixing modes on response rates and data quality in cross-sectional surveys, there is a paucity of research on longitudinal surveys. Bianchi et al. (2017) report on the introduction of mixed-mode (web and face-to-face) data collection in Understanding Society, and Al Baghal and Kelley (2017) focus on changes in mode preference in the same study. Similarly, Voorpostel and Ryser (2011) look at the introduction of web to the Swiss Household Panel. However, we know of no studies looking at web and mail mixed-mode surveys across waves of data collection. This note aims to address this gap in the literature.
Research Aims
Research aims for the current study were to examine: (1) whether experimental condition (randomly assigned in 2014 and continued in 2016) and the type of response in 2014 predicted response in 2016, (2) how response mode and device type changed from 2014 to 2016, and (3) cumulative cost comparisons by condition for the 2014 and 2016 data collections.
Method
Monitoring the Future Main Study
MTF is an annual series of surveys of nationally representative samples of 12th grade students in the 48 contiguous United States (n≈ 15,000 per year, modal age 18 years; Miech et al., 2018). For the MTF follow-up study, approximately 2,450 students from each 12th grade cohort are randomly selected for mailed follow-up surveys; half of the sample begin the survey one year after high school at modal age 19 and the other half two years after high school at modal age 20 (Schulenberg et al. 2018). Subsequent follow-up surveys are conducted biannually, all via mail. Prior to the first follow-up survey at modal ages 19/20, participants are contacted several times. They are sent (1) a notification letter (informing them of their selection for follow-up); (2) a newsletter (including a cover letter, and a summary of selected MTF study results), (3) a change of address card for updating contact information; (4) an advance letter (alerting them that the survey is coming soon); and (5) the actual questionnaire mailing. At the last step, a paper questionnaire is mailed to each participant along with a pencil, prepaid return envelope, and a check for $25. Reminders are sent one week later (postcard) and three weeks later (letter) to the participants; nonresponse phone calls are made one week after the second reminder, and a final mailing with a second copy of the paper questionnaire is sent about six weeks after the first questionnaire mailing. Subsequent follow-up surveys follow the same protocol, except that the notification letter and newsletter are mailed together and that no advance letter is sent out.
The MTF control group includes participants who completed their in-school baseline survey in 2012 or 2013 and who were mailed the first follow-up survey in 2014 and the second follow-up survey in 2016 (N=2,451). MTF oversamples drug users in 12th grade to ensure the representativeness of drug-using populations in the follow-up surveys; weights are used to compensate for the differential sampling probabilities (Bachman et al. 2015; Schulenberg et al. 2018).
Experimental Design
A supplementary sample of participants who completed the MTF baseline year survey in 2012 or 2013 but were not randomly selected for the main MTF follow-up was used for the experimental conditions (N=4,950). Participants in the supplementary sample were randomly assigned to one of three experimental conditions in 2014 (as described in Patrick et al. 2018) , and this assignment was maintained for the second experimental data collection in 2016 (at modal ages 21 or 22).
Condition 1 (the Mail Push Condition) initially mirrored the mailing procedure of the MTF main study (i.e. selection letter, newsletter, advance letter, paper questionnaire with a check for $25, and reminder postcard at the same time as the MTF main study). The reminder letter in Condition 1, however, provided the option to complete the survey online; web survey login information (i.e. survey URL and a personal identification number [PIN]) was included. Nonresponse phone calls also provided information about both the paper and web response modes. A final mailing included information about the web survey option in addition to a second copy of the paper questionnaire.
In Condition 2 (the Web Push Condition), participants were sent the same selection letter and newsletter as the MTF control group without an indication of a paper survey. An advance letter in Condition 2 stated that they would be sent an invitation to complete an online survey. Instead of a paper questionnaire, participants in Condition 2 were sent web survey login information (i.e. survey URL and PIN) and a check for $25. The reminder postcard sent by mail was the same as the MTF main study except it requested completion of an online survey. The reminder letter included the online survey login information but also a paper questionnaire as an additional option. Nonresponse phone calls provided information about both the paper and web response modes. A final mailing also included a paper questionnaire and online survey information.
Condition 3 (the Web Push+ Email Condition) included the identical procedures to Condition 2 with the addition of an emailed version of the advance letter, web survey login information, reminder postcard, and reminder letter. Participants selected for this condition who did not provide an email address (23%) or who provided an unusable email address (i.e. email was undeliverable, 13%) received the same protocol as Condition 2. Further details on the experimental design are available elsewhere (Patrick et al. 2018).
At the second follow-up in 2016, at modal ages 21/22, participants were assigned to the same experimental condition as at the first follow-up (regardless of their response mode at ages 19/20 in 2014). All eligible respondents from 2014 were invited to participate in 2016, whether or not they responded previously.
Measures
To examine response in 2016 to the second follow-up, we controlled for baseline characteristics (from 12th grade reports via in-school surveys) that were available for both respondents and nonrespondents. These include class year (1=2013, 0=2012); gender (1=male, 0=female); race/ethnicity (black, Hispanic, other, and white [reference category]); parent education (1=at least one parent had at least some college education, 0=no parent had college education); four-year college plans (1=will definitely graduate from a four-year college program, 0=probably will, probably won’t, or definitely won’t); living in urban area (1=urban, 0=rural); and lifetime substance use (i.e. alcohol use, cigarette use, marijuana use, and illicit drug use other than marijuana) in 12th grade (each separately coded as 1=yes, 0=no).
Mode choice included paper or web. Device type was coded from the paradata available through the software used for web survey implementation (i.e. DatStat Illume, Seattle, WA). Smartphones included Android, Windows, and iOS phones. Android tablets and iPads were coded as tablets. Personal computers (PCs) included laptop and desktop computers running full-screen browsers.
Estimated cumulative costs included survey mailing materials, postage, staff labor (including fringe benefits), incentives, paper survey scanning, and nonresponse calling for both the first and second follow-up surveys. The calculation did not include (1) costs associated with survey design (for the web survey or the paper survey) and (2) staff time expected to be approximately equal across conditions (i.e. respondent tracking and data cleaning). The costs at both follow-up surveys in 2014 and 2016 were summed for the total cumulative costs.
Results
Table 1 shows response rates by experimental conditions at both waves. At the first follow-up survey, in 2014, the response rate of Condition 2 (38.0%; Web Push) was significantly lower than the MTF control group (43.6%) and Condition 1 (42.5%; Mail Push). At the second follow-up survey, in 2016, no statistically significant differences in response rates were found by conditions; 36.4–38.3% of participants in each condition responded. Among those who did not respond in 2014, 14.8–18.1% responded in 2016; of those who did respond in 2014, 64.3–68.4% responded in 2016. The only significant difference by condition was that 2014 nonresponders had a higher 2016 response rate in Condition 2 (18.1%) than in the MTF control group (14.8%).
Table 2 reports odds ratios of response at the second follow-up survey in 2016 (at modal ages 21/22). Web response in 2014, compared to paper response in 2014, was associated with significantly greater odds of response at the second follow-up survey in 2016. The experimental conditions were not significantly associated with response in 2016, controlling for 2014 response mode. Regarding other control variables measured at baseline (modal age 18), blacks and other racial/ethnic groups had significantly lower odds of response in 2016 compared to whites. Those who had 4-year college plans at age 18 were significantly more likely to respond in 2016. Lifetime use of illicit drugs other than marijuana at age 18 was significantly associated with lower odds of response in 2016.
Table 3 presents response mode choices in 2014 and 2016 among those who responded in both waves. In 2016, 56.0% responded by web overall, with significant differences by condition: only 12.0% in Condition 1 responded by web compared to 76.0% in Condition 2 and 82.2% in Condition 3. Most respondents chose the same mode at both waves; 72.4% of 2014 paper responders and 81.4% of 2014 web responders used the same mode in 2016. However, there was a noticeable effect of the “push” strategy, with greater percentages of 2014 paper responders moving to the web in 2016 in Conditions 2 and 3 and more web responders in 2014 moving to paper in 2016 in Condition 1.
Table 4 shows device types among web responders at both follow-up waves. In 2016, 77.5% of web respondents used a PC while 19.7% used a smartphone, and 2.9% used a tablet. Most PC responders in 2014 largely responded by PC in 2016 (80.9%); more than 70% of tablet responders and 60% of smartphone responders in 2014 responded by other devices in 2016.
The total cost of each experimental design across the two waves was calculated as the cost per targeted participant at both waves, i.e. the cost divided by the number of targeted participants, and as the cost per responder at both waves, i.e. the cost divided by the number of people who actually responded (Table 5). Condition 3 saved 10.7% of the cost per responder (and 11.2 % of the cost per targeted responder), compared to Condition 1 which is similar to the MTF Control procedures. Condition 2 incurred a 3.4% greater cost per responder (although a 5.4% lower cost per targeted responder) than Condition 1.
Discussion
We examined the effect of different mail and web mixed-mode designs across two waves of data collection following baseline. The 2014 experiment found that the response rate was lower in the Web Push condition, whereas the Mail Push and Web Push+Email conditions did not differ significantly from the MTF Control. In the 2016 wave, using the same mode assignment as in 2014, no significant differences in response rates by condition were found, but the Web Push+Email did slightly better than the control. Furthermore, the Web Push conditions brought in significantly more responders among those who had not responded in 2014.
We also found that responders tended to continue using the same mode in both waves (77.4% of 2014 paper responders also used paper in 2016; 81.4% of 2014 web responders used web again in 2016), but the Web Push conditions moved more paper responders to the web in 2016, and the Mail Push condition moved more web responders to paper in 2016, suggesting that the choice of mode is influenced by the invitation in the current wave. Among web responders, we found considerable device switching between waves, particularly for mobile device users, consistent with Lugtig and Toepoel (2016). Specifically, those who used a PC in 2014 tended to use a PC again in 2016, but more than half of those who used a mobile device (smartphone or tablet) in 2014 used a different device in 2016.
Finally, the Web Push+Email condition resulted in cost savings over the mail control group across two waves of follow-up data collection. The primary reason for this is the increased proportion of responders using the web mode, consistent with the emerging literature on this topic (Cernat and Lynn 2018; Millar and Dillman 2011). The finding points to the advantages of soliciting email addresses at baseline for use in follow-up surveys.
Strengths of the study include the longitudinal extension of a randomized experimental design based on a nationally representative sample of high school students in the United States Limitations include that the original study did not include high school dropouts and that the comparable cost estimates for the MTF Control group were not available. In summary, based on this experiment using nationally representative samples of high school graduates, we find continuing benefits of using a web push strategy for follow-up data collection and of using email addresses (where available) to invite participants.
Acknowledgements. This work was supported by the National Institute on Drug Abuse grants [R01DA001411 to R. Miech and L. Johnston, and R01DA016575 to J. Schulenberg and L. Johnston]. The content here is solely the responsibility of the authors and does not necessarily represent the official views of the sponsors.
Correspondence may be sent to Megan E. Patrick at the University of Minnesota Twin Cities, 1100 Washington Ave. S, Suite 101, Minneapolis MN 55415, 612-624-8063, or via email at: mpatrick@umn.edu.