Sample attrition, a constant concern in longitudinal studies, occurs when the burden of participation begins to outweigh the rewards (Apodaca, Lea, and Edwards 1998). Over the past several years, an increasing number of facilities in the annual National Survey of Substance Abuse Treatment Services (N-SSATS) have begun complaining about the burden of yearly participation. In response, Mathematica imbedded an experiment in the 2008 N-SSATS that tested whether prefilling some fields in their annual questionnaire would encourage continued participation. The appeal of such an option was unclear since prefilled questions must still be read and updated, if incorrect.
N-SSATS is sponsored by the Substance Abuse and Mental Health Services Administration (SAMHSA). The project surveys facilities that provide substance abuse treatment services, approximately 17,000 facilities annually.
The Experiment
The experiment embedded in the 2008 survey had the following features:
- Sample. Prefilling questions requires responses from the previous year. Consequently, only facilities that had responded to the 2007 survey, which had a 95 percent response rate, were eligible for the prefill experiment. These eligible facilities were then stratified by facility type and randomly distributed into either the treatment (N=6,187) or control group (N=6,092).
- Prefill questions. Facility-level questions—such as those asking about type of facility and services offered–were good candidates for prefilling. These questions accounted for about half the items in this 30 minute questionnaire.
- Experimental treatment. Treatment group facilities were offered a prefill option that promised some of their questions would be prefilled with their 2007 responses, if they chose to complete the questionnaire online. However, treatment group members who chose to complete a paper questionnaire or respond by phone would be not provided with prefilled responses.
- Procedures. Data collection procedures were identical for the treatment and control groups, except for the cover and reminder letters. Treatment group letters offered the prefill option for those who chose to respond online and asked facilities that responded online to carefully review the prefill information, updating it as needed.
Hypotheses. Since measuring sample attrition over a one-year span is difficult, we chose to examine any potential impact of a prefill option by looking at: 1) response rates; 2) proportion of Web completes; 3) Web completion times; 4) rate of return (measured by the number of facilities that needed a second mailing or needed telephone follow-up); 5) phone follow-up effort (measured by mean number of telephone attempts); 6) data quality (measured by comparing the average number of designated 2008 prefill questions that were changed in 2007, as compared to 2008) and overall impressions (measured by responses to debriefing questions).
Key Findings
All of the comparisons, except for Web completion times, were made on the group level. Table 1 provides the results for:
- Response rates. Less than one percentage point differentiated the two groups. However, given the sample sizes, such a small difference is still statistically significant, although not meaningful enough to suggest a change in procedures.
- Proportion of Web completes. Web completes were six percentage points higher in the treatment group, suggesting that the prefill option had affected survey mode choice.
- Web completion times. They averaged four minutes less in the treatment group, suggesting the prefill option reduced respondent burden slightly.
Given N-SSATS’ historically high response rates, it is not too surprising that the prefill intervention made little difference. However, the nearly six percentage point difference in the proportion of Web completes helped minimize paper questionnaire editing and data processing costs.
Cost Issues. Our hope was that offering a prefill option would create an early interest in responding, leading to a smaller second mailing and fewer cases requiring telephone follow-up. We had also assumed that once telephone follow-up began, a reminder about the prefill option might cause treatment group members to respond more readily, resulting in fewer follow-up calls.
As shown in Table 2 below, although all three outcomes were in the expected direction, less than one percentage point differentiated the two groups on all three measures. Given that half of the 2008 facilities had been in N-SSATS a decade or longer, the absence of speedier responses might reflect the existence of well-established routines, not easily changed.
Data Quality. We were concerned that a prefill option might cause respondents to be less careful about the information they submitted. To gauge this, we added a few debriefing questions at the end of the treatment group Web questionnaire. One question asked respondents about how carefully they had reviewed the prefilled questions before submitting their responses. Nearly 90 percent said they had reviewed “all” of the prefilled responses and another 8.7 percent said they reviewed “most” of the information. To more objectively assess differences in reporting patterns, we compared changes to the information reported in 2007 by both the treatment and control groups for those questions selected for prefilling in 2008. As shown in Table 3, the percentage of changed or updated questions was higher in the control group (16 percent) than in the treatment group (11 percent), suggesting respondents might have been more careful about responding when the prefill option was not available.
Perceived Burden. The most important outcome of this experiment may have been more perceptual than factual. Treatment group members who completed by Web were asked several debriefing questions at the end of the questionnaire. One asked if the respondent would like to see the prefill option offered in the future; 98.7 percent replied “Yes.” Moreover, hundreds wrote to say how “very helpful” the prefill option had been. Examples of specific comments included: “Very helpful!!!!”, “Saved a lot of time.”, “This was fabulous!” and “EXTREMELY EFFICIENT.”
Summary
As noted by Apodaca et al. (1998), the burden of participation has two components: the immediate and the longitudinal. The length of the initial interview is the immediate burden and the perception of future burden is defined as the longitudinal. In reality, the prefill option only shortened the average time completion time by four minutes, but the perception of time saved appears to have been far greater. Thus, this perception of time saved may work to minimize future sample attrition.
Postscript (2010)
Based on the 2008 N-SSATS experiment, the prefill option was offered as part of the 2009 survey. To assess the potential impact, we looked at the response rates and proportion of Web completes in the two years immediately preceding the 2008 experiment and then in 2009, the year immediately following the experiment. The results are in Table 4 below.
In the years immediately preceding the prefill option, the response rate was dropping. In fact, the 1.8 percentage point drop between 2006 and 2007 was the largest single year decrease since Mathematica began conducting N-SSATS in 1997. During the experiment year (2008) and the following year (2009), the response rates still dropped, but only by about a quarter of what it had dropped in the previous two years. Thus, it appears the prefill option may have stabilized sample attrition, at least temporarily. While the impact of the prefill option on Web completes is less clear, the jump between 2008 and 2009 was the largest single year increase since the Web survey was added in 2002.