In an effort to prevent non-response, survey practitioners are increasingly offering refusal conversion incentives to reluctant respondents (Singer 2002). However, despite the potential for preventing attrition, these types of differential incentives are generally avoided in longitudinal studies. Of particular concern to survey practitioners in deciding to offer additional incentives is the possibility respondents will come to expect higher incentives in future waves of the study (Laurie and Lynn 2009). If this is the case, the introduction of refusal conversion incentives to a panel study could lead to greater reliance on such incentives in future waves (Singer 2002; Steeh et al. 1999), resulting in escalating respondent incentive costs over time.
There is relatively limited knowledge of the long-term effects of tailoring incentive strategies to respondent behavior (Laurie and Lynn 2009). In what appears to be the only existing investigation of refusal conversion incentives in a panel study, Lengacher and colleagues (1995) examined the influence that refusal conversion incentives had on respondents’ future participation. They found that, among respondents needing refusal conversion, those receiving additional incentives and those receiving non-monetary conversion approaches had nearly indistinguishable participation rates in a later wave.
This paper expands on this research by exploring the effect of refusal conversion incentives on both participation and incentive costs in multiple waves of a panel study. We conducted our analysis in the context of the National Social Life, Health, and Aging Project (NSHAP), a longitudinal, population-based study of health and social factors. NSHAP data collection includes three components: a two-hour in-home CAPI interview, extensive biomeasure collection, and a leave-behind self-administered questionnaire.
Wave 1 was completed in 2005 and 2006 by NORC at the University of Chicago and Principal Investigators at the University of Chicago. From a national multi-stage area probability sample of community-dwelling adults aged 57 to 85, we collected data from 3,005 respondents (AAPOR RR2 of 75.5 percent; AAPOR 2011). Wave 2, conducted in 2010 and 2011, returned to Wave 1 respondents and Wave 1 non-hostile, non-interviewed respondents. We also interviewed the spouses and co-habiting romantic partners of Wave 1 respondents. Data from 3,377 respondents were collected in the second wave (AAPOR RR2 of 76.9 percent; AAPOR 2011), including 2,260 respondents first interviewed in Wave 1 (87.8 percent retention rate).
Wave 1 Background
At the outset of Wave 1, we offered an initial incentive of $100 to all NSHAP respondents. Toward the end of Wave 1 data collection, we conducted a split-ballot incentive experiment in which 250 “final refusal”’ cases were randomly selected into two groups. We offered one group of 125 respondents $200 to participate in the survey and $400 to the second group of 125. Respondents were notified by mail of the incentive offer and encouraged to call a toll-free number if they wanted to participate. Twenty-three calls were received regarding the increased incentives, and 13 interviews were completed at the increased incentive levels. Following the experiment, we introduced a $400 incentive for all remaining Wave 1 pending cases. During the final month of data collection, we increased the incentive offer to $500. Ultimately, 221 Wave 1 NSHAP respondents received an incentive payment greater than the original $100. By offering the increased incentive, we gained an additional seven percentage points in the final response rate (Smith et al. 2009; Wargo et al. 2007).
Wave 2 Methods
Based on the success of the Wave 1 experiment, Wave 2 respondents were offered the same $100 initial incentive followed by a phased increase. After 26 weeks of field work, the incentive offer was raised to $300 for all pending Wave 1 respondents who had received greater than $100 in Wave 1. Five weeks later, the incentive offer was raised to $400 and made available to all pending cases.
In analyzing the incentive distributions across Waves 1 and 2, we were interested in three main questions: 1) What socio-demographic factors, if any, were correlated with Wave 2 incentive acceptance?; 2) Is there a relationship between Wave 1 incentive acceptance and Wave 2 participation?; and 3) What effect, if any, did Wave 1 refusal conversion incentives have on Wave 2 incentive costs? In the following discussion of the results, we refer to respondents who accepted $100 as “compliant” and those who did not participate for $100, but ultimately participated for more, we call “reluctant”.
In the first wave, 2,784 respondents received $100, 18 received $200, 4 received $300, 163 received $400, and 36 received $500. Of the 3,005 respondents who participated in the first wave, 2,533 were eligible to participate in Wave 2, and 2,260 participated again in the second wave. In Wave 2, 2,097 returning Wave 1 Respondents received $100, 25 received $300, and 138 received $400. In the analysis that follows, we focus on those respondents who participated in Wave 1 and were eligible to participate in Wave 2.
First, we analyzed socio-demographic characteristics (such as gender, race, education, income) and found none to be significantly correlated with Wave 2 compliancy.
Next, we looked at the relationship between Wave 1 incentives and Wave 2 participation. Table 1 shows the relationship between Wave 1 and Wave 2 incentive acceptance among those respondents who participated in Wave 1.
In comparing W1 Compliant and W1 Reluctant Respondents, W1 Compliant Respondents were more likely to be compliant again in Wave 2 (85 percent), while W1 Reluctant Respondents were more likely to be reluctant (26 percent) or refuse (15 percent) in Wave 2. However, looking only at W1 Reluctant Respondents, 59 percent were compliant in Wave 2 (accepting the $100 initially refused in Wave 1) and 26 percent participated in Wave 2 once presented with an additional incentive. In total, 90 percent (n=2,097) of W1 Compliant Respondents and 85 percent (n=163) of W1 Reluctant Respondents participated in Wave 2.
We also examined the net change in incentives accepted in Wave 1 and Wave 2 for respondents participating in both waves. These changes are summarized in Table 2.
Among W1 Compliant Respondents participating in Wave 2, 95 percent completed with no net change in incentive between waves, receiving the same incentive of $100 in each wave. Only five percent of W1 Compliant Respondents received a higher incentive (an increase of $300) in Wave 2. However, almost 89 percent of W1 Reluctant Respondents participating in Wave 2 did so for less money (12 percent accepted $400 less, 50 percent accepted $300 less, three percent accepted $200 less and 24 percent accepted $100 less) and 10 percent participated for the same amount as they had in Wave 1. Only one percent of W1 Reluctant Respondents participating in Wave 2 did so at a higher incentive. In total, 95 percent of all Wave 1 Respondents that participated in Wave 2 did so for the same incentive or less than they had accepted in Wave 1.
Table 3 translates these net changes into the difference in dollars spent between waves.
W1 Compliant Respondents received a total of $34,200 more in Wave 2. However, since 144 of the 163 W1 Reluctant Respondents participated at a lower incentive and 17 participated for the same amount, a total of $36,400 in incentives was saved on W1 Reluctant Respondents in Wave 2. Overall, $2,200 in incentives was saved interviewing Wave 1 respondents in Wave 2.
Another way to compare the incentive costs between waves is to look at incentive dollars spent per case. For W1 Compliant Respondents, we spent an average of $100 in incentives per case in Wave 1. In Wave 2, we spent an average of $116.24 in incentives for this group, or roughly 16 percent more than we had in Wave 1. For W1 Reluctant Respondents, we spent an average of $398.16 in incentives in Wave 2. In Wave 2, we spent an average of $174.85 in incentives for this group, or roughly 56 percent less than in Wave 1. When combining these groups, we spent on average $121.50 in incentives per case in Wave 1 and $120.52 in incentives per case in Wave 2.
The results of our analysis show the majority of W1 Compliant and Reluctant Respondents, 90 percent (n=2,097) and 85 percent (n=163) respectively, participated in Wave 2. In addition, 93 percent (n= 2,097) of returning respondents participated at the base incentive level of $100, and 88 percent (n=2,000) had no change in the incentive received across waves. Results suggest graduated incentives in Wave 1 did not condition respondents to delay participation until an additional incentive was offered in Wave 2. Indeed, 88 percent of W1 Reluctant Respondents accepted a lower incentive in Wave 2, resulting in a net savings of $36,400 for incentives paid to W1 Reluctant Respondents.
Of particular interest was the finding that five percent (n=114) of W1 Compliant Respondents received the largest increase in Wave 2. We hypothesize that all surveys contain a minority of difficult-to convert respondents and that these respondents are not the same individuals across waves. Given no significant socio-demographic differences (race, gender, education and income) between compliant and reluctant respondents, we suggest intangible factors such as timing of the survey or experience with the interviewer or survey may play a larger role than demographic variables or prior incentive received in influencing participation decisions. It is also important to note that even with increased incentives, some respondents refuse participation; 11 percent (n=273) of returning respondents refused participation in Wave 2 even after the highest incentive was offered.
Overall, there was no difference in incentive costs between waves as a result of using graduated incentives ($2,200 was saved on the panel translating into a savings of $0.98 per case). These findings, coupled with NSHAP’s comprehensive data collection activities, support our plans to continue using graduated incentives on NSHAP. We acknowledge the incentives used in NSHAP may be cost-prohibitive for many surveys. We hope survey practitioners will be encouraged by the principle behind our findings and use these results to inform their use and cost decisions regarding graduated incentives in longitudinal studies.
NSHAP is supported by the National Institutes of Health (R01-AG021487), (R37-AG030481), and (R01-AG033903) including the National Institute on Aging, the Office of Research on Women’s Health, and the Office of AIDS Research.
The initial $100 incentive (and any subsequent increases in both waves) was reviewed by the Institutional Review Boards (IRBs) of both NORC and the University of Chicago Social and Behavioral Sciences Division.
A total of 472 Wave 1 respondents were ineligible (deceased (n=432), in nursing homes (n=7), unlocatable (n=12), or outside the United States (n=7) at the time of Wave 2 interview). A total of 273 Wave 1 Respondents were eligible but did not participate in Wave 2 for reasons including in too poor health to participate (n=120) or other refusals (n=153).