Keeping track of sample persons between waves of data collection helps minimize attrition in longitudinal studies. All things being equal, the longer the time between data collection waves, the greater the likelihood that sample persons have moved, and the greater the difficulty in locating movers (Couper and Ofstedal 2009; Duncan and Kalton 2000). In 1997, the Panel Study of Income Dynamics (PSID) changed from annual to biennial interviewing. To capture residential changes between waves, PSID began sending families a “contact information update” mailing in the year between data collection waves. Families who update or verify the address and telephone information and return it receive a $10 post-paid check. About half the families responded to this mailing in recent waves. During 2007, families providing this information needed far less tracking or refusal conversion efforts, and half as many contacts to be interviewed, underscoring the cost effectiveness of the mailing.
Given these advantages, we designed a study before 2009 production interviewing to improve the response rate of the contact update mailing. Families were randomly assigned to the following conditions: $10 as a pre- versus post-paid incentive, mailing design (traditional versus updated), being sent a study newsletter, and timing and frequency of the mailing (July versus October versus both times). This paper reports on initial findings with regard to response rates to the mailing by these different conditions. Overall, there is no effect of incentive type implying that post-paid incentives are more cost effective. Traditional design performs better than updated design. Families receiving a second mailing have higher response rates than those receiving one mailing. There are some interaction effects with timing-of-mailing, with October-only families having higher response rates when also receiving a prepaid incentive and newsletter. Hypotheses for these findings and next steps for analysis are described.
Background
Various types of contact strategies to improve panel retention have been used in many studies (Couper and Ofstedal 2009; Laurie, Smith, and Scott 1999). However, we know of no experimental test of particular strategies. While the issue of contact strategy effectiveness is an understudied one, the survey methods literature can be drawn upon for suggestions of ways to improve response rates. First, under some conditions a prepaid incentive increases cooperation by heightening the salience of the incentive, as well as the respondent’s sense of reciprocity (Singer et al. 1999). Moreover, an updated design of respondent contact materials should increase the salience of the request, which may enhance cooperation (Dillman 1987).
Further, the timing and frequency of the request may affect cooperation. Ideally, the request occurs close enough to the upcoming data collection that most residential changes are captured, but not so close that perceptions of respondent burden are increased, which could occur with too many contacts. PSID interviewing occurs in odd years between March and November. Thus, two times of year were chosen for the mailing: midway between the end of the prior wave and beginning of the next (i.e., July) and as far into the year as was feasible to update addresses before production began (i.e., October). A third condition, July with an October re-mail for non-responders, was chosen to examine the effect on response rates of two contacts versus one.
Finally, evidence on the ideal amount of respondent contact is scant. In the two-year timeline of a biennial survey, respondents participate in a lengthy interview, receive a study newsletter, are asked to update contact information, and receive a letter alerting them to the upcoming interview. At what point do these multiple contacts become burdensome, or do they in fact enhance perceptions of identification with the survey? It is likely that these perceptions vary by characteristics of sample members. Thus, in this study the manipulation of the mailing involved modifying aspects of the incentive, the design, the timing, and the amount of contact being made, with guidance from evidence in the survey methods literature.
Methods
Conditions
PSID families eligible for the 2009 interview (n=8,929) were randomly assigned to four conditions which defined a 2 (“newsletter”) × 3 (“timing”) × 2 (“design”) × 2 (“incentive”) experimental design (Figure 1). To manipulate number of respondent contacts and burden, half the families were sent a study newsletter a year before interviewing began. The second condition was the timing of the mailings with one-third mailed in July, one-third in October, and one-third mailed initially in July with a follow-up mailing in October for nonresponders. Mail design was the third condition, with half the families receiving the traditional black and white design, and half an updated design (Appendix I). The final condition varied whether the $10 incentive was pre-paid or post-paid.
Results
Sixty percent of families provided updated or verified contact information. This did not vary by whether the family was sent a newsletter nor by incentive type. There was a significant effect of timing with a higher rate for July-October follow-up (67.2 percent) compared to July-only (57.8 percent) and October-only (55.8 percent) cases. Finally, traditional design had a significantly higher response rate than updated design (62.3 percent versus 58.0 percent).
Traditional design performed significantly better regardless of newsletter or incentive type. There were no differential effects of incentive by newsletter or design type.
Logistic regression models examined effects of newsletter, incentive, and design within each timing condition. Among July-only cases, traditional design performed significantly better than updated design, and there was no effect of incentive or newsletter. Among October-only cases, a significantly higher response rate was associated with traditional design, pre-paid incentive, and being mailed the newsletter. Finally, there were no significant effects of newsletter, incentive or timing among July-October follow-up cases.
Discussion
What lessons can be learned from this experiment? First, a follow-up mailing for nonresponders is an effective, low cost strategy that may ultimately reduce the need for expensive tracking during production. This condition yielded response rates between nearly 10 and 12 percentage points higher than the one-time mailing conditions.
Second, and unexpectedly, traditional design performed better. In both designs, the last known contact information was preprinted on a card that respondents folded over, sealed with an attached sticker, and mailed. The updated design also included lengthy instructions describing the necessity of tearing away the card before folding it. Perhaps the additional instructions made returning the updated card seem complicated and discouraged compliance. We will test this in 2010 by removing the “tear step” and instructions and modifying only the color and design.
Third, the pre-paid incentive performed better than post-paid only in the October condition. Perhaps it was comparatively appealing by October as the U.S economy began to unwind. However, overall pre-paid incentives were not cost effective as they did not increase response rates compared to post-paid. The finding that families who were sent the newsletter had higher response rates in October than those who were not may reflect the importance of maintaining contact with study families. October-only families who were not sent the newsletter had no study contact for at least 10 months, with most having no contact for 16 months or longer. These families had a four percent lower response rate than families who received the newsletter. Taken together with the finding of highest response rates for the mailing condition with follow-up suggests that the most promising timing is an initial contact approximately eight months before production starts, with a follow-up three months later if needed.
We will next examine whether these conditions affect operational burden during 2009 production interviewing, including tracking and refusal conversion rates, and contact attempts needed to obtain a final result. This information will help us design an effective strategy for keeping track of panel families.