For many years, the telephone had been the mode of choice for survey practitioners because it offered high coverage, quick turnaround and relatively low cost. However, falling response rates and concerns about coverage have led researchers to look at other modes for data collection and sampling. Some researchers are turning to Address Based Samples and mail surveys to deal with these issues. Converting an existing Random Digit Dial (RDD) frame survey to an Address Based Sample (ABS) requires careful consideration. Several papers at the 2009 American Association for Public Opinion Research Annual Meeting examined ABS, comparing the advantages and disadvantages of ABS to other sampling schemes while searching for the optimal method of implementation of each design.
One of the first concerns a researcher may have when considering whether or not to use ABS is the quality of coverage that one can achieve. All RDD surveys have to contend with phone numbers in the sampling frame which are not ideal. For example, a sampled person may have multiple phone numbers. Nonworking telephone numbers may be sampled. Business telephones may be sampled when only residential phone numbers are desired. ABS faces similar issues, but addresses cause the problems. Throwbacks, surveys that are sent to a person’s post office box rather than their residential address, may occur. Undeliverable addresses pepper the typical sample because of vacant dwellings and “no-stat” dwellings which are not counted as possible mail delivery points. Seasonal, temporary, or second residences can enter the sample when only primary residences are desired (Fahini et al., 2009).
Additionally, just as telephone samples omit people without identifiable telephone numbers, ABS omits people without mailable addresses. ABS utilizes sampling frames of addresses, and most often these are derived from the national United States Postal Service (USPS) Delivery Sequence File (DSF). Two of the presentations at the AAPOR Annual Meeting focused on the quality of coverage that one can achieve using the DSF for ABS. English et al (2009) conducted a study to validate addresses from the DSF. Their work indicated that the DSF allowed for better coverage in urban areas than rural areas, though coverage in rural areas has improved over time. Fahini et al note that 911 services have led the improvement in rural coverage on the DSF by forcing all addresses to convert from rural routes to city-style street addresses (2009). In order to supplement gaps in DSF coverage in rural areas, McMichael et al developed an in-person system for adding addresses to a given sample. The system, called Check for Housing Units Missed (CHUM) may help strengthen ABS for future surveys (McMichael et al, 2009).
Response Rates and Other Concerns among Address Based Samples
A few of the studies presented at the AAPOR annual meeting compared RDD and ABS samples. Recently, there have been serious coverage and response issues associated with an RDD design, such as the inability to contact cell phone only (CPO) households. Estimates from the National Center for Health Statistics indicate that 22.7 percent of American homes had only cell phone telephones and no landline phone as of June 2009 (Blumberg and Luke, 2009). When comparing ABS samples to RDD ones, it was generally found that ABS had a higher sample yield, more efficient production rate, and better accessed hard-to-reach populations such as racial and ethnic minority groups (Barron, 2009; Johnson et al., 2009; Fleeman and Wasikowski, 2009; Link et al., 2009; Sherr et al., 2009). However, ABS samples require a longer turnaround period than do RDD samples. Barron concludes that the sequential nature of modes of contact can easily double or triple time in the field compared to a straight CATI system (2009). One study comparing the two sampling designs received a lower response rate from the ABS sample (Sherr et al., 2009). Therefore it is critical to find the best ways of implementing ABS, in order to truly reap any benefits over RDD.
Implementing ABS: Pre-recruitment by Mail
One design issue associated with ABS surveys is how to get a sampled individual to complete an instrument when there is the possibility that several people at one address, or on the other hand, no people at one address, may be eligible for the survey. Presentations at the AAPOR Annual Meeting demonstrated differing approaches to pre-recruiting or screening would-be respondents.
In Dekker and Murphy’s study, a postcard was sent to the households in which a phone number could not be matched. The purpose of these postcards was to obtain a valid telephone number from the respondent. There were no studies that seemed to indicate that respondents had an issue sharing this information. Minimal information about the household was collected beyond this, as to keep the burden for the respondent low and to keep the propensity for a postcard return high. The response rates overall of initial postcards that were used as a pre-recruitment approach (i.e. screeners) were generally low; however, Johnson, Fawson, and Williams (2009) noted that of the mere 2.1% of their ABS sample who returned the screener, consenting to participate in the study, a relatively high percentage (67.4%) successfully completed the full survey as well. This may indicate that individuals who did not want to be burdened with a second survey chose not to even return the first one, which would present an issue for a study trying to produce population estimates.
One potential solution to the issue of gaining information through the use of a screener is to eliminate the need for the respondent to complete multiple questionnaires. Link et al. (2009) conclude that it may be more effective to send sampled addresses all of the survey materials at once, forgoing a pre-recruitment contact altogether. Elver, Dykema, and Stevenson (2009) did just that, and simply asked for the respondent of the survey package they mailed (which included a postcard that was to be returned if there were not any eligible respondents within the household) to be someone within the desired gender and age categories. They acknowledge that this method would not suffice if they had been aiming to estimate accurate prevalence rates, but did not want to risk losing potential respondents through a complicated screening procedure. In total, 32 percent of all households that had a deliverable address returned the screenout postcard. It is clear that more research is necessary to find an effective and accurate way of handling a screening process via the mail.
Implementing ABS: Survey Mode
Another decision that befalls any researcher who opts to use an ABS design is how to contact members of the sample in a way that will result in high response rates. On the one hand, materials sent via mail can easily be lost, ignored, or never returned, as evidenced by the common practices of offering replacement surveys to sampled addresses and contacting sampled addresses multiple times. On the other hand, in RDD surveys, phone calls are often easy to ignore, too. For example, in the 2007 National Household Education Survey, a national RDD survey conducted by the National Center for Education Statistics, over 10 percent of the residential numbers that were determined to be residential, working numbers had been contacted at least 10 times with no response (Hagedorn et al., 2009).
The majority of the ABS papers presented at the Annual Meeting discussed using the telephone for contacting respondents. Researchers tried to match the households sampled with a telephone number (through telephone directories, commercial listings, etc.; see Dekker and Murphy, 2009), and make the telephone the initial (and primary) method of collecting information.
Messer and Dillman (2009) investigated the use of ABS for completion of a Web survey. They found that when respondents were mailed a request to complete the survey by mail versus by Web, the response rate was higher for the mail instrument than for the Web instrument.
The field of research devoted to the transition to ABS is rather limited and new. However, since many household surveys require an initial screening or within household sampling, it seems that the advantages of contacting respondents on the phone remain even when using an address based frame. It is much easier to learn about the household composition for a screening process from a phone call than it is to rely on a mail screener with a lengthy turnaround time and generally low response rates. It is also desirable to limit the burden of response to a single session, when possible. Looking ahead, it will be crucial for researchers to ensure that the households obtained by ABS will be able to be matched to phone numbers, since it seems as though phone interviews have not lost their usefulness.
In conclusion, the papers presented at the 2009 AAPOR Annual Meeting related to ABS revealed the following key findings:
ABS coverage is generally good and has been improving. Traditionally, the USPS DSF allowed for better coverage of urban areas than rural areas. The Check for Housing Units Missed (CHUM), an in-person address verification system, may help mitigate some of the gaps in coverage from the DSF.
Response rates for surveys using ABS vary by survey design, and can be higher than RDD.
ABS, compared to RDD frame samples, have been found to increase production rate and allow access to hard-to-reach populations like racial and ethnic minority groups.
Screening for eligibility for a survey by mail presents challenges. Screener mailings yield low response rates, though those who respond to a mail screener complete the full survey at higher rates than may be expected. Eliminating the need for a separate pre-recruitment contact yields better response.
Survey mode varies. A common ABS survey design is to use mail to collect telephone numbers for addresses that cannot be matched ahead of time and complete the survey by telephone.
Paul, Jamie, Sarah Grady. 2010. “Summary of Address Based Sampling Papers from 2009 AAPOR Conference” Survey Practice, April: www.surveypractice.org
Barron, M. (May, 2009). Multimode Surveys Using Address-Based Sampling: The Design of the REACH U.S. Risk Factor Survey. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Blumberg, S. and Luke, J. (December 2009). Wireless Substitution: Early Release of Estimates from the National Health Interview Survey, January-June 2009. National Center for Health Statistics. Retrieved December 23, 2009 from http://www.cdc.gov/nchs/data/nhis/earlyrelease/ wireless200912.htm.
Dekker, K. and Murphy, W. (May, 2009). Address-Based Sampling and Address Matching: Experience from REACH US. PowerPoint presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Elver, K., Dykema, J., Stevenson, J. (May, 2009). Effects of Alternative Forms of Addressing Households on Response Rates and Costs in a Mail Survey. PowerPoint presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
English, N., O’Muircheartaigh, C., Latterner, M., Eckman, S., Dekker, K. (May, 2009). Modeling the Need for Traditional Vs. Commercially-Available Address Listings for In-Person Surveys: Results from a National Validation of Addresses. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Fahini, M., Kulp, D., Malarek, D. (May, 2009). A Short Course in Address-Based Sampling (Merits, Design & Implementation, and Review of Case Studies). Short course presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Fleeman, A. and Wasikowski, N. (May, 2009). Performance Rates of CPO Subsequent Survey Households Identified Via Address Frames. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Hagedorn, M., Roth, S.B., Carver, P., Van de Kerckhove, W., and Smith, S. (2009). National Household Education Surveys Program of 2007: Methodology Report. (NCES 2009-047). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.
Johnson, P., Fawson, B., Williams, D. (May, 2009). Comparing Response Rates of Multi-Mode Recruit Single Mode Address-Based Sampling and Random-Digit Dialing Sampling. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Kudela, M., Wivagg, J., Choinière, C., Lando, A. (May, 2009). Making the Most of a Traditional Landline RDD Survey in the 21st Century: A Case Study. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Link, M., Daily, G., Shuttles, C., Yancey, L., Thu Burks, A., Bourquin, H.C. (May, 2009). Building a New Foundation: Transitioning to Address Based Sampling After Nearly 30 Years of RDD. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
McMichael, J., Ridenhour, J., Shook-Sa, B., Iannachionne, V. (May, 2009). Evaluation of Address Based Sampling (ABS) Frame Supplementation Methods for In-Person Household Surveys. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Messer, B. and Dillman, D. (May, 2009). Using Address-Based Sampling to Survey the General Public by Mail Vs. Web Plus Mail. Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Rockwood, T., Constantine, M., Davern, M., Beebe, T., Swaney, S. (May, 2009). Using the U.S. Postal Deliver Sequence File for Mixed-Mode Studies: Report on Measurement Differences Between Mail and Telephone Responses in the SHAPE Study. PowerPoint presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Sherr, S., Dutwin, D., Triplett, T., Wissoker, D., Long, S. (May, 2009). Comparing Random Digit Dial (RDD) and United States Postal Service (USPS) Address-Based Sample Designs for a General Population Survey: The 2008 Massachusetts Health Insurance Survey. PowerPoint presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.