Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:25822/feed
Articles
Vol. 9, Issue 3, 2016July 31, 2016 EDT

Random Digit Dialing versus Address-Based Sampling using Telephone Data Collection

Bonnie E Shook-Sa, Douglas Currivan, David Roe, Lauren Klein Warren,
oversamplingabsrdd
https://doi.org/10.29115/SP-2016-0015
Survey Practice
Shook-Sa, Bonnie E, Douglas Currivan, David Roe, and Lauren Klein Warren. 2016. “Random Digit Dialing versus Address-Based Sampling Using Telephone Data Collection.” Survey Practice 9 (3). https:/​/​doi.org/​10.29115/​SP-2016-0015.
Save article as...▾
Download all (3)
  • Figure 1  AF4Q 2.1 markets.
    Download
  • Figure 2  AF4Q 2.2 markets.
    Download
  • Figure 3   AF4Q 2.2 ABS design.
    Download

Sorry, something went wrong. Please try again.

If this problem reoccurs, please contact Scholastica Support

Error message:

undefined

View more stats

Abstract

Dual-frame random digit dialing (RDD) and address-based sampling (ABS) are typically considered for household survey frames. Inaccurate geographic assignments using RDD can lead to undercoverage of the target population and challenges in oversampling geographically-clustered subpopulations. Geographic targeting is straightforward with an ABS design, but ABS poses challenges related to contact for telephone data collection. The Aligning Forces for Quality: Assessment of Consumer Engagement (AF4Q) survey was originally based on an RDD design and was migrated to an ABS design due to concerns with coverage, sampling efficiencies, and data collection costs. We discuss the challenges associated with each sampling method and compare key measures between the designs. We find that the ABS design offered higher frame coverage, lower costs, and improved sampling efficiency, while the RDD design had higher response rates. ABS should be considered as a viable option for telephone-only data collection efforts that feature geographic targeting or oversampling of geographically-clustered populations.

Introduction

The selection of an appropriate sampling frame is dependent on multiple factors. These include expected coverage of the target population, mode(s) of data collection, anticipated response rates, and impact on data collection costs. The two sampling frames that are typically considered for telephone studies are dual-frame random digit dialing (RDD) and address-based sampling (ABS). When the constraints of a study require that data be collected solely by telephone and a multimode approach is not feasible, both of these frames have advantages and challenges. These are augmented for studies with geographic targeting and oversampling of subpopulations.

There have been examples in the literature that show that ABS designs have better coverage and a more efficient frame compared to an RDD design (Burks and Link 2012; Johnson and Williams 2010; Link et al. 2006, 2008; Losch et al. 2011). Many studies, upon switching sampling strategies, have also switched modes. What seems to be lacking from the literature is an examination of the impact of moving from an RDD to an ABS design for studies that cannot or do not change modes.

The Aligning Forces for Quality: Assessment of Consumer Engagement (AF4Q) transitioned from a dual-frame RDD to an ABS design. It was constrained to a telephone-only data collection protocol due to the complexity of the survey. The AF4Q presents an opportunity to compare key sampling and data collection metrics between the designs to assess the advantages and limitations of each approach. The AF4Q is a survey of chronically ill consumers of healthcare residing in participating, geographically-defined markets. RTI International conducted the sampling, data collection, and weighting for the second round of the AF4Q. The AF4Q 2.1 included 16 markets and was based on an RDD design, and the AF4Q 2.2, conducted a year later, included three markets and was based on an ABS design.

This paper compares the use of RDD and ABS within a telephone data collection context. We introduce each design, explain the decision to move from an RDD to an ABS design for the AF4Q 2.2, and note key comparisons between the two designs.

Sample Designs

RDD Design

The AF4Q 2.1 was based on an RDD design. Sampled phone numbers were contacted via CATI to complete the screening interview, and if found eligible, the full interview. Respondents found eligible for the full interview were offered a $20 incentive for their participation. Approximately 18,000 completed screeners and 5,000 completed interviews were obtained across the 16 AF4Q markets and the national comparison sample (Figure 1).

Figure 1  AF4Q 2.1 markets.

The dual-frame RDD design included sample from both landline and cell phone frames. The cell phone frame was developed based on switch center locations. While rate centers have more recently become available and have been shown to be more accurate than switch centers (Marketing Systems Group 2014), they were not available at the time the sample was selected. The switch center locations associated with each market were targeted and included on the sampling frame. Approximately 25 percent of completed screeners and 20 percent of completed interviews were conducted by cell phone.

The AF4Q 2.1 design oversampled minority respondents (defined as Hispanics, non-Hispanic African Americans, and non-Hispanic Asians) within the landline frame. Oversampling was achieved by stratifying the frame based on the mapped location of phone numbers to geographic areas (census block groups for listed numbers and counties for unlisted numbers) and using surname samples. Because of the geographic uncertainties associated with cell phone numbers, minority oversampling within the cell phone frame was not feasible.

Transition to ABS

While the RDD design achieved the goals of the study, several challenges were encountered. Inaccuracies in locations associated with telephone numbers led to concerns related to geographic coverage. This also contributed to concerns with meeting the necessary minority targets and statistical inefficiencies due to significant undersampling of the cell phone population. Finally, because of lower geographic eligibility rates in the cell phone frame, substantial data collection costs were incurred. An ABS design was implemented in the AF4Q 2.2 to alleviate these challenges.

ABS Design

The AF4Q 2.2 was conducted in three markets (Figure 2) and was based on an ABS sampling frame. ABS frames are based on commercially-available versions of the United States Postal Service Computerized Delivery Sequence (USPS CDS) file. The CDS file is made available to the public through licensing agreements with qualified private companies. The USPS also makes available the No-Stat file, which contains over 8 million primarily rural mailing addresses that supplement the CDS file (Shook-Sa et al. 2013). The union of the CDS and No-Stat files account for all postal delivery points serviced by the USPS, giving ABS frames near-complete coverage of the household population (Iannacchione 2011).

Figure 2  AF4Q 2.2 markets.

ABS frame elements are mailing addresses, some of which can be linked to phone numbers. The availability of mailing addresses for each frame element allows for more effective stratification and oversampling of minorities because frame elements can be targeted at small levels of geography. For the AF4Q 2.2, oversampling of minorities was achieved by grouping census block groups into six strata based on the proportion of minorities, which were again defined as Hispanics, non-Hispanic African Americans, and non-Hispanic Asians. Sample was allocated across the strata to achieve minority targets while controlling the anticipated design effects. The availability of mailing addresses for each frame element also allows advance letters to be mailed to all sample cases rather than just a subset of cases. However, with an ABS design, persons in households without linked phone numbers (approximately 55 percent of the AF4Q sample) must provide their phone numbers or call in to complete the screening interview. The data collection process for the AF4Q 2.2 is displayed in Figure 3. During the AF4Q 2.2 data collection, about 3,000 screeners and 1,000 interviews were completed, with about 30 percent of completes coming from cases for which a phone number could not be appended to the frame.

Figure 3   AF4Q 2.2 ABS design.

Comparison of Designs

Following data collection, comparisons were made between the dual-frame RDD (AF4Q 2.1) sample and the ABS (AF4Q 2.2) sample. These comparisons include frame coverage, data collection costs, sampling efficiency, and response rates. The primary limitation of these comparisons is that the AF4Q 2.1 and 2.2 samples were conducted in different geographic areas, and data collection occurred during different time frames.

Frame Coverage

The cell phone portion of the RDD frame suffered from significant undercoverage of the target population due to inaccuracies in identifying relatively small geographic areas through switch centers. However, even if the sample design were transitioned to rate centers, sizeable undercoverage of the cell phone only population would remain. As previously discussed, the ABS frame eliminates inaccuracies due to geographic targeting and provides near-complete coverage of the household population.

Data Collection Costs

For both the RDD and ABS designs, we calculated the cost per completed interview. These costs included direct labor charges as well as other direct costs such as the costs of mailings and incentives. ABS completed interviews were approximately 20 percent cheaper than RDD completes, even taking into account costs of additional mailings and pre-incentives that were not incurred with the RDD design.

Sampling Efficiencies

We compared the sampling efficiency between the RDD and ABS samples by evaluating the variability in analysis weights, measured with design effects due to unequal weighting (DEFFw). Differing market characteristics led to a need to enhance comparability between the designs. We conducted a simulation study in which the ABS design was simulated for the 10 AF4Q 2.1 markets that incorporated minority oversampling and were originally conducted via RDD. This simulation allocated the ABS sample within the 10 RDD markets, holding constant the number of interviews and number of minority interviews. Sample yield rates were assumed to be similar to those observed in the AF4Q 2.2 ABS markets. Inflation in design effects due to unequal weighting from design to final analysis weights observed in the AF4Q 2.2 markets was applied.

Effective sample sizes were calculated for both the original (RDD) and simulated (ABS) designs equal to the ratio of the total number of respondents to the DEFFw. Results are shown in Table 1. For all markets except two (Memphis and Cleveland), the ABS-simulated effective sample sizes were higher than the RDD-observed effective sample sizes. This simulation study provided evidence that the ABS design was more statistically efficient than the RDD design.

Table 1  Effective sample sizes: observed RDD vs. simulated ABS.
Market RDD ESS Simulated ABS ESS ABS-RDD % Relative difference
Minneapolis/St. Paul, MN 86 132 47 54.7
Willamette Valley, OR 122 176 54 44.3
Western New York 79 110 31 39.2
Puget Sound, WA 142 191 48 33.8
Greater Cincinnati, OH/KY 104 128 25 24.0
Kansas City, MO/KS 115 132 17 14.8
Detroit, MI 101 110 9 8.9
National comparison 183 194 11 6.0
Memphis, TN 106 97 -9 -8.5
Cleveland, OH 118 97 -21 -17.8

ESS=effective sample size.

Response Rates

Table 2 contains the ranked weighted AAPOR4 response rates in the AF4Q 2.1 (RDD) and AF4Q 2.2 (ABS) markets. While the response rates in all markets were relatively low, the response rates in the RDD markets were higher than the response rates in the ABS markets with the exception of the national comparison sample. However, response rates are not necessarily indicative of which design produced higher quality estimates (Groves 2006; Lee et al. 2009).

Table 2 Ranked weighted response rates by market: AF4Q 2.1 (RDD) and AF4Q 2.2 (ABS).
MarketAAPOR4 weighted response rates (%)
Oregon, remainder 31.0
Western Michigan 29.0
Detroit, MI 27.6
Minnesota 25.9
Cleveland, OH 24.9
Minneapolis/St. Paul, MN 24.3
Wisconsin 22.1
Greater Cincinnati, OH/KY 20.6
Willamette Valley, OR 19.6
Western New York 19.4
Memphis, TN 19.1
Maine 18.8
Humboldt County, CA 17.9
Puget Sound, WA 17.8
South Central Pennsylvania 16.7
Kansas City, MO/KS 16.5
Indianapolis, IN 11.2
Albuquerque, NM 9.9
National comparison 9.7
Boston, MA 6.4

White=AF4Q 2.1 (RDD) markets; Grey=AF4Q 2.2 (ABS) markets.

Discussion

For telephone-only data collection designs, dual-frame RDD and ABS are two viable frame options for telephone sampling. RDD offers a consistent mode of recruitment and contact for all sampled cases. It does not allow advance letters to be mailed to all cases because mailing addresses can only be appended to a subset of telephone numbers. Inaccuracies in geographically linking cell phone numbers to subnational geographic areas makes RDD challenging for subnational studies and for studies that aim to oversample based on geographically-clustered characteristics of the target population such as race/ethnicity. These inaccuracies result in undercoverage, sampling inefficiencies, and increased data collection costs.

ABS offers a high-coverage frame for subnational studies. The ABS frame allows for more efficient and accurate targeting of subnational populations and oversampling of geographically-clustered characteristics. From a data collection standpoint, ABS allows for advance letters to be mailed to all sampled cases. ABS poses challenges for telephone designs since inconsistent modes of study recruitment are required. Different recruitment methods can lead to differential nonresponse and the potential for nonresponse bias.

In this comparative analysis, we found that both designs were able to achieve the goals of the AF4Q study. However, key differences were found in terms of frame coverage, data collection costs, sampling efficiency, and response rates. The ABS design offered higher frame coverage, lower costs, and improved sampling efficiency over the RDD design. The only metric evaluated in which the RDD design performed better than the ABS design was in response rates, which do not necessarily correlate with bias in key estimates such as health outcomes.

Until better methods are established for linking cell phone numbers to subnational areas, we recommend that subnational telephone studies and telephone studies incorporating oversampling of geographically-correlated characteristics consider ABS as a viable frame option. ABS will be an even better option if and when techniques are developed to accurately link cell phone numbers to mailing addresses.

Acknowledgements

The Aligning Forces for Quality (AF4Q) initiative is sponsored by the Robert Wood Johnson Foundation. The Center for Healthcare and Policy Research at Pennsylvania State University serves as the AF4Q evaluation team. The authors would like to acknowledge the contributions of Dennis Scanlon and David Johnson at Pennsylvania State University. Thanks also to RTI International staff members G. Lance Couzens for developing the graphics in this paper and to Brian Head and Barbara Bibb for their valuable contributions to the AF4Q project.

References

Burks, A.T., and M.W. Link. 2012. “Address Based Sampling: Census Block Group Data Used to Define Incentive Structure.” Presented at the Annual Conference of the American Association for Public Opinion Research. Orlando, FL.
Groves, R.M. 2006. “Nonresponse Rates and Nonresponse Bias in Household Surveys.” Public Opinion Quarterly 70 (5): 646–75.
Google Scholar
Iannacchione, V. 2011. “The Changing Role of Address-Based Sampling in Survey Research.” Public Opinion Quarterly 75 (3): 556–75.
Google Scholar
Johnson, P., and D. Williams. 2010. “Comparing ABS vs. Landline RDD Sampling Frames on the Phone Mode.” Survey Practice 3 (3): 2168–0094.
Google Scholar
Lee, S., R.E. Brown, D. Grant, T.R. Belin, and M.J. Brick. 2009. “Exploring Nonresponse Bias in a Health Survey Using Neighborhood Characteristics.” American Journal of Public Health 99 (10): 1811–17.
Google Scholar
Link, M.W., M.P. Battaglia, M.R. Frankel, L. Osborn, and A.H. Mokdad. 2006. “Address-Based versus Random-Digit-Dial Surveys: Comparison of Key Health and Risk Indicators.” American Journal of Epidemiology 164 (10): 1019–25.
Google Scholar
———. 2008. “Address-Based versus Random-Digit Dial Sampling: Comparison of Data Quality from BRFSS Mail and Telephone Surveys.” Public Opinion Quarterly 72 (1): 6–27.
Google Scholar
Losch, M.E., P. Damiano, J. Willard, A.B. Hoekstra, K.H. Park, and D. Nguyen. 2011. “A Direct Comparison of ABS and Telephone Sampling in a Pilot Study of Children’s Health.” Survey Practice 4 (5): 2168–0094.
Google Scholar
Marketing Systems Group. 2014. “Cellular Sampling - a New Approach - Rate Centers.” http:/​/​www.m-s-g.com/​CMS/​ServerGallery/​MSGWebNew/​Documents/​GENESYS/​whitepapers/​CellSampling-RateCenters.pdf.
Shook-Sa, B.E., D.B. Currivan, J.P. McMichael, and V.G. Iannacchione. 2013. “Extending the Coverage of Address-Based Sampling Frames: Beyond the USPS Computerized Delivery Sequence File.” Public Opinion Quarterly 77 (4): 994–1005.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system