Description of the problem
In late 2021, our study teams undertook two separate qualitative studies—unique in their funding streams, study objectives, and target populations—that shared several attributes: both sought to recruit hard-to-reach populations, lacked a definitive sample frame from which to recruit, and screened potential participants for eligibility across multiple dimensions. Both studies were exploratory and sought to gather a spectrum of responses rather than produce representative results. The first study (Study 1) explored parent perceptions of fairness in child support. Study 1’s design called for semi-structured, in-depth interviews with parents who owe or are owed child support in each of two Wisconsin counties. To be eligible, participants needed to be at least 18 years of age, have a current order to pay or to receive child support in either of the two target counties, and be a parent to at least one child under age 18. We aimed to recruit 20–32 participants in total—5–8 participants who owe child support and 5–8 participants owed child support—in each of the two counties. Interviews were conducted by Zoom or telephone as participants preferred and generally lasted 1.5–2 hours. The second (Study 2) used a semi-structured interview format and aimed to improve understanding of how lower-income individuals define, experience, and manage their debts. To be eligible, participants needed to be at least 18 years of age, have an annual gross income less than $40,000, live in one of five counties in southcentral and southeast Wisconsin, and have at least one debt of any kind (including, but not limited to, child support debt). Study 2 aimed to recruit 16 participants with quotas for equal representation by women and men, ages 18–29 and 30 and older, and non-white and white participants. Interviews were conducted by Zoom and generally lasted 1–2 hours.
When researchers need to recruit hard-to-reach populations, they often turn to community-based recruitment methods. These include strategies such as disseminating study information to trusted intermediaries via informational/promotional materials (MacDougall and Fudge 2001; Sadler et al. 2010; Sixsmith, Boneham, and Goldring 2003); snowball sampling through eligible sample members to identify other potentially eligible individuals within their networks (Browne 2005; Sadler et al. 2010; Sixsmith, Boneham, and Goldring 2003); and recruiting from public spaces where the target population is expected to be found (Sixsmith, Boneham, and Goldring 2003).
Given the timing of our studies, the pandemic posed practical recruitment challenges: how to locate hard-to-reach, potentially eligible individuals at a time when traditional in-person recruitment methods were hampered; how to convey key information about the studies; and how to encourage participation at a time of hardship and competing priorities for so many? Addressing these questions informs recruitment strategies under pandemic conditions and other situations that prevent in-person contact, as well as research recruitment generally given the increasing prevalence of virtual ways of working and connecting.
To reach potential participants when in-person methods were constrained, both study teams ultimately implemented recruitment strategies that approximated the three aforementioned community-based recruitment strategies (leveraging trusted intermediaries; snowball sampling; and recruiting from public spaces) as closely as possible using virtual means.
To promote the studies, we first developed promotional materials that contained information about eligibility, the expected approximate interview length (90 minutes for Study 1 and 60 minutes for Study 2) and the incentive amount to be paid upon interview completion ($50 for both studies). Study 1 used flyers as promotional materials from the outset; Study 2 initially developed recruitment letters and later developed promotional flyers, which contained the same information as the recruitment letters. To allow for participation among potential participants with varying degrees of technical literacy, the materials provided options for completing a screener by phone with a study member or filling out a web-based (Qualtrics) screening tool accessible by URL link or QR code.
Study 1 launched recruitment in January 2022 by reaching out to community providers that provide services to parents in two Wisconsin counties, in the hope that these trusted intermediaries would help us connect with potentially interested and eligible parents. We engaged county child support agencies, and we conducted web searches to identify an initially narrow list of other community providers that we expected might serve parents, including public and private non-profit human services agencies; parenting support groups and parenting programs; employment programs aimed at parents; the United Way; YMCAs and YWCAs; and local educational institutions with parent outreach components. In addition to leveraging these trusted intermediaries, Study 1 sought to approximate the strategy of recruiting from public spaces by asking these programs to post flyers in publicly available spaces, such as lobbies and waiting rooms. Study 1 also leveraged a third recruitment method—snowball sampling. Pre-COVID, we would have interviewed study participants in person and then provided hard copies of study materials to participants to share with other potentially eligible participants. In the virtual modality driven by COVID, however, we adapted our snowball sampling process by providing electronic copies of the study flyer to participants at the conclusion of each Study 1 interview and asked sample members to forward it to potentially interested parents in their networks. To allow sample members to share study information without fear of inadvertently sharing their study incentive information with other people, we sent participants’ electronic gift card incentive payments in a second, separate email.
The Study 2 team sought to interview lower-income individuals, earning a gross annual income of less than $40,000, and owing at least one debt of any kind. However, the team lacked a definitive sample frame containing a reliable measure of this income threshold. Therefore, as an initial step, the Study 2 team constructed a sample of individuals identified in Wisconsin administrative data as presently paying or receiving child support under specific thresholds that were roughly expected to correspond to having a gross annual income less than $40,000. Sample members were not required to have an order to pay or receive child support as a condition of participation; eligibility depended only on the sample member having a debt of any kind and meeting the aforementioned age (i.e., being at least 18 years old) and income criteria (i.e., having an annual gross income less than $40,000).
In contrast to Study 1, Study 2 initially planned to recruit participants by mail rather than community-based recruitment strategies. Study 2 began recruitment in November 2021 by sending recruitment letters via United States Postal Service to individuals who met these criteria. After sending 100 letters and getting a lower response than we had hoped for in the first month of the field period, we began direct outreach to community agencies that serve the target population of lower-income individuals who might hold any kind of debt, including county service agencies in the areas of economic support, human services, university outreach, and child support. Study 2 continued to send recruitment letters through the end of March 2022, sending a total of 730 letters, but we increasingly turned to community-based recruitment methods to reach the study’s recruitment goals.
For Study 1, the study team had initially envisioned a plan for visiting community providers in both target counties to share information about the study in-person, ask the agency to disseminate study information to their networks, and provide hard copies of study flyers. When Study 2 expanded recruitment efforts beyond letters to include community-based strategies, the Study 2 team also initially considered in-person visits to community providers. However, both the Study 1 and Study 2 teams realized through our respective early searches that staff members of many of these organizations were working remotely all or some of the time and that in-person efforts were unlikely to be productive. Instead, both study teams looked up contact information for leaders within these organizations and made initial attempts to contact them by email. The email described the studies and the populations we sought to reach, provided an electronic copy of the recruitment flyer, and asked the agency to share the flyers through their listservs or on social media. For every outreach, both study teams offered to send hard copies of the flyer printed by the research team for posting within the agency to avoid the agency incurring printing costs and alleviate potential inconvenience for agencies. Across both studies, our initial outreach efforts to community agencies were somewhat fruitful. When initial contacts did not respond within 2 to 3 business days, both study teams followed up with a telephone call and follow-up email to remind contacts of the initial email and answer any questions about the study. Across both studies, we broadened our outreach efforts within organizations to include people who held roles such as facilities directors and administrative assistants when the agency’s director did not respond.
However, after 7 to 8 weeks of working these initial strategies, both study teams independently recognized that our respective studies required even broader outreach. For example, responses from parents for Study 1 were coming in too slowly to meet the project timeline, and we struggled to meet targets with a subgroup: parents who owe child support. For Study 2, our recruitment quotas were difficult to fill. For these reasons, both study teams broadened outreach to include a wider array of community providers, including legal aid offices, public libraries, re-entry programs for the formerly incarcerated, community centers, and community colleges. With this new batch of contacts, we escalated our efforts to recruit from public spaces, beyond leveraging the “trusted intermediary” approach. Whereas pre-COVID we would have visited these providers in person to provide hard copy materials or ask permission to post flyers, due to travel restrictions, we instead asked leaders or staff within these organizations working physically on site to post the materials in public places on our behalf. Our outreach efforts followed a similar pattern, with initial attempts occurring by email, followed by a telephone call, followed by an additional email or emails. In total, we reached out to 45 organizations for Study 1 (with 40% agreeing to share study materials) and 24 for Study 2 (with 25% agreeing to share study materials). We made approximately three contact attempts per organization.
Effectiveness of methods used
For Study 1, we recruited and interviewed 26 eligible participants in total, compared to our original target of 20–32 participants. We achieved both of our subgroup targets by recruiting and completing interviews with at least five parents who owe child support and at least five who are owed child support in each of our two target counties. For Study 2, we recruited and interviewed 16 eligible individuals as intended and achieved our quota targets for equal representation from women and men, ages 18–29 and 30 and older, and non-white and white participants.
In order to gain insight into the effectiveness of community-based strategies relative to more traditional methods such as letters, during screening for Study 2, we asked screener responders to indicate how they learned about the study, using a check-all-that-apply screening question (Figure 1). Over half (63.6%) of screener responses corresponded with learning of the study through a community-based recruitment method—either a flyer (36.5% of all screener responders); from a friend, relative, or acquaintance (15.9%); or an “other” response consistent with a community-based recruitment strategy (11.2%, e.g., a caseworker, other community professional or workplace). Over a third (35.5%) indicated learning of the study through Study 2’s targeted, name-sample mailing (and not a community-based recruitment method), and .9% provided an “other” response that could not be categorized. We note that we are only able to compare the relative effectiveness of the initial mail-based strategy for Study 2 to the community-based approaches implemented later. We are not able to parse out the effectiveness of snowball sampling, public space, and trusted intermediary approaches relative to each other, as screener responses are potentially overlapping. (For example, a screener responder might have encountered the flyer in a public space or from a trusted intermediary.)
Providing a QR code on the flyer for individuals to access the screener survey, in addition to a shortened-URL direct survey link, was a new option to the study teams. To understand uptake of the QR code option for both studies, we tracked which method participants used to access the screener. Across the two studies, 101 of the 169 (60%) screener responders used the shortened-URL direct link, and 68 (40%) used the QR code. Use of these options varied across studies, however, with most (82%) Study 1 participants using the shortened-URL direct link, and most (57%) Study 2 participants utilizing the QR code (Figure 2).
Recruiting hard-to-reach populations can be challenging even in the best of circumstances, and the COVID-19 pandemic presented new challenges for reaching potential research participants due to limitations on in-person contact and traditional means of communication. Despite these constraints, pivoting to virtual approximations of traditional community-based approaches helped us to achieve study recruitment targets. Ultimately, the recruitment efforts we employed allowed us to successfully recruit enough eligible participants to reach our recruitment targets for both studies.
Our experiences through these studies highlight the benefits of using community-based recruitment methods. A limitation of this analysis is that we are not able to parse out the unique effectiveness of each of the three approaches—recruiting through trusted intermediaries, from public spaces, and through snowball methods—utilized by the study teams. We also sought to understand whether potential participants prefer QR codes to direct links. We offered both options on recruitment materials and found that, combined across both studies, most responders used the direct link; however, preferences for the direct link compared to the QR code varied across studies (with most Study 1 responders using the direct link and most Study 2 responders using the QR code). Future research could explore the underlying factors potentially driving differences in responder preferences, as well as how responder preferences might shift over time as technology continues to evolve.
Lisa Klein Vogel, PhD
1180 Observatory Drive #3412; email@example.com; 608-265-9377.
This research was supported by the Child Support Research Agreement between the Wisconsin Department of Children and Families and the Institute for Research on Poverty at the University of Wisconsin–Madison (PI: Judith Bartfeld; co-I: Lisa Klein Vogel).
Support for this study was provided by the National Institute of Child Health and Human Development (R01HD103356; MPIs Lawrence M. Berger and Rachel Dwyer).
The Study 1 team did not include an analogous measure to Study 2’s screener question on the Study 1 screener; therefore, comparable summary statistics for Study 1 are not available.