Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • Author terms & conditions
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:20111/feed
Articles
August 12, 2025 EDT

Increasing Participation in Survey Research among Veterans: A Tailored Mailing Experiment

Vince Welch, MA, Evan Herring-Nathan, MS, Lindsey L. Monteith, PhD, Evgenia Kapousouz, PhD, Erin Fordyce, MS, MEd, Theresa T. Morano, MS, Claire A. Hoffmire, PhD,
Survey RecruitmentSurvey ResearchVeteransSuicide preventionRecruitment materials
https://doi.org/10.29115/SP-2025-0009
Photo by sydney Rae on Unsplash
Survey Practice
Welch, Vince, Evan Herring-Nathan, Lindsey L. Monteith, Evgenia Kapousouz, Erin Fordyce, Theresa T. Morano, and Claire A. Hoffmire. 2025. “Increasing Participation in Survey Research among Veterans: A Tailored Mailing Experiment.” Survey Practice 18 (August). https:/​/​doi.org/​10.29115/​SP-2025-0009.
Save article as...▾
Download all (5)
  • Figure 1. Overall Yield for Experimental Cases, by Demographic Group, Compared to non-Experimental Group
    Download
  • Figure 2. Yield by Experimental Condition, within each Demographic Group
    Download
  • Appendix A
    Download
  • Appendix B
    Download
  • Appendix C
    Download

Sorry, something went wrong. Please try again.

If this problem reoccurs, please contact Scholastica Support

Error message:

undefined

View more stats

Abstract

In a pilot survey aiming to inform Veteran suicide prevention, Veterans who were Black/African American, Hispanic, Asian/Native Hawaiian/Pacific Islander (AANHPI), female, and recently separated responded at lower rates. To increase response in a subsequent survey, an experiment was conducted to examine the effect of receiving tailored inserts (i.e., recruitment materials with images of Veterans from one’s demographic group), compared to a generic insert (i.e., images from a variety of demographic groups) or no insert. We hypothesized that receiving tailored inserts with images of Veterans from respondents’ demographic group would increase survey response. The experimental manipulation yielded non-significant results, except for significant findings in the opposite direction as hypothesized among Black/African American Veterans, for whom yield (i.e., percentage of sampled cases who returned a completed survey) was lowest among those who received the tailored insert. Conversely, while not statistically significant, yields were higher in the tailored insert group for AANHPI, Hispanic, and recently separated Veterans. Findings suggest that the impact of visual representation in survey recruitment materials may differ across Veteran groups. Alternately, more effective images may be needed to optimize tailoring of recruitment materials. Additional research is warranted to better understand whether tailored inserts can increase response among Veterans who are harder to engage in research.

Background

To conduct valid, generalizable survey research, investigators must collect data from all population segments. However, some groups (e.g., Black and Hispanic Americans) have historically been less likely to respond to surveys (Jang and Vorderstrasse 2019; Sheldon et al. 2007; Moorman et al. 1999). Among Veterans, women respond at lower rates (Harrington et al. 2019).

In a study on suicidal ideation and attempts among Veterans (ASCEND; Hoffmire et al. 2022), Black/African American, Hispanic, Asian/Native Hawaiian/Pacific Islander (AANHPI), female, and younger (18-34 years) Veterans responded at lower rates than non-Hispanic White, male, and older (≥35 years) Veterans (Herring-Nathan et al. 2021). Veterans separated within the past 3 years also responded at lower rates than those separated more than 3 years ago (Herring-Nathan et al. 2021). The decreased responding among younger Veterans and Veterans of color mirror response patterns observed in general population studies (Jang and Vorderstrasse 2019; Morgan 2020).

Understanding how to increase survey response among harder-to-reach respondents is critical. Ensuring representative Veteran samples improves generalizability (Groves 2006), which is essential for research designed to improve Veteran health and well-being. While nonprobability sampling can increase the number of participants from hard-to-reach populations (Shaghaghi, Bhopal, and Sheikh 2011), further efforts are needed to increase these groups’ response rates. One study with female Veterans found that personalized information about the research team can increase response rates (Gaeddert et al. 2020). Additionally, including culturally reflective imagery and language in survey invitations and materials has been found to increase participation among racial/ethnic and linguistic minority groups (Yancey, Ortega, and Kumanyika 2006; Laganà et al. 2013; Sullivan et al. 2011; Banducci, Donovan, and Karp 2004). It is unknown if this strategy is effective when tailored for other demographics (e.g., by sex or age) or among Veterans.

As it is critical to understand how to increase survey response from Veteran groups that have experienced divergent patterns in their suicide rates and trends over time (U.S. Department of Veterans Affairs, n.d.), we examined if including tailored inserts was effective in a suicide prevention study. We hypothesized that including images in recruitment materials that reflected aspects of one’s identity would increase survey response within these groups (i.e., Black, Hispanic, AANHPI, female, and recently separated).

Data and Methods

Participants and Procedures

This experiment was embedded into ASCEND Wave 1 data collection, fielded over 16 weeks between 2/2022-6/2022 (Hoffmire et al. 2024). Standard strategies for optimizing response were employed, including mixed contact and response modes, pre- and post-incentives, and reminders (Dillman, Smyth, and Christian 2014). A sequential multi-mode, push-to-web design was used (Appendices A and C). 97,287 Veterans residing in the United States (all 50 states, the District of Columbia and Puerto Rico) were invited to participate; 17,383 completed the survey (yield=17.9%; AAPOR RR3 =19.1%).

Insert experiment

We conducted an experiment of tailored inserts included in the initial postal mail invitation and Week 6 paper survey mailings to examine if their inclusion improved response rates among specific demographic subgroups. Veterans were randomly assigned to control or tailored insert conditions. The only difference between the two conditions was the insert included; there were no outward differences in the envelopes/mailings.

Control conditions. To provide a relevant comparison group, those in the control condition received no insert or a generic insert in which images were not limited to any specific demographic group (Appendix B). This allowed testing the effectiveness of including the tailored insert (experimental condition), relative to no insert or a generic insert.

Tailored condition. In this condition, the postcard front featured three images of Veterans with likeness to one’s demographic group (Appendix B). Inserts tailored by race or ethnicity (Black/African American, AANHPI, and Hispanic Veterans) were included in Replicates 1 and 3; inserts tailored to female and recently separated Veterans were included in Replicates 2 and 4.

To reduce potentially confounding effects, images across insert conditions were selected to display similar general characteristics; each included images of a family, plus a range of ages, races/ethnicities, and sexes, when these characteristics were not aligned with the demographic group of interest. Images of individuals in uniform were avoided, to align with the Veteran focus. Members of a project Veterans Engagement Board (Wendleton et al. 2023) provided input on images and informed expectations of insert performance and feedback for future experiments. Images were purchased from stock image sites. The backs of all inserts included an illustration of how to access the web survey (Appendix B). Inserts were printed on 8x5.5 inch, full-color, two-sided postcards.

Some demographic groups were not mutually exclusive (e.g., females were also part of all other groups; recently separated Veterans are of all races and ethnicities); thus, decisions about which group to place a Veteran into were necessary. In Replicates 1 and 3, where the insert experiment focused on race and ethnicity, Hispanic ethnicity was prioritized. Hispanic Veterans who also indicated another race/ethnicity were coded as Hispanic, meaning all other groups were comprised exclusively of non-Hispanic respondents. Veterans with more than one race or whose race was listed as “Other” in the sampling frame were placed in the “All Other Races” group and not sent a tailored insert. For each experiment, Veterans in an “All others” category included Veterans not in the demographic group of interest for the given replicate and received a generic insert.

In Replicate 2, where the insert experiment focused on sex and recency of separation, female sex was prioritized (i.e., recently separated female Veterans were considered female in Replicate 2). Conversely, in Replicate 4, recently separated status was prioritized (i.e., recently separated female Veterans were considered recently separated in Replicate 4). This design promoted balance and representativeness. Among the demographics of interest, the design allocated 50% to the tailored condition, 25% to the generic insert control, and 25% to the no-insert control. The number of Veterans sampled in each group and condition is provided in Tables 1 and 2. The non-experimental group primarily included non-Hispanic White Veterans, but also Veterans who indicated more than one race/ethnicity or American Indian Alaska Native in Replicates 1 and 3, and older male Veterans in Replicates 2 and 4.

Table 1.Sample Sizes for Inserts Tailored by Race and Ethnicity, Replicates 1 and 3
  Number of Sampled Veterans
Experimental Condition Control, No insert Control, Generic Insert Experimental, Tailored Insert
Mailing Invitation Paper Survey2 Invitation Paper Survey2 Invitation Paper Survey2
Demographic Group            
Black/African 2,146 1,914 2,194 1,957 4,369 3,897
American
Asian American, 363 324 376 335 688 614
Native Hawaiian, or
Pacific Islander
(AANHPI)
Hispanic 1,152 1,028 1,106 987 2,262 2,018
All Others1 31,987 28,532 2,000 1,784 0 0

1 All Others includes White and Two or More Races.
2 Actual counts for the paper survey groups varied depending on whether respondents participated in the survey through web prior to the paper survey mailing.

Table 2.Sample Sizes for Inserts Tailored to Female and Recently Separated Veterans, Replicates 2 and 4
  Number of Sampled Veterans
Experimental Condition Control, No Insert Control, Generic Insert Experimental, Tailored Insert
Mailing Invitation Paper Survey2 Invitation Paper Survey2 Invitation Paper Survey2
Demographic Group            
Female 2,510 2,239 2,510 2,239 5,031 4,488
Recently 531 474 532 475 1,074 958
separated
All others1 34,456 30,735 2,000 1,784 0 0

1 All Others includes White and Two or More Races.
2 Actual counts for the paper survey groups varied depending on whether respondents participated in the survey through web prior to the paper survey mailing

Statistical Analysis

To evaluate the impact of inserts on response, we calculated yield (percentage of sampled cases who returned a completed survey) for each demographic group, overall and by condition (tailored insert, generic insert, no insert). As this experiment was embedded in the larger ASCEND study and design was thus largely driven by practical considerations and operations related to printing and mailing materials, we conducted a post-hoc power analysis to evaluate the minimum effect size (i.e., difference in yield) we could detect with 80% power, nominal Type 1 error of 0.05, and 15% yield. For AANHPI and recently separated Veterans, we had 80% power to detect a difference of at least 8% across experimental conditions. For all other demographic groups, we had 80% power to detect a difference of at least 4% across conditions.

Next, logistic regression models examined the relationship between experimental group and the likelihood of survey completion. Age was included as a potential confounder, as preliminary analyses revealed a significant association between age and survey response across demographic groups (p<.001 for female, Black/African American, Hispanic, recently separated, and AANHPI), with older Veterans more likely to participate. An a priori decision was also made to evaluate whether age was an effect modifier of the relationship between experimental group and survey completion given prior research suggesting that response relationships often differ by age (Kolaja et al. 2023; Wu, Zhao, and Fils-Aime 2022). Thus, for each demographic group, three sequential logistic regression models were fit: (1) experimental condition as the only independent variable to assess for overall unadjusted associations with responding; (2) experimental condition and age as independent variables to investigate whether the effect of the experimental manipulation was independent of age; and (3) experimental condition, age, and their interaction to evaluate whether the relationship between the experimental condition and responding differed by age. The independent experiment variable for all models was categorical (no insert [reference], generic insert, tailored insert). The dependent variable was binary survey completion status (yes/no); age (in years) was included as a continuous variable.

Results

97,287 Veterans were invited to participate in the ASCEND Wave 1 survey; 26,843 were randomly assigned to an experimental condition. Overall, 17.9% (N=17,383) of invited Veterans completed the survey, compared to 13.6% (N=3,642) of those included in the experiment. Figure 1 illustrates overall yield (from 11.6% to 14.9%) for those in the experimental condition, by demographic group. Each group had lower overall response (statistically significant difference in yield at p<0.001), compared to the non-experimental group (N=70,444; yield=19.5%).

Figure 1
Figure 1.Overall Yield for Experimental Cases, by Demographic Group, Compared to non-Experimental Group

Within the experimental sample, we compared the yield of insert types across all demographic groups (Table 3). Yield was highest for the no-insert condition (14.2%; 95%CI=13.4-15.1), followed by the tailored insert condition (13.5%; 95%CI=12.9-14.1), and lowest for the generic insert condition (13.0%; 95%CI=12.2-13.9). The no-insert control group had a significantly higher yield than the generic insert condition (p<.05). However, yields were not significantly different between the no-insert and tailored insert conditions, nor between tailored and generic insert conditions.

Table 3.Yield rate by experimental condition across demographic groups
Experimental Condition Total Sample (N) Total Completes (n) Yield (%) 95% Confidence Interval
Control (no insert) 6,701 953 14.2%* (13.4%, 15.1%)
Control (generic insert) 6,718 875 13.0% (12.2%, 13.9%)
Tailored insert 13,424 1,814 13.5% (12.9%, 14.1%)
Total 26,843 3,642 13.6% (13.2%, 14.0%)

Note. * = Using a 2-sample test for equality of proportions chi-square p<0.05 for comparing the no-insert and generic insert conditions

Figure 2 illustrates survey yields by experimental condition and demographic group. The pattern of findings regarding the effect of insert type differed across groups. Significant differences by insert type were only observed for Black/African Americans, among whom both insert conditions were associated with a statistically significant difference in yield; those who received tailored (11.4%; 95%CI=10.4-12.3) or generic (11.9%, 95%CI=10.6-13.4) inserts responded at significantly lower rates, compared to those who received no insert (14.0%; 95%CI=12.6-15.6; p=.02). No other demographic groups displayed this pattern, irrespective of statistical significance.

Additionally, no demographic group, including Black/African American Veterans, experienced significant differences in responding between those who received generic and tailored inserts. However, the magnitude of yield was notably larger in the tailored insert condition, compared to both the no insert and generic insert conditions among AANHPI and recently separated Veterans, and compared to the generic insert condition among Hispanic Veterans.

Figure 2
Figure 2.Yield by Experimental Condition, within each Demographic Group

Results from logistic regression modeling are presented in Tables 4 and 5[1]. A negative effect of the tailored insert, relative to no insert, for the Black/African American group remained after accounting for age. No other significant effects were observed after accounting for age, nor was the interaction between age and tailored insert condition significant for any group, indicating no differential effect by age.

Table 4.Logistic regression models evaluating the relationship between experimental group and survey completion for racial and ethnic groups
Black/African American Hispanic AANHPI
  Model 1 (Experimental Condition) Model 2 (Experimental Condition and Age) Model 1 (Experimental Condition) Model 2 (Experimental Condition and Age) Model 1 (Experimental Condition) Model 2 (Experimental Condition and Age)
Coefficient (SE) Coefficient (SE) Coefficient (SE) Coefficient (SE) Coefficient (SE) Coefficient (SE)
Intercept -1.81 (0.06)*** -3.57 (0.16)*** -1.75 (0.08)*** -3.32 (0.18)*** -3.38 (0.30)*** -1.96 (0.16)***
Experimental Condition            
No Insert Ref Ref Ref Ref Ref Ref
Tailored Insert -0.24 (0.08)** -0.24 (0.08)** 0.07 (0.10) 0.09 (0.10) 0.18 (0.20) 0.18 (0.19)
Generic Insert -0.19 (0.09)* -0.16 (0.09) -0.14 (0.12) -0.12 (0.12) -0.05 (0.23) -0.07 (0.23)
Age NA 0.03 (<.001)*** NA 0.03 (<.001)*** 0.03 (0.01)*** NA

Note. ***p<0.001; **p<0.01; *p<0.05. SE = Standard error.
Model 1 includes experimental condition as the only independent variable to assess for overall, unadjusted associations with responding; Model 2 includes experimental condition and age as independent variables; Model 3 also included the interaction between age and experimental insert group but is not displayed as no interactions were found to be statistically significant.

Table 5.Logistic regression models evaluating the relationship between experimental group and survey completion for female and recently separated Veterans
  Female Recently Separated
Model 1 (Experimental Condition) Model 2 (Experimental Condition and Age) Model 1 (Experimental Condition) Model 2 (Experimental Condition and Age)
Coefficient (SE) Coefficient (SE) Coefficient (SE) Coefficient (SE)
Intercept -1.74 (0.05)*** -3.49 (0.12)*** -2.18 (0.12)*** -4.57 (0.24)***
Experimental Condition        
No Insert        
Tailored Insert -0.04 (0.07) -0.06 (0.07) 0.24 (0.15) 0.24 (0.16)
Generic Insert -0.05 (0.08) -0.03 (0.08) -0.07 (0.18) -0.11 (0.19)
Age NA 0.03 (0.00)*** NA 0.06 (0.00)***

Note. ***p<0.001; **p<0.01; *p<0.05. SE = Standard error.
Model 1 includes experimental condition as the only independent variable to assess for overall, unadjusted associations with responding; Model 2 includes experimental condition and age as independent variables; Model 3 also included the interaction between age and experimental insert group but is not displayed as no interactions were found to be statistically significant.

Discussion

While prior research suggests that tailored recruitment materials increase responding among different groups in the general population (Cunningham-Erves et al. 2022; Taani et al. 2020; Hughson et al. 2016), our findings suggest this may not be an effective approach to increase response among U.S. Veterans. Although we did not observe a significant increase in responding in the experimental condition for any Veteran groups in this study, we did observe a non-significant trend towards a positive effect of tailored imagery, compared to the no insert and/or generic insert, among AANHPI, recently separated, and Hispanic Veterans. Conversely, a statistically significant negative impact of tailored images was observed among Black/African American Veterans, for whom the no-insert control group had a higher yield rate than those who received inserts with tailored imagery. Consequently, the present findings do not support the effectiveness of using tailored imagery in recruitment materials to increase responding to a Veteran suicide prevention survey, though do suggest that further research is warranted. While the current research pertains to a Veteran population, the lessons learned about the use of tailored recruitment material for harder-to-survey subgroups can inform practices in other populations since, as noted above, the response patterns in Veterans by race/ethnicity and age mirror the response patterns in the general population (Jang and Vorderstrasse 2019; Morgan 2020).

A few possible explanations exist for these findings. First, recruitment materials representing one aspect of respondents’ identity may not positively influence survey response among Veterans. Individuals possess multiple identities, and the experimental manipulation focused on a singular assumed aspect of identity, which may have been inconsistent with their self-identity. It is also possible that the imagery did not highlight the most salient aspect(s) of identity; Veteran status or aspects of military service (e.g., branch) not featured may have produced results consistent with prior research in non-Veteran populations in which representation was associated with higher responding (Cunningham-Erves et al. 2022; Hughson et al. 2016; Taani et al. 2020). It is additionally possible that the imagery on the tailored inserts was better suited for some groups (i.e., AANHPI, recently separated, Hispanic Veterans) than for others (i.e., Black/African American Veterans) and/or that the generic insert influenced responding differentially across groups. Specifically, while the generic insert clearly included recently separated, female, and Black Veterans, other images were more racially and ethnically ambiguous and thus AANHPI and Hispanic Veterans may or may not have identified with the generic insert images. Additional research to identify optimal imagery for survey research with Veterans is warranted.

Another possible explanation is that different groups respond differently to visual representations of themselves in recruitment materials due to differential exposure to their group (e.g., in popular culture). If the novelty of seeing one’s group represented in recruitment materials motivates responding, members of less well-represented groups might respond better to tailored recruitment material than those of better represented groups. AANHPI representation in broader US culture, especially portraying such individuals as multidimensional, lags behind that of Black/African Americans and females (Wickliffe 2020). The lack of visibility among AANHPI individuals is a concern for AANHPI communities and Veterans (Yip et al. 2021; Polzer et al. 2025). Thus, one possibility is that Black/African American Veterans responded differently than AANHPI Veterans because images of AANHPI individuals are less commonly encountered. If supported by subsequent research, this would suggest that group-specific imagery highlighting race/ethnicity is unlikely to result in higher survey participation for some groups. Research is needed to evaluate this hypothesis.

Limitations

While this experiment had a sound design and robust overall sample size, there are limitations. We were only able to use three images on each insert and chosen images may not have been the most effective. Additionally, the AANHPI and recently separated subgroups had smaller sample sizes; post-hoc power analysis indicates this may explain why some larger differences observed for these groups were not statistically significant. Moreover, inserts were not the only material in the recruitment envelope, which also included a letter and, in the second insert mailing, a paper survey; such materials might have distracted respondents from the insert, limiting exposure to the stimulus. Nonetheless, this would be the case for most studies using tailored inserts; further efforts to isolate the insert effect may not optimally convey true effectiveness. Finally, the lack of additional covariates from the sampling frame (e.g., education) limited adjustment for potential confounding.

Conclusions

Given the myriad interpretations of our findings, which run somewhat contrary to existing literature, continued research is needed to understand the most effective recruitment materials for Veteran survey research. Such research could be qualitative (e.g., focus groups to understand recruitment material characteristics most responsive to individual group needs) or quantitative research (e.g., discrete choice analyses to understand the most influential features of recruitment materials). Such future research is critical to ensure that studies with Veterans are representative and generalizable.


Corresponding author contact information

Claire A. Hoffmire, PhD, Rocky Mountain Regional VA Medical Center, Rocky Mountain MIRECC for Suicide Prevention, 1700 North Wheeling St., Aurora, CO 80045. Telephone: 585-301-1442; E-mail: Claire.Hoffmire@va.gov

Acknowledgements

This material is based on work supported by the U.S. Department of Veterans Affairs (VA) Office of Mental Health and Suicide Prevention and the VA Rocky Mountain Mental Illness Research, Education and Clinical Center (MIRECC) for Suicide Prevention. The views expressed are those of the authors and do not necessarily represent the views or policy of the VA or the United States Government. The authors would like to thank all the ASCEND Veterans Engagement Board members for their contributions and input.


  1. The interaction of age and experiment condition was not found to be significant in any of the regression models. For this reason, those models have not been included as they should not be interpreted.

Submitted: February 19, 2025 EDT

Accepted: June 11, 2025 EDT

References

Banducci, Susan A., Todd Donovan, and Jeffrey A. Karp. 2004. “Minority Representation, Empowerment, and Participation.” The Journal of Politics 66 (2): 534–56. https:/​/​doi.org/​10.1111/​j.1468-2508.2004.00163.x.
Google Scholar
Cunningham-Erves, Jennifer, Sheila V. Kusnoor, Victoria Villalta-Gil, Sarah C. Stallings, Jabari S. Ichimura, Tiffany L. Israel, Paul A. Harris, and Consuelo H. Wilkins. 2022. “Development and Pilot Implementation of Guidelines for Culturally Tailored Research Recruitment Materials for African Americans and Latinos.” BMC Medical Research Methodology 22 (1). https:/​/​doi.org/​10.1186/​s12874-022-01724-4.
Google Scholar
Dillman, Don A., Jolene D. Smyth, and Leah Melani Christian. 2014. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Hoboken, N.J.: Wiley & Sons. https:/​/​doi.org/​10.1002/​9781394260645.
Google Scholar
Gaeddert, Laurel A., Alexandra L. Schneider, Christin N. Miller, Lindsey L. Monteith, Lisa A. Brenner, Jodie Katon, and Claire A. Hoffmire. 2020. “Recruitment of Women Veterans into Suicide Prevention Research: Improving Response Rates with Enhanced Recruitment Materials and Multiple Survey Modalities.” Research in Nursing & Health 43 (5): 538–47. https:/​/​doi.org/​10.1002/​nur.22065.
Google Scholar
Groves, Robert M. 2006. “Nonresponse Rates and Nonresponse Bias in Household Surveys.” Public Opinion Quarterly 70 (5): 646–75. https:/​/​doi.org/​10.1093/​poq/​nfl033.
Google Scholar
Harrington, Kelly M., Xuan-Mai T. Nguyen, Rebecca J. Song, Keri Hannagan, Rachel Quaden, David R. Gagnon, Kelly Cho, et al. 2019. “Gender Differences in Demographic and Health Characteristics of the Million Veteran Program Cohort.” Women’s Health Issues 29 (June): S56-66. https:/​/​doi.org/​10.1016/​j.whi.2019.04.012.
Google Scholar
Herring-Nathan, Evan R., Christopher H. Johnson, Lindsey L. Monteith, Alexandra L. Schneider, and Claire A. Hoffmire. 2021. “ASCEND for Veteran Suicide Prevention: Evaluating Sample Design and Response Characteristics of a National Veteran Survey Surveillance System.” In JSM Proceedings. Alexandria, VA: American Statistical Association.
Google Scholar
Hoffmire, Claire A., Sean M. Barnes, Ryan Holliday, Julie A. Kittel, Alexandra L. Schneider, Lisa A. Brenner, Jamie L. Tock, and Lindsey L. Monteith. 2024. “Non-Fatal Suicidal Self-Directed Violence Among U.S. Veterans (2022): The Assessing Social and Community Environments With National Data (ASCEND) for Veteran Suicide Prevention Study.” American Journal of Epidemiology, December. https:/​/​doi.org/​10.1093/​aje/​kwae461.
Google Scholar
Hoffmire, Claire A., Nathan V. Mohatt, Ryan Holliday, Sean M. Barnes, Lisa A. Brenner, and Lindsey L. Monteith. 2022. “ASCEND for Veteran Suicide Prevention: Enhancing Surveillance to Save Lives.” Psychiatry Research 310 (April). https:/​/​doi.org/​10.1016/​j.psychres.2022.114432.
Google Scholar
Hughson, Jo-anne, Robyn Woodward-Kron, Anna Parker, John Hajek, Agnese Bresin, Ute Knoch, et al. 2016. “A Review of Approaches to Improve Participation of Culturally and Linguistically Diverse Populations in Clinical Trials.” Trials 17 (1). https:/​/​doi.org/​10.1186/​s13063-016-1384-3.
Google Scholar
Jang, Myoungock, and Allison Vorderstrasse. 2019. “Socioeconomic Status and Racial or Ethnic Differences in Participation: Web-Based Survey.” JMIR Research Protocols 8 (4): e11865. https:/​/​doi.org/​10.2196/​11865.
Google Scholar
Kolaja, Claire A., Jennifer N. Belding, Satbir K. Boparai, Sheila F. Castañeda, Toni Rose Geronimo-Hara, Teresa M. Powell, Xin M. Tu, Jennifer L. Walstrom, Beverly D. Sheppard, and Rudolph P. Rull. 2023. “Survey Response over 15 Years of Follow-up in the Millennium Cohort Study.” BMC Medical Research Methodology 23 (1). https:/​/​doi.org/​10.1186/​s12874-023-02018-z.
Google Scholar
Laganà, Francesco, Guy Elcheroth, Sandra Penic, Brian Kleiner, and Nicole Fasel. 2013. “National Minorities and Their Representation in Social Surveys: Which Practices Make a Difference?” Quality & Quantity 47 (3): 1287–1314. https:/​/​doi.org/​10.1007/​s11135-011-9591-1.
Google Scholar
Moorman, Patricia G., Beth Newman, Robert C. Millikan, Chiu-Kit J. Tse, and Dale P. Sandler. 1999. “Participation Rates in a Case-Control Study.” Annals of Epidemiology 9 (3): 188–95. https:/​/​doi.org/​10.1016/​s1047-2797(98)00057-x.
Google Scholar
Morgan, S. L. 2020. “Response Rates and Representativeness: A Benchmark Comparison of the General Social Surveys to the American Community Surveys, 2012–2018.” GSS Methodological Report 131. NORC at the University of Chicago. https:/​/​gss.norc.org/​content/​dam/​gss/​get-documentation/​pdf/​reports/​methodological-reports/​MR131%20Response%20Rates.pdf.
Polzer, Evan R., Carly M. Rohs, Christe’An D. Iglesias, Joseph Mignogna, Lauren S. Krishnamurti, Ryan Holliday, and Lindsey L. Monteith. 2025. “‘You Wished the Ground Would Open and Swallow You Up’: Expert Opinions on Shame, the Collective, and Other Cultural Considerations for Suicide Prevention among Asian American and Pacific Islander Veterans.” Injury Epidemiology 12 (1). https:/​/​doi.org/​10.1186/​s40621-025-00560-6.
Google Scholar
Shaghaghi, Abdolreza, Raj S. Bhopal, and Aziz Sheikh. 2011. “Approaches to Recruiting ‘Hard-To-Reach’ Populations into Re-Search: A Review of the Literature.” Health Promotion Perspectives 1 (2): 86–94. https:/​/​doi.org/​10.5681/​hpp.2011.009.
Google Scholar
Sheldon, Helen, Chris Graham, Nick Pothecary, and Farhat Rasul. 2007. “Increasing Response Rates amongst Black and Minority Ethnic and Seldom Heard Groups.” Europe: Picker Institute.
Sullivan, Patrick F., Christine M. Khosropour, Nicole Luisi, Matthew Amsden, Tom Coggia, Gina M. Wingood, and Ralph J. DiClemente. 2011. “Bias in Online Recruitment and Retention of Racial and Ethnic Minority Men Who Have Sex with Men.” Journal of Medical Internet Research 13 (2): e38-38. https:/​/​doi.org/​10.2196/​jmir.1797.
Google Scholar
Taani, Murad H., Bev Zabler, Michael Fendrich, and Rachel Schiffman. 2020. “Lessons Learned for Recruitment and Retention of Low-Income African Americans.” Contemporary Clinical Trials Communications 17 (March): 100533. https:/​/​doi.org/​10.1016/​j.conctc.2020.100533.
Google Scholar
U.S. Department of Veterans Affairs. n.d. “2023 National Veteran Suicide Prevention Annual Report.” Accessed February 11, 2025. https:/​/​www.mentalhealth.va.gov/​docs/​data-sheets/​2023/​2023-National-Veteran-Suicide-Prevention-Annual-Report-FINAL-508.pdf.
Wendleton, Leah, Claire A. Hoffmire, Stephan Drop, Mark Flower, Randall Jones, John P. Nolan, Camille Parham, and Lindsey L. Monteith. 2023. “Veteran Engagement in Survey Research to Prevent Suicide.” Progress in Community Health Partnerships 17 (1): 129–34. https:/​/​doi.org/​10.1353/​cpr.2023.0017.
Google Scholar
Wu, Meng-Jia, Kelly Zhao, and Francisca Fils-Aime. 2022. “Response Rates of Online Surveys in Published Research: A Meta-Analysis.” Computers in Human Behavior Reports 7 (2): 1–11. https:/​/​doi.org/​10.1016/​j.chbr.2022.100206.
Google Scholar
Yancey, Antronette K., Alexander N. Ortega, and Shiriki K. Kumanyika. 2006. “Effective Recruitment and Retention of Minority Research Participants.” Annual Review of Public Health 27 (1): 1–28. https:/​/​doi.org/​10.1146/​annurev.publhealth.27.021405.102113.
Google Scholar
Yip, Tiffany, Charissa S. L. Cheah, Lisa Kiang, and Gordon C. Nagayama Hall. 2021. “Rendered Invisible: Are Asian Americans a Model or a Marginalized Minority?” American Psychologist 76 (4): 575–81. https:/​/​doi.org/​10.1037/​amp0000857.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system