Processing math: 100%
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:57280/feed
Articles
Vol. 8, Issue 4, 2015August 31, 2015 EDT

Logos and Inserts can Reduce Survey Return Rates: An Experiment in California

Matt Jans, Royce Park, John Rauch, David Grant, Sherman Edwards,
address-based samplemailing designlogomail survey
https://doi.org/10.29115/SP-2015-0017
Survey Practice
Jans, Matt, Royce Park, John Rauch, David Grant, and Sherman Edwards. 2015. “Logos and Inserts Can Reduce Survey Return Rates: An Experiment in California.” Survey Practice 8 (4). https:/​/​doi.org/​10.29115/​SP-2015-0017.
Save article as...▾
Download all (4)
  • Figure 1  Logo and non-logo envelopes used in the screener packet mailing.
    Download
  • Figure 2  Insert glue-tacked to cover letter (Spanish on reverse side).
    Download
  • Figure 3  Effect of logo/insert at experimental mailing 1 overall and by community.
    Download
  • Figure 4  Age differences in logo effect by mailing.
    Download

Sorry, something went wrong. Please try again.

If this problem reoccurs, please contact Scholastica Support

Error message:

undefined

View more stats

Abstract

Survey methodologists seek techniques that make the mail they send more likely to be opened, completed, and returned than missed, ignored, or thrown away. Traditional design wisdom recommends against using color and logos, which can make the delivery envelope look like marketing, fundraising, and other “junk mail.” Yet there have been few experimental tests of this assertion. Theoretically, a distinctive graphic representing a recognizable and beloved sponsor, combined with a prominent motivational message should serve dual roles of a) distinguishing surveys from marketing, advertising, fundraising, and b) encourage the respondent to participate. In this experiment, a random half of envelopes had a sponsor logo on the outside and a motivational insert affixed to the cover letter. The response task was a one-page screener form asking for basic household information and a telephone number for a subsequent telephone survey. The experiment used a random sample of addresses from the USPS Delivery Sequence File (DSF) in two California communities with hard-to-survey characteristics. The logo envelope received about a 1.3 percentage point lower return rate in the first mailing in which it was used, but not in the second mailing. Further, that difference was only significant in one of the two communities (a 2.8 point difference in that community v. 0.7 points in the other). Traditional advice about avoiding logos in mail surveys seems sound, but effects are not universal. Comparing experimental and control conditions at their first use, forms returned from logo/insert packets were less likely to report adults age 41 years old and over (70.8% v. 78.2% in non-logo/non-insert mailings). In their second use, logo/insert packets were more likely to reporting at least one adult 18-40 in the household (76.8% v. 66.9% for non-logo/non-insert mailings). Suggestions for future research are discussed.

Abstract

Traditional wisdom about mail survey envelope design warns that colors and extra logos can make the envelope look like marketing, fundraising, and other “junk mail.” Yet, a distinctive and inspiring graphic representing a recognizable and beloved sponsor, combined with a prominent motivational message could distinguishing the survey from junk mail and encourage participation. We present evidence from a randomized experiment in which half of the mailings included a sponsor logo on the envelope and a motivational insert affixed to the cover letter. Findings show that conventional design wisdom is sound. The logo envelope received about a 1.3 percentage point lower return rate from the first mailing in which it was used. However, the difference was only significant in one of the two communities sampled suggesting that the logo effects may not be universal. Only age and dwelling type (single-unit vs. multi-unit) appeared to be related to the effect of the logo. Suggestions for future research and practice are discussed.

Introduction

Survey design guidance recommends using professional yet plain mailing materials to maximize response rates (Dillman 1978; Dillman, Smyth, and Christian 2014; Fowler 2014). Dillman et al. (2014) recommend a “recognized and respected logo with the return address” and to “limit print to standard colors such as black or charcoal gray” (p. 384, Guideline 10.13). These recommendations imply that unknown logos and unusual colors indicate marketing or fundraising to sampled households, rather than a survey in service of the public good.

The few studies testing markings and messaging on envelopes show decreased response (Dykema et al. 2012; Levin 2015). Yet, logos and markings can vary on many dimensions, and their effects may not be universal across populations. Government agency logos may lend status to a survey, but they do not necessarily increase response rates (Edwards et al. 2006). Even printing “Thank you! A cash gift is enclosed” on the envelope may suppress response, suggesting that any markings should be avoided (Dykema et al. 2012).

We hypothesized that a funder logo could work in this survey because the funder was conducting community-wide health promotion programs in the sampled communities. Households not actively participating might still recognize the logo from local advertising. The logo also had an explicit health promotion message (see Figure 1), which we thought might induce proactive behavior and motivate response. A sponsor endorsement was also inserted into the mailing.

Figure 1  Logo and non-logo envelopes used in the screener packet mailing.

Research Question:

“Will the logo and insert make respondents more likely to open, complete, and return the questionnaire? If yes, among whom?”

Methods and Data

We conducted this experiment in an address-based sampling (ABS) pilot test with a mail-to-phone design. A self-administered screener questionnaire was mailed to sampled addresses in two California communities (Jans et al. 2013), which had several hard-to-survey characteristics including moderate-to-high Spanish use, high rates of renter-occupation and multi-unit dwelling, younger age, higher proportion of Hispanic/Latino ethnicity, and moderate rates of poverty (see Table 1).

Table 1  Description of sampled communities.
Community characteristic Boyle heights
(n=4,466)
Southwest Merced and East Merced county (n=3,811)
Location Los Angeles, CA California Central Valley
Description One of the oldest LA neighborhoods in Central/East LA Agricultural and small college town with the youngest UC campus
Adult age 18–39 years 35% 33%
Hispanic 92–98% 60–63%
Speak Spanish at home 84% 44%
Families below poverty line 33% 30%
Renter-occupied housing units 78% 54%

Source: Estimates are from www.HealthyCity.org (reporting Census 2010 data), BHC 2009 health profiles (http://healthpolicy.ucla.edu/chis/bhc/Pages/default.aspx), and 2008 Claritas data based on U.S. Census 2000 urbanized areas.

Mailing Protocol and Screener Questionnaire

A three-mailing protocol was used, testing the logo and insert in the second and third mailings (here forward called “experimental mailings 1 and 2). We did not manipulate the first full-protocol mailing to minimize any negative impact on the final results. The screener questionnaire asked household demographic and health questions, and requested phone numbers at which the household could be contacted for an interview (see Appendix). The outgoing envelope, used in full-protocol mailing 1 and as the “control” condition in experimental mailing 1 and 2 was white, 9”×12”, with a UCLA logo in the return address field. A cover letter, a one-page FAQ, a one-page screener form, and a postage-paid return envelope were enclosed and printed in English and Spanish. Mailings were conducted between October 12 and November 29, 2012.

A random half of mailings in experimental mailings 1 and 2 included the sponsor’s logo (Figure 1) on the envelope, and a sponsor endorsement insert glue tacked to the cover letter (Figure 2). The same random half received the logo and insert at experimental mailings 1 and 2 (see Table 2). At experimental mailing 2, the entire packet, including the original outgoing envelope, was placed in a Priority Mail envelope for delivery.

Figure 2  Insert glue-tacked to cover letter (Spanish on reverse side).
Table 2  Sample sizes and returns at each mailing.
Screener form mailing Standard packet (no logo) Logo/insert Participation rate
Mailing 1 enclosures 7,274 No experiment at mailing 1 780*
10.7%**
 Cover letter
 FAQ
 Screener form
 $2 incentive
 Business reply envelope
Postcard – Reminder cards sent to all sampled addresses
Mailing 2 enclosures 3,260 3,253 425
6.8%
(Experimental mailing 1)
 Cover letter
 FAQ
 Screener form
 Business reply envelope
Mailing 3 enclosures 2,883 2,893 342
6.0%
(Experimental packet 2)
Note: Packet enclosed in USPS priority
 Mail envelope
  Cover letter
  FAQ
  Screener form
 Business reply envelope

*Form returned with some information filled in.

**Excludes cases returned by the post office as not deliverable as addressed (including these cases obtains 11.4 percent, 6.5 percent, and 5.9 percent for mailings 1, 2, and 3, respectively).

Results

We analyzed experimental mailings 1 and 2 separately because the Priority Mail envelope creates a different stimulus for householders. Return rates differed by less than one percentage point (n=83 people, 1.3 percent of experimental packet 1). Non-deliverable cases were excluded from these analyses. Analyses were unweighted, but simple random sampling was used within each community.

Overall, significantly-lower participation was seen among logo/insert packets (x2=4.20,p=0.04, see Figure 3) in the first experimental mailing only. The effect only appeared in Merced (x2=4.21,p=0.04).

Figure 3  Effect of logo/insert at experimental mailing 1 overall and by community.

At experimental mailing 1 , data from logo/insert packets were marginally less likely to indicate someone age 41 or older in the household (x2=2.98, p=0.08; see Figure 4). At experimental mailing 2 (i.e., Priority Mail), forms from logo/insert packets were significantly more likely to indicate adults age 18–40 in the household (x2=4.15, p=0.04). Chi-square tests of association between logo/insert condition and the presence and number of children and teens, number of adults, language in which the form was completed, and preferred language for the call were conducted but none reached even marginal significance.

Figure 4  Age differences in logo effect by mailing.

Single-family and multi-family dwelling units (from sampling frame data) may be differentially-affected by the logo/insert manipulation. At experimental mailing 1, single-family dwellings appeared to be marginally less likely to return a completed form when the logo was used (5.42 percent for logo/insert v. 6.69 percent for control, x2=3.46, p=0.063). Multi-family dwellings show the same direction of difference, though non-significant effect. There were no differences in effects between housing unit type at experimental mailing 2.

Discussion

Findings Summary

We find evidence that a sponsor logo on the outgoing envelope can suppress the return rate in a screener questionnaire mailed to an ABS sample, but only in the first experimental packet mailing, and only in one community. Clearly, logo effects are not universal. We cannot disentangle community, housing unit, and householder effects in this study, but several demographic features distinguish the two communities. Boyle Heights is more urban, more Hispanic, more Spanish-speaking, and has more renters than Merced, each of which is expected to reduce response. Thus, results may be due to community factors more than individual factors. Printing the logo only in English may partly explain why we see results in Merced, and not Boyle Heights. Linguistically-isolated households may not have processed the message of the logo. The low absolute rate of response in Boyle Heights could also lead to a lack of significant effect.

We found that single-family dwellings were marginally less likely to return a completed form when the logo and insert were used, and multi-family dwellings showed a non-significant difference in the same direction. When mail is delivered to a central location, as is common in multi-unit dwellings, there are more opportunities for mis-delivered mail or for someone other than the sampled residence to receive the package, which could contaminate experimental effects.

While we cannot infer about the effect of Priority Mail per se, we can gain some insight from the age difference in response from experimental mailing 1 (First-Class mail) and 2 (Priority Mail). Adults age 41+ were reported less often on forms from logo/insert mailings when the mailing was sent first-class (i.e., experimental mailing 1; logo visible from the outside). Younger adults (age 18–40) were reported more often on forms from logo/insert mailings when mailed via Priority Mail (i.e., experimental mailing 2; logo not visible until Priority Mail envelope opened). This may mean that younger adults are more impacted by Priority Mail, or simply that they respond later. With respect to the logo, older adults may be deterred by the logo (experimental mailing 1) and younger households attracted by it (but only after opening the Priority Mail envelope). We cannot conclusively assess the cause of this age difference in response across mailings because logo/insert (vs. none), Priority Mailing (vs. First-Class), and mailing order were not factorially-manipulated.

Limitations, Design/Analysis Caveats, and Future Directions

Limited Languages: The logo was only in English, so its message (“Health Happens Here”) may not have been understood by linguistically-isolated households. Only English and Spanish were used for the questionnaire and insert. Future studies should translate all messages into all relevant languages.

Results are Few: Most of the associations we tested between demographics and experimental conditions were non-significant.

Confounding of Logo and Insert: Logo and insert effects could not be assessed independently because they were both part of the same experimental condition. Future research should manipulate these factors (and others, like Priority vs. First-Class mailing) independently. Further, our “control” condition had a UCLA logo in the return address field, so it is not a true control condition with respect to the use of logos.

Truncated Sample: The experiment was implemented in the second and third mailing for practical reasons. Stronger or different effects may be seen in the full sample.

Recommendations for Practice

Practice recommendations are simple; do not use logos in general population surveys. Our findings and others’ (Dykema et al. 2012; Levin 2015) strongly suggest that logos and other messages reduce response. Attractiveness to a graphic designer or an organization’s membership may have negative connotations to the general population. While logos do not seem to work for general population surveys, they may work for surveys of organizational membership lists or other situations where the logo will have a more direct and personal association.

We are left with the stark reminder that mail response is a “black box” relative to face-to-face or telephone response. We never know what happens to unreturned mailings once they leave our hands. Did sampled households even notice the mailing? Did they notice the envelope and choose to throw it away? Did they open the envelope and then decide not to participate? Sorting and opening mail, like considering a survey participation request, is a dynamic cognitive/perceptual process that likely happens very quickly and subconsciously. Without data on the steps between mailing and response (e.g., receiving, opening, and responding), it is impossible to isolate the mechanism of the logo effect and know how to improve designs. Yet this experience has given us a few insights for practitioners facing similar design decisions.

Know the subjective assessment of the sponsor and logo in the population: Envelope designs will benefit from qualitative or quantitative measures of familiarity with and opinions the logo and sponsor.

Balance sleekness with simplicity: Impressive graphics and inspiring messages may reduce response because they imply that the organization has money, is looking for money, or is something other than a non-profit research operation that truly needs the household’s help. Researchers should seek uniqueness in plainness and try to design mailings that distinguish themselves from junk mail by being plain.

Create design-robust experiments: Practitioners who want to explore envelope visual design should plan experiments with strong manipulations that isolate design features (e.g., logo vs. insert vs. Priority Mailing). When such experiments are impractical in the field, lab-based experiments may be helpful, particularly when they focus on perceptual processes of mail handling.

Consider intensive cognitive and perceptual testing of materials and the mail receipt/decision process: Usability-like tests and cognitive-interviews of respondents’ reactions to the envelope design are priceless, since we cannot observe them opening their mail in the field. Behavioral debriefings are also possible (i.e., “Why did you choose to open the envelope with the TCE logo?; see Jans and Cosenza 2002) but suffer from the general limitations of retrospective reporting (Willis 2005). Eye tracking studies also hold promise for understanding how people perceive envelope design elements (Romano Bergstrom and Schall 2014).

Experiment with other uses of color, markings, and images that could visually draw the eye to the mailing but not suggest marketing or fundraising: For example, a plain mailing with a brightly-colored mark on it could draw the eye to the envelope while not creating the feel of marketing or implying anything about the sponsor. Local images seem to positively affect response (Smyth et al. 2010). Size, shape, weight, material, and other perceptual dimensions of the mailing’s “look and feel” should also be further tested, following recommendations of Dillman et al. (2014).

Acknowledgements

The authors thank The California Endowment for support of this study and continued support of the California Health Interview Survey. At Westat, Matt Regan and Howard King assisted with the data collection, and Ismael Flores-Cervantes conducted the sampling.

Appendix

SP-Vol8_Jans_fig5.jpg

References

Dillman, D.A. 1978. Mail and Telephone Surveys: The Total Design Method. New York: John Wiley & Sons.
Google Scholar
Dillman, D.A., J.D. Smyth, and L.M. Christian. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 4th ed. Hoboken: Wiley.
Google Scholar
Dykema, J., K. Cyffka, K. Jaques, R. Ganci, K. Elver, and J. Stevenson. 2012. “SHOW Me the Money?: Effects of Preincentives, Differential Incentives, and Envelope Messaging in an ABS Mail Survey.” In Proceedings of the Survey Research Methods Section, American Statistical Association. Orlando, FL. http:/​/​www.amstat.org/​sections/​srms/​proceedings/​y2012/​files/​400221_500635.pdf.
Google Scholar
Edwards, S., M. Brick, J. Kurata, and D. Grant. 2006. “Effects on Response Rates of Multiple Sponsors on Advance Letters for an RDD Survey.” In Presented at the American Association for Public Opinion Research. Montreal, Canada.
Google Scholar
Fowler, F.J. 2014. Survey Research Methods. 5th ed. Sage, Thousand Oaks, CA.
Google Scholar
Jans, M., and C. Cosenza. 2002. “Respondent Debriefings of Mail Surveys: Real People in the Real World.” In Proceedings of the Survey Research Methods Section, American Statistical Association. St. Pete Beach, FL. https:/​/​www.amstat.org/​sections/​srms/​Proceedings/​y2002/​Files/​JSM2002-000859.pdf.
Google Scholar
Jans, M., D. Grant, A. Lee, R. Park, S. Edwards, J. Rauc, and I. Flores-Cervantes. 2013. “Address-Based Sampling (ABS) as an ‘Alternative’ to RDD: A Test in California.” In Proceedings of the American Statistical Association, Section on Survey Research Methods. Boston, MA. https:/​/​www.amstat.org/​sections/​SRMS/​Proceedings/​y2013/​Files/​400278_500765.pdf.
Google Scholar
Levin, K.Y. 2015. “Formal or Friendly: Does Messaging Style Impact Survey Response?” Presented at the American Association for Public Opinion Research Annual Conference, Hollywood, FL.
Romano Bergstrom, J., and A. Schall, eds. 2014. Eye Tracking in User Experience Design. 1st ed. Morgan Kaufmann, Amsterdam; Boston.
Google Scholar
Smyth, J.D., D.A. Dillman, L.M. Christian, and A.C. O’Neill. 2010. “Using the Internet to Survey Small Towns and Communities: Limitations and Possibilities in the Early 21st Century.” American Behavioral Scientist 53 (9): 1423–48. https:/​/​doi.org/​10.1177/​0002764210361695.
Google Scholar
Willis, G.B. 2005. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system