Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content
Survey Practice
  • Menu
  • Articles
    • Articles
    • Editor Notes
    • In-Brief Notes
    • Interview the Expert
    • Recent Books, Papers, and Presentations
    • All
  • For Authors
  • Editorial Board
  • About
  • Issues
  • Blog
  • Subscribe
  • search

RSS Feed

Enter the URL below into your favorite RSS reader.

http://localhost:28157/feed
In-Brief Notes
April 28, 2025 EDT

Diary method to assess survey invitation frequency and survey experience

Kenneth E. Wallen, Ph.D., Abbey E. Hammell, M.S., Katherine E. Dentzman, Ph.D.,
Survey fatigueResponse burdenSurvey climateSurvey commons
https://doi.org/10.29115/SP-2025-0006
Photo by Scott Graham on Unsplash
Survey Practice
Wallen, Kenneth E., Abbey E. Hammell, and Katherine E. Dentzman. 2025. “Diary Method to Assess Survey Invitation Frequency and Survey Experience.” Survey Practice 18 (April). https:/​/​doi.org/​10.29115/​SP-2025-0006.
Save article as...▾

View more stats

Abstract

The survey commons is a shared resource, but our collective practices affect survey fatigue, response burden, and survey experience. These issues are influenced by survey invitation frequency, survey source and purpose, and survey design quality. Yet few methods have been established to assess these factors collectively. As such, we developed and tested a web-based diary method to collectively explore these factors. Participants reported on the survey invitations they received in a one-month period, including the number of invitations, the contact and response mode, sponsor, topic (purpose), completion, and the perceived impact of the survey on society and the participant. These exploratory results provide a scalable methodology to evaluate survey invitation frequency, the characteristics of those invitations, and participants’ perceptions of surveys relevant to the contemporary survey commons and nonresponse issues.

Problem Description

Common concerns such as survey fatigue, response burden, and survey experience are influenced by, among other factors, the frequency of survey invitations, the source and perceived purpose of the survey, and the quality of survey design (Dillman, Smyth, and Christian 2014). However, there is no established method to empirically measure the frequency and characteristics of surveys people are invited to participate in. Moreover, the development and diversification of survey techniques questions a fixed definition of what constitutes a survey, particularly from the perspective of potential participants and their experiences with variants. To address these issues, we developed and tested a web-based diary method as a potentially effective methodology for survey researchers and practitioners to assess and relate survey invitation frequency to factors associated with survey fatigue, burden, and experience. Below, we describe the rationale of a diary method to assess those factors and outline the procedure and instrument used to test its effectiveness at providing empirical insights.

Potential Solution

A diary method is an event sampling technique. Event sampling asks participants to monitor and record daily or event-contingent behavior, thoughts, feelings, and experiences in response to open-ended and/or close-ended questions, variables, and formats defined by the researcher. In this sense, a survey invitation is an event that elicits certain behaviors, thoughts, and experiences. The term “diary” refers to a self-report instrument administered repeatedly at set intervals to empirically document the individual’s record of events (Bolger, Davis, and Rafaeli 2003). Compared to traditional self-administered survey questionnaires, a diary method provides additional structure, i.e., data collection at set and repeated time intervals. For event-contingent phenomena such as survey invitations, a diary method provides recency, salience, distinction, and presence advantages. Those advantages tend to be diminished with the use of a traditional questionnaire, i.e., asking individuals to recall the number and characteristics of each survey invitation received in the past month, for instance, reduces retrospection and increases recall bias (Nezlek 2012).

Pilot Study

To explore the effectiveness of a diary method, we piloted a one-month electronic diary study in 2019 and 2023 with a convenience sample recruited from members of WERA-1010, a U.S. Department of Agriculture Multistate Research Coordinating Committee (n=30). Corresponding with our primary objective to demonstrate the effectiveness of a diary method, we assessed (a) survey invitation frequency, (b) invitation and questionnaire characteristics, and (c) relevant factors related to nonresponse. One week prior to the data collection period, participants were emailed a prenotification and information sheet to describe the purpose of the study and participation details. Data collection intervals (i.e., diary entries) were based on event-contingent responding, which asked participants to record a diary entry whenever a predefined event occurred, i.e., when they were solicited to participate in a survey effort. Additionally, participants were sent a weekly reminder to document survey invitations received. Participation was voluntary and no incentives were used; human-subjects research was approved by the University of Idaho (#23-004).

Diary Instrument

To track survey invitation events and characteristics, we developed an electronic spreadsheet-based diary instrument of sequenced close- and open-ended questions (available at https://bit.ly/SrvyDiary). Participants were provided access to an individual web-based Google Sheet diary instrument (or offline Microsoft Excel file, if requested). The instrument consisted of the following two worksheet tabs: (1) study description and instructions and (2) diary instrument. As part of the study description and instructions, participants were provided the following definition: “By survey invitation, we mean communications to you—by email, postal mail, telephone, in-person, etc.—that request you to provide feedback, opinions, or other forms of information on a questionnaire (i.e., a single or multiple question form).” The diary instrument consisted of two sections based on whether the survey solicitation was a first/initial invitation or a reminder/follow-up. The first section included the following diary entry prompts: (a) invite/reminder, (b) title/topic, (c) date, (d) sponsor, (e) contact mode, and (f) response mode. The second section included the following diary entry prompts for only a first/initial invitation: (g) perceived purpose, (h) perceived impact on you, (i) perceived impact on society, (j) completion, and (k) experience (Table 1).

Table 1
Variable Prompt Response format
Invite or Reminder Is this invitation an initial request or reminder for a previous survey invitation? Close-ended: First Invite (1), Reminder (Follow-Up Invite) (2)
Survey Title/Topic Please provide the survey title or general topic indicated by the survey invitation that you received. Open-ended response
Date of Survey Invite What day did you receive this survey invitation? Close-ended: YYYY-MM-DD
Survey Sponsor Who sent you this survey invitation? Close-ended: My employer (1), My school/university (of which I am a student of) (2), An organization in my government (local/state/national/etc.) (3), A political candidate (4), A professional society or organization I am a member of (5), A company I was a customer of (6), A scientist or academic researcher (7), Other (8), Unsure (9)
Contact Mode By what mode(s) were you contacted with an invitation to take this survey? Close-ended: Email (1), Internet (2), Mail (3), Phone (4), Text (5) (SMS), In-person (6), Receipt (7), Kiosk (8), Other (9)
Response Mode By what mode(s) were you asked to respond to this survey?
Perceived Survey Purpose What do you think the purpose of this survey is? Open-ended response
Impact on You Overall, how much of a positive or negative impact do you expect this survey to have on your life? Close-ended: Very Negative (1), Somewhat Negative (2, Neutral (3), Somewhat Positive (4), Very Positive (5)
Impact on Society Overall, how much of a positive or negative impact do you expect this survey to have on society?
Completion Which best describes your response to this survey? Close-ended: I did not respond to the survey (1), I filled out part of the survey but dropped out part way through (2), I completed this survey (3)
Respondent Experience To what extent do you agree or disagree that the survey creators had your (the respondent's) experience in mind when they designed this survey, i.e., that it was a user-friendly design? (If you did not respond to the survey, please skip this question) Close-ended: Strongly disagree (1) Somewhat disagree (2), Somewhat agree (3), Strongly agree (4)
Notes Please provide any additional information that you feel is pertinent or that is not captured by the column categories. Open-ended response

Effectiveness of the Solution

Analyses examined descriptive statistics and correlations. No participant attrition occurred, and the 30 participants received, on average, 10 invitations per month but only completed 26% of invitations. Marketing surveys (49%), email contact mode (72%), and web-based response mode (89%) were the most common. Marketing surveys had the lowest completion rate while academic (46%), membership (44%), and employer (40%) surveys had the highest. Additional results indicate that:

  • 73% of participants perceived surveys to have a neutral impact on society and 26% a positive impact.

  • 79% of participants perceived surveys to have a neutral personal impact and 18.5% a positive personal impact.

  • 82% of participants agreed that survey design had the participants’ experience in mind.

  • A negative correlation was observed between survey invitations and survey completion (τ = -0.29, t = -1.5, p = 0.12).

  • A trend of a higher completion rate was observed if perceived impact on society or self was rated more positively, irrespective of survey sponsor.

Our initial work to develop a diary method to evaluate survey invitation frequency and characteristics of participants’ experience yielded useful insights. The rationale and methodology of a diary method seems to effectively contribute to our knowledge of survey invitation frequency, survey characteristics, and participants’ perceptions of surveys relevant to contemporary survey commons and nonresponse issues. Given that diary studies are commonly used in medicine, public health, and user experience fields, we expect the methodology to be scalable in the context of survey research (Bolger, Davis, and Rafaeli 2003; Nezlek 2012).

That is, we expect the general rationale and instrument to be adaptable to various contexts; it is also expected that our spreadsheet approach can be adapted to and programmed as a web-based computer assisted self-interviewing online survey platform (e.g., Qualtrics) or paper-based instrument (e.g., if a target population as limited experience with spreadsheets, internet access, or computer access). Likewise, we anticipate our methodology is scalable to larger populations. However, we expect common study limitations and data quality issues associated with large or “general” populations to emerge, such as increased rates of participant attrition and incomplete data or the need to include incentives, additional reminders, and data monitoring. Potential next steps include testing a probability sampling of a defined population or survey panel to further demonstrate efficacy among various populations and larger sample sizes. In conclusion, we encourage other survey practitioners with an interest in evaluating survey invitations and characteristics (and their relation to survey burden and response decisions) to consider this research design and method in the future.


Acknowledgements

We thank all participants and members of the USDA WERA-1010 committee; and special thanks to the late Don A. Dillman for facilitating our intellectual curiosity into the topic. We also thank the editor and reviewer for their professionalism and valuable comments that improved the quality of the manuscript.

Lead author contact information

Kenneth E. Wallen, Ph.D., Department of Natural Resources and Society, University of Idaho, 875 Perimeter Dr., Moscow, ID 83844, USA

Email: kwallen@uidaho.edu

Submitted: February 03, 2025 EDT

Accepted: March 27, 2025 EDT

References

Bolger, Niall, Angelina Davis, and Eshkol Rafaeli. 2003. “Diary Methods: Capturing Life as It Is Lived.” Annual Review of Psychology 54:579–616.
Google Scholar
Dillman, Don A., Jolene D. Smyth, and Leah M. Christian. 2014. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Hoboken: John Wiley & Sons. https:/​/​doi.org/​10.1002/​9781394260645.
Google Scholar
Nezlek, John B. 2012. Diary Methods for Social and Personality Psychology. London: Sage. https:/​/​doi.org/​10.4135/​9781446287903.
Google Scholar

This website uses cookies

We use cookies to enhance your experience and support COUNTER Metrics for transparent reporting of readership statistics. Cookie data is not sold to third parties or used for marketing purposes.

Powered by Scholastica, the modern academic journal management system