Problem Description
Common concerns such as survey fatigue, response burden, and survey experience are influenced by, among other factors, the frequency of survey invitations, the source and perceived purpose of the survey, and the quality of survey design (Dillman, Smyth, and Christian 2014). However, there is no established method to empirically measure the frequency and characteristics of surveys people are invited to participate in. Moreover, the development and diversification of survey techniques questions a fixed definition of what constitutes a survey, particularly from the perspective of potential participants and their experiences with variants. To address these issues, we developed and tested a web-based diary method as a potentially effective methodology for survey researchers and practitioners to assess and relate survey invitation frequency to factors associated with survey fatigue, burden, and experience. Below, we describe the rationale of a diary method to assess those factors and outline the procedure and instrument used to test its effectiveness at providing empirical insights.
Potential Solution
A diary method is an event sampling technique. Event sampling asks participants to monitor and record daily or event-contingent behavior, thoughts, feelings, and experiences in response to open-ended and/or close-ended questions, variables, and formats defined by the researcher. In this sense, a survey invitation is an event that elicits certain behaviors, thoughts, and experiences. The term “diary” refers to a self-report instrument administered repeatedly at set intervals to empirically document the individual’s record of events (Bolger, Davis, and Rafaeli 2003). Compared to traditional self-administered survey questionnaires, a diary method provides additional structure, i.e., data collection at set and repeated time intervals. For event-contingent phenomena such as survey invitations, a diary method provides recency, salience, distinction, and presence advantages. Those advantages tend to be diminished with the use of a traditional questionnaire, i.e., asking individuals to recall the number and characteristics of each survey invitation received in the past month, for instance, reduces retrospection and increases recall bias (Nezlek 2012).
Pilot Study
To explore the effectiveness of a diary method, we piloted a one-month electronic diary study in 2019 and 2023 with a convenience sample recruited from members of WERA-1010, a U.S. Department of Agriculture Multistate Research Coordinating Committee (n=30). Corresponding with our primary objective to demonstrate the effectiveness of a diary method, we assessed (a) survey invitation frequency, (b) invitation and questionnaire characteristics, and (c) relevant factors related to nonresponse. One week prior to the data collection period, participants were emailed a prenotification and information sheet to describe the purpose of the study and participation details. Data collection intervals (i.e., diary entries) were based on event-contingent responding, which asked participants to record a diary entry whenever a predefined event occurred, i.e., when they were solicited to participate in a survey effort. Additionally, participants were sent a weekly reminder to document survey invitations received. Participation was voluntary and no incentives were used; human-subjects research was approved by the University of Idaho (#23-004).
Diary Instrument
To track survey invitation events and characteristics, we developed an electronic spreadsheet-based diary instrument of sequenced close- and open-ended questions (available at https://bit.ly/SrvyDiary). Participants were provided access to an individual web-based Google Sheet diary instrument (or offline Microsoft Excel file, if requested). The instrument consisted of the following two worksheet tabs: (1) study description and instructions and (2) diary instrument. As part of the study description and instructions, participants were provided the following definition: “By survey invitation, we mean communications to you—by email, postal mail, telephone, in-person, etc.—that request you to provide feedback, opinions, or other forms of information on a questionnaire (i.e., a single or multiple question form).” The diary instrument consisted of two sections based on whether the survey solicitation was a first/initial invitation or a reminder/follow-up. The first section included the following diary entry prompts: (a) invite/reminder, (b) title/topic, (c) date, (d) sponsor, (e) contact mode, and (f) response mode. The second section included the following diary entry prompts for only a first/initial invitation: (g) perceived purpose, (h) perceived impact on you, (i) perceived impact on society, (j) completion, and (k) experience (Table 1).
Effectiveness of the Solution
Analyses examined descriptive statistics and correlations. No participant attrition occurred, and the 30 participants received, on average, 10 invitations per month but only completed 26% of invitations. Marketing surveys (49%), email contact mode (72%), and web-based response mode (89%) were the most common. Marketing surveys had the lowest completion rate while academic (46%), membership (44%), and employer (40%) surveys had the highest. Additional results indicate that:
-
73% of participants perceived surveys to have a neutral impact on society and 26% a positive impact.
-
79% of participants perceived surveys to have a neutral personal impact and 18.5% a positive personal impact.
-
82% of participants agreed that survey design had the participants’ experience in mind.
-
A negative correlation was observed between survey invitations and survey completion (τ = -0.29, t = -1.5, p = 0.12).
-
A trend of a higher completion rate was observed if perceived impact on society or self was rated more positively, irrespective of survey sponsor.
Our initial work to develop a diary method to evaluate survey invitation frequency and characteristics of participants’ experience yielded useful insights. The rationale and methodology of a diary method seems to effectively contribute to our knowledge of survey invitation frequency, survey characteristics, and participants’ perceptions of surveys relevant to contemporary survey commons and nonresponse issues. Given that diary studies are commonly used in medicine, public health, and user experience fields, we expect the methodology to be scalable in the context of survey research (Bolger, Davis, and Rafaeli 2003; Nezlek 2012).
That is, we expect the general rationale and instrument to be adaptable to various contexts; it is also expected that our spreadsheet approach can be adapted to and programmed as a web-based computer assisted self-interviewing online survey platform (e.g., Qualtrics) or paper-based instrument (e.g., if a target population as limited experience with spreadsheets, internet access, or computer access). Likewise, we anticipate our methodology is scalable to larger populations. However, we expect common study limitations and data quality issues associated with large or “general” populations to emerge, such as increased rates of participant attrition and incomplete data or the need to include incentives, additional reminders, and data monitoring. Potential next steps include testing a probability sampling of a defined population or survey panel to further demonstrate efficacy among various populations and larger sample sizes. In conclusion, we encourage other survey practitioners with an interest in evaluating survey invitations and characteristics (and their relation to survey burden and response decisions) to consider this research design and method in the future.
Acknowledgements
We thank all participants and members of the USDA WERA-1010 committee; and special thanks to the late Don A. Dillman for facilitating our intellectual curiosity into the topic. We also thank the editor and reviewer for their professionalism and valuable comments that improved the quality of the manuscript.
Lead author contact information
Kenneth E. Wallen, Ph.D., Department of Natural Resources and Society, University of Idaho, 875 Perimeter Dr., Moscow, ID 83844, USA
Email: kwallen@uidaho.edu